Deployment/build tool between Ant and Chef - windows

So I've been agonizing over embracing a deployment/configuration management tool like Chef or Puppet for a good long while. Not because I have any hesitation about them in general, but because I don't think they are a good fit for our specific scenario.
As far as I can see, these types of tools are targeted at frequent/wide-scale deployments, where you need to roll out software to 10s-1000s of systems. In our environment, we have a collection of ~25 different web services spread across half a dozen runtimes, with 1-8 deployments of each in production currently. Our big deployment problem is that each of the services has a different deployment story, and it's entirely manual, so it tends to be time consuming and error prone. Another wrinkle is that different instances in production may be different versions of the software, so we may need to concurrently support multiple deployment stories for a single service.
So I feel like we need something more like Ant/Maven/Rake, which is customized for each service. However, my experience with those is they are generally focused on local operations, and specific to a given language/runtime.
Is there a runtime-agnostic framework for describing and orchestrating building/testing/deployment in the manner I'm interested in?
I'm sure if I hit them long enough, I could get Rake or Puppet to do these for me, but I'm looking for something built for this purpose.
(Oh, and to make things worse, everything runs on Windows)
Thanks!

Here's another alternative you might want to consider: kwatee (I'm affiliated) is a free lightweight deploiement tool which besides having a web management interface can also integrate with ant (or maven or anything else with python CLI) to automate build & deploiement on dev/test environments for instance.
One of the nice things is the web configuration interface which make it pretty easy to quickly configure your deploiment stories, i.e. which software/version goes on which server. It's often necessary to setup different parameters in configuration files depending on the target server. For that you can "templatize" your packages using kwatee variable (similar to environment variables) which are configured with different values for each server.
Software must be registered in Kwatee's repository in the form of a folder of files, or an archive (zip, tar, tar.gz, bzip2, war) or a single file (e.g. an exe). Msi's are not supported. To deploy on windows kwatee needs the servers to have either telnet/ftp or ssh/scp (there are free tools out there).

Related

Creating an installer for consultingware

At the company I work, we have a product that for all intents and purposes could be called consultingware. It's a platform for EDI with quite a few moving parts. The back-end is an ESB written in Java SE, the front-end is a Java EE application running on GlassFish, the database is typically on an MSSQL server and RabbitMQ is used as queueing middleware. It's domain-agnostic in the sense that different message models and mappings can be deployed. Setting up a new environment tends to take quite a while, but a lot of it are mundane tasks that could easily be automated by filling in the right parameters and running scripts. T-SQL for the database, asadmin scripts on GlassFish, and the ESB configs are XML, so an XSLT transformation on a template would do the job.
This is never going to become a simple installation, but having an "installer" that does most of the work for you, lists prerequisite steps, presents the user with a convenient way of supplying necessary parameters, generating some scripts and putting things in place would be nice; even if only the devs ever use it, it would make life easier. Although the software is technically platform-independent, it tends to be run on Windows Server.
Just making a Java application that does the above wouldn't be very difficult, but rather than reinvent the wheel (and make a probably very ugly GUI) I'd like to see if any existing solutions fit the bill. InstallShield and Inno Setup look promising. So the question is, which existing tool could provide the following, or alternatively, is making something from scratch worth it?
Call other executables or installers (for GlassFish, for example).
Run shell scripts (for the asadmin setup).
Connect to a (MSSQL) database and run scripts.
Perform XSLT transformations (could be via a Java method call/jar execution).
Set up services.
Maybe have some way of checking if prerequisites are fullfilled (check if GlassFish is installed, RabbitMQ, DB is accessible...)
FWIW, you can do all of those things from an MSI. There are a number of tools out there that make the process easier. I use a free one called MAKEMSI that is excellent: http://dennisbareis.com/makemsi.htm

Trouble developing on mirrored, but separate, production environment

I'm having some problems with the "development environment should be as close as possible to the production environment".
(Production machine's operating system is Linux.)
My understanding of development steps (roughly):
code, compile, test/run, repeat
"Normally" I would go through these on my own machine, then push the code to CI for testing, and possibly deploy. The CI would be responsible for running the tests in an environment that matches production, this way if the tests pass, it's safe to assume that the code works in production as well.
The problem of a larger environment
☑ Database - of some kind.
☑ Job Processing Pool - for some long-running background tasks.
☑ User Account Management - used by other systems as well.
☑ Centralized Logging - for sanity.
☑ Forward Proxy - to tie individual http-accessible services under the same url but different paths.
☐ And possible other services or collections of services.
Solutions?
All on my own machine? No way in hell.
All on a virtual machine? Maybe, but security-wise if this setup was supposed to mirror the prod.env., and the prod.env. was like this, well.. that might not be such a good idea in case of a breach.
Divide by responsibility and set them up on multiple virtual machines? Who's gonna manage all those machines? I think it's possible to do better than this.
Use containers such as Docker, or slap similar together by yourself? Sounds good: (Possibly:) very fast iteration cycles, separation of concern, some security by separation, and easy reproducibility.
For the sake of simplicity, let's say that our containerization tooling of choice is Docker, and we are not going to build one ourselves with libvirt / lxc tooling / direct kernel calls.
So Docker it is, possibly with CoreOS or Project Atomic. So now there is a container for an application (or multiple applications) that has been separated from the rest of the system, and can be brought up nearly identically anywhere.
Solution number 1: Production environment is pretty and elegant.
Problem number 1: This is not development environment.
The development environment
Whatever the choice to not having to sprinkle the production environment into my own machine, the problem remains the same:
Even though the production environment is correctly set up, I have to run the compilation and testing somewhere, before being able to deploy (be it to another testing round by CI or whatever).
How do I solve this?
Can it really be that the proper way to solve this is by writing code on my own machine, having it synchronized/directly visible in a virtualized-mirrored-production-like environment, which automates running of the tests?
What happens when I don't want to run all the tests, but only the portion that I'm writing right now? Do I edit the automated compilation process every time? What about remote debugging, since multiple systems must be orchestrated to run in the correct way, and debugging must attach in-between to one of the programs. Not to mention the speed of "code, test" cycle, which would be _very_ slow.
This sounds helluvalot like CI, but multiple developers can't all use the same CI and modify it, so they probably have to have this setup on their own machines.
I was also thinking that the developers could each use a completely virtualized os that contained all the development tools and was mirrored environment-wise with the production, but that would force veteran users to adopt the tooling of the virtual development environment, which doesn't sound such a good idea.

Executables deployment in multiple servers regularly

If there is a code base and many servers, how should I deploy the executable in multiple servers regularly? I need a script to do this
There are many system management tools. Chef is what we use, it is great but there is a bit of a learning curve. You may want to look into your OS's default package manager and distribute packages of your code base.

Using puppet (or any thing else) instead of a bash script for an SSH based deployment

I have a custom build and deployment script which work over SSH and deploy to servers (on running MacOS). The bash script does a lot of simple things like copying files, backing up the old ones and applying the correct SQL scripts for a forward moving database. But there are some advanced things like starting a remote SQL upgrade procedure which can be disconnected from and once the deployment script is started again it only goes forward if the SQL script has been applied completely (in short there is some flow control happening and bash is not really ideal for such stuff)
The script is already huge and is a mess since bash is not meant for such kind of detailed logic. Can you recommend some tools, libraries which would make things easier.
For what you tell us, I think you need a deployment tool, rather than a configuration management tool.
To simplify, I'll distinguish the two like this:
A deployment tool is a 'push' tool: When you press the button, the required actions are run to make the deployment. It's a one-step process (it can have multiple actions, but it's launched once).
A configuration management tool is usually a 'pull' tool, where your servers periodically check if their configuration is exactly as the CM server tells them to be - and apply changes, if needed. You configure your servers once, and after that the system assures that all is as it should be. It is also a great tool to easily clone systems.
For deployment tools, I personally know Fabric, a great Python tool. But there is also Capistrano in the Ruby world. I don't know of any others.
For CM tools, Puppet and Chef seem to be the preferred choice of people nowadays. Cfengine is an older tool, which had some problems (I don't know if that has changed).
Here are my recommendations:
Puppet
Chef
cfengine
These are all free (as in beer) and allow you to do what you're wanting. They will require you to adapt your current bash script into modules to fit their design/framework. It's a bit of work, but in the long run it tends to be better since the frameworks take care of error checking, converging configurations and a lot of other things you'd have to manually insert into your own code were you doing this yourself.
I've also used Opsware previously for this sort of thing, but that costs a fair bit of cash and, for what you're trying to do, does not offer significantly more benefit.
In some cases moving from a bash-script to an complete solution is not as straightforward as many cloudservices claim.
With 'dont try new things when your on a deadline' in mind:
it could also be a good timing to refactor your bashscripts.
I have done automated, repeatable deployments in the past using PaaS or just using GIT/SVN hooks using deployogi (which is bash) : https://github.com/coderofsalvation/deployogi
I understand your situation, but Im not sure whether its fair to say that the bash-language implies 'a mess' and 'complex'.
Every language allows to hide complexity no?
I guess code (in whatever language) gets overly complex when time does not allow us to refactor :)
PaaS is great. But always needed? I think not.

Does CI need a CI-Server

Is a CI server required for continous integration?
In order to facilitate continous integration you need to automate the build, distribution, and deploy processes. Each of these steps is possible without any specialized CI-Server. Coordinating these activities can be done through file notifications and other low level mechanisms; however, a database driven backend (a CI-Server) coordinating these steps greatly enhances the reliability, scalability, and maintainability of your systems.
You don't need a dedicated server, but a build machine of some kind is invaluable, otherwise there is no single central place where the code is always being built and tested. Although you can mimic this affect using a developer machine, there's the risk of overlap with the code that is being changed on that machine.
BTW I use Hudson, which is pretty light weight - doesn't need much to get it going.
It's important to use a dedicated machine so that you get independent verification, without corruption.
For small projects, it can be a pretty basic machine, so don't let hardware costs get you down. You probably have an old machine in a closet that is good enough.
You can also avoid dedicated hardware by using a virtual machine. Best bet is to find a server that is doing something else but is underloaded, and put the VM on it.
Before I ever heard the term "continuous-integration" (This was back in 2002 or 2003) I wrote a nightly build script that connected to cvs, grabbed a clean copy of the main project and the five smaller sub-projects, built all the jars via ant then built and redeployed a WAR file via a second ant script that used the tomcat ant tasks.
It ran via cron at 7pm and sent email with a bunch of attached output files. We used it for the entire 7 months of the project and it stayed in use for the next 20 months of maintenance and improvements.
It worked fine but I would prefer hudson over bash scripts, cron and ant.
A separate machine is really necessary if you have more than one developer on the project.
If you're using the .NET technology stack here's some pointers:
CruiseControl.Net is fairly lightweight. That's what we use. You could probably run it on your development machine without too much trouble.
You don't need to install or run Visual Studio unless you have Visual Studio Setup Projects. Instead, you can use a free command line build tool called MSBuild.

Resources