I hear these terms together and wonder what is the difference? How are they related to continuous builds and continuous deployments?
Continuous integration / continuous builds is all about getting developers to commit code to a source code repository little and often (and get the latest version from the repository, so any further changes are based on other developers recent changes). This reduces the time wasted on merge resolution, as it's easier to merge in this case.
The process is best automated using a build server, which can also run any Unit Tests. Feedback is then provided to the developers in the case of a build / test failure, so that any issues can be fixed quickly.
Continuous deployment involves the automated deployment of the build artefacts from the build process onto the test and production environments. To mitigate the risk involved with this, people often use feature toggles to separate the release (in a controlled way) from the deployment.
Continuous delivery is less about the technology and more about the organisations approach to software delivery (although is does make heavy use of automation).
DevOps is a much larger area that generally emphasises breaking down barriers between developers and operations teams, and getting them to collaborate in a way where they benefit from combined skills. More automation of environment provisioning, build deployment, monitoring (and reacting automatically to problems and scalability), and in some cases software defined networks will come out of this in a company. In some organisations, dedicated DevOps team(s) has been created.
Continuous Delivery (CD) is a concept that was first described in the 2010 book co-authored by Jez Humble and David Farley, both of ThoughtWorks.
Continuous Integration and Continuous Delivery often get confused with one another, but there are some key differences:
CI can be done by one dev where CD requires team collaboration
CD cannot be done without CI
CD is a linear journey where CI is a continuous feedback (build) loop moving CD forward
With CD you are always ready to push to prod
CI allows you to check your code into repo multiple times so you can detect your issues early on
Here is a quote from Martin Fowler:
"Continuous Integration is a software development practice where
members of a team integrate their work frequently, usually each person
integrates at least daily - leading to multiple integrations per day.
Each integration is verified by an automated build (including test) to
detect integration errors as quickly as possible. Many teams find that
this approach leads to significantly reduced integration problems and
allows a team to develop cohesive software more rapidly."
The main difference between Continuous Delivery and Continuous Deployment, is automation. You automate the deployment side of things. This works well if you are pushing to production multiple times a day or for a variety of other reasons.
As for DevOps, that's a whole other ball of wax. People often think DevOps is a role or a tool, but it's really a culture. You don't "do" DevOps. Here's a quote from Mike Kavis that I like quite a bit:
"DevOps is a culture shift or a movement that encourages great
communication and collaboration (aka teamwork) to foster building
better-quality software more quickly with more reliability."
There's probably some ambiguity in how the continuous xxx phrases are used by different people, but I think this blog post sums it up pretty well.
http://blog.assembla.com/assemblablog/tabid/12618/bid/92411/Continuous-Delivery-vs-Continuous-Deployment-vs-Continuous-Integration-Wait-huh.aspx
DevOps is more of an overarching idea than a specific practice, a bit like Agile is the idea, and unit testing is a practice.
Related
Im doing research regarding continuous integration tools and there benefits. For my research im looking at the following tools:
GitLab CI
Jenkins
Bamboo
GoCD
TeamCity
Now I wont bother you with all the requirements and benefits. But so far im not finding so many differences between the tools except for these:
Fan-in fan-out support GoCD
Community size, Jenkins and GitLab seem to have most contributors
Costs
Open source or not
Amount of plugins available
I was wondering if some people who have had to choose a continuous integration tool aswell could share there experience and why they chose that tool and if there are certain differences that are worth thinking about before choosing which I didn't cover.
Now im leaning towards GoCD because of fan-in fan-out support and the visualisation of the continuous delivery pipeline does anybody have experience with the support on issues for this tool?
Thanks in regard,
Disclaimer: I was an active contributor to GoCD before previous Fall.
I haven't used GitLab CI so won't talk about that :) Also, I haven't used any of these tools in the past one year.
I think TeamCity is a good CI tool. It integrates very well with IDE if you want to debug some failures. The test reports are brilliant. But I don't think they are that advanced in CD space and in my opinion you need both. But if you are interested only in CI, you might want to give it a look. However, you will miss on some of the good features of GoCD I've mentioned below.
Jenkins has a huge community but Jenkins has its own disadvantages. Many a times one plugin doesn't work due to another plugin for some compatibility issues for instance.
GoCD has Fan-in/Fan-out support which avoids many unnecessary builds saving a lot of build time and resources. The value stream map is intuitive and helps to get a better picture of the build stage from a developer's, QA's or even Deliver Manager's point of view. The pipeline modeling in GoCD is also very good. If you read Jez Humble and David Farley's book on Continuous Delivery, you will see the power behind such a build design.
Now, to your second question:
Now im leaning towards GoCD because of fan-in fan-out support and the
visualisation of the continuous delivery pipeline does anybody have
experience with the support on issues for this tool?
Good to hear that :P I love GoCD. The support is good. If you choose to go the Open Source way, the mailing list is pretty active. You can expect a reply from the GoCD team within a day or two. Of course, your questions have to be genuine and specific. Looking through the forums before posting a question helps :)
You can also choose to buy support for GoCD from ThoughtWorks. They used to offer multiple support tiers, not sure of the current support model. You might face issues only when your DB grows too huge (~5-7 GB) when you might want to go for the proprietary Postgres DB support from ThoughtWorks. I've seen very few users of GoCD with that DB size.
I have a lot of experience with Teamcity and some with Gocd. If you are interested in fan-in/fan-out it's also possible to do the same in Teamcity -- it's called Build Chains.
Also there is a good post about this topic on official blog.
If I could choose I would prefer Teamcity. It's more mature and more feature rich product suitable for use in corporate environment.
Taking a indepth look at CI and a question rose up. Is a agile development process a pre-requisite to be able to work with Continuous Integration?
Would it be possible to implement a CI process in a traditional, team based
development process?
Gut feeling says me that agility is more or less a pre-requisite, but "gut feeling" is not an argument when talking to management... :-)
And is there any documentation out there about this? All I found take it for granted
that you already work agile.
I would argue that continuous integration is good practice in almost all development teams, whether you are following an agile process or not (along with source control and free coffee). I've used it in agile teams, traditional teams and when I am coding alone - it has always added value.
For any development process, CI gives you:
Immediate feedback on any build errors (e.g. when a developer has forgotten to add or check in a file)
Immediate feedback on unit test failures (if you have written unit tests, which again are a good idea whether you are following an agile process or not)
Your QA team having up to date binaries to test with
Automating the build process (which greatly reduces the chances of error when you release your software)
Have a look at Jenkins - it's free and pretty easy to set up.
CI is not really related to agile or not-agile methodology (although some state to require it, while others just indirectly imply it or not mention at all)
CI is the only tool (yeah suppose it like a keyboard) which helps you during development to eliminate some bugs ASAP
actually the only thing you need to do CI is configure version control system with some build tool (like post-commit hook), and ask all developers to commit/fetch code as soon they pretty sure that it will compile - this will be enough to start continuous integration, then of course you can add unit test etc
so, the answer - agile is not requirement and you can implement CI in any process, without implementing XP, Scrum, Whatever methodology
We know this is good to have, but I find myself justifying it to my employer. Please pitch in on why a development team needs a build server.
There are multiple reasons to use build servers. In no particular order and off the top of my head:
You simplify the developers' workflow and reduce the chance of mistakes. Your build server can take care of multiple steps such as checking out latest code, having required software installed, etc. There's no chance of a developer having some stray DLLs on their machine that can cause the build to pass or fail seemingly at random.
Your build server can replicate your target environment (operating system, etc.) and there's less of a chance of something working on developers' desktops and breaking in production.
While it's a good practice for developers to test everything they check in, sometimes they just don't. Then it's good to have the build server there to catch test errors and let the team know the product is broken.
Centralized builds provide easy access to code metrics -- which tests passed, which failed, how often, how well is your code covered by your tests, etc. Having a solid understanding of the quality state of the codebase reduces maintenance and testing costs by providing timely feedback that allows errors to be fixed quickly and easily.
Product deployment is simplified -- the developer or QA doesn't have to remember multiple manual steps. It can be easily automated.
The link between developers and QA is simplified. QA personnel can go to a known location to grab latest, propertly versioned builds.
It's easy to set up builds for release branches, providing an extra safety net for products in their release stage, when making code changes must be done with extra care.
To avoid the "but it works on my box" issue.
Have a consistent, known environment where the software is built to avoid dependencies on local dev boxes.
You can use a virtual server to avoid (much) extra cost if you need to.
ASAP knowledge on what unit tests are currently working and which do not; furthermore, you'll also know if a once passing unit tests starts to fail.
This should sum up why it is critical to have a build server:
http://www.codinghorror.com/blog/2006/10/the-build-server-your-projects-heart-monitor.html
It's a continuous quality test dashboard; it shows you statistics about how the quality of your software is doing, and it shows them to you now. (JUnit, Cobertura)
It makes sure developers aren't hamstrung by other developers breaking the build, and encourages developers to write better code. (FindBugs, PMD)
It saves you time and money throughout the year by getting better code from developers the first time - less money on testing and retesting - and by getting more code from the same developers, because they're less likely to trip each other up.
Two main reasons that non technical people can relate to:
It improves the productivity of the dev team because problems are identified earlier.
It makes the state of the project very obvious. I've shown my management the build status dashboard an now they look at it all the time.
One more thing. Something like Hudson is very simple to set up - you might want to simply run it somewhere in a corner for a while and then show it later.
This is my principal argument:
all official releases must be build in a controlled environment. No exception.
simply because you never know how the developers create their personal releases.
You also don't need to talk about build server as in "blade that costs an arm a a leg". The first build server I set up was a desktop machine that sat unplugged in a corner. It served us very well for more than 3 years.
One you have your build machine, you can start adding some features (Hudson is great) and implement everything that the other posters mentioned.
Once your build machine becomes indispensable to your organization (and everyone sees its benefits), you will be able to ask for a shiny new blade if you wish :-)
The simplest thing you can do to convince your your employer to have a build server is to tell them that they will be able to release faster and with better quality.
Faster releases come from the immediate feedback about quality of the build. If someone breaks the build, he or she can fix the broken build immediately thus avoiding a delay in the build and release schedule. Without a build server the team will have to spend time trying to find what and when happened and how to fix it.
Better quality is achieved by the build server running bug detection tools automatically every time someone check is changes into a version control system. You don't mention what is the main development language in your organization, but such tools, advanced but commercial and simple but free, exist practically for all languages. Lint, FxCop, FindBugs and PMD come to mind.
You may also check this presentation on benefits of continuous integration for a more extensive discussion.
We have 3 branches {Dev,Test,Release} and will have continuous integration set up for each branch. We want to be able to assign build qualities to each branch i.e. Dev - Ready for test...
Has anyone any experience with this that can offer any advice/best practice approach?
We are using TFS 2008 and we are aware that it has Build Qualities built in. It is just when to apply a quality and what kind of qualities people use is what we are looking for.
Thanks
:)
Your goal here is to get the highest quality possible in each branch, balanced against the burden of verifying that level of quality.
Allowing quality to drop in any branch is always harmful. Don't think you can let the Dev branch go to hell and then fix it up before merging. It doesn't work well, for two reasons:
Recovering is harder than you expect. When a branch is badly broken, you don't know how broken it really is. That's because each issue hides others. It's also hard to make any progress on any issue because you'll run in to other problems along the way.
Letting quality drop saves you nothing. People sometimes say "quality, cost, schedule - pick any 2" or something like that. The false assumption here is that you "save" by allowing quality to slip. The problem is that as soon as quality drops, so does "velocity" - the speed at which you get work done. The good news is that keeping quality high doesn't really cost extra, and everyone enjoys working with a high-quality code base.
The compromise you have to make is on how much time you spend verifying quality.
If you do Test Driven Development well, you will end up with a comprehensive set of extremely fast, reliable unit tests. Because of those qualities, you can reasonably require developers to run them before checking in, and run them regularly in each branch, and require that they pass 100% at all times. You can also keep refactoring as you go, which lets you keep velocity high over the life of the project.
Similarly, if you write automated integration / customer tests well, so they run quickly and reliably, then you can require that they be run often, as well, and always pass.
On the other hand, if your automated tests are flaky, if they run slowly, or if you regularly operate with "known failures", then you will have to back off on how often people must run them, and you'll spend a lot of time working through these issues. It sucks. Don't go there.
Worst case, most of your tests are not automated. You can't run them often, because people are really slow at these things. Your non-release branch quality will suffer, as will the merging speed and development velocity.
Assessing the quality of a build in a deterministic and reproducible way is definitely challenging. I suggest the following:
If you are set up to do automated regression testing then all those tests should pass.
Developers should integration test each of their changes using an official Dev build newly installed on an official and clean test rig and give their personal stamp of approval.
When these two items are satisfied for a particular Dev build you can be reasonably certain that promoting this build to Test will not be wasting the time of your QA team.
I know that using continuous integration improves the quality of my code base, and speeds up releases, but what is the best way to convince clients that they want it on their next project?
Say exactly what you've said in the question:
Speeding up releases = earlier market penetration = more money
Improving code quality = less time fixing bugs = less cost
So long as you can help them set it up reasonably quickly and cheaply, I can't see why it would be a problem.
In addition to making the standard arguments I quote the data from this paper:
Alan MacCormack, Chris Kemerer, Michael Cusumano, and Bill Crandall, “Trade-offs between Productivity and Quality in Selecting Software Development Practices”, IEEE Software, September-October 2003
Namely:
Integration/regression testing at each code check-in = 36%
reduction in defect rate
Daily builds = 93% rise in LOC output/programmer
So CI gives you higher productivity and better quality. Who doesn't want that?
You have made some assertions. If you want to sell the idea to you clients you are going to have to answer the questions:
How does it improve your code quality?
Compilation/build issues are identified on a regular basis. And if used in conjunction with automated integration and unit tests you will be able to identify bugs on a regular basis.
How does it speed up you releases?
If you automate the build and deployment process you eliminate the downtime required from the development team to ship a new build for testing.
You have a history of successful builds to fall back on if you run out of time and are prepared to ship with incomplete features,
I am not sure how interested Clients are in continuous integration. I think selling the idea t the development team is more worth while exercise in many cases.
That said Clients will always like to hear.
Your project will always be in a
working state.
All code is tested as we write it