We have several projects undergoing, and there are dependencies relationships among them. All projects makes up a final software.
We set up a DEV build environment to do snapshots build by using LASTEST dependencies. Any change will trigger a snapshot build (jekins job) and all dependent's snapshot build will be triggered too, and so if any changes break some project, that project's own build will notify the owner.
The question is about the release. The DEV build is continuous, and we want to release EVERY project against certain timestamp when it was a GREEN dev build across all projects.
How to get such release process setup?
thanks.
jenkins provides some Post-Build-Actions. You can use them to publish/archive every successfully built artifact to whereever you want.
Your Release-Job can take all the artifacts and deploy them. So you're sure all artifacts are from GREEN builds and is also independent from all the continuous jobs.
If you want to be really cool, do some smoke tests (e.g. is database connection working, external APIs working, etc) in the Release-Job as well.
best,
marco
Related
Wondering how people manage their project artefacts through an environment lifecycle of say DEV - AQA - CQA - RELEASE and if there's some best practices to follow.
I use a Jenkins build server to build my projects (code checkout then maven build). My artefacts all have version 1.0.0-SNAPSHOT and are published to a local .m2 repo on the build server. There are also Jenkins jobs that rebuild the DEV system (on the same server) using those artefacts. The project build is automated whenever someone checks in code. The DEV build is automated on a nightly basis.
At some point, my lead developer determines that our project is fit to go to AQA (the first level of testing environment on a different server).
For this I need to mark the artefacts as version 1.0.0-1 and publish to a remote AQA repository (it's actually a Nexus repo).
The Maven deploy plugin sounds like the right approach, but how do I change the version number to be effectively 1.0.0-$release (where $release is just an incrementing number starting from 1)? Would Maven/Nexus be able to manage the value of $release, or would I need a simple properties file in my project to store/update the last used $release.
Furthermore, someone tests AQA and determines its fit to move on to CQA (second testing env). This is 'promote to AQA'. So my requirement is to copy the artefact from the AQA Nexus repo and publish to the CQA Nexus repo.
Likewise, after CQA, there'd be a 'promote to RELEASE' job too.
I think the version value remains unchanged during the 'promote' phases. I'd expect the AQA repo to see all versions 1-50, but CQA only 25 and 50, then RELEASE only 50, for example.
I can find loads of info about Maven plugins/goals/phases, but very little about a prescriptive method on how or where to use outside of the immediate development environment.
Any suggestions gratefully received.
Staging/promoting is out of scope for Maven. Once deployed/uploaded to a remote repository, that system is responsible for the completion of the release cycle. Read this chapter about staging: http://books.sonatype.com/nexus-book/reference/staging.html if you use Nexus.
Build numbers are just that build numbers. They are not promotion / staging numbers.
You should come up with another means of tracking your promotions, because otherwise one might get confused in "knowing" that build 10.1.4-2 is the same as 10.1.4-6. Certainly, all of the Maven related software will see those two builds as different builds.
In addition, if a person "grabs" the wrong copy of the build, the way you are managing staging within your build number will increase confusion. As if you don't kill all of the 10.1.4-2 builds, then someone might get a copy of that not realizing that the build has been promoted to 10.1.4-6. This means that for the "last" staging number to be the most likely one to be grabbed, you must do two things (which are impossible in combination)
Remove all the old staging numbers, updating them to the new ones.
Ensure that no copy of an old staging number escaped the update.
Since people generally can copy files without being tracked, or said files might not be reachable at time up "update", or timing between reaching all the files cannot be simultaneous, such a system is doomed to fail.
Instead, I recommend (if you must track by file), placing the same file in different "staging directories". This defines release gateways by whether the file exists in a certain directory, and makes it clear that it is the same file that is going through the entire process. In addition, it becomes easy to have various stages of verification poll their respective directories (and you can write Jenkins tasks to promote from one directory to another, if you really wish).
We are using bamboo as our integration server. During each build it produces binary packs of our products. Some of the built artifacts then goes in to QA.
Is there a way to retain build artifacts of a certain build job number, irrespective of the global build expire configuration. For an example, at some point we identify one of the built artifacts as release candidate.
The QA should be able to download that specific pack even after one weeks time. Right now we are copying the build artifacts from CI server to some other machine. It is a script, but still it is a manual process.
In Hudson there is an option called 'keep this build forever'.
Depending on your version this is possible by applying a label to it. Under a plan's configuration on the "Miscellaneous" tab, you can set a label which can then be used to prevent a build from expiring.
For example, our system has builds that can get labelled "SaveBuild" which then prevents their expiry.
I understand that during development build artifacts are placed in the snapshot repository.
When a product needs to go to QA for testing, do teams pull from the snapshot repository? Or do they do a full build, deploy to the release repository, and then give it to QA from there?
Also, if my snapshots repository holds all the build artifacts from each build, how is this commonly cleaned up? I could see keeping the last 5 builds from the build server, but not every one. I'm using Artifactory if it helps.
Opinions differ, here'e my approach:
Snapshots are for Dev
Typically used for "throwaway" builds. I publish them from my CI server, triggered by changes committed to the source code. The purpose of the snapshot build is to share the latest tested artifact from a particular team. This is important as teams might be sharing jars between each other.
This approach is much more stable than our previous approach of sharing the source code (Constant fire-fighting problems, when another team commits something that fails their build.... and by extension everyone elses).
Cleaning up snapshots
I use Nexus to manage my repository, it has a scheduled task that can be configured to periodically purge the snapshot repository. I'd imagine Artifactory has similar functionality.
Release candidates are for QA
I treat QA like a full-blown release to the customer. That's why I prefer the term "Release Candidate".
The key difference between a release candidate build and a snapshot is that the source code is "tagged" or "labelled" (dependent on your SCM system's terminology). What you're doing is drawing a line in the sand from which you can conveniently re-create the binary on demand.
Nexus professional has a staging suite, which enables development to cut a new release and hold it on a temporary "staging repository". Obviously some releases will fail testing in which case they're dropped. others are either promoted to the next group in the chain, or released to the public area.
There are several methods of implementing this "promotional model" of release management.
How release revisions are managed
I use the following numbering convention for my releases.
<major number>.<minor number>.<patch number>.<build number>
Example:
1.0.0.24
(For smaller/simpler projects I might alter the convention and drop the patch number).
Ivy has a wonderfully useful buildnumber task to manage the incrementing build number, based on what has already been published to your repository.
<property name="release.candidate" value="1.0.0"/>
..
<ivy:buildnumber organisation="${ivy.organisation}" module="${ivy.module}" revision="${release.candidate}"/>
..
<echo message="Release revision: ${ivy.new.revision}"/>
The release.candidate property is typically stored in a properties file, under version control. What I really like about this solution is that it enables parallel branch management (See answer to this question).
I've started using TeamCity personal builds, via the new Git remote run feature in TeamCity 6.5. Doing a single build works fine; I have a project that compiles from source, and I gave it a Branch Remote Run trigger.
However, it looks like TeamCity only triggers the one project that has the Branch Remote Run trigger applied. I have several unit test projects, set up in a chain with Finish Build triggers, and none of these get run. Furthermore, if I try to start a custom build of one of these unit test projects, I can't use the artifacts from my personal build: I can only pick artifacts from one of the 'official' builds.
Can I get TeamCity personal builds to work with build chains?
With the setup that you have (snapshot dependencies and finish build triggers), you can achieve build chaining by submitted your personal changes to the builds you are looking to trigger. For example, if you have projects A and B where B depends on A - run the remote build against project B and A will be triggered first and B will be added to the queue. Both of these builds will have your personal changes.
If you are using the TeamCity Visual Studio plugin you can select which builds you want to send your changes to and you just need to tick the box for B instead of A.
The finished build trigger won't be fired, but the build chaining means that A must be built first.
More info - http://confluence.jetbrains.net/display/TCD7/Build+Chain
(You have tagged TeamCity 6.5, but 7 has now been released so I have included the documentation for the newer version)
I suppose you should setup your chain not with Finish Build trigger, but with "Snapshot dependencies" feature of TeamCity. And, setup artifacts dependency basing on the snapshots.
Please read about snapshot dependencies in TeamCity here.
I am the Configuration manager for an IT firm. Currently we are using anthill build management server for all our build related purposes. We are looking to implement Continuous Integration in our development life cycle.
Currently the building process is done manually. Suppose there are 5 projects A,B,C,D,E and E is the parent project and the dependency chain does like this:
A->B->C->D->E
What we do is we build A first update project.xml of B to the latest version of A, build B so on and so forth untill all dependent projects get built and finally parent project gets built.
What I am thinking is automating the entire process i.e. automatically finding out dependencies and building them first and then updating the version of parent projects and building them again to a newer version.
Would continuum do this for me? If not is here any other CI tool that does this?
Hudson does this really well, if you're using Maven, it'll even automatically figure out the build dependencies for you automatically after the first build, otherwise you can manually define the build dependencies. I.e., it lets you configure the system to build project B after a successful project A build.
I'm not sure if it matters to you, but Hudson is also open source.
If not is here any other CI tool that does this?
I like TeamCity, which does pretty much everything you'll need. With the latest version (and a plugin from JetBrains), there's even Git support.
On the other hand, any continuous integration system should handle dependencies easily.
We use Zed Builds and Bugs for a setup similar to this. We have a master project that has sub-project dependencies and the build system handles everything in the proper order.
We also have very small, tight builds for the sub-projects so that each of them can be built when the developers commit to source control. The Zed Server is capable of pulling the latest artifacts from these small builds and putting them together into larger builds, but we haven't yet used that feature.
Our check-ins trigger the small CI builds, and then twice per day the entire application is re-built from scratch, following the dependency chain.
I'd agree with OregonGhost, though, any CI system should be able to set up this type of chain.
I don't think you need a CI tool for this. Try to automate this using a buildscript and use Continuum (or any other CI tool) to trigger your preferred buildtool.