Building multiple "things" on wercker - wercker

We are trying to migrate a project that is currently build on Jenkins to Wercker:
We have a Play Framework (2.4.x) application that makes have use of multi-module architecture. To be more specific, it's quite close to that of the Guardian.
Within one Git repository, we have the following structure:
What we did earlier was to detect where changes occurred since the last build. The result of that query may be this:
/app1/
/app3/
/app5/
We only build those projects usually by invoking parallel and parameterized builds for each of those modules.
Is there a pattern that would allow us to keep this mechanism running on wercker?

Related

Teamcity build chain templating

We have set of different web application projects, which is delivered by Teamcity to different environments. At this moment, we are doing all-in-one builds: compile, package and deploy at once; all based on a templates.
Now I am investigating a way to separate concenrs: one build tests and produces package, another - delivers. Naturally, both builds are having own templates. Is there a way to template this build chain - so, when I choose some meta-template - both builds will be created with present Artifact dependencies?
Sadly not. What we've had to do is clone the build chains. We've put in place a mechanism which makes sure the set-ups of the different chains do not diverge.
Another option, admittedly ugly, is to set up a single build chain and let each of your projects pretend that it's a separate VCS branch. In this case there's a single set-up (for the single build chain) and to view the history of a given project you filter by that project "branch". Needless to say, that's not how branches should be used and you may run into some issue down the line.

Do composite builds make multi-module builds obsolete?

I have a hard time do understand when to use composite builds vs multi-module builds. It seems both can be used to achieve similar things.
Are there still valid use cases for multi-module builds?
In my opinion, a multi-module build is a single system which is built and released together. Every module in the build should have the same version and is likely developed by the same team and committed to a single repository (git/svn etc).
I think that a composite build is for development only and for use in times when a developer is working on two or more systems (likely in different repositories with different release cycles/versions). eg:
Developing a patch for an open source library whilst validating the changes in another system
Tweaking a utility library in a separate in-house repository (perhaps shared by multiple teams) whilst validating the changes in another system
A fix/improvement that spans two or more systems (likely in separate repos)
I don't think that a composite build should be committed to source control or built by continuous integration. I think CI should use jars from a repo (eg nexus). Basically I think composite builds serves the same purpose as the resolve workspace artifacts checkbox in m2e
Please note that one of the restrictions on a composite build is that it can not include another composite build. So I think it's safer to commit multi-module builds to source control and use composite builds to join them together locally for development only.
These are my opinions on how the two features should be used, I'm sure there are valid exceptions to the above
We use our own monorepo with monobuild-type detection and use composite builds for CI and CD to staging (any microservices that end up building from your changes auto-deploy to staging). I disagree that composite builds are just for development as we use it to get to production in a monorepo/monobuild.
multi-project build estimated time at Orderly Health is about 15-20 minutes based on webpieces 5 minutes AND based on modifying a library in OrderlyHealth that affects EVERY project takes about 15 minutes.
Instead, we detect projects changed, what leaf nodes depend on it and all leaf nodes are composite projects pulling in libraries that pull in libraries and the general average build time is 3 minutes. (That is a 5x boost right there on build time).
later
Dean

Build dependencies and local builds with continuous integration

Our company currently uses TFS for source control and build server. Most of our projects are written in C/C++, but we also have some .NET projects and wouldn't want to be limited if we need to use other languages in the future.
We'd like to use Git for our source control and we're trying to understand what would be the best choice for a build server. We have started looking into TeamCity, but there are some issues we're having trouble with which will probably be relevant regardless of our choice of build server:
Build dependencies - We'd like to be able to control the build dependencies for each <project, branch>. For example, have <MyProj, feature_branch> depend on <InfraProj1, feature_branch> and <InfraProj2, master>.
From what we’ve seen, to do that we might need to use Gradle or something similar to build our projects instead of plain MSBuild. Is this correct? Are there simpler ways of achieving this?
Local builds - Obviously we'd like to be able to build projects locally as well. This becomes somewhat of a problem when project dependencies are introduced, as we need a way to reference these resources or copy them locally for the build to succeed. How is this usually solved?
I'd appreciate any input, but a sample setup which covers these issues will also be a great help.
IMHO both issues you mention fall really in the config management category, thus, as you say, unrelated to the build server choice.
A workspace for a project build (doesn't matter if centralized or local) should really contain all necessary resources for the build.
How can you achieve that? Have a project "metadata" git repo with a "content" file containing all your project components and their dependencies (each with its own git/other repo) and their exact versions - effectively tying them together coherently (you may find it useful to store other metadata in this component down the road as well, like component specific SCM info if using a mix of SCMs across the workspace).
A workspace pull wrapper script would first pull this metadata git repo, parse the content file and then pull all the other project components and their dependencies according with the content file info. Any build in such workspace would have all the parts it needs.
When time comes to modify either the code in a project component or the version of one of the dependencies you'll need to also update this content file in the metadata git repo to reflect the update and commit it - this is how your project makes progress coherently, as a whole.
Of course, actually managing dependencies is another matter. Tons of opinions out there, some even conflicting.

How to split a big Jenkins job/project into smaller jobs without compromising functuality?

we're trying to improve our Jenkins setup. So far we have two directories: /plugins and /tests.
Our project is a multi-module project of Eclipse Plugins. The test plugins in the /tests folder are fragment projects dependent on their corresponding productive code plugins in /plugins.
Until now, we had just one Jenkins job which checked out both /plugins and /tests, built all of them and produced the Surefire results etc.
We're now thinking about splitting the project into smaller jobs corresponding to features we provide. It seems that the way we tried to do it is suboptimal.
We tried the following:
We created a job for the core feature. This job checks out the whole /plugins and /tests directories and only builds the plugins the feature is comprised of. This job has a separate pom.xml which defines the core artifact and tells about the modules contained in the feature.
We created a separate job for the tests that should be run on the feature plugins. This job uses the cloned workspace from the core job. This job is to be run after the core feature is built.
I somehow think this is less than optimal.
For instance, only the core job can update the checked out files. If only the tests are updated, the core feature does not need to be built again, but it will be.
As soon as I have a feature which is dependent on the core feature, this feature would either need to use a clone of the core feature workspace or check out its own copy of /plugins and /tests, which would lead to bloat.
Using a cloned workspace, I can't update my sources. So when I have a feature depending on another feature, I can only do the job if the core feature is updated and built.
I think I'm missing some basic stuff here. Can someone help? There definitely is an easier way for this.
EDIT: I'll try to formulate what I think would ideally happen if everything works:
check if the feature components have changed (i.e. an update on them is possible)
if changed, build the feature
Build the dependent features, if necessary (i.e. check ob corresponding job)
Build the feature itself
if build successful, start feature test job
let me see the results of the test job in the feature job
Finally, the project job should
do a nightly build
check out all sources from /plugins and /tests
build all, test all, send results to Sonar
Additionally, it would be neat if the nightly build was unnecessary because the builds and test results of the projects' features would be combined in the project job results.
Is something like this possible?
Starting from the end of the question. I would keep a separate nightly job that does a clean check-out (gets rid of any generated stuff before check-out), builds everything from scratch, and runs all tests. If you aren't doing a clean build, you can't guarantee that what is checked into your repository really builds.
check if the feature components have changed (i.e. an update on them is possible)
if changed, build the feature
Build the dependent features, if necessary (i.e. check ob corresponding job)
Build the feature itself
if build successful, start feature test job
let me see the results of the test job in the feature job
[I am assuming that by "dependent features" in 1 you mean the things needed by the "feature" in 2.]
To do this, I would say that you have multiple jobs.
a job for every individual feature and every dependent feature that simply builds that feature. The jobs should be started by SCM changes for the (dependent) feature.
I wouldn't keep separate test jobs from compile jobs. It allows the possibility that successfully compiled code is never tested. Instead, I would rely on the fact that wen a build step fails in Jenkins, it normally aborts further build steps.
The trick is going to be in how you thread all of these together.
Let's say we have a feature and it's build job called F1 that is built on 2 dependent features DF1.1 and DF1.2, each with their own build jobs.
Both DF1.1 and DF1.2 should be configured to trigger the build of F1.
F1 should be configured to get the artifacts it needs from the latest successful DF1.1 and DF1.2 builds. Unfortunately, the very nice "Clone SCM" plugin is not going to be of much help here as it only pulls from one previous job. Perhaps one of the artifact publisher plugins might be useful, or you may need to add some custom build steps to put/get artifacts.

Continuum finding dependencies and building on chain-dependent projects

I am the Configuration manager for an IT firm. Currently we are using anthill build management server for all our build related purposes. We are looking to implement Continuous Integration in our development life cycle.
Currently the building process is done manually. Suppose there are 5 projects A,B,C,D,E and E is the parent project and the dependency chain does like this:
A->B->C->D->E
What we do is we build A first update project.xml of B to the latest version of A, build B so on and so forth untill all dependent projects get built and finally parent project gets built.
What I am thinking is automating the entire process i.e. automatically finding out dependencies and building them first and then updating the version of parent projects and building them again to a newer version.
Would continuum do this for me? If not is here any other CI tool that does this?
Hudson does this really well, if you're using Maven, it'll even automatically figure out the build dependencies for you automatically after the first build, otherwise you can manually define the build dependencies. I.e., it lets you configure the system to build project B after a successful project A build.
I'm not sure if it matters to you, but Hudson is also open source.
If not is here any other CI tool that does this?
I like TeamCity, which does pretty much everything you'll need. With the latest version (and a plugin from JetBrains), there's even Git support.
On the other hand, any continuous integration system should handle dependencies easily.
We use Zed Builds and Bugs for a setup similar to this. We have a master project that has sub-project dependencies and the build system handles everything in the proper order.
We also have very small, tight builds for the sub-projects so that each of them can be built when the developers commit to source control. The Zed Server is capable of pulling the latest artifacts from these small builds and putting them together into larger builds, but we haven't yet used that feature.
Our check-ins trigger the small CI builds, and then twice per day the entire application is re-built from scratch, following the dependency chain.
I'd agree with OregonGhost, though, any CI system should be able to set up this type of chain.
I don't think you need a CI tool for this. Try to automate this using a buildscript and use Continuum (or any other CI tool) to trigger your preferred buildtool.

Resources