Maven executing goals concurrently - maven

How can I execute several Maven goals on the same project concurrently? My exact scenario is after having installed sevaral gigabytes of artifacts locally I would start deployment to Nexus, but since it will take a while, I would also like to start executing additional goals with exec plugin - the two tasks are not generally dependent on one another, but they add up to a significant time waste.
I was hoping Maven has any sort of idiom for that, but other solutions are welcome. I would need a way to fail the build if any of the tasks fails (bonus points for a solution that would notify the other goal to fail fast).

Related

Maven and Jenkinsfile - skipping previous phases

I'm exploring Jenkins staging functionality and I want to devise a fast and lean setup.
Basically, Jenkins promotes the use of stages to partition in the build process and provide nice visual feedback about the progress of the build.
So the Jenkinsfile goes something like
stage("Build")
bat("mvn compile")
stage("Test")
bat("mvn test")
stage("Deploy")
bat("mvn deploy")
This works well, but feels wrong, because deploy and test will both do activities from previous phases again.
As a result, in this setup I am building three times (although skipping compilation due to no changes) and testing two times (in the test and the deploy runs).
When I google around I can find various switches and one of them works for skipping unit tests, but the compilation and dependency resolution steps happen regardless of what I do.
Do I need to choose between speed and stages in this case or can I have both?
I mean:
stage("Resolve dependencies, build, test and deploy")
bat("mvn deploy")
is by far the fastest approach, but it doesn't produce a nice progress table in Jenkins.
In order to bring incremental builds in Maven phases as Gradle does, you can use takari-lifecycle maven plugin.
So, once the plugin is apply you will get all the benefits. In your example, Test stage which will perform mvn test will avoid compilation because it was compiled in the previous stage and Deploy stage will avoid compilation from your main source code and test source code but tests will be executed again so I suggest to add -DskipTests.

Does maven support concurrent execution of different projects

I have a query regarding maven.
Does maven support concurrent execution of different projects which are not related.
To elaborate it more;I have 4 different projects and i want to run "mvn site" command on all of them on 4 different terminals.
So the question is,Does maven support this feature.
Thanks
#Raj Jain,
By "different projects which are not related" I take you to mean each one has a different pom file sitting in a different directory. If so, then the answer is yes, but.
Yes, in theory, you can build all of them concurrently, for instance running 4 xterms concurrently, cd to each one's respective dir, and run mvn clean install in rapid succession.
And yes, each of the builds will run in its own directory in a self-contained manner, creating a local subdir called target/ to store all the build artifacts.
But there is a slight risk of builds interfering with one another as they write to what's called the local repo. Especially if they depend on the same jars, they might write the same file to the same folder simultaneously, causing the build to get corrupted. This doesn't happen often, especially after the first time the build runs since the local repo is now fully populated.
However, if you want extra insurance against this kind of collision, then have each of the builds write to their own repo using mvn -Dmaven.repo.local=/tmp/repository1/
Hope that helps.
Add all the artifacts to a pom.xml as modules, so when you run mvn site on the pom.xml, all modules should be called.

How to get Hudson to correctly build multiple modules changed by a single commit

Consider a Maven project with multiple interdependent modules: let's say, three jar modules A, B, and C, which are dependencies for a war module Z. I have a separate Hudson build for each of these modules, so that only modules that have changed are re-built.
My issue is that if I commit a changeset that changes both module A and module Z, Z may be built before A and fail, before A completes and triggers a rebuild of Z which now passes. Allowing builds to regularly fail for reasons to do with build ordering rather than "real" failures desensitizes us to real failures; we end up ignoring builds which have legitimately broken because we are used to assuming it will eventually flip back.
I have been managing this through the use of quiet periods, blocking when upstream builds are running, etc. But in practice, my build has more modules than the example I've given, many of which take a while to build and test. I also have a small horde of diligent developers making frequent commits.
This means my jar modules are constantly building, only rarely leaving a gap for my war module(s) to build. So the war doesn't build very frequently, meaning it takes a long time to find out when we've broken it, and also takes longer to identify which change broke it.
Also, the constant running of builds means that if I commit a change that touches jars A and B, the war file Z may be built once for jar A (which builds quickly), and then again for jar B (which takes longer). This makes it hard to understand the results of a given commit.
I've considered using the join plugin, but this appears to require all of the modules to build every time. Since I actually have quite a few jar modules, I really don't want to have to build them all every time, I only want to build the ones that have changed for a given commit.
Are there any better ways to handle this?
Thanks
This is always a difficult problem (and I've re-written this answer more than once!)
In terms of a technical solution, you want something that will wait for the build of several different jobs to be not running before it starts to run. If it's difficult to quantify, it's going to be difficult to put in place. I'll be very interested to see what technical solutions are suggested in this thread.
I guess you have to look at why your jobs are being run, and how often. If there's any code that requires unit testing in your WAR, could you move it out into it's own module? That way you can run only integration tests every hour/30 mins using the war and not worry about where and when the individual modules are at.
You may want to also look at what your modules contain. Do they ALL have to be modules? Can you perhaps reduce the fragmentation - it might help reduce the complexity of what you are attempting to schedule :)
I understand and applaud your efforts to get as much tested as soon as possible - but sometimes a smoke test is all you can do if there's a constant churn of code.
The approach we're now looking at is combining some Maven modules into single Hudson jobs, rather than having a one to one mapping of modules to jobs.
Specifically, if a war module's dependencies are fairly small and quick to build on their own, building them in the same job with the war ensures that all of the code from a single commit is built together, at least for that given war file.
This does result in duplication - we have multiple war files using the same jars, so the jars are essentially rebuilt for every war, rather than once only. But in practice, the jars are quick to build, and this makes the war files conceptually cleaner.
This would be less attractive if the jars took a while to build and test, since the combined jars + war job would then be quite long, giving us long feedback loops for problems within the jars. Getting the balance right is important.
So my takeaway: don't assume that one Hudson/Jenkins job per module is the best way to go, and don't be afraid to rebuild the same code in multiple jobs.

how to time (profile) maven goals in a multi-module project

We have a huge project with many submodules. A full build takes currently over 30mins.
I wonder how this time distributes over different plugins/goals, e.g. tests, static analysis (findbugs, pmd, checkstyle, etc ...)
Would it be possible to time the build to see where (in both dimensions: modules and goals) most time is spent?
The maven-buildtime-extension is a maven plugin that can be used to see the times of each goal:
https://github.com/timgifford/maven-buildtime-extension
If you run the build in a CI server like TeamCity or Jenkins (formerly Hudson), it will give you timestamps for every step in the build process and you should be able to use these values to determine which goals/projects are taking the most time.
I don't think there is any way built in to maven to do this. In fact, in the related question artbristol posted, there is a link to a Maven feature request for this functionality. Unfortunately, this issue is unresolved and I don't know if it will ever be added.
The other potential solution is to write your own plugin which would provide this build metadata for you.
I don't think there is a way to determine the timing of particular goals. What you can do is run the particular goals separately to see how long they take. So instead of doing a "mvn install" which runs all of your tests, checkstyle, etc.. just do "mvn checkstyle:checkstyle" to see how long that takes for a particular module.
Having everything done every time is nice when its done by an automated server (continuum/jenkins/hudson) but when you are building locally, sometimes its better to be able to just compile. Some of the things you can do are have the static analysis goals ONLY run when you pass in a certain parameter or profile. Another option is to only have them ran when maven.test.skip=false.
If you are using a continuous build, try having the static analysis only done every 4 hours, or daily.

How to recombine builds in TeamCity?

We have a lot of tests. I can break these up so that they run on seperate agents after an initial compile build happens, but is there a way I can recombine these results? Having 8 build configurations that all need to be green makes it hard to see if you've got one ubergreen build.
Is there a way in TeamCity to recombine / join builds once we've split them out? TW-9990 might help - allowing ANDs in the dependencies.
We found the answer which certainly works from TeamCity 5:
One compile build,
N test only builds that take compile.zip!** and copy to where the compile output would normally be. (via a template)
Consolidated finish:
Finish Build Trigger: Wait for a successful build in: ...
Snapshot Dependencies: Do not run new build if there is a suitable one
Only use successful builds from suitable ones
This all seems to work nicely and the whole shbang is easily copied for branches etc. Am very happy - this has worked well for us for many months now.
No idea how to do that natively. Here's my first thoughts on how I would try and tackle such a thing though:
Saving test results to files
Publishing the test result files as build artifacts
Creating a 'Merge build'
Adding artifact dependency onto the individual test projects
Writing a custom 'build' script using something like (N)Ant. This would parse the individual test results and publish the results as per the TC KB
Good luck!
Thinking outside the box you could have an overall build which doesn't really do anything (or use one of your test build configs as your 'master'), with snapshot dependencies on each of your split test builds. That way if any of them fail, the 'master' will fail because one the dependent build failed.
TW-9990 looks to be concerned with build triggering rather than dependencies.

Resources