I'm a Maven noob currently working with Maven/Jenkins to perform some downstream jobs on the back of a release task and my team has hit a problem.
What we are trying to achieve is to pass in the version tag into the downstream jobs once the main build has been executed. We had been trying to achieve this using the M2 plugin, but it appears to execute in a Build->Downstream Jobs->Release cycle, and we need to have a Build->Release->Downstream Jobs pattern.
We therefore decided to create a separate job using the build step to perform the release a as a Goal. Here's the directives we are using to achieve this:
-Pdmt -Dresume=false release:clean release:prepare release:perform -DautoVersionSubmodules
A consequence of abandoning the M2 plugin for this job has been that the prompt requesting the version number is no longer appearing. Subsequently, we've been trying to achieve this via the Post-build Actions, and passing in pre-defined parameters. The issue for us here is knowing how to pass in a dynamic parameter based on the previously executed job.
An alternative angle we were looking at was specifying a properties file that the main job could tokenize prior to it's usage in the downstream jobs.
Does anyone have any advice on how we might achieve this workflow, or if it's even possible?
OK, JFTR it looks like it's outside of the scope of the maven release plugin:
To answer the question I had to enumerate some of the assumptions made by the Release
plugin. I can tell you about these baseline assumptions and you can decide whether or not > something like the Maven Release plugin is appropriate for you.
What are these assumptions?
Your codebase is going to be versioned and released as a “unit”. What
does this mean? This means that you are going to be releasing an
entire project at once with all of its submodules. [sic] In Github it means that the Maven
Release plugin is going to operate on an entire repository.
The complex relationship of the repositories, releases and resultant artifacts prevent us from automating the task in the manner described in the question
Related
At the moment, we are generating Jenkins jobs using the Job DSL plugin. Typically, we have the following jobs per project:
CI build (SNAPSHOT build)
Deployment, one per stage
Integration test (nightly build)
Creation of a release
Reports (Maven site, during the night)
Am I right that there can be only one Jenkins file in the project's repository? How could I map our requirements to the new Jenkins pipeline?
I'm asking because we're going to install version 2 of Jenkins, and I'm not sure whether we should abandon our Jenkins job generation and use Jenkins files.
There are a couple of options, which might help you to migrate over to Jenkins pipelines. But you don't have to, especially not all at once.
You can use a shared library to define functions that can be used in multiple jobs, e.g. a buildOurThing function.
You can trigger an existing job (of whatever kind) using the build step. So you could model your new pipeline around existing jobs.
You can still use parameterized builds, if you want to use the deployment job with different targets.
Jenkins pipeline is really worth using, but maybe don't force you to an immediate switch. If JobDSL works for you, keep it. If you have new jobs (pipelines) to create, get familiar with pipelines.
we're trying to improve our Jenkins setup. So far we have two directories: /plugins and /tests.
Our project is a multi-module project of Eclipse Plugins. The test plugins in the /tests folder are fragment projects dependent on their corresponding productive code plugins in /plugins.
Until now, we had just one Jenkins job which checked out both /plugins and /tests, built all of them and produced the Surefire results etc.
We're now thinking about splitting the project into smaller jobs corresponding to features we provide. It seems that the way we tried to do it is suboptimal.
We tried the following:
We created a job for the core feature. This job checks out the whole /plugins and /tests directories and only builds the plugins the feature is comprised of. This job has a separate pom.xml which defines the core artifact and tells about the modules contained in the feature.
We created a separate job for the tests that should be run on the feature plugins. This job uses the cloned workspace from the core job. This job is to be run after the core feature is built.
I somehow think this is less than optimal.
For instance, only the core job can update the checked out files. If only the tests are updated, the core feature does not need to be built again, but it will be.
As soon as I have a feature which is dependent on the core feature, this feature would either need to use a clone of the core feature workspace or check out its own copy of /plugins and /tests, which would lead to bloat.
Using a cloned workspace, I can't update my sources. So when I have a feature depending on another feature, I can only do the job if the core feature is updated and built.
I think I'm missing some basic stuff here. Can someone help? There definitely is an easier way for this.
EDIT: I'll try to formulate what I think would ideally happen if everything works:
check if the feature components have changed (i.e. an update on them is possible)
if changed, build the feature
Build the dependent features, if necessary (i.e. check ob corresponding job)
Build the feature itself
if build successful, start feature test job
let me see the results of the test job in the feature job
Finally, the project job should
do a nightly build
check out all sources from /plugins and /tests
build all, test all, send results to Sonar
Additionally, it would be neat if the nightly build was unnecessary because the builds and test results of the projects' features would be combined in the project job results.
Is something like this possible?
Starting from the end of the question. I would keep a separate nightly job that does a clean check-out (gets rid of any generated stuff before check-out), builds everything from scratch, and runs all tests. If you aren't doing a clean build, you can't guarantee that what is checked into your repository really builds.
check if the feature components have changed (i.e. an update on them is possible)
if changed, build the feature
Build the dependent features, if necessary (i.e. check ob corresponding job)
Build the feature itself
if build successful, start feature test job
let me see the results of the test job in the feature job
[I am assuming that by "dependent features" in 1 you mean the things needed by the "feature" in 2.]
To do this, I would say that you have multiple jobs.
a job for every individual feature and every dependent feature that simply builds that feature. The jobs should be started by SCM changes for the (dependent) feature.
I wouldn't keep separate test jobs from compile jobs. It allows the possibility that successfully compiled code is never tested. Instead, I would rely on the fact that wen a build step fails in Jenkins, it normally aborts further build steps.
The trick is going to be in how you thread all of these together.
Let's say we have a feature and it's build job called F1 that is built on 2 dependent features DF1.1 and DF1.2, each with their own build jobs.
Both DF1.1 and DF1.2 should be configured to trigger the build of F1.
F1 should be configured to get the artifacts it needs from the latest successful DF1.1 and DF1.2 builds. Unfortunately, the very nice "Clone SCM" plugin is not going to be of much help here as it only pulls from one previous job. Perhaps one of the artifact publisher plugins might be useful, or you may need to add some custom build steps to put/get artifacts.
I have project configured in Jenkins that polls an SCM and begins a build when a change is posted. There is a post build action to build another project. The question I have is, the project that is being built afterwards has its own parameters. How do I know which parameter is specified when the post-build action triggers? Right now if if I use 'Choices', is it just picking the first one? How do I have it pick other ones?
OK, let's take it one by one :)
If you want to see which parameter were used, You can install this plugin: Show Build Parameters Plugin
If you want to trigger a build with a specific parameters, use this plugin: Parameterized Trigger Plugin
I want to collect the build metrics for a maven build (metrics like total time taken for the build, status of build SUCCESS or FAILURE, test results etc.,) and store it for analysis. All these information are available in the log but i need to collect it at the end of the build and call a service with the data.
This feature should be available where ever maven build is done. so it should be associated with the lifecycle. But i am not sure whether maven has any hooks to tap to get this kind of information.
-
Kamal
You might want to look into Continuous Integration, which will build your project everytime you commit to the repository. I personally like Jenkins, where you can install the Global Build Stats Plugin which I think will cover what you want to do
I found a way to profile the maven build in developer machines.
For Maven 3 and above, it exposes the events through EventSpy API. An example profiler is available at https://github.com/tesla/tesla-profiler . so we implemented our own profiler and it logs the data to the central server.
For Maven 2.x, there is no easy way. I modified Maven to expose the events and wrote listener to track the data
#Kamal's answer requires you to modify your local Maven installation which is not a portable solution.
Here is a maven plugin which collects execution times of each plugin and saves it as an HTML report. The plugin utilizes Maven's core extensions feature which was introduced in 2015, so that you don't have to modify your Maven installation.
https://github.com/jcgay/maven-profiler
We have a huge project with many submodules. A full build takes currently over 30mins.
I wonder how this time distributes over different plugins/goals, e.g. tests, static analysis (findbugs, pmd, checkstyle, etc ...)
Would it be possible to time the build to see where (in both dimensions: modules and goals) most time is spent?
The maven-buildtime-extension is a maven plugin that can be used to see the times of each goal:
https://github.com/timgifford/maven-buildtime-extension
If you run the build in a CI server like TeamCity or Jenkins (formerly Hudson), it will give you timestamps for every step in the build process and you should be able to use these values to determine which goals/projects are taking the most time.
I don't think there is any way built in to maven to do this. In fact, in the related question artbristol posted, there is a link to a Maven feature request for this functionality. Unfortunately, this issue is unresolved and I don't know if it will ever be added.
The other potential solution is to write your own plugin which would provide this build metadata for you.
I don't think there is a way to determine the timing of particular goals. What you can do is run the particular goals separately to see how long they take. So instead of doing a "mvn install" which runs all of your tests, checkstyle, etc.. just do "mvn checkstyle:checkstyle" to see how long that takes for a particular module.
Having everything done every time is nice when its done by an automated server (continuum/jenkins/hudson) but when you are building locally, sometimes its better to be able to just compile. Some of the things you can do are have the static analysis goals ONLY run when you pass in a certain parameter or profile. Another option is to only have them ran when maven.test.skip=false.
If you are using a continuous build, try having the static analysis only done every 4 hours, or daily.