Jenkins - changing pom.xml versions manually - maven

I need to change a project version and I can do it by creating a Maven project and add Maven goal of versions:set. I can also do manually.
Question: Is there any downside of changing pom.xml versions manually (using sed/awk)?

Changing the version using a Maven plugin is definitely better than a manual approach (e.g. sed/awk) for several reasons:
You stay within the Maven ecosystem and as such avoid undesirable and unforeseen side effects
The versions:set would also automatically take care of propagating the change to sub-modules, in case of multi-module Maven project, since the goal:
Sets the current project's version and based on that change propagates that change onto any child modules as necessary.
You can make use of several additional options provided by the goal, like filtering (e.g. only change for certain groupId/artifactId, again in case of multi-module)
Maintenance wise, you have better chances to keep it solid across different versions of Maven
In general, if Maven (or one of its plugins) already provides the same functionality: simply don't reinvent the wheel

Related

Automatically force update pom to newer versions from inside a test

This is not related to update the pom dependencies using a maven goal. I already have that sorted out.
So I am responsible for creating, packaging and maintaining common utilities. These common utilities are in turn used by all the teams in the org. Problem is that the teams using these utilities do not update the dependencies unless it is the last resort. We would like them to use the latest release version of our common utilities barring very few.
Now I have come across this Maven Versions Plugin by mojohaus which I think serves my need by using two goals - versions:update-properties and versions:use-latest-releases. It serves my purpose except two things:
I do not see a way to exclude certain groupid:artifactid from the update dependency/property
We really want this to be a compulsory thing (maybe part of the Test execution - this is for test automation utilities mainly) rather than a maven goal. Because if it is a maven goal, it needs to be invoked separately and hence becomes optional for teams.
We know that forcefully updating to latest version might cause some issues with defect re-produciblity, but we are willing to take that risk. Our utilities are really test products.
Any direction/help on this is appreciated.
Edit: We run our tests using maven goals clean install already. So they use existing pom. We want dependencies update to happen before the tests run. Also it is desirable to commit the changes to source control (bitbucket) if possible.
We have our tests setup using Jenkins but teams also run multiple test on local machines.
Edit: Found the answer to #1. The plugin provides to exlude regex for group and artifact id. using tags excludes and excludesList
Sorry for insisting. From the Maven point of view I see two main solutions:
Your projects and your utility jar are tightly coupled and every project always needs to use the latest version. Then you can bundle all projects and your utility in one multi-module project. This makes sure that everything is up to date all the time, but it requires that all projects and your utilities are always build together (not separately).
You distribute your utilities to different projects which build and release at different times. Then it is up to the projects to decide when to update. There is unfortunately no standard way to deprecate jars.
If I understand you correctly, you want something "in the middle". This may be hard to achieve.

Good approach of a maven project design or antipattern design

Context:
I have a multimodules maven project that looks like:
Root ---- ModuleA
ModuleB
ModuleC
ModuleD
.......
They are around 25 modules under the Root:
A few of them represent the core of the application (5 modules)
And each of the remaining modules represent the business processes implementation related to a type a customers. These modules are completely independant among each others.
When packaging or releasing the 'Root' project, the artifact generated is a single ZIP file aggregating all the JARs related to 'Root' modules.
The single ZIP file is generated according to an assembly descriptor, it represents the delivery artifact.
At deployment time on the target environment, the single ZIP is unziped under a directory where it is consumed (class loaded) by an 'engine', a java web application that provides the final services.
Constraints
The 'business constraints' from one side,
And the willing to reduce regressions between different versions on
the other side
The above constraints lead us to adopt the following release scenarios:
Either we release the Root and ALL its submodules. It means that
the resulting ZIP will aggegate all the submodules JAR with the same
version. The ZIP will contain something similar to:
[ModuleA-1.1.jar, ModuleB-1.1.jar, ModuleC-1.1.jar, ModuleD-1.1.jar,
......., ModuleX-1.1.jar].
Or we release the Root and A FEW of its submodules, the ones that we want to re update.
The resulting ZIP will aggegate all the submodules JAR : The released submodules will be aggregated with the last released versions, the unreleased submodules will be aggregated with another 'appropriate' version.
For example, if we made a such incremental release, the ZIP will contain something similar to [ModuleA-1.2.jar, ModuleB-1.1.jar, ModuleC-1.2.jar, ModuleD-1.1.1.jar,
......., ModuleX-1.1.2.jar].
These 2 scenarios were made possible by:
Either declaring the modules as MAVEN MODULES 'module' for the first
scenario
Or declaring the modules as MAVEN DEPENDENCIES 'dependency' for
the second scenario, the INCREMENTAL scenario.
Question
Both scenarios are working perfectly BUT when we are in the 2nd scenario (INCREMENTAL), the maven-release-plugin:prepare is uploading to the SCM (svn) all the modules [ModuleA, ModuleB, ModuleD, .... ModuleX], it is uploading the released and the non released ones, whereas the 'non released modules' are declared as 'dependency' in the pom and not as a 'module'.
1/ IS THERE a way to avoid uploading the 'non released' modules ? Is there a way to inject an 'exlcude directrory list' to SCM svn provider ?
2/ A MORE global question, does the approach used is a correct one ? Or is it an anti pattern usage ? In that case, what should be the alternative ?
Thank you.
To me, your approach looks like an antipattern. I recommend to only have projects in the same hierarchy that you want to release together. Projects with a different release lifecycle should live on their own - otherwise you will keep running into the issues you mentioned. If you run the release plugin from a root directory (multi-module setup), all of the content of that root directory will be tagged in SVN.
In your case, I would probably create the following hierarchies:
Core
One per customer type
Potentially one per type to bundle them (zip), depending on your structure
I would group it by the way you create the release. It might mean that you have to run the release plugin a couple of times instead of just once when you make a change e.g. in Core, but it will be a lot cleaner.
Your packaging project will then pull in all of the dependencies and package/assemble them.
If you have common configuration options, I recommend to put them into a common parent pom. This doesn't have to be your root (multi-module) pom.
Did you try to run the maven-release-plugin with -r argument + the list of all modules you want to release?
Basically, this argument allows you to specify the list of modules against which the maven command should be performed. (if you omit it: all submodules will be included, this the default behavior)
See more details about this command line here.
I never try to use it with the maven-release-plugin, and I don't know if it will work, especially regarding SCM operations.

One (multi-module) maven project per svn trunk or not

My usual practice is to have a single maven project (can be multi-module) per svn trunk like this:
trunk/ (style 1)
/pom.xml
/submod-1
/submod-2
Basically, the entire trunk is treated as a single release package. I find this easier to manage. There's an aggregation/parent pom to manage all modules within this trunk.
However, i noticed some of my peers organize like this:
trunk/ (style 2)
/project-1
/pom.xml
/project-2
/pom.xml
Basically, within the single svn trunk...project-1 and project-2 needs to be managed separately. i.e. I cannot checkout the trunk and work with its contents as a single multi-module maven project -something i appreciate.
Q1: When would style 2 be a good idea, if at all?
Q2: Can someone tell/point me to best practices on how to manage maven projects with subversion?
I have seen instances of both.
In my opinion, it depends on how independent the projects/modules are. If they have their own release plans, then it makes sense for them to have their own trunk, branches and tags. However, if the projects/modules are always released together, it may make sense for them to be part of a single trunk.
From a maven perspective, if the modules of a multi-module project inherit their <version> from the parent, then it should follow style 1 above.
I pretty much agree with #Raghuram's answer, but would like to add the following:
If you want to release projects together with the same life cycle and the same version number, you need the Style 1, with the parent POM in the root of the trunk. This parent POM then usually includes the sub projects as modules to be built from the root.
Style 2 makes sense if you have a set of independent projects (e.g. utilities), where you want a single SVN location (for easier administration), but have different release cycles. In this case, you don't need a root aggregator POM that includes the single project as modules. It might still make sense to have a common parent POM for the utility projects that centralizes things like compiler settings and plugin management. The difference to Style 1 is that you would never release all of the projects together. You would release a stable version of just the parent POM and then release the single utility projects as needed.
We have used both styles with great success - the main difference really is in the way and frequency you use for releasing.
With Style 2 (and the utility project example) I don't see an issue if these projects share the same SVN tags and trunk. A developer can still decide whether he wants to check out the whole utility trunk or just a single project.
Just look at this question and at least its most-up answer. It actually summarizes the case and says pretty everything about it.
Just think to continuos integration:
This thought make first approche more flexible to use with CI servers like jinkins, bitbucket or bamboo ...

Automating Maven artifact releasing

For a project with a large number of Maven artifacts (both internally generated as well as external ones), how does one go about automating the releasing of the internally controlled artifacts as part of an overall product release.
Things to be aware of about this question, we use Jenkins and the Maven release plugin. So the operation of releasing a single artifact is automated (albeit the operation to kick-start the process is manual). However the process of releasing all the changed artefacts over the course of a release is not automated (i.e. one has to manually kick-start the release of each artifact). Part of the problem is that almost nothing is released until the end of the release, prior to that everything remains in SHAPSHOT. We have a huge number of components as well as numerous applications/services (over 30) which rely on the plethora of components. So it is not just the case of picking a component and releasing, there are release dependency hierarchies that must be followed (i.e. start at the bottom releasing components that do not use other components and then work your way up until all the applications/services are released).
It is also worth noting that we use two common parent poms which, for the most part, control the versions of the external artifact dependencies and the internal component dependencies. Some pom files for components and applications may override this, but this is (or should be) an exception and should be for a good, but temporary, reason. So when an internal artifact is released, the version in the corresponding parent dependency pom should also be updated.
The product has a release number (of course), however the various pom files technically do not share this version number. While this is not strictly true, the idea as that when parts of the software are set to end-of-life, they will not be updated in the future, thus while a limited number of artifact versions match the product's version at present, this will eventually not be the case.
Any thoughts on ways to get this process automated would be greatly appreciated. Also if you feel what I have described seems to be a crazy way to manage the software, then please provide a comment. Thank you.
You might be able to make use of the Maven Versions plugin which can help formalise versions for projects.
For example, the use-next-releases goal may allow you to release the lowest level of project and then more rapidly bring those released versions into their dependencies.
There may also be scope to use the use-next-versions goal if you fancy releasing components as necessary and simply bring your projects to the "latest" version thats been formally released.

Maven : Multimodule projects and versioning

What are the best practices for software versioning and multimodules projects with Maven?
I mean, when I create a multimodules project with Maven, what is the best approach for the versioning? To use a single version for all the modules (defined in the top project)? To use a version for each module (defined in the POM of each module)? Is there another approach that I'm missing? What are the pros and cons of each approach?
In general, are the different modules released together (possibly sharing the same version number)?
Thanks
Honestly it depends on what you would like to do. Multimodule projects are created for multiple reasons, one of them being you only need to deploy what has changed instead of all modules.
Think about it this way: if you had a non-multi-module project and you only had to change one line in the services layer, you have to rebuild the entire project and deploy all of the code again...even though only your services layer will change.
With multi-module projects, you can regenerate your project and deploy only what changed...your services. This reduces risk and you're assured that only your services module changed.
You also have a multitude of benefits to using multi-module projects that I'm not listing here but there is certainly a huge benefit to NOT keeping your version numbers of your modules in sync.
When you build your project, consider deploying it to a repository that will hold all compatible jars together for builds (each build creates a new folder with the parent-most pom version number). That way, you don't need to keep documentation about which jars are compatible...they're all just deployed together with a build number.
I was looking for a solution for this exact problem myself and versions-maven-plugin was exactly what I needed. I don't like the release plugin communicating with the SCM system. The versions plugin does just what we need: it sets a new version number in all poms of the project:
mvn versions:set -DnewVersion=2.0.0
Then I can proceed with commits, tags and an official build server build...
EDIT:
The versions plugin depends on how a maven multi-module project has been organised: as a result, it often does not update all POM files in a complex multi-module project.
I've found that sed and find do the job much more reliably:
sed -i 's/1.0.0-SNAPSHOT/1.0.0-RC1/g' `find . -name 'pom.xml'`
Typically you create a multi-module project because you have deemed that the various modules are parts of a single whole. Maybe the client-piece, the controller-piece and the services-piece. Or maybe the UI with services.
In any case, it makes sense to have the version numbers for the various modules to move in lock-step. However Maven does not enforce that as a rule.
As to your question
are the different modules released together (possibly sharing the same
version number)
I would think so. That is one of the reasons for having it a multi-module project. Otherwise you could have the modules as independent projects.
Of course this is the kind of stuff that is rife with edge cases and exceptions ;-)
I had the same problem with a project I`m working on. I also decided to use separate versions and even the dependency to the parent pom only has to be updated if some of the managed dependencies change. (so mostly as #vinnybad describes it)
Two additions
exists-maven-plugin
With the usage of the "org.honton.chas.exists-maven-plugin" only the modules will be deployed to the repository that have actually changed, which is really great, because also the according docker-images will only be published if something has changed on one of the service. This avoids "polluting" the image repository with different but unchanged versions.
versioning
One main downside of the "separated versions" approach are the questions regarding versioning:
What's the current version of my project?
Which module versions work with each other? (even thought they don't directly depend on each other, one does rely on what another does, e.g. they share the database schema)
To solve that I put all module versions into the dependency management part of the parent pom, even if no other module depends on them. A "integration-test" module could solve that by depending on all of the modules - and of course testing them together.
This way I would be "forced" to update the parent pom with every change, since it's referring the released module versions. This way the parent pom would have the "leading" version and at the dependency-management block state the versions of all modules that are compatible with each other (which will be ensured by the integration test).

Resources