I'm trying to design a solution that allows a single Virgo application to provide backwards compatibility for integrating with multiple versions of an external service provider.
For example, the application, call it PortalApp, is a portal that currently integrates with version 2.3 of ThirdPartyApp. ThirdPartyApp v3.0 is coming out soon with new features, so the new version of PortalApp will have features that won't work with the older version of ThirdPartyApp.
I don't require being able to serve both versions dynamically at run time, just one or the other. I've already established that I can have two versions of a module in the Virgo usr repository, and load one or the other based on the .plan file used at server start.
For simplicity, we can assume project is currently set up like this:
PortalApp
- web-app
- ThirdPartyProvider
There are a number of other modules that depend on ThirdPartyProvider, so changing the ArtifactId would break those chains. What I'd like to do is build two different versions of the same module. Something like this:
PortalApp
- web-app
- - 1.0
- - 2.0
- ThirdPartyProvider
- - 1.0
- - 2.0
I tried creating a parent pom.xml in web-app (packaging: pom) that identified both 1.0 and 2.0 as modules, but only one of them builds.
Can a single build of the PortalApp project build both versions of a module?
No, it's not possible (actually it is, but you really wouldn't want to do that because it's a world of pain). The pom has a version tag and this is the version of the artifact that is built.
Rather than do that, you should create a multi-module project, with one module for each web-app, as you have in your second diagram, each depending on the relevant version of the ThirdPartyProvider. You would then factor out the common code from these web-apps - usually this produces two things, a common web-app which you depend on from web-app:1 and web-app:2 (this will create what's called an 'overlay' which pushes the contents of common into the other two apps, but which doesn't overwrite existing files), and a shared java library containing the common java classes (depending on how you use the third-party api you may need two of these too).
You then build both web-apps, producing two artifacts, web-app-1.war and web-app-2.war, each with a dependency on the relevant ThridPartyProvider and on the common classes lib.
In order to keep all artifacts in the same version you could use Versions Maven Plugin to set the version 2.0 or 3.0 and in your build you can use mvn package -DthirdParty.version=3.0
Make sure your pom.xml looks like this
<properties>
<thirdParty.version>2.0</thirdParty.version>
</properties>
So you can use Jenkins in order to automate buils and ensure that eveyrhing is ok in your Continuous Integration process.
Related
I'm looking for advice on how to build artifacts that are composed of combinations of multiple modules without repeating a lot of boilerplate for all possibilities:
We have a software that is deployed as a .war into Tomcat and as an .amp into Alfresco running on the same Tomcat instance.
Everything related to Alfresco / .amp does not matter for the scope of the question. For simplicity just assume a single .war artifact in regards to Maven.
We use the open-core model and have a free version that consists of some code that ends up in an .amp and a .war file that contains the Angular-Frontend and several backend libraries.
We have at the moment two plugins in our software - each plugin provides an additional .amp file and adds a .jar / config files to the .war and we have lot's of extensions - each extensions overwrites/extends some Angular-Frontend files and also adds XML-configuration to the .war and/or .amp
Now I'm trying to migrate to Maven from an ancient ant-based build setup that basically just copies the plugins/extensions on deploy time over the base-install.
I need to be able to create configurations like: core + plugin-a + extension-b or core + plugin-a + plugin-b + extension-c - so that I have several .amp artifacts and a single .war artifact for each configuration.
It would be nice if it's also possible to aggregate extensions like core + plugin-a + plugin-b + extension-c + extension-d
At the moment I'm using the maven assembly plugin for the .war and the maven-frontend-plugin for angular and the assembly-plugin just copies the compiled artifacts into the war.
The .war itself is a maven module.
I could go on with this strategy and create modules for every extensions and every plugin but then I will need a module for every possible combination of the extensions and plugins.
To make it worse some extensions/plugins are commercial and live in different repositories - so I can't just add everything to the open-core POM.
I've looked into profiles but I'm not sure if that would solve my problem - as I need something like a central registry for all the submodules?
Somethink like mvn clean package -Pextension-a,extension-b,plugin-a that creates the artifacts would be great.
How to tackle this problem with Maven? Are there projects with these requirements where I can look how it's solved there?
This answer is bit speculative, as I do not know anything about Alfresco.
Have you thought about writing a Maven plugin that downloads an extension/plugin (maybe as zip file from your repository?), unpacks it and applies it to your project?
Then you could call the maven plugin with different lists of extensions/plugins.
In the end I've found Bazel with jvm_rules_external.
The concept of WORKSPACE files that allow dependencies using git/maven/http/etc.pp is perfect for this. Beeing able to also build the Angular frontend using Bazel and create lightweight Docker images as well as the cached incremental builds make it a perfect fit.
However transitioning from Maven to Bazel is not straight forward but after learning the concepts I won't look back!
I'm fairly unexperienced and all new to the whole world of build tools so here's my situation: I am developing a webapp with JSF, PrimeFaces and Hibernate on wildfly-9.0.2-final. All java files (incl. ManagedBeans, DAOs, Model classes, etc.) are currently in a regular eclipse java project called MyApp-CORE. There is no html or any other resources in that project, but all the third-party libraries like PrimeFaces, commons-xy, etc. Then I got two dynamic web projects with all the .xhtml files and stuff. Both web projects include the CORE in their build path (all done via eclipse built-in tools). Basically I followed Structure for multiple JSF projects with shared code so far. All projects are versioned using Git. I was now asking myself how to mavenize the whole thing and also how to properly include tests. The final result should be:
I want a build file for each web project that includes the CORE dependency and all of it's transitive dependencies, creates a .war file and deploys it either on the production system or locally (Depending on some parameters I want to be able to maintain).
This buildfile could then for instance test and build the CORE and then the .war file.
Since I'm using JSF, mostly the only option for testing is JSFUnit. Should I test each web project individually and put all the test cases there (which would be highly redundant because they're mostly the same, just a few features are different) or should I rather create a separate web project called MyApp-TEST which tests the CORE.jar and also - depending on some configuration - each web project.
I've already created a structure that makes it possible to include the core in the web project but unfortunately I loose the perks of hot deployment in wildfly when just including it as a dependency from my local maven repository.
So, to summarize it:
What would be a best practice for this setup, eventually leading to a continuous integration scenario?
How should I include the test cases (full integration tests that test actual UI behaviour)?
Which Tool (Maven, Gradle, Ant, etc.) would be best for that task?
Keep using hot deployment for smooth development?
Thanks in advance for any comments, hints or shared experience!
I am looking for a way to ensure that all the features I deploy in Karaf require dependencies that are of the same version. The project is composed of more than 40 bundles which makes it difficult to verify manually.
I am thinking of developping a Maven plug-in that would make the check, but before I would like to be sure that such a solution do not exist yet.
If you want to be sure you use the same versions then create a parent project and define versions of dependencies only there. So you can be sure all your modules have the same dependencies. Of course this only makes sense if all these modules are very closely related (e.g. belong to the same application / release unit).
Why would you even want to do this? Each bundle should depend on the versions of the package it needs, and that dependency should be a range. So if you compile against and API package version 1.0.0, and you are a consumer of that API, then you should import with the range [1.0.0, 2.0.0). Refer to the OSGi Core Release 5 specification, section 3.7.3 ("Semantic Versioning") for details.
At runtime the OSGi Framework will ensure that your bundle is wired to a package version that is within its permitted range. Obviously if you have non-overlapping version ranges from different importers then the Framework will not be able to satisfy them with a single exporter.
We create releases and upgrade the versions of our multi module project with the maven release plugin, which is usually triggered with a Jenkins release build. The problem is that we have a couple of modules whose versions should be updated as well, but which should not be built on the server - only the version upgrade should be performed. (These are just testing stuff only used occasionally by developers.) Any good ideas how to do that?
I believe the accepted policy is that every module should be built that forms part of a project. That includes those modules intended to test others.
If internally you have a situation in which a module is intended to test another but should definitely not form part of the build you may wish to extract that module into another project and adjust the module to depend on the artefacts of the older project's module.
You may still find that your CI server (such as Jenkins) recognises this dependency and builds your test project because it detects a changed dependency. In that case disable the automatic building of this test project.
You may find it less "smelly" to correct the testing procedure to ensure your test modules are designed to be run upon each build. Having a set of tests that should not be automatically run seems contrary to modern software engineering approaches.
What are the best practices for software versioning and multimodules projects with Maven?
I mean, when I create a multimodules project with Maven, what is the best approach for the versioning? To use a single version for all the modules (defined in the top project)? To use a version for each module (defined in the POM of each module)? Is there another approach that I'm missing? What are the pros and cons of each approach?
In general, are the different modules released together (possibly sharing the same version number)?
Thanks
Honestly it depends on what you would like to do. Multimodule projects are created for multiple reasons, one of them being you only need to deploy what has changed instead of all modules.
Think about it this way: if you had a non-multi-module project and you only had to change one line in the services layer, you have to rebuild the entire project and deploy all of the code again...even though only your services layer will change.
With multi-module projects, you can regenerate your project and deploy only what changed...your services. This reduces risk and you're assured that only your services module changed.
You also have a multitude of benefits to using multi-module projects that I'm not listing here but there is certainly a huge benefit to NOT keeping your version numbers of your modules in sync.
When you build your project, consider deploying it to a repository that will hold all compatible jars together for builds (each build creates a new folder with the parent-most pom version number). That way, you don't need to keep documentation about which jars are compatible...they're all just deployed together with a build number.
I was looking for a solution for this exact problem myself and versions-maven-plugin was exactly what I needed. I don't like the release plugin communicating with the SCM system. The versions plugin does just what we need: it sets a new version number in all poms of the project:
mvn versions:set -DnewVersion=2.0.0
Then I can proceed with commits, tags and an official build server build...
EDIT:
The versions plugin depends on how a maven multi-module project has been organised: as a result, it often does not update all POM files in a complex multi-module project.
I've found that sed and find do the job much more reliably:
sed -i 's/1.0.0-SNAPSHOT/1.0.0-RC1/g' `find . -name 'pom.xml'`
Typically you create a multi-module project because you have deemed that the various modules are parts of a single whole. Maybe the client-piece, the controller-piece and the services-piece. Or maybe the UI with services.
In any case, it makes sense to have the version numbers for the various modules to move in lock-step. However Maven does not enforce that as a rule.
As to your question
are the different modules released together (possibly sharing the same
version number)
I would think so. That is one of the reasons for having it a multi-module project. Otherwise you could have the modules as independent projects.
Of course this is the kind of stuff that is rife with edge cases and exceptions ;-)
I had the same problem with a project I`m working on. I also decided to use separate versions and even the dependency to the parent pom only has to be updated if some of the managed dependencies change. (so mostly as #vinnybad describes it)
Two additions
exists-maven-plugin
With the usage of the "org.honton.chas.exists-maven-plugin" only the modules will be deployed to the repository that have actually changed, which is really great, because also the according docker-images will only be published if something has changed on one of the service. This avoids "polluting" the image repository with different but unchanged versions.
versioning
One main downside of the "separated versions" approach are the questions regarding versioning:
What's the current version of my project?
Which module versions work with each other? (even thought they don't directly depend on each other, one does rely on what another does, e.g. they share the database schema)
To solve that I put all module versions into the dependency management part of the parent pom, even if no other module depends on them. A "integration-test" module could solve that by depending on all of the modules - and of course testing them together.
This way I would be "forced" to update the parent pom with every change, since it's referring the released module versions. This way the parent pom would have the "leading" version and at the dependency-management block state the versions of all modules that are compatible with each other (which will be ensured by the integration test).