Server/Client build using maven/jenkins - maven

I'm asking for recommendation on how to configure our Client/Server build. we have a client server architecture. the versions of the Client and the Server are tightly coupled, meaning,
they are both stored in the same git repository.
during development, changes in Client and the Server must be made at the same time to maintain compatibility in some cases
the Client code is dependant on server code
at the moment they do not share a parent pom.
I'm wondering what would be the best way to have both Client and Server compiled from the same branch. keeping in mind branches are created all the time and developers may run a job on a private branch using parameterized builds.
options:
Create combined parent pom.
Create a Jenkins job and execute 2 maven builds one of the other using shell commands
Add branch name to the version and deploy the server to the Artifactory

Maven modules
You can use modules to aggregate both the client and server projects within the same Maven build. You don't have to have a parent POM in the inheritance sense, but you need one more POM to aggregate the two projects. You'll need the following directory/files structure (normally all POM files are named pom.xml, I renamed them here for clarity):
main-pom.xml
server/
server-pom.xml
client/
client-pom.xml
In main-pom.xml specify the projects to be built in the "modules" section:
<modules>
<module>server</module>
<module>client</module>
</modules>
The module names have to be the directory names. Don't worry about the order, Maven will figure it out according to dependencies.
You can combine this approach with inheritance (the real parent POM) if it makes sense to share properties between your client and server projects (version, common dependencies, plugins, etc.).

Related

Gitlab ci issue with parent and child POM

I have 6 microservices in my project and i have seperated them into 6 projects in gitlab. When i tried to build this microservices all together or after building parent POM later child POM seperately outside Gitlab it is working but while using gitlab-ci i am not able to build it as they are failing non resolvable parent POM.Can someone please let me know how can i build this microservices independently(building parent POM and keeping the artifact available for all other projects).
Tried caching and artifacts in gitlab but they are strightly bound to single project
If you always want to build those six microservices together, put them into one multi-module project. Then you have one project on GitLab and everything will be much easier.
If you need to separate, then you need a Maven package manager. You can use the one that is included in GitLab, or you can use an external one like Artifactory.

Maven: how to not depend on Parent POM

I have a Maven multi-module project. Something like this:
- ParentProject
- ChildA
- ChildB
- ChildC
The child projects inherit from a Parent POM (ParentProject) solely for the reason of sharing stuff like <build>, <scm> and <properties>, so as to not repeat it in all the child modules. Thus, the objective of the parent-child relationship is not related to dependencies in any way. It plays a role at build-time, not at runtime, so to speak.
The child projects's artifacts are for consumption for a wider audience, hence they'll be published into a centralized repo.
How do I "break" the relationship between from the child up to the parent seen from a perspective of a consumer of a child?
Let's say another project, ProjectX, adds a dependency on ChildA. When doing this the Maven client will attempt to not only download the POM and artifact of ChildA itself but will even try to download the POM for ParentProject. However, there's absolutely no need for that POM seen from a consumer point of view. It doesn't contain information that the consumer needs to know.
How can I break this relationship from consumer's perspective? Forcing the POM for ParentProject to be published into a repo seems pointless as nobody has any need for it there.
Perhaps there's another way that Maven will let me share things like build instructions and properties between projects without mandating that a Parent POM exists in a centralized repo ?
Or perhaps there's some way I can manipulate the POM for the Child projects which gets put into the centralized repo (removing the <parent> element as it is irrelevant).
Perhaps only me but I feel that Maven is conflating two unrelated concepts here (build-time vs consume-time) and forcing unnecessary roundtrips and unnecessary artifacts in repo. I haven't dabbled with Gradle yet but I wonder if it does it any better?
Usually, the Maven POM is both build POM and consumer POM. This is not ideal, and will probably change in future versions of Maven.
At the moment, your best option seems to be the flatten Maven plugin, which allows you to remove "unnecessary" parts of the POM before uploading it.

Reusing Mule connectors and validation flows

How to reuse mule code (flows, exception strategies, database connectors, validators) across several projects. It's a application specific reusable artifacts, not an enterprise wide reuse.
For ex: I have some master code( validators, flows, and exception stratagies) which should be reused in a 15 different flows. (i.e 15 different mule projects). We are not using maven at the moment. One way I explored is, we could jar it and publish to local nexus repo, and re-use it via pom. Is there any other way ?
If possible, I also would like to make it dynamic, such that if I change the master code and deploy, it should be in effect without having to redeploy the ones that are using it.
You can reuse flows etc. (everything which is in Mule xml files) and Java classes by placing them in a plain Java project, building a jar from it and placing the jar on the classpaths of the importing Mule projects.
To use the stuff in the xml files, import them with .
Your question sounds like you already know this part.
I recommend building all Mule projects and the so called master project with Maven, Mule projects with packaging Mule, the master project with packaging jar.
Maven will pack the master part inside the using projects, so there is no dynamic update.
When you want this dynamic update, don't build with Maven or set the scope to "provided". In this case the master is not packaged in the other Mule projects. You have to make sure it is on the server classpath, e.g. in lib/user. Then you can change it there, restart the Mule server and all projects get the update.
Same with another level of indirection/possibility for grouping can be done with Mule domains.
All the dynamic stuff described so far does only for on premise Mule servers, not for CloudHub.

What is the best way to structure maven projects to make a client jar?

New to maven here...coming from the ant world
I need to create a client jar with a few files that will give my client the ability to write to my Db and make rest calls to my services.
These are mainly classes that wrap a Rest connection and db client.
Is it possible to produce this artifact as a side effect of my main maven project ?
Eg: main project produces a bundle when I run mvn package, but I'd like to produce the client jar by providing some other parameters....
What you need here is a multi-module maven project.
The structure goes like this:
-- Parent Module
----- Child 1 Module
----- Child 2 module
Here you can have all your code/files of your main app in child 1 module and put all the code/files for the client in the child 2 module.
The parent module is just an aggregator which produces an artifact of type pom. Whereas each of your child modules will produce individual jars.
You can then you the second jar in your client.
For a detailed understanding how multi-module project works, check this link.
The standard Maven way is "one project, one jar". This means that the cleanest way to achieve your goal is to set up a multi-module project where you have one module for your "normal" jar and one for your "client" jar. But there are other possibilities:
If you are talking about an ejb, you can use the maven-ejb-plugin and create a client artifact. Unfortunately, both artifacts then share the same pom (and, therefore, the same dependencies).
You can use the maven-assembly-plugin to assemble a set of files and deploy them as side artifact (same problem as in (1)).
You can use the maven-install-plugin and maven-deploy-plugin to install/deploy entirely different artifacts along with your main artifact. These artifacts need to be created before, e.g. by a custom maven plugin.

Is there any way to configure Archiva to download missing Maven project modules if they aren't in the local workspace?

I'm confused about how Archiva fully works. I understand that if we had a core set of dependencies, we could use Archiva as our local maven repo.
The thing I don't understand, is how Archiva manages build artifacts from your own projects.
Say I have a multi-module maven project - we can even use the one from the Sonaytpe for example. http://www.sonatype.com/books/mvnex-book/reference/multimodule-sect-building-multimodule.html
What if I wanted to have one team working on the Simple Model app, while I wanted another to work on the Simple webapp. But I didn't want either to have the projects they AREN'T assigned to, in their local workspace. Webapp needs Model to build, but I don't want the Wepapp team having direct access to Model.
Is there any way Maven can detect that the build artifact for Model wasn't in a Webapp dev's workspace, and pull it from our local Archiva repo, so they can still build the Webapp despite not having the model (maven module project) code in their workspace?
The Model project will be like any other third-party dependency and be downloaded by Archiva automatically, provided
the Webapp project specifies Model project as a dependency
the Model project is deployed to Archiva periodically (by a Continuous Integration system or other means).

Resources