GitLab CI: Spring boot dependecy on another project - spring-boot

I have a clustered application architecture, where 3 of my primary services make use of a dependency artifact (lets call it commons) that contains the modal files and other utils used by other 3 services.
Presently, I have all the 3 spring boot applications deployed on k8s through Gitlab CI via artifactory for image management.
Now, each time I make changes to my commons service, I have to change the version of the commons in pom.xml(so that it doesn't conflict with the previous artifactory image) and also change the pom versions of my other 3 services that depend on this new version and push all the 4 (first push commons so that the new build image is available in artifactory, and then the other 3) services.
Is there a better way to manage this. I would have preferred if, my 3 services where able to fetch the latest common version and add it to my pom version

This is currently supported in Reliza Hub (disclaimer: I'm developing the project).
The workflow to get latest release is documented here (see workflow 2.Get Latest Release Of A Project Or Product).
Idea is the following:
you define project for your Shared Library and configure from GitLab CI to automatically stream build metadata to Reliza Hub on every build using Reliza Client.
Automatic versioning can also be maintained via Reliza Hub (meaning that Hub would increment versions for you on every build based on your chosen versioning schema) - you need to use getversion command of Reliza Client for that.
You can then use this automatic version increments to update version in your pom.xml at build time. So this process will be fully automated.
Once that is done, in your CI pipelines for each of the 3 dependent services, you include call to Reliza Hub using getlatestrelease command of Reliza Client for your shared library. This call will return you back all metadata for the latest release of shared library, including its version.
You can then plug this version into pom files of your dependent services.
Hope this helps.

Related

Apache Karaf auto-adding library to jre.properties during build?

I have a library that relies on exporting a sun.reflect package from JRE.properties. During testing I have been manually adding this. What can I do to ensure this is automatically added within Apache Karaf?
Changes to the etc/jre.properties requires a container restart. If you are deploying this Karaf instance inside a Linux container (aka Docker), you would simply include this change as part of the linux container image build.
However, if you are deploying into a Virtual Machine environment, you'd want to make this part of your organization's custom build of Karaf. I suggest using a Maven project with the Assembly plugin to apply all your organization's changes-- ldap, security, ssl certs, etc/jre.properties... etc. It would then create a new .tar.gz or .zip file that and you would deploy your app into the modified Karaf instance.
There is an example in the HYTE Runtime build here:
HYTE Runtime
Technically, you could leverage the feature deployment mechanism to deploy an updated file, but this won't cause the Karaf instance to restart.

Reusing Mule connectors and validation flows

How to reuse mule code (flows, exception strategies, database connectors, validators) across several projects. It's a application specific reusable artifacts, not an enterprise wide reuse.
For ex: I have some master code( validators, flows, and exception stratagies) which should be reused in a 15 different flows. (i.e 15 different mule projects). We are not using maven at the moment. One way I explored is, we could jar it and publish to local nexus repo, and re-use it via pom. Is there any other way ?
If possible, I also would like to make it dynamic, such that if I change the master code and deploy, it should be in effect without having to redeploy the ones that are using it.
You can reuse flows etc. (everything which is in Mule xml files) and Java classes by placing them in a plain Java project, building a jar from it and placing the jar on the classpaths of the importing Mule projects.
To use the stuff in the xml files, import them with .
Your question sounds like you already know this part.
I recommend building all Mule projects and the so called master project with Maven, Mule projects with packaging Mule, the master project with packaging jar.
Maven will pack the master part inside the using projects, so there is no dynamic update.
When you want this dynamic update, don't build with Maven or set the scope to "provided". In this case the master is not packaged in the other Mule projects. You have to make sure it is on the server classpath, e.g. in lib/user. Then you can change it there, restart the Mule server and all projects get the update.
Same with another level of indirection/possibility for grouping can be done with Mule domains.
All the dynamic stuff described so far does only for on premise Mule servers, not for CloudHub.

gradle wrapper downloaded multiple times for different projects

I have got micro-services where each service is a separate project. In the gradle wrapper properties files, I have defined the same distribution URL:
distributionUrl=http://nexus-server.com/nexus/content/repositories/software/gradle/2.14.1/gradle-2.14.1-bin.zip
Our nexus is migrated to a different server therefore I had to update the gradle wrapper properties file to use the new server address. I have got 2 questions:
Is there a way that I could tell gradle not to download the
distribution from scratch as (in principal) the distribution is the
same, only server address is changed?
Even if gradle wrapper downloads the distribution from the new nexus
server, why is it repeating the action for each micro-service
project instead of reusing the one downloaded for the first project
(after it was build)?

Continuous integration and deployment with jBPM

We are using jBPM EAP 6.4 version. We are developing the JBPM workflows and rules using business central console tool.
We want to implement Continuous integration in our project. How can we implement the CI if we use Business central console for our changes ? Normally Jenkins (other build server) listens repository server for changes, as soon as developer pushes the changes to repository, immediately it will trigger the build and deploy.
But in our case we are developing everything using the console.How to achieve CI in this special case or any recommended approach to implement CI in jBPM ?
Can you please suggest..
Thanks
Just a pointer,
The work bench (business central control) is pointing to a git repository (by default), so essentially, when you are interacting to the work bench, you are doing actions very similar to a normal scenario, that is, where a developer commits to a git repository.
User documentation jBPM - VFS repository
Hope this gives a direction.
Repositories in Business Central use virtual file system based on Git and each time you save something in one of the editors a commit is made. You can create a Git hook on your repository which will trigger some action after each commit.

How to combine 3 standalone applications on java and run parallely with maven in spring

How to combine 3 standalone applications on java and run parallely with maven in spring.
The three othe standalone applications run on different Db's and I want to make use of these three in my main Standalone app.
What are the required maven settings i need to follow and what are the best spring components i used.
Any kind of answer is appresiable.
Thanks in Advance.
About Maven.
I recomend you to use a Repository Manager (Nexus, Artifactory...). In that repository, you can upload manually your aplications as a jar (I assumed that these three app are not build with Maven, but it could be interesant to migrate them, you should evaluate that).
You have to configure your new app pom.xml and settings.xml to get access to this repository. And then, you can add these dependencies in your new app pom.xml. After that, you can use your applications classes in your new app.
About Spring
Spring Framework, has a lot of things that could help you in your development (like dependency injection, jdbcTemplate and a large etc)
I really recommend you to read the documentation (with the index you can get an idea), and evaluate what things could help you.

Resources