Java Springboot (Maven) - .jar doesnt package right in Jenkins pipeline - spring-boot

Very complex issue here I cannot figure out... For starters our corporate Jenkins instance has support for both Maven plugin style jobs and pipeline. They have a heavy push on pipeline jobs, and I'm creating a pipeline for our Java Springboot (maven) job.
When this job is executed in the maven plugin style job - it runs fine and produces a jar that is runable and has been working for years.
Upon moving this job to a pipeline configuration scripted job - it will build but is about 24mb smaller than the maven plugin built job. In doing a comparison using WinMerge tool - i find that there are several missing items from this packaged .jar (see screenshot). If a full list of these is requested I can provide, but TLDNR - i've tried several different images for the jenkins agent... which is a dockerized container agent on the Jenkins instance.
WinMerge directory comparison of good vs. bad
I have tried images such as - openjdk1_8, maven:3.6.3-openjdk-8, and several other images - all with the same outcome of the same missing files. This is beyond my experience level to figure out :/ so i'm at your guys mercy! Thanks in advance - let me know if you'd need more info to what would be needed for troubleshooting.

Related

Gradle - Compile submodules in parallel

I have a project with two sub modules.
Client - A UI based on Google's web developer kit
Server - A spring boot based server.
Now in my Gradle config(build file) on server, Im creating a jar file from the client and then including it on the server through the below snippet. Lastly, I create a .war file based on the server config.
dependencies {
compile project(':client')
}
The architecture is similar to Spring Boot's proposed ways of resource handling.
Now, when I run the Gradle build, because server is dependent on the client, the server compilation doesnt start until the client compilation and tests are done.
I feel that I'm not making use of Gradle's parallelism with this way of compiling client and server.
Are there any ways such that a compile and run test cases in parallel and then create a .war file only when both submodule's tasks are complete? How do i access the configurations of the client and server modules and then create a new war file on the rootProject?
You can try to add flag --parallel to your Gradle command. However this is still incubating feature. I noticed significant improvement on building time when running Gradle daemon, so you can try it out as well.
No, this level of parallelism is not currently available. I think the team are slowly working towards general parallel task execution, as described in their spec. That should allow for the kind of behaviour you're requesting.
That said, you can run tests in parallel if they're independent, via the maxParallelForks and forkEvery options. MrHaki gives a short how-to for this on his blog. Note that this only applies to a single Test task instance.

How to manage maven settings.xml on a shared jenkins server?

I have a Jenkins cluster that is shared by several teams, that I can configure build jobs on, However i can't easily make changes to the Jenkins configuration itself.
There is a central "nexus pro" maven repository manager but each team / group in this very large multinational has their own repo, publishing to the repos requires username / password combination.
This means that I have to configure the Jenkins server with a maven settings.xml that is unique to the team I am working with without messing up the maven configuration of the other users of the Jenkins cluster.
Git is the source control repository.
On a shared Jenkins cluster how do I configure a maven settings.xml that is unique to a a group of build jobs or to a single job? What are the best practices for handling this type of situation?
I would recommend using the configuration file plugin, provides a UI to edit one or more Maven settings files.
These settings files can be passed into your Maven build using the "-s" option.
You can specify for each job in the Maven Advanced Options part a specific seetings.xml path
We manage all our build nodes using Puppet. It gives you greater control than just settings.xml. Highly recommended
Puppet is IT automation software that helps system administrators manage infrastructure throughout its lifecycle, from provisioning and configuration to patch management and compliance. Using Puppet, you can easily automate repetitive tasks, quickly deploy critical applications, and proactively manage change, scaling from 10s of servers to 1000s, on-premise or in the cloud.
If your company is using Nexus Pro (as you've already mentioned), then your unique Maven settings.xml can be stored there, and retrieved at build time using the nexus-maven-plugin as described here: http://books.sonatype.com/nexus-book/reference/maven-settings.html
Combined with token-based access (again, Nexus Pro does this), you do not need to store passwords insecurely in the settings.xml (see https://books.sonatype.com/nexus-book/reference/usertoken.html)
I faced the similar issue when building the project with jenkins as ojdbc jar is not available in maven central repository.
It worked when I placed the ojdbc jar in WEB-INF/lib folder and removed the maven dependency in pom.xml.
A good way to automate the provisioning of maven executors with specific configuration, is using ElasticBox Jenkins plugin.
You only need to create a box for the Maven slave, that define all the customization variables and files to be used by it and choose your preferred cloud provider for deploying it.
ElasticBox gives you also the flexibility to create new slaves only when needed and automatically destroy them after an specified retention time.
Here is how-to connect your Jenkins with ElasticBox:
https://elasticbox.com/documentation/integrate-with-jenkins/jenkins-elasticbox-setup/#jenkins-configure-plugin
Here is how to automate creation of Jenkins slaves with ElasticBox:
https://elasticbox.com/documentation/integrate-with-jenkins/jenkins-elasticbox-slaves/
There is a blog post about how easily build and deploy from GitHub pull requests with ElasticBox Jenkins plugin:
https://elasticbox.com/blog/github-pull-requests-jenkinsplugin/

jenkins Running jobs based on packages available

i have test packages like
test.regression.pacakgea
test.regression.pacakgeb
test.regression.pacakgec
these packages are developed by different people and am running them on VMs. i have configured an email notification for success and failures for each job. people may not be interested in going thru the mail for all the failures to find out which of their tests failed. the tests inside the packages extend junit and am using maven as a build tool
so is there a way in which jenkins can execute each package as a separate job. i can make each pacakge into a different project and in jenkins i can configure this as a new job but thats too tedious as everyone has to checkin into 2 locations.
If you're using Maven:
Make a Maven profile for each package you want to test.
Configure the Surefire plugin to only include the packages you want to test in that build using a variable from that profile.
Create a different Jenkins Job for each build with one of the three profiles.
Now Jenkins will test only that package in the given class.
Note that the include/exclude only works for the test packages. Those test might test code outside of that package.

Bamboo doesn't recognize test in my Spring project

I have a Spring project (Apache CXF, Spring, Hibernate, Maven ...) hosted on BitBucket and I'm trying to use Bamboo as my CI server. My idea is deploying the code directly to Heroku from Bamboo so that deploying time is automated.
I made a plan with a couple of tasks to achieve this. First I have a Source Code Checkout task and a builder task. Both of them are working, code is compiling and test are passing, I can see that in the task log. The problem is that Bamboo doesn't seem to recognize the tests (it marks the task are testless).
I have also tried to create a new JUnit test task and it's even worst. Log shows that everything is working properly but Bamboo marks the plan as a failure after the test task is executed.
Any ideas?
Not sure which version of Bamboo you're using, but in the version that we have, you have to turn on unit test result evaluation on the Builder tab. Please see the attached screenshot, and make sure that this is enabled, and the directory setting is pointing to the directory where Maven Surefire creates the test results (in XML format).

What is the "maven way" for this ant development workflow?

How can maven be configured to support this type of workflow:
One Time Setup Invoke maven to do one time setup of a developers machine such as
Create a custom version of tomcat configured for this application
Create a local postgres database on the developers machine
load sample data into the database
run a junit test to configure other resources needed to run the application
Integration Tests Invoke maven to do run integration tests which should do the following
Create an integration test db
setup the db
Run command line integration tests against the db
Run a test version of tomcat with the application in it
Run command line junit tests that test the restful services exposed by the application
Release Build Invoke maven to do a release build of the system
do all the steps for an integration test
generate resources and configurations that are used on the server rather than production
deposit the end result in a git repo, commit, and push the changes to production
Test Build Invoke maven to do a test build of the system
do all the steps of a release build but configure the test release package with test server configuration
The main thing I am struggling with is that maven has a single build life-cycle with a well defined sequence of phases not sure if the workflow I want to build is a good fit for maven.
Can maven be configured for this type of workflow? If yes what are the key features of maven that allow for the different configurations of the four main ways that I want to use maven?
Update What I mean by this workflow, is that I want to be able to do something like
mvn setup
mvn integration
mvn prod-release
mvn test-release
I know the above example look like ant, I am long time ant user and total noob with maven.
You could setup Maven to do all that...
You probably would use (shock horror) profiles to achieve some of this...
BUT you don't want to do that
You are following ANT style thinking... if you like that style of thinking then use ANT or Gradle and be happy.
If you want to follow the Maven way, then you will solve the problem differently.
Coming from the Maven way, here are my thoughts:
Why do you need one-time setup? I usually have a run profile that dynamically provisions the correct application server and starts it with the App deployed, tearing down everything afterwards when I hit ^C. Typically this involves starting up a database server or two... hence things I have developed like the cassandra-maven-plugin. That way when I am working on a different project (which could be in 10 minutes time) I don't have to worry about background database servers eating up all my laptop's ram.
Integration tests are actually trivial when you have the above working... in fact I created the Maven Failsafe Plugin to make it easy to have plugin execution tied to the appropriate phases for integration testing. The Maven convention is to have a profile called run-its for running integration tests.
Release builds being different from test builds... ugh! You should be building environment agnostic artifacts. Have them pick up their configuration from the environment they are deployed in. That removes the worry that something has changed between the "test" build and the "production" build. If you really need to bundle the config, then I usually would resort to a separate module for taking the agnostic artifact and rebundling with the required configuration. That way it is easy to prove that you have a reproducible transformation and that nothing has changed inbetween what went to QA vs what is going to Ops.
I always make the release builds include the integration testing.
So typically I have my projects such that
$ mvn -Prun
will fire up the application starting from zero. Hitting ^C will tear everything back down again, and mvn clean or in extreme situations if I have a more complex setup process and need some caching mvn post-clean (think really clean) will remove anything that the run profile put into play
To run the integration tests I typically do
$ mvn -Prun-its verify
To make a release I typically do
$ mvn release:prepare release:perform -B
That is (in my view) the ideal way of handling the above steps you need.
HTH.
BTW I have not had to use PostgreSQL specifically (typically my integration tests and run profile can get away with a pure java database such as derby or hsqldb and because the artifacts are environment agnostic it is easy to have the integration test/dev flyweight app server inject the correct JDBC url) so you may hit some issues with regard to PostgreSQL

Resources