Force maven to always fetch artifacts from target folder - maven

Circumstances
It would be nice to split long running build jobs, espescially in multi module projects.
Compile everything mvn clean install or mvn clean package
Execute unit tests on each module mvn surefire:test
Execute integrations tests ant and stuff
Publish artifacts to a remote repository mvn my.own.tools:publish-plugin:publish
Execute post build steps (Tagging, etc) build server stuff
In some build environments, like Atlassian Bamboo, each step will be likely executed on another build agent as the previous one. They can even have their own local repository each. However, it is possible to copy all the files of the working directory to that of a subsequent stage.
Observation
Maven uses the contents of the target folders before it makes a lookup into the local repository or a remote repository. This is true when the contents of the target directories are created by a previous phase during a specific run.
Example: mvn clean install surefire:test
If a test has dependencies, maven will look at first in the target directories which have been created during the compile phase.
If the command is split into two, it seems that maven does not recognize the target folders at all.
Example: mvn clean install; mvn surefire:test
Now maven loads all the dependencies from the local repository or, if they are not there, from a remote repository.
Problem(s)
During step 2 maven ignores the contents inside the target foldera, which have been created during step 1.
Step 2 runs on a different build server, than step 1. However, the whole file structure (including the target folders) from step 1 have been copied into the working directory from step 2. The artifacts which the tests depend on, are not taken out of those directories. Since each build server has it's own local repository, they are not found and on the remote repository they have never been uploaded.
Any idea of making our whole beast more modular and split up the project is unfortunately no possiblity
Question
How can maven be forced to load dependencies out of existing target folders of a multi module project under any circumstances? I understand that maven is not aware of those folders in step 2 since they are not created during that maven run. But how can I force maven to look if those folders exists?
More elaborted problem ;-)
Our "Publish-Plugin" in step 4 seems to work mostly fine and uploads the artifacts out of the target folders. The project iformation is gathered inside the Mojo by usual maven properties. But there are some zip files created during step 1 by the org.apache.maven.plugins:maven-assembly-plugin. Those zip files are not found, although they are residing inside the target folder. Now maven tries to download them, what is funny, because it does that to have them in hand for a upload.

Related

Maven deploy current artifact in target folder as it is

For a web application we have a Jenkins pipeline with these steps:
maven build of the back-end (mvn clean install)
npm build of the front-end (npm run build)
update the back-end .jar file including inside the front-end dist folder (jar -uf ...)
deploy that jar file into our development environment (docker container on OpenShift)
This works very nicely for the deployment. The question now is how to keep these artifacts in our repository (Artifactory). If we use the mvn deploy command in step 1, the artifact we store in our repository will be the jar file without the front-end. What I would like is after step 3 make a call to maven that deploys the jar file in the /target folder as it is without modifying it.
I've seen this other question, but like this I would need to specify many things as version, groupId... what from Jenkins could be difficult and also all this information is already defined inside the pom.xml file.
Would it be possible to call maven to use the already contained configuration and just perform the upload to Artifactory step?
You can probably just call
mvn deploy:deploy
in the end.

Multi-Module local jar dependencies - Jenkins Pipeline

I need to build a Java project on Maven. I am working on a multi-module Maven project that's built on the Jenkins Pipeline in the Nexus repository. I have a few libraries that are not available on the Nexus repository. I can't manually upload the libraries. I am building this project on a pipeline.
What I did:
I created a folder named jars in the project root of the Git hub repository and manually put all the jar files that are not available on Nexus. In the dependency, I referenced all these local jars as in the dependency parameters.
In the repositories, I gave the URL of the git hub repo of the jar folder. The Jenkins were not able to pick the libraries. I am getting the following error: dependency: dependency version - Build Error - Could not build for non-released dependencies and I am getting an error for all the jars that are in the jars folder. I tried putting the jars folder in src/main/resources but still getting the same error.
How can I reference this jar folder so that the Jenkins Pipeline can take it? I don't have control over the Jenkins / Scripts that are involved. I am a developer just building it on the Pipeline.
P.S: I don't have access to the internet at my company to post the POM or Build Failure errors.
Adding more details:
It's built on the Pipeline. There are two repositories: Nexus 2 and Nexus 3. The particular libraries are not available on Nexus 3 and pipeline takes the build only on Nexus 3.
We have raised a request to upload those libraries but it's not going to happen anytime soon. The Jenkins Pipelines takes it's files from the Github repository and builds the Java project using Maven. I don't have control to a pipeline or any of the scripts in Jenkins.
We downloaded all the libraries that are not available and put that in a folder in git hub. There are 4 cycles in the Pipeline. Github Cycle / Jenkins Cycle / Deployment Cycle / Release Cycle.
Github Cycle: In this cycle, it follows three stages. It takes the code from the code, builds it. It builds the snapshot and uploads it to Nexus repo. In these 2 stages, it was able to successfully build by taking the code from the GitHub and builds it and artifact generated. Third stage: It's really strange as in this stage, it again builds and build getting failed in this stage citing code for Non Released Dependencies for the jars that are uploaded in the git hub.
What might be the reason for this: When it can build in the first two stages of the Github cycle and getting failed in the third stage for Build Failure for non-released dependencies.
The pipeline is designed in such a way that it looks only on Nexus 3 and build during each phase of the cycle.
In the repositories, I gave the URL of the git hub repo of the jar
folder
That does not work because your lib folder is not a valid Maven repository.
How can I reference this jar folder so that the Jenkins Pipeline can
take it?
You have some options:
Set up custom Maven repository manager. You can use Nexus
Repository Manager or JFrog Artifactory or something else.
It will give you the greatest flexibility and allow to do a lot more
in the future. Downside is, you will need to have the infrastructure
to run this which usually comes with some sort of maintenance cost.
Install the bundles in Maven's local repo from the jar folder you already have. There are two ways you can do that:
Via script in Jenkins Pipeline that runs before your build and calls mvn install-file ... for each library in your jar folder. You can find the exact syntax for this command on Apache Maven Install Plugin site
By changing your build and calling the install-file goal of the maven-install-plugin in earlier build phase. I've personally
never done that but this answer suggests it's possible.
remove the files from the jar folder and create a wrapper project for each of them which does nothing but install the jar in
the local maven repository. Make sure those are the first modules to
run in your multi-module project.

Best maven practices for modifying built war-file with generated resources

Currently I'm working on integrating grunt build process to our maven build process.
Here are two options I can think of:
Pointing some folder in resources as a target build dir for grunt project.
Building grunt project wherever and enlisting this this non-standard folders as included resources in the pom.xml.
It looks like there is yet some space to enhancement. Basically, since grunt subproject
does not depend on any external resources, it would be nice to learn how not to rebuild war-file that had been already compiled, but modify it after rebuilding grunt-project.
So, the question is:
What are the best practices adding generated resources to an existing war file.
The approach of adding resources to a war file amounts to modifying a maven-built artifact after maven builds it. That runs counter to the maven philosophy of tightly controlling the entire build of every artifact. You really have three choices:
Include the grunt generated source in the war's source and build a single artifact with maven. In this case you rebuild the war every time grunt resources change, or
Put the grunt generated sources in a second maven artifact and make that artifact a dependency of the war artifact. Maven will still rebuild the war every time, but you get the separation of builds you seem to be implying in your question, or
Make the dependency in (2) a runtime dependency, if possible. You basically make the scope of the grunt artifact dependency "provided" so you don't have to rebuild the war every time your grunt artifacts change. You only have to rebuild your grunt artifact.
It sounds like you want to go with option (3).
A war file is just a zip file with a certain file layout, you could just add to the archive using a zip tool. For example my linux platform zip command has the -g command.
-g
--grow Grow (append to) the specified zip archive, instead of creating a new one. If this operation fails, zip attempts to restore the
archive to its original state. If the restoration fails, the archive
might become corrupted. This option is ignored when there's no
existing archive or when at least one archive member must be updated
or deleted.

Maven: Change the "test" phase directory from local .m2 to target?

Forgive me if this is remedial, but I am still new to Maven and it's functionality.
In my project, when it "builds" and gets to the compile phase, it will create a target directory with just compiled libraries and update (or create if not there) the local .m2 directory.
When I get to the "test" phase, I want it to build against the target directory's library files, and not the local .m2 directory.
Any hints, recommendations, or suggests would be greatly appreciated. Thanks!
Maven has this concept of “the reactor”, which is just a fancy term for the list of projects being built. At the start of a Maven build, and at the end, Maven prints out this list of projects (using /project/name if defined or groupId:artifactId otherwise).
For each project in the reactor, Maven maintains a list of artifacts that have been attached. By default, each module's pom.xml is attached, and as each plugin runs, they have the option of attaching additional artifacts. Most plugins do not attach artifacts, here are some plugins that do:
jar:jar creates a .jar and attaches it
war:war creates a .war and attaches it
source:jar creates a .jar of the source Java code and attaches it with a classifier of source
java doc:jar creates a .jar of the JavaDocs ad attaches it with a classifier of javadoc
There is also a default primary artifact (this is the one that gets replaced by jar:jar) which is actually a directory and not a file, as such it will not get installed or deployed to the local repository cache or a remote repository.
So when in the reactor, and a plugin that attaches the primary artifact has not run yet, and another plugin asks for the primary artifact, it will be given the directory ${project.build.outputDirectory}. If after the primary artifact as been attached, then that primary artifact will be provided.
The test phase happens before the package phase, so will use the directory and not the .jar. The integation-test phase happens after, so will always use the .jar.
Things get more complex in a multi-module project (which is where my long intro should help you out)
Maven has to build the test classpath. If one of the dependencies is within the reactor, Maven will use the artifact attached to the reactor. Otherwise it will use the local cache (populating from the remote repositories if necessary).
When you run
mvn test
In a multimdule project from the root, there is no replacement of the default (directory-based) artifact, so intra-module classpath will be to the target/classes directories.
When you run
mvn package
In the same project, however, because each module completes its life cycle sequentially, all the dependent modules will have swapped in their .jar files as their attached artifact.
All of this should show you that Maven is doing the sensible thing. Hope this has helped.
The test phase is going to execute tests on your project. The project won't reference itself via the dependency mechanism. Only dependencies will be referenced via your local repository, i.e. .m2/repository
Also, it's not the compile phase that installs the artifact to the local repository, it's the install phase. And, then, there's a later phase, called deploy, that will deploy the artifact to a remote repository, provided you have a remote repository configured as the deploy target. Note, install and deploy are nearly identical phases except install is a local only thing; thus, it's the common build phase to hit when doing dev environment work. Normally the build server will do the deploy stuff.

Maven WAR overlay problems, while using Hudson + Artifactory

We have three artifacts:
common.jar : with common classes.
public.war : depending on the common.jar, contains only public site resources.
internal.war : depends on both common.jar and public.war, adding authentication
information and security context resource files. Also contains
few administration site classes.
Currently I have structured these in such way, that internal.war overlays itself with public.war.
Building the project locally, installing the artifacts to local repo, works perfectly.
Problems start when trying to get the Hudson builds working with following sequence:
Build all projects in dependency order.
Modify common.jar (say, add a new class method)
Modify internal.war classes in such way that they are compile-time dependent on changes done in 2. step.
Commit both changes, triggering the Hudson builds.
Internal.war build fails because it can not find the symbols added in step 2.
Somehow the build in step 5. is using an old version of the common.jar, and failing because of it.
The common.jar version number does not change, let's say it's 1.0.0-SNAPSHOT for the purposes of this example.
If I DO change the common.jar version number, the build works. (Supposedly because there is only one release by a release version number).
Now, what could cause this using of old artifacts in Hudson builds?
We are running maven builds on Hudson with command "clean package -e -X -U"
"Deploy artifacts to maven repository" has been checked.
It's hard to definitively answer this without access to the real poms, but here is what I would do:
1) Make sure Hudson is using the exact same version of Maven as you are on your local machine
2) Examine the effective pom.xml of internal.war on the Hudson machine in a terminal via mvn help:effective-pom making sure you are running the same mvn executable as your Hudson job does. You need to verify the version of the common.jar in the effective pom.xml of internal.war. It could be different than what you expect due to profiles or settings.xml differences.
3) Check the settings.xml file for your Hudson install of Maven. In particular you need to verify all is well in your distributionManagement, servers, and repositories stanzas. Another good way to check this is to go to your internal.war project and run mvn help:effective-settings and see if what is there matches what is on your local machine.
Something is awry and it won't take long to find with the right analysis.

Resources