Upstream Jenkins project is not populating the m2 repository with jars for the downstream projects to use - maven

I have a Jenkins server with a few projects and some dependencies set up between them.
The dependencies trigger correctly (if I trigger the downstream project then the upstream ones build first) but the downstream project fails because it can't find the jar dependencies that should be in the .m2 repository having been populated by the upstream projects.
The upstream projects are set to do 'clean install' and it does complete successfully so I'm not sure why the jar files aren't being stored in the repo or aren't being picked up by the downstream project.
Jenkins is set to use the default ~/.m2 repository.
I tried changing the default repo setting but that didn't help. I would expect the upstream projects to populate the m2 repo as they would in my usual workspace and therefore the downstream project wouldn't complain it couldn't find these jar files.

Related

Jenkins: Build whenever a SNAPSHOT dependency is built - with different .m2 location(s)?

In Jenkins, we configured different maven jobs. Each of those jobs installs their result build artefact to one of some local .m2 repositories being different to the default local .m2 repository (e. g. .m2-dev, specified in the settings.xml that are specified by the -s flag in the job's maven goals).
Now, when using the "Build whenever a SNAPSHOT dependency is built" option, it seems that dependent builds are only triggered if content (.jar) changes in the default .m2 repository are detected.
My question: Is it possible to specify an alternative .m2 settings file or repository location for a job that the "Build whenever a SNAPSHOT dependency is built" will be aware of?
Thank you for any hints or alternative suggestions.

Multi-Module local jar dependencies - Jenkins Pipeline

I need to build a Java project on Maven. I am working on a multi-module Maven project that's built on the Jenkins Pipeline in the Nexus repository. I have a few libraries that are not available on the Nexus repository. I can't manually upload the libraries. I am building this project on a pipeline.
What I did:
I created a folder named jars in the project root of the Git hub repository and manually put all the jar files that are not available on Nexus. In the dependency, I referenced all these local jars as in the dependency parameters.
In the repositories, I gave the URL of the git hub repo of the jar folder. The Jenkins were not able to pick the libraries. I am getting the following error: dependency: dependency version - Build Error - Could not build for non-released dependencies and I am getting an error for all the jars that are in the jars folder. I tried putting the jars folder in src/main/resources but still getting the same error.
How can I reference this jar folder so that the Jenkins Pipeline can take it? I don't have control over the Jenkins / Scripts that are involved. I am a developer just building it on the Pipeline.
P.S: I don't have access to the internet at my company to post the POM or Build Failure errors.
Adding more details:
It's built on the Pipeline. There are two repositories: Nexus 2 and Nexus 3. The particular libraries are not available on Nexus 3 and pipeline takes the build only on Nexus 3.
We have raised a request to upload those libraries but it's not going to happen anytime soon. The Jenkins Pipelines takes it's files from the Github repository and builds the Java project using Maven. I don't have control to a pipeline or any of the scripts in Jenkins.
We downloaded all the libraries that are not available and put that in a folder in git hub. There are 4 cycles in the Pipeline. Github Cycle / Jenkins Cycle / Deployment Cycle / Release Cycle.
Github Cycle: In this cycle, it follows three stages. It takes the code from the code, builds it. It builds the snapshot and uploads it to Nexus repo. In these 2 stages, it was able to successfully build by taking the code from the GitHub and builds it and artifact generated. Third stage: It's really strange as in this stage, it again builds and build getting failed in this stage citing code for Non Released Dependencies for the jars that are uploaded in the git hub.
What might be the reason for this: When it can build in the first two stages of the Github cycle and getting failed in the third stage for Build Failure for non-released dependencies.
The pipeline is designed in such a way that it looks only on Nexus 3 and build during each phase of the cycle.
In the repositories, I gave the URL of the git hub repo of the jar
folder
That does not work because your lib folder is not a valid Maven repository.
How can I reference this jar folder so that the Jenkins Pipeline can
take it?
You have some options:
Set up custom Maven repository manager. You can use Nexus
Repository Manager or JFrog Artifactory or something else.
It will give you the greatest flexibility and allow to do a lot more
in the future. Downside is, you will need to have the infrastructure
to run this which usually comes with some sort of maintenance cost.
Install the bundles in Maven's local repo from the jar folder you already have. There are two ways you can do that:
Via script in Jenkins Pipeline that runs before your build and calls mvn install-file ... for each library in your jar folder. You can find the exact syntax for this command on Apache Maven Install Plugin site
By changing your build and calling the install-file goal of the maven-install-plugin in earlier build phase. I've personally
never done that but this answer suggests it's possible.
remove the files from the jar folder and create a wrapper project for each of them which does nothing but install the jar in
the local maven repository. Make sure those are the first modules to
run in your multi-module project.

How to build dependent projects using Bit Bucket pipeline

I am trying to get my build working with pipeline using maven . I have two bit bucket repositories for two maven projects. repository1 -> project1 repository2 -> project2. project2 has dependency on project1. Now I dont have problem in building project1 as it doesn't has dependency on any projects. But when I try to build project2 using pipeline build is failing because maven is not finding the project1 artifact.
I got to know that every pipeline runs within a docker image. So my guess is that pipelines for project1 and project2 are running in 2 separate docker images. Because of this when I run pipeline for project2 maven is not finding project1 artifact in local repository. One way to fix this is hosting a maven remote repo for my project artifacts and adding the repo in POM of project2. But i don't want to host a maven repo. I want maven to pick the artifact from local repo. How to get this working?
I'm sorry that no one ever answered this question for you. Setting up BitBucket Pipelines to use private maven repositories requires you to create a custom settings.xml file in the pipeline and then invoking maven with that file specified.
You can't just put a settings.xml file in your source code repository as it would put your credentials at risk. Instead, you can create a settings.xml file and set credentials from BitBucket Pipelines Secure environment variables.
It's pretty straightforward once you see it in action. I actually wrote an extensive guide on how to completely setup BitBucket Pipelines with Maven repositories that shows specifically how to do this in a secure manner.

How to remove maven artifcat completely from SpringSource?

I have a local maven repository and installed a custom artifact. It works if i reference it in other projects. But now i want to use a server for a "own maven repository". If i delete the artifact from the local maven repository, i assumed that the project will not build when i do a maven clean and maven force update dependencies. The artifact cannot be found under .m2/ but Spring Source Tool Suite still can add the artifact to new Java Projects. Create New Java Project -> Edit Pom -> Maven Artifact is added, even if i deleted it from local repository .m2/ . How is this possible and how can i delete it completely, to be able to test if now all dependencies are updated from my server with the .m2/settings.xml configuration?
Your repository is just a directory/file structure. Go to your local repo, find the path (the group id is the path), and delete from the place where you start to see version numbers. When you rebuild, the artifact should be downloaded/replaced from your server/repo.

Maven install local usage when using a repository manger

I am lacking some basic understanding of using a repository manager for our projects. What I don't know is how, if I use a repository manager, if I run a local install command Maven doesn't deploy the package to something like a shared Nexus instance. I seem to have some confusion between local repositories and shared ones when using a repository manager.
Apologies for the naivity and for not testing this myself. We have started versioning our application and using a shared file system approach to getting artifacts and are left with a few questions about what, within the scope of what we are currently doing, will be gained by using a repository manager instead. We do use TeamCity as a build server which is deploying to that currently used file system. I need to know some answers to a few questions before POCing a repo manager.
install is specific to the local repository.
From Maven's point of view whether a remote repository is hosted by your repository manager or is completely external has no relevance - when you're adding your artifact to any kind of remote repository, you need to use the deploy plugin (or release for non-trivial deployments).
Repository managers usually generate instructions on configuring your projects for deployment to a hosted repo.
maven defines a lifecycle (clean, compile, install, deploy...). There are default mappings when you execute "mvn install". So maven knows which plugins to execute for that maven goal.
The Introduction page gives a good overview what happens for each phase (goal) and what the default plugins are: https://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html
In your case: mvn install will copy the artifacts into the local maven repository to be shared by other projects you have locally.
If you want to share artifacts with other developers on other locations "mvn deploy" will copy the artifacts to the remote repository. Note you need to configure the distributionManagement section in the pom.xml to be able to do that.
The normal maven setup should look like this:
project -> local repository -> private remote repository -> public remote repository
Project: in the simplest case your project consists of source files and a configuration file (pom.xml). The project may depend on third party libraries like junit. The jar files of the libraries are not stored in your project directory, only the information which jars are needed.
mvn package
This command creates a jar out of your project an places it in the target/ folder of your project.
Local Repository: This is a maven repository stored locally on your machine. It normally resides in ~/.m2/repository/. Every dependency you are using in your project will be stored in this repository. On compiling your project, maven will use the jar files from this location.
mvn install
This command creates a jar file and copies it to your local repository: ~/.m2/repository/groupId/artifactId/version/project.jar. Now you can use this jar in different independent projects as a dependency, but only your machine.
Private Remote Repository: Most of the time this is a Nexus in your company network. This server allows to share the build project across developers. Your TeamCity server builds the jar and copies it to your nexus server. Beyond this the nexus server works like a proxy, e.g. A Developer needs junit-4.1.1.jar, so the server looks for it on public remote repositories and caches it.
mvn deploy
This command builds a jar and sends it to your nexus server ('to your private remote repository') After that every developer inside your company network can access the jar.
Public Remote Repository: These are repositories available on the internet which contain several jar files, e.g. maven.codehaus.org
Summary:
If you call mvn compile maven looks for the dependencies in your local repository. If maven can't find them, it will ask the (private/public) remote repository, and copy the files to the local repository.
You should not synchronize a local repository over network, since this type of repository is not targeted at such use and may break in some obscure way.
What you need is a mvn deploy - copies the final package to the remote repository for sharing with other developers and projects
mvn install you tried will just build and install the project in your local ~/.m2 repository. It will_not publish the artifacts to your nexus repository which you have configured.
Both install and deploy are valid build phase - meaning it executes all previous phases. Please refer to Maven docs below for more understanding.
From maven documentation:
the default Maven lifecycle has the following build phases (for a complete list of the build phases, refer to the Lifecycle Reference):
validate - validate the project is correct and all necessary information is available
compile - compile the source code of the project
test - test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployed
package - take the compiled code and package it in its distributable format, such as a JAR.
integration-test - process and deploy the package if necessary into an environment where integration tests can be run
verify - run any checks to verify the package is valid and meets quality criteria
**install** - install the package into the local repository, for use as a dependency in other projects locally
**deploy** - done in an integration or release environment, copies the final package to the remote repository for sharing with other developers and projects.

Resources