Maven snapshot dependency - maven

There are two projects P-m and P-d. They are separate Jenkins projects that can be built separately. P-m depends on P-d and it is a snapshot version dependency.
Recently an issue occurs during building P-m. It complains of not being able to download P-d jar from the remote repository with this error: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException
Though the error suggests a possible issue with JVM certificate, it looks like not as other jars from the same remote repository can be downloaded successfully.
If manually builds P-d first (local repository has a P-d snapshot jar and remote repository has the deployed timestamped P-d jar), then builds P-m, it works OK as it does not try to download P-d jar from remote repository.
But a few days later, without any change of P-d, when P-m builds (this time P-d is not manually built), it invokes a packaging of P-d dynamically which I didn't figure out why. In this case, local repository has the P-d snapshot jar from last time, and the remote repository has a newly deployed timestamp P-d jar. It then tries to download this new timestamp P-d jar from remote repository and cannot download the jar with error: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException
Please help with the following questions (not sure if it matters: the maven version used is 3.6.3):
Is maven supposed to build the dependency on the fly or just use the latest one in the repositories? For example, P-m depends on P-d. Will P-m just uses the latest version of P-d in repository (and if P-d jar is not available, it complains cannot find P-d), or will P-m builds P-d each time when P-m builds? What decides which behavior? I observed the 2nd case (each time P-m builds, it packages a new P-d jar in remote repository without installing P-d jar to local repository) and I didn't figure out why----what makes it generate a new P-d jar (only the timestamped P-d jar deployed to remote repository without a new snapshot P-d jar in local repository) and do not use the latest P-d jar?
Why cannot the newly generated P-d jar be downloaded with an "PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException"? Any clue what could be wrong? As mentioned earlier, all other jars can be downloaded successfully from the same remote repository except this newly generated P-d jar, so it seems not a JVM certificate issue in building machine.

The error indicates problems with SSL certificates.
I guess the remote repository is accessible via https, and it may have changed it's certificate so the client does no longer trust the server and refuses to communicate. It need not even be the remote repository server. Your organization setting up a proxy server that mangles SSL connections is sufficient for this error to occur.
You describe the problem does not exist if the projects get built locally - this is when the repository server does not have to be contacted which supports above theory.
Check a tutorial like https://resources.weboffice.vertigis.com/Documentation/WebOffice102/EN/howto_install_certs_to_truststore.htm (there are many out there) how to add your repository's public certificate to your maven's truststore.
But to answer your questions:
Maven is not supposed to automatically build the dependencies unless they are subprojects of the current project (this refers to multi module projects). It will try to access the already compiled dependencies from the repository, preferring the local one (which also caches) above the remote.
Did you verify Maven still downloads other dependencies? After all they get cached, so unless you clear the cache or modify your dependencies (name/version) there is no need to do that again - and you mention this recently occurred. To clean the cache check out https://stackoverflow.com/a/22671261/4222206

Related

Force download of same named artifact from alternative Nexus repository

I have a problem with a Maven artifact from a predecessor.
He modified an external maven library and uploaded it under the same version name to the Nexus releases repo.
When I build my project I obviously get the official version and not the intended.
Locally I just overwrite my local .m2 repo with it.
But for our cicd server this is not an outcome, because I have no file system access, and because the situation might repeat itself.
There is no source code so rebuilding and reversioning would be cumbersome.
My question is: how can I force to get the artifact downloaded from the releases and not the default central repo, meanwhile getting all other dependencies come from the latter?

Multi-Module local jar dependencies - Jenkins Pipeline

I need to build a Java project on Maven. I am working on a multi-module Maven project that's built on the Jenkins Pipeline in the Nexus repository. I have a few libraries that are not available on the Nexus repository. I can't manually upload the libraries. I am building this project on a pipeline.
What I did:
I created a folder named jars in the project root of the Git hub repository and manually put all the jar files that are not available on Nexus. In the dependency, I referenced all these local jars as in the dependency parameters.
In the repositories, I gave the URL of the git hub repo of the jar folder. The Jenkins were not able to pick the libraries. I am getting the following error: dependency: dependency version - Build Error - Could not build for non-released dependencies and I am getting an error for all the jars that are in the jars folder. I tried putting the jars folder in src/main/resources but still getting the same error.
How can I reference this jar folder so that the Jenkins Pipeline can take it? I don't have control over the Jenkins / Scripts that are involved. I am a developer just building it on the Pipeline.
P.S: I don't have access to the internet at my company to post the POM or Build Failure errors.
Adding more details:
It's built on the Pipeline. There are two repositories: Nexus 2 and Nexus 3. The particular libraries are not available on Nexus 3 and pipeline takes the build only on Nexus 3.
We have raised a request to upload those libraries but it's not going to happen anytime soon. The Jenkins Pipelines takes it's files from the Github repository and builds the Java project using Maven. I don't have control to a pipeline or any of the scripts in Jenkins.
We downloaded all the libraries that are not available and put that in a folder in git hub. There are 4 cycles in the Pipeline. Github Cycle / Jenkins Cycle / Deployment Cycle / Release Cycle.
Github Cycle: In this cycle, it follows three stages. It takes the code from the code, builds it. It builds the snapshot and uploads it to Nexus repo. In these 2 stages, it was able to successfully build by taking the code from the GitHub and builds it and artifact generated. Third stage: It's really strange as in this stage, it again builds and build getting failed in this stage citing code for Non Released Dependencies for the jars that are uploaded in the git hub.
What might be the reason for this: When it can build in the first two stages of the Github cycle and getting failed in the third stage for Build Failure for non-released dependencies.
The pipeline is designed in such a way that it looks only on Nexus 3 and build during each phase of the cycle.
In the repositories, I gave the URL of the git hub repo of the jar
folder
That does not work because your lib folder is not a valid Maven repository.
How can I reference this jar folder so that the Jenkins Pipeline can
take it?
You have some options:
Set up custom Maven repository manager. You can use Nexus
Repository Manager or JFrog Artifactory or something else.
It will give you the greatest flexibility and allow to do a lot more
in the future. Downside is, you will need to have the infrastructure
to run this which usually comes with some sort of maintenance cost.
Install the bundles in Maven's local repo from the jar folder you already have. There are two ways you can do that:
Via script in Jenkins Pipeline that runs before your build and calls mvn install-file ... for each library in your jar folder. You can find the exact syntax for this command on Apache Maven Install Plugin site
By changing your build and calling the install-file goal of the maven-install-plugin in earlier build phase. I've personally
never done that but this answer suggests it's possible.
remove the files from the jar folder and create a wrapper project for each of them which does nothing but install the jar in
the local maven repository. Make sure those are the first modules to
run in your multi-module project.

Maven - Why Does it Keep Redownloading Dependencies?

When I add a new Maven dependency that I've never used before, I will do Maven build and see the dependencies being downloaded into my local machine from Nexus. All is good.
I will then create another project, specify the same dependency with the same version, do a Maven build, and I will again see the dependencies being downloaded from Nexus into my local machine.
Why are my dependencies re-downloaded every time? Aren't these dependencies already installed in my local repository?
Maven will NOT download artifacts repeatedly. The only exceptions are if you are deleting your local repository (in ~/.m2/repository by default), you are configuring usage of a different local repository and if a new SNAPSHOT version is available.

Maven repository server for caching

I'm trying to understand some concepts about maven. This is the my scenario:
Almost every time I deploy a project, i.e Cloudstack I type:
mvn install
I got some failures like unable to connect to some repositories or that some tests just failed. I don't understand why some tests can fail if the code has been recently downloaded.
My idea is to create a local server repository so maven won't connect to remote servers but to a server that is in the same network with the packages that the application needs.
Is possible to do that? or does these problems are produced by other cause?
See Maven, Introduction to Repositories:
There are strictly only two types of repositories: local and remote. The local repository refers to a copy on your own installation that is a cache of the remote downloads, and also contains the temporary build artifacts that you have not yet released.
Remote repositories refer to any other type of repository, [...]. These repositories might be a truly remote repository set up by a third party to provide their artifacts for downloading [...]. Other "remote" repositories may be internal repositories set up on a file or HTTP server within your company, used to share private artifacts between development teams and for releases.
mvn install does not deploy your project's artifact (at least not in the sense of Maven's deploy, see Introduction to the Build Lifecycle). It does:
install the package into the local repository, for use as a dependency in other projects locally
whereas mvn deploy is:
done in an integration or release environment, copies the final package to the remote repository for sharing with other developers and projects
„unable to connect to some repositories“ and „some tests just failed“ are two different kind of errors. It's impossible to tell more without any relevant part(s) of the output of the build where these occurred.

Maven install local usage when using a repository manger

I am lacking some basic understanding of using a repository manager for our projects. What I don't know is how, if I use a repository manager, if I run a local install command Maven doesn't deploy the package to something like a shared Nexus instance. I seem to have some confusion between local repositories and shared ones when using a repository manager.
Apologies for the naivity and for not testing this myself. We have started versioning our application and using a shared file system approach to getting artifacts and are left with a few questions about what, within the scope of what we are currently doing, will be gained by using a repository manager instead. We do use TeamCity as a build server which is deploying to that currently used file system. I need to know some answers to a few questions before POCing a repo manager.
install is specific to the local repository.
From Maven's point of view whether a remote repository is hosted by your repository manager or is completely external has no relevance - when you're adding your artifact to any kind of remote repository, you need to use the deploy plugin (or release for non-trivial deployments).
Repository managers usually generate instructions on configuring your projects for deployment to a hosted repo.
maven defines a lifecycle (clean, compile, install, deploy...). There are default mappings when you execute "mvn install". So maven knows which plugins to execute for that maven goal.
The Introduction page gives a good overview what happens for each phase (goal) and what the default plugins are: https://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html
In your case: mvn install will copy the artifacts into the local maven repository to be shared by other projects you have locally.
If you want to share artifacts with other developers on other locations "mvn deploy" will copy the artifacts to the remote repository. Note you need to configure the distributionManagement section in the pom.xml to be able to do that.
The normal maven setup should look like this:
project -> local repository -> private remote repository -> public remote repository
Project: in the simplest case your project consists of source files and a configuration file (pom.xml). The project may depend on third party libraries like junit. The jar files of the libraries are not stored in your project directory, only the information which jars are needed.
mvn package
This command creates a jar out of your project an places it in the target/ folder of your project.
Local Repository: This is a maven repository stored locally on your machine. It normally resides in ~/.m2/repository/. Every dependency you are using in your project will be stored in this repository. On compiling your project, maven will use the jar files from this location.
mvn install
This command creates a jar file and copies it to your local repository: ~/.m2/repository/groupId/artifactId/version/project.jar. Now you can use this jar in different independent projects as a dependency, but only your machine.
Private Remote Repository: Most of the time this is a Nexus in your company network. This server allows to share the build project across developers. Your TeamCity server builds the jar and copies it to your nexus server. Beyond this the nexus server works like a proxy, e.g. A Developer needs junit-4.1.1.jar, so the server looks for it on public remote repositories and caches it.
mvn deploy
This command builds a jar and sends it to your nexus server ('to your private remote repository') After that every developer inside your company network can access the jar.
Public Remote Repository: These are repositories available on the internet which contain several jar files, e.g. maven.codehaus.org
Summary:
If you call mvn compile maven looks for the dependencies in your local repository. If maven can't find them, it will ask the (private/public) remote repository, and copy the files to the local repository.
You should not synchronize a local repository over network, since this type of repository is not targeted at such use and may break in some obscure way.
What you need is a mvn deploy - copies the final package to the remote repository for sharing with other developers and projects
mvn install you tried will just build and install the project in your local ~/.m2 repository. It will_not publish the artifacts to your nexus repository which you have configured.
Both install and deploy are valid build phase - meaning it executes all previous phases. Please refer to Maven docs below for more understanding.
From maven documentation:
the default Maven lifecycle has the following build phases (for a complete list of the build phases, refer to the Lifecycle Reference):
validate - validate the project is correct and all necessary information is available
compile - compile the source code of the project
test - test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployed
package - take the compiled code and package it in its distributable format, such as a JAR.
integration-test - process and deploy the package if necessary into an environment where integration tests can be run
verify - run any checks to verify the package is valid and meets quality criteria
**install** - install the package into the local repository, for use as a dependency in other projects locally
**deploy** - done in an integration or release environment, copies the final package to the remote repository for sharing with other developers and projects.

Resources