I'm working with a project that imports an internal SNAPSHOT dependency that has been updated, but the version is still the same. I cannot get my Maven to get this latest update.
We have an internal Nexus repository. One of modules contains much of our data model so that it can be leveraged across multiple applications. This module is at version 1.5.0-SNAPSHOT. Recently it was updated, however the version was not incremented. One of the updates was a method was moved from one class to another. My project calls that method and therefore, this update should break the project's build.
I know for a fact that the module has been updated in Nexus, because when I build this project through Jenkins, it fails because that method was moved. I also pulled the source code from our VCS to verify directly from the source code.
I have searched SO and here is what I've tried from many of the solutions provided to similar questions.
mvn -U clean install.
Removed the local repository entry from my .m2/repository folder.
mvn dependency:purge-local-repository
Using a new local repository with -Dmaven.repo.local=localrepo
In pretty much all of these, I can see the dependency getting downloaded from our Nexus server and yet my project still builds successfully. I have the feeling there's a setting in my Maven environment that needs changed or something.
Related
I have a set of applications, all use Maven and the local repository. The applications form a dependency tree using <dependency> in their pom.xml. All of these projects have -SNAPSHOT in their version.
Is it possible for Maven (or some compatible dependency manager) to build an application together with all of its local dependencies whose source changed?
I do not want to create a multi-module project, because:
the projects are exactly libraries, not modules;
I do not want an additional complexity just to have a form of build which is already precisely defined;
I want the process to be dynamic: if a library is mature enough to be put into a remote repository, it would be no more rebuilt with the main project and that's ok.
For now, there is a lot of refactoring, moving code from one library to another etc. and it happens often that substantial parts of the dependency tree need to be rebuilt. I thus need to manually write mvn install in several projects in order to assure that there is no stale code.
No, it doesn't work. Even with a multi-module project, maven does not detect which modules have changed sources in it and which do not.
There was a (flaky) implementation in Maven 2, but it was not continued in 3.x, see How to get maven 3.0 to only build modules with local scm changes
I hoped they would include it again in maven 4, but I didn't see it yet: https://maarten.mulders.it/2020/11/whats-new-in-maven-4/
I once did a similar setup, but had to use shell scripts with some git magic to get it working.
You can also decide to put your libraries in separate repo's from the start, and use the repo tool that google uses for android development: https://github.com/GerritCodeReview/git-repo/blob/main/README.md
Once you run mvn install on the particular Maven project, it will be accessible for all other Maven projects, which are on the same workstation, during dependency collection (before the compile phase).
Official Maven Build Lifecycle description:
install - install the package into the local repository, for use as a dependency in other projects locally
It's not necessary to keep libraries as part of the same project(or have it as a multi-module project). But once you want to share those libraries with your teammates, you would need either to force them installing libraries locally (as you did), or store those libraries at some external repo, like Artifactory or Nexus
My company's admin team have introduced a new restriction on Nexus repository where they do not allow update on same artifact version again. My project has 7 sub-modules (say m1..m7) where all have different versions. They all are independent and kept in the same project to manage them better. Now after this restriction activated, if any of the module's version pre-exist in Nexus, the whole build fails with
"Return code is: 400, ReasonPhrase: Repository does not allow updating assets"
I am OK with that artifact not uploading to Nexus but I want other modules to still carry on uploading. If I made a change to one module (say m4), I want m4 to upload with the new version that I specified but other modules can try the upload and fail. Or maybe skip the upload automatically by checking if the module's version already exists.
I have searched a lot but couldn't find any way to achieve this apart from driving all modules with the same version. Here, I use maven properties to enforce the same version on all submodules, and now even if I make changes in one module, all modules are uploaded to Nexus.
Is there any cleaner way to achieve this like ignoring specific errors from Nexus or check for the existing version and skip upload?
The clean thing is to increase the version (automatically) directly after a release, so if you have build a version 1.0.0, the version in the POM should be changed to 1.0.1-SNAPSHOT.
This way you avoid most problems arising from accidentally releasing things twice.
I have an .m2 repository on my Jenkins slave which is growing every day, currently it's nearly ~40 GB.
Since I have multiple jobs running and picking dependencies from .m2 I cannot remove everything, but I can see in each repo of .m2 there is an older and useless version of the artefact.
Are there any means of way available in maven so that when a job triggers $mvn install maven will keep the latest version only in the .m2 repo (example versioning x.y.z.w which is incremental) for every repo inside .m2?
If you don't care that external dependencies are pulled in every build, you could use a private Maven repository per job (Maven -> Advanced -> Check 'use private Maven repository') and clean the workspace at the start of your build. The private repository creates a .repository in your workspace, so cleaning your workspace will ensure you start with an empty repository.
Should you have many shared external dependencies, then you may be using even more diskspace, since they are present multiple times in the different repositories. In that case you could write a script that periodically (using a task scheduler like cron) removes unused files from the shared repository, see for example this Stack Overflow answer.
However be cautious with a shared Maven repository! Maven by default is not threadsafe, so concurrent jobs downloading the same artifact might use the incomplete downloads. Consider using the Takari extensions to make your Maven repository thread-safe.
Having been through a similar problem, I came up with a solution and made it open source as it might help others. The application is available on Github and it can clean up old dependencies and retain just the latest.
https://github.com/techpavan/mvn-repo-cleaner
Apart from cleaning old dependencies, it has other features like date based cleanup based on download date / last accessed date, removing snapshots, sources, javadocs, ignoring or enforcing deletion of specific groups or artifacts.
Additionally, this is cross platform and can run on both Windows and Unix / Linux environments.
I know how to do it for an external repository but not for my local repository, since I don't have a <repository> for my local repository in my settings.xml.
I use snapshot versions for my sub-projects, so when I re-build the parent project I want maven to get all the sub-projects snapshot versions from my local repository not only once a day (which seems to be what happens by default) but always.
If I'm understanding your comment, I think #FrVaBe may have the correct answer. When you change code for a child project on your development machine, it's up to you to rebuild the snapshot and get it into your local artifact repo (via mvn install) so it's available for the parent project to use.
If, however, you want your parent project build to pull in changes made by your teammates and published to the corporate remote repository more often than once per day, read on.
Here is a summary of how Maven central (and kin), remote repositories (e.g a company instance of Nexus or Artifactory) and your local repository work together. If you always want the latest version of snapshots to download on every build, go into your settings.xml file, find <snapshot> repository containing the snapshot you want, and change the <updatePolicy> value to "always". Personally I rarely do this, I simply add the '-U' option to my mvn command line when I want to ensure I have the latest version of a snapshot from my remote repo.
There is no update policy for the local repository!
The local repository is just a bunch of files. When you install to your local repository your local projects already reference the artifacts directly. There is no update that needs to be performed except that maybe your IDE needs to be refreshed to pickup the newer files.
In this manner you can build local snapshots all day long with no versioning headaches, no updates required and no old artifacts left hanging around afterwards. Nice and clean but not so obvious if you're new to Maven and still getting to grips with all these repositories and their fancy update mechanisms.
I think you missunderstood something. Maven will always take the latest/newest SNAPSHOT from your local respository. But in your project setup (Project Inheritance) you need to build the sub projects on their own if you changed something.
An automatical build of the sub project only happens on a Project Aggregation layout.
The difference is explained in the Project Inheritance vs Project Aggregation section of the documentation.
I have a following problem. We have a central maven repository hosted on our company server. Our team is working on a project. Everyone here uses that repository to get the required artifacts. If something is missing at the moment and is required for the task that the developer is currently dealing with, he installs this artifact manually to the central repository, so that his commits don't break the automated builds.
Now, each developer also has Glassfish v2 installed on his machine. That is for testing and debugging purposes. Before committing the changes, developer makes the .ear for the project with Maven help. However, after the developer deploys the ear to it's local glassfish, frequent errors arise, because the set of glassfish libraries may not contain all the latest dependencies of the central company repository.
Right now in case of the error the developer simply reads the log and looks what exactly is missing. After that he manually copies the required jar inside his local $GLASSFISH_HOME$/lib dir. But that seems a little bit frustrating. How can this be done automatically?
Right now we are trying to implement the following solution. The developer has to synchronize his local maven repository gathering all the artifacts from the central one that are required by the project. This local repository has to be placed on the java classpath, so that glassfish would also see it. Is that a correct approach? Maybe there is a way to install directly all the required artifacts from the central repository inside $GLASSFISH_HOME$/dir and this can be done automatically during deploy?
About having to install dependencies. If the developers need to install dependencies missing from public maven repositories, take into account that usually maven proxies have the ability to cache public repos. For instance, archiva has a proxying cache. If the dependencies are your own project deliverables you should consider releasing and deploying with maven to your company repo.
About latest versions. You need to specify maven what version of dependencies should use. I would prefer editing my poms manually, anyway there's a variety of ways to achieve that.
The libraries should be part of the project, I think. If not standard libraries of glassfish, they should be included, for instance, in your war file as part of your project. If not standard but not part of your project (not the regular approach) consider managing this glassfish as a project on its own (own git/svn repo, own pom, own versions, own everything).
Good luck.