I am using a local maven repository to house some code I am using to develop a project. I have cited this repository in my project.clj file, and am now able to rely on local jars in this way (how to do this in a previous question of mine).
Since I am actively developing these projects, I have my project.clj file looking for the LATEST version. But, in order to update a dependency, I still have to increment the version number of that dependency and then run lein install to build it to the maven repository.
Does leiningen have a way to do this where this is automatically done for me when I build the project that depends on things from the maven repo? Can lein just look for those things and rebuild them as needed?
If you want to develop two projects in parallel, with one depending on the other, you can use symlinks in a checkouts directory to avoid having to install snapshots all the time.
To quote from the Leiningen README:
Q: I want to hack two projects in parallel, but it's annoying to switch between them.
A: Use a feature called checkout dependencies. If you create a directory called checkouts in your project root and symlink some other project roots into it, Leiningen will allow you to hack on them in parallel. That means changes in the dependency will be visible in the main project without having to go through the whole install/switch-projects/deps/restart-repl cycle. Note that this is not a replacement for listing the project in :dependencies; it simply supplements that for tighter change cycles.
Are your dependencies version snapshots? Maven should update all *-SNAPSHOT dependencies on build automatically.
Related
I have a set of applications, all use Maven and the local repository. The applications form a dependency tree using <dependency> in their pom.xml. All of these projects have -SNAPSHOT in their version.
Is it possible for Maven (or some compatible dependency manager) to build an application together with all of its local dependencies whose source changed?
I do not want to create a multi-module project, because:
the projects are exactly libraries, not modules;
I do not want an additional complexity just to have a form of build which is already precisely defined;
I want the process to be dynamic: if a library is mature enough to be put into a remote repository, it would be no more rebuilt with the main project and that's ok.
For now, there is a lot of refactoring, moving code from one library to another etc. and it happens often that substantial parts of the dependency tree need to be rebuilt. I thus need to manually write mvn install in several projects in order to assure that there is no stale code.
No, it doesn't work. Even with a multi-module project, maven does not detect which modules have changed sources in it and which do not.
There was a (flaky) implementation in Maven 2, but it was not continued in 3.x, see How to get maven 3.0 to only build modules with local scm changes
I hoped they would include it again in maven 4, but I didn't see it yet: https://maarten.mulders.it/2020/11/whats-new-in-maven-4/
I once did a similar setup, but had to use shell scripts with some git magic to get it working.
You can also decide to put your libraries in separate repo's from the start, and use the repo tool that google uses for android development: https://github.com/GerritCodeReview/git-repo/blob/main/README.md
Once you run mvn install on the particular Maven project, it will be accessible for all other Maven projects, which are on the same workstation, during dependency collection (before the compile phase).
Official Maven Build Lifecycle description:
install - install the package into the local repository, for use as a dependency in other projects locally
It's not necessary to keep libraries as part of the same project(or have it as a multi-module project). But once you want to share those libraries with your teammates, you would need either to force them installing libraries locally (as you did), or store those libraries at some external repo, like Artifactory or Nexus
Our project has a requirement where we want to build only modules which got changed and remaining should be referred from maven local repository or remote repository. Is there any way to do this?
Ideal solution would be if maven can detect any changes in modules from SCM like SVN and build only that and remaining pick from repsoitory
We want to do this because we have many modules and it takes lot of time to compile thus will save lot of time for us.
I have an .m2 repository on my Jenkins slave which is growing every day, currently it's nearly ~40 GB.
Since I have multiple jobs running and picking dependencies from .m2 I cannot remove everything, but I can see in each repo of .m2 there is an older and useless version of the artefact.
Are there any means of way available in maven so that when a job triggers $mvn install maven will keep the latest version only in the .m2 repo (example versioning x.y.z.w which is incremental) for every repo inside .m2?
If you don't care that external dependencies are pulled in every build, you could use a private Maven repository per job (Maven -> Advanced -> Check 'use private Maven repository') and clean the workspace at the start of your build. The private repository creates a .repository in your workspace, so cleaning your workspace will ensure you start with an empty repository.
Should you have many shared external dependencies, then you may be using even more diskspace, since they are present multiple times in the different repositories. In that case you could write a script that periodically (using a task scheduler like cron) removes unused files from the shared repository, see for example this Stack Overflow answer.
However be cautious with a shared Maven repository! Maven by default is not threadsafe, so concurrent jobs downloading the same artifact might use the incomplete downloads. Consider using the Takari extensions to make your Maven repository thread-safe.
Having been through a similar problem, I came up with a solution and made it open source as it might help others. The application is available on Github and it can clean up old dependencies and retain just the latest.
https://github.com/techpavan/mvn-repo-cleaner
Apart from cleaning old dependencies, it has other features like date based cleanup based on download date / last accessed date, removing snapshots, sources, javadocs, ignoring or enforcing deletion of specific groups or artifacts.
Additionally, this is cross platform and can run on both Windows and Unix / Linux environments.
I have a multi-module project, i.e.
parent
module1
module2
In one dev cycle, I added a class mod1.A to module1. Class mod2.B in module2 depends on it.
I do not have the artifacts in my local .m2/repository. Running this:
$ cd prj/module2
$ mvn -o exec:java -Dexec.mainClass=mod2.B
results in an error along the lines of:
The following artifacts could not be resolved: com.example:module1:jar:1.0-SNAPSHOT
After I install the artifacts via mvn install while in the prj folder, it all works as expected.
However, this presents an issue in at least two ways:
I have to go through the slower install phase instead of the faster compile phase
I have two versions of the same project and conflicting modifications in these. I cannot run the same Java class with their respective modifications, only the currently installed modifications, considering they are both the same SNAPSHOT version
There are workaround for both (skip parts of the build for the first, different snapshot versions for the second), but they are far from usable in practice.
Is there a way to make maven use the local modules, instead of using artifacts from local maven repository?
If I understand your question correctly, it seems like you are living a bit outside the norm here: you have two local "copies" of the project with different modifications, that you want to work with alternately when running "exec:java". And Maven is getting in your way: it expects your local .m2 repository area to be in play, but the version strings in each copy are the same, so you end up with the changes interfering among the copies.
To me, it sounds like what you are trying to do is to test your changes. I suggest you just write an actual JUnit or TestNG test in module2 that tests what you want (it can just call mod2.B Main if you want). Then, from your chosen project directory, you can run mvn test -Dtest=MyTestName. It won't "install" anything and it will find the dependencies the way you want it to.
Otherwise, I can see three options.
Change the version string locally in one of the copies (mvn versions:set -DnewVersion=B-SNAPSHOT can do this for you). That way any "installed" jars from your work on that copy will not be considered by the other copy, and vice-versa. You refer to this as being "far from usable" ... I think it should be fine? These are different versions of the project! They should have different version strings! I strongly recommend this option out of the three. (You can do mvn versions:revert when done if you used :set, or you can rely on version control to undo the change.)
Select a different local repository used by Maven when working on one of the projects, with a command-line flag as per https://stackoverflow.com/a/7071791/58549. I don't really think this is a good solution, since you would have to be very careful about using the right flags every time with both projects. Also you'd end up having to re-download Maven plugins and any other dependencies into your new local repository anyway, which is kind of a waste of time.
Try to avoid using any local repository at all. You seem to be trying to make this option work. I don't think this is a great approach either; you're fighting against Maven's expectations, and it limits your flexibility a lot. Maven will indeed find dependencies from the "reactor" (i.e., the executing mvn process) first, but this means all of the required modules must be available in the reactor to be found, which means you can only run mvn at the top level. So if instead you want to just do "mvn exec:java" inside a single module, mvn needs to find that module's dependencies somewhere ... and that's what the local repo is generally used for.
If you're dead set on going with option 3 (instead of option 1), then I suggest you follow the comments on your question and create a profile that runs your exec selectively against module2 and binds it to a lifecycle phase. But this is in practice very close to just wrapping it with a test.
For IntelliJ users:
I solved this problem using IntelliJ's Run configuration. It has the options Resolve workspace artifacts and Add before launch task -> Build. See this picture for clarification:
Run configuration example
The whole point of modules in Maven is to create decoupling between them. You either build each module independently, so that you can work on one module without touching the other, or include both modules as sub-modules in the parent pom and build the parent, which will resolve dependencies between its sub-modules and trigger their builds.
It looks like you have two options here:
Review the structure of your project. Do you really need to split it into two separate modules, if you change code in both of them simultaneously?
Import the project into a Maven-aware IDE (IntelliJ IDEA is very good at working with Maven), and let the IDE handle the compilation. Once finished and stabilized the code-base, build normally with Maven.
I am attempting to create a way in which hermetic builds can be achieved while still relying on SNAPSHOT dependencies in your project.
For the purposes of example, say I have a project which has a dependency structure like this:
┌ other-1.2-SNAPSHOT
mine-1.2.3 ──┤
└ thing-3.1-SNAPSHOT ── gizmo-6.1.3-SNAPSHOT
What I would like to do is resolve all the SNAPSHOT dependencies locally to something which is related to my current version and then deploy those as releases to my Nexus' release repository. Not all of these dependencies are internal so I cannot simply just make a release on each.
So, in this example, other-1.2-SNAPSHOT would become something like other-1.2-mine-1.2.3 and thing-3.1-SNAPSHOT would become thing-3.1-mine-1.2.3. This is relatively trivial in about 60 lines of python.
The problem, however, is in resolving transitive SNAPSHOTs to concrete versions. So I also need to convert gizmo-6.1.3-SNAPSHOT to gizmo-6.1.3-mine.1.2.3 and have thing-3.1-mine-1.2.3 depend on it.
This is only an example of one way in which to achieve what I want. The goal is that in a year or two down the road I can checkout my release branch for version 1.2.3 and be able to run mvn clean package or the like without having to worry about resolving long-since-gone SNAPSHOT dependencies.
It's important that this branch be compilable and not just retain all dependencies using something like the jar-and-dependencies functionality of the assembly plugin. I'd like to potentially be able to modify the source files and make another release build (e.g., applying a hotfix).
So,
Is there anything like this available that will be able to convert SNAPSHOT dependencies in a recursive fashion to be concrete?
Are there any plugins which manage this kind of thing for you? The release plugin had promise with some configuration options on its branch goal but it doesn't resolve external deps to the degree that I want.
Are other techniques available for creating hermetic Maven builds?
This is not a widely used technique, but you can always check your specific SNAPSHOT dependencies into your project as a "project" repository, as described in this blog post: Maven is to Ant as a Nail Gun is to a Hammer
In short, use the Dependencies Plugin to create repository located in your project directory. The below is copied from the linked blog post (which you should read):
1) Run mvn -Dmdep.useRepositoryLayout=true -Dmdep.copyPom=true dependency:copy-dependencies
"This creates /target/dependencies with a repo-like layout of all your projects dependencies"
2) Copy target/dependencies/ to something like libs/
3) Add a repository declaration like the following to your POM:
<repositories>
<repository>
<releases />
<id>snapshots-I-need-forever</id>
<name>snapshots-I-need-forever</name>
<url>file:///${basedir}/libs</url>
</repository>
</repositories>
You make this an automated part of your build/release process: step 1 by configuring the Dependencies plugin to a lifecycle phasephase, and step 2 using AntRun Plugin to move the downloaded dependencies to the right place..
Hope this works for you. I have to go take a shower now...
The maven versions plugin will do most of what you want.
http://mojo.codehaus.org/versions-maven-plugin/
However you will almost certianly need to run it in a pre-build step in which you resolve all the dependencies and update the pom file accordingly. Then re-run maven (which re-reads the pom) to run the real build. You might be able to configure everything within the pom itself triggered with a separate goal thus avoiding a separate script.
This works better if you use particular versions instead of SNAPSHOT dependencies and let the pre-build step upgrade them if necessary. The only real difference for dependency resolution is that maven will always re-download -SNAPSHOT dependencies whereas it will only download normal dependencies if there is a new version available. However many plugins (including the versions plugin) treat -SNAPSHOT dependencies differently causing problems. Since every CI build has a new version number I never use -SNAPSHOT, prefering a different tag like -DEV with more predictable behaviour for things like developer local builds etc.
I've spent a lot of time getting maven to do things similar to this. Most maven projects I know have some kind of pre-build step in order to set version numbers or get around other limitations such as this. Trying to do all this in one step usually fails because maven only reads the pom once, string substitution doesn't work in a few places and the deployed/installed pom doesn't generally doesn't contain the results of string substituion or changes made during the build.