I want to rebuild my project structure from scratch from time to time and want to purge the built repository in order to do that. However, I don't want to remove downloaded files from maven central and other repositories. Is there a simple way to tell maven to install my built artifacts into a separate repository, ie. other then the one used to store downloaded, external files?
I am NOT talking about deploy, just mvn install.
UPDATE
I found an alternate solution using only one local repository for both downloaded and self-built artifacts: the self-built ones are accompanied by files called "maven-metadata-local.xml", so I select the repository directories to purge based on the existence of that file now...
You cannot do that with the install goal. maven-install-plugin will install the artifact to the same local repository that is used to fetch downloaded artifacts from. By default, this is ${user.home}/.m2/repository. You change that by setting the system variable maven.repo.local to another location (or by telling Maven to use a specific settins.xml). However, at the moment, Maven can't be configured to install specific artifacts to a different local repository than where it is fetching downloaded artifacts.
A possible work-around would be to declare an execution of the install-file goal, bound to the install phase and declare it to install all of the artifacts you want to to the specified local repository.
<plugin>
<artifactId>maven-install-plugin</artifactId>
<version>2.5.2</version>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file><!-- path to artifact to install --></file>
<pomFile><!-- path to POM of artifact --></pomFile>
<localRepositoryPath><!-- path to repository you want to install to --></localRepositoryPath>
</configuration>
</execution>
</executions>
</plugin>
MINSTALL-126 enhancement about if this could be added to maven-install-plugin. In the mean time, see the following workaround, slightly extending what's proposed above, from a blog post I wrote on http://blog2.vorburger.ch/2016/06/maven-install-into-additional.html with some background about why this would be useful:
<profiles>
<profile>
<activation>
<property>
<name>addInstallRepositoryPath</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<artifactId>maven-install-plugin</artifactId>
<executions>
<execution>
<id>additional-install</id>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file>${project.build.directory}/${project.build.finalName}.jar</file>
<localRepositoryPath>${addInstallRepositoryPath}</localRepositoryPath>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
To formalize and expand on the “update” in the question (by the way you should not hesitate to answer your own question):
I came to a similar conclusion independently and include
find -L ~/.m2/repository \( -type d -name '*-SNAPSHOT' -prune -o -type f -name maven-metadata-local.xml \) -exec rm -rfv {} \;
in a general “cleanup” script that I run from time to time. Note that this differs from the ideal of install:install always going to a separate location in (at least) three ways:
You have to remember to run this script, so in the meantime you could have a local repository “polluted” with things you built. On occasion this will mean that a build will work locally for you but will not work for others. (Or even fail only or you, or succeed for everyone but behave subtly differently.) This defeats the goal of reproducible builds, unless you have plenty of Internet bandwidth and are willing to run docker run --rm -v "$PWD":/usr/src/mymaven -w /usr/src/mymaven maven mvn clean install!
If someone has intentionally deployed SNAPSHOTs to a shared repository, this script will delete them, so your next build will have to repeat the download.
Local installs of release versions are not deleted. Now if these came from release:perform because you were the one cutting the release, that is not so bad—presumably the remote artifact is identical to your local copy anyway. Where this gets really evil is if, in the course of trying to debug some problem in someone else’s released artifact by rebuilding from sources with some diagnostic patches (say), you forget to edit pom.xml to use a SNAPSHOT or other distinguishing version, and install the result. Maven will never notice that your local copy differs from the official version, and you can get into weird situations months later. Of course this has never happened to me.
The latter two problems could perhaps be addressed with a more complicated script that parsed maven-metadata-*.xml files rather than assuming that all, and only, SNAPSHOTs were local builds. Or as the submitter hints at, just delete the whole version directory if maven-metadata-local.xml is present (distinguishing this somehow from the parent artifact directory, which will also have such a file, and resolver-status.properties too).
While it is nice that Maven 3 records some information about where artifacts in the local repository came from, it is not good enough. What I for one would really appreciate is if install:install always saved to a distinct location, so that the main local repository could be trusted to be purely a cache of downloads. Local artifact resolution would then prefer one or the other repository in case of conflict based on a command-line switch (after issuing a warning).
Related
I have inherited a POM that attempts to avoid repeating build steps by using a profile
that is only activated when the step output does not exist:
<profile>
<id>run-once</id>
<activation>
<file>
<missing>target/some-output</missing>
</file>
</activation>
<build>
<plugins>
<plugin>
...
<executions>
<execution>
... slow process to produce target/some-output ...
</execution>
</executions>
</plugin>
</plugins>
</build>
</activation>
However, as maven experts no doubt realized immediately, this does not work if the developer says mvn clean install. Maven calculates the active profiles once, before running clean, and if target/some-output was present, then the run-once profile is not active. The result is that target/some-output is removed by the clean phase but is not recreated in the install phase, and the ensuing WAR is broken because some-output is missing.
Is there a standard solution to this problem (besides avoiding mvn clean install) ? I'm about to make the plugin unconditional to prevent the silent creation of a broken WAR.
More generally, is there a standard technique to prevent mvn from recreating artifacts like some-output that are up-to-date? Or is the idea that if make-style dependency management is important, one should use gradle or rake instead of maven?
I don't think there is a standard solution to this problem. There are though various options that I can think of (there are most likely others as well):
you could obviously activate the profile manually: mvn clean install -Prun-once, but then you have to remember to do that each time of course
configure the maven-enforcer-plugin together with its requireFilesExist rule to make sure the files exist and fail the build if they don't. (at least then you wont get the silent creation of a broken WAR)
modify the profile to have it create the files to a location under your src folder (i.e. src/main/gen) which is excluded from being checked into your source repository (if you are using one), and then configure the maven-resources-plugin and its copy-resources goal to copy these resources to the correct location under your build directory. This way clean wont delete them.
How can I easily remove all stale snapshots from my local repo?
Many files in my repo results in poor performance and the time to fetch artifacts is long. I'd like to keep the repo trimmed down to latest snapshot.
I see that dependency:purge-local-repository can clear the local repository but I want to keep latest.
I can easily create a script to do this work (and wrap it in a plugin) but don't want to re-invent the wheel if there's already a tool to purge down to latest.
Is there a plugin that can purge my dependencies to latest-snapshot or last X snapshots?
This is similar to how-do-you-deal-with-maven-3-timestamped-snapshots-efficiently
I added the following and each build's package clears out the old snapshots keeping my local repository trim and ensuring faster builds on windows which has issues with a bulky repo
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<id>remove-old-artifacts</id>
<phase>package</phase>
<goals>
<goal>remove-project-artifact</goal>
</goals>
<configuration>
<removeAll>true</removeAll><!-- remove all versions of built artifacts including all versions. Install phase will regenerate -->
</configuration>
</execution>
</executions>
</plugin>
With Peter's solution, all old versions of current artifact (or group of artifacts, if applied in a parent pom) are deleted, including releases and all snapshots.
If any other projects needs a previous release version, will have to be downloaded again from your non-local repository (say Nexus), and once you create a 'package' of this snapshot, will be deleted again.
I've seen another option for this issue: http://maven.apache.org/plugins/maven-dependency-plugin/purge-local-repository-mojo.html
Has the option of defining 'snapshotsOnly', and will remove only snapshots, not previous stable releases.
I have a working Tycho build that produces a working p2 repository. My current work flow is to manually drag and drop the results of this build from the project's target dir to the web server that hosts the p2 repository. The results of my tycho build look normal:
${projectBaseDir}/target/repository
- features
- com.my.product.feature.201111071414.jar
- plugins
- com.my.product.plugins
- artifacts.jar
- content.jar
So, what is the "industry standard" for taking the results of this build in the repository directory and placing them on a web server.
In this case, I am running the p2 repository's web server on the same machine that is running the build sever, so a simple copy to a directory command would work.
I've tried the maven-resources-plugin using the resources:copy-resources with no luck. I kept getting an error about the invalid output directory. I don't really feel like copy-resources is the way to go here, since the general purpose of that goal is to copy files to the target directory of your maven build, and not to copy files from the target directory of a maven build.
My task seems simple, and I realize there are a lot of options to copy files, but I'm looking for the "maven way" or better yet, "the tycho way" of doing this. If such a standard exists.
thanks,
TW
You can achieve what you want with the antrun plugin. It lets you use ant tasks/targets (like copying files) to do things during a Maven build.
I expect something like the following would work for you:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.6</version>
<executions>
<execution>
<id>configFix</id>
<phase>package</phase>
<configuration>
<target name="configFix">
<copy file="${project.build.directory}/p2/some.file" todir="C:\My\Directory" overwrite="true">
</copy>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
There is an example of Minerva how to publish the repository.
I just finished a plugin for that: http://download.ralph-schuster.eu/eu.ralph-schuster.uploadfiles-maven-plugin/STABLE/. It provides even ways to execute commands on the repository server, e.g. in order to delete old files before or merge artifacts after the deployment.
Hope it resolves the deploy problem. :)
Recently, Apache Maven seems to be having caching issues. Performing clean installs on our projects using Windows Vista or Windows 7 sometimes produce artifacts with the same data as a previous build even though the newer artifact's files should have been updated.
Is there any way to clear this cache to force maven to always trigger a clean build of the local artifact that should be built?
In particular, we're having issues building a webapp with the war plugin. Maven version is 3.0.3. War plugin version is 2.1.1.
Delete the artifacts (or the full local repo) from c:\Users\<username>\.m2\repository by hand.
To clean the local cache try using the dependency plug-in.
mvn dependency:purge-local-repository: This is an attempt to delete the local repository files but it always goes and fills up the local repository after things have been removed.
mvn dependency:purge-local-repository -DreResolve=false: This avoids the re-resolving of the dependencies but seems to still go to the network at times.
mvn dependency:purge-local-repository -DactTransitively=false -DreResolve=false: This was added by Paweł Prażak and seems to work well. I'd use the third if you want the local repo emptied, and the first if you just want to throw out the local repo and get the dependencies again.
I would do the following:
mvn dependency:purge-local-repository -DactTransitively=false -DreResolve=false --fail-at-end
The flags tell maven not to try to resolve dependencies or hit the network. Delete what you see locally.
And for good measure, ignore errors (--fail-at-end) till the very end. This is sometimes useful for projects that have a somewhat messed up set of dependencies or rely on a somewhat messed up internal repository (it happens.)
Have you checked/changed the updatePolicy settings for your repositories in your settings.xml.
This element specifies how often updates should attempt to occur.
Maven will compare the local POM's timestamp (stored in a repository's
maven-metadata file) to the remote. The choices are: always, daily
(default), interval:X (where X is an integer in minutes) or never.
Try to set it to always.
Use mvn dependency:purge-local-repository -DactTransitively=false -Dskip=true if you have maven plugins as one of the modules. Otherwise Maven will try to recompile them, thus downloading the dependencies again.
This works on the Spring Tool Suite v 3.1.0.RELEASE, but I'm guessing it's also available on Eclipse as well.
After deleting the artifacts by hand (as stated by palacsint above) in the /username/.m2 directory, re-index the files by doing the following:
Go to:
Windows->Preferences->Maven->User Settings menu.
Click the Reindex button next to the Local Repository text box. Click "Apply" then "OK" and you're done.
As some answers have pointed out, sometimes you really want to delete the local repository entirely, for example, there might be some artifacts that can't be purged as they are not anymore referenced by the pom.
If you want to have this deletion embedded in a maven phase, as for example clean you can use the maven-clean-plugin and access the repository through the settings, for example:
<plugin>
<inherited>false</inherited>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.1</version>
<executions>
<execution>
<phase>clean</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<echo>Base clean is attached to deleting local maven cache</echo>
<echo>${settings.localRepository}</echo>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<inherited>false</inherited>
<artifactId>maven-clean-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<filesets>
<fileset>
<directory>${settings.localRepository}</directory>
</fileset>
</filesets>
</configuration>
</plugin>
I've had this same problem, and I wrote a one-liner in shell to do it.
rm -rf $(mvn help:evaluate -Dexpression=settings.localRepository\
-Dorg.slf4j.simpleLogger.defaultLogLevel=WARN -B \
-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn | grep -vF '[INFO]')/*
I did it as a one-liner because I wanted to have a Jenkins-project to simply run this whenever I needed, so I wouldn't have to log on to stuff, etc.
If you allow yourself a shell-script for it, you can write it cleaner:
#!/usr/bin/env bash
REPOSITORY=$(mvn help:evaluate \
-Dexpression=settings.localRepository \
-Dorg.slf4j.simpleLogger.defaultLogLevel=WARN \
-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn \
--batch-mode \
| grep -vF '[INFO]')
rm -rf $REPOSITORY/*
Should work, but I have not tested all of that script. (I've tested the first command, but not the whole script.) This approach has the downside of running a large complicated command first. It is idempotent, so you can test it out for yourself. The deletion is its own command afterwards, and this lets you try it all out and check that it does what you think it does, because you shouldn't trust deletion commands without verification. However, it is smart for one good reason: It's portable. It respects your settings.xml file. If you're running this command, and tell maven to use a specific xml file (the -s or --settings argument), this will still work. So you don't have to fiddle with making sure everything is the same everywhere.
It's a bit wieldy, but it's a decent way of doing business, IMO.
So there are some commands which you can use for cleaning
1. mvn clean cache
2. mvn clean install
3. mvn clean install -Pclean-database
also deleting repository folder from .m2 can help.
I have a question that's probably pretty similar to this. I need to solve what I have to imagine to be a pretty common problem -- how to configure Maven to produce multiple variations on the same artifact -- but I have yet to find a good solution.
I have a multi-module project, that eventually results in the assembly plugin generating an artifact. However, part of the assembly includes libraries that have changed substantially in the recent past, with the result that some consumers of the project need library version N, while others need version N+1. Ideally, we'd just automatically generate multiple artifacts, e.g. theproject-1.2.3.thelib-1.0.tar.gz, theproject-1.2.3.thelib-1.1.tar.gz, etc. (where that's release 1.2.3 of our project, running against either library version 1.0 or 1.1).
Right now, I have a bunch of default properties, which build against the latest version of the library in question, plus a profile to build against the older version. I can deploy one or the other this way, but cannot deploy both in one build. Here's the key wrinkle that differs from the above question: I can't automate build-one-clean-build-the-other inside of the release plugin.
Normally, we'd mvn release:prepare release:perform from the root of the multi-module project to take care of deploying everything to our internal Nexus. However, in that case, we have to pick one -- either run the old-library profile, or run without and get the new one. I need the release plugin to deploy both. Is this just impossible? I have to imagine we're not the first people who want to have our automated builds generate support for different platforms....
You may install additional artifacts with differrent types/classifiers. Use attach-artifact goal of the build-helper-maven-plugin to achieve this. Here is a small example - we are deploying a Windows and a Unix installers of the product as windows/exe and unix/sh files. These files will be installed to the local repo and deploy to the distribution management.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<executions>
<execution>
<id>install-installation</id>
<phase>install</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>${basedir}/target/${project.artifactId}-${project.version}-windows.exe</file>
<classifier>windows</classifier>
<type>exe</type>
</artifact>
<artifact>
<file>${basedir}/target/${project.artifactId}-${project.version}-unix.sh</file>
<classifier>unix</classifier>
<type>sh</type>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
Hope this helps.