How do you clear Apache Maven's cache? - maven

Recently, Apache Maven seems to be having caching issues. Performing clean installs on our projects using Windows Vista or Windows 7 sometimes produce artifacts with the same data as a previous build even though the newer artifact's files should have been updated.
Is there any way to clear this cache to force maven to always trigger a clean build of the local artifact that should be built?
In particular, we're having issues building a webapp with the war plugin. Maven version is 3.0.3. War plugin version is 2.1.1.

Delete the artifacts (or the full local repo) from c:\Users\<username>\.m2\repository by hand.

To clean the local cache try using the dependency plug-in.
mvn dependency:purge-local-repository: This is an attempt to delete the local repository files but it always goes and fills up the local repository after things have been removed.
mvn dependency:purge-local-repository -DreResolve=false: This avoids the re-resolving of the dependencies but seems to still go to the network at times.
mvn dependency:purge-local-repository -DactTransitively=false -DreResolve=false: This was added by Paweł Prażak and seems to work well. I'd use the third if you want the local repo emptied, and the first if you just want to throw out the local repo and get the dependencies again.

I would do the following:
mvn dependency:purge-local-repository -DactTransitively=false -DreResolve=false --fail-at-end
The flags tell maven not to try to resolve dependencies or hit the network. Delete what you see locally.
And for good measure, ignore errors (--fail-at-end) till the very end. This is sometimes useful for projects that have a somewhat messed up set of dependencies or rely on a somewhat messed up internal repository (it happens.)

Have you checked/changed the updatePolicy settings for your repositories in your settings.xml.
This element specifies how often updates should attempt to occur.
Maven will compare the local POM's timestamp (stored in a repository's
maven-metadata file) to the remote. The choices are: always, daily
(default), interval:X (where X is an integer in minutes) or never.
Try to set it to always.

Use mvn dependency:purge-local-repository -DactTransitively=false -Dskip=true if you have maven plugins as one of the modules. Otherwise Maven will try to recompile them, thus downloading the dependencies again.

This works on the Spring Tool Suite v 3.1.0.RELEASE, but I'm guessing it's also available on Eclipse as well.
After deleting the artifacts by hand (as stated by palacsint above) in the /username/.m2 directory, re-index the files by doing the following:
Go to:
Windows->Preferences->Maven->User Settings menu.
Click the Reindex button next to the Local Repository text box. Click "Apply" then "OK" and you're done.

As some answers have pointed out, sometimes you really want to delete the local repository entirely, for example, there might be some artifacts that can't be purged as they are not anymore referenced by the pom.
If you want to have this deletion embedded in a maven phase, as for example clean you can use the maven-clean-plugin and access the repository through the settings, for example:
<plugin>
<inherited>false</inherited>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.1</version>
<executions>
<execution>
<phase>clean</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<echo>Base clean is attached to deleting local maven cache</echo>
<echo>${settings.localRepository}</echo>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<inherited>false</inherited>
<artifactId>maven-clean-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<filesets>
<fileset>
<directory>${settings.localRepository}</directory>
</fileset>
</filesets>
</configuration>
</plugin>

I've had this same problem, and I wrote a one-liner in shell to do it.
rm -rf $(mvn help:evaluate -Dexpression=settings.localRepository\
-Dorg.slf4j.simpleLogger.defaultLogLevel=WARN -B \
-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn | grep -vF '[INFO]')/*
I did it as a one-liner because I wanted to have a Jenkins-project to simply run this whenever I needed, so I wouldn't have to log on to stuff, etc.
If you allow yourself a shell-script for it, you can write it cleaner:
#!/usr/bin/env bash
REPOSITORY=$(mvn help:evaluate \
-Dexpression=settings.localRepository \
-Dorg.slf4j.simpleLogger.defaultLogLevel=WARN \
-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn \
--batch-mode \
| grep -vF '[INFO]')
rm -rf $REPOSITORY/*
Should work, but I have not tested all of that script. (I've tested the first command, but not the whole script.) This approach has the downside of running a large complicated command first. It is idempotent, so you can test it out for yourself. The deletion is its own command afterwards, and this lets you try it all out and check that it does what you think it does, because you shouldn't trust deletion commands without verification. However, it is smart for one good reason: It's portable. It respects your settings.xml file. If you're running this command, and tell maven to use a specific xml file (the -s or --settings argument), this will still work. So you don't have to fiddle with making sure everything is the same everywhere.
It's a bit wieldy, but it's a decent way of doing business, IMO.

So there are some commands which you can use for cleaning
1. mvn clean cache
2. mvn clean install
3. mvn clean install -Pclean-database
also deleting repository folder from .m2 can help.

Related

separate local download and install repositories using maven?

I want to rebuild my project structure from scratch from time to time and want to purge the built repository in order to do that. However, I don't want to remove downloaded files from maven central and other repositories. Is there a simple way to tell maven to install my built artifacts into a separate repository, ie. other then the one used to store downloaded, external files?
I am NOT talking about deploy, just mvn install.
UPDATE
I found an alternate solution using only one local repository for both downloaded and self-built artifacts: the self-built ones are accompanied by files called "maven-metadata-local.xml", so I select the repository directories to purge based on the existence of that file now...
You cannot do that with the install goal. maven-install-plugin will install the artifact to the same local repository that is used to fetch downloaded artifacts from. By default, this is ${user.home}/.m2/repository. You change that by setting the system variable maven.repo.local to another location (or by telling Maven to use a specific settins.xml). However, at the moment, Maven can't be configured to install specific artifacts to a different local repository than where it is fetching downloaded artifacts.
A possible work-around would be to declare an execution of the install-file goal, bound to the install phase and declare it to install all of the artifacts you want to to the specified local repository.
<plugin>
<artifactId>maven-install-plugin</artifactId>
<version>2.5.2</version>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file><!-- path to artifact to install --></file>
<pomFile><!-- path to POM of artifact --></pomFile>
<localRepositoryPath><!-- path to repository you want to install to --></localRepositoryPath>
</configuration>
</execution>
</executions>
</plugin>
MINSTALL-126 enhancement about if this could be added to maven-install-plugin. In the mean time, see the following workaround, slightly extending what's proposed above, from a blog post I wrote on http://blog2.vorburger.ch/2016/06/maven-install-into-additional.html with some background about why this would be useful:
<profiles>
<profile>
<activation>
<property>
<name>addInstallRepositoryPath</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<artifactId>maven-install-plugin</artifactId>
<executions>
<execution>
<id>additional-install</id>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file>${project.build.directory}/${project.build.finalName}.jar</file>
<localRepositoryPath>${addInstallRepositoryPath}</localRepositoryPath>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
To formalize and expand on the “update” in the question (by the way you should not hesitate to answer your own question):
I came to a similar conclusion independently and include
find -L ~/.m2/repository \( -type d -name '*-SNAPSHOT' -prune -o -type f -name maven-metadata-local.xml \) -exec rm -rfv {} \;
in a general “cleanup” script that I run from time to time. Note that this differs from the ideal of install:install always going to a separate location in (at least) three ways:
You have to remember to run this script, so in the meantime you could have a local repository “polluted” with things you built. On occasion this will mean that a build will work locally for you but will not work for others. (Or even fail only or you, or succeed for everyone but behave subtly differently.) This defeats the goal of reproducible builds, unless you have plenty of Internet bandwidth and are willing to run docker run --rm -v "$PWD":/usr/src/mymaven -w /usr/src/mymaven maven mvn clean install!
If someone has intentionally deployed SNAPSHOTs to a shared repository, this script will delete them, so your next build will have to repeat the download.
Local installs of release versions are not deleted. Now if these came from release:perform because you were the one cutting the release, that is not so bad—presumably the remote artifact is identical to your local copy anyway. Where this gets really evil is if, in the course of trying to debug some problem in someone else’s released artifact by rebuilding from sources with some diagnostic patches (say), you forget to edit pom.xml to use a SNAPSHOT or other distinguishing version, and install the result. Maven will never notice that your local copy differs from the official version, and you can get into weird situations months later. Of course this has never happened to me.
The latter two problems could perhaps be addressed with a more complicated script that parsed maven-metadata-*.xml files rather than assuming that all, and only, SNAPSHOTs were local builds. Or as the submitter hints at, just delete the whole version directory if maven-metadata-local.xml is present (distinguishing this somehow from the parent artifact directory, which will also have such a file, and resolver-status.properties too).
While it is nice that Maven 3 records some information about where artifacts in the local repository came from, it is not good enough. What I for one would really appreciate is if install:install always saved to a distinct location, so that the main local repository could be trusted to be purely a cache of downloads. Local artifact resolution would then prefer one or the other repository in case of conflict based on a command-line switch (after issuing a warning).

How can I add a 3rd party JAR to my Travis-CI maven build?

I have a project that uses a JAR with no maven repo.
I made this by myself.
Before build my project, I do this on my console:
mvn install:install-file -Dfile=myownjar-1.5.jar -DgroupId=com.cmabreu -DartifactId=mylocal-lib -Dversion=1.5 -D packaging=jar -DgeneratePom=true
and add the JAR to my maven repo (local).
Then I add the required dependency tag to my POM file and build my project.
But, when I commit to Github, I do not send my custom JAR (is another project).
The question is: how can I tell Travis-CI to build my project using this custom JAR in its repository without send it to Github?
Not a recommended solution but a very useful workaround:
Make a directory inside project's home. Let's call it
$projectBasseDir/lib
Put all your external jars in this folder.
In the pom file add scope and systemPath as follows for your dependency:
< scope>system< /scope >
< systemPath>${project.basedir}/lib/yourJar.jar< /systemPath>
Push this lib/ directory to your project repo on github
Travis builds work fine with this
If the jars are not present locally, then we need to add a before_start script to your repo which basically does this:
mkdir lib/
wget -P "lib/" http://urlForYourJar.jar
and it works great again.
I had a very similar problem and after some research, I did the following.
First of all, add before_install command such as
before_install:
- wget -P somewhere/ http://some.url/awesome.jar
- mvn validate
The line validate is very important, since we need to explicitly install the third-party jar via the maven-install plugin.
The second step is to modify your pom.xml file as follows:
<build>
....
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.5.1</version>
<configuration>
<groupId>your.group</groupId>
<artifactId>your_artifact</artifactId>
<version>some.version</version>
<packaging>jar</packaging>
<file>${basedir}/match_this_with_wget/awesome.jar</file>
</configuration>
<executions>
<execution>
<id>install-jar-lib</id>
<goals>
<goal>install-file</goal>
<goals>
<phase>validate</validate>
</execution>
</executions>
</plugin>
....
In your local repository, you can put the jar file in the desired place and use .gitignore to avoid committing it into the git repo.
It is also very easy to replace the wget command with other downloading commands such as git clone. However, you may need to put the jar file within your repository if you install it with maven-install. When you are executing the before_install commands on travis-ci, your pwd should be right at your repo root.
If your company has private repo (Artifactory/Nexus) you may want to deploy it there (deploy:deploy-file). This would be one time manual step. Than you wouldn't need to install it into local repo on every build.
If it's not your case, there is no way how to install it into local repo without checking it into your source control.
You can get Travis to run a custom build script. If you can wget the JAR or if you've checked it into your repo, you can run that mvn install command before running your tests yourself.

How to run a Maven plugin execution only if the resulting output is not already present

I have inherited a POM that attempts to avoid repeating build steps by using a profile
that is only activated when the step output does not exist:
<profile>
<id>run-once</id>
<activation>
<file>
<missing>target/some-output</missing>
</file>
</activation>
<build>
<plugins>
<plugin>
...
<executions>
<execution>
... slow process to produce target/some-output ...
</execution>
</executions>
</plugin>
</plugins>
</build>
</activation>
However, as maven experts no doubt realized immediately, this does not work if the developer says mvn clean install. Maven calculates the active profiles once, before running clean, and if target/some-output was present, then the run-once profile is not active. The result is that target/some-output is removed by the clean phase but is not recreated in the install phase, and the ensuing WAR is broken because some-output is missing.
Is there a standard solution to this problem (besides avoiding mvn clean install) ? I'm about to make the plugin unconditional to prevent the silent creation of a broken WAR.
More generally, is there a standard technique to prevent mvn from recreating artifacts like some-output that are up-to-date? Or is the idea that if make-style dependency management is important, one should use gradle or rake instead of maven?
I don't think there is a standard solution to this problem. There are though various options that I can think of (there are most likely others as well):
you could obviously activate the profile manually: mvn clean install -Prun-once, but then you have to remember to do that each time of course
configure the maven-enforcer-plugin together with its requireFilesExist rule to make sure the files exist and fail the build if they don't. (at least then you wont get the silent creation of a broken WAR)
modify the profile to have it create the files to a location under your src folder (i.e. src/main/gen) which is excluded from being checked into your source repository (if you are using one), and then configure the maven-resources-plugin and its copy-resources goal to copy these resources to the correct location under your build directory. This way clean wont delete them.

Maven build fails at site default-deploy

I have been struggling to fix my maven build issue from last 2 days with no success almost. Can you please help me on this?
I have a parent pom.xml which looks like
<distributionmanagement>
Repository...
Snapshot
<site>
site config here..
</site>
</distributionmanagement>
In child pom.xml, which I wrote works fine if I do 'mvn install'. tar file is created and appears in project/target folder. Looks good so far...
When I do release the problem comes. The good thing is, it goes well till end - creates tar, uploads tar into my svn repository.. but after that maven is trying to read parent pom.xml and error comes while running "maven-site-plugin:default-deploy" and then "BUILD FAILURE"
What I'm thinking is - since tar is created and uploaded into subversion repository creating site & deploying is not required for us. How can I say to maven that once tar is created don't do anything and that's the end point for me. In other words - don't run anything 'site' related stuff for me?
=========================
UPDATE
I have my release plugin config as below
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-release-plugin</artifactId>
<configuration>
<tagBase></tagBase>
</configuration>
</plugin>
we actually do release from our batch file which consists of mvn statements like below -
call mvn clean
call mvn install
call mvn -B release:prepare -DdryRun=true -DscmCommentPrefix="somecomment"
call mvn -B release:clean
call mvn -B release:prepare -DscmCommentPrefix="somecomment"
call mvn -B release:perform -DscmCommentPrefix="somecomment"
Can you please suggest me now?
You should change the maven-release-plugin configuration no to do an site-deploy which is default like the following:
<plugin>
<artifactId>maven-release-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<goals>deploy</goals>
</configuration>
</plugin>
Or if creating site & deploying is really not required (as you say in your question), you could just remove the <distributionManagement> section. According to the docs default goals are "either deploy or deploy site-deploy, if the project has a <distributionManagement>/<site> element".
http://maven.apache.org/maven-release/maven-release-plugin/perform-mojo.html

How to build EAR subproject and deploy it with Jenkins?

My Maven project has a bunch of subprojects like this:
proj/
projEAR/
projCommon/
How can I compile and build the EAR project + deploy it to my web server at the same time?
The way I do it now is:
proj$ mvn clean install
[... builds everything ... ]
proj$ cd projEAR
projEAR$ mvn weblogic:deploy
[... deploys the EAR file ... ]
I'd like to do this with one command. Something like
proj$ mvn clean install projEAR/pom.xml weblogic:deploy
This fails of course, but I hope you get the idea...
Update:
The reason for all this is that jenkins only accepts one pom-file and command. So the problem is really how to configure Jenkins to run Maven twice.
How about the weblogic-deployer-plugin of Jenkins. It will deploy your ear file to a weblogic instance. See WebLogic Deployer Plugin.
Quick and easy workaround
As a workaround, I can advise you to use some Jenkin's Plugins, like "M2 Extra Steps". It allow you to perform extra actions pre or post one. They are often use after a build to perform stuff like generating doc, or deploying something.
I know this is working well ... because I often use this trick :)
Suggestion, never tried
At this moment, I don't have a straight answer. I don't really know how to do it in only one maven command. What I would try is to attach weblogic deploy phase to install.
ear submodule --> pom.xml
<build>
[...]
<plugins>
[...]
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>weblogic-maven-plugin</artifactId>
<version>2.9.1</version>
<executions>
<execution>
<configuration>
[...]
</configuration>
<goals>
<goal>install</goal>
</goals>
</execution>
</executions>
</plugin>
[...]
<plugins>
[...]
<build>
It should work, but once again, I never tried it.
Don't hesitate to give feed back
I couldn't get it to work with Maven. But the way I solved it (in Jenkins) was
Create a pre-build step in Jenkins with the command mvn clean install using the parent pom: proj/pom.xml
Configure the main build as weblogic:deploy using projEAR/pom.xml.
This results in two commands being run: First mvn clean install followed by mvn weblogic:deploy.

Resources