Can Maven 3 redownload broken files instead of failing the build? - maven

[WARNING] The POM for org.testng:testng:jar:5.14.10 is invalid,
transitive dependencies (if any) will not be available: 1 problem was
encountered while building the effective model for
org.testng:testng:5.14.10
[FATAL] Non-readable POM
/home/teamcity/.m2/repository/org/sonatype/oss/oss-parent/3/oss-parent-3.pom:
input contained no data #
/home/teamcity/.m2/repository/org/sonatype/oss/oss-parent/3/oss-parent-3.pom
Corrupted files happen in ~/.m2, everyone knows about it. Fixing it is as easy as removing the corrupted files so Maven can redownload it. However, I don't want to manually grep the logs, connect to the build agent and remove those files by hand. Reliable builds should be capable of dealing with such problems.
Is there any way to make Maven redownload corrupted files instead of failing the build? I don't want to remove ~/.m2 before each build is performed as it would make the build really slow.
Why that happens? One of my customer has got a broken infrastructure. Virtual machines are being restarted very often without any notice. And since builds are performed most of the time, files get corrupted in e.g. ~/.m2. There is nothing I can change in this matter, it's their servers, and their policy - or just ineptness. But it's me who has to fix the builds by hand.

Up as far as Maven 3.0.4 there is no way to solve this with one invocation of maven.
What you could do is write an aggregator plugin that steps through each of the modules in the reactor and resolves their dependencies via API calls (rather than mojo annotation) allowing to catch failures and purge and retry.
It wouldn't catch every case (for example plugin dependencies) but if you did something like
$ mvn org.mine.maven:resolve-all:resolve-all || rm -rvf ~/.m2/repository
$ mvn clean verify
It would be more reliable.
If you are happy to require Maven 3.x you could write a build extension and drop it in $MAVEN_HOME/lib and the build extension could do the same tricks as the plugin, but because it is in play before plugin resolution, it could catch the cases with plugins.
A lot of work, personally a good MRM makes redownlading silly fast, and in 8 years of using Maven I have maybe had local repo corruption maybe 3-4 times... And of those times all bar one were where I had multiple repositories in play and metadata (pom) was resolved form one while the artifact from another... Only one case was the "downloaded HTML by mistake"... All of them would be stopped by an MRM

Related

parallel build not working with release plugin

Maven -T not working with release plugin
I start to write as answer cause the comment area is too limited.
The mentioned point 2. must have failed with an error cause -T requires parameters (Missing argument for option: T`)
Furthermore the given call release:prepare release:perform clean install deploy is simply wrong.
Let us begin with some basics. A combination of install and deploy shows that there is a misunderstanding about the Maven life cycle.
So using install only makes sense if you want to install the artifacts only into your local repository ($HOME/.m2/repository) to be consumed by other project on the same machine which is usually not the case.
Using deploy (which includes install) is used to upload the created artifacts into a remote repository (like Nexus, Artifactory) which is in corporate environments the case.
Based on the output I can see that you are using extremely old plugin versions like maven-dependency-plugin:2.1: this version is ten years old. Furthermore I see the usage of a sources goal which is used to resolve sources of the dependencies where I would ask: Why do you need that?
The mentioned point 1:
mvn deploy -U -T 1C -DskipTests -Dmaven.install.skip=true
this shows that you have not understand the purpose of install and deploy phase cause the install phase is needed to install the artifacts and deploy phase will transfer them to the remote repository which means it does not make sense to skip the install part (I doubt that this will work). Furthermore using -U only would make sense if you have SNAPSHOT dependencies otherwise this is waste of time.
The usage of -DskipTests gives me the impression you seemed to have long running unit tests (or they might be integration tests instead?)...
To make a release with Maven you should go:
mvn release:prepare release:perform
Nothing else. Based on the supplemental parameters you are giving during a release it looks like your pom files seemed to be not in optimal state.
The given option -DcheckModificationExcludeList=pom.xml looks from my point of view like a problem cause usually you don't need that and furthermore during a release the pom.xml will be changed (the version) so from that point of view it does not make sense. The modification is to check if something not checked in before running a release..(The whole thing looks not concise to me).
Based on the error message you have given:
[ERROR] Failure executing javac, but could not parse the error:
I bet your maven-compiler-plugin version is very old? Which version do you use?
I recommend to use an up-to-date version of maven-release-plugin which is hopefully correctly configured in your pom file (which I can't tell you cause you haven't showed the full pom files).
Also I recommend to use a most recent version of Maven and check all plugins (using most recent versions) and in particular the configuration of the appropriate plugins if the configuration is correct and really needed and fulfills your needs.

Skip the download of artifacts (with Maven) if it is not available in the repository

Is it possible to skip the download of artifacts when none of the repositories defined in the pom file don't contain it? I realize that this might result in incorrect builds but is the option there anyway?
I was trying to download the sources for all my dependencies using
mvn dependency:sources
but since some of the sources are not available, the maven process gets stuck in the middle. I have tried using the -fn option but that too does not stop it from getting stuck in the middle of the operation.
If you define and need an artifact, your project cannot build without it and it would not make sense to be able to skip it.
If you do not need the dependency, you can delete it or use exclude if it is only a transitive dependency (i.e. one you get by using another dependency but do not need yourself).
If you do have the dependency locally, you can install it to your local repository, or if you know your container (application server) will provide it you can set it to scope provided.

Is not -rf command in Maven reliable?

I was building a multi-module project with Maven3. In a module, it gave a "build failure" and said after completing my error, I can use -rf :moduleName in order to continue my build. I did not change anything and gave the same command this time with -rf :moduleName as the maven said and built successfully. What may be the possible reasons of this situation and is not -rf command in Maven reliable?
Either you have a non-deterministic test which fails randomly, you need to look for why and fix it.
Etiher it is just a plugin maven error, for exemple maven-clean-plugin may fail under some OS if target directory is used (explorer, etc.) and may work when it is reexecuted a second time if the lock was released.
Either you have snapshots dependencies and parallel builds and share maven artifact repository with others teamates, like Nexus or Artifactory.
For example, if module A depends on B, in your local build the build chain will be "B, then A". If A doesn't compile, B is built and put into local repository, but the complete build chain fails when building project A.
Then if you use -rf flag, the build chain doesn't recompile B and starts building from module A.
But imagine that you have a continuous deployment, like Teamcity or Jenkins, the project B may be rebuilt with same version number (snapshot) and put in shared central repository. In this case, module A retrieve last available snapshot for module B which can not be the right artifact (if you have local modification), and A may have no error when compiling with this code for module B.
You can avoid this problem either by rebuilding the chain entirely, either by using the -o flag which means "offline" mode (ie maven will retrieve artifacts only from local repository).
To fix correctly it, you should take care of the error and investigate what the specific error means. It was a compile error ? a test failure ? a maven plugin error ? Start by reading the error message, it may help :)

maven can't find my local artifacts

I can't seem to run mvn -o package because it complains with
The repository system is off line but the artifact
com.liferay.portal:util-bridges:jar:6.1.20 is not available in the
local repository.
But I checked my local repository and that artifact does exist there. I also tried the solution of setting updatePolicy to never in the settings.xml file but that failed to work.
Prior to Maven 3.0.x, Maven did not track the origin of files in the local repository.
This could result in build issues, especially if you were building something that listed the (now dead) very borked java.net2 repository... Not only did that repository change released artifacts (extremely bad and evil practice) but it also published artifacts at the same coordinates as artifacts on central but with different content (unbelievably evil)
So you could have the build work (because you had commons-io:commons-io:2.0 from central) wipe your local repo and the build fails (because you now get commons-io:commons-io:2.0 from java.net2 which was a completely different artifact with different dependencies in the pom) or vice versa.
The above situation is one of the drivers for using a maven repository manager, because that allows you to control the subset of a repository that you expose downstream and the order in which artifacts are resolved from multiple repositories (usually referred to as routing rules)
In any case, when maven switched to Aether as the repository access layer, the decision was made to start tracking where artifacts come from.
So with Maven 3.0.x, when an artifact is downloaded from a repository, maven leaves a _maven.repositories file to record where the file was resolved from. If you are building a project and the effective list of repositories does not include the location that the artifact was resolved from, then Maven decides that it is as if the artifact was not in the cache, and will seek to re-resolve the artifact...
There are a number of bugs in 3.0.x though... The most critical being how offline is handled... Namely: when offline, maven 3.0.x thinks there are no repositories, so will always find a mismatch against the _maven.repositories file!!!
The workaround for Maven 3.0.x is to delete these files from your local cache, eg
$ find ~/.m2/repository -name _maven.repositories -exec rm -v {} \;
The side effect is that you loose the protections that Maven 3.0.x is trying to provide.
The good news is that Maven 3.1 will have the required fix (if we can ever get our act together and get a release out the door)
With Maven 3.1 when in offline mode the _maven.repositories file is (semi-)ignored, and there is also an option to ignore that file for online builds (referred to as legacy mode)
At this point in time (June 1st 2013) the 4th attempt to cut a release that meets the legal and testing requirements is in progress... So, assuming that the 4th time is lucky, I would hope to see 3.1.0-alpha-1 released in 3-4 days time... But it could be longer given that we want to give the changes in 3.1 enough time to soak to ensure uses builds don't break (there was a change in an API exposed (by accident-ish - the API is needed by the site and dependency plugin) that plugin authors have depended on (even though they shouldn't have) so there is potential, though we think we have all the bases covered)
Hope that answers your question (and maybe a few more you didn't know you had ;-) )
I also had to remove _remote.repositories in the same way as the _maven.repositories described above. I'm using Maven 3.1.1
find ~/.m2/repository -name _remote.repositories -exec rm -v {} \;
I had this issue when i was using apache-maven-3.0.4, the issue is gone right after i move to apache-maven-3.3.1.
I had this issue in Ubuntu Linux when I've installed local artifacts via a shell script. The solution was to delete the local artifacts and install them again "manually" - calling mvn install:install-file via terminal.

Maven 3 dependency resolution fails until maven-metadata-local.xml files are deleted [maven-invoker-plugin related]

In one of my Maven projects, dependency resolution will succeed once, then fail for later build attempts:
[WARNING] The POM for commons-logging:commons-logging:jar:1.1.1 is missing, no dependency information available
[WARNING] The POM for commons-httpclient:commons-httpclient:jar:3.1 is missing, no dependency information available
[WARNING] The POM for javax.mail:mail:jar:1.4.4 is missing, no dependency information available
…and so on, until I delete the maven-metadata-local.xml files corresponding to the failing artifacts (e.g. ~/.m2/repository/commons-logging/commons-logging/maven-metadata-local.xml). After those files are deleted, the next mvn invocation proceeds properly; the metadata files are restored by that invocation (presumably as part of the process of checking my upstream repositories/mirrors for updated artifacts), and I am again presented with the above errors until I again delete the metadata files.
This impacts multiple projects, though it appears to be limited to a particular set of dependencies. I suppose I could go nuclear and blow away my local repo, but I'd like to understand what the problem is.
Thoughts?
Update: It looks like it's the maven-invoker-plugin (which these builds are using for general-purpose integration testing) that is producing these maven-metadata-local.xml files. I'm not using an integration-testing-only local repo as described here, simply because doing so causes the re-downloading of all transitive dependencies (unless you want to maintain an integration-specific settings.xml file!!!). I've used the invoker plugin with a variety of other projects in this way with good results -- certainly never encountering a wedged local repository in the process like this.
Update 2 OK, this is repeatable, even after starting with a completely fresh local repository. This is on OS X, Java 1.6.0_24 with Maven 3.0.3; note that Maven 2.2.1 does NOT exhibit this problem.
Here's one of the projects in question: the 1.3.0-compat branch of rummage. To reproduce:
> mvn clean test
# no error -- can run this and other builds that don't involve maven-invoker-plugin all day w/o problems
> mvn clean integration-test
# FAIL: "Could not resolve dependencies", with warnings as noted above
> mvn clean test
# FAIL: "Could not resolve dependencies", with warnings as noted above
Once the local repository is borked (by the generation of the maven-metadata-local.xml files, AFAICT), no builds will get past the dependency resolution stage.
Running mvn -X reveals lines like this for each artifact that is later apparently not found:
[DEBUG] Verifying availability of /Users/chas/.m2/repository/javax/mail/mail/1.4.4/mail-1.4.4.jar from []
Of course, /Users/chas/.m2/repository/javax/mail/mail/1.4.4/mail-1.4.4.jar et al. does exist, as does /Users/chas/.m2/repository/javax/mail/mail/1.4.4/mail-1.4.4.pom. Totally puzzled. At this point, I'm assuming this is a bug in Maven 3 (or some underlying library), now that I see that 2.2.1 is clean.
Update 3 Bug report filed with Maven project.
This issue is resolved in aether 1.12, one rev above the aether 1.11 library that ships with Maven 3.0.3. Replacing aether 1.11 with 1.12 in one's Maven install results in expected behaviour (as noted in the bug I filed). Here's hoping Maven 3.0.4 is released with aether 1.12 ASAP. :-)
you do not mention what you may have tried, so maybe you didn't try this one: adding the -U option to force update ? (tho maybe this -U option is only relevant for SNAPSHOTs ...)
I've seen similar errors caused by corrupted files in my local repository. For example, if a download failed partway through, or a file in a remote repository changed after I downloaded it. Deleting the affected directories under ~/.m2 fixed it.

Resources