How to enable Maven to see binary in $PATH? - maven

I'm trying to add the buildnumber maven plugin to include the hgchangeset from mercurial. This works great for me when running in the Terminal (I'm on Mac OSX).
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>create</goal>
<goal>hgchangeset</goal> // The specific goal I'm interested in
<goal>create-timestamp</goal>
</goals>
</execution>
</executions>
<configuration>
<doCheck>false</doCheck>
<doUpdate>false</doUpdate>
<revisionOnScmFailure>true</revisionOnScmFailure>
<!-- format of the property 'buildNumber' -->
<format>{0,date,MMdd-HHmm}</format>
<items>
<item>timestamp</item>
</items>
</configuration>
</plugin>
</plugins>
However, when running in Eclipse and Jenkins, I get this error:
[INFO] EXECUTING: /bin/sh -c cd /Users/Shared/Jenkins/Home/jobs/proj/workspace && hg id -i
[ERROR]
EXECUTION FAILED
Execution of cmd : id failed with exit code: 127.
Working directory was:
/Users/Shared/Jenkins/Home/jobs/proj/workspace
Your Hg installation seems to be valid and complete.
Hg version: NA (OK)
Logging into that server as the user 'jenkins', I can execute: hg id -i just fine and I see the correct output, as the hg binary is on my $PATH and that's recognized in the console.
Similarly, Eclipse gives the same output. I imagine this is because in Terminal the maven build can see my hg binary in /usr/local/bin but Eclipse and Jenkins cannot.
Is there any way I can tell Maven to use hg at that location? I don't care if it has to be hardcoded since the binary location is the same on all our computers.
Thanks in advance
echo $PATH
/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/X11/bin:/usr/local/git/bin

Looking at the code of the buildnumber-maven-plugin, it seems that there's no way to provide a prefix to the hg executable, or to override the actual command line that's built.
Therefore, I'd suggest modifying the Jenkins build agent configuration so that you adjust the PATH used to locate hg. You can do this following the instructions at Jenkins Distributed Builds setup - typically (depending on how your build slaves are set up), you have a couple options:
Modify the launch-slave shell script
Modify the .profile (or equivalent) for the user running the build slave
Modify the agent configuration in the Jenkins UI, which allows for passing environment variables to the build agent
Any of these should allow you to adjust the contents of the PATH used by the build agent.
Good luck!

I encountered the same issue with eclipse and maven - Though my terminal reported a $PATH that included the path of the binary, the maven-plugin in eclipse was not finding it.
I soon realized the problem: Eclipse had been started from the UI and not the terminal, so it did not inherit the environment variables from the terminal, which includes $PATH. This can be quickly tested by starting eclipse from the terminal
$./eclipse
Now, the maven in this instance of eclipse was able to find the binary alright.
In order to get the path into eclipse when it is started from the UI, you will need to add the path to /etc/paths also

Related

How to run script in mvn as test

I would like to have the following scenario in Maven/Jenkins:
Run test scripts (bash/shell)
when the script exited with a problem (an error), then the Maven build on Jenkins should be on UNSTABLE and not FAILURE status
Question: How can I do it?
You can run scripts in Maven using the Exec Maven Plugin and its exec goal.
If you want to run the script during the test phase, then you can bind an execution of the plugin to it as following:
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<id>run-test-script</id>
<phase>test</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable><!-- configure here your script .sh/.cmd --> </executable>
<arguments>
<argument><!-- configure here arguments, if any --></argument>
</arguments>
<workingDirectory><!-- configure here PWD, if required --></workingDirectory>
</configuration>
</plugin>
</plugins>
</build>
Note that you can also configure different successful exit codes via the successCodes configuration entry.
If the script fails, then the build will fail. However, you can change this behavior on the Jenkins build via the Jenkins Text Finder Plugin and configure it as Post-build Action:
You should set-up a regular expression which could be found as part of the Maven build output on Jenkins. As example, the regex .*Script Failed.* would match the string Script Failed printed by the script in such a case. So the build will actually fail, however we can change its status on Jenkins (but not on Maven)
You should check the option Unstable if found which will convert the status of the build from FAILED to UNSTABLE
As per documentation of the Unstable if found option:
Use this option to set build unstable instead of failing the build.
You can see an example of such a configuration in the image below:
As such, you would have a script executed in the test phase as you desired, the Maven build would fail if the script did so but the Jenkins build would change its status according to your configuration of the Text Finder plugin.
Also note: if you want Maven not to fail in case the script did, you can play with the successCodes as mentioned above and still make the Jenkins build change its status to UNSTABLE according to the same configuration of the Text Finder plugin. Hence different combinations are possible.

Is maven-download-plugin not portable, or am I crazy?

TL;DR: I stumbled upon a situation where my pom.xml works fine on Windows, but fails on Linux. Since I'm rather new to maven, I'm not sure whether it's a common situation, or if I messed up somewhere.
More details:
I use the maven-download-plugin like this:
<plugin>
<groupId>com.googlecode.maven-download-plugin</groupId>
<artifactId>maven-download-plugin</artifactId>
<version>1.1.0</version>
<executions>
<execution>
<id>get-stuff</id>
<phase>generate-sources</phase>
<goals>
<goal>wget</goal>
</goals>
<configuration>
<url>http://myUrl/my.tar.gz</url>
<unpack>true</unpack>
<outputDirectory>${project.build.directory}</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
On Windows it works like a charm (ie: it downloads and unpack).
On Linux, it fails with the following error:
[ERROR] Failed to execute goal com.googlecode.maven-download-plugin:maven-download-plugin:1.1.0:wget (get-moab)
on project my-project: Execution get-stuff of goal com.googlecode.maven-download-plugin:maven-download-plugin:1.1.0:wget failed:
An API incompatibility was encountered while executing com.googlecode.maven-download-plugin:maven-download-plugin:1.1.0:wget: java.lang.NoSuchMethodError: org.codehaus.plexus.util.cli.Commandline.createArg()Lorg/codehaus/plexus/util/cli/Arg;
I found a workaround (<unpack>false</unpack>, and then "manually" unpack with antrun), but my pom.xml looked better without those additional 15 lines...
To put it in a nutshell:
Is it actually a portability issue, or have I messed up somewhere?
If it's an portability issue: is it common with maven, or am I unlucky on this one?
More technical details:
I used the same plugin both on Linux and Windows (same version, same maven repository)
It failed on a Centos, and a VM with Ubuntu 12.04
My first troubleshooting step when a build works on one machine and not another is to clean out the local Maven repository on the failing machine, and let Maven re-download all of the artifacts. That's often enough to fix the problem.
If the build fails with the same error, then I clean out the local repository on the working machine and build. Usually then I see that I've missed a dependency in the POM that just happened to exist in my local repository already. Fixing the POM often makes the build work on both systems.
Did you check the Maven versions (mvn -version)? org.codehaus.plexus.util is a dependency of Maven Core, so if maven-download-plugin is running under a different version of Maven it was compiled for, this would explain the error.

Unable to find javadoc command - maven

I ran this command in my project directory to build and package it:
mvn clean javadoc:jar package
I do have my JAVA_HOME variable set correctly.
Evidently:
$ which java
/usr/bin/java
$ sudo ls -l /usr/bin/java
lrwxr-xr-x 1 root wheel 74 Dec 18 23:42 /usr/bin/java -> /System/Library/Frameworks/JavaVM.framework/Versions/Current/Commands/java
$ which javadoc
/usr/bin/javadoc
Does anyone know why mvn still complains?
Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:2.8:jar (default-cli) on project foo_bar: MavenReportException: Error while creating archive: Unable to find javadoc command: The environment variable JAVA_HOME is not correctly set. -> [Help 1]
A correct which java is not evidence enough, since /usr/bin/ will likely be in your PATH anyway. Check
$ echo $JAVA_HOME
for evidence. Or run
$ JAVA_HOME=/path/to/your/java/home mvn clean javadoc:jar package
On OS X you can set your JAVA_HOME via:
$ export JAVA_HOME=$(/usr/libexec/java_home)
which on my machine points to
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home
You can make it use the java.home system property instead of the JAVA_HOME environment variable:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.0.1</version>
<configuration>
<javadocExecutable>${java.home}/bin/javadoc</javadocExecutable>
</configuration>
</plugin>
Source of idea: https://medium.com/#kankvish/fixing-issue-the-environment-variable-java-home-is-not-correctly-set-b5f0b66a84d0
You can add the JAVA_HOME as an environment variable in eclipse.
Go to your maven build-> add the following.
Click New->
Name: JAVA_HOME
Value : your path here.
This worked for me!
While a lot of answers talk about OS X, for a Debian or Debian-like system (such as Ubuntu), I've decided to abuse the "alternatives" system:
export JAVA_HOME=$(update-alternatives --query javadoc | grep Value: | head -n1 | sed 's/Value: //' | sed 's#bin/javadoc$##')
Rewriting that more cleanly with awk, or using a more correct way to access the value in the "alternatives" database, is left as an exercise for the reader.
Alternatively, given that the point of using "alternatives" system is to maintain symlinks such as /usr/bin/javadoc in this case, we can just query the path pointed to by the symlink:
export JAVA_HOME=$(realpath /usr/bin/javadoc | sed 's#bin/javadoc$##')
While this isn't the only possible "Java home" (you might have numerous JDKs installed), given that I only care about moving the mvn build forward, and the error talks about Javadoc, I chose to refer to this the directory containing the javadoc binary.
Don't forget to install a JDK in addition to a JRE. For instance, the JDK I needed was openjdk-11-jdk, to complement the JRE openjdk-11-jre which I previously installed.
After the above, the JAVA_HOME envvar has this value on my system: /usr/lib/jvm/java-11-openjdk-amd64/
After spending 2-3 hours of time, i felt opening Eclipse via command line looks easiest solution.
Follow below steps,
cd <Folder_where_Eclipse.app>
open Eclipse.app
Now your eclipse can able to find the Terminal Environmental variables.
There are 2 options to fix this. Here are the steps:
Make sure to configure JAVA_HOME as an environment variable.
Option 1: Add javadocExecutable into properties.
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>${project.build.sourceEncoding
</project.reporting.outputEncoding>
<java.version>11</java.version>
<javadocExecutable>${java.home}/bin/javadoc</javadocExecutable>
</properties>
Option 2: Add javadocExecutable into the build section as below.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>${maven-javadoc-plugin.version}</version>
<executions>
<execution>
<id>attach-javadocs</id>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<additionalparam>-Xdoclint:none</additionalparam>
</configuration>
</execution>
</executions>
<configuration>
<javadocExecutable>${java.home}/bin/javadoc</javadocExecutable>
<excludePackageNames>com.vu.poc.test.objects</excludePackageNames>
<overview />
</configuration>
</plugin>
Upgrade maven-javadoc-plugin plugin to latest version (3.3.0 or later).
They fixed this for OSX in maven 3.1 by adding "export JAVA_HOME" to the "bin/mvn" shell script, obviating the need to set JAVA_HOME externally yourself just to find javadoc.
I started facing this issue once I switched to using sdkman and removed Ubuntu's java packages. Everything else works fine, but the javadoc plugin fails when using IntelliJ IDEA's bundled maven. Thankfully, we can set environment variables at a per project level in
Build, Execution, Deployment > Build Tools > Maven > Runner
In the Environment variables: text box, add
JAVA_HOME=/home/user/.sdkman/candidates/java/11.0.12-open/

How do you clear Apache Maven's cache?

Recently, Apache Maven seems to be having caching issues. Performing clean installs on our projects using Windows Vista or Windows 7 sometimes produce artifacts with the same data as a previous build even though the newer artifact's files should have been updated.
Is there any way to clear this cache to force maven to always trigger a clean build of the local artifact that should be built?
In particular, we're having issues building a webapp with the war plugin. Maven version is 3.0.3. War plugin version is 2.1.1.
Delete the artifacts (or the full local repo) from c:\Users\<username>\.m2\repository by hand.
To clean the local cache try using the dependency plug-in.
mvn dependency:purge-local-repository: This is an attempt to delete the local repository files but it always goes and fills up the local repository after things have been removed.
mvn dependency:purge-local-repository -DreResolve=false: This avoids the re-resolving of the dependencies but seems to still go to the network at times.
mvn dependency:purge-local-repository -DactTransitively=false -DreResolve=false: This was added by Paweł Prażak and seems to work well. I'd use the third if you want the local repo emptied, and the first if you just want to throw out the local repo and get the dependencies again.
I would do the following:
mvn dependency:purge-local-repository -DactTransitively=false -DreResolve=false --fail-at-end
The flags tell maven not to try to resolve dependencies or hit the network. Delete what you see locally.
And for good measure, ignore errors (--fail-at-end) till the very end. This is sometimes useful for projects that have a somewhat messed up set of dependencies or rely on a somewhat messed up internal repository (it happens.)
Have you checked/changed the updatePolicy settings for your repositories in your settings.xml.
This element specifies how often updates should attempt to occur.
Maven will compare the local POM's timestamp (stored in a repository's
maven-metadata file) to the remote. The choices are: always, daily
(default), interval:X (where X is an integer in minutes) or never.
Try to set it to always.
Use mvn dependency:purge-local-repository -DactTransitively=false -Dskip=true if you have maven plugins as one of the modules. Otherwise Maven will try to recompile them, thus downloading the dependencies again.
This works on the Spring Tool Suite v 3.1.0.RELEASE, but I'm guessing it's also available on Eclipse as well.
After deleting the artifacts by hand (as stated by palacsint above) in the /username/.m2 directory, re-index the files by doing the following:
Go to:
Windows->Preferences->Maven->User Settings menu.
Click the Reindex button next to the Local Repository text box. Click "Apply" then "OK" and you're done.
As some answers have pointed out, sometimes you really want to delete the local repository entirely, for example, there might be some artifacts that can't be purged as they are not anymore referenced by the pom.
If you want to have this deletion embedded in a maven phase, as for example clean you can use the maven-clean-plugin and access the repository through the settings, for example:
<plugin>
<inherited>false</inherited>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.1</version>
<executions>
<execution>
<phase>clean</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<echo>Base clean is attached to deleting local maven cache</echo>
<echo>${settings.localRepository}</echo>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<inherited>false</inherited>
<artifactId>maven-clean-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<filesets>
<fileset>
<directory>${settings.localRepository}</directory>
</fileset>
</filesets>
</configuration>
</plugin>
I've had this same problem, and I wrote a one-liner in shell to do it.
rm -rf $(mvn help:evaluate -Dexpression=settings.localRepository\
-Dorg.slf4j.simpleLogger.defaultLogLevel=WARN -B \
-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn | grep -vF '[INFO]')/*
I did it as a one-liner because I wanted to have a Jenkins-project to simply run this whenever I needed, so I wouldn't have to log on to stuff, etc.
If you allow yourself a shell-script for it, you can write it cleaner:
#!/usr/bin/env bash
REPOSITORY=$(mvn help:evaluate \
-Dexpression=settings.localRepository \
-Dorg.slf4j.simpleLogger.defaultLogLevel=WARN \
-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn \
--batch-mode \
| grep -vF '[INFO]')
rm -rf $REPOSITORY/*
Should work, but I have not tested all of that script. (I've tested the first command, but not the whole script.) This approach has the downside of running a large complicated command first. It is idempotent, so you can test it out for yourself. The deletion is its own command afterwards, and this lets you try it all out and check that it does what you think it does, because you shouldn't trust deletion commands without verification. However, it is smart for one good reason: It's portable. It respects your settings.xml file. If you're running this command, and tell maven to use a specific xml file (the -s or --settings argument), this will still work. So you don't have to fiddle with making sure everything is the same everywhere.
It's a bit wieldy, but it's a decent way of doing business, IMO.
So there are some commands which you can use for cleaning
1. mvn clean cache
2. mvn clean install
3. mvn clean install -Pclean-database
also deleting repository folder from .m2 can help.

Maven build number plugin, how to save the build number in a file?

I've a Java project using Spring Framework and Git and I wanted to display a build number. I found the Build Number Maven plugin. With Git the build number is a Git hash. I dislike that and I thought a date was much more expressive.
I found this excellent blog article explaining how to use build number plugin with a different profile for SVN and Git. Since I just use Git, instead of creating a new profile, I just copied the plugin part in my build tag.
When I run "mvn package" it tells me:
[INFO] --- buildnumber-maven-plugin:1.0:create (default) # sherd ---
[INFO] Storing buildNumber: 2011-08-04_21-48_stivlo at timestamp: 1312487296631
Which looks ok, but I wonder, where is it stored? "git status" doesn't detect any new file and it seems it's not in target/ too (target/ is in my .gitignore).
Maybe I've to change the configuration to store the build number in a file? How can I use the build number value?
Thanks to the hint of Michael-O, I read the chapter about how to filter resource files in Maven Getting Started Guide. I've created a file application.properties in src/main/resources/properties/application.properties with the following contents:
# application properties
application.name=${pom.name}
application.version=${pom.version}
application.build=${buildNumber}
I've added the following XML snippet within my build section:
<resources>
<resource>
<directory>src/main/resources/properties</directory>
<filtering>true</filtering>
</resource>
</resources>
Now when I call from command line "mvn package", this property file gets saved in target/classes/properties/application.properties, for instance with the following contents:
# application properties
application.name=Sherd Control Panel
application.version=1.0.1-SNAPSHOT
application.build=2011-08-05_05-55_stivlo
Everything works fine from command line, but, sigh, m2eclipse gives Build errors:
05/08/11 6.05.03 CEST: Build errors for obliquid-cp;
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal
org.codehaus.mojo:buildnumber-maven-plugin:1.0:create (default) on project
sherd: Cannot get the branch information from the scm repository :
Exception while executing SCM command.
For some reason m2eclipse tries to connect to my repository, but it can't because it's a Git repository accessed with SSH and a private key. I wonder if I can tell m2eclipse to not connect to Git.
After more digging I found about revisionOnScmFailure option, set it to true and now also m2eclipse works. For reference, here is the full configuration of buildnumber maven plugin that I used (it goes in pom.xml in the build / plugins section).
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.0</version>
<executions>
<execution>
<phase>generate-resources</phase>
<goals>
<goal>create</goal>
</goals>
</execution>
</executions>
<configuration>
<doCheck>false</doCheck>
<doUpdate>false</doUpdate>
<revisionOnScmFailure>true</revisionOnScmFailure>
<format>{0,date,yyyy-MM-dd_HH-mm}_{1}</format>
<items>
<item>timestamp</item>
<item>${user.name}</item>
</items>
</configuration>
</plugin>
You shouldn't set revisionOnScmFailure option to true, it doesn't expect a boolean. Set it to revision string you want to use when SCM is unavailable, like na or like that. It doesn't matter for your case since you override the build number format but it would be more correct.
See buildnumber-maven-plugin docs.
Store it in a filtered properties file. See Using maven to output the version number to a text file
I could not reproduce the problem reported by the OP. In my case both command line and m2eclipse work fine and the file is generated correctly in the target/classes folder. The answer provided by #KasunBG is incorrect. The buildNumber.properties is generated only if you use the following:
<format>{0,number}</format>
<items>
<item>buildNumber</item>
</items>
buildNumber.properties is used to store a number which can be incremented. For this reason ( I think ) the plugin doesn't generate this file if you use timestamp/scmVersion etc.
The documentation page says that the properties files is stored at ${basedir}/buildNumber.properties, which is created when the buildnumber:create phase is ran.

Resources