Why isn't mvn resources:resources picking up buildNumber? - maven

I've got a maven project that uses the buildnumber-maven-plugin. If I run mvn validate I see it's working:
[INFO] --- buildnumber-maven-plugin:1.3:create (default) # myproject ---
[INFO] Executing: /bin/sh -c cd /Users/rob/Workspace/myproject && git rev-parse --verify HEAD
[INFO] Storing buildNumber: 5d315d8d1a43c3289fbf114c379fa1a3d3787044 at timestamp: 1477059166424
But if I run mvn resources:resources the filtered file does not pick it up:
[INFO] --- maven-resources-plugin:2.6:resources (default-cli) # myproject ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
The pom.xml has:
<build>
...
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
<includes>
<include>version.txt</include>
</includes>
</resource>
version.txt has:
${buildNumber}
But after maven runs, no filtering:
> cat target/classes/version.txt
${buildNumber}
The build number config in pom.xml:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.3</version>
<executions>
<execution>
<phase>validate</phase>
<goals><goal>create</goal></goals>
</execution>
I don't know enough Maven. Shouldn't running the resources "goal" also get the buildNumber property?

There is a difference in the commands that you execute:
mvn validate executes the maven phase "valdate": meaning all phases that come before (in this case none)
mvn resources:resources is a shortcut for executing the goal "resources" on the resources plugin. Actually its a shortcut for executing: org.apache.maven.plugins:maven-resources-plugin:3.0.1:resources. These short names are resolved by maven and very typical for plugins in the Apache namespace.
As you can see on the maven life-cycle page the goal you may look for is: "mvn process-resources". That phase has a default plugin binding to "resources:resources" which will run the resource plugin. Since you execute a phase all phases before that will be run too, including the build number plugin.
The ":" indicates the difference for the maven command line.

Related

Cloud Build fails to build the the simple build step with maven

Testing the cloud-build
Part of my cloudbuild.yaml
- name: 'gcr.io/cloud-builders/mvn'
args: ['dockerfile:build']
dockerfile:build perfectly works in bitbucket pipeline, no problem. I use
<plugin>
<groupId>com.spotify</groupId>
<artifactId>dockerfile-maven-plugin</artifactId>
<version>${dockerfile-maven-version}</version>
<executions>
<execution>
<id>default</id>
<goals>
<goal>build</goal>
<goal>push</goal>
</goals>
</execution>
</executions>
<configuration>
<repository>gcr.io/my-project-id/${project.artifactId}</repository>
<tag>${project.version}</tag>
<buildArgs>
<JAR_FILE>${project.build.finalName}.jar</JAR_FILE>
</buildArgs>
</configuration>
</plugin>
But with the cloud-build for this single step I get the error:
[INFO] Step 14/15 : ARG JAR_FILE
[INFO]
[INFO] ---> Using cache
[INFO] ---> 55793de4bb9f
[INFO] [INFO] Step 15/15 : ADD target/${JAR_FILE} /usr/share/$SERVCE_FOLDER_NAME/app.jar
[INFO]
[ERROR] ADD failed: stat /mnt/cache/docker/tmp/docker-builder589658449/target/myappname-0.0.1-SNAPSHOT.jar: no such file or directory
(the JAR_FILE is passed from the maven dockerfile plugin
no such file or directory
Why?.. In the end of the day I juse call dockerfile:build and expect it to be the same as it is when I build it from another pipeline.
My Dockerfile:
FROM openjdk:8-jdk
ENV GOOGLE_APPLICATION_CREDENTIALS=/app/credentials.json
ARG ACTIVE_PROFILES=dev
ENV ACTIVE_PROFILES=$ACTIVE_PROFILES
ARG CREDENTIALS
ARG SERVCE_FOLDER_NAME=myappname-service
ENV SERVCE_FOLDER_NAME=$SERVCE_FOLDER_NAME
#ENTRYPOINT ["/usr/bin/java", "-jar", "/usr/share/$SERVCE_FOLDER_NAME/app.jar"]
ENTRYPOINT ["./entrypoint.sh" ]
WORKDIR /app
EXPOSE 8080
COPY ./.gcloud/credentials.json credentials.json
COPY entrypoint.sh .
#Add Maven dependencies (not shaded into the artifact; Docker-cached)
#ADD target/lib /usr/share/$SERVCE_FOLDER_NAME/lib
ARG JAR_FILE
ADD target/${JAR_FILE} /usr/share/$SERVCE_FOLDER_NAME/app.jar
EntryPoint script is (that is what is mentioned on step 15/15 in the log):
java -Djava.security.egd=file:/dev/./urandom -jar /usr/share/$SERVCE_FOLDER_NAME/app.jar --spring.profiles.active=$ACTIVE_PROFILES
(I did try to pass hard-coded values to $SERVCE_FOLDER_NAME, $ACTIVE_PROFILES - same [it works in bitbucket pipeline])
A few things come to mind,
how are you triggering the builds?
manually with gcloud or api? or automatically with build triggers or the github app?
it seems that the target/ directory might not be present in the remote workspace-- are you ignoring target/ or .jar files anywhere?
the remote workspace might not be getting the target/ directory or .jar files if they are in your .gitignore or .gcloudignore
try making an empty .gcloudignore or temporarily removing target/ and .jar files from .gitignore
relevant links: https://cloud.google.com/sdk/gcloud/reference/topic/gcloudignore, https://github.com/GoogleCloudPlatform/cloud-builders/issues/40
have you tried debugging with cloud-build-local? it allows you to write and explore the workspace locally
https://cloud.google.com/cloud-build/docs/build-debug-locally
https://github.com/GoogleCloudPlatform/cloud-build-local

Maven Install plugin: parameter file is missing or invalid

I have a local jar and I want to use it in my project. There are plenty ways to do it: just install manually into local repo, do it with script in parent pom, use system scope, use local repo declaration.
I decided to use Maven Install plugin to install jar into the repo. Here is a configuration:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.3.1</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<groupId>jacob</groupId>
<artifactId>jacob</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<file>${basedir}\lib\jacob.jar</file>
</configuration>
</execution>
</executions>
</plugin>
I tried to run it in many ways:
mvn initialize install:install-file
mvn initialize
mvn install:install-file
But all the time I have an exception:
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-install-plugin:2.3.1:install-file
(default-cli) on project MYPROJECTNAME: The parameters 'file' for goal
org.apache.maven.plugins:maven-install-plugin:2.3.1:install-file are
missing or invalid ->
I just want to find out why it doesn't work? Looks like I am messed with phases and goals..
Finally I went another way: I placed it into my parent POM, but why it works with parent and doesn't work with child?
The best way to deal with a separated jar is to start using a repository manager and install this jar via the UI into the repository manager. From that point you can use as a usual dependency. Makes life easier.
If you like to install it as part of the build what you have tried that means you need to call maven like this:
mvn package
You should not mix up calling goals without life cylce like install:install-file with life-cycle parts as initialize.
That is the reason why you got the described error message, cause the command line part needs required parameters which you didn't gave on command line.
Combination like:
mvn install:install-file initialize
Make in Maven no sense (rarly). You have bound the maven-install-plugin to the life-clycle in your pom so you should simply call the life cylce:
mvn initalize
And you will get something like the following:
[INFO] ------------------------------------------------------------------------
[INFO] Building test 0.6-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install-file (default) # test ---
[INFO] Installing /home/mvntest/lib/jdom-1.0.jar to /home/.m2/repository/com/soebes/test/jacob/1.1/jacob-1.1.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.780s
[INFO] Finished at: Sat Oct 25 12:19:30 CEST 2014
[INFO] Final Memory: 6M/91M
[INFO] ------------------------------------------------------------------------
BUT: I have to say that installing an artifact like you did is bad practice. Best is to use a repository manager like Nexus, Artifactory or Archiva or if really don't like repository managers (I don't understand why, but this is a different story) you can use Stephen Connolly's non-maven-jar-maven-plugin
Apart from all the above you should use more up-to-date versions of maven-install-plugin which current version number is 2.5.2

Displaying custom test results in Jenkins Maven job

I just added some Python unit tests to an existing Maven POM but I can't seem to get Jenkins to report the results of the tests when it runs the build.
I'm running nose tests from Maven with the exec-maven-plugin during the "test" phase. The tests run successfully from the Jenkins job and generate an xUnit compatible test report to target/surefire-reports/TEST-nosetests.xml, but Jenkins doesn't pick up on the results.
Interestingly, Maven also reports no tests run before executing the test suite:
-------------------------------------------------------
T E S T S
-------------------------------------------------------
There are no tests to run.
Results :
Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO]
[INFO] --- exec-maven-plugin:1.1.1:exec (nosetests) # server ---
[INFO] ................
[INFO] ----------------------------------------------------------------------
[INFO] Ran 16 tests in 194.799s
[INFO]
[INFO] OK
[INFO] Registering compile source root /Volumes/Data/workspace/myProject/target
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
So is this a problem with Jenkins not seeing the results, or with Maven not treating my test suite as actual tests?
I worked through this problem by using a "free-style software project" in Jenkins rather than the "maven2/3" project.
Under Build, choose Add build step and configure the project to Invoke top-level Maven targets. I'm using the test target.
Finally, under Post-build Actions choose Add post-build action of Publish JUnit test result report and point it at the xUnit output from your tests. This option is not available for Maven 2/3 jobs for some reason.
To build upon the answer by bpanulla, if you have tests in more than one sub-directory of your project, you can use a wildcard in the "Test report XMLs" field, such as: **/target/surefire-reports/*.xml
If you have a more modular Jenkins setup using a free project will not correctly build submodules. If the maven-plugin that generates the reports execution id is e2eTests, then add the following snippet to your pom.xml and the jenkins maven plugin will take care of the rest!
<properties>
    <jenkins.e2eTests.reportsDirectory>target/protractor-reports</jenkins.e2eTests.reportsDirectory>
</properties>
See https://wiki.jenkins-ci.org/display/JENKINS/Building+a+maven2+project
Try after adding the maven compiler plugin
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<executions>
<execution>
<id>default-testCompile</id>
<phase>compile</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<encoding>ISO-8859-1</encoding>
<source>1.8</source>
<target>1.8</target>
<useIncrementalCompilation>false</useIncrementalCompilation>
<testSource>1.8</testSource>
<testTarget>1.8</testTarget>
</configuration>
</plugin>
As you can see in the Tests run output that Maven didn't recognized the tests as tests. Furthermore to see the results in Jenkins you need to check if nope can create junit compatible output result files (which i assume) which can be read by jenkins and of course displayed.

maven site unpack error

I have a multi module maven project. If I run
mvn clean install
everything works fine. But if I run:
mvn site
I get the following error:
[INFO] --- maven-dependency-plugin:2.4:unpack (copy-war) # module2 ---
[INFO] Configured Artifact: com.example:module1:1.0-SNAPSHOT:war
Unpacking path\2\module1\target\classes to path\2\module2\target\module1 with includes "" and excludes ""
org.codehaus.plexus.archiver.ArchiverException: The source must not be a directory.
With mvn clean install at the same point I get:
[INFO] --- maven-dependency-plugin:2.4:unpack (copy-war) # module2 ---
[INFO] Configured Artifact: com.example:module1:1.0-SNAPSHOT:war
[INFO] Unpacking path\2\module1\target\module1-1.0-SNAPSHOT.war to path\2\module2\target\module1 with includes "" and excludes ""
and everything works fine.
Any idea why the dependency plugin wants to unpack a directory instead of a war?
I found a workaround. I disabled the site plugin for that module:
<plugin>
<artifactId>maven-site-plugin</artifactId>
<configuration>
<generateReports>false</generateReports>
</configuration>
</plugin>
I think the comand fails due to a bug in the site plugin.
This is indeed a long standing (since 2007!) bug in maven-dependency-plugin. Have a look at the MDEP-98 JIRA issue for a discussion and possible workarounds.

Netbeans reloads no class when "Apply Code Changes" in a maven project with remote jpda debugging

So I have a maven project which produces a jar package containing some ant tasks.
When I run my ant build script somewhere else with jpda open, and debug the tasks, say MyTask with NetBeans, the Apple Code Changes button doesn't work. Here is the output of the netbeans console:
cd /trunks/tasks; JAVA_HOME=/opt/jdk /opt/netbeans-7.0/java/maven/bin/mvn -Djpda.stopclass=com.abc.ant.MyTask compile
Scanning for projects...
------------------------------------------------------------------------
Building tasks 1.0-SNAPSHOT
------------------------------------------------------------------------
[resources:resources]
Using 'UTF-8' encoding to copy filtered resources.
Copying 1 resource to com/abc/ant
[compiler:compile]
Compiling 1 source file to /trunks/tasks/build/classes
------------------------------------------------------------------------
BUILD SUCCESS
------------------------------------------------------------------------
Total time: 1.548s
Finished at: Fri Mar 09 17:45:24 CST 2012
Final Memory: 11M/149M
------------------------------------------------------------------------
NetBeans: classes to reload: []
NetBeans: No class to reload
So Netbeans does successfully tell Maven what class needs to be compiled. However, NetBeans won't reload the compiled class. Is it because my ant process is using the jar package produced by the Maven project, or because of other reasons?
Note: I have some custom configurations, like where to output the compiled classes, and where to put the jar package. Could that be a reason?
Update 2:
OK I found the reason by myself.
It's because I added the following line under <build> in the pom.xml:
<directory>${my.custom.work.dir}/build</directory>
So maven will output the compiled class files to this directory, rather than the default ${basedir}/target. However, Netbeans seems to be too stupid to recognize that -- it just tries the default directory.
Now the question could be much easier: is there any way to make the IDE recognize that by adding configuration in the pom?
In [your_home_path]/.netbeans/7.0/maven/conf you will find a setting.xml file.
Here you can set ${my.custom.work.dir} in <profiles> tag
You can find examples here (in section Properties)
Edit :
It works for me with this kind of POM (in Netbeans 7.0.1) :
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
<directory>${my.custom.work.dir}/build</directory>
</build>
<properties>
<my.custom.work.dir>/home/alain/Bureau</my.custom.work.dir>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>

Resources