Is maven-download-plugin not portable, or am I crazy? - maven

TL;DR: I stumbled upon a situation where my pom.xml works fine on Windows, but fails on Linux. Since I'm rather new to maven, I'm not sure whether it's a common situation, or if I messed up somewhere.
More details:
I use the maven-download-plugin like this:
<plugin>
<groupId>com.googlecode.maven-download-plugin</groupId>
<artifactId>maven-download-plugin</artifactId>
<version>1.1.0</version>
<executions>
<execution>
<id>get-stuff</id>
<phase>generate-sources</phase>
<goals>
<goal>wget</goal>
</goals>
<configuration>
<url>http://myUrl/my.tar.gz</url>
<unpack>true</unpack>
<outputDirectory>${project.build.directory}</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
On Windows it works like a charm (ie: it downloads and unpack).
On Linux, it fails with the following error:
[ERROR] Failed to execute goal com.googlecode.maven-download-plugin:maven-download-plugin:1.1.0:wget (get-moab)
on project my-project: Execution get-stuff of goal com.googlecode.maven-download-plugin:maven-download-plugin:1.1.0:wget failed:
An API incompatibility was encountered while executing com.googlecode.maven-download-plugin:maven-download-plugin:1.1.0:wget: java.lang.NoSuchMethodError: org.codehaus.plexus.util.cli.Commandline.createArg()Lorg/codehaus/plexus/util/cli/Arg;
I found a workaround (<unpack>false</unpack>, and then "manually" unpack with antrun), but my pom.xml looked better without those additional 15 lines...
To put it in a nutshell:
Is it actually a portability issue, or have I messed up somewhere?
If it's an portability issue: is it common with maven, or am I unlucky on this one?
More technical details:
I used the same plugin both on Linux and Windows (same version, same maven repository)
It failed on a Centos, and a VM with Ubuntu 12.04

My first troubleshooting step when a build works on one machine and not another is to clean out the local Maven repository on the failing machine, and let Maven re-download all of the artifacts. That's often enough to fix the problem.
If the build fails with the same error, then I clean out the local repository on the working machine and build. Usually then I see that I've missed a dependency in the POM that just happened to exist in my local repository already. Fixing the POM often makes the build work on both systems.

Did you check the Maven versions (mvn -version)? org.codehaus.plexus.util is a dependency of Maven Core, so if maven-download-plugin is running under a different version of Maven it was compiled for, this would explain the error.

Related

Why can't my local environment find generated Proto classes?

I have a project that is set up to compile protobufs specified in my resources directory. To that end, I am using the xolstice plugin, with the following configuration:
<plugin>
<groupId>org.xolstice.maven.plugins</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>compile-custom</goal>
</goals>
</execution>
</executions>
<configuration>
<protocArtifact>com.google.protobuf:protoc:${protobuf.version}:exe:${os.detected.classifier}</protocArtifact>
<pluginId>grpc-java</pluginId>
<pluginArtifact>io.grpc:protoc-gen-grpc-java:${grpc.version}:exe:${os.detected.classifier}</pluginArtifact>
</configuration>
</plugin>
This is roughly the configuration described here. The protos in question include a few object models as well as a GRPC service which I register for use.
The .jar is packaged easily enough, with a maven-jar-plugin I've inherited from our common root pom. The configuration for that is:
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<archive>
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
</manifest>
</archive>
</configuration>
</plugin>
When I run my project in IntelliJ, everything seems to work fine - I can observe that the required protos are generated correctly and I don't have any issues. However, when I run the .jar with java -jar target/service.jar, I run into the following issue:
[Byte Buddy] ERROR com.artistchooser2.handlers.ChooseArtistsHandler [jdk.internal.loader.ClassLoaders$AppClassLoader#5c29bfd, unnamed module #776b83cc, Thread[main,5,main], loaded=false]
java.lang.IllegalStateException: Cannot resolve type description for com.artistChooser2.v1.ChooseArtistsServiceGrpc$ChooseArtistsServiceImplBase
at net.bytebuddy.pool.TypePool$Resolution$Illegal.resolve(TypePool.java:161)
at net.bytebuddy.pool.TypePool$Default$WithLazyResolution$LazyTypeDescription.delegate(TypePool.java:1038)
The class which should have been generated by the protocol compilation step seems nowhere to be found. Interestingly, however, I can easily confirm that this seems to be broken by running: jar -tvf target/service.jar |grep 'ChooseArtistsServiceGrpc$ChooseArtistsServiceImplBase'. If I run that, I can observe that the class IS actually available and correctly packaged with the .jar. I can also verify that in IntelliJ easily enough by perusing through everything within the .jar.
I noticed this issue because I was setting up a test that runs my service in a Docker image and verifies that it starts up correctly, as it would in production. Interestingly, however, although I am locally unable to get mvn verify to run successfully, my build server (which I have confirmed is running mvn verify) runs to completion without issue.
I've checked all of the usual suspects - it has nothing to do with the maven build profile that is used on the build server, maven versions are the same on the build server and locally, I've even tried clearing the .m2/repository in case there was something fishy there.
So I guess my question is whether anyone has any further leads? Is there something else I should be looking into, some sort of environment variable, or anything else that might cause the above exception locally but not on a build server?
So I'm still not entirely sure how exactly this issue manifested - in my .proto spec, I had:
option java_package = "com.artistChooser2.v1";
Interestingly, on my local machine, when I found the compiled protos in the packaged .jar, they were appearing under com.artistchooser2.v1. Note the lowercase 'c'. But the test I was running was still looking in the upper 'C' package.
For some reason, on the build server, they were actually compiling in the correct location, but not on my local jar. I'm running a Mac, while the build server is a Linux box. Curious if it has something to do with the environment.
Either way, the solution was to alter the package name to what my machine was expecting, which was the lower 'c', probably a better package name anyway.

Maven + AspectJ/SpringAOP + Lombok + Surefire = test broken in a specific scenario

I have an interesting problem in a project where all of the technologies mentioned in the title are used. I've been able to track it down up to the diagnosis (the test classpath prepared by Surefire), but I don't understand whether it can be fixed and how. It's not a showstopper, indeed it's a minor issue for me, but I'd like to solve it anyway.
First a rough description.
The problem is related to executing tests in a specific module of the project, and only in a specific way.
Everything works (tests pass) when I run from the master pom level:
cd ${projHome}
mvn install
Everything works (tests pass) when I run:
cd ${projHome}/modules/CoreImplementation/
mvn test
That means that I can build and test with no problems, the same for my Jenkins, and NetBeans can run tests from the IDE when I need them.
But that module fails testing when I run from the master pom level:
cd ${projHome}
mvn test
with this error:
java.lang.NoSuchMethodError: it.tidalwave.northernwind.profiling.RequestProfilerAspect.aspectOf()Lit/tidalwave/northernwind/profiling/RequestProfilerAspect;
at it.tidalwave.northernwind.frontend.ui.spi.DefaultSiteViewController.processRequest(DefaultSiteViewController.java:82) ~[classes/:na]
at it.tidalwave.northernwind.frontend.ui.spi.DefaultSiteViewControllerTest.must_call_some_RequestProcessors_when_one_breaks(DefaultSiteViewControllerTest.java:161) ~[test-classes/:na]
Running mvn test as a second pass (after a mvn install -DskipTests) happens to be the way Drone.io and Travis do their job. While I could change their configuration, I'd like to stay with the standard configuration and fix the problem if possible.
The diagnosis in short and my question.
Now, the question in short (details are further below). I was able to track down the problem to different ways in which Surefire prepares the classpath to execute the tests.
When I run mvn install the classpath is:
${repo}/org/apache/maven/surefire/surefire-booter/2.16/surefire-booter-2.16.jar
${repo}/org/apache/maven/surefire/surefire-api/2.16/surefire-api-2.16.jar
${projHome}/modules/CoreImplementation/target/test-classes
${projHome}/modules/CoreImplementation/target/classes
${projHome}/modules/Core/target/it-tidalwave-northernwind-core-1.1-ALPHA-37-SNAPSHOT.952b0c8bdc77.jar
${repo}/it/tidalwave/thesefoolishthings/it-tidalwave-role/3.0-ALPHA-1/it-tidalwave-role-3.0-ALPHA-1.jar
${projHome}/modules/Profiling/target/it-tidalwave-northernwind-core-profiling-1.1-ALPHA-37-SNAPSHOT.952b0c8bdc77.jar
${repo}/org/apache/commons/commons-math3/3.0/commons-math3-3.0.jar
…
When I run mvn test (from the project home) the classpath is:
${repo}/org/apache/maven/surefire/surefire-booter/2.16/surefire-booter-2.16.jar
${repo}/org/apache/maven/surefire/surefire-api/2.16/surefire-api-2.16.jar
${projHome}/modules/CoreImplementation/target/test-classes
${projHome}/modules/CoreImplementation/target/classes
${projHome}/modules/Core/target/unwoven-classes
${repo}/it/tidalwave/thesefoolishthings/it-tidalwave-role/3.0-ALPHA-1/it-tidalwave-role-3.0-ALPHA-1.jar
${projHome}/modules/Profiling/target/unwoven-classes
${repo}/org/apache/commons/commons-math3/3.0/commons-math3-3.0.jar
…
The different portions are the indented ones. In the former case, SureFire uses the classes directory (forget for a moment that in my case they are named unwoven-classes) only for the module under test, and the installed jar files for every dependency. In the latter case, it seems to be using classes for all dependencies in the reactor.
The reason for which this difference in the classpaths gives me troubles is explained below in the "Gory details" section. In short, that unwoven means that they contain bytecode not augmented by AspectJ, hence the methods that can't be found at runtime.
I'm running with SureFire 2.16, but I've also tried the latest 2.19 with no changes. Being able to force SureFire to always use jar files for dependencies would fix my problems. If you have the answer, you can stop reading my post here.
Gory details (just for curiosity).
The faulty module artifactId is it-tidalwave-northernwind-core-default and it depends on aspects available in it-tidalwave-northernwind-core-profiling - that's where the offending RequestProfilerAspect is. The aspect library dependency is both in the regular dependencies of the faulty module and in the configuration of the aspectj plugin:
<dependency>
<groupId>it.tidalwave.northernwind</groupId>
<artifactId>it-tidalwave-northernwind-core-profiling</artifactId>
</dependency>
...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<configuration>
<aspectLibraries combine.children="append">
<dependency>
<groupId>it.tidalwave.northernwind</groupId>
<artifactId>it-tidalwave-northernwind-core-profiling</artifactId>
</dependency>
</aspectLibraries>
</configuration>
</plugin>
</plugins>
</build>
AspectJ integration is by means of the following profile in a Super POM, which is activated in the build, whose relevant part is:
<profile>
<id>it.tidalwave-aspectj-springaop-v1</id>
...
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>default-compile</id>
<phase>compile</phase>
<configuration>
<outputDirectory>target/unwoven-classes</outputDirectory>
</configuration>
</execution>
<execution>
<id>default-testCompile</id>
<phase>test-compile</phase>
<configuration>
<outputDirectory>target/unwoven-test-classes</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
...
The aspectj plugin is configured in the profile to statically weave binaries in the unwoven-test-classes directories. The reason for this approach is that it's the only feasible solution AFAIK to have both Lombok and AspectJ work together.
Now, back to the two classpaths described above: the fact that SureFire is using unwoven-classes means that it's pointing to bytecode that has not been augmented with AspectJ methods, hence the error.
References
The project is a FLOSS one and can be found at
https://bitbucket.org/tidalwave/northernwind-src
or
https://github.com/tidalwave-it/northernwind-src
A changeset where the problem can be reproduced is f98e9a89ac70138c1b6bd0d4570a22d59ed71be6. JDK 1.8.0 is required to build the project (even though it doesn't use Java 8 code yet).
The SuperPOM can be found here:
https://bitbucket.org/tidalwave/thesefoolishthings-superpom-src

SoapUI Maven plugin has weird dependencies

I'm trying to use the SoapUI Maven plugin as shown here: http://www.soapui.org/Test-Automation/maven-2x.html
I can't add the eviware repository, since I'm behind a corporate firewall (only the browser can connect to the outside world) and have to use the local artifactory repo (which mirrors maven central).
So I tried downloading the appropriate jars and poms from the eviware repo and adding them to the local artifactory. Now it seems I've gone down the rabbit hole, the SoapUI plugin has more and more dependencies that I need to add. At first I didn't mind, but we're looking at tens of jars now, and most of them seem to be things that should be in maven central. But then I saw that most of these have altered group-IDs - for example there is a jetty dependency that uses "jetty" as group-ID instead of the canonical "org.mortbay.jetty". And this seems to be the case for many of these dependencies.
So my question has two sides: What are the SoapUI folks doing here? This seems fishy to me, or am I overlooking something?
And second, can I somehow make the plugin use the canonical jars instead of having to chase all the stuff that's in eviware's repository?
Have a look on this soapui forum, I explain why soapui uses weird maven coordinates and what could be done to use regular ones.
I have already complained about the problem on this post and I am sure that SoapUI dev are aware of the problem. Sadly, there is no current work to fix it.
I have tried relentlessly to use the maven-soapui-plugin, but It didn't work for me. However the maven-soapui-extension-plugin mentioned by pppeater and developed by redfish above has worked fine when I tried it. I have used artifactory as my repo manager.
First configure the plugin on your pom
<plugin>
<groupId>com.github.redfish4ktc.soapui</groupId>
<artifactId>maven-soapui-extension-plugin</artifactId>
<version>4.6.3.0</version>
<executions>
<execution>
<id>soapui-tests</id>
<phase>verify</phase>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
<configuration>
<projectFile>${basedir}/src/test/soapui/airline-sample-soapui- project.xml</projectFile>
<outputFolder>${basedir}/target/soapui</outputFolder>
<junitReport>true</junitReport>
<exportwAll>false</exportwAll>
<printReport>false</printReport>
</configuration>
</plugin>
You still need to add the soapui plugin to your repo manager remote repositories and virtual repositories list using the url http://www.soapui.org/repository/maven2/
Just came across this plugin: https://github.com/redfish4ktc/maven-soapui-extension-plugin which notes: "starting from soapui 3.6.1, almost all SmartBear plugin versions have missing dependencies. This is fixed in maven-soapui-extension-plugin"
It also appears to address some Groovy dependencies which others have noted as an issue with the plugin.

How do I write a maven plugin which actually runs?

The instructions here seem very clear:
http://maven.apache.org/guides/plugin/guide-java-plugin-development.html
However, the first problem I run into is that the dependencies are wrong. I also needed to reference the maven-plugin-annotations dependency.
Then, when I attempt to run I get the "No plugin descriptor found at META-INF/maven/plugin.xml" error. I haven't figured out what to do about that.
I've found lots of pages referencing the maven-plugin-plugin, but I can't figure out how to add it to the pom so that it actually does anything which allows my own plugin to run.
Is there an updated version of the plugin development instructions which actually mentions the need to use maven-plugin-plugin?
If I can't get this to work I'm just going to go back to using exec-maven-plugin. It's uglier, but at least it works easily.
There are actually several terrific resources from Sonatype for learning how to write plugins:
Maven the Complete Reference: Writing Plugins
Maven Cookbook: Creating an Ant Maven Plugin
Maven Cookbook: Writing Plugins in Groovy
If I recall correctly, you need to configure the maven-plugin-plugin this way to avoid the "No plugin descriptor found..." issue.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-plugin-plugin</artifactId>
<version>3.2</version>
<configuration>
<!-- see http://jira.codehaus.org/browse/MNG-5346 -->
<skipErrorNoDescriptorsFound>true</skipErrorNoDescriptorsFound>
</configuration>
<executions>
<execution>
<id>mojo-descriptor</id>
<goals>
<goal>descriptor</goal>
</goals>
</execution>
</executions>
</plugin>
I forked a simple GitHub project called maven-wrapper (port of the Gradle wrapper) to make it a Maven plugin.
"It should be easy" for you to figure out pieces that you may eventually be missing:
Maven wrapper plugin(Mojo)
Maven Wrapper full POM

How should I get Maven to deploy artifacts for all supported architectures at the same time?

I have a question that's probably pretty similar to this. I need to solve what I have to imagine to be a pretty common problem -- how to configure Maven to produce multiple variations on the same artifact -- but I have yet to find a good solution.
I have a multi-module project, that eventually results in the assembly plugin generating an artifact. However, part of the assembly includes libraries that have changed substantially in the recent past, with the result that some consumers of the project need library version N, while others need version N+1. Ideally, we'd just automatically generate multiple artifacts, e.g. theproject-1.2.3.thelib-1.0.tar.gz, theproject-1.2.3.thelib-1.1.tar.gz, etc. (where that's release 1.2.3 of our project, running against either library version 1.0 or 1.1).
Right now, I have a bunch of default properties, which build against the latest version of the library in question, plus a profile to build against the older version. I can deploy one or the other this way, but cannot deploy both in one build. Here's the key wrinkle that differs from the above question: I can't automate build-one-clean-build-the-other inside of the release plugin.
Normally, we'd mvn release:prepare release:perform from the root of the multi-module project to take care of deploying everything to our internal Nexus. However, in that case, we have to pick one -- either run the old-library profile, or run without and get the new one. I need the release plugin to deploy both. Is this just impossible? I have to imagine we're not the first people who want to have our automated builds generate support for different platforms....
You may install additional artifacts with differrent types/classifiers. Use attach-artifact goal of the build-helper-maven-plugin to achieve this. Here is a small example - we are deploying a Windows and a Unix installers of the product as windows/exe and unix/sh files. These files will be installed to the local repo and deploy to the distribution management.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<executions>
<execution>
<id>install-installation</id>
<phase>install</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>${basedir}/target/${project.artifactId}-${project.version}-windows.exe</file>
<classifier>windows</classifier>
<type>exe</type>
</artifact>
<artifact>
<file>${basedir}/target/${project.artifactId}-${project.version}-unix.sh</file>
<classifier>unix</classifier>
<type>sh</type>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
Hope this helps.

Resources