Can I set DYLD_LIBRARY_PATH with maven-surefire-plugin? - macos

In a Java project, I depend on a third-party native library that in turn loads dependency dylibs via DYLD_LIBRARY_PATH. I've successfully run tests via Tycho's surefire plugin by setting this with the environmentVariables property of the plugin config, but a similar setup in a non-OSGi project leaves the DYLD_LIBRARY_PATH variable unset.
Here's a snippet of my functioning Tycho configuration:
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-surefire-plugin</artifactId>
<version>0.25.0</version>
<configuration>
<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${dylib-program}"</argLine>
<environmentVariables>
<DYLD_LIBRARY_PATH>${dylib-program}</DYLD_LIBRARY_PATH>
</environmentVariables>
</configuration>
</plugin>
With that, the tests run correctly, and outputting System.getenv("DYLD_LIBRARY_PATH") shows the set path.
Here's the equivalent snippet from my non-Tycho config:
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${dylib-program}"</argLine>
<environmentVariables>
<DYLD_LIBRARY_PATH>${dylib-program}</DYLD_LIBRARY_PATH>
<OtherVar>bar</OtherVar>
</environmentVariables>
</configuration>
</plugin>
When I run this, however, the dependency library doesn't load properly and System.getenv("DYLD_LIBRARY_PATH") returns null. System.getenv("OtherVar"), however, returns "bar", so setting environment variables generally seems to work. That makes me suspect that there's something peculiar about DYLD_LIBRARY_PATH (and the same happened with LD_LIBRARY_PATH, but not PATH).
The behavior is the same when run in Eclipse (either as-is or with the path also set in the Run Configuration environment) and via the command line (again either as-is or with the environment variable explicitly exported before the run). The Tycho and non-Tycho projects are run on the same machine, with the same tools (other than the test plugins). I'm using macOS 10.12.3, Java 1.8.0_111, and Maven 3.3.9.
Is there a general limitation about setting this property, at least on a Mac, or is there a way I can work around this?

Related

Why can't my local environment find generated Proto classes?

I have a project that is set up to compile protobufs specified in my resources directory. To that end, I am using the xolstice plugin, with the following configuration:
<plugin>
<groupId>org.xolstice.maven.plugins</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>compile-custom</goal>
</goals>
</execution>
</executions>
<configuration>
<protocArtifact>com.google.protobuf:protoc:${protobuf.version}:exe:${os.detected.classifier}</protocArtifact>
<pluginId>grpc-java</pluginId>
<pluginArtifact>io.grpc:protoc-gen-grpc-java:${grpc.version}:exe:${os.detected.classifier}</pluginArtifact>
</configuration>
</plugin>
This is roughly the configuration described here. The protos in question include a few object models as well as a GRPC service which I register for use.
The .jar is packaged easily enough, with a maven-jar-plugin I've inherited from our common root pom. The configuration for that is:
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<archive>
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
</manifest>
</archive>
</configuration>
</plugin>
When I run my project in IntelliJ, everything seems to work fine - I can observe that the required protos are generated correctly and I don't have any issues. However, when I run the .jar with java -jar target/service.jar, I run into the following issue:
[Byte Buddy] ERROR com.artistchooser2.handlers.ChooseArtistsHandler [jdk.internal.loader.ClassLoaders$AppClassLoader#5c29bfd, unnamed module #776b83cc, Thread[main,5,main], loaded=false]
java.lang.IllegalStateException: Cannot resolve type description for com.artistChooser2.v1.ChooseArtistsServiceGrpc$ChooseArtistsServiceImplBase
at net.bytebuddy.pool.TypePool$Resolution$Illegal.resolve(TypePool.java:161)
at net.bytebuddy.pool.TypePool$Default$WithLazyResolution$LazyTypeDescription.delegate(TypePool.java:1038)
The class which should have been generated by the protocol compilation step seems nowhere to be found. Interestingly, however, I can easily confirm that this seems to be broken by running: jar -tvf target/service.jar |grep 'ChooseArtistsServiceGrpc$ChooseArtistsServiceImplBase'. If I run that, I can observe that the class IS actually available and correctly packaged with the .jar. I can also verify that in IntelliJ easily enough by perusing through everything within the .jar.
I noticed this issue because I was setting up a test that runs my service in a Docker image and verifies that it starts up correctly, as it would in production. Interestingly, however, although I am locally unable to get mvn verify to run successfully, my build server (which I have confirmed is running mvn verify) runs to completion without issue.
I've checked all of the usual suspects - it has nothing to do with the maven build profile that is used on the build server, maven versions are the same on the build server and locally, I've even tried clearing the .m2/repository in case there was something fishy there.
So I guess my question is whether anyone has any further leads? Is there something else I should be looking into, some sort of environment variable, or anything else that might cause the above exception locally but not on a build server?
So I'm still not entirely sure how exactly this issue manifested - in my .proto spec, I had:
option java_package = "com.artistChooser2.v1";
Interestingly, on my local machine, when I found the compiled protos in the packaged .jar, they were appearing under com.artistchooser2.v1. Note the lowercase 'c'. But the test I was running was still looking in the upper 'C' package.
For some reason, on the build server, they were actually compiling in the correct location, but not on my local jar. I'm running a Mac, while the build server is a Linux box. Curious if it has something to do with the environment.
Either way, the solution was to alter the package name to what my machine was expecting, which was the lower 'c', probably a better package name anyway.

Confusion on maven compiler plugin

I am quite confused about maven compiler plugin and what it does. I have a project that has several modules. In my top pom.xml I have a section
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.5.1</version>
<inherited>true</inherited>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
My understanding is this specifies the JDK compiler version used to compile the code, and this section get inherited by all the modules. What I don't get is in my IntelliJ IDEA I can still specify project JDK in the settings and it seems like that setting overrides this. When I run maven install in the IDE I can confirm that it is using javac from JDK 8 to compile. So what does this section do exactly?
You are correct; maven-compiler settings should be inherited by child modules.
I don't know about IntelliJ, but I can tell you that Eclipse picks&chooses whatever it wants from maven config, and for all the rest it uses its own settings.
Therefore, I'd expect IntelliJ may do something similar?
The simplest way to test this is to run a mvn clean install via command line, and see which "wins". If you get artifact compiled with 1.8 then it means you're missing something in Maven config which causes those settings not to propagate to children. If you get artifact compiled with 1.7 then it is IntelliJ who does it and not maven-compiler-plugin.

Configure Maven to look at local directory while running tests

We are using JBehave and while running test we need to add a local folder and jars to the classpath while running the tests.
The problem is the local folder might vary from system to system. We want the tests to run by looking at the jars installed on that system and the resources defined on that system.
How to add a dependency to maven that could change from system to system?
You can use environment variables in your pom.xml using ${env.VARIABLE_NAME}. If you have the path to your local folder in the pom, you could replace it by a variable. If you do so you have to set that variable on every system you execute the maven job on. I have found some guides for linux and windows on how to do that. Hope this fits your problem.
The test are executed by the maven-surefire-plugin. The plugin has only one goal surefire:test and this goal supports the configuration of additionalClasspathElements.
You can find a usage example here. The example configuration on this page looks like this:
<project>
[...]
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.2</version>
<configuration>
<additionalClasspathElements>
<additionalClasspathElement>path/to/additional/resources</additionalClasspathElement>
<additionalClasspathElement>path/to/additional/jar</additionalClasspathElement>
</additionalClasspathElements>
</configuration>
</plugin>
</plugins>
</build>
[...]
</project>
I would go this way. To change the local folder location for each system you can use environment variables or different maven profiles.

How can I append to the system path for Maven Surefire tests that use DLLs?

I'd like to be able to append a lib directory to the system path to allow Maven to run unit tests that (gah) use DLL native libraries.
I have so far used <argLine>-Djava.library.path=${path.dll}</argLine> to add my DLL path as a library path. However, Windows still wants to resolve DLLs via the path, and I'm getting:
java.lang.UnsatisfiedLinkError
So, is there a way of providing a modified system path to Surefire?
Thanks in advance for your help.
Turns out the following config was necessary:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
...
<environmentVariables>
<PATH>${basedir}\..;${java.library.path}</PATH>
</environmentVariables>
...
</configuration>
</plugin>

Specifying Maven memory parameter without setting MAVEN_OPTS environment variable

I was wondering if it is possible to specify Maven memory boundaries with a syntax similar to:
mvn -Dtest=FooTest -DXmx=512M clean test
I tried a couple of variations till now, unsuccessfully.
I am aware of MAVEN_OPTS environment variable, but I would like to avoid that.
Related to the above question, it would be nice to know if there is the possibility to specify the memory behavior of the surefire plugin in a similar manner, so that it forks the jvm using the overridden memory amount (eventually overriding the <argLine> parameter in the pom plugin configuration, if present)
To specify the max memory (not via MAVEN_OPTS as originally requested) you can do the following:
mvn clean install -DargLine="-Xmx1524m"
You can configure the surefire-plugin to use more memory. Take a look on
Strange Maven out of memory error.
Update:
If you would like to set the parameter from command-line take a look on {{mvn.bat}} (or {{mvn}} shell script) in your maven installation directory. It uses with additional options specified in command line.
Next possibility is to set surefire-plugin and use properties specified in command-line, e.g. mvn ... -Dram=512
and
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.5</version>
<configuration>
<forkMode>once</forkMode>
<argLine>-Xms${ram}m -Xmx${ram}m</argLine>
</configuration>
</plugin>
Since Maven 3.3.1 a default can also be specified in ${maven.projectBasedir}/.mvn/jvm.config
It is documented here: https://maven.apache.org/docs/3.3.1/release-notes.html#JVM_and_Command_Line_Options

Resources