Why can't my local environment find generated Proto classes? - maven

I have a project that is set up to compile protobufs specified in my resources directory. To that end, I am using the xolstice plugin, with the following configuration:
<plugin>
<groupId>org.xolstice.maven.plugins</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>compile-custom</goal>
</goals>
</execution>
</executions>
<configuration>
<protocArtifact>com.google.protobuf:protoc:${protobuf.version}:exe:${os.detected.classifier}</protocArtifact>
<pluginId>grpc-java</pluginId>
<pluginArtifact>io.grpc:protoc-gen-grpc-java:${grpc.version}:exe:${os.detected.classifier}</pluginArtifact>
</configuration>
</plugin>
This is roughly the configuration described here. The protos in question include a few object models as well as a GRPC service which I register for use.
The .jar is packaged easily enough, with a maven-jar-plugin I've inherited from our common root pom. The configuration for that is:
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<archive>
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
</manifest>
</archive>
</configuration>
</plugin>
When I run my project in IntelliJ, everything seems to work fine - I can observe that the required protos are generated correctly and I don't have any issues. However, when I run the .jar with java -jar target/service.jar, I run into the following issue:
[Byte Buddy] ERROR com.artistchooser2.handlers.ChooseArtistsHandler [jdk.internal.loader.ClassLoaders$AppClassLoader#5c29bfd, unnamed module #776b83cc, Thread[main,5,main], loaded=false]
java.lang.IllegalStateException: Cannot resolve type description for com.artistChooser2.v1.ChooseArtistsServiceGrpc$ChooseArtistsServiceImplBase
at net.bytebuddy.pool.TypePool$Resolution$Illegal.resolve(TypePool.java:161)
at net.bytebuddy.pool.TypePool$Default$WithLazyResolution$LazyTypeDescription.delegate(TypePool.java:1038)
The class which should have been generated by the protocol compilation step seems nowhere to be found. Interestingly, however, I can easily confirm that this seems to be broken by running: jar -tvf target/service.jar |grep 'ChooseArtistsServiceGrpc$ChooseArtistsServiceImplBase'. If I run that, I can observe that the class IS actually available and correctly packaged with the .jar. I can also verify that in IntelliJ easily enough by perusing through everything within the .jar.
I noticed this issue because I was setting up a test that runs my service in a Docker image and verifies that it starts up correctly, as it would in production. Interestingly, however, although I am locally unable to get mvn verify to run successfully, my build server (which I have confirmed is running mvn verify) runs to completion without issue.
I've checked all of the usual suspects - it has nothing to do with the maven build profile that is used on the build server, maven versions are the same on the build server and locally, I've even tried clearing the .m2/repository in case there was something fishy there.
So I guess my question is whether anyone has any further leads? Is there something else I should be looking into, some sort of environment variable, or anything else that might cause the above exception locally but not on a build server?

So I'm still not entirely sure how exactly this issue manifested - in my .proto spec, I had:
option java_package = "com.artistChooser2.v1";
Interestingly, on my local machine, when I found the compiled protos in the packaged .jar, they were appearing under com.artistchooser2.v1. Note the lowercase 'c'. But the test I was running was still looking in the upper 'C' package.
For some reason, on the build server, they were actually compiling in the correct location, but not on my local jar. I'm running a Mac, while the build server is a Linux box. Curious if it has something to do with the environment.
Either way, the solution was to alter the package name to what my machine was expecting, which was the lower 'c', probably a better package name anyway.

Related

Spark Maven and Jar Development Workflow with local and remote server

So I have a very basic question about how to most effectively work with a local spark environment along with a remote server deployment and despite all of the various pieces of info about this, I still don't find any of them very clear.
I have my IntelliJ environment and dependencies in need within my pom to be able to compile and run and test with my local within intellij. Then I want to test and run against a remote server by copying over my packaged jar file via scp to then run spark-submits.
But I don't need any of the dependencies from maven within my pom file since spark-submit will just use the software on the server anyway so really I just need a jar file with the classes and keeping it very lightweight for the scp would be best. Not sure if I'm mis-understanding this but now I just need to figure out how to exclude any dependency from being added to the jar during packaging. What is the right way to do that?
Update:
So I managed to create a jar with and without dependencies using the below and I could just upload the one without any dependencies to server after building but how can I build only one jar file without any dependencies rather than waiting for a larger jar with everything which I don't need anyway:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
Two things here.
The provided dependency scope will allow you to work locally and prevent any server provided libraries from being packaged.
Maven doesn't package external libraries without creating an uber or shaded jar.
An example of a good Spark POM is provided by Databricks
Also worth mentioning, Maven copy local file to remote server using SSH
See Maven Wagon SSH plugin

Can I set DYLD_LIBRARY_PATH with maven-surefire-plugin?

In a Java project, I depend on a third-party native library that in turn loads dependency dylibs via DYLD_LIBRARY_PATH. I've successfully run tests via Tycho's surefire plugin by setting this with the environmentVariables property of the plugin config, but a similar setup in a non-OSGi project leaves the DYLD_LIBRARY_PATH variable unset.
Here's a snippet of my functioning Tycho configuration:
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-surefire-plugin</artifactId>
<version>0.25.0</version>
<configuration>
<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${dylib-program}"</argLine>
<environmentVariables>
<DYLD_LIBRARY_PATH>${dylib-program}</DYLD_LIBRARY_PATH>
</environmentVariables>
</configuration>
</plugin>
With that, the tests run correctly, and outputting System.getenv("DYLD_LIBRARY_PATH") shows the set path.
Here's the equivalent snippet from my non-Tycho config:
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>-Dfile.encoding=UTF-8 -Djava.library.path="${dylib-program}"</argLine>
<environmentVariables>
<DYLD_LIBRARY_PATH>${dylib-program}</DYLD_LIBRARY_PATH>
<OtherVar>bar</OtherVar>
</environmentVariables>
</configuration>
</plugin>
When I run this, however, the dependency library doesn't load properly and System.getenv("DYLD_LIBRARY_PATH") returns null. System.getenv("OtherVar"), however, returns "bar", so setting environment variables generally seems to work. That makes me suspect that there's something peculiar about DYLD_LIBRARY_PATH (and the same happened with LD_LIBRARY_PATH, but not PATH).
The behavior is the same when run in Eclipse (either as-is or with the path also set in the Run Configuration environment) and via the command line (again either as-is or with the environment variable explicitly exported before the run). The Tycho and non-Tycho projects are run on the same machine, with the same tools (other than the test plugins). I'm using macOS 10.12.3, Java 1.8.0_111, and Maven 3.3.9.
Is there a general limitation about setting this property, at least on a Mac, or is there a way I can work around this?

Is maven-download-plugin not portable, or am I crazy?

TL;DR: I stumbled upon a situation where my pom.xml works fine on Windows, but fails on Linux. Since I'm rather new to maven, I'm not sure whether it's a common situation, or if I messed up somewhere.
More details:
I use the maven-download-plugin like this:
<plugin>
<groupId>com.googlecode.maven-download-plugin</groupId>
<artifactId>maven-download-plugin</artifactId>
<version>1.1.0</version>
<executions>
<execution>
<id>get-stuff</id>
<phase>generate-sources</phase>
<goals>
<goal>wget</goal>
</goals>
<configuration>
<url>http://myUrl/my.tar.gz</url>
<unpack>true</unpack>
<outputDirectory>${project.build.directory}</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
On Windows it works like a charm (ie: it downloads and unpack).
On Linux, it fails with the following error:
[ERROR] Failed to execute goal com.googlecode.maven-download-plugin:maven-download-plugin:1.1.0:wget (get-moab)
on project my-project: Execution get-stuff of goal com.googlecode.maven-download-plugin:maven-download-plugin:1.1.0:wget failed:
An API incompatibility was encountered while executing com.googlecode.maven-download-plugin:maven-download-plugin:1.1.0:wget: java.lang.NoSuchMethodError: org.codehaus.plexus.util.cli.Commandline.createArg()Lorg/codehaus/plexus/util/cli/Arg;
I found a workaround (<unpack>false</unpack>, and then "manually" unpack with antrun), but my pom.xml looked better without those additional 15 lines...
To put it in a nutshell:
Is it actually a portability issue, or have I messed up somewhere?
If it's an portability issue: is it common with maven, or am I unlucky on this one?
More technical details:
I used the same plugin both on Linux and Windows (same version, same maven repository)
It failed on a Centos, and a VM with Ubuntu 12.04
My first troubleshooting step when a build works on one machine and not another is to clean out the local Maven repository on the failing machine, and let Maven re-download all of the artifacts. That's often enough to fix the problem.
If the build fails with the same error, then I clean out the local repository on the working machine and build. Usually then I see that I've missed a dependency in the POM that just happened to exist in my local repository already. Fixing the POM often makes the build work on both systems.
Did you check the Maven versions (mvn -version)? org.codehaus.plexus.util is a dependency of Maven Core, so if maven-download-plugin is running under a different version of Maven it was compiled for, this would explain the error.

intellij not picking up maven project structure

I have a mavenized java project in Intellij 122.327. Unfortunately (due to legacy code) certain code in the src directory uses tests in the test directory. I'm trying to remove these dependencies but its a long shot. In the meanwhile, I'm able to compile and deploy by using the build-helper maven plugin and adding src/test/java as sources:
<execution>
<id>add-test-dir-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>src/test/java</source>
</sources>
</configuration>
</execution>
Problem is whenever I restart Intellij it keeps marking the src/test directory as a "test" directory (if I go to Project Structure -> Modules -> Sources, src/test is marked in green). So every time I have to manually mark test/java as "Sources". Is there a way to permanently mark this as sources? Even better, does Intellij have a way to read from the pom and infer the project structure?
Check the logs for any related exceptions. There can be many reasons for this problem like a proxy with the self signed certificate, invalid VM options, network issues, etc. See also this answer.
If the issue persists, contact support with the logs attached.

maven can't add files in generated-sources for compilation phase

I use Apache Thrift to generate code in target/generated-sources.
The Thrift compiler produces a directory named gen-java which contains all the Java code. When I execute mvn compile, the code is generated correctly in target/generated-source/gen-java, but in compilation phase, it complains can't find the classes which defined in gen-java.
In my understanding, Maven 2 automatically adds generated sources, is that right?
And what if my testing code also depends on the generated-sources, do I have to manually specified the compiler includes?
In my understanding, maven 2 automatically add generated sources, is that right?
Nothing automatic, plugins generating source code typically handle that by adding their output directory (something like target/generated-sources/<tool> by convention) as source directory to the POM so that it will be included later during the compile phase.
Some less well implemented plugins don't do that for you and you have to add the directory yourself, for example using the Build Helper Maven Plugin.
And since you didn't provide any POM snippet, any link, I can't say anything more.
And what if my testing code also depends on the generated-sources, do I have to manually specified the compiler includes?
As I said, generated sources are usually added as source directory and compiled and are thus available on the test classpath without you having to do anything.
Generated sources are not compiled or packaged automatically. Some IDEs (i.e. IntelliJ) will however show them as source folders.
To make generated sources visible to maven add a add-source-step to the build/plugins node of your pom.xml:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>${project.build.directory}/generated-sources/gen-java</source><!-- adjust folder name to your needs -->
</sources>
</configuration>
</execution>
</executions>
</plugin>

Resources