Maven compiler plugin issue with custom annotation processor - maven

I have written a custom annotation processor and configured with maven compiler plugin as shown below, I am facing issue with Immutables annotation processor which is in my application class path. When I add my annotation processor via maven compiler plugin, the Immutables is giving compilation errors. I need Immutables as well in my project.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
<generatedSourcesDirectory>${project.build.directory}/generated-sources/</generatedSourcesDirectory>
<annotationProcessors>
<annotationProcessor>
org.smarttechie.TraceAnnotationProcessor
</annotationProcessor>
</annotationProcessors>
</configuration>
</plugin>
Any hints to use Immutables/any annotation processors along with my custom annotation processor.

Package your annotation processor into a JAR and include that JAR as a
compilation dependency. Be sure to add
META-INF/services/javax.annotation.processing.Processor to your JAR
(contents single line with your processor class name):
org.smarttechie.TraceAnnotationProcessor
If you don't want your new JAR included as a dependency of your generated
artifact, mark it prodided and/or true.

Related

Different packaging with Spring Maven Plugin: Spring + Spark application

I have a project which consists of three parts:
Spring Boot application
Spark Application
"Library" used by both of the above (having this library as separate JAR or similar causes quiet a bit of overhead and slowed down the development)
So what I want is a JAR that can be used to run the Spring Boot app (java -jar myapp.jar) as well as the Spark app (java -cp myapp.jar path.to.main.class).
It is also OK to have two JARs - but both would need to be fat JARs (meaning: include dependencies).
What I tried in the pom.xml is this:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
This creates (as expected) a fat JAR that can be used to run the Spring Boot app. But it cannot be used for the Spark app (as the classes and dependencies are somehow repackaged as I understand).
My second try was this:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
<configuration>
<classifier>exec</classifier>
</configuration>
</execution>
</executions>
</plugin>
This creates the fat JAR as well plus another JAR just holding the classes that are implemented in my project - but without the dependencies. Therefore the Spark job does not start (as expected).
Any idea how to solve this situation?
Thanks!
I used the same technology stack for an application (Spring- for the web part and Apache Spark for the big data processing). I don't see the case where someone wants to build a fat jar for both the side, Spring + Spark (except the case where inside spark jobs you would use something from Spring). So, the approach that we use is to have to separate Maven module one for the Spring web part and one for the Apache Spark. For the Spring Boot we did not use the spring-boot-maven-plugin, but instead we used the following maven plugins, something like this:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<configuration>
<mainClass>com.Application</mainClass>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.5.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<excludeArtifactIds>integration</excludeArtifactIds>
<outputDirectory>${project.build.directory}/lib/</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>lib/</classpathPrefix>
<mainClass>com.Application</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
Like this we have a better controller use all the dependencies (e.g: put then in a lib folder and include them in MANIFEST)
For the Spark application you have two option:
run with spark-submit (personally i don't prefer it)
use SparkLauncher class from spark_launcher*.jar dependency (calling from web a Spark Job).
Building a fat jar for Spark application with only dependencies used in Spark code is desirable, cause you load only what you truly need. We can use maven-shade-plugin for this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<includes>
// put here what you need to include
</includes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
Using maven shade plugin is not encouraged, unless and until there is no other option apart from using the maven shade plugin. Below is the informative link for not to use the shade plugin.
Downsides of using Shade plugin relocation feature
From my perspective for running spark application, spring boot maven plugin is the best option for managing everything, including library dependencies. Even if your application is used as a scheduled job, in which you need to use spark-submit/spark launcher for launching the application.
In other case where your java app using spark and spring both but have controller/api to use the application in that case also spring boot maven plugin is the best one.
There are only 2 types of Challanges when we use spark-submit/spark launcher to launch the application which is created using spring boot maven plugin.
1. Main Class
When we package fat jar/uber jar using spring boot maven plugin, it is packging the class files and java libraries in spring way and not the way sprak is expecting it.Inside the uber jar generated by the spring boot maven plugin, we have boot-inf,meta-inf and org folder.So when we give mainclass in spark submit or spark launcher as a parameter,It will not able to find that class as it will not be able to locate package/path specified in parameter, due to change of structure in jar file.Even after you specify the correct location which is starting with BOOT-INF for main class, it will not work because the way spring launches the application is using differrent main class.
Below is the link which shows main class that should be used for launching the fat jar generated by spring boot maven plugin.
https://docs.spring.io/spring-boot/docs/current/reference/html/executable-jar.html#appendix.executable-jar.launching
On high level if I inform ,File named as MANIFEST.MF inside the uber jar contains below entries.Where Main-Class is the actual main class which is used for initialization of spring related stuff, after which customized main class should get started which is Start-Class entry.
Main-Class: org.springframework.boot.loader.JarLauncher
Start-Class: com.mycompany.project.MyApplication
So as a conclusion specifiying main class as "org.springframework.boot.loader.JarLauncher" in spark-submit or spark-launcher will resolve our issue for this problem.This will only work if you are using sparing boot maven plugin for packagin the jar.
External common libraries used in pom.xml+spark installation.
Another issue which might occure while using spark-submit or launcher application while launching uber jar, packaged using the spring boot maven plugin is jar conflicts.So the problem will be when we package jar using spring boot maven plugin, it is coping depedencies inside BOOT-INF/lib folder, let's say if you are using below depedency in pom.xml.
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.10</version>
</dependency>
Now let's say this depedency already exist with differrent version in spark installation, in that case it will give conflict in classes.As it will be an issue of class loader ,its better not to use those which might create conflict due to which application might fail.I have faced this kind of issues related to logger classes or json libraries, as there are multiple json/logger library options are available.As a resolution you can excluded those classes or libraries or you can replace that library with alternative one.

How to add test-jar as aspectLibrary in Maven

I have an aspect that I want to use in my test-classes. I don't want to add it to the main jar, as it would pull in test libraries like junit and mockito. While there's a configuration setting to add an aspectLibrary, it always adds the main jar, there's no way to specify the test-jar.
My aspectj plugin looks like this:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.7</version>
<configuration>
<aspectLibraries>
<aspectLibrary>
<groupId>aspect.test</groupId>
<artifactId>general</artifactId>
<type>jar</type>
</aspectLibrary>
</aspectLibraries>
</configuration>
<executions>
<execution>
<phase>process-sources</phase>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
I actually want to specify test-jar but that doesn't seem possible. Without the it defaults to the jar (obviously).
I also might have to configure aspectj-maven-plugin for the compile and test-compile goal... but first I need to know how to specify the test-jar. Any suggestions are welcome.
Please read the Maven JAR Plugin documentation, chapter How to create a jar containing test classes. There are two options listed:
the easy way: using type "test-jar" (will not work here)
the preferred way: creating a normal JAR containing only test classes, then importing it with scope "test"
We will choose the preferred way because it solves your problem. So basically you do the following:
Create a separate module for test helper aspects/classes, put everything under src/main/java, not src/test/java. AspectJ Maven plugin should have an execution with <goal>compile</goal> for that module.
Add that module as a test-scoped dependency wherever you need the test aspects
Refer to the module as an <acpectLibrary> from AspectJ Maven plugin and also be careful to only use <goal>test-compile</goal> in your plugin execution for that module so as to avoid to have the aspects woven into production code or to get error messages because the dependency has test scope and is unavailable for normal compile.
Because I do not want to fully quote 3 POMs and several classes here, I have created a little GitHub sample project for you. Just clone it, inspect all the files and their respective locations and - be happy. ;-)

Spring Boot useTestClasspath throws CannotLoadBeanClassException and ClassNotFoundException

I have a maven project with a test class located at src/test/java/MyDevClass which is intended for development/testing purposes only. I would like to use it when I start spring-boot-maven-plugin with the command line mvn spring-boot:run.
So, my pom.xml contains:
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<!-- TODO: Will create a maven profile and have useTestClasspath only for development/testing -->
<useTestClasspath>true</useTestClasspath>
</configuration>
</plugin>
</plugins>
</build>
But, I get the following error:
org.springframework.beans.factory.CannotLoadBeanClassException: Cannot find class [MyDevClass]
Caused by: java.lang.ClassNotFoundException: MyDevClass
Intriguing enough, I have another project using tomcat7-maven-plugin and it works fine:
<plugin>
<groupId>org.apache.tomcat.maven</groupId>
<artifactId>tomcat7-maven-plugin</artifactId>
<version>2.0</version>
<configuration>
<useTestClasspath>true</useTestClasspath>
</configuration>
</plugin>
What I am missing?
Spring-boot maven plugin does not include current module's test sources (and resources) into classpath even if useTestClasspath is set to true.
I ran a forked execution with verbose logging (-X flag to Maven), and the plugin listed the forked JVM classpath. Test classes were not included.
If you look at the plugin sources (version 1.5.3 at time of writing), the flag useTestClasspath is only used in the addDependencies method.
I see two options as workarounds
Add the target/test-classes dir to the run classpath:
<directories>
<directory>${project.build.testOutputDirectory}</directory>
</directories>
For older plugin version use:
<folders>
<folder>${project.build.testOutputDirectory}</folder>
</folders>
The problem here is that the folder is added to the beginning of classpath, while test files are preferable at its end.
Create another Maven module with test classes and resources, and add it as a <scope>test</scope> dependency.

configuring maven plugin with a custom packaging

I have written a custom maven plugin which works in package phase of maven's default life cycle. In addition I have added a custom packaging type to it. In order to support the custom packaging type I have introduced components.xml so it will override the default maven life cycle. In the component/configuration/lifecycles/lifecycle/phases/package section of component.xml I have added my plugin to execute in package phase.
When I use my plugin I pass the configuration to the plugin through the pom.xml as follows;
<build>
...
<plugins>
<plugin>
<groupId>sample</groupId>
<artifactId>sampleArtifact</artifactId>
<extensions>true</extensions>
...
<executions>
<executions>
<phase>package</phase>
<goal>generate</generate>
<configuration>
//Configuration goes here.
</configuration>
</executions>
</executions>
...
</plugin>
</plugins>
...
</build>
Problem:
the configuration I passed into mojo as above is not getting set in the mojo. However, if I set the plugin configuration in one level below the tag(where the executions tag present), then it works. Since this plugin works in package phase I need the plugin configuration passed in through as above. Without custom packaging, the above configuration works well. Any thoughts on what I miss here?

Difference between 'plugin' section and 'dependency' section in pom. Which one to use when?

While searching for maven-check-style plugin information online I found that it can be added both as a <dependency> tag like this:
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>2.5</version>
</dependency>
and also under <plugins> tag like this:
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<configuration>
<configLocation>config/sun_checks.xml</configLocation>
</configuration>
</plugin>
</plugins>
</reporting>
I would like to know the difference between each and which one to use when. Please guide.
As far as I know, plugins are also artifacts, so they can be added as a dependency to a project. However adding plugin artifact as a dependency, doesnt bind its execution to any phase of maven build, therefore it cannot be executed.
Here you can find some answers:
https://www.quora.com/In-Maven-what-is-the-difference-between-dependency-and-plugins
"A plugin is an extension to Maven, something used to produce your artifact (maven-jar-plugin for an example, is used to, you guess it, make a jar out of your compiled classes and resources).
A dependency is a library that is needed by the application you are building, at compile and/or test and/or runtime time."Olivier Demeijer

Resources