org.springframework.boot:spring-boot-maven-plugin:1.5.10.RELEASE:repackage failed: Unable t o find main class - maven

I have the following project structure :
parent pom
|
- Core module - pom
- Dist - pom
- Integration test - pom
Spring boot main class is in the core module. Integration tests will go into the integration tests folder.
I am trying to use spring-boot-maven-plugin for integration testing.
I have done the following pom configuration for the same in the integration-test module. The start-class property is set in the parent pom :
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<mainClass>${start-class}</mainClass>
<layout>JAR</layout>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
<execution>
<id>pre-integration-test</id>
<goals>
<goal>start</goal>
</goals>
</execution>
<execution>
<id>post-integration-test</id>
<goals>
<goal>stop</goal>
</goals>
</execution>
</executions>
</plugin>
However I get the following error on mvn clean install on the parent pom :
[ERROR] Failed to execute goal org.springframework.boot:spring-boot-maven-plugin:1.5.10.RELEASE:start (pre-integration-test) on pro
ject: Spring application did not start before the configured timeout (30000ms -> [Help 1]
[ERROR]
Questions :
Is this project structure the best way to organize the integration
tests?
How and where do I give the path of the main class to the
spring-boot-maven-plugin correctly? It doesn't work even when I give
it in the integration module or parent pom even after giving a
relative path that you would normally do (e.g.
../core-module/)
maven-failsafe-plugin and spring annotations on the test class itself seem sufficient to run integration tests. So first of all is there any relation between integration tests and spring boot maven plugin? What is the need of spring boot maven plugin apart from packaging into an executable file. I have this confusion because it has goals like start pre-integration-test & post-integration-test but I fail to understand why its needed.
PS : I have checked few questions with a similar problem but none answer the questions I have asked for the given project structure.

I just ran into the same error as you (repackage failed: Unable t o find main class) trying to generalize some Spring code into a common spring boot project.
Turns out that it's not possible as a Spring boot project needs a class with a main method.
I solve my problem using direct dependencies to spring framework modules that I need, perhaps you can do that in your integration test project.
However, I don't know what kind of tests you are trying to run in the separate project, but tests concerning core module must be in the core module itself, using Spring test, not in a separate project. (use mock where needed).
Hope this helps.

Related

How should I run front end integration tests in spring boot?

I have a simple spring boot project that has a maven structure like this:
project
- backend (spring boot)
- frontend (angular)
I got the project configured so *IT.java classes run during integration test phase of maven and they are successfully testing the Spring Boot REST API with an H2 database.
I'm trying to configure the project so my cucumber tests that interact with the browser will run during the integration test phase of the project. The cucumber tests start up ok but fail for the simple reason the angular files don't get served by the spring boot application when it starts.
In backend/src/main/resources/static I have a simple index.html file. When the cucumber tests run they open a browser an I see the content of that file.
In the backend module (war) pom I copy the dist content from the angular build...
<build>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>${spring.boot.version}</version>
<executions>
<execution>
<goals>
<goal>build-info</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>some.package.Boot</mainClass>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>${maven.war.plugin.version}</version>
<configuration>
<warName>${root.context}</warName>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
<resources>
<resource>
<directory>../frontend/<appname>/dist/frontend</directory>
<!--<targetPath>src/main/resources/static</targetPath>-->
<targetPath>static</targetPath>
</resource>
</resources>
</build>
In the resultant war file the angular files are packaged up into WEB-INF/classes/static/.
You'll note from above I also tried getting the angular code copied into resources/static, but that puts the files in target/classes/src/main/resources/static, so that isn't the right approach!
Yet the cucumber tests still only see the content of the index.html file from src/main/resources/static when they run.
Potentially we could move the angular app into the same src tree as the java/webapp code, but that's not an option.
How can I persuade Spring Boot to use the war in target rather what it appears to be doing and service content from source?
Versions:
Java 11
Maven war plugin 3.2.2
Springboot version defined in parent pom as 2.1.3.RELEASE
spring-boot-starter-web
spring-boot-starter-data-rest
spring-boot-starter-data-jpa
spring-boot-starter-data-jdbc
spring-boot-starter-tomcat (not sure I need this one)

Different packaging with Spring Maven Plugin: Spring + Spark application

I have a project which consists of three parts:
Spring Boot application
Spark Application
"Library" used by both of the above (having this library as separate JAR or similar causes quiet a bit of overhead and slowed down the development)
So what I want is a JAR that can be used to run the Spring Boot app (java -jar myapp.jar) as well as the Spark app (java -cp myapp.jar path.to.main.class).
It is also OK to have two JARs - but both would need to be fat JARs (meaning: include dependencies).
What I tried in the pom.xml is this:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
This creates (as expected) a fat JAR that can be used to run the Spring Boot app. But it cannot be used for the Spark app (as the classes and dependencies are somehow repackaged as I understand).
My second try was this:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
<configuration>
<classifier>exec</classifier>
</configuration>
</execution>
</executions>
</plugin>
This creates the fat JAR as well plus another JAR just holding the classes that are implemented in my project - but without the dependencies. Therefore the Spark job does not start (as expected).
Any idea how to solve this situation?
Thanks!
I used the same technology stack for an application (Spring- for the web part and Apache Spark for the big data processing). I don't see the case where someone wants to build a fat jar for both the side, Spring + Spark (except the case where inside spark jobs you would use something from Spring). So, the approach that we use is to have to separate Maven module one for the Spring web part and one for the Apache Spark. For the Spring Boot we did not use the spring-boot-maven-plugin, but instead we used the following maven plugins, something like this:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<configuration>
<mainClass>com.Application</mainClass>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.5.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<excludeArtifactIds>integration</excludeArtifactIds>
<outputDirectory>${project.build.directory}/lib/</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>lib/</classpathPrefix>
<mainClass>com.Application</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
Like this we have a better controller use all the dependencies (e.g: put then in a lib folder and include them in MANIFEST)
For the Spark application you have two option:
run with spark-submit (personally i don't prefer it)
use SparkLauncher class from spark_launcher*.jar dependency (calling from web a Spark Job).
Building a fat jar for Spark application with only dependencies used in Spark code is desirable, cause you load only what you truly need. We can use maven-shade-plugin for this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<includes>
// put here what you need to include
</includes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
Using maven shade plugin is not encouraged, unless and until there is no other option apart from using the maven shade plugin. Below is the informative link for not to use the shade plugin.
Downsides of using Shade plugin relocation feature
From my perspective for running spark application, spring boot maven plugin is the best option for managing everything, including library dependencies. Even if your application is used as a scheduled job, in which you need to use spark-submit/spark launcher for launching the application.
In other case where your java app using spark and spring both but have controller/api to use the application in that case also spring boot maven plugin is the best one.
There are only 2 types of Challanges when we use spark-submit/spark launcher to launch the application which is created using spring boot maven plugin.
1. Main Class
When we package fat jar/uber jar using spring boot maven plugin, it is packging the class files and java libraries in spring way and not the way sprak is expecting it.Inside the uber jar generated by the spring boot maven plugin, we have boot-inf,meta-inf and org folder.So when we give mainclass in spark submit or spark launcher as a parameter,It will not able to find that class as it will not be able to locate package/path specified in parameter, due to change of structure in jar file.Even after you specify the correct location which is starting with BOOT-INF for main class, it will not work because the way spring launches the application is using differrent main class.
Below is the link which shows main class that should be used for launching the fat jar generated by spring boot maven plugin.
https://docs.spring.io/spring-boot/docs/current/reference/html/executable-jar.html#appendix.executable-jar.launching
On high level if I inform ,File named as MANIFEST.MF inside the uber jar contains below entries.Where Main-Class is the actual main class which is used for initialization of spring related stuff, after which customized main class should get started which is Start-Class entry.
Main-Class: org.springframework.boot.loader.JarLauncher
Start-Class: com.mycompany.project.MyApplication
So as a conclusion specifiying main class as "org.springframework.boot.loader.JarLauncher" in spark-submit or spark-launcher will resolve our issue for this problem.This will only work if you are using sparing boot maven plugin for packagin the jar.
External common libraries used in pom.xml+spark installation.
Another issue which might occure while using spark-submit or launcher application while launching uber jar, packaged using the spring boot maven plugin is jar conflicts.So the problem will be when we package jar using spring boot maven plugin, it is coping depedencies inside BOOT-INF/lib folder, let's say if you are using below depedency in pom.xml.
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.10</version>
</dependency>
Now let's say this depedency already exist with differrent version in spark installation, in that case it will give conflict in classes.As it will be an issue of class loader ,its better not to use those which might create conflict due to which application might fail.I have faced this kind of issues related to logger classes or json libraries, as there are multiple json/logger library options are available.As a resolution you can excluded those classes or libraries or you can replace that library with alternative one.

How to add test-jar as aspectLibrary in Maven

I have an aspect that I want to use in my test-classes. I don't want to add it to the main jar, as it would pull in test libraries like junit and mockito. While there's a configuration setting to add an aspectLibrary, it always adds the main jar, there's no way to specify the test-jar.
My aspectj plugin looks like this:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.7</version>
<configuration>
<aspectLibraries>
<aspectLibrary>
<groupId>aspect.test</groupId>
<artifactId>general</artifactId>
<type>jar</type>
</aspectLibrary>
</aspectLibraries>
</configuration>
<executions>
<execution>
<phase>process-sources</phase>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
I actually want to specify test-jar but that doesn't seem possible. Without the it defaults to the jar (obviously).
I also might have to configure aspectj-maven-plugin for the compile and test-compile goal... but first I need to know how to specify the test-jar. Any suggestions are welcome.
Please read the Maven JAR Plugin documentation, chapter How to create a jar containing test classes. There are two options listed:
the easy way: using type "test-jar" (will not work here)
the preferred way: creating a normal JAR containing only test classes, then importing it with scope "test"
We will choose the preferred way because it solves your problem. So basically you do the following:
Create a separate module for test helper aspects/classes, put everything under src/main/java, not src/test/java. AspectJ Maven plugin should have an execution with <goal>compile</goal> for that module.
Add that module as a test-scoped dependency wherever you need the test aspects
Refer to the module as an <acpectLibrary> from AspectJ Maven plugin and also be careful to only use <goal>test-compile</goal> in your plugin execution for that module so as to avoid to have the aspects woven into production code or to get error messages because the dependency has test scope and is unavailable for normal compile.
Because I do not want to fully quote 3 POMs and several classes here, I have created a little GitHub sample project for you. Just clone it, inspect all the files and their respective locations and - be happy. ;-)

Auto-configured tests as integration tests

I've got a spring boot application that I'd like to automatically test.
I've written an #DataJpaTest unit test like the one here, and it works fine, however it slows down the build process considerably due to having to start spring.
I'd like to run these tests as integration tests using the maven failsafe plugin, but I can figure out how to do this. If I rename the tests so they match *IT.java, failsafe tries to run them, but spring doesn't start and I get java.lang.NoClassDefFoundError errors for the injected repositories.
What's the best way to run spring boot tests as integration tests?
Update 18 March:
With a dependency on spring-boot-starter-test:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
</dependency>
And the following plugin configuration:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<plugin>
The integration tests execute correctly using mvn failsafe:integration-test. However, I'd like the tests to be run when I do mvn install, so I updated the plugin config to:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
<plugin>
And now the spring framework doesn't start when I run the test with mvn verify, so I get java.lang.NoClassDefFoundError for the JpaRepository I'm trying to inject and test.

Changing when maven runs integration tests

We have unit tests (mockito) and integration tests (in-memory database)
We'd like maven to not run the integration tests as part of 'mvn install'.
Basically I think this means reconfiguring the lifecyle so that integration-test
comes between install and deploy. Is this possible?
The reason for this would be that the integration tests are somewhat time consuming
and so we don't want them to run every time a developer does an install. But we would
like them to be run before the project can be released, for example.
Check the docs for the plugin you use for running integration tests (possibly Failsafe) - simply exclude the tests, or set the plugin execution to false.
Does integration-tests just execute a single plugin (like surefire)? If so, it is probably easier to just bind the plugin execution to a different phase:
<project>
...
<build>
<plugins>
<plugin>
...
<executions>
<execution>
<id>execution1</id>
<phase>install</phase>
<configuration>
...
</configuration>
<goals>
<goal>test</goal>
</goals>
</execution>

Resources