I have a simple spring boot project that has a maven structure like this:
project
- backend (spring boot)
- frontend (angular)
I got the project configured so *IT.java classes run during integration test phase of maven and they are successfully testing the Spring Boot REST API with an H2 database.
I'm trying to configure the project so my cucumber tests that interact with the browser will run during the integration test phase of the project. The cucumber tests start up ok but fail for the simple reason the angular files don't get served by the spring boot application when it starts.
In backend/src/main/resources/static I have a simple index.html file. When the cucumber tests run they open a browser an I see the content of that file.
In the backend module (war) pom I copy the dist content from the angular build...
<build>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>${spring.boot.version}</version>
<executions>
<execution>
<goals>
<goal>build-info</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>some.package.Boot</mainClass>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>${maven.war.plugin.version}</version>
<configuration>
<warName>${root.context}</warName>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
<resources>
<resource>
<directory>../frontend/<appname>/dist/frontend</directory>
<!--<targetPath>src/main/resources/static</targetPath>-->
<targetPath>static</targetPath>
</resource>
</resources>
</build>
In the resultant war file the angular files are packaged up into WEB-INF/classes/static/.
You'll note from above I also tried getting the angular code copied into resources/static, but that puts the files in target/classes/src/main/resources/static, so that isn't the right approach!
Yet the cucumber tests still only see the content of the index.html file from src/main/resources/static when they run.
Potentially we could move the angular app into the same src tree as the java/webapp code, but that's not an option.
How can I persuade Spring Boot to use the war in target rather what it appears to be doing and service content from source?
Versions:
Java 11
Maven war plugin 3.2.2
Springboot version defined in parent pom as 2.1.3.RELEASE
spring-boot-starter-web
spring-boot-starter-data-rest
spring-boot-starter-data-jpa
spring-boot-starter-data-jdbc
spring-boot-starter-tomcat (not sure I need this one)
Related
We have multiple boot component as well as different schema for each component. We have a business requirement where we need to deliver everything as a war and has to be deployed as a single application on different machine.
I am trying to go with plugin architecture, so that one(new) aggregator microservice will be prepared where we can add every boot component as a plugin and then use them as a bundle(war) deployable.
By enabling a component with maven plugin we have created two jar one executable jar and another original jar. But how do i add them in to the new aggregator component as a plugin and access the rest layer from each component(plugin).
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.3.4.RELEASE</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<classifier>exec</classifier>
</configuration>
</plugin>
</plugins>
Do i need any change in approach/ how do i add a spring-boot jar as a plugin to another project? Any references or sample structure/code would be helpful & thanks in advance.
I see on the web some images refferred to Spring Boot Admin showing the app version in the wallboard page.
I'm using latest version of SBA, currently 2.1.6 and i can't see the versions in the wallboard.
I see something like this.
Reading the documentation it seems that a maven plugin is needed:
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>build-info</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
I added it in the pom.xml of a micro-service and I restarted all docker swarms stacks (including SBA) but no changes.
I did some search but I can't find any reference.
The 'spring-boot-maven-plugin' is required to generate the build-info in
/target/classes/META-INF/build-info.properties
Spring Boot Admin picks up the build info including the application version from this file. Please check if this file is generated.
You need to execute the maven plugin first or just run
mvn clean install
For Spring Boot applications
The easiest way 😄 to show the version, is to use the build-info goal from the spring-boot-maven-plugin, which generates the META-INF/build-info.properties.
1) Change/add the plugin in the pom.xml as below👇
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>build-info</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
2) Delete 🚮 your target folder and do a mvn clean install🧹
3) Restart your app and check the version is there 👏
I did so and it worked.
Src.➡Show Version in Application List
Dirty fix
If the previous solution does not work...
You can read the properties from the META-INF (in the jar) and concatenate it to the app name (here: myApp-service).
1) Do the previous step 👆 (add goal in maven plugin)
2) Add in the properties:
spring.config.import=classpath:META-INF/build-info.properties
spring.application.name=myApp-service ${build.version}
3) Check the result (image below 📸 )
Src.➡spring-boot-maven-plugin build-info.properties
while creating a maven project using archetype13, I have faced Sling IDE issues as below for core and test pom.xml
This is the error that I get:
"Missing m2e incremental build support for generating the bundle manifest, component descriptions and metatype resources.
Please use the provided Quick Fixes on this issue to resolve this. pom.xml /AEMEditable.core line 1 Bundle Project Not Supporting M2E"
Note: Figured out how to solve this.
We need to manually modify the pom.xml for core and test, under maven-bundle plugin.
For both core and test pom.xml add the following code :
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
<executions>
<!-- Configure extra execution of 'manifest' in process-classes phase
to make sure SCR metadata is generated before unit test runs -->
<execution>
<id>scr-metadata</id>
<goals>
<goal>manifest</goal>
</goals>
<configuration>
<supportIncrementalBuild>true</supportIncrementalBuild>
</configuration>
</execution>
</executions>
<configuration>
<instructions>
<!-- Import any version of javax.inject, to allow running on multiple versions of AEM -->
<Import-Package>javax.inject;version=0.0.0,*</Import-Package>
<Sling-Model-Packages>
AEMEditable.core
</Sling-Model-Packages>
<!-- Enable processing of OSGI DS component annotations -->
<_dsannotations>*</_dsannotations>
<!-- Enable processing of OSGI metatype annotations -->
<_metatypeannotations>*</_metatypeannotations>
</instructions>
<exportScr>true</exportScr>
</configuration>
</plugin>
Check image:
Once this is done, select the whole project in eclipse, right click, Click on Maven --> Update Project
I have the following project structure :
parent pom
|
- Core module - pom
- Dist - pom
- Integration test - pom
Spring boot main class is in the core module. Integration tests will go into the integration tests folder.
I am trying to use spring-boot-maven-plugin for integration testing.
I have done the following pom configuration for the same in the integration-test module. The start-class property is set in the parent pom :
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<mainClass>${start-class}</mainClass>
<layout>JAR</layout>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
<execution>
<id>pre-integration-test</id>
<goals>
<goal>start</goal>
</goals>
</execution>
<execution>
<id>post-integration-test</id>
<goals>
<goal>stop</goal>
</goals>
</execution>
</executions>
</plugin>
However I get the following error on mvn clean install on the parent pom :
[ERROR] Failed to execute goal org.springframework.boot:spring-boot-maven-plugin:1.5.10.RELEASE:start (pre-integration-test) on pro
ject: Spring application did not start before the configured timeout (30000ms -> [Help 1]
[ERROR]
Questions :
Is this project structure the best way to organize the integration
tests?
How and where do I give the path of the main class to the
spring-boot-maven-plugin correctly? It doesn't work even when I give
it in the integration module or parent pom even after giving a
relative path that you would normally do (e.g.
../core-module/)
maven-failsafe-plugin and spring annotations on the test class itself seem sufficient to run integration tests. So first of all is there any relation between integration tests and spring boot maven plugin? What is the need of spring boot maven plugin apart from packaging into an executable file. I have this confusion because it has goals like start pre-integration-test & post-integration-test but I fail to understand why its needed.
PS : I have checked few questions with a similar problem but none answer the questions I have asked for the given project structure.
I just ran into the same error as you (repackage failed: Unable t o find main class) trying to generalize some Spring code into a common spring boot project.
Turns out that it's not possible as a Spring boot project needs a class with a main method.
I solve my problem using direct dependencies to spring framework modules that I need, perhaps you can do that in your integration test project.
However, I don't know what kind of tests you are trying to run in the separate project, but tests concerning core module must be in the core module itself, using Spring test, not in a separate project. (use mock where needed).
Hope this helps.
I have a project which consists of three parts:
Spring Boot application
Spark Application
"Library" used by both of the above (having this library as separate JAR or similar causes quiet a bit of overhead and slowed down the development)
So what I want is a JAR that can be used to run the Spring Boot app (java -jar myapp.jar) as well as the Spark app (java -cp myapp.jar path.to.main.class).
It is also OK to have two JARs - but both would need to be fat JARs (meaning: include dependencies).
What I tried in the pom.xml is this:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
This creates (as expected) a fat JAR that can be used to run the Spring Boot app. But it cannot be used for the Spark app (as the classes and dependencies are somehow repackaged as I understand).
My second try was this:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
<configuration>
<classifier>exec</classifier>
</configuration>
</execution>
</executions>
</plugin>
This creates the fat JAR as well plus another JAR just holding the classes that are implemented in my project - but without the dependencies. Therefore the Spark job does not start (as expected).
Any idea how to solve this situation?
Thanks!
I used the same technology stack for an application (Spring- for the web part and Apache Spark for the big data processing). I don't see the case where someone wants to build a fat jar for both the side, Spring + Spark (except the case where inside spark jobs you would use something from Spring). So, the approach that we use is to have to separate Maven module one for the Spring web part and one for the Apache Spark. For the Spring Boot we did not use the spring-boot-maven-plugin, but instead we used the following maven plugins, something like this:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<configuration>
<mainClass>com.Application</mainClass>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.5.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<excludeArtifactIds>integration</excludeArtifactIds>
<outputDirectory>${project.build.directory}/lib/</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>lib/</classpathPrefix>
<mainClass>com.Application</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
Like this we have a better controller use all the dependencies (e.g: put then in a lib folder and include them in MANIFEST)
For the Spark application you have two option:
run with spark-submit (personally i don't prefer it)
use SparkLauncher class from spark_launcher*.jar dependency (calling from web a Spark Job).
Building a fat jar for Spark application with only dependencies used in Spark code is desirable, cause you load only what you truly need. We can use maven-shade-plugin for this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<includes>
// put here what you need to include
</includes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
Using maven shade plugin is not encouraged, unless and until there is no other option apart from using the maven shade plugin. Below is the informative link for not to use the shade plugin.
Downsides of using Shade plugin relocation feature
From my perspective for running spark application, spring boot maven plugin is the best option for managing everything, including library dependencies. Even if your application is used as a scheduled job, in which you need to use spark-submit/spark launcher for launching the application.
In other case where your java app using spark and spring both but have controller/api to use the application in that case also spring boot maven plugin is the best one.
There are only 2 types of Challanges when we use spark-submit/spark launcher to launch the application which is created using spring boot maven plugin.
1. Main Class
When we package fat jar/uber jar using spring boot maven plugin, it is packging the class files and java libraries in spring way and not the way sprak is expecting it.Inside the uber jar generated by the spring boot maven plugin, we have boot-inf,meta-inf and org folder.So when we give mainclass in spark submit or spark launcher as a parameter,It will not able to find that class as it will not be able to locate package/path specified in parameter, due to change of structure in jar file.Even after you specify the correct location which is starting with BOOT-INF for main class, it will not work because the way spring launches the application is using differrent main class.
Below is the link which shows main class that should be used for launching the fat jar generated by spring boot maven plugin.
https://docs.spring.io/spring-boot/docs/current/reference/html/executable-jar.html#appendix.executable-jar.launching
On high level if I inform ,File named as MANIFEST.MF inside the uber jar contains below entries.Where Main-Class is the actual main class which is used for initialization of spring related stuff, after which customized main class should get started which is Start-Class entry.
Main-Class: org.springframework.boot.loader.JarLauncher
Start-Class: com.mycompany.project.MyApplication
So as a conclusion specifiying main class as "org.springframework.boot.loader.JarLauncher" in spark-submit or spark-launcher will resolve our issue for this problem.This will only work if you are using sparing boot maven plugin for packagin the jar.
External common libraries used in pom.xml+spark installation.
Another issue which might occure while using spark-submit or launcher application while launching uber jar, packaged using the spring boot maven plugin is jar conflicts.So the problem will be when we package jar using spring boot maven plugin, it is coping depedencies inside BOOT-INF/lib folder, let's say if you are using below depedency in pom.xml.
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.10</version>
</dependency>
Now let's say this depedency already exist with differrent version in spark installation, in that case it will give conflict in classes.As it will be an issue of class loader ,its better not to use those which might create conflict due to which application might fail.I have faced this kind of issues related to logger classes or json libraries, as there are multiple json/logger library options are available.As a resolution you can excluded those classes or libraries or you can replace that library with alternative one.