I am trying to figure out what is the best way to setup a spring boot application in such a way that its has its own jar dependencies but additional jars are added to classpath at runtime when its being run as java -jar command. What approach makes more sense
Use the original jar (without dependencies added to it) and place all jars (application and runtime) in a folder on file system and use PropertiesLauncher to specify the loader.path to jars folder.
Use the fat jar (with application jars) place the additional jars on the filesystem and somehow include those as additional jars that need to be added to classpath. Not sure how this can be done.
Is there another better way to do this
The PropertiesLauncher was designed to work with fat jars, so you should be able to keep the fat jar and add as many additional dependencies as you like in an external location, e.g. with loader.path=/opt/app/lib:lib. I guess that's your option 2? If it doesn't work we can discuss in a github issue.
I resolved this issue using the following spring-boot-maven-plugin configuration, I had to build my Uber jar without excluded artifacts to create my external "lib" directory, then I added my excluded artifacts again and packaged my Uber jar with my application specific dependencies only.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>1.3.1.RELEASE</version>
<configuration>
<layout>ZIP</layout>
<executable>true</executable>
<excludeArtifactIds>
<!-- My libs which will be packaged with my Uber jar-->
<!-- core,data-feeder,engine,lightspeed-tcp-api,order-manager,store,strategies,utils,viewer -->
<!-- Other libs -->
antlr,aopalliance,aspectjrt,aspectjweaver,classmate,commons-lang,
dom4j,h2,hibernate-commons-annotations,hibernate-core,hibernate-entitymanager,
hibernate-jpa-2.1-api,hibernate-validator,jackson-annotations,jackson-core,jackson-databind,
jandex,javassist,javax.transaction-api,jboss-logging,jboss-logging-annotations,jcl-over-slf4j,
jul-to-slf4j,log4j-over-slf4j,logback-classic,logback-core,mysql-connector-java,slf4j-api,
snakeyaml,spring-aop,spring-aspects,spring-beans,spring-boot,spring-boot-autoconfigure,
spring-boot-starter,spring-boot-starter-aop,spring-boot-starter-data-jpa,spring-boot-starter-jdbc,
spring-boot-starter-logging,spring-boot-starter-tomcat,spring-boot-starter-web,
spring-boot-starter-websocket,spring-context,spring-core,spring-data-commons,spring-data-jpa,
spring-expression,spring-jdbc,spring-messaging,spring-orm,spring-tx,spring-web,spring-webmvc,
spring-websocket,tomcat-embed-core,tomcat-embed-el,tomcat-embed-logging-juli,tomcat-embed-websocket,
tomcat-jdbc,tomcat-juli,validation-api,xml-apis
</excludeArtifactIds>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
Then, I added the following property to my "application.properties" which inside my jar "resources/" dir to specify my "lib" dir for Spring PropertiesLauncher where I put "lib" dir along with my jar in the same dir.
loader.path=lib/
Finally, I did run my jar using the following command
java -jar back-tester-0.0.1-beta-01.jar
Also, you can add the "loader.path" property to your command line without putting it in your "application.properties" like the following command but this way didn't work with me as I packaged my jar as an executable one which I'm running as linux service.
java -Dloader.path="lib/" -jar back-tester-0.0.1-beta-01.jar
Now, I successfully reduced my jar size from 29 M to only 1 M jar which contains only my application specific libs and it works out of the box.
thank you #Ashraf Sarhan, you rescue my two days :)
I added in pom file:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<layout>ZIP</layout>
<executable>true</executable>
<mainClass>vn.com.Mymainclass</mainClass>
<excludes>
<exclude>
<groupId>com.vn.groupId</groupId>
<artifactId>excluded-id-a</artifactId>
</exclude>
<exclude>
<groupId>com.vn.groupId</groupId>
<artifactId>excluded-id-b</artifactId>
</exclude>
</excludes>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
And Placed ./lib folder containing two jars of two files which excluded above beside with my-main-spring-boot-app.jar file, and I ran:
java -Dloader.path="lib/" -jar my-main-spring-boot-app.jar
It worked perfectly.
Related
I have a project which consists of three parts:
Spring Boot application
Spark Application
"Library" used by both of the above (having this library as separate JAR or similar causes quiet a bit of overhead and slowed down the development)
So what I want is a JAR that can be used to run the Spring Boot app (java -jar myapp.jar) as well as the Spark app (java -cp myapp.jar path.to.main.class).
It is also OK to have two JARs - but both would need to be fat JARs (meaning: include dependencies).
What I tried in the pom.xml is this:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
This creates (as expected) a fat JAR that can be used to run the Spring Boot app. But it cannot be used for the Spark app (as the classes and dependencies are somehow repackaged as I understand).
My second try was this:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
<configuration>
<classifier>exec</classifier>
</configuration>
</execution>
</executions>
</plugin>
This creates the fat JAR as well plus another JAR just holding the classes that are implemented in my project - but without the dependencies. Therefore the Spark job does not start (as expected).
Any idea how to solve this situation?
Thanks!
I used the same technology stack for an application (Spring- for the web part and Apache Spark for the big data processing). I don't see the case where someone wants to build a fat jar for both the side, Spring + Spark (except the case where inside spark jobs you would use something from Spring). So, the approach that we use is to have to separate Maven module one for the Spring web part and one for the Apache Spark. For the Spring Boot we did not use the spring-boot-maven-plugin, but instead we used the following maven plugins, something like this:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<configuration>
<mainClass>com.Application</mainClass>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.5.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<excludeArtifactIds>integration</excludeArtifactIds>
<outputDirectory>${project.build.directory}/lib/</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>lib/</classpathPrefix>
<mainClass>com.Application</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
Like this we have a better controller use all the dependencies (e.g: put then in a lib folder and include them in MANIFEST)
For the Spark application you have two option:
run with spark-submit (personally i don't prefer it)
use SparkLauncher class from spark_launcher*.jar dependency (calling from web a Spark Job).
Building a fat jar for Spark application with only dependencies used in Spark code is desirable, cause you load only what you truly need. We can use maven-shade-plugin for this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<includes>
// put here what you need to include
</includes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
Using maven shade plugin is not encouraged, unless and until there is no other option apart from using the maven shade plugin. Below is the informative link for not to use the shade plugin.
Downsides of using Shade plugin relocation feature
From my perspective for running spark application, spring boot maven plugin is the best option for managing everything, including library dependencies. Even if your application is used as a scheduled job, in which you need to use spark-submit/spark launcher for launching the application.
In other case where your java app using spark and spring both but have controller/api to use the application in that case also spring boot maven plugin is the best one.
There are only 2 types of Challanges when we use spark-submit/spark launcher to launch the application which is created using spring boot maven plugin.
1. Main Class
When we package fat jar/uber jar using spring boot maven plugin, it is packging the class files and java libraries in spring way and not the way sprak is expecting it.Inside the uber jar generated by the spring boot maven plugin, we have boot-inf,meta-inf and org folder.So when we give mainclass in spark submit or spark launcher as a parameter,It will not able to find that class as it will not be able to locate package/path specified in parameter, due to change of structure in jar file.Even after you specify the correct location which is starting with BOOT-INF for main class, it will not work because the way spring launches the application is using differrent main class.
Below is the link which shows main class that should be used for launching the fat jar generated by spring boot maven plugin.
https://docs.spring.io/spring-boot/docs/current/reference/html/executable-jar.html#appendix.executable-jar.launching
On high level if I inform ,File named as MANIFEST.MF inside the uber jar contains below entries.Where Main-Class is the actual main class which is used for initialization of spring related stuff, after which customized main class should get started which is Start-Class entry.
Main-Class: org.springframework.boot.loader.JarLauncher
Start-Class: com.mycompany.project.MyApplication
So as a conclusion specifiying main class as "org.springframework.boot.loader.JarLauncher" in spark-submit or spark-launcher will resolve our issue for this problem.This will only work if you are using sparing boot maven plugin for packagin the jar.
External common libraries used in pom.xml+spark installation.
Another issue which might occure while using spark-submit or launcher application while launching uber jar, packaged using the spring boot maven plugin is jar conflicts.So the problem will be when we package jar using spring boot maven plugin, it is coping depedencies inside BOOT-INF/lib folder, let's say if you are using below depedency in pom.xml.
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.10</version>
</dependency>
Now let's say this depedency already exist with differrent version in spark installation, in that case it will give conflict in classes.As it will be an issue of class loader ,its better not to use those which might create conflict due to which application might fail.I have faced this kind of issues related to logger classes or json libraries, as there are multiple json/logger library options are available.As a resolution you can excluded those classes or libraries or you can replace that library with alternative one.
Is there a way to exclude files only when calling mvn deploy but have the files included when I call mvn install?
EDIT:
When I run the jar locally I want the logback.xml in src/main/resources but when I deploy it so it's a library the logback.xml should not be included.
It is not the "Maven Way" to have an artifact with different content depending on where it's stored. Maven expects artifact-1.0.jar to be exactly the same in the remote repository and any local repositories.
You could have the project create a classified jar alongside the real jar. The classified jar would include the logback.xml.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<!-- default-jar is the ID assigned to the jar:jar execution included automatically by
Maven. -->
<execution>
<id>default-jar</id>
<configuration>
<!-- not exactly sure of the exact syntax for excludes in the jar plugin -->
<excludes>
<exclude>logback.xml</exclude>
</excludes>
</configuration>
</execution>
<execution>
<id>jar-with-logging</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<classifier>logging</classifier> <!-- or whatever -->
</configuration>
</execution>
</executions>
This will create two artifacts, artifact-1.0.jar and artifact-1.0-logging.jar. Both artifacts will end up in both repositories. If you don't want the logging version to be attached (Maven terminology for published to repos), investigate using the maven-assembly-plugin which can create packages in various formats without attaching them.
You could also move the logback.xml into a separate project, package it separately, and add it to the classpath only when you run the jar from the local script.
I have a maven project , which needs to copy webapp/WEB-INF/ resources from another maven project which is packaged as a war .
How do I do it ?
PLease suggest
As Bittrance said, you should use the maven dependency plugin.
The better way is to create project that include all your shared resources, probably a type zip, which is build up with the assembly plugin. This is the good "maven way". It's a better solution than unpacking a war.
Then, refer it
<dependency>
<groupId>com.mygroup/groupId>
<artifactId>my-dependencies</artifactId>
<version>1.0.0</version>
<type>zip</type>
</dependency>
Next, you use the maven dependency plugin to unpack your resources, in the directory of your choice (probably WEB-INF/ ?)
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack-cfg-test-resources</id>
<goals>
<goal>unpack-dependencies</goal>
</goals>
<phase>resources</phase>
<configuration>
<outputDirectory>${project.build.directory}/WEB-INF/</outputDirectory>
<includeArtifacIds>my-resources</includeArtifacIds>
<excludeTypes>pom</excludeTypes>
<excludeTransitive>true</excludeTransitive>
</configuration>
</execution>
</executions>
</plugin>
I'm not realy sure of this code snippet (written for another purpose), but this is an example.
For more information, please follow this link : http://maven.apache.org/plugins/maven-dependency-plugin/
If you can't shared a common-project including your files, you can unpack war including only ftl (or whatever you want), but it's not a realy clean solution ;)
There is a lot of posts that deal with this subject :
Unzip dependency in maven
...
Just try with the keywords maven-dependency-plugin, unpack :)
Hope that will help you.
I can see some alternatives:
Use external references in your version control system to point all repos to the same files.
The Maven Dependency module can copy and unpack project dependencies. From there, you can use the Maven Assembly plugin (or Ant targets) to include parts of that dependency in your own installation.
At least for the FTL files, perhaps you could package them in a separate Jar file and then load them as resources through the class loader.
If the resources are filtered, you may get into problem with solution 1 if you want the filtered version and 2, 3 if you want the source version.
Hope this helps.
(This assumes your dependent project is java (jar) and not another web app, if it is a webapp I think the solution is similar).
I suggest a (slightly) different approach:
Instead of reading resources from war, add this to your war pom, to generate a jar in the artifact as well as a war:
<!-- maven war plugin config -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<configuration>
...
<attachClasses>true</attachClasses>
<classesClassifier>some-string</classesClassifier>
</configuration>
<artifactId>maven-war-plugin</artifactId>
<version>3.0.0</version>
</plugin>
...
<resources>
<!-- This is for inclusion in the jar, so dependent module can load it -->
<resource>
<targetPath>some-path</targetPath>
<directory>src/main/webapp/path...</directory>
<includes>
<include>your-resource</include>
</includes>
</resource>
</resources>
And this to your consuming pom, so the generated jar will be loaded:
<dependency>
<groupId>com.company</groupId>
<artifactId>...</artifactId>
<classifier>some-string</classifier>
</dependency>
Then you will be able to load the resources the usual way (getResourceAsStream("some-path/your-resource"))
I have a maven build configuration where I do the following steps
1) Compile and build the jar file (ABC.jar) using the maven assembly plugin
2) Run proguard using maven-proguard plugin to shrink and obfuscate the jar file to get a resultant file as ABC-small.jar
3) Run the maven jarsigner plugin to sign the final jar ABC-small.jar
The problem is that the jarsigner plugin always picks the initial ABC.jar file generated from maven-assembly instead of ABC-small.jar generated from maven-proguard plugin.
How do I tell jarsigner plugin to pick the ABC-small.jar ?
Here is my maven-jarsigner config in pom file
<plugin>
<artifactId>maven-jarsigner-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<id>sign</id>
<goals>
<goal>sign</goal>
</goals>
</execution>
</executions>
<configuration>
<!-- <storetype>pkcs12</storetype> -->
<keystore>cert\keystore</keystore>
<alias>applet</alias>
<storepass>applet</storepass>
<keypass>applet</keypass>
</configuration>
</plugin>
The plugin docs say that the basic configuration signs the project jars and any attached jars. Is ABC-small.jar attached to the project? If it is not, try including the <archive> element in your plugin configuration. Value should be the ABC-small.jar.
<configuration>
<archive>${project.build.directory}/ABC-small.jar</archive>
<keystore>cert\keystore</keystore>
<alias>applet</alias>
<storepass>applet</storepass>
<keypass>applet</keypass>
</configuration>
In your configuration you have to mention the jar file path in tag.
Adding more info of #user944849 explained.. if you want generate keystore for your appet use keytool-maven-plugin ..
I am trying to come out with a plugin to detect and process Java EE application clients.
I created a new packaging type called 'car' through META-INF/plexus/components.xml (http://maven-car-plugin.googlecode.com/svn/trunk/maven-car-plugin/src/main/resources/META-INF/plexus/components.xml) and a corresponding mojo for Java EE app clients. I have pretty much followed the same steps as the maven-ejb-plugin.
The behaviour i want is the same as the maven-ejb-plugin: Defines an ejb packaging type but the artifact gets installed in the repo as a .jar and gets bundled in the ear as .jar too.
I believe must be configurable some how because ejb packaging type gets installed as .jar but war packaging type produces a .war.
The problem in my case is that a .car file gets installed in the repo and a .car file gets bundled in the ear.
Does anyone know how to make sure it gets installed in the repo as a .jar file?
I ran into the same issue you have, except, I'm building a .war file and wanted a .jar file installed into my local repo. What I did was use the maven-jar-plugin to create a jar file in addition to a war file, it's generated in my /target directory. I also used the maven-install-plugin to install the outputted jar to my local repo.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>make-jar</id>
<phase>compile</phase>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<executions>
<execution>
<id>install-jar</id>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<packaging>jar</packaging>
<artifactId>${project.artifactId}</artifactId>
<groupId>${project.groupId}</groupId>
<version>${project.version}</version>
<file>${project.build.directory}/${project.artifactId}.jar</file>
</configuration>
</execution>
</executions>
</plugin>
Perhaps you could try using the packaging parameter in maven install plugin to see if that helps in your case?
I would assume you would have to specify
<packaging>jar</packaging>
as well in the component descriptor. Otherwise it looks correct to me..