How to bundle my Maven based gatling loadtest into one JAR? - maven

I created a Gatling load test using the highcharts archetype. I decided against just downloading the latest Gatling ZIP file and creating a simulation within the extracted folder since I rely on a number of dependencies in public and private Maven repositories.
I want to
bundle my simulation and all its dependencies into a single JAR,
distribute the JAR to multiple load generators in EC2/GCE, and
start the test on all remote load generators.
Maven's assembly plugin looks like an obvious candidate to solve #1. So I added the following to my pom.xml:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>io.gatling.app.Gatling</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
With this configuration, running a JAR file created with mvn clean package assembly:single results in the following NoSuchFileException:
$ java -jar target/myapp-0.1-SNAPSHOT-jar-with-dependencies.jar
Exception in thread "main" java.nio.file.NoSuchFileException: ./target/test-classes
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newDirectoryStream(UnixFileSystemProvider.java:407)
at java.nio.file.Files.newDirectoryStream(Files.java:457)
at io.gatling.core.util.PathHelper$RichPath$.deepListAux$1(PathHelper.scala:99)
at io.gatling.core.util.PathHelper$RichPath$.deepList$extension(PathHelper.scala:105)
at io.gatling.core.util.PathHelper$RichPath$.deepFiles$extension(PathHelper.scala)
at io.gatling.app.classloader.SimulationClassLoader.simulationClasses(SimulationClassLoader.scala:55)
at io.gatling.app.Gatling.loadSimulations(Gatling.scala:92)
at io.gatling.app.Gatling.start(Gatling.scala:70)
at io.gatling.app.Gatling$.fromArgs(Gatling.scala:59)
at io.gatling.app.Gatling$.main(Gatling.scala:44)
at io.gatling.app.Gatling.main(Gatling.scala)
Is this how I should bundle up my Maven based Gatling project?
Have I misconfigured Gatling's Maven plugin at the time the JAR file is created?
Update 1:
Creating the target/test-classes directory gets around the NoSuchFileException. However, gatling then doesn't find any of my simulations. None of the *.scala files were added to JAR generated by the assembly plugin.

Related

Remove timestamp in maven tycho build

I have a multi module eclipse RCP Application. We are building the application through maven tycho. The build is creating successfully.
In the build folder i have the usual plugins folder which contains all the plugins(both jar packaging and directory packaging) in the project.
The plugins contains timestamp in it.Is there any way to remove the timestamp from the plugin while building. currently it is plugin.name_1.0.0.20200211.jar but i want the plugin to be plugin.name_1.0.0.jar
Adding a format tag did the trick for me. Pom file snippet is attached.
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-packaging-plugin</artifactId>
<version>${tycho-version}</version>
<configuration>
<format>''</format>
</configuration>
</plugin>

Intellij artifact tool doesn't create correct executable spark jar

I created a Spark maven project in IntelliJ IDEA 2018 and tried to export an executable jar file of my main class. As I try to submit it to Yarn cluster, it errors The main class not found! while the MANIFEST.MF includes it:
Manifest-Version: 1.0
Main-Class: Test
I did the same with other processing engines like Apache Flink and IntelliJ could create an executable jar file that successfully runs on the cluster.
So in Spark case I always have to use maven-assembly-plugin and export the jar file using the command:mvn clean compile assembly:single
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>Test</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
I guess it's because of spark dependencies format. I faced the same problem in creating a jar file from my written class using Spark dependencies(not executable). For example, adding spark-sql dependency to Maven project eventuate in getting some other dependencies like spark-catalyst. Is there any way to export Spark executable jar file using IntelliJ IDEA?
maven-shade-plugin can be alternative option to create Uber jar. Here is the detailed pom.xml.

Spark Maven and Jar Development Workflow with local and remote server

So I have a very basic question about how to most effectively work with a local spark environment along with a remote server deployment and despite all of the various pieces of info about this, I still don't find any of them very clear.
I have my IntelliJ environment and dependencies in need within my pom to be able to compile and run and test with my local within intellij. Then I want to test and run against a remote server by copying over my packaged jar file via scp to then run spark-submits.
But I don't need any of the dependencies from maven within my pom file since spark-submit will just use the software on the server anyway so really I just need a jar file with the classes and keeping it very lightweight for the scp would be best. Not sure if I'm mis-understanding this but now I just need to figure out how to exclude any dependency from being added to the jar during packaging. What is the right way to do that?
Update:
So I managed to create a jar with and without dependencies using the below and I could just upload the one without any dependencies to server after building but how can I build only one jar file without any dependencies rather than waiting for a larger jar with everything which I don't need anyway:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
Two things here.
The provided dependency scope will allow you to work locally and prevent any server provided libraries from being packaged.
Maven doesn't package external libraries without creating an uber or shaded jar.
An example of a good Spark POM is provided by Databricks
Also worth mentioning, Maven copy local file to remote server using SSH
See Maven Wagon SSH plugin

What is the best place for JavaDoc files in a Maven project using Tomcat?

I am regularly deploying a Maven project to a Tomcat server, using Travis CI. My project is a web app, so I have configured my pom.xml for building a WAR file, instead of a JAR:
...
<packaging>war</packaging>
...
With Maven, I can generate a directory containing all the JavaDoc files for my project; Maven puts them in the target/site/apidocs directory. But then, when I deploy my project, Travis doesn't perform any mvn site phase so I don't have my JavaDocs on the server.
Should I edit my pom.xml so that Maven puts the JavaDoc files somewhere in the src directory (instead of target) or is there a way to package the JavaDoc files together with the WAR file? I thought that I could create a docs/ directory inside src/main/webapp/. Specifically: is it "good practice" to generate my JavaDoc in src instead of target? if not, how can I have a WAR file containing my JavaDoc?
What would you suggest is the best thing to do?
I already know how to generate a standalone JAR containing my JavaDoc files (see here), but this is not what I'm looking for.
Use the site plugin https://maven.apache.org/plugins/maven-site-plugin/ and the javdoc plugin https://maven.apache.org/plugins/maven-javadoc-plugin/usage.html.
Add the following to your pom.xml
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<links>
<link>http://commons.apache.org/lang/api</link>
<link>http://java.sun.com/j2se/1.5.0/docs/api</link>
<link>http://this-one-will-not-work</link>
</links>
</configuration>
</plugin>
</plugins>
</reporting>
then mvn site:site your documentation will be in target/site you can also deploy it.

How to include workspace artifacts in the assembly using Appassembler maven plug-in?

I'm using Eclipse m2e in my development environment, and I have a spring-boot maven project(can be viewed as a standard maven jar project with runnable main class in this context) which depends on another maven project in the same workspace(workspace artifact, let's call it moduleB, a sibling of the spring-boot project), when I run the maven goal clean package(the appassembler:assemble goal can be ommited because I configured the execution section of the plugin, see the configuration detail below), the generated assembly in the target directory seems fine, except that the jar of moduleB is missing in the repo. It seems that the plugin is trying to copy every file under the class folder in moduleB according to the log:
...
[INFO] Installing artifact ...
[INFO] Installing artifact /foo/bar/moduleB/target/classes to /foo/bar/repo/groupid/artifactid/0.0.1-SNAPSHOT/moduleB-0.0.1-SNAPSHOT.jar
[INFO] Installing ...
...
How to resolve this? Do I have to install moduleB into the maven local repository before running the assemble? Is there any way to bypass this step because I don't want to mess up the repository with unstable artifacts.
P.S. configuration of the plugin:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>appassembler-maven-plugin</artifactId>
<version>1.10</version>
<configuration>
<configurationDirectory>conf</configurationDirectory>
<configurationSourceDirectory>src/main/resources</configurationSourceDirectory>
<copyConfigurationDirectory>true</copyConfigurationDirectory>
<includeConfigurationDirectoryInClasspath>true</includeConfigurationDirectoryInClasspath>
<assembleDirectory>${project.build.directory}/someApp</assembleDirectory>
<extraJvmArguments>-Xms128m</extraJvmArguments>
<logsDirectory>logs</logsDirectory>
<repositoryLayout>default</repositoryLayout>
<repositoryName>repo</repositoryName>
<showConsoleWindow>true</showConsoleWindow>
<platforms>
<platform>windows</platform>
<platform>unix</platform>
</platforms>
<binFileExtensions>
<unix>.sh</unix>
</binFileExtensions>
<programs>
<program>
<mainClass>someClass</mainClass>
<id>app</id>
<platforms>
<platform>windows</platform>
<platform>unix</platform>
</platforms>
</program>
</programs>
</configuration>
<executions>
<execution>
<id>assemble</id>
<goals>
<goal>assemble</goal>
</goals>
</execution>
</executions>
</plugin>
Update #1:
I bypass the spring-boot:repackage goal because it encapsulates everything in one jar, including the configuration files which I want to be conveniently editable in production environment. Here's the earlier question I asked: Alternatives to distribute spring-boot application using maven (other than spring-boot:repackage)

Resources