I'm working on a multi-module Maven project which looks like this:
Parent-project (pom)
+- Module1 (executable-jar)
+- Module2 (executable-jar)
+- Module3 (jar)
+- ...
+- Distribution (pom)
The Distribution module lists dependencies on Module1, Module2, and Module3. I want the Distribution module to test the project and package it up. I'm using this module for testing since the distribution already contains all the necessary configuration files. If I were to start the tests manually from the command line it would look something like this:
$ # Pre-integration-test:
$ java -Djava.rmi.server.codebase=file:module3.jar -jar module1.jar
$ java -Djava.rmi.codebase=file:module3.jar -jar module2.jar
I've looked at the exec-maven-plugin for binding these calls to the pre-integration-test phase, but the plugin only includes the JARs on the classpath instead of the directory holding the JAR itself. I think I would benefit most from a plugin which can easliy execute jars produced by a project's dependencies. That way, I can do something like this in the Distribution POM:
...
<plugin>
<artifactId>some-magic-plugin</artifactId>
<executions><execution>
<phase>pre-integration-test</phase>
<goals><goal>exec-dependency</goal></goals>
<configuration>
<artifactId>module1</artifactId>
<vmArguments>...</vmArguments>
<arguments>...</arguments>
</configuration>
</execution></executions>
</plugin>
Is there already a plugin for this? Ideally, it would be able to:
Execute a JAR artifact given its Maven-coordinates
Put directories containing the artifacts' JARs on the path (so I can include the JARs with command line arguments)
I've laso looked at dependency:copy goal to try and copy all JARs to a common path first, but it seems unnecessary since the jars are already built as part of the project. (I'm also not sure which directory would make the best copy-destination).
Related
According to https://maven.apache.org/pom.html#properties it should be possible to use ${settings.x} to access the element <settings><x>value</x></settings> from the settings.xml file in the pom.xml file.
However, when I try something like
<profiles>
<profile>
<activation>
<file>
<exists>${settings.localRepository}/path/to/file</exists>
</file>
</activation>
</profile>
</profiles>
in my pom.xml it isn't replaced in the effective pom.xml. When I replace ${settings.localRepository} with ${user.home}/.m2/repository it works fine but that's not what I want. Is there anything I can do to fix that? (Tested with Apache Maven 3.6.0.)
Background information:
I have a dependency that isn't present in an online maven repository and I can't change that. It must be compiled by users and can be installed to the local repository. Instead of doing this manually, I'm trying to do this automatically in my pom.xml. For this I have to ignore the dependency if it's not present in the local repository. Hence the profile that checks if the file is present in the local repository. Without the profile, maven wouldn't even start the life cycle because the dependency can't be resolved. Of course the project won't compile the first time the pom.xml is executed. But all dependencies are automatically installed and the project will compile in a second pass. I know that this isn't a clean solution but I think it's better than telling users to compile and install dependency xy manually before this project can be compiled. I also include a build script that first runs mvn clean initialize to install the dependencies and then mvn clean compile.
Put the source of the external dependency in an own project like:
+- main
+- pom.xml ... <packaging>pom...<module>external...<module>internal
|
+- external
| +- ... Maven dirs as usual ...
| + pom.xml
|
+- internal
+- ... Maven dirs as usual ...
+- pom.xml ... <dependency>external
Such when building main the Maven Reactor takes care of the projects' build order (build external first, then internal in this case) and you can forget about dealing with settings.xml, repositories, profiles or properties.
I have one multimodule maven project where there are source directories apart from 'src' where java file resides.
This is the folder structure
folder1
-pom.xml
pom.xml Contains modules defined like this:
<modules>
<module>module1</module>
<module>module2</module>
<module>module3</module>
<module>module4</module>
<module>module5</module>
<module>module6</module>
<module>module7</module>
<module>module8</module>
<module>module9</module>
<module>module10</module>
</modules>
Different modules are organized like this:
module1
-src
-gen
module2
-src
module3
-gen
module4
module5
-src
-gen
So, as you see, there are modules/projects which have either src or gen or both or doesn't have any of it.
When I run findbugs analysis, it picked only java classes from 'src' and skipped 'gen' (Natural as Maven model forces the analyzer to pick from only src)
So, in the Jenkins job configuration, I defined sources explicitly like this:
-Dsonar.sources=src,gen
-Dsonar.exclusions=src/test/java/**
When I run with this configuration, analysis fails for modules which doesn't have both src and gen. (module2, module3, module4)
So, how do I run the analysis to pick either src or gen or skip that module if either of them is not found ?
Thanks,
Ron
When using the SonarQube scanner for Maven, you can't specific properties that only apply to some of the modules using the command line.
In the modules where you want to modify the sources, add in the pom.xml a property. For example, in module5/pom.xml add:
<properties>
<sonar.sources>src,gen</sonar.sources>
</properties>
So I have the following tar.gz file in my repo with structure as:
> A.tar.gz
> |
> |____ a.tar.gz
> |
> |____ b.tar.gz
> |
> |_____ folderA
> |
> |_____ folderB
> |
> |______ jar1.jar
> |
> |______ jar2.jar
Now in my POM file for another project I would like to add the jar1 and jar2 as dependencies. So far I have the following:
<dependency>
<groupId>com.groupid</groupId>
<artifactId>master</artifactId>
<version>18.1</version>
<type>tar.gz</type>
<classifier>bin</classifier>
</dependency>
This made the tar file available. I then tried to unpack as:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<phase>generate-resources</phase>
<goals>
<goal>unpack-dependencies</goal>
</goals>
<configuration>
<includeTypes>tar.gz</includeTypes>
<includeArtifactIds>master</includeArtifactIds>
<outputDirectory>target/somefolder</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
However, on running the build, I still don't get the jars as dependencies. I am sure I am missing something , so any help appreciated!
So you have JAR dependencies located inside another tar.gz dependency that you want in your project. So far so good, the problem is you're trying to:
Make the the first .tar.gz dependency available (OK)
Launch the build to unpack the jars (OK)
Add unpacked jar as dependencies during the same build (Not possible)
All in one run. That is not possible in Maven by design (to the best of my knowledge). Even if you did find a way to do it this way, that would over-complexify your build and break Maven design, probably leading to other issues.
You mentioned that you don't control the packaging of the other team and it seems you can't upload said dependencies on a Nexus repo either. What you can do is make the JAR dependencies available in your local repo prior to running your build by doing something like:
Download your tar.gz file and unpack it
Install the jar dependencies in your local Maven repo using commands like mvn install:install-file -Dfile=path/to/jar1.jar -DgroupId=com.mycompany -DartifactId=jar1 -Dversion=1.0.0 -Dpackaging=jar with proper version, groupdId and artifactId for each jars (see the Guide to installing 3rd party JARs for more details)
Now you can run your your original project only by mentioning your jar dependencies as <dependency>
With this you will manually install your jar dependencies in your Maven local repository, rendering them available to your project without needing a Nexus repository or further unpackaging. You can perform step 1 and 2 manually or by creating another Maven project that you will have to run once before running your main project. To do so you can create a new project and use the maven-dependency-plugin as you already did coupled with the maven-exec-plugin to run the mvn install:install-file command.
Note this process must be done for every individual machine on which you will run your project. As #khmarbaise mentioned, it's best to have your dependencies available directly through a repository manager such as Nexus without having to perform additional steps, but this temporary workaround should work just fine.
I'm new to maven. I want to package a jar of my hadoop project with its dependencies, and then use it like:
hadoop jar project.jar com.abc.def.SomeClass1 -params ...
hadoop jar project.jar com.abc.def.AnotherClass -params ...
And I want to have multiple entry points for this jar (different hadoop jobs).
How could I do it?
Thanks!
There's two ways to create a jar with dependencies:
Hadoop supports jars in a jar format - meaning that your jar contain contain a lib folder of jars that will be added to the classpath at job submission and map / reduce task execution
You can unpack the jar dependencies and re-pack them with your classes into a single monolithic jar.
The first will require you to create a maven assembly definition file but in reality is more hassle than it's worth. The second also uses maven assemblies but utilizes a built in descriptor. To use the second, just add the following to your project -> build -> plugins section in the pom:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
Now when you run mvn package you'll get two jars in your target folder:
${project.name}-${project.version}.jar - Which will just contain classes and resources for your project
${project.name}-${project.version}-jar-with-dependencies.jar - which will contain your classes / resources and everything from your dependency tree with a scope of compile unpacked and repacked into a single jar
For multi entry points, you don't need to do anything specific, just make sure you don't define a Main-Class entry in the jar manifest (if you explicitly configure a manifest, otherwise the default doesn't name a Main-Class so you should be good)
Short version:
I would like the maven-glassfish-plugin to only be executed in the root project in a hierarchical (inheritance based) maven multiproject setup.
Long version:
Following setup:
project-root
|
+-pom.xml
|
+ ear-module
| |
| +-pom.xml
|
+ jar-module
|
+-pom.xml
All submodules are included in the root project via <modules>...</modules> and all submodules inherit the root project pom.xml.
In my root project pom I include the maven-glassfish-plugin:
...
<plugin>
<groupId>org.glassfish.maven.plugin</groupId>
<artifactId>maven-glassfish-plugin</artifactId>
<version>2.1</version>
<inherited>false</inherited>
<configuration>
<glassfishDirectory>${glassfish.home}</glassfishDirectory>
<passwordFile>${glassfish.home}/masterpassword.txt</passwordFile>
<domain>
<name>${project.name}</name>
<adminPort>4848</adminPort>
<httpPort>8080</httpPort>
<httpsPort>8443</httpsPort>
<iiopPort>3700</iiopPort>
<jmsPort>7676</jmsPort>
</domain>
<components>
<component>
<name>poc.vermittler</name>
<artifact>${project.basedir}/ear-module/target/ear-project-1.0-SNAPSHOT.ear</artifact>
</component>
</configuration>
</plugin>
...
(Note: This is just an simplified version of my pom. It may not run :)
I want to only deploy the ear-module module to glassfish, this is why I added <inherited>false</inherited> section, and depict the modules to be deployed as <components>...</components> in the root pom.
Now the command:
mvn glassfish:deploy
Will deploy the ear to glassfish, all well so far... but then maven will decent recursively to all submodules, which will all fail with:
No plugin found for prefix 'glassfish' in the current project and in the plugin groups [org.apache.maven.plugins, org.codehaus.mojo] available from the repositories
I could tell maven to only run the root project with the -pl option but for my gusto, deploying shouldn't rely on such additional information.
Thanks a lot for your help!
It seems that there is no good solution to this problem:
either the plugin supports a "NOP"/silent discard functionality
or it will fail in all subprojects
Another method could be to create a new subproject (which is not included in the root project by <modules>...</modules> but inherits from the root project) and add dependencies to only the projects that have a deployment artifact.
The plugin can now be included in this subproject without it wanting to run any subproject.
Or for anybody who is lazy: mvn clean package glassfish:redeploy -pl . to selectively only run the root project without descending into child projects.