Maven: Force Jersey to use specific artifact version - maven

I have a Maven repository where I load Jena TDB 0.9.3 (which depends on Jena ARQ 2.9.3), Jersey 1.8 and RMOnto 1.0. The point is, as you expected, to do some analysis on semantic datasets.
It looks like RMOnto has ARQ 2.8.7 built in, as in "hardwired". There isn't any explicit dependency in its pom file, yet the jar file contains a ARQ.class. It's very tricky because you won't notice it with Maven Enforcer Plugin and the like.
It looks like this causes Jersey to use RMOnto's ARQ version instead of the one defined in pom.xml. Here is a minimal example. When you run the test (checks whether or not ARQ.VERSION equals 2.9.3), it succeeds. When you build the project and deploy it on a Tomcat 7, you should see 2.8.7 as output.
Is this behaviour expected and why?
How could one force Jersey to use ARQ 2.9.3?
In case it's not possible, could one isolate RMOnto to use 2.8.7 while the rest of the source uses 2.9.3?
Thanks in advance!

You should define the ARQ 2.9.3 first in the dependencies list. By doing that you force your build to use that specific version. The dependency order is relevant when choosing what artifact to use.
Update
OK, I understand what the problem is.
The RMOnto jar is obviously shaded according to the pom: http://semantic.cs.put.poznan.pl/maven/put/semantic/RMOnto/1.0/RMOnto-1.0.pom.
Tomcat 7 loads the jars in WEB-INF/lib in an undefined order. This means that even if you define ARQ 2.9.3 to be first in your dependencies it will not be the case when the application is run in Tomcat. http://tomcat.apache.org/tomcat-7.0-doc/class-loader-howto.html
Good thing is that Tomcat always look in WEB-INF/classes before WEB-INF/lib for dependencies.
So what you can do as a work around is to make sure that the ARQ 2.9.3 version is added to the WEB-INF/classes folder. This can be done using the maven-dependency-plugin:
<build>
<plugins>
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>org.apache.jena</groupId>
<artifactId>jena-arq</artifactId>
<version>2.9.3</version>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<excludes>**/META-INF/</excludes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Your war as well as your exploded war will now contain all the classes from ARQ 2.9.3 in the WEB-INF/classes folder. They will be loaded before any jar-file that is in WEB-INF/lib folder.
NB: I have not tested this on Tomcat but I cannot see that it would not work.
NB2: This is a hack. Best thing would be to remove the ARQ packages out of the RMOnto jar.

You should file a defect report against RMOnto. Hard-wiring library code into a jar, instead of including it as a dependency you can manage in the POM, is definitely a bad idea that the code maintainer should fix.

If the files have been copied directly to the RMOnto .jar, the behaviour is expected.
In that case, I'd say the best bet is to hardcode it away, aka remove the ARQ files directly from the package. Opening up the RMOnto-1.0.jar package one can see arq files in the arq folder. What you'd need to do is open up the jar file (it's just a .zip), remove the ARQ files from there, store the edited RMOnto package in your version control / repository and refer to the edited package from there. Also, you'd need to add excludes statement to your pom for the old version of ARC and keep the dependency to the new version.
If you feel like it, it would be also good practice to remove the other dependencies that haven't been mentioned in the RMOnto's pom file, then add them in the RMOnto pom file (and rebuild, if you have the source code). This way Maven mechanism would be aware of them. The file seems to contain a lot of dependencies like this, which will cause headaches in the future.

Related

How to deal with dependencies with "provided" scope in OSGi

There are lots of tutorials, which shows how to cope with dependencies of the OSGi project and how should they be converted to the bundle. After more than one day research, I have still not found how to deal with the dependencies with provided scope.
Let me give an example. I am currently using Dropbox (dropbox-core-sdk 3.0) and it has two dependencies (com.google.android and javax.servlet) with provided scope. When I use the techniques such as maven-bundle-plugin or bnd, it only downloads the artifacts and its transitive dependencies. However, I need also provided dependencies in order to be able to import my project to the OSGi container.
I am using maven-bundle-plugin and my pom.xml looks like:
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-SymbolicName>${project.artifactId};singleton:=true</Bundle-SymbolicName>
<Bundle-Version>${project.version}</Bundle-Version>
<Export-Package>*</Export-Package>
<Embed-Transitive>true</Embed-Transitive>
<Embed-Dependency>*</Embed-Dependency>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
even if Embed-Dependency says include everything, only the dependencies + transitive dependencies are in the jar. However, I want the provided-scoped jars are also in the jar.
Is there any way to download dependencies with provided scope. If not, how to deal with this situation?
I would have to defer to the Maven BND experts out there, but I don't think you can include provided dependencies through a Maven build. Since it is unlikely you will be using the Android components outside of your bundle, couldn't you just manually download the needed Jars and place them in your bundle (Bundle-Classpath)?
I think you can specify the scopes of the dependencies you want to embed. Be careful though that some dependencies like the OSGi spec jars should never be deployed.
In general you may only embed dependencies that are hidden inside the bundle. Any packages that are needed to talk to other bundles should better not deployed.
For example the servlet api is typically provided by the httpservice bundle you use.
Try this option:
<Embed-Dependency>*;scope=compile|provided</Embed-Dependency>

how to make zip files (produced by a self-made maven plugin)from target folder end up in the local repository?

I am creating my own maven-environment-plugin that creates and bundle resources for a predefined folder structure for each environment defined in the configuration. The plugin is outputting the folder structure and resource in a zip file and placing it in the target folder.
Questions:
How can I make my plugin work like the maven-assembly-plugin so my output to target folder also ends up in my local repository when I use 'mvn install'?
Do I need to mark it or something? Its automaticallly doing it when the maven-assembly-plugin is used.
How does maven-assembly-plugin manage to make sure of this?
I am using mojo for my plugin development.
<plugin>
<groupId>dk.kmd.devops.maven.plugin</groupId>
<artifactId>envconfiguration-maven-plugin</artifactId>
<version>1.0.3</version>
<configuration>
<environments>
<environment>${env.local}</environment>
<environment>${env.dev}</environment>
<environment>${env.t1}</environment>
<environment>${env.t2}</environment>
<environment>${env.p0}</environment>
</environments>
<sourceConfigDir>${basedir}/src/main/config</sourceConfigDir>
<zipEnvironments>true</zipEnvironments>
</configuration>
<executions>
<execution>
<phase>generate-resources</phase>
<goals>
<goal>generateEnv</goal>
</goals>
</execution>
</executions>
</plugin>
You need to attach (that's the correct terminology in this case) the new artifact (the generated zip file) to the build as part of its official artifacts.
This is basically what the attach-artifact goal of the build-helper-maven-plugin does:
Attach additional artifacts to be installed and deployed.
From its official examples, the attach goal:
Typically run after antrun:run, or another plugin, that produces files that you want to attach to the project for install and deploy.
The another plugin in this case can be the plugin you developed. Hence there are two solutions to your case:
Configure this plugin to attach the generated artifact as a further pom.xml configuration, or
add to your plugin the functionality to automatically attach the generated file
The second case can be covered via Maven API, using the MavenProjectHelper and its attachArtifact method.
In your mojo, you can import is as a component via:
/**
* Maven ProjectHelper
*/
#Component
private MavenProjectHelper projectHelper;
Then use the aforementioned method:
projectHelper.attachArtifact(project, "zip", outputFile);
You should probably already have the required Maven dependency providing it, but just in case it would be this one:
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-core</artifactId>
<version>3.3.9</version>
</dependency>
Note that the artifact will be attached to the build as an additional artifact via a classifier, that is, a suffix to the default artifact name differentiating it from the default artifact and making it unique as output of the build.
As a reference to real example and to further answer your (last) question, check this query on the GitHub maven-plugins repository, checking for the attachArtifact string, you will see it used in a number of Maven plugins, among which the maven-assembly-plugin, for example here in the AbstractAssemblyMojo class.

How to force Maven to always create a new jar file?

If all classes are up-to-date "Nothing to compile - all classes are up to date"
so will maven create jar again?
As I am seeing in my log that jar is not creating again. so maven come to know that all classes are up-to-date.
Question: is there any process or another thing which work on this?
The Maven Jar Plugin will create a jar via its jar goal if none exists or skip its creation if existing but nothing changed.
You can force the creation of the jar via its forceCreation option (since version 2.2). From official documentation:
Require the jar plugin to build a new JAR even if none of the contents appear to have changed. By default, this plugin looks to see if the output jar exists and inputs have not changed. If these conditions are true, the plugin skips creation of the jar. This does not work when other plugins, like the maven-shade-plugin, are configured to post-process the jar. This plugin can not detect the post-processing, and so leaves the post-processed jar in place. This can lead to failures when those plugins do not expect to find their own output as an input. Set this parameter to true to avoid these problems by forcing this plugin to recreate the jar every time.
Its default value is at false, which explains the behavior you are having.
If you want to force it always, you can add to your pom file:
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.6</version>
<configuration>
<forceCreation>true</forceCreation>
</configuration>
...
</plugin>
</plugins>
</build>
...
</project>
Or just on a single build, invoke it as following:
mvn package -Djar.forceCreation=true
So, going back to your question:
is there any process or another thing which work on this?
The answer is: Yes, the Maven Jar Plugin works on this and the option above will change its behavior.

How can the production jar specify its own dependencies when added to other project as a dependency?

If the question title can't make it clear, take me explain here in more detail. Suppose the production jar of one of my Maven applications needs to be used into my other Maven web-application. Adding that jar to my second application Maven dependency doesn't add its transitive dependencies. Also, the jar in itself is an application.
One way is to look at the POM of the first application and add those in the POM of the other application. But then, how do central Maven jars add their own transitive dependencies when added to some project.
In other words, if I add commons-io.jar Maven dependency to my project, it automatically adds its transitive dependencies. But when I add myjar.jar as a Maven dependency (scope->system) then it doesn't automatically adds its transitive dependencies.
I think that I should develop my first application as some other archetype which can be used in such a case. Please advise me how to proceed further.
Sorry for this newbie question. Actually, I'm new to Maven and I've started using Netbeans-embedded-maven to create applications. I really like the way Maven simplifies the job.
edited
Seems like I should explain in more detail. So here is it.
Suppose I wrote a program/application that used A.jar,B.jar,C.jar and my production output was X.jar (which obviously doesn't contain other jars within as per maven default build). The above A,B,C jars are present in maven central repository and were added as dependency to my project. The project build jar is X.jar
Now I write another application in which I added X.jar as a system dependency, now what I want is that A.jar, B.jar, C.jar added automatically to the project since they are transitive dependencies for X.jar
Hope so I've explained it clear this time. Please forgive me for my writing style in case you didn't understand earlier.
One solution is to build X.jar containing all dependencies within it using something like this
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>com.nitinsurana.mlmmaven.Start</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-my-jar-with-dependencies</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
But I'm looking for something that automatically adds transitive dependencies of a system dependency.
The system scope is not supposed to be used for actual jar dependencies that will be packaged with another application. Quoting from the official documentation:
Dependencies with the scope system are always available and are not looked up in repository. They are usually used to tell Maven about dependencies which are provided by the JDK or the VM. Thus, system dependencies are especially useful for resolving dependencies on artifacts which are now provided by the JDK, but where available as separate downloads earlier. Typical example are the JDBC standard extensions or the Java Authentication and Authorization Service (JAAS).
You should use the default compile scope.
As others have suggested, use the (default) compile scope and add <exclusions> for transitive dependencies you don't want / need.
See: Maven > Optional Dependencies and Dependency Exclusions
I had gone through the link provided by #Sean and it seems like what I want is not possible.
Shall I vote to delete this question ?
Although the answer is IT CAN'T BE DONE and heres' why :
Project-A -> Project-B
The diagram above says that Project-A depends on Project-B. When A declares B as an optional dependency in its POM, this relationship remains unchanged. Its just like a normal build where Project-B will be added in its classpath.
Project-X -> Project-A
But when another project(Project-X) declares Project-A as a dependency in its POM, the optional dependency takes effect. You'll notice that Project-B is not included in the classpath of Project-X; you will need to declare it directly in your POM in order for B to be included in X's classpath.
Taken from Official Documentation
So, your X module is mavenized? Then you can install it locally with mvn clean install and then use it in another projects with all transitive dependencies and compile scope. This case is good till you do everything on you own machine. As far as you want to share the code with others or configure CI build you need X with its pom available to others. The best way to do this is to have your own artifactory, accessible from all other machines. You install X there and use it with compile scope as ususal, just need to add new repo to pom.

How to create a JarJar'd artifact with Maven, where use of artifact does not pull transitive dependencies?

I currently have a Java testing library which is built with Maven, and distributed as a jar. My project depends on a very common library (Objectweb ASM), and I've experienced problems where an earlier and incompatible version of ASM is already on the classpath. Thus, I've started usings the jarjar-maven-plugin to create jar, repackaging ASM internally where it cannot conflict with another version of ASM.
This executes fine, and my library can be pulled in as a dependency with no problem.
However, because my project has compile-scope dependencies on ASM, whenever a client project adds my library, the transitive dependencies are all pulled in as well. So, hypothetically, if they use a particular version of ASM, and they also add the version I depend on to the classpath, they have undefined behaviour. I'd like to avoid this situation, and allow clients to depend on the JarJar'd artifact without having Maven pulling down the transitive dependencies both unnecessarily and potentially dangerously.
How do I create a JarJar'd artifact which users can depend on without pulling transitive dependencies?
I found a solution to this problem by ditching the jarjar-maven-plugin, and reverting to the maven-shade-plugin. This allows repackaging classes within your own namespace, setting the main class of the jar, and crucially, rewriting the generated pom to not include the compile time dependencies which are now bundled.
The part of my pom.xml which acheived this is:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<shadedArtifactAttached>false</shadedArtifactAttached>
<createDependencyReducedPom>true</createDependencyReducedPom>
<relocations>
<relocation>
<pattern>org.objectweb.asm</pattern>
<shadedPattern>${repackage.base}.org.objectweb.asm</shadedPattern>
</relocation>
</relocations>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>${package.base}.my.MainClass</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
The important parts of this configuration are:
shadedArtifactAttached which when set to false, means the shaded jar will replace the main artifact that would normally be produced. This defaults to false but it's worth pointing out.
createDependencyReducedPom which when set to true means that when the shaded jar is deployed or installed, the pom.xml which is deployed will not include the compile-scope dependencies which have been repackaged into the jar.
relocation these elements configure how files within the dependencies are repackaged into the shaded jar. In the above example any class whose canonical name begins with org.objectweb.asm will be moved to ${package.base}.org.objectweb.asm, and thus when packaged in the jar will have the equivalent file path within the jar.
With this configuration, when my project is deployed, when clients declare a compile-scope dependency on my project, it only pulls in the shaded jar, and no transitive dependencies.
Consider trying the maven-shade-plugin instead, which allows all sorts of fine control.
Perhaps setting the <optional> attribute will work in your case. Specifying something like the following in your java testing library pom.
<dependencies>
<dependency>
<groupId>asm.group</groupId>
<artifactId>asm</artifactId>
<version>x.y</version>
<optional>true</optional>
</dependency>
...
</dependencies>

Resources