i tried to search both here and on jetbranis.net however i did not found an answer.
I have created a project (P) using maven that have to modules (A,*B*). The module A is dependant on module B and module B is dependent on libraries R. When I open P using the InteliJ Idea 10.0.1 everything works smoothly. The only problem that i have is dependency handling.
The dependencies were imported transitively. Both A,*B* are now dependent on libraries R. I would expect B to be dependent on R, but i would expect R to be exported and A to be dependent only on B.
I found some old posts on jetbrains that seam to be related, however it seams that they have opposite problem http://devnet.jetbrains.net/thread/286098. Can anybody advise me please? Did i missed some configuration option?
This is how Maven's dependencies work; each module (aka Maven Project) has an isolated classpath. Dependencies, imported into IDEA, are not 'exported' to prevent interference between transitive dependencies.
Related
I have two dependencies imported via Maven that both import a common library, but at different versions but the versions are not compatible with each other. Essentially the problem described in this post:
But unfortunately for me, the solution is not as simple as the blog post describes, because there isn't a common version of package Z that works for both dependencies.
Skipping the poor design decisions that led to this point as I don't control any of these libraries, I'm looking to repackage one of the top-level dependencies and shade all of its dependencies so it can essentially use its own, isolated version of Z. Is this possible to accomplish with Maven?
One solution I have considered is isolating all the classes that depend on package Y and putting them in a separate application and shipping that as a shaded jar which X imports, however I'm wondering if there's a simpler way to accomplish that.
As everyone has suggested you can use the maven-shade-plugin. Open source projects handle this by creating one maven sub project for each dependency that needs to be shaded. So in your case you would need 3 maven projects:
One for shading dependency Y
One for shading dependency G
One for your original project. Your original project includes the artifacts created in 1. and 2. as dependencies.
Your maven project hierarchy would look like this:
project
pom.xml
shade-Y
pom.xml
shade-G
pom.xml
application
pom.xml
An example of a project which has a maven sub project for shading a dependency is here. Look at the shaded-ning19 folder to see how to create a dedicated maven project for shading a dependency.
I am using the hector & astyanax projects. These projects used to require maven, and now astyanax requires gradle.
I would like to statically link one of these projects to my java project (which is not built using maven/gradle). I am not interested in updating the version of astyanax every time they make a new release. I am not interested in mavenizing/gradelizing my own project.
So, two problems arise: 1. Getting the astyanax jars. 2. Getting the depenedency Jars.
At first, not having time to thoroughly understand maven (get off my lawn!), I copied all of the jar files in my global .maven directory into my project, and linked to them. Problem is, it's a pretty messy solution.
Is there an easier way to get all jars needed to use a gradle/maven library? (While I don't mind using gradle to build astyanax, I don't want to use it to build my own project).
Getting jars for distribution, seems like a very basic use case, am I missing a simple way here?
astyanax is published to maven central as com.netflix.astyanax:astyanax:1.56.42. Any build tool (Grails, Maven, Gradle, Buildr, SBT) that resolves from Maven can make a dependency on Astyanax and have its dependencies transitively downloaded. That fact that it's built with Gradle doesn't change how it's consumed.
From your question, it's unclear how you want to resolve these libraries. If you don't want to use a tool (Grails, Maven, Gradle, Buildr, SBT), then you'll have to manually navigate every dependencies and its dependencies from Maven Central. It's quite uncommon for a modern java project to manually download dependencies anymore, the practicalness given the complex dependencies graph make it prohibitive.
I have a maven multimodule project that has one parent pom-project and a bunch modules. One of these modules is the "main module" that has all the libraries shaded into it. All other modules depend on that module and use the provided libraries.
The main module is a Bukkit plugin that loads the other modules as extensions. These extensions are loaded all with their own classloader, but the loaded classes are shared between the loaders to be able to depend on each other. They are also able to depend on other Bukkit plugins, as their parent classloader is Bukkit's PluginClassLoader that also shares the loaded classes between plugins to allow interaction.
That's where the problems start: Different plugins may use the same library, but the classes of that library might get loaded by different classloaders which causes LinkageErrors and other problems.
My idea to solve that problem was to relocate the libraries in the main module via maven-shade-plugin. That works as expected with libraries that are only used by the main module. However relocating libraries used by the other modules causes runtime ClassNotFoundExceptions, because the modules still search for the normal package name instead of the relocated one.
Then I tried to change the imports to the relocated packages, but my IDE (IntelliJ) doesn't find the classes.
Has anyone an idea on how to solve this relocation problem? Or maybe different approaches on the classloading issue?
5 years later in a very similar context (Bukkit -> SpongeApi) I encountered this problem again, but this time I found the (probably only satisfying) solution:
The main module had its shaded version as the main artifact, so dependents could only see relocated classes and were unaware of the original classnames. This made no difference in our case, as the main module is a provided dependency anyway, but it also prevents consumers from accidentally using relocated classes directly. IntelliJ does not care for the relocations, so it was unaware of the new relocated classes. Attaching the shaded version as a secondary artifact (shadedArtifactAttached option set to true) makes the dependencies visible to the dependents again.
The dependent modules have to apply the same relocation rule as the main module, so the plugin corrects the classnames to the ones available at runtime.
This way IntelliJ is not aware of the relocations but it also doesn't need to be aware. If necessary, the relocations can be configured in a parent pom for consistant rules across all projects.
I had almost exactly the same problem you have/had (judging from the age of this question). Although I don't have a cleaner solution for the libraries overriding other plugins' versions, I do have a workaround for IntelliJ not recognizing relocated classes.
To stop it from complaining, I added the shaded jar (with the relocations) as IntelliJ library to the target module.
You can do this like so:
Go to File > Project Structure... > Modules > (target module) > Dependencies
Select the shaded jar using Add (green +) > 1. Jars or directories....
You should now see the shaded jar in the libaries list
Although it seems to work at first glance, this solution workaround has a few caveats:
Non-relocated classes are still visible to code through Maven's module dependency and if you happen to use them, you'll only see that upon compiling with Maven. (You could remove the module dependency, but it gets readded every time you reimport your pom)
You'd have to update the jar path every time you change your project's version if you include a version number in your jar file name (Workaround: Specify a static project.build.finalName)
When you add new methods or change signatures, you need to compile the library module again. (This can be worked around by creating a separate module for shading dependencies - That would actually also resolve the file name issue)
I have a Maven project multimodule project. Some of the modules create custom packaging for the libraries produced by the other modules. The packaging being used has its own suite of versioned dependencies that I need to play nice with.
As an example: my parent POM might have an entry for e.g. commons-codec:commons-codec 1.4, my "core-lib" POM includes it as a dependency (sans explicit version), and I want to make sure my packaging module bundles in the right version. However, the specific type of custom packaging that I'm using also needs e.g. log4j:log4j 1.2.15, and I want to make sure that when my packaging module runs, it also bundles the correct log4j version.
Here's the wrinkle: the example POM I'm working from for "project that makes {custom packaging}" uses a parent that's provided by the custom-packaging team. If I use their parent, I lose the version info for commons-codec. If I use my parent, I lose the version info for log4j.
Now, ordinarily if I ask "how do I make A and B depend on the same version", you'd answer "make A and B have the same parent, and include a dependencyManagementsection in the parent". My problem is, I need A, B, and C to depend on the same version, but I don't have any control over C.
I think this is what Maven "mixins" are meant to address, but of course they don't exist yet. In the meantime, what I've been doing is picking one parent, then copy-and-pasting the dependencyManagement section from the other POM, with a comment saying "make sure you keep this up to date". Obviously this is an ugly, ugly hack, but I haven't found another way to keep current with both sides.
What about using the assembly plugin to pack up your artifact with all its dependencies and having your packaging module run on that instead? Then you're not trying any pom magic. It's just a matter of one project using the artifact from another project, like usual.
For now, I'm going to accept the answer of "this is one of the really sucky things about Maven". Maybe this question can get updated when Maven 3.1 finally launches.
Could you not activate multiple profiles which have their own dependency section pulling in the required libraries when enabled. This allows some nice flexibility due to the ways that profiles can be activated.
I'm having a problem reconciling building a project for use within an application server and for use as a stand-alone application.
To give an overall simplified context, say I have three Projects A, B, C.
Project A depends on Project B which depends on Project C.
Project C has a dependency X which is marked as provided since it was expected that it would be available as a JEE library within say an application server. i.e. jms.jar.
So if I perform an assembly build of Project A, I get all the transitive dependencies save for those marked as provided as expected.
Now I have a new deployment scenario where Project A needs to be used in a standalone environment i.e. outside an application server.
So now I need the jms jar to be a compile dependency. Does this mean that I should explicitly add a compile dependency for X in Project A? Doesn't this violate the Law of Demeter, i.e. don't talk to strangers, in the sense Project A shouldn't explicitly know about Project C but only about Project B?
This is a simple example but in reality I have multiple dependencies which have been marked as provided but are now need to be compile or runtime dependencies so they end up in the artifact produced by the maven assembly plugin.
Is this a fundamental problem with Maven or am I not using the tools correctly?
Thanks in advance for any guidance.
If you need your build to have variations in it for different scenarios, you need to use profiles and keep certain things (such as some of the dependencies) in the various profiles.
http://maven.apache.org/pom.html#Profiles
Different dependencies for different build profiles in maven
answers a similar question - but you can swap in the "release" and "debug" for "Project A" and "Project C"
Provided dependencies are a difficult subject. First of all: Provided dependencies are not transitive in the following sense: If your project C has a provided dependency on X, then A will not get the dependency. It is silently ignored. This fits with the following meaning of "provided" which I propose:
Only the artifacts that are actually deployed should mark dependencies as "provided". Libraries or other jars that are not individually deployed to a specific server should not have provided dependencies. Instead, they should declare their dependencies as compile dependencies. In your example: Project C should have a compile dependency on X. If project A knows that X is provided, it sets X to provided in "dependencyManagement". As project A should know the environment in which it runs it should decide what is provided and what is not. And "dependenyManagement" is the right place to declare this.
If your project A should be able to run within and without a given server, you probably need to make a lot of adjustments, even change the type from ear to jar. So you either use build profiles for this, which then have different dependencyManagement entries, or you split A into two projects which depend on some other project that contains the common elements.
If some given project C already has a provided dependency on X and you cannot change that, this is effectively the same as a missing dependency in C. This has to be repaired at some point, and this could be project A itself.