I am in my project reusing an open source maven-based component that includes a bunch of shaded (e.g, using the maven-shade plugin) direct and transitive dependencies in the component uber-jar. Unfortunately some of those dependencies clash with dependencies that my own project has. Specifically, the component's dependencies transitively include servlet-api 2.x whereas I need 3.x in my project - and they appear to be in the same namespace. The component's top-level dependency that pulls in servlet-api (lucene-demo) is actually not needed for the functionality of the component, so I'd be happy to remove it if possible. My project is built with Gradle.
What is the recommended way of dealing with this type of situation? Is there any way of removing the offending dependencies from the reused uber-jar when I build my own project? Or should I rebuild the reused component myself, excluding the troublesome dependency? If so, can this be done in an automatic manner, such that I don't need to maintain my own fork of the open source component? The component is presently hosted in GitHub and published via Maven Central.
(As you might understand, I'm a bit of a beginner to both Maven and Gradle, so don't worry about dumbing things down).
Related
I want to publish a common build script which i will include across various projects in my application.
This will contain only the common set of dependencies, i.e dependencies with particular versions that will be common across all the artifacts in my enterprise application..
My applications will refer to this file from the url.
How can i achieve this?
EDIT1: my exploration in this direction is based on this answer on SO:
How to share a common build.gradle via a repository?
There are a few different options for this.
One is to publish a project with the dependencies you want to share defined as API dependencies. Projects that depend on this will inherit the dependencies.
Or you could write and publish a Gradle plugin that will configure your projects with the common dependencies. Projects can apply the plugin, and will automatically be configured in a certain way. (You don't need to publish a plugin to do this - first try creating a project-local buildSrc convention plugin.)
I would actually recommend neither of these approaches.
It's easy to get into a tangled web of dependency hell when transitive dependencies are inherited. It's likely that at some point some dependency will clash, and excluding dependencies can be a big headache, and will easily cancel out any benefit in trying to reduce a little duplication.
Additionally, it's nice when a project is explicit about its dependencies. Being able to look at a build.gradle.kts and understand exactly what dependencies are set is very convenient.
Instead, what I would recommend is controlling the versions of common dependencies in a central location. This can be achieved with the Java Platform plugin. This plugin can be applied to a single build.gradle.kts file, and it lists all versions of all possible dependencies. (It can also import existing Maven BOMs, like the Spring Boot BOM).
Now, all subprojects can add a platform dependency on the 'Java Platform' project.
dependencies {
// import the platform from a Maven repo
implementation(platform("my.company:my-shared-platform:1.2.3"))
// or import a platform from a local project
implementation(platform(":my-project:version-platform"))
// no need to define a version, if it's defined in the platform
implementation("com.fasterxml.jackson.core:jackson-databind")
}
This is the best of both worlds. Projects can be explicit about their dependencies, retain autonomy, while the versions can be aligned across independent projects.
I have a maven war project with submodules. One module uses google-api-client, another use google-cloud-storage. I draw some of their dependencies below
A
|-google-api-client:jar:1.33.1
|-google-http-client-gson:jar:1.41.1
B
|-google-cloud-storage:jar:2.4.4
|-google-api-client:jar:1.33.1
|-google-http-client-gson:jar:1.41.2
When packaging wars, both gson 1.41.1 and 1.41.2 will be packaged. I know maven has a nearest rule to determine which jar to use when compiling. But when the webserver loads my project I have no control to which jar will be loaded first. So I want to keep only a newer version for each jar.
I know that I can add <exclusion> tags to the dependencies and add a new dependency to tell maven to use a specific version of jars. However, I am not sure if that is the best practice because it requires me to go through the dependencies of third-party libraries. There are just too many of them.
Any suggestions on how to handle the multiple versions of jars properly?
A good practice I recommend is to use enforcer Plugin with dependency convergence goal. This way you are forced to decide which version will be on the class path. Of course it might be additional effort because you have to handle conflicts (also by setting exclusions), but in the end it's well defined, which versions you get.
I have three Java projects. The first is an application, com.foo:foo-application:1.0.0, and the second is a module used as a dependency to that application, com.foo:foo-framework:1.0.0. The third is a Maven plugin authored by our team, com.foo:foo-plugin:1.0.0.
My intention is that any project, e.g. foo-application, which uses classes available in foo-framework must also validate that it has used those classes correctly, where said validation is enforced by foo-plugin.
Is there a way to enforce this behaviour within foo-framework's POM.xml, whereby any Maven module which declares it as a dependency in its own POM will have foo-plugin executed as part of its build lifecycle?
No (at least no way that I'm aware of).
when you declare a dependency on something, youre declaring a dependency on its output artifacts (and transitively their dependencies as optionally described in that artifact's pom.xml file). There's no place in a pom file to force anything on the build importing it - the build importing it may not even be a maven build.
it appears you may be able to do something similar via other tools though - for example checkstyle supports discovering rules from dependencies on the classpath (not exactly what you want and depends on users of your library running checkstyle configured just right)
I have the following Java projects structure:
Util
|
-- Core
|
-- Services
|
-- Tools
The projects: Tools and Services references to Core and Util projects, the thing is that I ended up writing the same dependency over each project, there must be a better way to inherit the dependencies of the referenced projects and add new ones if needed.
I know about multi projects in Gradle, but this is not like a multi project, since I can basically take the Core library, compile it (which will then contain Core + Util libs) and use it in another project.
I wonder what would be the best way to approach this?
Repeating the same dependencies in every project is usually reasonable because in a bigger project you'll never know when they become different, and you don't want to deal with compilation/runtime problems when someone changes common dependencies list.
I believe that it is more pragmatic to add dependency analyser plugin to your build. It will help you to remove unnecessary dependencies and explicitly add transitive dependencies. And if you add this plugin to your build chain, it will help you to keep your dependencies healthy in the future. Pick this plugin here gradle-dependency-analyze, or maybe there is a better fork or equivalent somewhere.
You are actually out of options in your case because there are only two kinds of dependencies: (1) external (some other jar artefact) or (2) internal (another module in a multimodule build).
2.1 When you use an external maven-like dependency it will come to you with own dependencies (they are named "transitive dependencies"). It means that if you do compile 'yourgroup:Core:1.0' then you will get Util as a transitive dependency. But as I mentioned above, it is better to list transitive dependencies explicitly if they are used during compilation or to prevent them from being accidentally removed and crash your application in runtime.
2.2. If your projects live in the same version control repository and usually change and build together, then the multimodule layout is your best choice. In this case, you will refer to Core dependency like compile project(':Util:Core') and it will grab Util as a transitive dependency as well. And you will be able to do what you asked for and define dependencies for Services and Tools once - inside subprojects {} closure in the Core/build.gradle.
Having multimodule built doesn't limit you from using Core library elsewhere. No matter if it is a multimodule build or not, you can always add maven-publish plugin to Core/build.gradle, execute publishToMavenLocal task and reference to Core.jar from another project the same way you do for external dependencies.
You can always put your common code (like the one which will add common dependencies) in the external gradle file or custom plugin and apply it in Services and Tools.
this is a more conceptual question:
I want to create an application which uses the WALA framework, which itself is packaged as a eclipse plugin, built with maven-tycho. When I try to add this as an dependency no transitive dependency gets resolved, because they are covered by the tycho build.
This is the pom of the WALA project I need at least https://github.com/wala/WALA/blob/master/com.ibm.wala.core/pom.xml
Should my application be a OSGI Bundle itself or can I create a regular jar with it without having much trouble? Which approach is more practical?
If I have seen it correctly, wala.core has only two dependencies wala.util and wala.shrike (util has none, shrike depends on util). So you might as well simply include all three dependencies in your project.
On the long haul, however, you might should indeed consider creating an osgi application instead.