make implementation dependency available in another submodule - gradle

I have submodule rest that defines couple of dependencies with implementation. And I have another module app that uses implementation project(":rest").
However the dependencies of the rest declared with implementation are not available for app. I know implementation does that by design, but how to make them available without using original compile configuration?

It seems I need to use plugin java-library and use api configuration for that dependency.

Related

kotlin data-classes without kotlin-stdlib maven dependency

we have a kotlin-microservice that needs to expose a maven-artifact that defines all DataTransferObjects that are required/emitted by that microservice (e.g. kotlin data-classes that represent events published to the event-bus).
client of that microservice is however a pure java application which shall depend on this kotlin-DTO-maven-artifact but not transitively get the kotlin-stdlib or any other kotlin-specific dependencies injected.
can we provide the kotlin-DTOs without introducing any kotlin-dependency?
The easiest and the most reasonable for me would be to create those DTOs as Java classes since the Kotlin is compatible with such classes
You don't need for this any additional tools or tricks. Inside this artifact you can easily use Lombok to not write plenty of boiler plate code

What does it really mean that api configuration exposes depedencies whereas implementation does not, in Gradle?

I have gone through the official doc and many StackOverflow questions about the difference betweenapi and implementation configurations. I think I understand the basic, but I really want to understand what it means by dependencies are exposed or not exposed.
This is what I have got so far. When I publish my Java library (written in Kotlin, but not relevant), the dependency scope in the published pom file is either complie when api is used or runtime when implementation is used, i.e.
dependencies {
api "..."
}
<dependency>
<groupId>...</groupId>
<artifactId>...</artifactId>
<version>...</version>
<scope>compile</scope>
</dependency>
dependencies {
implementation "..."
}
<dependency>
<groupId>...</groupId>
<artifactId>...</artifactId>
<version>...</version>
<scope>runtime</scope>
</dependency>
So does exposing dependencies in this case really just mean adding them to classpath (compile scope)?
One of the many answers about api vs implementation says it is merely about build optimization, it makes sense that the build time will be reduced if we not adding everything in the classpath maybe?
And a bonus question, the Gradle doc says api configuration comes with java-library plugin, but apparently, I can use it without applying the plugin, how is this possible?
// Gradle 6.1.1
plugins {
id 'org.jetbrains.kotlin.jvm' version 'XXX'
}
dependencies {
api "myLibrary"
}
So does exposing dependencies in this case really just mean adding them to classpath (compile scope)?
Yes, it's pretty much just a matter of having them on the consumer's compile classpath or not.
One of the many answers about api vs implementation says it is merely about build optimization, it makes sense that the build time will be reduced if we not adding everything in the classpath maybe?
Well, good software design advocates not exposing internal implementation details. This is why you have public and private class members in the code. You could argue that this principal is solid when it comes to dependencies as well. I see the following benefits:
A consumer does not implicitly start relying on "internal" transitive dependencies. If they did, it would mean that you can't remove them from the library without breaking the consumers.
A reduced classpath may make compilation slightly faster. I don't think it matters a whole lot for normal projects though. Maybe it is more impactful if you rely on Java or Kotlin annotation processors or Groovy AST transformations that feels like scanning the entire classpath through each time.
Not having unnecessary modules on the compilation classpath means a library will not have to be recompiled if those modules changes.
The last one is the biggest benefit in my opinion. Let's say you have a big multi-project where a shared sub-project internally relies on Apache Commons Lang. If you have declared Lang as an api dependency and update it, then all other projects relying on this shared project need to be recompiled. If you declare it as an implementation dependency instead, this will not happen. All those projects will still need to be re-tested of cause as the runtime behaviour might have changed (this is handled correctly by default in Gradle).
And a bonus question, the Gradle doc says api configuration comes with java-library plugin, but apparently, I can use it without applying the plugin, how is this possible?
This is because the Kotlin plugin also declares an api configuration. It has the same semantics as configured by the java-library plugin.
If your project is a multi-project, you can still add the java-library plugin even if it is using the Kotlin plugin. An additional change that this will cause is that consumers will see the output directory for the compiled classes instead of the final jar file. This removes the need to construct the jar during normal development, which should reduce build time. On the other hand, there is apparently a potential performance problem on Windows if you have a lot of classes in a single project, so the usual your mileage may vary disclaimer applies here as well (I don't know how many "a lot" is though).

Dealing with other dependencies in your own Maven dependency

I want to reuse and centralize the utils I created for my Spring REST API for my future projects. That's why I thought I'd outsource them to my own project and make them available as a Maven dependency.
These Util files e.g. a basic service, basic controllers also contain Spring annotations, i.e. I need some Spring dependencies in my Util dependency. Now I'm a bit unsure whether I'm making a mistake or not.
First of all, I'm not sure if I should even use spring dependencies in a utility dependency or try to remove everything. Otherwise, I'll have to specify a spring version, but it might differ from the version I want to use later in the project it's included in. How am I supposed to solve this?
It is perfectly reasonable to have dependencies for your dependencies (these are called transitive dependencies). Of course, you should keep the number as low as possible, but on the other hand, you do not want to reinvent the wheel.
When somebody uses your dependency, they will automatically draw the transitive dependency on spring. Now, several cases can occur:
If this is the only reference to spring, the version is just used as you stated it.
If at some other point, a different version of spring is given, Maven dependency mediation kicks in. It decides by a "nearest is best" rule which version to take.
But: You can always set the spring version in <dependencyManagement> and then overwrite all transitively given version numbers.
That is the main concept of Maven. Your utility module must shipped together with Spring dependencies. It's called transitive dependencies.
Try to imagine that situation when all dependencies had excluded. In that case nobody will never know what kind and which version of Spring dependencies are needed.
Maven has a very good dependency conflict resolution. It's based on nearest-newest principle. So you can override those Spring versions easily and your application will use only one of that.
Take a look at these:
[1] Dependency Mechanism
[2] Dependency Mediation and Conflict Resolution

Equivalent of api for test dependency in gradle?

I'm having multi module gradle project. In one of my modules I'm having api dependency:
api('de.flapdoodle.embed:de.flapdoodle.embed.mongo')
I want to change it to dependency that will be visible in tests, across all modules. There is a testImplementation dependency but there is no testApi.
I cannot have this dependency on production classpath anymore since I want to use real mongo instance instead of embedded one. On the other hand I have tests in different modules that depend on data access - in that case I want to run those test with embedded mongo on test classpath.
How I can make this dependency visible in all modules tests?
The question (appears to me) is sharing the test code across modules in a multi-module project
Short answer - No - there is direct test dependency share across modules.
To share test code between modules internally via build settings
Official gradle route https://docs.gradle.org/current/userguide/java_testing.html#sec:java_test_fixtures
Simple hack
testImplementation files(project(':core-module').sourceSets.test.output.classesDirs)
add the above line either individually where you need or in root with subprojects() with appropriate condition
*there are other possible routes as well *
ex: via configuration
child.testImplementation extends parent.testImplementation (or runtime)
testCompileClassPath includes api dependencies so you are all good here, de.flapdoodle.embed.mongo will be visible in your tests.

Should I create a new spring-boot starter or use optional dependencies?

I'm currently maintaining a fork of the jodconverter project which offers a spring-boot-starter module, allowing a spring-boot based app to use an Open/Libre Office installation (on the same server) to automate document conversions.
Now, the project have grown and a new module was born, named jodconverter-online. This module will send conversion request to a LibreOffice Online server, and I now want to create a spring-boot starter to support this new module.
The current jodconverter-local (on which depends the current jodconverter-spring-boot-starter) does not have the same dependencies as the jodconverter-online module. This is why they are two separated modules in the first place.
So my question is:
Should I create a new jodconverter-online-spring-boot-starter or if it is possible (and how) to just modify the current starter project, making the dependencies optional according to the needs of the user.
For now I put it all in the current starter project (which is available as a 4.2.0-SNAPSHOT in the OSS snapshot repository), but I'm doing it the wrong way since it automatically adds the dependencies for both the jodconverter-local and the jodconverter-online modules.
You may want to make the dependencies to jodconverter-local and jodconverter-online optional, you just need to replace the keyword compile by compileOnly in your Gradle build file.
Obviously, when dependencies become optional, the developer will have to choose one of the options and add it to their project's dependencies (in addition to your starter).
If the only additional dependency is either jodconverter-local or jodconverter-online, that is no big deal. But if more dependencies have to be added for each case, then you might consider creating a new starter to encapsulate those dependencies.
As for the AutoConfigurations, I don't see any problem with what you did, since you use #ConditionalOnClass to trigger the AutoConfiguration only when the corresponding class is present on the classpath.

Resources