What is the recommended way to group dependencies of the same type? - gradle

I'd like to separate the dependencies in my project by type, and am considering doing so in the following way:
// Implementation dependencies
dependencies {
implementation("foo:bar:1") {
because("reason 1")
}
implementation("foo:bar:2") {
because("reason 2")
}
implementation("foo:bar:3") {
because("reason 3")
}
}
// Test implementation dependencies
dependencies {
testImplementation("foo:bar:4") {
because("reason 4")
}
testImplementation("foo:bar:5") {
because("reason 5")
}
}
Questions:
I am able to build the project after structuring the build file in this way, but I don't see any authoritative material stating that specifying multiple dependencies blocks is formally supported. Does such material exist?
Is there a more preferable way of separating dependencies by type than this? Preferably, I'd like to have a dependency-configuration (implementation, testImplementation, etc.) per module in order to document the reason for including each module, like the configuration above does.

I don't see any authoritative material stating that specifying multiple dependencies blocks is formally supported. Does such material exist?
There doesn't need to be any material because the Gradle DSL (Groovy or Kotlin) isn't anything special or magical. It's simply sugar over the Gradle API.
Specifying multiple dependencies block is perfectly legal. If you were to de-sugar the Gradle DSL, invoking multiple dependencies blocks is actually just doing:
project.getDependencies().add("implementation", "foo:bar:1")
project.getDependencies().add("testImplementation", "foo:bar:4")
It's no different than simply calling the add(...) method on a List multiple times.
Is there a more preferable way of separating dependencies by type than this?
Create a library (project or subproject) that bundles dependencies together. This is easily accomplished with the Java Library Plugin. For example, for your test library:
dependencies {
api("foo:bar:4") {
because("reason 4")
}
api("foo:bar:5") {
because("reason 5")
}
}
Then simply consume the library in your main project:
dependencies {
testImplementation(project(":my-test-library")) {
because("bundles test libs")
}
}

There is no such support and I don't think is there is need also, but to achieve your requirements we can create an extension function just to differentiate the different dependencies. Anyway many Kotlin DSL is extension functions only so add something like below. just declare this in your buildSrc Dependencies.kts file or anywhere you like but should be accessible global.
// test
fun Project.dependenciesTest(configuration: DependencyHandlerScope.() -> Unit) =
DependencyHandlerScope.of(dependencies).configuration()
//app
fun Project.dependenciesApp(configuration: DependencyHandlerScope.() -> Unit) =
DependencyHandlerScope.of(dependencies).configuration()
now call something like this in the calling site.
dependenciesApp {
implementation(fileTree(mapOf("dir" to "libs", "include" to listOf("*.jar"))))
}
dependenciesTest {
testImplementation(AppDependencies.junit)
}

Related

Module replacement when there is no conflict

Module replacement works well in Gradle, however it only applies when there is a conflict.
Although I understand the reason, it breaks my use-case where there is extension of configurations and the conflict happens in some but not others that I need to consume.
I have two special configurations and some module replacement:
configurations {
lib // what should be bundled
provided // what should not be bundled
implementation.extendsFrom(lib)
implementation.extendsFrom(provided)
}
dependencies {
modules {
module('javax.annotation:javax.annotation-api') {
replacedBy('jakarta.annotation:jakarta.annotation-api', 'Javax to Jakarta')
}
}
}
task collectLibs(type: Copy) {
// bundle everything from lib which is not provided (not even transitively)
from configurations.lib - configurations.provided
into "$buildDir/lib"
}
I also use company BOM, here for example: api platform('org.springframework.boot:spring-boot-dependencies:2.5.4') and so I don't want to specify versions anywhere in my project.
Let's assume these dependencies:
dependencies {
lib 'javax.annotation:javax.annotation-api'
provided 'jakarta.annotation:jakarta.annotation-api'
}
the task dependencies then correctly resolves compileClasspath and runtimeClasspath to jakarta.annotation-api, however the collected files in build/lib contain javax.annotation-api-1.3.2.jar even though it "should have been replaced and subtracted"
If I use module substitution instead, it works:
configurations.all {
resolutionStrategy.dependencySubstitution {
substitute module('javax.annotation:javax.annotation-api') using module('jakarta.annotation:jakarta.annotation-api:1.3.5')
}
}
However there I must specify version. Is there any possibility to force module replacement to always act?
My problem is caused by the subtraction, maybe there is a better way to find all dependencies that come from provided but not lib by looking at runtimeClasspath?
I tried something but it gets too complicated very quickly.
I found a solution. Instead of subtracting provided configuration, I can exclude everything from resolved provided configuration. The tricky part is to exclude not too much and not too little:
platform must remain otherwise resolution of versions will fail
both requested and selected must be excluded
This is not a general solution; it still requires some fiddling with configurations (provided must declare both javax and jakarta) but it works for me.
private static excludeFromConfiguration(Configuration configuration, Configuration toExclude) {
toExclude.incoming.resolutionResult.allDependencies.each { dep ->
if (dep instanceof ResolvedDependencyResult && dep.requested instanceof ModuleComponentSelector) {
def isPlatform = dep.requested.attributes.keySet().any {
// asking for org.gradle.api.attributes.Category.CATEGORY_ATTRIBUTE does not work
def attribute = dep.requested.attributes.getAttribute(it)
return attribute == org.gradle.api.attributes.Category.ENFORCED_PLATFORM ||
attribute == org.gradle.api.attributes.Category.REGULAR_PLATFORM
}
if (!isPlatform) {
// we exclude both - the requested and selected because there could have been some:
// module replacement, dependency substitution, capability matching
configuration.exclude(group: dep.requested.group, module: dep.requested.module)
configuration.exclude(group: dep.selected.moduleVersion.group, module: dep.selected.moduleVersion.name)
}
}
}
}

Configuring a custom Gradle sourceSet using a closure

I'm trying to develop a Gradle plugin for a language I use (SystemVerilog). I'm still experimenting and figuring things out. Before I write the entire thing as a plugin, I thought it would be best to try out the different parts I need inside a build script, to get a feel of how things should work.
I'm trying to define a container of source sets, similar to how the Java plugin does it. I'd like to be able to use a closure when configuring a source set. Concretely, I'd like to be able to do the following:
sourceSets {
main {
sv {
include '*.sv'
}
}
}
I defined my own sourceSet class:
class SourceSet implements Named {
final String name
final ObjectFactory objectFactory
#Inject
SourceSet(String name, ObjectFactory objectFactory) {
this.name = name
this.objectFactory = objectFactory
}
SourceDirectorySet getSv() {
SourceDirectorySet sv = objectFactory.sourceDirectorySet('sv',
'SystemVerilog source')
sv.srcDir("src/${name}/sv")
return sv
}
SourceDirectorySet sv(#Nullable Closure configureClosure) {
configure(configureClosure, getSv());
return this;
}
}
I'm using org.gradle.api.file.SourceDirectorySet because that already implements PatternFilterable, so it should give me access to include, exclude, etc.
If I understand the concept correctly, the sv(#Nullable Closure configureClosure) method is the one that gives me the ability to write sv { ... } to configure via a closure.
To add the sourceSets property to the project, I did the following:
project.extensions.add("sourceSets",
project.objects.domainObjectContainer(SourceSet.class))
As per the Gradle docs, this should give me the possibility to configure sourceSets using a closure. This site, which details using custom types, states that by using NamedDomainObjectContainer, Gradle will provide a DSL that build scripts can use to define and configure elements. This would be the sourceSets { ... } part. This should also be the sourceSets { main { ... } } part.
If I create a sourceSet for main and use it in a task, then everything works fine:
project.sourceSets.create('main')
task compile(type: Task) {
println 'Compiling source files'
println project.sourceSets.main.sv.files
}
If I try to configure the main source set to only include files with the .sv extension, then I get an error:
sourceSets {
main {
sv {
include '*.sv'
}
}
}
I get the following error:
No signature of method: build_47mnuak4y5k86udjcp7v5dkwm.sourceSets() is applicable for argument types: (build_47mnuak4y5k86udjcp7v5dkwm$_run_closure1) values: [build_47mnuak4y5k86udjcp7v5dkwm$_run_closure1#effb286]
I don't know what I'm doing wrong. I'm sure it's just a simple thing that I'm forgetting. Does anyone have an idea of what that might be?
I figured out what was going wrong. It was a combination of poor copy/paste skills and the fact that Groovy is a dynamic language.
First, let's look at the definition of the sv(Closure) function again:
SourceDirectorySet sv(#Nullable Closure configureClosure) {
configure(configureClosure, getSv());
return this;
}
Once I moved this code to an own Groovy file and used the IDE to show me what is getting called, I noticed that it wasn't calling the function I expected. I was expecting a call to org.gradle.util.ConfigureUtil.configure. Since this is part of the public API, I expected it to be imported by default in the build script. As this page states, this is not the case.
To solve the issue, it's enough to add the following import:
import static org.gradle.util.ConfigureUtil.configure
This will get rid of the cryptic closure related error. It is replaced by the following error, though:
Cannot cast object 'SourceSet_Decorated#a6abab9' with class 'SourceSet_Decorated' to class 'org.gradle.api.file.SourceDirectorySet'
This is caused by the copy/paste error I mentioned. When I wrote the SourceSet class, I drew heavily from org.gradle.api.tasks.SourceSet (and org.gradle.api.internal.tasks.DefaultSourceSet). If we look at the java(Closure) method there, we'll see it has the following signature:
SourceSet java(#Nullable Closure configureClosure);
Notice that it returns SourceSet and not SourceDirectorySet like in my code. Using the proper return type fixes the issue:
SourceSet sv(#Nullable Closure configureClosure)
With this new return type, let's look again at the configuration code for the source set:
sourceSets {
main {
sv {
include '*.sv'
}
}
}
Initially, I thought it was supposed to work as follows: pass main { ... } as a Closure to sourceSets, pass sv { ... } as a Closure to main, and handle the include ... part inside sourceDirectorySet. I banged my head against the wall for a while, because I couldn't find any code in that class hierarchy that takes closures like this.
Now, I think the flow is slightly different: pass main { ... } as a Closure to sourceSets (as initially thought), but call the sv(Closure) function on main (of type sourceSet), passing it { include ... } as the argument.
Bonus: There was one more issue that wasn't related to the "compile" errors I was having.
Even after getting the code to run without errors, it still wasn't behaving as expected. I had some files with the *.svh extension that were still getting picked up. This is because, when calling getSv(), it was creating a new SourceDirectorySet each time. Any configuration that was done previously was getting thrown away each time that this function was called.
Making the sourceDirectorySet a class member and moving its creation to the constructor fixed the issue:
private SourceDirectorySet sv
SourceSet(String name, ObjectFactory objectFactory) {
// ...
sv = objectFactory.sourceDirectorySet('sv',
'SystemVerilog source')
sv.srcDir("src/${name}/sv")
}
SourceDirectorySet getSv() {
return sv
}

How to extract Android Gradle dependencies (implement, testImplementation...) into method in the root of project?

In app-level build.gradle, I often include the following dependencies
dependencies {
implementation "androidx.appcompat:appcompat:$appCompatVersion"
implementation "androidx.cardview:cardview:$cardVersion"
implementation "com.google.android.material:material:$materialVersion"
implementation "androidx.recyclerview:recyclerview:$recyclerViewVersion"
implementation "androidx.annotation:annotation:$androidXAnnotations"
...
}
It makes that file becomes longer and longer, thus I think about moving all such dependencies into project-level build.gradle, such as (just example):
ext {
includeUnitTestDeps() {
implementation "androidx.appcompat:appcompat:$appCompatVersion"
implementation "androidx.cardview:cardview:$cardVersion"
implementation "com.google.android.material:material:$materialVersion"
implementation "androidx.recyclerview:recyclerview:$recyclerViewVersion"
implementation "androidx.annotation:annotation:$androidXAnnotations"
}
}
(The reason why to project-level but not app-level because we might have multiple modules in project, thus project-level is the best places)
Then in app-level build.gradle we call
dependencies {
ext.includeUnitTestDeps()
...
}
Note: I'm not so familiar with Groovy/Gradle syntax, therefore not sure it works (In fact I tried but it doesn't allow to define such method in ext). But if you know any solution, please help me. Thanks so much.
There are couple of ways that I can think of
1 - Add dependencies to all sub projects (in the parent)
subprojects {
dependencies {
implementation 'com.google.guava:guava:23.0'
testImplementation 'junit:junit:4.12'
}
....
}
2 - Add dependencies to specific sub projects (in the parent)
// this will add dependencies to project a, b, and c
// when You add a new subproject, You have to add it to here also
// If You these need dependencies available in it
configure([project(':a'), project(':b'), project(':c')]) {
dependencies {
implementation 'com.google.guava:guava:23.0'
.....
}
}
3 - Using a method to add dependencies
//in parent
// define a method to add dependencies
// sub projects who need these dependencies will call this method
def addDependencies(subProject) {
subProject.dependencies.add("implementation", "com.google.guava:guava:23.0")
subProject.dependencies.add("implementation", "org.apache.commons:commons-lang3:3.8.1")
// add others
}
// in child
dependencies {
addDependencies(this)
// You can add other dependencies here If this child has any
}
4 - Define dependencies as a list in the parent
// parent
ext.appDependencies = [
[configuration: "implementation", dependency: "org.apache.commons:commons-lang3:3.8.1"],
[configuration: "implementation", dependency: "com.google.guava:guava:23.0"]
]
// child
dependencies {
rootProject.appDependencies.each {
add(it.configuration, it.dependency)
}
}
There is a detailed explanation of this method in the following link, which uses external file to define these dependencies.
https://hackernoon.com/android-how-to-add-gradle-dependencies-using-foreach-c4cbcc070458
You can also combine 3. and 4. methods like define a list that has dependencies, call a function which iterate and add dependencies in that list.
I would use the first or second method If I can. (Also there might be other ways to achieve this.)

How to declare project artifacts in non-Java build?

I have multi-project Gradle build that contains also non-Java projects.
I want to declare the artifacts create by one such project in a way that I can use project/configuration dependencies to get them, e.g.
consumer:
dependencies {
myConf project(path: ':producer', configuration: 'myConf')
}
What I currently have is this:
producer:
configurations {
myConf
}
task produceFile {
//... somehow create the file...
outputs.file file('path/to/file')
}
artifacts.add('myConf', produceFile.outputs.files.singleFile, { builtBy produceFile })
Is there a better way to declare the artifact than my clumsy version?
I couldn't figure out a way to pass the task dependency from the artifact to the producing task in one go.
According to the documentation article on Legacy publishing and the javadoc on the ArtifactHandler, for your simple example it should be sufficient to just pass the task, as long as the task type extends AbstractArchiveTask (e.g. Zip or Jar):
artifacts.add('myConf', produceFile)
... or in the more Gradle-ish way:
artifacts {
myConf produceFile
}
The article mentioned above has another example, where a File is passed directly to the add method, which requires you to specify the task to build the file in the way you did in your example.
However, let me propose other ideas for syntax that may be experienced more 'lightweight':
artifacts {
myConf files(produceFile).singleFile { buildBy produceFile }
// or
myConf file: files(produceFile).singleFile, buildBy: [produceFile]
}
These two examples use the Project.files(...) method to resolve the output(s) of the task instead of accessing them manually. The second example makes use of the map syntax often provided by Gradle.
If you want to somehow standardize your way to publish your custom artifacts, I would propose to create a custom task type that offers any of the different arguments the ArtifactHandler can process as a method or property:
class MyTaskType extends DefaultTask {
// ... other stuff ... of course this should be part of a plugin
def getArtifact() {
return ... // either a (Configurable)PublishArtifact (if constructor is available) or a map representation
}
}
task produceFile(type: MyTaskType) {
// configure somehow
}
artifacts {
myConf produceFile.artifact
}

How do I compile against local file in gradle?

For each sub-project in our build, we have a structure like this:
apply from: '../dependencies.gradle'
dependencies {
... omitting other dependencies ...
compile libraries.poi
}
These libraries are defined in dependencies.gradle, which looks like this:
ext.libraries = [
... omitting other libraries ...
poi: [
'poi:poi:3.9.custom.1',
'poi:poi-ooxml:3.9.custom.1',
'poi:poi-ooxml-schemas:3.9.custom.0',
'poi:poi-scratchpad:3.9.custom.0',
],
... omitting other libraries ...
]
A few days ago I wanted to try something against a nightly build of POI. Nightly builds don't go into their repository, so I'm forced to try and get it to work with local files.
Looking in the docs, you're supposed to use files(...) for this, so I tried this:
poi: [
files('/path/to/poi-3.14-beta1/poi-3.14-beta1-20151027.jar'),
files('/path/to/poi-3.14-beta1/poi-3.14-ooxml-20151027.jar'),
files('/path/to/poi-3.14-beta1/poi-3.14-ooxml-schemas-20151027.jar'),
files('/path/to/poi-3.14-beta1/poi-3.14-scratchpad-20151027.jar'),
],
When I run this, I get an error:
* What went wrong:
A problem occurred evaluating root project 'product'.
> Cannot convert the provided notation to an object of type ModuleVersionSelector: file collection.
The following types/formats are supported:
- Instances of ModuleVersionSelector.
- String or CharSequence values, for example 'org.gradle:gradle-core:1.0'.
- Maps, for example [group: 'org.gradle', name:'gradle-core', version: '1.0'].
- Collections or arrays of any other supported format. Nested collections/arrays will be flattened.
So really it seems like files() does not actually work, as it doesn't return one of the things listed here.
What is the correct way to do it? (Assuming it's even possible!)
Edit: More information
Now that I updated to Gradle 2.8, I get a line number pointing at the problem. It points at some custom build code which we put in to work around Gradle sucking at dependency resolution:
resolutionStrategy {
libraries.each {
libraryName, libraryList ->
libraryList.each {
library -> force library // 👈 this line
}
}
failOnVersionConflict()
}
So I take it the problem is that force doesn't support all the same things that other methods support?
My crap workaround for a workaround is to filter out elements of type FileCollection:
resolutionStrategy {
libraries.each { libraryName, libraryList ->
[libraryList].flatten()
.findAll { library ->
!(library instanceof FileCollection)
}
.each { library -> force library }
}
failOnVersionConflict()
}
Maybe there is a better way.

Resources