Gradle Task messes with runtime Dependencies - gradle

Another strange behaviour of gradle...
So I've found this post:
Gradle exclude module for Copy task
Totally fine and works like a charm to exclude things from copying.
But here is where it gets interesting. This is how my Copy Task looks:
task copyDependencies(type: Copy) {
into "$buildDir/libs/dependencies"
from configurations.runtime {
exclude module: 'groovy'
exclude module: 'aws-java-sdk-s3'
exclude module: 'commons-io'
}
}
If I try to run the Application through Gradles 'application run' task. It fails with "Main Class xxx couldn't be found or loaded". Digging deeper into the problem I noticed that Groovy couldn't be resolved.
I don't even run this Task, or depend on it.
But if I comment out line 4 like this:
task copyDependencies(type: Copy) {
into "$buildDir/libs/dependencies"
from configurations.runtime {
//exclude module: 'groovy'
exclude module: 'aws-java-sdk-s3'
exclude module: 'commons-io'
}
}
The Application starts like normal, until it reaches a point where it needs Commons-IO. I still want to use this copyDependencies Task at other times, without changing the code there though.
Can somebody explain me this behaviour ?
I imagine manipulating the configuration.runtime anywhere in the gradle file, changes it for every other task ?

In your from configuration block, you are referencing the runtime configuration, but in the same time you are altering this configuration by adding some exclusion rules. This will alter the original (and unique) runtime configuration which will be used by all other tasks in your build project, as you have guessed. This explains the "Main Class xxx couldn't be found or loaded" error you get when trying to execute the run task, since the runtime configuration (classpath) does not contain the needed library.
If you want to write exclusions rules by group and/or module in your copyDependencies task, one possible way would be to work on a copy of the original runtime configuration; you could define a new Configuration for this purpose:
configurations{
runtimeDeps.extendsFrom runtime
}
task copyDependencies(type: Copy) {
into "$buildDir/libs/dependencies"
from configurations.runtimeDeps {
exclude module: 'groovy'
exclude module: 'aws-java-sdk-s3'
exclude module: 'commons-io'
}
}

Related

Use build.finalizedBy on each subproject with Gradle kotlin

I want to run a specific task after EVERY build of my subprojects. I can go into each of my subprojects build.gradle.kts file and add the following
tasks.build {
finalizedBy("afterbuildtask")
}
However, this should be possible to do in my root project build.gradle.kts file right? I tried it by doing the following:
subprojects {
this.tasks.findByName("build")?.dependsOn("afterbuildtask")
}
But nothing happens. How can I achieve this?
You can't programatically execute tasks from other tasks in newer versions of Gradle.
Instead, you are supposed to declare task dependencies and Gradle will ensure they get executed in the correct order. But I think it's not what you want
Alternatively, you could move your logic into the doLast block in the build task. eg:
build {
doLast {
println("Copying...")
copy {
from 'source'
into 'target'
include '*.war'
}
println("completed!")
}
}
good coding! ¯_(ツ)_/¯

Create separate shadowJars for code and dependencies

Using the gradle shadowJar plugin version 4.0.2 and gradle 4.10, I wanted to create two separate shadowJars, one for my source code and another for my dependencies (since dependencies are large and rarely changes I don’t want to repackage them every time I change my source code). What I have in mind is to have a gradle plugin that adds two separate tasks and which takes the same configurations supplied by user for shadowJar and overrides the configurations/sources used to create the shadowJar.
Below is what I have got so far, still trying to figure out a clean way to pass the shadow configs only once and whether there are other gotchas I need to worry about (ex: having two mergeServiceFiles will break etc)
import com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar
task dependencyShadowJar(type: ShadowJar) {
mergeServiceFiles()
zip64 = true
relocate 'com.google.common', 'com.shadow.com.google.common'
classifier = 'dependencies'
configurations = [project.configurations.runtime]
}
task userCodeShadowJar(type: ShadowJar) {
mergeServiceFiles()
zip64 = true
relocate 'com.google.common', 'com.shadow.com.google.common'
classifier = 'mycode'
from sourceSets.main.output
}
task splitShadowJar {
doLast {
println "Building separate src and dependency shadowJars"
}
}
splitShadowJar.dependsOn dependencyShadowJar
dependencyShadowJar.dependsOn userCodeShadowJar
Ideally I would like to have a shadowJar settings specified once and the tasks copies the same settings, does that require creating a custom Plugin task in groovy ?
Can I copy the settings from existing shadowJar that user specifies and just overrides the from or configurations part alone for my purpose
Anybody has attempted something similar ?

Gradle `tasks.withType()` strange behavior for tasks added afterwards

I am using Gradle 2.14.1 and the https://github.com/unbroken-dome/gradle-testsets-plugin to add an integration test task. I would like to configures the location of the HTML reports.
The default tasks uses:
<project>/build/reports/tests
The testsets plugin configures:
<project>/build/intTest
for the intTest task, and I want:
<project>/build/reports/test
<project>/build/reports/intTest
Here is my configuration:
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'org.unbroken-dome.gradle-plugins:gradle-testsets-plugin:1.2.0'
}
}
apply plugin: 'java'
apply plugin: 'org.unbroken-dome.test-sets'
defaultTasks = ['clean', 'build']
tasks.withType(Test) {
reports.html.destination = new File(project.reportsDir, name)
println(it.name + '\t' + reports.html.destination)
}
task wrapper(type: Wrapper) {
description = 'Defines the common gradle distribution for this project.'
gradleVersion = '2.14.1'
}
testSets {
intTest
}
intTest.dependsOn test
check.dependsOn intTest
repositories {
jcenter()
}
dependencies {
testCompile 'junit:junit:4.12'
intTestCompile 'junit:junit:4.12'
}
println('===== final config =====')
println(test.name + '\t' + test.reports.html.destination)
println(intTest.name + '\t' + intTest.reports.html.destination)
(Please ignore the println statements for the time being.)
After a full build, the intTest task's reports are in the wrong place (the default), and the configuration for the standard test task is applied:
$ ls build/
classes dependency-cache intTest intTest-results libs reports test-results tmp
$ ls build/reports/
test
I added some output to see what is going on, and it seems strange (the project's root is 'blob'):
test /home/wujek/blob/build/reports/test
intTest /home/wujek/blob/build/reports/intTest
===== final config =====
test /home/wujek/blob/build/reports/test
intTest /home/wujek/blob/build/intTest
So, in the tasks.withType() block the location is reported to be correct, but in the end, it is not.
Please note that moving the tasks.withType() block after the testSets block works for this project, but my real setup is mode complex: I have mutliple modules, and the root build.gradle uses the subprojects block with the tasks.withType() block to configure the report locations for all modules, and then one of the submodules adds a new test set and its test task's HTML report has the wrong location. To fix this, I have to repeat the configuration in the submodules that add test sets.
What is going on here? Why does the tasks.withType() block say the config works, but in reality it doesn't?
This is due to the pecularities of ordering configuration in Gradle. Let's walk through your code as Gradle would process it to see what happens (skipping over lines that aren't relevant):
apply plugin: 'org.unbroken-dome.test-sets'
This executes the apply method of the test-sets plugin, which includes the creation of a class which listens for test sets to be added. It adds a whenObjectAdded action to the testSets container. You haven't added any test sets yet, so lets move back to your build.gradle.
tasks.withType(Test) {
reports.html.destination = new File(project.reportsDir, name)
println(it.name + '\t' + reports.html.destination)
}
You've now added an action that will apply to all existing Test tasks, and to new ones as they are created. Now where it all unwinds:
testSets {
intTest
}
This creates a testSet called intTest. The whenObjectAdded action in the test-sets plugin now fires:
The intTest sets Test task is created.
Your withType action fires, because there's now a new Test task. This sets the report to go where you want.
The whenObjectAdded action now continues, getting to this line which also sets the html report location, overriding what you just set.
When you change this to declare the testSet first it goes:
whenObjectAdded - Create the Test task
whenObjectAdded - Set the test task's HTML report location
withType is registered by you
withType fires setting the HTML report location to your desired destination
There aren't hard and fast rules to avoid this, since plugins can and do take wildly different approaches to how/when they register their configuration actions. Generally, I try to break my build.gradle down in this order:
Buildscript block (if needed)
apply plugins
set project level properties (group, version, sourceCompatibility, etc)
Configure extensions (sourceSets, testSets, configurations, dependencies)
Configure tasks (either direct or withType)
Usually this helps allow plugins that fire config to register default values before my configuration comes in to change things.
FYI, Gradle's new, and still incubating, model space is intended to help solve this configuration ordering problem. It won't be perfect, but allows the order to be more explicit.

Creating a post build copy task with Gradle

I am struggling with the Gradle build lifecycle; specifically with the split between the configuration and execution phases. I have read a number of sections in the Gradle manual and have seen a number of ideas online, but have not found a solution to the following problem:
I want to run a specific task to produce an artifact at the end of my java-library-distribution build that is a flattened version of the runtime configuration jars. That is, I only want to produce the artifact when I run the specific task to create the artifact.
I have created the following task:
task packageSamplerTask(type: Tar, dependsOn: distTar) {
description "Packages the build jars including dependencies as a flattened tar file. Artifact: ${distsDir}/${archivesBaseName}-${version}.tar"
from tarTree("${distsDir}/${archivesBaseName}-${version}.tar").files
classifier = 'dist'
into "${distsDir}/${archivesBaseName}-dist-${version}.tar"
}
Although this task does produce the required artifact, the task runs during gradle's configuration phase. This behavior has the following consequences:
Irrespective of which task I run from the command line, this packageSamplerTask task is always run, often unnecessarily; and
If I clean the project, then the build fails on the next run because $distsDir doesn't exist during the configuration phase (obviously).
It appears that if I extend the Copy task in this manner I'm always going to get this kind of premature behavior.
Is there a way to use the << closure / doLast declarations to get what I want? Or is there something else I'm missing / should be doing?
Update
After further work I have clarified my requirements, and resolved my question as follows (specifically):
"I want to package my code and my code's dependencies as a flat archive of jars that can be deployed as a jMeter plugin. The package can then be installed by unpacking into the jMeter lib/ext directory, as is. The package, therefore, must not include the jMeter jars (and their dependencies) which are used for building and testing"
Because Gradle doesn't appear to support the Maven-like provided dependency management, I created a new configuration for my package which excludes the jMeter jars.
configurations {
jmpackage {
extendsFrom runtime
exclude group: 'org.apache.jmeter', name: 'ApacheJMeter_core', version: '2.11'
exclude group: 'org.apache.jmeter', name: 'ApacheJMeter_java', version: '2.11'
}
}
And then created the following task (using the closure recommendation from Peter Niederwieser):
task packageSamplerTask(type: Tar, dependsOn: assemble) {
from { libsDir }
from { configurations.jmpackage.getAsFileTree() }
classifier = 'dist'
}
This solution appears to work, and it allows me to use just theGradle java plugin, too.
The task declaration is fine, but the flattening needs to be deferred too:
...
from { tarTree("${distsDir}/${archivesBaseName}-${version}.tar").files }
Also, the Tar file should be referred to in a more abstract way. For example:
from { tarTree(distTar.archivePath).files }
First your task isn't executed in the configuration phase but like EVERY task it is configured in that phase. And your closure is just a configuration of your task (a Configuration closure, not an Action closure). That is why your code is "executed" in the configuration phase".
If you want your code to be executed in the execution phase have to write it in a doLastclosure or doFirst. But in your case it is better to keep it in a configuration closure, because you are configuring your task.
To make sure your build doesn't fail because of the missing folder, you can create it with distsDir.mkdirs().

Gradle - can I include task's output in project dependencies

I have a task that generates java sources and a set of jars from these sources (say, project a). I would like to export these jars to dependent projects (say, project b). So here's roughly what I have right now:
//a.gradle
configurations{
generatedJars
}
task generateJars(type: JavaExec) {
//generate jars ...
outputs.files += //append generated jars here
}
dependencies{
generatedJars generateJars.outputs.files
}
//b.gradle
dependencies{
project(path: ':a', configuration: 'generatedJars')
}
It works OK, except that adding generateJars.outputs.files as a dependency does not tell gradle that it has to run generateJars task when there are no jars generated yet. I have tried adding the task itself as a dependency hoping that it would work in the same way as it does when you add a jar/zip task to an artifact configuration (e.g. artifacts{ myJarTask }), but it throws an error telling me that I cannot do that. Of course I can inject the generateJars task somewhere in the build process before :b starts evaluating, but that's clumsy and brittle, so I would like to avoid it.
I feel like I should be adding the generated jars to artifacts{ ... } of the project, but I am not sure how to make them then visible to dependent projects. Is there a better way of achieving this?
Dependent projects (project b) will need to do setup IntelliJ IDEA module classpath to point to project a's generated jars. Something rather like this (pseudo-code):
//b.gradle
idea{
module{
scopes.COMPILE.plus += project(path: ':a', configuration: 'generatedJars').files
}
}
So far I have tried simply adding a project dependecy on :a's generatedJars in :b, but Idea plugin simply adds module :a as a module-dependency and assumes that it exports its generated jars (which is probably a correct assumption), therefore not adding the generated jars to :b's classpath.
Any help would be greatly appreciated!
First, do you need a separate configuration? That is, do you have clients of a that should not see the generated Jars? If not, you can add the generated Jars to the archives configuration, which will simplify things.
Second, the correct way to add the generated Jars to the configuration is (instead of the dependencies block):
artifacts {
generatedJars generateJars
}
This should make sure that the generateJars task gets run automatically when needed.
Third, I'd omit the += after outputs.files, although it might not make a difference. You should also add the necessary inputs.
Fourth, why do you need a JavaExec task to generate the Jars? Can you instead add the generated sources to some source set and let Gradle build them?
Fifth, IDEA doesn't have a concept corresponding to Gradle's project configuration dependencies. Either an IDEA module fully depends on another module, or not at all. You have two options: either use a module dependency and make the generated sources a source folder of the depended-on module (preferably both in the Gradle and the IDEA build), or pass the generated Jars as external dependencies to IDEA. In either case, you should probably add a task dependency from ideaModule to the appropriate generation task. If this still doesn't lead to a satisfactory IDEA setup, you could think about moving the generation of the Jars into a separate subproject.
For my use case, I had a C++ project which generated some native libraries which my java project needed to load in order to run.
In the project ':native' build.gradle:
task compile(type: Exec, group: 'build') {
dependsOn ...
outputs.files(fileTree('/some/build/directory') {
include 'mylib/libmy.so'
})
...
}
In project java application build.gradle:
configurations {
nativeDep
}
// Add dependency on the task that produces the library
dependencies {
nativeDep files(project(':native').tasks.findByPath('compile'))
}
// Unfortunately, we also have to do this because gradle will only
// run the ':native:compile' task if we needed the tasks inputs for another
// task
tasks.withType(JavaCompile) {
dependsOn ':native:compile'
}
run {
doFirst {
// Use the configuration to add our library to java.library.path
def libDirs = files(configurations.nativeDep.files.collect {it.parentFile})
systemProperty "java.library.path", libDirs.asPath
}
}

Resources