Generate CLASSPATH from all multiproject gradle build dependencies - gradle

We're using an external testing tool (Squish) which runs after our main gradle build. This requires visibility of classes under test. Currently the CLASSPATH environment variable is built from various bash scripts and other string manipulation. I'm aiming to get gradle to do this automagically.
Since the build is done on a very slow source control system (clearcase) it is preferable for the tests to be run against the class files/JARs as left by the build rather than extra copies / compression into a single JAR etc.
The classpath needs to contain
Generated JAR files, typically in build/libs/project-x.jar
Test classes typically build/classes/test
Test resources typically build/resources/test
Any 3rd party jars any of the child projects depend upon, log4j, spring etc.
It's a complex multiproject build, but I've simplified to following example with parent and two children.
Parent settings.gradle
include ':child1'
include ':child2'
Parent build.gradle
allprojects {
apply plugin: 'java'
repositories {
mavenCentral()
}
}
Child 1 build.gradle
dependencies {
compile 'org.springframework:spring-context:4.1.2.RELEASE'
compile 'org.springframework:spring-beans:4.1.2.RELEASE'
}
Child 2 build.gradle
dependencies {
project (":child1")
}
What I've got so far. Is this the right approach? Can it be simplified or completely rewritten in a better way?
task createSquishClasspath << {
def paths = new LinkedHashSet()
paths.addAll(subprojects.configurations.compile.resolvedConfiguration.resolvedArtifacts.file.flatten())
paths.addAll(subprojects.jar.outputs.files.asPath)
paths.addAll(subprojects.sourceSets.test.output.resourcesDir)
paths.addAll(subprojects.sourceSets.test.output.classesDir)
paths.each {
println "${it}"
}
println paths.join(File.pathSeparator)
}
Output
C:\so-question\Parent\local-repo\spring\spring-context-4.1.2.RELEASE.jar
C:\so-question\Parent\local-repo\spring\spring-beans-4.1.2.RELEASE.jar
C:\so-question\Parent\child1\build\libs\child1.jar
C:\so-question\Parent\child2\build\libs\child2.jar
C:\so-question\Parent\child1\build\resources\test
C:\so-question\Parent\child2\build\resources\test
C:\so-question\Parent\child1\build\classes\test
C:\so-question\Parent\child2\build\classes\test

In summary the best answer so far is to join all the paths from the projects dependencies and all the test.output.resourcesDir and test.output.classesDir
task createSquishClasspath << {
def paths = new LinkedHashSet()
paths.addAll(subprojects.configurations.compile.resolvedConfiguration.resolvedArtifacts.file.flatten())
paths.addAll(subprojects.jar.outputs.files.asPath)
paths.addAll(subprojects.sourceSets.test.output.resourcesDir)
paths.addAll(subprojects.sourceSets.test.output.classesDir)
paths.each {
println "${it}"
}
println paths.join(File.pathSeparator)
}

Related

How to copy files between sub-projects in Gradle

How can I make a sub-project copy a file that is produced by a sibling sub-project? All this with proper dependency management, and without assuming that any language-specific plugins (like the JavaPlugin) are used.
I have looked at the updated Gradle 6 draft Sharing artifacts between projects but it does not really answer that question.
My multi-project structure is something like:
top/
build.gradle
settings.gradle
producer/
build.gradle
myFile_template.txt
consumer/
build.gradle
I want a Copy-task in producer/build.gradle to copy+transform myFile_template.txt into $buildDir/target/myFile.txt and another Copy-task in consumer/build.gradle should further copy+transform that myFile.txt to a finalFile.txt.
Presumably a proper solution would be able to use task outputs.files or some such so that consumer/build.gradle does not need to explicitly mention the location of $buildDir/target/myFile.txt.
(I'm completely new to Gradle).
Gradle gives you lots of freedom but I prefer that projects only "share" with each other by Configurations and/or Artifacts. I feel that one project should never concern itself with another project's tasks and feel that the tasks are private to each project.
With this principle in mind you could do something like
project(':producer') {
configurations {
transformed
}
task transformTemplate(type: Copy) {
from 'src/main/template'
into "$buildDir/transformed"
filter(...) // transformation goes here
}
dependencies {
// file collection derived from a task.
// Any task which uses this as a task input will depend on the transformTemplate task
transformed files(transformTemplate)
}
}
project(':consumer') {
configurations {
producerTransformed
}
dependencies {
producerTransformed project(path: ':producer', configuration: 'transformed')
}
task transformProducer(type:Copy) {
from configurations.producerTransformed // this will create a task dependency
into ...
filter ...
}
}

A code generator task in a multi-project gradle build

I have studied thousand similar questions on SO and I am still lost. I have a simple multiproject build:
rootProject.name = 'mwe'
include ":Generator"
include ":projectB"
include ":projectC"
with a top level build.gradle as follows (settings.gradle):
plugins { id "java" }
allprojects { repositories { jcenter() } }
and with two kinds of project build.gradle files. The first one (Generator) exposes a run command that runs the generator taking the command line argument:
plugins {
id "application"
id "scala"
}
dependencies { compile "org.scala-lang:scala-library:2.12.3" }
mainClassName = "Main"
ext { cmdlineargs = "" }
run { args cmdlineargs }
The code generator is to be called from projectB (and an analogous projectC, and many others). I am trying to do this as follows (projectB/build.gradle):
task TEST {
project (":Generator").ext.cmdlineargs = "Hurray!"
println ("Value set:" + project(":Generator").ext.cmdlineargs )
dependsOn (":Generator:run")
}
Whatever I try to do (a gradle newbie here) I am not getting what I need. I have two problems:
The property cmdlineargs is not set at the point that task :projectB:TEST is run. The println sees the right value but the argument passed to the executed main method is the one configured in Generator/build.gradle, not the one in projectB/build.gradle. As pointed out in responses this can be work around using lazy property evaluation, but this does not solve the second problem.
The generator is only run once, even if I build both projectB and projectC. I need to run Generator:run for each of projectB and projectC separately (to generate different sources for each dependent project).
How can I get this to work? I suppose a completely different strategy is needed. I don't have to use command line and run; I can also try to run the main class of the generator more directly and pass arguments to it, but I do find the run task quite convenient (the complex classpath is set up automatically, etc.). The generator is a Java/Scala project itself that is compiled within the same multi-project build.
Note: tasks aren't like methods in java. A task will execute either 0 or 1 times per gradle invocation. A task will never execute twice (or more) in a single Gradle invocation
I think you want two or more tasks. Eg:
task run1(type:xxx) {
args 'foo'
}
task run2(type:xxx) {
args 'bar'
}
Then you can depend on run1 or run2 in your other projects.

Execute Gradle task after subprojects are configured

I have a multi-project Gradle build where subprojects are assigned version numbers independent of the root project. I'd like to inject this version number into a few resource files in each subproject. Normally, I'd do this by configuring the processResources task for each subproject in the root build. However, the problem is that Gradle appears to be executing the processResources task before loading the subprojects' build files and is injecting "unspecified" as the version.
Currently, my project looks like this:
/settings.gradle
include 'childA' // ... and many others
/build.gradle
subprojects {
apply plugin: 'java'
apply plugin: 'com.example.exampleplugin'
}
subprojects {
// This has to be configured before processResources
customPlugin {
baseDir = "../common"
}
processResources {
// PROBLEM: version is "unspecified" here
inputs.property "version", project.version
// Inject the version:
from(sourceSets.main.resources.srcDirs) {
include 'res1.txt', 'res2.txt', 'res3.txt'
expand 'version':project.version
}
// ...
}
}
/childA/build.gradle
version = "0.5.424"
I looked into adding evaluationDependsOnChildren() at the beginning of root's build.gradle, but that causes an error because childA/build.gradle runs before customPlugin { ... }. I've tried using dependsOn, mustRunAfter, and other techniques, but none seem have the desired effect. (Perhaps I don't fully understand the lifecycle, but it seems like the root project is configured and executed before the subprojects. Shouldn't it configure root, then configure subprojects, and then execute?)
How can I get inject the version of each subproject into the appropriate resource files without a lot of copy/paste or boilerplate?
You could try using this method, with a hook:
gradle.projectsEvaluated({
// your code
})
I got this figured out for myself. I'm using a init.gradle file to apply something to the rootProject, but I need data from a subproject.
First option was to evaluate each subproject before I modified it:
rootProject {
project.subprojects { sub ->
sub.evaluate()
//Put your code here
But I wasn't sure what side effects forcing the sub project to evaluate would have so I did the following:
allprojects {
afterEvaluate { project ->
//Put your code here
Try doing it like this:
subprojects { project ->
// your code
}
Otherwise project will refer to your root project where no version has been specified.

Delombok using Gradle

As part of our build process we analyse our source code with SonarQube.
One problem with this is that we use Lombok annotations and SonarQube is not handling this very well -- our code needs to be 'delombok'ed.
Delomboked source removed the annotations and replaces the source file with the final code used by the compiler.
This can be done in gradle (see here).
Well, in part. Typically an Ant task can be used to delombok source. Code sample below:-
task delombok {
// delombok task may depend on other projects already being compiled
dependsOn configurations.compile.getTaskDependencyFromProjectDependency(true, "compileJava")
// Set up incremental build, must be made in the configuration phase (not doLast)
inputs.files file(srcJava)
outputs.dir file(srcDelomboked)
doLast {
FileCollection collection = files(configurations.compile)
FileCollection sumTree = collection + fileTree(dir: 'bin')
ant.taskdef(name: 'delombok', classname: 'lombok.delombok.ant.DelombokTask', classpath: configurations.compile.asPath)
ant.delombok(from:srcJava, to:srcDelomboked, classpath: sumTree.asPath)
}
}
The problem I have with this is that I believe I would need a pre-configured ant system (I've yet to get this working).
Another approach would be to use a Maven lombok:delombok plugin (see here). However I don't know how to do this and if this would also require a pre-configured environment.
I'm not sure which is the best approach. An approach that does not require a pre-configured build system and can work fully from gradle/gradlew would be preferrable.
The ultimate aim would to have a 'delombok' project task which would essentially perform the following:
java -jar lombok.jar delombok src -d src-delomboked
edit
So i've pretty much got this to work with roughly this snippet:-
dependencies {
compile 'org.projectlombok:lombok:1.14.2'
}
task delombok {
description 'Delomboks the entire source code tree'
def srcDelomboked = 'build/src-delomboked'
def srcJava = 'src'
inputs.files file( srcJava )
outputs.dir file( srcDelomboked )
doLast {
def collection = files( configurations.compile + configurations.testCompile )
def sumTree = collection + fileTree( dir: 'bin' )
ant.taskdef( name: 'delombok', classname: 'lombok.delombok.ant.DelombokTask',
classpath: configurations.compile.asPath +
configurations.testCompile.asPath )
ant.delombok( from:srcJava, to:srcDelomboked, classpath: sumTree.asPath )
// Replace current src directory with delomboked source
copy {
from srcDelomboked
into srcJava
}
}
}
This first bit ensures that the lombok jar is available to gradle for
using the delombok ant task.
Then we configure the source files to
use.
Next we setup gradle to use the ant task.
Finally the copy task replaces the entire source tree with the delomboked version of the code. Obviously this could be removed to suit your needs.
I think the easiest way to delombok sources with gradle is:
task delombok {
description 'Delomboks the source code'
ant.taskdef(classname: 'lombok.delombok.ant.Tasks$Delombok', classpath: configurations.compile.asPath, name: 'delombok')
ant.mkdir(dir: 'build/src-delomboked')
ant.delombok(verbose: 'true', encoding: 'UTF-8', to: 'build/src-delomboked', from: 'src/main/java')
}
Using an Ant task is fine. No "Ant preconfiguration" should be necessary. Alternatively, you could use a JavaExec task to call delombok as in your last snippet. (JavaExec doesn't currently support the -jar option, so you'd have to name the main class.) Using a Maven plugin from Gradle isn't possible (except for executing Maven with an Exec task).
https://github.com/franzbecker/gradle-lombok
buildscript {
repositories {
maven { url 'https://plugins.gradle.org/m2/' }
}
dependencies {
classpath 'io.franzbecker:gradle-lombok:1.6'
}
}
apply plugin: 'java'
apply plugin: 'io.franzbecker.gradle-lombok'
UPD: at the moment I quite enough IntelliJ IDEA 2016.3 + Lombok plugin
and the contents build.gradle:
dependencies {
compileOnly 'org.projectlombok:lombok:+'
}

Gradle - can I include task's output in project dependencies

I have a task that generates java sources and a set of jars from these sources (say, project a). I would like to export these jars to dependent projects (say, project b). So here's roughly what I have right now:
//a.gradle
configurations{
generatedJars
}
task generateJars(type: JavaExec) {
//generate jars ...
outputs.files += //append generated jars here
}
dependencies{
generatedJars generateJars.outputs.files
}
//b.gradle
dependencies{
project(path: ':a', configuration: 'generatedJars')
}
It works OK, except that adding generateJars.outputs.files as a dependency does not tell gradle that it has to run generateJars task when there are no jars generated yet. I have tried adding the task itself as a dependency hoping that it would work in the same way as it does when you add a jar/zip task to an artifact configuration (e.g. artifacts{ myJarTask }), but it throws an error telling me that I cannot do that. Of course I can inject the generateJars task somewhere in the build process before :b starts evaluating, but that's clumsy and brittle, so I would like to avoid it.
I feel like I should be adding the generated jars to artifacts{ ... } of the project, but I am not sure how to make them then visible to dependent projects. Is there a better way of achieving this?
Dependent projects (project b) will need to do setup IntelliJ IDEA module classpath to point to project a's generated jars. Something rather like this (pseudo-code):
//b.gradle
idea{
module{
scopes.COMPILE.plus += project(path: ':a', configuration: 'generatedJars').files
}
}
So far I have tried simply adding a project dependecy on :a's generatedJars in :b, but Idea plugin simply adds module :a as a module-dependency and assumes that it exports its generated jars (which is probably a correct assumption), therefore not adding the generated jars to :b's classpath.
Any help would be greatly appreciated!
First, do you need a separate configuration? That is, do you have clients of a that should not see the generated Jars? If not, you can add the generated Jars to the archives configuration, which will simplify things.
Second, the correct way to add the generated Jars to the configuration is (instead of the dependencies block):
artifacts {
generatedJars generateJars
}
This should make sure that the generateJars task gets run automatically when needed.
Third, I'd omit the += after outputs.files, although it might not make a difference. You should also add the necessary inputs.
Fourth, why do you need a JavaExec task to generate the Jars? Can you instead add the generated sources to some source set and let Gradle build them?
Fifth, IDEA doesn't have a concept corresponding to Gradle's project configuration dependencies. Either an IDEA module fully depends on another module, or not at all. You have two options: either use a module dependency and make the generated sources a source folder of the depended-on module (preferably both in the Gradle and the IDEA build), or pass the generated Jars as external dependencies to IDEA. In either case, you should probably add a task dependency from ideaModule to the appropriate generation task. If this still doesn't lead to a satisfactory IDEA setup, you could think about moving the generation of the Jars into a separate subproject.
For my use case, I had a C++ project which generated some native libraries which my java project needed to load in order to run.
In the project ':native' build.gradle:
task compile(type: Exec, group: 'build') {
dependsOn ...
outputs.files(fileTree('/some/build/directory') {
include 'mylib/libmy.so'
})
...
}
In project java application build.gradle:
configurations {
nativeDep
}
// Add dependency on the task that produces the library
dependencies {
nativeDep files(project(':native').tasks.findByPath('compile'))
}
// Unfortunately, we also have to do this because gradle will only
// run the ':native:compile' task if we needed the tasks inputs for another
// task
tasks.withType(JavaCompile) {
dependsOn ':native:compile'
}
run {
doFirst {
// Use the configuration to add our library to java.library.path
def libDirs = files(configurations.nativeDep.files.collect {it.parentFile})
systemProperty "java.library.path", libDirs.asPath
}
}

Resources