Gradle with Eclipse - incomplete .classpath when multiple sourcesets - gradle

I have a gradle build script with a handful of source sets that all have various dependencies defined (some common, some not), and I'm trying to use the Eclipse plugin to let Gradle generate .project and .classpath files for Eclipse, but I can't figure out how to get all the dependency entries into .classpath; for some reason, quite few of the external dependencies are actually added to .classpath, and as a result the Eclipse build fails with 1400 errors (building with gradle works fine).
I've defined my source sets like so:
sourceSets {
setOne
setTwo {
compileClasspath += setOne.runtimeClasspath
}
test {
compileClasspath += setOne.runtimeClasspath
compileClasspath += setTwo.runtimeClasspath
}
}
dependencies {
setOne 'external:dependency:1.0'
setTwo 'other:dependency:2.0'
}
Since I'm not using the main source-set, I thought this might have something to do with it, so I added
sourceSets.each { ss ->
sourceSets.main {
compileClasspath += ss.runtimeClasspath
}
}
but that didn't help.
I haven't been able to figure out any common properties of the libraries that are included, or of those that are not, but I can't find anything that I'm sure of (although of course there has to be something). I have a feeling that all included libraries are dependencies of the test source-set, either directly or indirectly, but I haven't been able to verify that more than noting that all of test's dependencies are there.
How do I ensure that the dependencies of all source-sets are put in .classpath?

This was solved in a way that was closely related to a similar question I asked yesterday:
// Create a list of all the configuration names for my source sets
def ssConfigNames = sourceSets.findAll { ss -> ss.name != "main" }.collect { ss -> "${ss.name}Compile".toString() }
// Find configurations matching those of my source sets
configurations.findAll { conf -> "${conf.name}".toString() in ssConfigNames }.each { conf ->
// Add matching configurations to Eclipse classpath
eclipse.classpath {
plusConfigurations += conf
}
}
Update:
I also asked the same question in the Gradle forums, and got an even better solution:
eclipseClasspath.plusConfigurations = configurations.findAll { it.name.endsWith("Runtime") }
It is not as precise, in that it adds other stuff than just the things from my source sets, but it guarantees that it will work. And it's much easier on the eyes =)

I agree with Tomas Lycken, it is better to use second option, but might need small correction:
eclipse.classpath.plusConfigurations = configurations.findAll { it.name.endsWith("Runtime") }

This is what worked for me with Gradle 2.2.1:
eclipse.classpath.plusConfigurations = [configurations.compile]

Related

How to add multiple tasks of the same type in Gradle?

I need to generate FlatBuffers files from *.fbs file before the build.
So i'm using gradle.plugin.io.netifi:gradle-flatbuffers-plugin:1.0.7 to do it for me.
It works as expected for 1 task:
def generatedSourcePathJava = "$buildDir/generated/source/flatbuffers/java"
def generatedSourcePathCpp = "$buildDir/generated/source/flatbuffers/cpp"
...
task createFlatBuffersJava(type: io.netifi.flatbuffers.plugin.tasks.FlatBuffers) {
outputDir = file(generatedSourcePathJava)
language = "kotlin"
}
build.dependsOn createFlatBuffersJava
But if i add the 2nd one (to generate C++ files for JNI):
task createFlatBuffersJava(type: io.netifi.flatbuffers.plugin.tasks.FlatBuffers) {
outputDir = file(generatedSourcePathJava)
language = "kotlin"
}
task createFlatBuffersCpp(type: io.netifi.flatbuffers.plugin.tasks.FlatBuffers) {
outputDir = file(generatedSourcePathCpp)
language = "cpp"
}
assemble.dependsOn createFlatBuffersJava, createFlatBuffersCpp
Gradle build (../gradlew :engine-flatbuffers:clean :engine-flatbuffers:build) fails with the following:
What went wrong:
A problem occurred configuring project ':engine-flatbuffers'.
java.util.ConcurrentModificationException (no error message)
I think the question can be generalized to "How to add multiple tasks of the same type in Gradle?".
PS. "gradle-5.6-all"
That's a known plugin bug/feature reported at https://github.com/gregwhitaker/gradle-flatbuffers-plugin/issues/7.
It works in 1.0.5 but kotlin [argument] is sadly not supported there at that point. java works and it's compatible.

How to construct test dependency graph with gradle

I work on a large project (several hundred modules, each with tests) and would like to construct a test dependency graph using gradle dependencies.
For example, suppose I have the following modules and dependencies:
core <----- thing1 <----- thing1a
<----- thing2
If I run gradle thing1:dependencies it will tell me that thing1 dependsOn core. Instead, I would like to know what modules depend on thing1 so I can run the tests for thing1 and all dependant modules whenever I change thing1. In the example above, the dependent modules would be thing1 and thing1a
Hopefully there is a simple way to do this in gradle (constructing a test dependency graph seems like a pretty common thing to do), but I have not been able to find anything yet.
Using this gist (which I didn't write) as an inspiration, consider this in the root build.gradle:
subprojects { subproject ->
task dependencyReport {
doLast {
def target = subproject.name
println "-> ${target}"
rootProject.childProjects.each { item ->
def from = item.value
from.configurations
.compile
.dependencies
.matching { it in ProjectDependency }
.each { to ->
if (to.name == target) {
println "-> ${from.name}"
}
}
}
}
}
}
an example run using a project structure as you describe:
$ gradle thing1:dependencyReport
:thing1:dependencyReport
-> thing1
-> thing1a

What is the proper way to refer to resource directory path in one source set in gradle?

In my build.gradle file, I have defined a separate sourceSet for integration tests:
sourceSets {
integtest {
java.srcDir 'src/integtest/java/io/attil/integration'
resources.srcDir 'src/integtest/resources'
}
}
I would like to use the path to resources of the integration tests in one of my manually defined tasks (a task that prefills the data-base for integration tests; the sql script is located in the mentioned resource folder).
I have now the following solution:
task prefillDatabase {
// ... snip!
String sqlString = new File(sourceSets.integtest.resources.srcDirs.iterator().next().toString() + '/setup_integration_tests.sql').text
// ... snip!
}
While this works, it is quite cumbersome.
Is there a better, shorter way to achieve the same? (I'm looking for something like sourceSets.integtest.resources.srcDir.)
I'm not sure this is less verbose, but I'd argue it's more correct
File file = sourceSets.integtest.resources.matching {
include 'setup_integration_tests.sql'
}.singleFile
String sqlString = file.text
See FileTree.matching(Closure) and FileCollection.getSingleFile()
Also, this looks wrong to me
java.srcDir 'src/integtest/java/io/attil/integration'
I'd think it would be
java.srcDir 'src/integtest/java'
java.include 'io/attil/integration/**'

Calling the same task multiple times in a single build.gradle file

I have a custom Gradle plugin that will generate Java files from a template file. I have several such template files in different locations, and I need to "compile" all of them to generate the Java files I need. Once I have the files, I want to package them into a .jar.
One way I thought I could do this was to call the "compile template" task multiple times from within the same build file. I'd call it once in a task that compiles template files in location A, again from a task that compiles template files from location B... etc., until I have all the Java files I need.
Something like this:
task compileFromLocationA <<{
compileTemplate.execute(A)...
}
task compileFromLocationB
compileTemplate.execute(B)...
...
packageJar(depends: compileFromLocationA, compileFromLocationB, ...)
...
However, you can't programmatically call a task from within another task. I suppose I could break each compileFromLocation_ task into it's own build.gradle file, but that seems like overkill. What's the "best practice" in a case like this?
This code seems to work in build.gradle by using tasks.register() - e.g. to perform multiple source code generating steps - in my case I needed to load different pairs of files (XML schema and generation options) in two different steps:
plugins {
id 'java'
id "com.plugin" version "1.0"
}
sourceSets.main.java.srcDirs += file("${buildDir}/genSrc")
sourceSets.test.java.srcDirs += file("${buildDir}/testGenSrc")
tasks.compileJava {
dependsOn tasks.named("genMessage")
}
genMessage {
codesFile = "${projectDir}/src/main/resources/file.xml"
}
def testGenModel1 = tasks.register("testGenModel1", com.plugin.TestGenModelTask.class) {
schema = "${projectDir}/src/test/resources/file.xsd"
options = "${projectDir}/src/test/resources/file.xml"
}
def testGenModel2 = tasks.register("testGenModel2", com.plugin.TestGenModelTask.class) {
schema = "${projectDir}/src/test/resources/file2.xsd"
options = "${projectDir}/src/test/resources/file2.xml"
}
tasks.compileTestJava {
dependsOn tasks.named("testGenModel1"), tasks.named("testGenModel2")
}

gradle script, What’s difference between direct dsl and called by a custom function of “apply from:”

First, there are some common scripts deployed in private maven repo:
http://domain/repo/com/d/build/script/java-project/1.0/java-project-1.0.gradle
http://domain/repo/com/d/build/script/maven/1.0/maven-1.0.gradle
In the target project, build.gradle
subprojects {
apply from: 'http://domain/repo/com/d/build/script/java-project/1.0/java-project-1.0.gradle'
apply from: 'http://domain/repo/com/d/build/script/maven/1.0/maven-1.0.gradle'
}
it's OK!
but,
ext.applyScript = { script, version ->
apply from: "http://domain/repo/com/d/build/script/${script}/${version}/${script}-${version}.gradle"
}
subprojects {
applyScript('java-project', '1.0')
applyScript('maven', '1.0')
}
it will fail, with message:
"Error:Cannot add task ':javadocJar' as a task with that name already exists."
task ':javadocJar' is defined in script 'java-project-1.0.gradle'
and we have several sub projects.
why ?
BTW: anyone can give me a lead of source location of "apply from:"?
It's hard to location it by myself.
The problem is that in the latte case you are applying the scripts multiple times to the same root project.
How is that possible? It is quite interesting and a little bit tricky:
you are defining applyScript as a Closure on the extension container ext of the current Gradle project,
generally the apply from: ... is handled as a method call apply(Map) on the org.gradle.api.plugins.PluginAware interface which is one of the super interfaces of the org.gradle.api.Project interface
this means every time you write apply ... you are calling the apply method on the current Gradle project (the one where the apply ... is specified)
as you defined the apply ... as part of the closure, the standard delegation applies
it is semantically the same as this.apply ...
this by default points to the enclosing class/object which is the root project (here it cannot be anything else)
So even if it looks like you are applying the 2 scripts to all the subprojects, you are actually applying the 2 scripts N times to the root project (N is the number of subprojects).
What you need to do is to change the delegate to the correct Project instance.:
you can do it very easily by adding one additional argument to the closure and explicitly calling the apply method on that argument:
ext.applyScript = { project, script, version ->
project.apply from: "..."
}
subprojects {
applyScript(it, 'java-project', '1.0')
applyScript(it, 'maven', '1.0')
}
or you can set the delegate explicitly:
ext.applyScript = { script, version ->
apply from: "..."
}
subprojects {
applyScript.resolveStrategy = Closure.DELEGATE_FIRST
applyScript.delegate = it
applyScript('java-project', '1.0')
applyScript('maven', '1.0')
}

Resources