I work on a large project (several hundred modules, each with tests) and would like to construct a test dependency graph using gradle dependencies.
For example, suppose I have the following modules and dependencies:
core <----- thing1 <----- thing1a
<----- thing2
If I run gradle thing1:dependencies it will tell me that thing1 dependsOn core. Instead, I would like to know what modules depend on thing1 so I can run the tests for thing1 and all dependant modules whenever I change thing1. In the example above, the dependent modules would be thing1 and thing1a
Hopefully there is a simple way to do this in gradle (constructing a test dependency graph seems like a pretty common thing to do), but I have not been able to find anything yet.
Using this gist (which I didn't write) as an inspiration, consider this in the root build.gradle:
subprojects { subproject ->
task dependencyReport {
doLast {
def target = subproject.name
println "-> ${target}"
rootProject.childProjects.each { item ->
def from = item.value
from.configurations
.compile
.dependencies
.matching { it in ProjectDependency }
.each { to ->
if (to.name == target) {
println "-> ${from.name}"
}
}
}
}
}
}
an example run using a project structure as you describe:
$ gradle thing1:dependencyReport
:thing1:dependencyReport
-> thing1
-> thing1a
Related
I have a build.gradle file that's fairly long, and I'd like to break some of the logic up into smaller files to make the whole thing more maintainable. After moving some tasks into a new file, I found that none of the variables I had set in the parent script were available in the child script. Below is a pair of source files I reproduced this behavior with:
build.gradle:
apply from: 'repro.gradle'
def foo = "This is a variable"
tasks.register('printFromMainScript') {
println(foo)
}
repro.gradle:
tasks.register('printFromChildScript') {
println(foo)
}
In the above example, printFromMainScript works fine, but printFromChildScript fails. Is there a way to access foo from repro.gradle?
def foo creates a variable that exists only in the scope of build.gradle. Gradle documentation explains this more in detail.
There is ext block in Gradle which is meant for extra properties.
This should work in your case:
build.gradle:
apply from: 'repro.gradle'
ext {
foo = "This is a variable"
}
repro.gradle:
task printFromChildScript {
doLast {
println(project.foo)
}
}
Note: doLast block ensures that the println function is called only when the project is fully configured and printFromChildScript is actually executed. If you put println directly in task body then it will be executed during Gradle project configuration phase.
I'm trying to configure a Zip task based on one of the property inside sub-projects, but the property is not yet accessible at the time of configuring the task. For instance, I want to exclude all my projects that has toexclude = true from my zip file. So, the build.gradle of the sub-projects that I want to exclude starts with this:
ext.toexclude = true;
...
And my main build.gradle has this task:
task zipContent (type: Zip){
def excludedProjects = allprojects.findAll{Project p -> p.toexclude == true}.collect{it.name}
println excludedProjects
destinationDir = "/some/path"
baseName = "myFile.zip"
exclude excludedProjects
from "/some/other/path"
}
The problem is that excludedProjects is always empty. Indeed, when I am executing the task, I can see []. I believe this is due to the fact that the property that I set in the subproject's build.gradle is not available at the moment the task is configured. As a proof, if I replace the first line of the task by this:
def excludedProjects = allprojects.collect{it.name}
The task prints out all of my project's name, and the zip contains nothing (which means the problem is in the p.toexclude == true).
Also, if I try this:
task zipContent (type: Zip){
def excludedProjects = []
doFirst{
excludedProjects = allprojects.findAll{Project p -> p.toexclude == true}.collect{it.name}
println "IN DOFIRST"
println excludedProjects
}
println "IN TASK CONFIG"
println excludedProjects
destinationDir = "/some/path"
baseName = "myFile.zip"
exclude excludedProjects
from "/some/other/path"
}
The task prints out IN TASK CONFIG followed by an empty array, then IN DOFIRST with the array containing only the subprojects that I set ext.toexclude == true.
So, is there a way to get the properties of the sub-projects at configuration time?
Well, the crucial question is: At which point of the build is all necessary information available?
Since we want to know each project in the build, where the extra property toexclude is set to true and it is possible (and by design) that the property is set via the build script, we need each build script to be evaluated.
Now, we have two options:
By default, subprojects are evaluated after the parent (root) project. To ensure the evaluation of each project, we need to wait for the point of the build, where all projects are evaluated. Gradle provides a listener for that point:
gradle.addListener(new BuildAdapter() {
#Override
void projectsEvaluated(Gradle gradle) {
tasks.getByPath('zipContent').with {
exclude allprojects.findAll { it.toexclude }.collect{ it.name }
}
}
})
Gradle provides the method evaluationDependsOnChildren(), to turn the evaluation order around. It may be possible to use your original approach by calling this method before querying the excluded projects. Since this method only applies on child projects, you may try to call evaluationDependsOn(String) for each project in the build to also apply for 'sibling' projects. Since this solution breaks Gradle default behavior, it may have undesired side effects.
Just define excludedProjects outside the task
def excludedProjects = allprojects.findAll{Project p -> p.toexclude == true}.collect{it.name}
task zipContent (type: Zip){
destinationDir = file("/some/path")
baseName = "myFile.zip"
exclude excludedProjects
from "/some/other/path"
}
You can call evaluationDependsOnChildren() in the root project so that child projects are evaluated before the root
Eg
evaluationDependsOnChildren()
task zipContent (type: Zip) { ... }
Another option is to use an afterEvaluate { ... } closure to delay evaluation
Eg:
afterEvaluate {
task zipContent (type: Zip) { ... }
}
I want to make my gradle build inteligent when building my model.
To acquire this I was planning to read schema files, acquire what is included and then build firstly included models (if they are not present).
I'm pretty new to Groovy and Gradle, so please that into account.
What I have:
build.gradle file on root directory, including n subdirectories (subprojects added to settings.gradle). I have only one gradle build file, because I defined tasks like:
subprojects {
task init
task includeDependencies(type: checkDependencies)
task build
task dist
(...)
}
I will return to checkDependencies shortly.
Schema files located externally, which I can see.
Each of them have from 0 to 3 lines of code, that say about dependencies and looks like that:
#include "ModelDir/ModelName.idl"
In my build.gradle I created task that should open, and read those dependencies, preferably return them:
class parsingIDL extends DefaultTask{
String idlFileName="*def file name*"
def regex = ~/#include .*\/(\w*).idl/
#Task Action
def checkDependencies(){
File idlFile= new File(idlFileName)
if(!idlFile.exists()){
logger.error("File not found)
} else {
idlFile.eachLine{ line ->
def dep = []
def matcher = regex.matcher(line)
(...)*
}
}
}
}
What should I have in (...)* to find all dependencies and how should I define, that for example
subprojectA::build.dependsOn([subprojectB::dist, subprojectC::dist])?
All I could find on internet created dep, that outputted given:
[]
[]
[modelName]
[]
[]
(...)
I have a simple multi-project build:
root
|___ a
|___ b
|___ c
build.gradle in root project:
subprojects {
task jarSources(type: Jar, dependsOn: classes) {
classifier = 'source'
from sourceSets.main.java, sourceSets.main.resources
}
}
build.gradle in project a:
dependencies {
compile project(':b')
compile project(':c')
}
task archiveDependencySources(type: Zip) {
...
}
Task archiveDependencySources is intended to collect all jars with sources from projects on which project a depends. Is there a standard way to do this job?
So far I found the solution that looks a bit ugly:
def allJarSourcesTasks = []
for (def dep : configurations.compile.dependencies)
if (dep.hasProperty('dependencyProject'))
allJarSourcesTasks << dep.dependencyProject.tasks['jarSources']
archiveDependencySources.dependsOn allJarSourcesTasks
archiveDependencySources.from allJarSourcesTasks
This might work (not sure about the null argument):
allJarSourcesTasks = configurations.compile.getTaskDependencyFromProjectDependency(true, "jarSources").getDependencies(null)
For dependsOn, .getDependencies(null) could be omitted, but I believe it is needed for from.
I have a gradle build script with a handful of source sets that all have various dependencies defined (some common, some not), and I'm trying to use the Eclipse plugin to let Gradle generate .project and .classpath files for Eclipse, but I can't figure out how to get all the dependency entries into .classpath; for some reason, quite few of the external dependencies are actually added to .classpath, and as a result the Eclipse build fails with 1400 errors (building with gradle works fine).
I've defined my source sets like so:
sourceSets {
setOne
setTwo {
compileClasspath += setOne.runtimeClasspath
}
test {
compileClasspath += setOne.runtimeClasspath
compileClasspath += setTwo.runtimeClasspath
}
}
dependencies {
setOne 'external:dependency:1.0'
setTwo 'other:dependency:2.0'
}
Since I'm not using the main source-set, I thought this might have something to do with it, so I added
sourceSets.each { ss ->
sourceSets.main {
compileClasspath += ss.runtimeClasspath
}
}
but that didn't help.
I haven't been able to figure out any common properties of the libraries that are included, or of those that are not, but I can't find anything that I'm sure of (although of course there has to be something). I have a feeling that all included libraries are dependencies of the test source-set, either directly or indirectly, but I haven't been able to verify that more than noting that all of test's dependencies are there.
How do I ensure that the dependencies of all source-sets are put in .classpath?
This was solved in a way that was closely related to a similar question I asked yesterday:
// Create a list of all the configuration names for my source sets
def ssConfigNames = sourceSets.findAll { ss -> ss.name != "main" }.collect { ss -> "${ss.name}Compile".toString() }
// Find configurations matching those of my source sets
configurations.findAll { conf -> "${conf.name}".toString() in ssConfigNames }.each { conf ->
// Add matching configurations to Eclipse classpath
eclipse.classpath {
plusConfigurations += conf
}
}
Update:
I also asked the same question in the Gradle forums, and got an even better solution:
eclipseClasspath.plusConfigurations = configurations.findAll { it.name.endsWith("Runtime") }
It is not as precise, in that it adds other stuff than just the things from my source sets, but it guarantees that it will work. And it's much easier on the eyes =)
I agree with Tomas Lycken, it is better to use second option, but might need small correction:
eclipse.classpath.plusConfigurations = configurations.findAll { it.name.endsWith("Runtime") }
This is what worked for me with Gradle 2.2.1:
eclipse.classpath.plusConfigurations = [configurations.compile]