I want the outputs of one task to be available to an identical task in another submodule.
I'm trying to make yet-another plugin for compilation (of C/++, .hs, .coffee, .js et al) and source code generation.
So, I'm making a plugin and task/s that (so far) generate CMakeLists.txt, Android.mk, .vcxproj or whatever for each module to build the source code.
I have a multi-module build for this.
I can reach around and find the tasks from "other" submodules, but, I can't seem to enforce any execution order.
So, with ...
root project: RootModule
sub project: NativeCommandLine (requires SharedModule)
sub project: NativeGUI (requires SharedModule)
sub project: SharedModule
... I find that the NativeGUI tasks are executed before SharedModule which means that the SharedModule results aren't ready.
Bad.
Since the dependency { ... } stuff happens after plugins are installed (AFAIK) ... I'm guessing that the dependencies are connected after.
I need my tasks executed in order based on the dependency relations ... right? How can I do that?
I have created a (scala) TaskBag that lazily registers a collection of all participating Task instances.
I add instances of my task to this, along with a handler for when a new task appears.
During configure, any task can include logic in the lambda to filter and act on other tasks and it will be executed as soon as both tasks are participating.
package peterlavalle
import java.util
import org.gradle.api.Task
object TaskBag {
class AnchorExtension extends util.LinkedList[(Task, Task => Unit)]()
/**
* connect to the group of tasks
*/
def apply(task: Task)(react: Task => Unit): Unit =
synchronized {
// lazily create the central anchor ... thing ...
val anchor: AnchorExtension =
task.getProject.getRootProject.getExtensions.findByType(classOf[AnchorExtension]) match {
case null =>
task.getProject.getRootProject.getExtensions.create(classOf[AnchorExtension].getName, classOf[AnchorExtension])
case anchor: AnchorExtension =>
anchor
}
// show us off to the old ones
anchor.foreach {
case (otherTask, otherReact) =>
require(otherTask != task, "Don't double register a task!")
otherReact(task)
react(otherTask)
}
// add us to the list
anchor.add(task -> react)
}
}
Related
I have a custom Gradle plugin that will generate Java files from a template file. I have several such template files in different locations, and I need to "compile" all of them to generate the Java files I need. Once I have the files, I want to package them into a .jar.
One way I thought I could do this was to call the "compile template" task multiple times from within the same build file. I'd call it once in a task that compiles template files in location A, again from a task that compiles template files from location B... etc., until I have all the Java files I need.
Something like this:
task compileFromLocationA <<{
compileTemplate.execute(A)...
}
task compileFromLocationB
compileTemplate.execute(B)...
...
packageJar(depends: compileFromLocationA, compileFromLocationB, ...)
...
However, you can't programmatically call a task from within another task. I suppose I could break each compileFromLocation_ task into it's own build.gradle file, but that seems like overkill. What's the "best practice" in a case like this?
This code seems to work in build.gradle by using tasks.register() - e.g. to perform multiple source code generating steps - in my case I needed to load different pairs of files (XML schema and generation options) in two different steps:
plugins {
id 'java'
id "com.plugin" version "1.0"
}
sourceSets.main.java.srcDirs += file("${buildDir}/genSrc")
sourceSets.test.java.srcDirs += file("${buildDir}/testGenSrc")
tasks.compileJava {
dependsOn tasks.named("genMessage")
}
genMessage {
codesFile = "${projectDir}/src/main/resources/file.xml"
}
def testGenModel1 = tasks.register("testGenModel1", com.plugin.TestGenModelTask.class) {
schema = "${projectDir}/src/test/resources/file.xsd"
options = "${projectDir}/src/test/resources/file.xml"
}
def testGenModel2 = tasks.register("testGenModel2", com.plugin.TestGenModelTask.class) {
schema = "${projectDir}/src/test/resources/file2.xsd"
options = "${projectDir}/src/test/resources/file2.xml"
}
tasks.compileTestJava {
dependsOn tasks.named("testGenModel1"), tasks.named("testGenModel2")
}
First, there are some common scripts deployed in private maven repo:
http://domain/repo/com/d/build/script/java-project/1.0/java-project-1.0.gradle
http://domain/repo/com/d/build/script/maven/1.0/maven-1.0.gradle
In the target project, build.gradle
subprojects {
apply from: 'http://domain/repo/com/d/build/script/java-project/1.0/java-project-1.0.gradle'
apply from: 'http://domain/repo/com/d/build/script/maven/1.0/maven-1.0.gradle'
}
it's OK!
but,
ext.applyScript = { script, version ->
apply from: "http://domain/repo/com/d/build/script/${script}/${version}/${script}-${version}.gradle"
}
subprojects {
applyScript('java-project', '1.0')
applyScript('maven', '1.0')
}
it will fail, with message:
"Error:Cannot add task ':javadocJar' as a task with that name already exists."
task ':javadocJar' is defined in script 'java-project-1.0.gradle'
and we have several sub projects.
why ?
BTW: anyone can give me a lead of source location of "apply from:"?
It's hard to location it by myself.
The problem is that in the latte case you are applying the scripts multiple times to the same root project.
How is that possible? It is quite interesting and a little bit tricky:
you are defining applyScript as a Closure on the extension container ext of the current Gradle project,
generally the apply from: ... is handled as a method call apply(Map) on the org.gradle.api.plugins.PluginAware interface which is one of the super interfaces of the org.gradle.api.Project interface
this means every time you write apply ... you are calling the apply method on the current Gradle project (the one where the apply ... is specified)
as you defined the apply ... as part of the closure, the standard delegation applies
it is semantically the same as this.apply ...
this by default points to the enclosing class/object which is the root project (here it cannot be anything else)
So even if it looks like you are applying the 2 scripts to all the subprojects, you are actually applying the 2 scripts N times to the root project (N is the number of subprojects).
What you need to do is to change the delegate to the correct Project instance.:
you can do it very easily by adding one additional argument to the closure and explicitly calling the apply method on that argument:
ext.applyScript = { project, script, version ->
project.apply from: "..."
}
subprojects {
applyScript(it, 'java-project', '1.0')
applyScript(it, 'maven', '1.0')
}
or you can set the delegate explicitly:
ext.applyScript = { script, version ->
apply from: "..."
}
subprojects {
applyScript.resolveStrategy = Closure.DELEGATE_FIRST
applyScript.delegate = it
applyScript('java-project', '1.0')
applyScript('maven', '1.0')
}
I want to make my gradle build inteligent when building my model.
To acquire this I was planning to read schema files, acquire what is included and then build firstly included models (if they are not present).
I'm pretty new to Groovy and Gradle, so please that into account.
What I have:
build.gradle file on root directory, including n subdirectories (subprojects added to settings.gradle). I have only one gradle build file, because I defined tasks like:
subprojects {
task init
task includeDependencies(type: checkDependencies)
task build
task dist
(...)
}
I will return to checkDependencies shortly.
Schema files located externally, which I can see.
Each of them have from 0 to 3 lines of code, that say about dependencies and looks like that:
#include "ModelDir/ModelName.idl"
In my build.gradle I created task that should open, and read those dependencies, preferably return them:
class parsingIDL extends DefaultTask{
String idlFileName="*def file name*"
def regex = ~/#include .*\/(\w*).idl/
#Task Action
def checkDependencies(){
File idlFile= new File(idlFileName)
if(!idlFile.exists()){
logger.error("File not found)
} else {
idlFile.eachLine{ line ->
def dep = []
def matcher = regex.matcher(line)
(...)*
}
}
}
}
What should I have in (...)* to find all dependencies and how should I define, that for example
subprojectA::build.dependsOn([subprojectB::dist, subprojectC::dist])?
All I could find on internet created dep, that outputted given:
[]
[]
[modelName]
[]
[]
(...)
We have an optional gradle task docker that depends on task war, which if executed, needs a war file generated with an extra file in it. This extra file can be added to the resources within the processResources task (or potentially directly in the war task). However, the corresponding code block must not run if task docker has not been requested and will not be run.
We need a correct condition in the following block checking if task docker is in the pipeline:
processResources {
if (/* CONDITION HERE: task docker is requested */) {
from ("${projectDir}/docker") {
include "app.properties"
}
}
}
task docker(type: Dockerfile) {
dependsOn build
...
Clarification: processResources is a standard dependency of the war task and the latter is a standard dependency of the build task. processResources is always executed on build, with or without the docker task to collect resources for assembling the war and may not be fully disabled in this case. One could move the code in question to a separate task dependent on docker and working on the output directory of processResources, yet before war is run, however, such a construct will result in much less clarity for such a simple thing.
You can simply add additional dependency to your docker task, to make it relying not only on build task, but also on processResources. In that case, your processResources task will be called only if docker should be executed.
Another solution is to use TaskExecutionGraph. This let you initialize some variable, which could tell you, whether or not some task will be executed. But you have to understand, that graph is prepared only after all the configuration is done and you can rely on it only during the execution phase. Here is a short example, how it could be used:
//some flag, whether or not some task will be executed
def variable = false
//task with some logic executed during the execution phase
task test1 << {
if (variable) {
println 'task2 will be executed'
} else {
println 'task2 will not be executed'
}
}
//according to whether or not this task will run or not,
//will differs test1 task behavior
task test2(dependsOn: test1) {
}
//add some logic according to task execution graph
gradle.taskGraph.whenReady {
taskGraph ->
//check the flag and execute only if it's true
if (taskGraph.hasTask(test2)) {
variable = true
println 'task test2 will be executed'
}
}
Furthermore, you can try to configure your custom task to make it disabled by setting is enabled property to false, if docker task is not in the execution graph. In that case you don't have to provide some flags and logic in execution phase. Like:
task test1 {
//some logic for execution
doLast {
println "execute some logic"
}
}
task test2(dependsOn: test1) {
}
gradle.taskGraph.whenReady {
taskGraph ->
if (!taskGraph.hasTask(test2)) {
//Disable test1 task if test2 will not run
test1.enabled = false
}
}
But it'll be impossible to run this custom task separately without some additional configuration.
I'm working on creating a multi project build file using Gradle. Many sub projects need to execute a task which exists in another sub project by passing in certain parameters. How can this be achieved in Gradle?
for example :
root project
- project B : task X
- project A : task Y (param m, param n)
I need projectB.taskX to call projectA.taskY(m,n)
Update:
Sub-Project A has a task of type JavaExec which needs an input parameter to the location of the properties file
task generateCode(dependsOn:['classes','build'], type: JavaExec) {
main = 'jjrom.ObjectGen'
classpath = sourceSets.main.runtimeClasspath
args 'arg1', 'arg2', file(propertiesFilePath).path
}
Now, there are 10 sub projects, all of which need to call this task 'generateCode' with a parameter that contains the location to the properties file. Also, this task should be executed before building each sub-project which can be achieved using dependsOn.
My java project code organisation:
trunk/
projA/src/java/../ObjectGen.java
projB/src/java/../properties.xml
projC/src/java/../properties.xml
projD/src/java/../properties.xml
....
A task cannot call another task. Instead, the way to solve this problem is to add a generateCode task to all ten subprojects. You can do this from the root build script with code similar to the following:
subprojects {
apply plugin: 'java'
configurations {
codegen
}
dependencies {
// A contains the code for the code generator
codegen project(':A')
}
task generateCode(type: JavaExec) {
main = 'jjrom.ObjectGen'
classpath = configurations.codegen
args 'arg1', 'arg2'
}
compileJava.dependsOn(generateCode)
}
If there is no general pattern as to where the properties file is located, this information can be added in the subprojects' build scripts:
generateCode {
args file('relative/path/to/properties/file')
}