Dynamically configuring a task in a parent build.gradle - gradle

I have a multi-project C++ Gradle build, which produces a number of libraries and executables. I'm trying to get the executables (but not the libraries) subprojects to get compiled in with a 'fingerprint' object. This works fine if I sprinkle smth like this in individual subprojects' build.gradle:
compileMain.doFirst {
// code to generate a 'BuildInfo.cpp' from from a template.
// embeds name of executable in so has to be generated anew for each exe
}
Following DRY principles, I'd much rather do this once and for all in a top level build.gradle. This is my attempt, to apply it to just the subprojects that use the cpp-exe plugin, following these instructions:
configure(subprojects.findAll { it.plugins.hasPlugin('cpp-exe') }) {
compileMain.doFirst {
// same code as above
}
}
Alas, this doesn't get triggered. However, if I put smth like this in a less restrictive configure, block, this demonstrates that the idea of querying the plugin should work:
configure(subprojects.findAll { true }) {
task mydebug << {
if ( project.plugins.hasPlugin( 'cpp-exe' ) ) {
println ">>> $project.name has it!"
}
}
}
Could it be that the plugins don't get applied to the subprojects at the time the configure closure is evaluated (in the top-level build.gradle)? There may well be a much simpler way of achieving this altogether?

You probably apply the cpp-exe plugin in the child projects' build scripts. By default, a parent build script gets evaluated before its children, which explains why it's not finding any projects that have cpp-exe applied.
There are several ways to solve this problem. One way is to move all configuration that's specific to a cpp-exe project (like applying the plugin and adding the action) to the same spot. Either you do all such configuration from the parent build script (for example by enumerating the cpp-exe subprojects and configuring them with a single configure(cppExeProjects) { ... }), or you move the cpp-exe specific configuration into its own build script (say gradle/cpp-exe.gradle) and apply it from selected subprojects like so: apply from: "$rootDir/gradle/cpp-exe.gradle".
Another solution is to change the evaluation order of build scripts. But I would only use this as a last resort, and it is certainly not necessary here.

Gradle 1.5 is recently out, I am not sure if this is a new feature but as it looks, you can solve the issue by using afterEvaluate.
Take a look at section 53.6.1 in http://www.gradle.org/docs/current/userguide/build_lifecycle.html
Something like:
subprojects {subProject ->
afterEvaluate {
if ( subProject.plugins.hasPlugin('cpp-exe')){
println "Project $subProject.name has plugin cpp-exe"
}
}
}
would give you a start.

Related

Gradle distribution plugin: conditionally copy assets

I'm packaging my java application using Gradle's Distribution plugin. I wanted to make 2 distributions, one which doesn't include a JRE and another one that bundles a JRE with the app.
I've set up a copyJre task and wanted to only make Distributions plugin include a folder (jre-8 in the example below) only when copyJre task is in the tasks graph. Here's my attempt which doesn't work.
distributions {
main {
contents {
from('/') {
include 'tools/**'
}
// my attempt to conditionally copy
// jre-8 directory only when tasks graph contains
// a task named 'copyJre'
if (tasks.findByName('copyJre') != null) {
from('../../jre-dist/') {
include 'jre-8/**'
}
}
}
}
}
There probably should be a better approach in general. This looks like kludges.
From a Gradle perspective, you are better expressing what you need the other way around:
Create a different distribution that will include the JRE, possibly extracting the common part of the copy spec.
And if you really only want a single output, make it replace the default distribution after building it.

Cannot have different system properties values for different tasks

I am trying to create 2 tasks to execute the sonarcube task. I want to be able to specify different properties depending on the task
task sonarqubePullRequest(type: Test){
System.setProperty( "sonar.projectName", "sonarqubePullRequest")
System.setProperty("sonar.projectKey", "sonarqubePullRequest")
System.setProperty("sonar.projectVersion", serviceVersion)
System.setProperty("sonar.jacoco.reportPath",
"${project.buildDir}/jacoco/test.exec")
tasks.sonarqube.execute()
}
task sonarqubeFullScan(type: Test){
System.setProperty("sonar.projectName", "sonarqubeFullScan")
System.setProperty("sonar.projectKey", "sonarqubeFullScan")
System.setProperty("sonar.projectVersion", serviceVersion)
System.setProperty("sonar.jacoco.reportPath",
"${project.buildDir}/jacoco/test.exec")
tasks.sonarqube.execute()
}
The tasks work but there seems to be an issue with the properties I am setting
if I run the first task which is sonarqubePullRequest then everything is fine, but if run sonarqubeFullScan then if uses the values specified in the sonarqubePullRequest. so the project name is set sonarqubePullRequest
it is as if those properties are set at run time and cannot be updated. I feel like I am missing something obvious any suggestions greatly received.
First of all: NEVER use execute() on tasks. The method is not part of the public Gradle API and therefor, its behaviour can change or be undefined. Gradle will execute the tasks on its own, either because you specified them (command line or settings.gradle) or as task dependencies.
The reason, why your code does not work, is the difference between the configuration phase and the execution phase. In the configuration phase, all the (configuration) code in your task closures is executed, but not the tasks. So, you'll always overwrite the system properties. Only (internal) task actions, doFirst and doLast closures are executed in the execution phase. Please note, that every task is only executed ONCE in a build, so your approach to parametrize a task twice will never work.
Also, I do not understand why you are using system properties to configure your sonarqube task. You can simply configure the task directly via:
sonarqube {
properties {
property 'sonar.projectName', 'sonarqubePullRequest'
// ...
}
}
Now you can configure the sonarqube task. To distinguish between your two cases, you can add a condition for different property values. The next example makes use of a project property as condition:
sonarqube {
properties {
// Same value for both cases
property 'sonar.projectVersion', serviceVersion
// Value based on condition
if (project.findProperty('fullScan') {
property 'sonar.projectName', 'sonarqubeFullScan'
} else {
property 'sonar.projectName', 'sonarqubePullRequest'
}
}
}
Alternatively, you can add another task of the type SonarQubeTask. This way, you could parametrize both tasks differently and call them (via command line or dependency) whenever you need them:
sonarqube {
// Generated by the plugin, parametrize like described above
}
task sonarqubeFull(type: org.sonarqube.gradle.SonarQubeTask) {
// Generated by your build script, parametrize in the same way
}

gradle: how do I list tasks introduced by a certain plugin

Probably a simple question but I can't find a way to list which tasks are introduced by the plugins that get applied in a build.gradle file.
So, say that your build.gradle is simply:
apply plugin: 'java'
is there a simple way to make gradle list all the tasks introduced by that plugin?
PS: that would come handy in case of messy and large build files with dozens of applied plugins
PS2: I'm not asking about the dependencies of the tasks. My question is different and quite clear. Each plugin that I apply introduces some tasks of its own (never mind what depends on what). The question is which are the newly introduced tasks in the first place?
I'm afraid it is not possible because of the nature how gradle plugins are applied.
If you take a look at Plugin interface, you will see it has a single apply(Project p) method. Plugin responsibility is to configure a project - it can add specific tasks / configurations / etc. For example, gradle JavaPlugin is stateless, so you can't get tasks from it.
The only solution that comes to mind is to get a difference of tasks after the plugin is applied:
build.gradle
def tasksBefore = [], tasksAfter = []
project.tasks.each { tasksBefore.add(it.name) } // get all tasks
apply(plugin: 'idea') // apply plugin
project.tasks.each { tasksAfter.add(it.name) } // get all tasks
tasksAfter.removeAll(tasksBefore); // get the difference
println 'idea tasks: ' + tasksAfter;
This will print tasks that were added by Idea plugin:
idea tasks: [cleanIdea, cleanIdeaModule, cleanIdeaProject,
cleanIdeaWorkspace, idea, ideaModule, ideaProject, ideaWorkspace]
You can play a bit with this code and build an acceptable solution.
In some cases origin from specific plugin can be restored by checking the task's group and name:
tasks.findAll { it.group == 'verification' && it.name.startsWith('jacoco') }.each { task ->
println(task.name)
}

How do I apply a patch file in Gradle?

I have a Gradle build script that successfully builds my project and compiles all the artifacts I need.
However, in a couple of cases I'd like to give other developers the option to patch some of the files. For example, in one of the archives there's an xml file with information about database hooks - some of the devs use other versions (or even engines) and need to change these before they can use the build output.
Instead of having them make changes to a version-controlled file, which they might commit by mistake, I'd like to give them the option to have a local, individual patch file which the build script applies.
In an old ant script, we did something like this
<target name="appcontext-patch" if="applicationContext.patch.file">
<patch patchfile="${applicationContext.patch.file}" originalfile="${dist.dir}/applicationContext.xml"/>
</target>
but I can't figure out how to do the equivalent in Gradle. Is there a better (i.e. more idiomatic) way of doing this than trying to directly convert this into a call to ant.patch?
Some context
This is how the file ends up in the archive in the first place:
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { fn -> "jboss-spring.xml" }
}
}
It would be fantabulous if I could just do something like
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { fn -> "jboss-spring.xml' }
patch 'local/applicationContext.xml.patch'
}
}
and have the patch file applied before the file is put in the archive. I don't mind writing some code to make this possible, but I'm quite new to Gradle and I have no idea where to begin.
You should be able to translate your ant call into gradle pretty directly.
The gradle doc on how to do this generically. Basically attributes become named arguments and child tags become closures. The documentation has a bunch of good examples.
Once you have your translated ant task you can put in in a doFirst or doLast block on an appropriate task.
My first guess would be something like this:
apply plugin: 'java'
assemble.doFirst {
ant.patch(patchfile: applicationContext.patch.file,
originalFile: "${dist.dir}/applicationContext.xml")
}
That's untested, so but I'm pretty sure it will get you started on the right path. The intent is that just before the java plugin assembles your archive you want gradle to call a closure. In this case the closure will perform an ant action that patches your xml.
Alternately you could use the task you have above that performs a copy and tag onto that.
task myCopyTask(type: Copy) {
...
} << {
ant.patch(patchfile: applicationContext.patch.file,
originalFile: "${dist.dir}/applicationContext.xml")
}
In this case you are writing the task yourself and the left-shift operator (<<) is equivalent to .doLast but a whole lot cooler. I'm not sure which method you prefer, but if you already have a copy task that gets the file there in the first place, I think doLast keeps the relevant code blocks as close to each other as possible.
RFC 5621 defines an XML patching language that uses XPath to target the location in the document to patch. It's great for tweaking config files.
There is an open source implementation in Java (Disclaimer: I am the author). It includes a filter that can be used from Gradle to patch XML files during any task that implements CopySpec. For example:
buildscript {
repositories { jcenter() }
dependencies { classpath "com.github.dnault:xml-patch:0.3.0" }
}
import com.github.dnault.xmlpatch.filter.XmlPatch
task copyAndPatch(type: Copy) {
// Patch file in RFC 5621 format
def patchPath = 'local/applicationContext-patch.xml'
inputs.file patchPath
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { 'jboss-spring.xml' }
filter(XmlPatch, patch: patchPath)
}
}
}
If you'd like to do this more on the fly I can think of two main techniques. Both involve writing some code, but they may be more appealing to you and I'm pretty confident gradle doesn't have this behavior built-in anywhere.
Personally I think #1 is the better solution, since you don't need to muck around with the internals of the Copy task. A custom filter feels cleaner and more reusable.
1) Write a custom filter that you specify in your copy task. I can't help with the details of how to write a custom filter, but I'd start here. You should be able to put the custom filter in buildSrc (lots of info about that at gradle.org) and then you simply need to import it at the top of your gradle file. If you write it in groovy I think you can even just use ant.patch() again.
task copyAndPatch() {
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { fn -> "jboss-spring.xml' }
filter(MyCustomFilterThatDoesAPatch, patchFile: 'local/applicationContext.xml.patch')
}
}
2) Write a custom task. Again, I'll leave the details to the experts but you can probably get away with subclassing the Copy task, adding a 'patch' property, and then jumping in during execution to do the dirty work.

Gradle - how to set up-to-date parameters on a predefined task?

I could really use some help with this!
The gradle docs say that to make the up-to-date logic to function, just do this:
task transform {
ext.srcFile = file('mountains.xml')
ext.destDir = new File(buildDir, 'generated')
inputs.file srcFile
outputs.dir destDir
This is all well and good for tasks you are defining. However, I am using the eclipse plugin to do some modification to the .classpath file. Up-to-date does not work. That is, it runs the task over and over again out of the box (at least for me). Here is what I have:
eclipse {
classpath {
//eclipseClasspath.inputs.file // something like this??? but what to set it to?
//eclipseClasspath.outputs.file // here too
file {
withXml {
def node = it.asNode()
// rest of my stuff here
I tried a couple of things where I have the two commented out lines. Since those didn't work, I realized I didn't really have a clue and could use some help! Thanks in advance!
In my experience, the Eclipse tasks should not rerun every single time. That makes me think that you are doing something to cause either the inputs or outputs to change. If you are modifying your Eclipse project after Gradle generates it or changing dependencies, etc, you would naturally be triggering the upToDate checks.
If you really do need to force it to run every time, you might be able to get it to work with this. I'm not sure if I've ever tried using this when other outputs are already defined.
eclipseClasspath {
outputs.upToDateWhen { true } //there isn't an equivalent for inputs
}
One important note is that what you were using is the Eclipse model that describes your project, not the actual task itself:
eclipse { //this is the eclipse model
classpath {
}
}
eclipseClasspath {
//this is a task
}
eclipseProject {
//this is a task
}

Resources