I'm trying to understand gradle distribution. In the gradle documentation, section 7.3.4, there's the following code example:
task dist(type: Zip) {
dependsOn spiJar
from 'src/dist'
into('libs') {
from spiJar.archivePath
from configurations.runtime
}
}
I was trying to find method dist() to understand what exact it does. I was searching for in org.gradle.api.tasks.bundling.Zip but there is no such method. So where is it declared?
Could You please provide a link to the example You mentioned?
It seems that dist() method is defined nowhere. The code sample You provided is just a task definition so in this particular case dist is just a name of the defined task of type Zip. from and into methods are taken from AbstractCopyTask.
EDIT
So, as stated above dist() is just an ordinary task definition, while for instance the next piece of code in the example:
artifacts {
archives dist
}
has a dedicated method defined in AbstractProject class:
public void artifacts(Closure configureClosure) {
ConfigureUtil.configure(configureClosure, getArtifacts());
}
Now, why this error:
Could not find method dist() for arguments [{type=class org.gradle.api.tasks.bundling.Zip}, txt, build_275gv6pdo8dsig251h253koq9t$_run_closure2#a81512] on proj ect ':MP'.
occurs for this input:
task dist(type: Zip, 'txt')
?
During script parsing (which is a dynamic and quite complicated process) the declaration above should be turned into invocation of one of create methods on TaskContainer instance, where dist is a task name of type String and type: Zip, 'txt' should be passed as arguments. As You can see in the docs for TaskContainer there's no create method that takes String (dist - task name), then Map (type: Zip - task config), and again a String (txt - redundant/invalid argument). That's why it's failing.
If You're interested how it works, it's good idea to put the following piece of code in build.gradle:
task someTask {
throw new RuntimeException()
}
and investigate the stacktrace. It will tell You a lot about how it works step by step.
There is no dist method. In this example, you are using the gradle dsl to create a new task called "dist" whose type is "Zip".
This is accomplished through the use of "method missing". See http://groovy.codehaus.org/Using+methodMissing+and+propertyMissing for more info.
Note: if you change "dist" to "foo", this is still a valid example, but with a less self-explanatory task name.
Related
I need some guidance and clarification about how artifacts are declared in gradle tasks, I would like to upload files to maven/artifactory reading these files from any kind of output container instead of having to hardcode the path for each artifact generated.
This is simpler if you have a JAR, ZIP, File, and a few other types, because you can make use of project.artifacts.add(...), but this is not trivial when you have some random file types.
Im going to provide an example to specify and clarify what I need exactly:
if I have:
task generateMyFile {
doLast {
buildDir.mkdirs()
['touch', new File(buildDir, 'myfile.random').absolutePath].execute().waitFor()
}
}
What's the right way to declare myfile.random as the output for generateMyFile ?
How do I declare myfile.random to be a valid artifact that will be used by maven/artifactory later by the 'publish' task ? currently I assume the file was generated in build/myfile.random, but Im looking for a smarter solution, and gradle documentation (or any other source) is not clear about this.
Thanks in advance
I am trying to create 2 tasks to execute the sonarcube task. I want to be able to specify different properties depending on the task
task sonarqubePullRequest(type: Test){
System.setProperty( "sonar.projectName", "sonarqubePullRequest")
System.setProperty("sonar.projectKey", "sonarqubePullRequest")
System.setProperty("sonar.projectVersion", serviceVersion)
System.setProperty("sonar.jacoco.reportPath",
"${project.buildDir}/jacoco/test.exec")
tasks.sonarqube.execute()
}
task sonarqubeFullScan(type: Test){
System.setProperty("sonar.projectName", "sonarqubeFullScan")
System.setProperty("sonar.projectKey", "sonarqubeFullScan")
System.setProperty("sonar.projectVersion", serviceVersion)
System.setProperty("sonar.jacoco.reportPath",
"${project.buildDir}/jacoco/test.exec")
tasks.sonarqube.execute()
}
The tasks work but there seems to be an issue with the properties I am setting
if I run the first task which is sonarqubePullRequest then everything is fine, but if run sonarqubeFullScan then if uses the values specified in the sonarqubePullRequest. so the project name is set sonarqubePullRequest
it is as if those properties are set at run time and cannot be updated. I feel like I am missing something obvious any suggestions greatly received.
First of all: NEVER use execute() on tasks. The method is not part of the public Gradle API and therefor, its behaviour can change or be undefined. Gradle will execute the tasks on its own, either because you specified them (command line or settings.gradle) or as task dependencies.
The reason, why your code does not work, is the difference between the configuration phase and the execution phase. In the configuration phase, all the (configuration) code in your task closures is executed, but not the tasks. So, you'll always overwrite the system properties. Only (internal) task actions, doFirst and doLast closures are executed in the execution phase. Please note, that every task is only executed ONCE in a build, so your approach to parametrize a task twice will never work.
Also, I do not understand why you are using system properties to configure your sonarqube task. You can simply configure the task directly via:
sonarqube {
properties {
property 'sonar.projectName', 'sonarqubePullRequest'
// ...
}
}
Now you can configure the sonarqube task. To distinguish between your two cases, you can add a condition for different property values. The next example makes use of a project property as condition:
sonarqube {
properties {
// Same value for both cases
property 'sonar.projectVersion', serviceVersion
// Value based on condition
if (project.findProperty('fullScan') {
property 'sonar.projectName', 'sonarqubeFullScan'
} else {
property 'sonar.projectName', 'sonarqubePullRequest'
}
}
}
Alternatively, you can add another task of the type SonarQubeTask. This way, you could parametrize both tasks differently and call them (via command line or dependency) whenever you need them:
sonarqube {
// Generated by the plugin, parametrize like described above
}
task sonarqubeFull(type: org.sonarqube.gradle.SonarQubeTask) {
// Generated by your build script, parametrize in the same way
}
I have a Gradle build script that successfully builds my project and compiles all the artifacts I need.
However, in a couple of cases I'd like to give other developers the option to patch some of the files. For example, in one of the archives there's an xml file with information about database hooks - some of the devs use other versions (or even engines) and need to change these before they can use the build output.
Instead of having them make changes to a version-controlled file, which they might commit by mistake, I'd like to give them the option to have a local, individual patch file which the build script applies.
In an old ant script, we did something like this
<target name="appcontext-patch" if="applicationContext.patch.file">
<patch patchfile="${applicationContext.patch.file}" originalfile="${dist.dir}/applicationContext.xml"/>
</target>
but I can't figure out how to do the equivalent in Gradle. Is there a better (i.e. more idiomatic) way of doing this than trying to directly convert this into a call to ant.patch?
Some context
This is how the file ends up in the archive in the first place:
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { fn -> "jboss-spring.xml" }
}
}
It would be fantabulous if I could just do something like
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { fn -> "jboss-spring.xml' }
patch 'local/applicationContext.xml.patch'
}
}
and have the patch file applied before the file is put in the archive. I don't mind writing some code to make this possible, but I'm quite new to Gradle and I have no idea where to begin.
You should be able to translate your ant call into gradle pretty directly.
The gradle doc on how to do this generically. Basically attributes become named arguments and child tags become closures. The documentation has a bunch of good examples.
Once you have your translated ant task you can put in in a doFirst or doLast block on an appropriate task.
My first guess would be something like this:
apply plugin: 'java'
assemble.doFirst {
ant.patch(patchfile: applicationContext.patch.file,
originalFile: "${dist.dir}/applicationContext.xml")
}
That's untested, so but I'm pretty sure it will get you started on the right path. The intent is that just before the java plugin assembles your archive you want gradle to call a closure. In this case the closure will perform an ant action that patches your xml.
Alternately you could use the task you have above that performs a copy and tag onto that.
task myCopyTask(type: Copy) {
...
} << {
ant.patch(patchfile: applicationContext.patch.file,
originalFile: "${dist.dir}/applicationContext.xml")
}
In this case you are writing the task yourself and the left-shift operator (<<) is equivalent to .doLast but a whole lot cooler. I'm not sure which method you prefer, but if you already have a copy task that gets the file there in the first place, I think doLast keeps the relevant code blocks as close to each other as possible.
RFC 5621 defines an XML patching language that uses XPath to target the location in the document to patch. It's great for tweaking config files.
There is an open source implementation in Java (Disclaimer: I am the author). It includes a filter that can be used from Gradle to patch XML files during any task that implements CopySpec. For example:
buildscript {
repositories { jcenter() }
dependencies { classpath "com.github.dnault:xml-patch:0.3.0" }
}
import com.github.dnault.xmlpatch.filter.XmlPatch
task copyAndPatch(type: Copy) {
// Patch file in RFC 5621 format
def patchPath = 'local/applicationContext-patch.xml'
inputs.file patchPath
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { 'jboss-spring.xml' }
filter(XmlPatch, patch: patchPath)
}
}
}
If you'd like to do this more on the fly I can think of two main techniques. Both involve writing some code, but they may be more appealing to you and I'm pretty confident gradle doesn't have this behavior built-in anywhere.
Personally I think #1 is the better solution, since you don't need to muck around with the internals of the Copy task. A custom filter feels cleaner and more reusable.
1) Write a custom filter that you specify in your copy task. I can't help with the details of how to write a custom filter, but I'd start here. You should be able to put the custom filter in buildSrc (lots of info about that at gradle.org) and then you simply need to import it at the top of your gradle file. If you write it in groovy I think you can even just use ant.patch() again.
task copyAndPatch() {
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { fn -> "jboss-spring.xml' }
filter(MyCustomFilterThatDoesAPatch, patchFile: 'local/applicationContext.xml.patch')
}
}
2) Write a custom task. Again, I'll leave the details to the experts but you can probably get away with subclassing the Copy task, adding a 'patch' property, and then jumping in during execution to do the dirty work.
I'm trying to add an installer builder to my build configuration and I'm having a little trouble getting task inputs set up properly. I have the configuration split into a separate .gradle file and I add it to my project by doing the following.
project.ext.i4jArgs = [ "--verbose" ]
apply from: rootProject.projectDir.absolutePath + "/gradle/install4j.gradle"
To build the installers I'm calling a command line tool via exec. Almost everything is based on convention, but I want to optionally add a couple arguments / switches to the command from my main build file. I do it using the project.ext.i4jArgs property (above).
If I set the project.ext.i4jArgs property before applying my install4j.gradle file, I can use the following for inputs and everything seems to work.
inputs.property("i4jArgs", project.ext.has('i4jArgs') ? project.ext.i4jArgs : null)
However, if I apply my install4j.gradle file first and set the project.ext.i4jArgs property second, the project.ext.i4jArgs property is always null when I'm declaring inputs in my task (obviously). The API for TaskInputs (here) says I can pass a closure as a value. Is there a way I can use a closure to delay the evaluation of the project.ext.i4jArgs long enough to guarantee it's been initialized? I though the following would work, but the closure never gets called.
inputs.property("i4jArgs", {
project.afterEvaluate {
println "has args ${project.ext.has('i4jArgs')}"
project.ext.has('i4jArgs') ? project.ext.i4jArgs : null
}
})
I know writing a plugin that supports all the configuration I want might be a better option for the specific example I've given, but I'd like to figure out what I'm misunderstanding here anyway.
I would remove project.afterEvaluate in the first closure. This is for adding a closure that gets executed after the project has been configured.
What is actually going on is when gradle resolves the inputs, it calls the first closure, which then calls project.afterEvaluate, which adds a closure to the list that will be called when the project is done configuring... which will never be called because it is now in the execution phase.
I have a multi-project C++ Gradle build, which produces a number of libraries and executables. I'm trying to get the executables (but not the libraries) subprojects to get compiled in with a 'fingerprint' object. This works fine if I sprinkle smth like this in individual subprojects' build.gradle:
compileMain.doFirst {
// code to generate a 'BuildInfo.cpp' from from a template.
// embeds name of executable in so has to be generated anew for each exe
}
Following DRY principles, I'd much rather do this once and for all in a top level build.gradle. This is my attempt, to apply it to just the subprojects that use the cpp-exe plugin, following these instructions:
configure(subprojects.findAll { it.plugins.hasPlugin('cpp-exe') }) {
compileMain.doFirst {
// same code as above
}
}
Alas, this doesn't get triggered. However, if I put smth like this in a less restrictive configure, block, this demonstrates that the idea of querying the plugin should work:
configure(subprojects.findAll { true }) {
task mydebug << {
if ( project.plugins.hasPlugin( 'cpp-exe' ) ) {
println ">>> $project.name has it!"
}
}
}
Could it be that the plugins don't get applied to the subprojects at the time the configure closure is evaluated (in the top-level build.gradle)? There may well be a much simpler way of achieving this altogether?
You probably apply the cpp-exe plugin in the child projects' build scripts. By default, a parent build script gets evaluated before its children, which explains why it's not finding any projects that have cpp-exe applied.
There are several ways to solve this problem. One way is to move all configuration that's specific to a cpp-exe project (like applying the plugin and adding the action) to the same spot. Either you do all such configuration from the parent build script (for example by enumerating the cpp-exe subprojects and configuring them with a single configure(cppExeProjects) { ... }), or you move the cpp-exe specific configuration into its own build script (say gradle/cpp-exe.gradle) and apply it from selected subprojects like so: apply from: "$rootDir/gradle/cpp-exe.gradle".
Another solution is to change the evaluation order of build scripts. But I would only use this as a last resort, and it is certainly not necessary here.
Gradle 1.5 is recently out, I am not sure if this is a new feature but as it looks, you can solve the issue by using afterEvaluate.
Take a look at section 53.6.1 in http://www.gradle.org/docs/current/userguide/build_lifecycle.html
Something like:
subprojects {subProject ->
afterEvaluate {
if ( subProject.plugins.hasPlugin('cpp-exe')){
println "Project $subProject.name has plugin cpp-exe"
}
}
}
would give you a start.