How to define a default dependsOn in a Custom Task - gradle

I'm relatively new to gradle.
In order to create an automated deployment script on a cluster, I have created a bunch of Custom tasks that will depend on each other. For example:
class StartSchedulerTask extends SchedulerTask {
#TaskAction
void start() {
dependsOn env.nodes.name.collect {"startTomcat_$it"}
println "Starting quartz on node: ${node}"
}
}
in build.gradle, i have dynamically created the tasks:
project.extensions.environment.nodes.each { TomcatNode n ->
String name = n.name
task "nodeInit_$name"(type: DeployToNodeInitTask) {
node(n)
}
task "stopWorker_$name"(type: StopWorkerTask) {
node(n)
}
task "stopTomcat_$name"(type: StopTomcatTask){
node(n)
}
task "updateAppConfigs_$name"(type: UpdateAppConfigsTask){
node(n)
apps(V3Application.ALL_APPS)
buildName('develop')
}
task "deployWars_$name"(type: DeployWarsTask){
node(n)
apps(V3Application.ALL_APPS)
buildName('develop')
}
task "startTomcat_$name"(type: StartTomcatTask){
node(n)
}
task "startWorker_$name"(type: StartWorkerTask){
node(n)
}
task "terminateNode_$name"(type: DeployToNodeTerminationTask){
node(n)
}
}
task stopScheduler(type: StopSchedulerTask) {
environment(environment)
}
task startScheduler(type: StartSchedulerTask) {
environment(environment)
}
The default task is configured to be startScheduler, which is the last step of the deployment process, the idea being that the task graph, once it is built, will take care of the correct execution order of my tasks.
However, when I print out the task graph, the only task listed is startScheduler. Am I missing something?

Task dependencies have to be declared at configuration time, not at execution time. In theory you could do so in the task's constructor, but a better approach is to do it in the build script, or in a plugin.

Thanks to the remark of Peter Niederwieser and Jeffrey, I was able to come up with the full solution I want. I did not mark Peter's as the answer, because the full answer is below, but it was a necessary hint to the right solution:
I Created an interface DependencyAware:
public interface DependencyAware {
void declareDependencies()
}
Every task that knows how to declare its dependencies, implements this interface. For example:
class StartSchedulerTask extends SchedulerTask {
#TaskAction
void start() {
println "Starting quartz on node: ${node}"
}
void declareDependencies() {
dependsOn env.nodes.name.collect {"startTomcat_$it"}
}
}
In my build script:
tasks.each { Task t ->
if (t instanceof DependencyAware) {
t.declareDependencies()
}
}
That's it!
Thanks for the pointers Peter and Jeffrey
UPDATE 1
task deploy(dependsOn: ['backupWars', 'startScheduler'])
task stopScheduler(type: StopSchedulerTask)
task backupWars(type: BackupWarsTask)
project.extensions.targetEnvironment.nodes.each { TomcatNode n ->
String name = n.name
[
("nodeInit_$name"): DeployToNodeInitTask,
("stopWorker_$name"): StopWorkerTask,
("stopTomcat_$name"): StopTomcatTask,
("updateAppConfigs_$name"): UpdateAppConfigsTask,
("deployWars_$name"): DeployWarsTask,
("startTomcat_$name"): StartTomcatTask,
("startWorker_$name"): StartWorkerTask,
("terminateNode_$name"): DeployToNodeTerminationTask,
].each { String taskName, Class taskType ->
task "$taskName"(type: taskType) {
node(n)
}
}
}
task startScheduler(type: StartSchedulerTask) {
dryRun(testMode)
}
The internal dependencies between the different deployment steps, are in the tasks themselves, for example:
class StartWorkerTask extends WorkerTask {
#TaskAction
void start() {
println "Starting worker ${node}"
}
void declareDependencies() {
dependsOn tomcatOnThisNodeHasBeenStarted()
}
String tomcatOnThisNodeHasBeenStarted() {
"startTomcat_${node.name}"
}
}
The declaration of the topology is as follows:
environments {
prod {
nodes {
(1..2).each { int i ->
"w${i}_prod" {
host = "prod-n$i"
userName = "xxxxx"
password = "xxxxx"
loadBalancer = 'lb_prod'
frontendJkManagerUrl = 'http://web01/jkmanager'
}
}
scheduler {
name = "w1_prod"
}
}
}
rc {
//rc topology here
}
}

It's because you're declaring the dependency from inside the #TaskAction method. The #TaskAction only runs once the dependency graph has been formed.
You could abuse DoFirst() inside your #TaskActions methods to call all your dependencies, but this won't turn up on the dependency graph.

Related

Stop execution of finalizedBy task, or only execute follow-up task on a condition

I'm using the com.google.cloud.tools.appengine gradle plugin, which has a task appengineDeploy.
I have two tasks that configure the appengineDeploy task before executing it. My current solution looks something like that:
task deployTest() {
doFirst {
appengine.deploy {
version = 'test'
...
}
}
finalizedBy appengineDeploy
}
task deployProduction() {
doFirst {
appengine.deploy {
version = '7'
...
}
}
finalizedBy appengineDeploy
}
Now I wanted to add a security question before the deployProduction task is executed, like this:
println "Are you sure? (y/n)"
def response = System.in.newReader().readLine()
if (response != 'y') {
throw new GradleException('Task execution stopped')
}
The problem is, by defintion the finalizedBy task is executed even if my task throws an exception, and that is exactly the opposite of what I want.
I can't change it to appengineDeploy dependsOn deployTest and call appengineDeploy as I have two tasks with different configuration.
And I can't change the appengineDeploy task as it comes from a plugin.
Is there any other way I can either stop the execution of appengineDeploy, or use something other than finalizedBy to execute that task after my deploy task?
One option is to leverage onlyIf to decide whether to execute the task, for example by examining a project property.
Here's a litte demo, given that task appengineDeploy is a task contributed by a plugin (see comment for details):
plugins {
id 'base'
}
ext {
set('doDeploy', false)
}
appengineDeploy.configure {
onlyIf {
project.ext.doDeploy
}
}
task deployTest() {
doFirst {
println 'deployTest()'
project.ext.doDeploy = true
}
finalizedBy appengineDeploy
}
task deployProduction() {
doFirst {
println 'deployProduction()'
println "Are you sure? (y/n)"
def response = System.in.newReader().readLine()
if (response == 'y') {
project.ext.doDeploy = true
}
}
finalizedBy appengineDeploy
}
Another option is to disable the task, which goes like this:
task deployProduction() {
doFirst {
println 'deployProduction()'
println "Are you sure? (y/n)"
def response = System.in.newReader().readLine()
if (response != 'y') {
project.tasks.appengineDeploy.enabled = false
}
}
finalizedBy appengineDeploy
}
I modified the answer from thokuest a bit (thanks for the help!), to prevent executing any tasks inbetween.
I created it as extension method since I needed it more than once:
ext.securityQuestion = { project, task ->
println "Are you sure you want to execute ${project.name}:${task.name}? (y/n)"
def response = System.in.newReader().readLine()
if (response != 'y') {
project.tasks.each {
if (it != task)
it.enabled = false
}
throw new GradleException("Execution of ${project.name}:${task.name} aborted")
}
}
and my task now looks like this:
task deployProduction() { task ->
doFirst {
securityQuestion(this, task)
}
finalizedBy appengineDeploy
}

How can I reference the outputs of multiple tasks with one name?

Or in other words: How do I gather outputs of multiple tasks?
I have a project with a number of tasks whose output I want to include in the distribution.
I also have a task that depends on all of them.
How can I avoid listing all the tasks?
Example build.gradle:
plugins {
id 'distribution'
}
task taskA0 {
...
}
task taskA1 {
...
}
task taskA2 {
...
}
task allA {
dependsOn (taskA0, taskA1, taskA2)
}
distributions {
main {
contents {
// this works but is tedious
from taskA0
from taskA1
from taskA2
// this doesn't work, as allA doesn't have any output
from allA
}
}
}
How's this?
task allA {
dependsOn (taskA0, taskA1, taskA2)
outputs.files(taskA0, taskA1, taskA2)
}
distributions {
main {
contents {
from allA
}
}
}

Why gradle runs all JavaExec tasks?

I defined two tasks in my build.gradle
task a(type: JavaExec) {
}
task b(type: JavaExec) {
}
When I execute task a, b also runs. Is this normal?
gradle a
I'm sharing my own experience for others.
I am a newbie on groovy/gradle.
What I tried to achieve was using a shared function for getting project properties.
def projectProperty = {
if (!project.hasProperty(it)) {
throw new Exception...
}
return project.getProperty(it);
}
task a(type: JavaExec) {
do some with projectProperty(a);
}
task b(type: JavaExec) {
do some with projectProperty(b);
}
And I changed like this.
task a(type: JavaExec) {
if (project.hasProperty('a')) {
do some with projectProperty('a');
}
}
task b(type: JavaExec) {
if (project.hasProperty('b')) {
do some with projectProperty('b');
}
}

How to re-run only failed JUnit test classes using Gradle?

Inspired by this neat TestNG task, and this SO question I thought I'd whip up something quick for re-running of only failed JUnit tests from Gradle.
But after searching around for awhile, I couldn't find anything analogous which was quite as convenient.
I came up with the following, which seems to work pretty well and adds a <testTaskName>Rerun task for each task of type Test in my project.
import static groovy.io.FileType.FILES
import java.nio.file.Files
import java.nio.file.Paths
// And add a task for each test task to rerun just the failing tests
subprojects {
afterEvaluate { subproject ->
// Need to store tasks in static temp collection, else new tasks will be picked up by live collection leading to StackOverflow
def testTasks = subproject.tasks.withType(Test)
testTasks.each { testTask ->
task "${testTask.name}Rerun"(type: Test) {
group = 'Verification'
description = "Re-run ONLY the failing tests from the previous run of ${testTask.name}."
// Depend on anything the existing test task depended on
dependsOn testTask.dependsOn
// Copy runtime setup from existing test task
testClassesDirs = testTask.testClassesDirs
classpath = testTask.classpath
// Check the output directory for failing tests
File textXMLDir = subproject.file(testTask.reports.junitXml.destination)
logger.info("Scanning: $textXMLDir for failed tests.")
// Find all failed classes
Set<String> allFailedClasses = [] as Set
if (textXMLDir.exists()) {
textXMLDir.eachFileRecurse(FILES) { f ->
// See: http://marxsoftware.blogspot.com/2015/02/determining-file-types-in-java.html
String fileType
try {
fileType = Files.probeContentType(f.toPath())
} catch (IOException e) {
logger.debug("Exception when probing content type of: $f.")
logger.debug(e)
// Couldn't determine this to be an XML file. That's fine, skip this one.
return
}
logger.debug("Filetype of: $f is $fileType.")
if (['text/xml', 'application/xml'].contains(fileType)) {
logger.debug("Found testsuite file: $f.")
def testSuite = new XmlSlurper().parse(f)
def failedTestCases = testSuite.testcase.findAll { testCase ->
testCase.children().find { it.name() == 'failure' }
}
if (!failedTestCases.isEmpty()) {
logger.info("Found failures in file: $f.")
failedTestCases.each { failedTestCase ->
def className = failedTestCase['#classname']
logger.info("Failure: $className")
allFailedClasses << className.toString()
}
}
}
}
}
if (!allFailedClasses.isEmpty()) {
// Re-run all tests in any class with any failures
allFailedClasses.each { c ->
def testPath = c.replaceAll('\\.', '/') + '.class'
include testPath
}
doFirst {
logger.warn('Re-running the following tests:')
allFailedClasses.each { c ->
logger.warn(c)
}
}
}
outputs.upToDateWhen { false } // Always attempt to re-run failing tests
// Only re-run if there were any failing tests, else just print warning
onlyIf {
def shouldRun = !allFailedClasses.isEmpty()
if (!shouldRun) {
logger.warn("No failed tests found for previous run of task: ${subproject.path}:${testTask.name}.")
}
return shouldRun
}
}
}
}
}
Is there any easier way to do this from Gradle? Is there any way to get JUnit to output a consolidated list of failures somehow so I don't have to slurp the XML reports?
I'm using JUnit 4.12 and Gradle 4.5.
Here is one way to do it. The full file will be listed at the end, and is available here.
Part one is to write a small file (called failures) for every failed test:
test {
// `failures` is defined elsewhere, see below
afterTest { desc, result ->
if ("FAILURE" == result.resultType as String) {
failures.withWriterAppend {
it.write("${desc.className},${desc.name}\n")
}
}
}
}
In part two, we use a test filter (doc here) to restrict the tests to any that are present in the failures file:
def failures = new File("${projectDir}/failures.log")
def failedTests = []
if (failures.exists()) {
failures.eachLine { line ->
def tokens = line.split(",")
failedTests << tokens[0]
}
}
failures.delete()
test {
filter {
failedTests.each {
includeTestsMatching "${it}"
}
}
// ...
}
The full file is:
apply plugin: 'java'
repositories {
jcenter()
}
dependencies {
testCompile('junit:junit:4.12')
}
def failures = new File("${projectDir}/failures.log")
def failedTests = []
if (failures.exists()) {
failures.eachLine { line ->
def tokens = line.split(",")
failedTests << tokens[0]
}
}
failures.delete()
test {
filter {
failedTests.each {
includeTestsMatching "${it}"
}
}
afterTest { desc, result ->
if ("FAILURE" == result.resultType as String) {
failures.withWriterAppend {
it.write("${desc.className},${desc.name}\n")
}
}
}
}
The Test Retry Gradle plugin is designed to do exactly this. It will rerun each failed test a certain number of times, with the option of failing the build if too many failures have occurred overall.
plugins {
id 'org.gradle.test-retry' version '1.2.1'
}
test {
retry {
maxRetries = 3
maxFailures = 20 // Optional attribute
}
}

Does the notification order of TaskExecutionListeners match the order of registration in Gradle?

Will the following always print one, two, three in order if I run gradle three?
task one {
gradle.taskGraph.whenReady { graph ->
if (graph.hasTask(it)) {
println "one"
}
}
}
task two {
dependsOn one
gradle.taskGraph.whenReady { graph ->
if (graph.hasTask(it)) {
println "two"
}
}
}
task three {
dependsOn two
gradle.taskGraph.whenReady { graph ->
if (graph.hasTask(it)) {
println "three"
}
}
}
yes as the closures get put into a LinkedHashMap, ultimately it goes into a BroadcastDispatch when you call whenReady
there is nothing in the public api that states this though

Resources