Task running unnecessarily - gradle

I have written a task to run my project using a main class chosen via user input only it is prompting me to choose a main class when I run gradle tasks. Why is this and how do I prevent it?
task run(dependsOn: "classes", type: JavaExec) {
description "Executes the project using the selected main class"
def selection = null
def mainClasses = []
// Select the java files with main classes in
sourceSets.main.allJava.each {
if(it.text.contains("public static void main")) {
def pkg = relativePath(it) - 'src/main/java/' - '.java'
pkg = pkg.tr "/", "."
println "${mainClasses.size()}. $pkg"
mainClasses << pkg
}
}
// Now prompt the user to choose a main class to use
while(selection == null) {
def input = System.console().readLine "#? "
if(input?.isInteger()) {
selection = input as int
if(selection >= 0 && selection < mainClasses.size()) {
break
} else {
selection = null
}
} else if(input?.toLowerCase() == "quit") {
return
}
if(selection == null) {
println "Unknown option."
}
}
main = mainClasses[selection]
classpath = sourceSets.main.runtimeClasspath
}

Gradle has a configuration phase and an execution phase.
The fact that your build logic is actually run when calling "gradle tasks" is because your build logic is in the tasks configuration section. If you want to move it to the execution phase, you should introduce a doFirst or doLast closure
See gradle build script basics for more details or this post

Related

Stop execution of finalizedBy task, or only execute follow-up task on a condition

I'm using the com.google.cloud.tools.appengine gradle plugin, which has a task appengineDeploy.
I have two tasks that configure the appengineDeploy task before executing it. My current solution looks something like that:
task deployTest() {
doFirst {
appengine.deploy {
version = 'test'
...
}
}
finalizedBy appengineDeploy
}
task deployProduction() {
doFirst {
appengine.deploy {
version = '7'
...
}
}
finalizedBy appengineDeploy
}
Now I wanted to add a security question before the deployProduction task is executed, like this:
println "Are you sure? (y/n)"
def response = System.in.newReader().readLine()
if (response != 'y') {
throw new GradleException('Task execution stopped')
}
The problem is, by defintion the finalizedBy task is executed even if my task throws an exception, and that is exactly the opposite of what I want.
I can't change it to appengineDeploy dependsOn deployTest and call appengineDeploy as I have two tasks with different configuration.
And I can't change the appengineDeploy task as it comes from a plugin.
Is there any other way I can either stop the execution of appengineDeploy, or use something other than finalizedBy to execute that task after my deploy task?
One option is to leverage onlyIf to decide whether to execute the task, for example by examining a project property.
Here's a litte demo, given that task appengineDeploy is a task contributed by a plugin (see comment for details):
plugins {
id 'base'
}
ext {
set('doDeploy', false)
}
appengineDeploy.configure {
onlyIf {
project.ext.doDeploy
}
}
task deployTest() {
doFirst {
println 'deployTest()'
project.ext.doDeploy = true
}
finalizedBy appengineDeploy
}
task deployProduction() {
doFirst {
println 'deployProduction()'
println "Are you sure? (y/n)"
def response = System.in.newReader().readLine()
if (response == 'y') {
project.ext.doDeploy = true
}
}
finalizedBy appengineDeploy
}
Another option is to disable the task, which goes like this:
task deployProduction() {
doFirst {
println 'deployProduction()'
println "Are you sure? (y/n)"
def response = System.in.newReader().readLine()
if (response != 'y') {
project.tasks.appengineDeploy.enabled = false
}
}
finalizedBy appengineDeploy
}
I modified the answer from thokuest a bit (thanks for the help!), to prevent executing any tasks inbetween.
I created it as extension method since I needed it more than once:
ext.securityQuestion = { project, task ->
println "Are you sure you want to execute ${project.name}:${task.name}? (y/n)"
def response = System.in.newReader().readLine()
if (response != 'y') {
project.tasks.each {
if (it != task)
it.enabled = false
}
throw new GradleException("Execution of ${project.name}:${task.name} aborted")
}
}
and my task now looks like this:
task deployProduction() { task ->
doFirst {
securityQuestion(this, task)
}
finalizedBy appengineDeploy
}

Gradle zip task with lazy include property includes itself

Hi I got this zip task which works great:
def dir = new File("${projectDir.parentFile}/test/")
task testZip(type: Zip) {
from dir
destinationDirectory = dir
include 'toast/**'
archiveFileName = 'test.zip'
}
but then when I make the include property lazy (because I need to in my real case)
def dir = new File("${projectDir.parentFile}/test/")
task testZip(type: Zip) {
from dir
destinationDirectory = dir
include {
'toast/**'
}
archiveFileName = 'test.zip'
}
then it creates a zip that includes everything in the folder, (so the generated archive too). In this test case the inner zip is just corrupted (doesn't run infinitely) but in the real world case it does make an infinite zip. (Not sure why, maybe my best case has too few or small files). Either way the test case shows the problem, the generated zip contains a zip even though it should only contain the toast directory and all of its content.
How do I fix this? I need a lazy include because the directory I want to include is computed by other tasks. I get the exact same problem with Tar except it refuses to create the archive since it includes itself.
Using exclude '*.zip' is a dumb workaround which makes the archive include other folders I don't want. I only want to include a specific folder, lazyly.
Here's what the monster looks like in the real world case. I basically need to retrieve the version of the project from Java to then use that version to name the folders I'm packaging. (Making a libGDX game and packaging it with a jre using packr). The problematic tasks are 'makeArchive_' + platform.
String jumpaiVersion;
task fetchVersion(type: JavaExec) {
outputs.upToDateWhen { jumpaiVersion != null }
main = 'net.jumpai.Version'
classpath = sourceSets.main.runtimeClasspath
standardOutput new ByteArrayOutputStream()
doLast {
jumpaiVersion = standardOutput.toString().replaceAll("\\s+", "")
}
}
def names = [
'win64' : "Jumpai-%%VERSION%%-Windows-64Bit",
'win32' : "Jumpai-%%VERSION%%-Windows-32Bit",
'linux64' : "Jumpai-%%VERSION%%-Linux-64Bit",
'linux32' : "Jumpai-%%VERSION%%-Linux-32Bit",
'mac' : "Jumpai-%%VERSION%%-Mac.app"
]
def platforms = names.keySet() as String[]
def jdks = [
'win64' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-win_x64.zip',
'win32' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-win_i686.zip',
'linux64' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-linux_x64.tar.gz',
'linux32' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-linux_i686.tar.gz',
'mac' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-macosx_x64.zip'
]
def formats = [
'win64' : 'ZIP',
'win32' : 'ZIP',
'linux64' : 'TAR_GZ',
'linux32' : 'TAR_GZ',
'mac' : 'ZIP'
]
File jdksDir = new File(project.buildscript.sourceFile.parentFile.parentFile, 'out/jdks')
File gameJar = new File("${projectDir.parentFile}/desktop/build/libs/Jumpai.jar")
File gameData = new File("${projectDir.parentFile}/desktop/build/libs/Jumpai.data")
File packrDir = new File("${projectDir.parentFile}/out/packr/")
File minimalTmpDir = new File("${projectDir.parentFile}/desktop/build/libs/minimal-tmp")
task minimizeGameJar {
dependsOn ':desktop:dist'
doFirst {
minimalTmpDir.mkdirs()
copy {
from zipTree(gameJar)
into minimalTmpDir
}
for(file in minimalTmpDir.listFiles())
if(file.getName().contains("humble"))
file.delete()
}
}
task makeMinimal(type: Zip) {
dependsOn minimizeGameJar
dependsOn fetchVersion
from minimalTmpDir
include '**'
archiveFileName = provider {
"Jumpai-${->jumpaiVersion}-Minimal.jar"
}
destinationDir packrDir
doLast {
minimalTmpDir.deleteDir()
}
}
task copyGameJar(type: Copy) {
outputs.upToDateWhen { gameData.exists() }
dependsOn ':desktop:dist'
from gameJar.getAbsolutePath()
into gameData.getParentFile()
rename("Jumpai.jar", "Jumpai.data")
}
task setWindowsIcons(type: Exec) {
dependsOn fetchVersion
workingDir '.'
commandLine 'cmd', '/c', 'set_windows_icons.bat', "${->jumpaiVersion}"
}
for(platform in platforms) {
task("getJdk_" + platform) {
String url = jdks[platform]
File jdkDir = new File(jdksDir, platform + "-jdk")
File jdkFile = new File(jdkDir, url.split("/").last())
outputs.upToDateWhen { jdkFile.exists() }
doFirst {
if(!jdkDir.exists())
jdkDir.mkdirs()
if(jdkFile.exists())
{
println jdkFile.getName() + " is already present"
return
}
else
{
println "Downloading " + jdkFile.getName()
new URL(url).withInputStream {
i -> jdkFile.withOutputStream { it << i }
}
}
for(file in jdkDir.listFiles()) {
if(file.equals(jdkFile))
continue
if(file.isFile()) {
if (!file.delete())
println "ERROR: could not delete " + file.getAbsoluteFile()
} else if(!file.deleteDir())
println "ERROR: could not delete content of " + file.getAbsoluteFile()
}
if(url.endsWith(".tar.gz"))// don't mix up archive type of what we downloaded vs archive type of what we compress (in formats)
{
copy {
from tarTree(resources.gzip(jdkFile))
into jdkDir
}
}
else if(url.endsWith(".zip"))
{
copy {
from zipTree(jdkFile)
into jdkDir
}
}
}
}
File packrInDir = new File(packrDir, platform)
String platformRawName = names[platform]
task("packr_" + platform, type: JavaExec) {
outputs.upToDateWhen { new File(packrDir, platformRawName.replace("%%VERSION%%", jumpaiVersion)).exists() }
dependsOn fetchVersion
dependsOn copyGameJar
dependsOn 'getJdk_' + platform
main = 'com.badlogicgames.packr.Packr'
classpath = sourceSets.main.runtimeClasspath
args 'tools/res/packr_config/' + platform + '.json'
workingDir = project.buildscript.sourceFile.parentFile.parentFile
doLast {
File packrOutDir = new File(packrDir, platformRawName.replace("%%VERSION%%", jumpaiVersion));
packrOutDir.deleteDir()
if(packrOutDir.exists())
{
println "ERROR Could not delete packr output " + packrOutDir.getAbsolutePath()
return
}
if(!packrInDir.renameTo(packrOutDir))
println "ERROR Could not rename packr output dir for " + packrInDir.getName()
}
}
if(formats[platform] == 'ZIP')
{
task('makeArchive_' + platform, type: Zip) {
if(platform.contains("win"))
dependsOn setWindowsIcons
dependsOn fetchVersion
dependsOn 'packr_' + platform
from packrDir
destinationDirectory = packrDir
include {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + "/"
}
archiveFileName = provider {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + ".zip"
}
}
}
else if(formats[platform] == 'TAR_GZ')
{
task('makeArchive_' + platform, type: Tar) {
dependsOn 'packr_' + platform
from packrDir
destinationDirectory = packrDir
include {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + '/**'
}
archiveFileName = provider {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + ".tar.gz"
}
extension 'tar'
compression = Compression.GZIP
}
}
else
println 'Unsupported format for ' + platform
}
task deploy {
dependsOn makeMinimal
for(platform in platforms)
dependsOn 'makeArchive_' + platform
}
How do I fix this? I need a lazy include because the directory I want to include is computed by other tasks. I get the exact same problem with Tar except it refuses to create the archive since it includes itself.
You can get what you want by using the doFirst method and modifiying the tasks properties with the passed action.
task('makeArchive_' + platform, type: Zip) {
if(platform.contains("win"))
dependsOn setWindowsIcons
dependsOn fetchVersion
dependsOn 'packr_' + platform
from packrDir
destinationDirectory = packrDir
archiveFileName = provider {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + ".zip"
}
doFirst {
def includeDir = platformRawName.replace("%%VERSION%%", jumpaiVersion)
// Include only files and directories from 'includeDir'
include {
it.relativePath.segments[ 0 ].equalsIgnoreCase(includeDir)
}
}
}
Please have also a look at this answer to a similar question. My solution is just a workaround. If you know your version at configuration phase you can achieve what you want more easily. Writing your own custom tasks or plugins can also help to clean up your build script.

How to re-run only failed JUnit test classes using Gradle?

Inspired by this neat TestNG task, and this SO question I thought I'd whip up something quick for re-running of only failed JUnit tests from Gradle.
But after searching around for awhile, I couldn't find anything analogous which was quite as convenient.
I came up with the following, which seems to work pretty well and adds a <testTaskName>Rerun task for each task of type Test in my project.
import static groovy.io.FileType.FILES
import java.nio.file.Files
import java.nio.file.Paths
// And add a task for each test task to rerun just the failing tests
subprojects {
afterEvaluate { subproject ->
// Need to store tasks in static temp collection, else new tasks will be picked up by live collection leading to StackOverflow
def testTasks = subproject.tasks.withType(Test)
testTasks.each { testTask ->
task "${testTask.name}Rerun"(type: Test) {
group = 'Verification'
description = "Re-run ONLY the failing tests from the previous run of ${testTask.name}."
// Depend on anything the existing test task depended on
dependsOn testTask.dependsOn
// Copy runtime setup from existing test task
testClassesDirs = testTask.testClassesDirs
classpath = testTask.classpath
// Check the output directory for failing tests
File textXMLDir = subproject.file(testTask.reports.junitXml.destination)
logger.info("Scanning: $textXMLDir for failed tests.")
// Find all failed classes
Set<String> allFailedClasses = [] as Set
if (textXMLDir.exists()) {
textXMLDir.eachFileRecurse(FILES) { f ->
// See: http://marxsoftware.blogspot.com/2015/02/determining-file-types-in-java.html
String fileType
try {
fileType = Files.probeContentType(f.toPath())
} catch (IOException e) {
logger.debug("Exception when probing content type of: $f.")
logger.debug(e)
// Couldn't determine this to be an XML file. That's fine, skip this one.
return
}
logger.debug("Filetype of: $f is $fileType.")
if (['text/xml', 'application/xml'].contains(fileType)) {
logger.debug("Found testsuite file: $f.")
def testSuite = new XmlSlurper().parse(f)
def failedTestCases = testSuite.testcase.findAll { testCase ->
testCase.children().find { it.name() == 'failure' }
}
if (!failedTestCases.isEmpty()) {
logger.info("Found failures in file: $f.")
failedTestCases.each { failedTestCase ->
def className = failedTestCase['#classname']
logger.info("Failure: $className")
allFailedClasses << className.toString()
}
}
}
}
}
if (!allFailedClasses.isEmpty()) {
// Re-run all tests in any class with any failures
allFailedClasses.each { c ->
def testPath = c.replaceAll('\\.', '/') + '.class'
include testPath
}
doFirst {
logger.warn('Re-running the following tests:')
allFailedClasses.each { c ->
logger.warn(c)
}
}
}
outputs.upToDateWhen { false } // Always attempt to re-run failing tests
// Only re-run if there were any failing tests, else just print warning
onlyIf {
def shouldRun = !allFailedClasses.isEmpty()
if (!shouldRun) {
logger.warn("No failed tests found for previous run of task: ${subproject.path}:${testTask.name}.")
}
return shouldRun
}
}
}
}
}
Is there any easier way to do this from Gradle? Is there any way to get JUnit to output a consolidated list of failures somehow so I don't have to slurp the XML reports?
I'm using JUnit 4.12 and Gradle 4.5.
Here is one way to do it. The full file will be listed at the end, and is available here.
Part one is to write a small file (called failures) for every failed test:
test {
// `failures` is defined elsewhere, see below
afterTest { desc, result ->
if ("FAILURE" == result.resultType as String) {
failures.withWriterAppend {
it.write("${desc.className},${desc.name}\n")
}
}
}
}
In part two, we use a test filter (doc here) to restrict the tests to any that are present in the failures file:
def failures = new File("${projectDir}/failures.log")
def failedTests = []
if (failures.exists()) {
failures.eachLine { line ->
def tokens = line.split(",")
failedTests << tokens[0]
}
}
failures.delete()
test {
filter {
failedTests.each {
includeTestsMatching "${it}"
}
}
// ...
}
The full file is:
apply plugin: 'java'
repositories {
jcenter()
}
dependencies {
testCompile('junit:junit:4.12')
}
def failures = new File("${projectDir}/failures.log")
def failedTests = []
if (failures.exists()) {
failures.eachLine { line ->
def tokens = line.split(",")
failedTests << tokens[0]
}
}
failures.delete()
test {
filter {
failedTests.each {
includeTestsMatching "${it}"
}
}
afterTest { desc, result ->
if ("FAILURE" == result.resultType as String) {
failures.withWriterAppend {
it.write("${desc.className},${desc.name}\n")
}
}
}
}
The Test Retry Gradle plugin is designed to do exactly this. It will rerun each failed test a certain number of times, with the option of failing the build if too many failures have occurred overall.
plugins {
id 'org.gradle.test-retry' version '1.2.1'
}
test {
retry {
maxRetries = 3
maxFailures = 20 // Optional attribute
}
}

Gradle task doLast if task fails

I want a clean build, where you can see exactly what happened, but all information is preserved - so essentially for every task, I want it to write the output to a file, and only display it if the task fails.
I've tried to achieve this in gradle - but am being defeated because doLast doesn't run if the task fails. Here's my "almost" working pattern:
task fakeTask( type: Exec ) {
standardOutput = new ByteArrayOutputStream()
commandLine 'cmd', '/c', 'silly'
doLast {
new File("build.log") << standardOutput.toString()
if (execResult.exitValue != 0) println standardOutput.toString()
}
}
is there any alternative to doLast that will run any time? Any other way of doing this? - especially as I want to do this for every single task I have?
this is my final solution:
tasks.withType( Exec ) {
standardOutput = new ByteArrayOutputStream()
ignoreExitValue = true
doLast {
new File("gradle.log") << standardOutput.toString()
if (execResult.exitValue != 0)
{
println standardOutput.toString()
throw new GradleException(commandLine.join(" ") + " returned exitcode of " + execResult.exitValue )
}
}
}
Add a ignoreExitValue true to your task definition to suppress throwing of an exception when the exit value is non-zero.
You can use the task graph as documented at https://docs.gradle.org/current/userguide/build_lifecycle.html#sec:task_execution to execute code after a task has run.
gradle.taskGraph.afterTask { task, state ->
if (task instanceof ExecTask && state.failure) {
file('gradle.log') << standardOutput.toString()
}
}

How to use startParameters in BuildGradle task?

I would like to pass deployDir (with value /my_archive) to uploadArchives task in my_project:
task build (type: GradleBuild) {
buildFile = './my_project/build.gradle'
tasks = ['uploadArchives']
/* startParameter = [deployDir:"/my_archive"] ??? */
}
I do not know how to declare the start parameters. I have tried different ways, e.g.,
startParameter = [deployDir:"/my_archive"]
Without success.
How to declare startParameter in the GradleBuild task?
I assume you mean to pass the deployDir as a project property. In this case, you'll find there is a setProjectProperties(Map) method you can use:
task build (type: GradleBuild) {
buildFile = './my_project/build.gradle'
tasks = ['uploadArchives']
startParameter.projectProperties = [deployDir: "/my_archive"]
}
This will enable you to access deployDir as a variable from the called build script:
uploadArchives {
repositories {
mavenDeployer {
repository(url: deployDir)
// --- or, if deployDir can be empty ---
repository(url: project.properties.get('deployDir', 'file:///default/path'))
}
}
}
we can set the project properties and system properties via the api
setProjectProperties(Map<String,String> projectProperties)
setSystemPropertiesArgs(Map<String,String> systemPropertiesArgs)
here is the sample from my local for startParameter:
task startBuild(type: GradleBuild) {
StartParameter startParameter = project.gradle.startParameter;
Iterable<String> tasks = new ArrayList<String>();
Iterable<String> excludedTasks = new ArrayList<String>();
startParameter.getProjectProperties().each { entry ->
println entry.key + " = " + entry.value;
if(entry.key.startsWith('t_')){
tasks << (entry.key - 't_');
}
if(entry.key.startsWith('build_') && "true" == entry.value){
tasks << (':' + (entry.key - 'build_') +':build');
}
if(entry.key.startsWith('x_') && "true" == entry.value){
excludedTasks << (entry.key - 'x_');
}
}
startParameter.setTaskNames(tasks);
startParameter.setExcludedTaskNames(excludedTasks);
println startParameter.toString();
}
we can reference the api from this link StartParameter
the startparameter is really powerful in gradle when you need to customize your gradle build logic.

Resources