How to copy files to more than one location using Gradle? - gradle

I'd like to define a Gradle task that copies files to four different directories. It seems that the copy task only allows a single target location.
// https://docs.gradle.org/current/userguide/working_with_files.html#sec:copying_files
task copyAssets(type: Copy) {
from 'src/docs/asciidoc/assets'
//into ['build/asciidoc/html5/assets', 'build/asciidoc/pdf/assets']
into 'build/asciidoc/pdf/assets'
}
task gen(dependsOn: ['copyAssets', 'asciidoctor']) << {
println "Files are generated."
}
How can I copy the files without defining four different tasks?
My current solution is:
// https://docs.gradle.org/current/userguide/working_with_files.html#sec:copying_files
task copyAssetsPDF(type: Copy) {
from 'src/docs/asciidoc/assets'
into 'build/asciidoc/pdf/assets'
}
task copyAssetsHTML5(type: Copy) {
from 'src/docs/asciidoc/assets'
into 'build/asciidoc/html5/assets'
}
task copyAssetsDB45(type: Copy) {
from 'src/docs/asciidoc/assets'
into 'build/asciidoc/docbook45/assets'
}
task copyAssetsDB5(type: Copy) {
from 'src/docs/asciidoc/assets'
into 'build/asciidoc/docbook5/assets'
}
task gen(dependsOn: ['copyAssetsPDF', 'copyAssetsHTML5', 'copyAssetsDB45', 'copyAssetsDB5', 'asciidoctor']) << {
println "Files are generated."
}

One of the solutions is to make a single task with a number of copy specifications like:
task copyAssets << {
copy {
from 'src/docs/asciidoc/assets'
into 'build/asciidoc/pdf/assets'
}
copy {
from 'src/docs/asciidoc/assets'
into 'build/asciidoc/pdf/assets'
}
}
Or you can do it within a loop:
//an array containing destination paths
def copyDestinations = ['build/asciidoc/pdf/assets', 'build/asciidoc/html5/assets']
//custom task to copy into all the target directories
task copyAssets << {
//iterate over the array with destination paths
copyDestinations.each { destination ->
//for every destination define new CopySpec
copy {
from 'src/docs/asciidoc/assets'
into destination
}
}
}

Related

Gradle: how to run a task for specified input files?

I have a Gradle build file which uses ProtoBuffer plugin and runs some tasks. At some point some tasks are run for some files, which are inputs to tasks.
I want to modify the set of files which is the input to those tasks. Say, I want the tasks to be run with files which are listed, one per line, in a particular file. How can I do that?
EDIT: Here is a part of rather big build.gradle which provides some context.
configure(protobufProjects) {
apply plugin: 'java'
ext {
protobufVersion = '3.9.1'
}
dependencies {
...
}
protobuf {
generatedFilesBaseDir = "$projectDir/gen"
protoc {
if (project.hasProperty('protocPath')) {
path = "$protocPath"
}
else {
artifact = "com.google.protobuf:protoc:$protobufVersion"
}
}
plugins {
...
}
generateProtoTasks {
all().each { task ->
...
}
}
sourceSets {
main {
java {
srcDirs 'gen/main/java'
}
}
}
}
clean {
delete protobuf.generatedFilesBaseDir
}
compileJava {
File generatedSourceDir = project.file("gen")
project.mkdir(generatedSourceDir)
options.annotationProcessorGeneratedSourcesDirectory = generatedSourceDir
}
}
The question is, how to modify the input file set for existing task (which already does something with them), not how to create a new task.
EDIT 2: According to How do I modify a list of files in a Gradle copy task? , it's a bad idea in general, as Gradle makes assumptions about inputs and outputs dependencies, which can be broken by this approach.
If you would have added the gradle file and more specific that would have been very helpful. I will try to give an example from what I have understood:
fun listFiles(fileName: String): List<String> {
val file = file(fileName).absoluteFile
val listOfFiles = mutableListOf<String>()
file.readLines().forEach {
listOfFiles.add(it)
}
return listOfFiles
}
tasks.register("readFiles") {
val inputFile: String by project
val listOfFiles = listFiles(inputFile)
listOfFiles.forEach {
val file = file(it).absoluteFile
file.readLines().forEach { println(it) }
}
}
Then run the gradle like this: gradle -PinputFile=<path_to_the_file_that_contains_list_of_files> readFiles

How can I reference the outputs of multiple tasks with one name?

Or in other words: How do I gather outputs of multiple tasks?
I have a project with a number of tasks whose output I want to include in the distribution.
I also have a task that depends on all of them.
How can I avoid listing all the tasks?
Example build.gradle:
plugins {
id 'distribution'
}
task taskA0 {
...
}
task taskA1 {
...
}
task taskA2 {
...
}
task allA {
dependsOn (taskA0, taskA1, taskA2)
}
distributions {
main {
contents {
// this works but is tedious
from taskA0
from taskA1
from taskA2
// this doesn't work, as allA doesn't have any output
from allA
}
}
}
How's this?
task allA {
dependsOn (taskA0, taskA1, taskA2)
outputs.files(taskA0, taskA1, taskA2)
}
distributions {
main {
contents {
from allA
}
}
}

Gradle zip task with lazy include property includes itself

Hi I got this zip task which works great:
def dir = new File("${projectDir.parentFile}/test/")
task testZip(type: Zip) {
from dir
destinationDirectory = dir
include 'toast/**'
archiveFileName = 'test.zip'
}
but then when I make the include property lazy (because I need to in my real case)
def dir = new File("${projectDir.parentFile}/test/")
task testZip(type: Zip) {
from dir
destinationDirectory = dir
include {
'toast/**'
}
archiveFileName = 'test.zip'
}
then it creates a zip that includes everything in the folder, (so the generated archive too). In this test case the inner zip is just corrupted (doesn't run infinitely) but in the real world case it does make an infinite zip. (Not sure why, maybe my best case has too few or small files). Either way the test case shows the problem, the generated zip contains a zip even though it should only contain the toast directory and all of its content.
How do I fix this? I need a lazy include because the directory I want to include is computed by other tasks. I get the exact same problem with Tar except it refuses to create the archive since it includes itself.
Using exclude '*.zip' is a dumb workaround which makes the archive include other folders I don't want. I only want to include a specific folder, lazyly.
Here's what the monster looks like in the real world case. I basically need to retrieve the version of the project from Java to then use that version to name the folders I'm packaging. (Making a libGDX game and packaging it with a jre using packr). The problematic tasks are 'makeArchive_' + platform.
String jumpaiVersion;
task fetchVersion(type: JavaExec) {
outputs.upToDateWhen { jumpaiVersion != null }
main = 'net.jumpai.Version'
classpath = sourceSets.main.runtimeClasspath
standardOutput new ByteArrayOutputStream()
doLast {
jumpaiVersion = standardOutput.toString().replaceAll("\\s+", "")
}
}
def names = [
'win64' : "Jumpai-%%VERSION%%-Windows-64Bit",
'win32' : "Jumpai-%%VERSION%%-Windows-32Bit",
'linux64' : "Jumpai-%%VERSION%%-Linux-64Bit",
'linux32' : "Jumpai-%%VERSION%%-Linux-32Bit",
'mac' : "Jumpai-%%VERSION%%-Mac.app"
]
def platforms = names.keySet() as String[]
def jdks = [
'win64' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-win_x64.zip',
'win32' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-win_i686.zip',
'linux64' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-linux_x64.tar.gz',
'linux32' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-linux_i686.tar.gz',
'mac' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-macosx_x64.zip'
]
def formats = [
'win64' : 'ZIP',
'win32' : 'ZIP',
'linux64' : 'TAR_GZ',
'linux32' : 'TAR_GZ',
'mac' : 'ZIP'
]
File jdksDir = new File(project.buildscript.sourceFile.parentFile.parentFile, 'out/jdks')
File gameJar = new File("${projectDir.parentFile}/desktop/build/libs/Jumpai.jar")
File gameData = new File("${projectDir.parentFile}/desktop/build/libs/Jumpai.data")
File packrDir = new File("${projectDir.parentFile}/out/packr/")
File minimalTmpDir = new File("${projectDir.parentFile}/desktop/build/libs/minimal-tmp")
task minimizeGameJar {
dependsOn ':desktop:dist'
doFirst {
minimalTmpDir.mkdirs()
copy {
from zipTree(gameJar)
into minimalTmpDir
}
for(file in minimalTmpDir.listFiles())
if(file.getName().contains("humble"))
file.delete()
}
}
task makeMinimal(type: Zip) {
dependsOn minimizeGameJar
dependsOn fetchVersion
from minimalTmpDir
include '**'
archiveFileName = provider {
"Jumpai-${->jumpaiVersion}-Minimal.jar"
}
destinationDir packrDir
doLast {
minimalTmpDir.deleteDir()
}
}
task copyGameJar(type: Copy) {
outputs.upToDateWhen { gameData.exists() }
dependsOn ':desktop:dist'
from gameJar.getAbsolutePath()
into gameData.getParentFile()
rename("Jumpai.jar", "Jumpai.data")
}
task setWindowsIcons(type: Exec) {
dependsOn fetchVersion
workingDir '.'
commandLine 'cmd', '/c', 'set_windows_icons.bat', "${->jumpaiVersion}"
}
for(platform in platforms) {
task("getJdk_" + platform) {
String url = jdks[platform]
File jdkDir = new File(jdksDir, platform + "-jdk")
File jdkFile = new File(jdkDir, url.split("/").last())
outputs.upToDateWhen { jdkFile.exists() }
doFirst {
if(!jdkDir.exists())
jdkDir.mkdirs()
if(jdkFile.exists())
{
println jdkFile.getName() + " is already present"
return
}
else
{
println "Downloading " + jdkFile.getName()
new URL(url).withInputStream {
i -> jdkFile.withOutputStream { it << i }
}
}
for(file in jdkDir.listFiles()) {
if(file.equals(jdkFile))
continue
if(file.isFile()) {
if (!file.delete())
println "ERROR: could not delete " + file.getAbsoluteFile()
} else if(!file.deleteDir())
println "ERROR: could not delete content of " + file.getAbsoluteFile()
}
if(url.endsWith(".tar.gz"))// don't mix up archive type of what we downloaded vs archive type of what we compress (in formats)
{
copy {
from tarTree(resources.gzip(jdkFile))
into jdkDir
}
}
else if(url.endsWith(".zip"))
{
copy {
from zipTree(jdkFile)
into jdkDir
}
}
}
}
File packrInDir = new File(packrDir, platform)
String platformRawName = names[platform]
task("packr_" + platform, type: JavaExec) {
outputs.upToDateWhen { new File(packrDir, platformRawName.replace("%%VERSION%%", jumpaiVersion)).exists() }
dependsOn fetchVersion
dependsOn copyGameJar
dependsOn 'getJdk_' + platform
main = 'com.badlogicgames.packr.Packr'
classpath = sourceSets.main.runtimeClasspath
args 'tools/res/packr_config/' + platform + '.json'
workingDir = project.buildscript.sourceFile.parentFile.parentFile
doLast {
File packrOutDir = new File(packrDir, platformRawName.replace("%%VERSION%%", jumpaiVersion));
packrOutDir.deleteDir()
if(packrOutDir.exists())
{
println "ERROR Could not delete packr output " + packrOutDir.getAbsolutePath()
return
}
if(!packrInDir.renameTo(packrOutDir))
println "ERROR Could not rename packr output dir for " + packrInDir.getName()
}
}
if(formats[platform] == 'ZIP')
{
task('makeArchive_' + platform, type: Zip) {
if(platform.contains("win"))
dependsOn setWindowsIcons
dependsOn fetchVersion
dependsOn 'packr_' + platform
from packrDir
destinationDirectory = packrDir
include {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + "/"
}
archiveFileName = provider {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + ".zip"
}
}
}
else if(formats[platform] == 'TAR_GZ')
{
task('makeArchive_' + platform, type: Tar) {
dependsOn 'packr_' + platform
from packrDir
destinationDirectory = packrDir
include {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + '/**'
}
archiveFileName = provider {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + ".tar.gz"
}
extension 'tar'
compression = Compression.GZIP
}
}
else
println 'Unsupported format for ' + platform
}
task deploy {
dependsOn makeMinimal
for(platform in platforms)
dependsOn 'makeArchive_' + platform
}
How do I fix this? I need a lazy include because the directory I want to include is computed by other tasks. I get the exact same problem with Tar except it refuses to create the archive since it includes itself.
You can get what you want by using the doFirst method and modifiying the tasks properties with the passed action.
task('makeArchive_' + platform, type: Zip) {
if(platform.contains("win"))
dependsOn setWindowsIcons
dependsOn fetchVersion
dependsOn 'packr_' + platform
from packrDir
destinationDirectory = packrDir
archiveFileName = provider {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + ".zip"
}
doFirst {
def includeDir = platformRawName.replace("%%VERSION%%", jumpaiVersion)
// Include only files and directories from 'includeDir'
include {
it.relativePath.segments[ 0 ].equalsIgnoreCase(includeDir)
}
}
}
Please have also a look at this answer to a similar question. My solution is just a workaround. If you know your version at configuration phase you can achieve what you want more easily. Writing your own custom tasks or plugins can also help to clean up your build script.

gradle copy include closure not working

Gradle copySpec include closure not working:
def fileList = ["hello/world.xml"]
task foo(type: Copy) {
from (zipTree("/path/a.zip")) {
include { elem ->
fileList.contains(elem.path)
}
}
}
The a.zip contains "hello/world.xml".
Message:
Skipping task 'foo' as it has no source files and no previous output files.
A copySpec closure needs to be used with a copy task.
Your code is just the copy task, which requires a destination to copy into.
Your code should be more like this:
def fileList = ["hello/world.xml"]
def filesToCopy = copySpec {
from (zipTree("/path/a.zip")) {
include { elem ->
fileList.contains(elem.path)
}
}
}
task foo(type: Copy) {
into 'build/target/docs'
with filesToCopy
}
See the API for more detail: https://docs.gradle.org/3.3/dsl/org.gradle.api.tasks.Copy.html

How to define a default dependsOn in a Custom Task

I'm relatively new to gradle.
In order to create an automated deployment script on a cluster, I have created a bunch of Custom tasks that will depend on each other. For example:
class StartSchedulerTask extends SchedulerTask {
#TaskAction
void start() {
dependsOn env.nodes.name.collect {"startTomcat_$it"}
println "Starting quartz on node: ${node}"
}
}
in build.gradle, i have dynamically created the tasks:
project.extensions.environment.nodes.each { TomcatNode n ->
String name = n.name
task "nodeInit_$name"(type: DeployToNodeInitTask) {
node(n)
}
task "stopWorker_$name"(type: StopWorkerTask) {
node(n)
}
task "stopTomcat_$name"(type: StopTomcatTask){
node(n)
}
task "updateAppConfigs_$name"(type: UpdateAppConfigsTask){
node(n)
apps(V3Application.ALL_APPS)
buildName('develop')
}
task "deployWars_$name"(type: DeployWarsTask){
node(n)
apps(V3Application.ALL_APPS)
buildName('develop')
}
task "startTomcat_$name"(type: StartTomcatTask){
node(n)
}
task "startWorker_$name"(type: StartWorkerTask){
node(n)
}
task "terminateNode_$name"(type: DeployToNodeTerminationTask){
node(n)
}
}
task stopScheduler(type: StopSchedulerTask) {
environment(environment)
}
task startScheduler(type: StartSchedulerTask) {
environment(environment)
}
The default task is configured to be startScheduler, which is the last step of the deployment process, the idea being that the task graph, once it is built, will take care of the correct execution order of my tasks.
However, when I print out the task graph, the only task listed is startScheduler. Am I missing something?
Task dependencies have to be declared at configuration time, not at execution time. In theory you could do so in the task's constructor, but a better approach is to do it in the build script, or in a plugin.
Thanks to the remark of Peter Niederwieser and Jeffrey, I was able to come up with the full solution I want. I did not mark Peter's as the answer, because the full answer is below, but it was a necessary hint to the right solution:
I Created an interface DependencyAware:
public interface DependencyAware {
void declareDependencies()
}
Every task that knows how to declare its dependencies, implements this interface. For example:
class StartSchedulerTask extends SchedulerTask {
#TaskAction
void start() {
println "Starting quartz on node: ${node}"
}
void declareDependencies() {
dependsOn env.nodes.name.collect {"startTomcat_$it"}
}
}
In my build script:
tasks.each { Task t ->
if (t instanceof DependencyAware) {
t.declareDependencies()
}
}
That's it!
Thanks for the pointers Peter and Jeffrey
UPDATE 1
task deploy(dependsOn: ['backupWars', 'startScheduler'])
task stopScheduler(type: StopSchedulerTask)
task backupWars(type: BackupWarsTask)
project.extensions.targetEnvironment.nodes.each { TomcatNode n ->
String name = n.name
[
("nodeInit_$name"): DeployToNodeInitTask,
("stopWorker_$name"): StopWorkerTask,
("stopTomcat_$name"): StopTomcatTask,
("updateAppConfigs_$name"): UpdateAppConfigsTask,
("deployWars_$name"): DeployWarsTask,
("startTomcat_$name"): StartTomcatTask,
("startWorker_$name"): StartWorkerTask,
("terminateNode_$name"): DeployToNodeTerminationTask,
].each { String taskName, Class taskType ->
task "$taskName"(type: taskType) {
node(n)
}
}
}
task startScheduler(type: StartSchedulerTask) {
dryRun(testMode)
}
The internal dependencies between the different deployment steps, are in the tasks themselves, for example:
class StartWorkerTask extends WorkerTask {
#TaskAction
void start() {
println "Starting worker ${node}"
}
void declareDependencies() {
dependsOn tomcatOnThisNodeHasBeenStarted()
}
String tomcatOnThisNodeHasBeenStarted() {
"startTomcat_${node.name}"
}
}
The declaration of the topology is as follows:
environments {
prod {
nodes {
(1..2).each { int i ->
"w${i}_prod" {
host = "prod-n$i"
userName = "xxxxx"
password = "xxxxx"
loadBalancer = 'lb_prod'
frontendJkManagerUrl = 'http://web01/jkmanager'
}
}
scheduler {
name = "w1_prod"
}
}
}
rc {
//rc topology here
}
}
It's because you're declaring the dependency from inside the #TaskAction method. The #TaskAction only runs once the dependency graph has been formed.
You could abuse DoFirst() inside your #TaskActions methods to call all your dependencies, but this won't turn up on the dependency graph.

Resources