Gradle copies all the files if at least one is not up-to-date - gradle

For instance, I've got the following task:
task testCopy(type: Copy) {
from ("folder/copy_from")
into ("folder/copy_to")
eachFile {println it.name}
}
Unless the inner files of the folder copy_from are touched, task works fine. As soon as I change, let's say one file in the folder copy_from, then Gradle begins to copy all the files from copy_from folder into copy_to instead of copying only one changed/added file.
Is this behaviour expected? Is there a way to make Gradle copy only changed/added file?

Yes based on this github issue and gradle discuss:
The build is incremental in the sense that the copy task only executes
when things have changed but it isn’t incremental itself, in that it
only copies inputs that have been modified.
I couldn't find a propper solution, but one solution is just splitting your task into smaller ones with specific types.
task copy1(type: Copy) {
into 'build/out'
from ('src') {
include 'docs/*.txt'
}
eachFile {println it.name}
}
task copy2(type: Copy) {
into 'build/out'
from ('src') {
include 'docs/*.md'
}
eachFile {println it.name}
}
task copy3 {
dependsOn copy1, copy2
}
It's not exactly what you want but it improves the performance by reducing files to copy.
when you change a text file and run gradle copy3 it just copies the text files not md files.
UPDATE:
Ant copy task doesn't have this problem
from it's documentation:
By default, files are only copied if the source file is newer than the destination file, or when the destination file does not exist. However, you can explicitly overwrite files with the overwrite attribute
So you can use ant copy task instead, as we can use ant tasks from gradle:
task copyFromAnt {
doLast {
ant.copy(todir: 'build/out') {
fileset(dir: 'src')
}
}
}
ant logs the files it copies so you can check the log with help of gradle -d and grep:
gradle copyFromAnt -d | grep "\[ant:copy\]"
and to see just the files it copies with out up-to-dat and etc. you can use the below command:
gradle copyFromAnt -d | grep "\[ant:copy\] Copying"

Related

Delete directory with all files in it using gradle task

In my project's root I have a directory as follows:
build/exploded-project/WEB-INF/classes
I want to delete all the files in the classes directory using a gradle task. I tried the of the following combinations but none of them worked:
task deleteBuild(type: Delete) {
project.delete 'build/exploded-project/WEB-INF/classes/'
}
task deleteBuild(type: Delete) {
delete 'build/exploded-project/WEB-INF/classes/'
}
task deleteBuild(type: Delete) {
delete '$rootProject.projectDir/build/exploded-project/WEB-INF/classes/'
}
task deleteBuild(type: Delete) {
delete fileTree('build/exploded-project/WEB-INF/classes').matching {
include '**/*.class'
}
}
Your second variant is correct and works fine here.
Though I'd recommend not hardcoding the path.
Use $buildDir instead of build, or if the path is the output path of another task use the respective property of that task.
If it doesn't work for you, run with -i or -d to get more information about what is going on and possibly going wrong.
As suggested a better approach is to use Gradle Variables.
Try:
task deleteBuild(type: Delete) {
delete "$buildDir/exploded-project/WEB-INF/classes/"
}
Pay attention to replace the single quote with the double one.
Regards,
S.

how to exclude path information in gradle zip task

Apologies in advance for this simple question I am relatively new to Gradle.
I would like to run a Zip task during my Build process and create a .zip file in my "archive" directory but without the source file's directory structure. I managed to get the task working but its retaining the directory structure. I read that for linux Zip there is the option -j or -r which flattens the directory but I am not sure if this is available to me via the Gradle task.
My input file structures looks similar to the below,
./libs/
./libs/file1.jar
./scripts/script1.sh
./scripts/script2.sh
but I would like to end up with a Zip file with directory structure as follows,
./
./file1.jar
./script1.sh
./script2.sh
My current Zip task is as follows,
task makeZipForDeploy(type: Zip) {
from 'build/'
include 'libs/*' //to include all files in libs folder
include 'scripts/*' //to include all files in the scripts folder
archiveName("${archiveFileName}")
destinationDir file("${archivePath}")
}
the solution was even more trivial than I thought. The below will flatten the structure
task makeZipForDeploy(type: Zip) {
from 'build/libs' //to include all files in libs folder
from 'build/scripts' //to include all files in the scripts folder
archiveName("${archiveFileName}")
destinationDir file("${archivePath}")
}

Use Gradle to run Yarn in every folder

I have a folder with a lot of directories inside. I want to have a gradle script that will loop through them (not recursively) and run
yarn build
in them.
I have tried two approaches (and started several different things), but alas no luck.
task build (description: "Loops through folders and builds"){
FileTree tree = fileTree(dir: 'admin', include: '*/package.json')
tree.each {File file -> println file}
}
task yarnBuild (type: Exec){
executable "bash"
args "find . -type d -exec ls {} \\;"
}
With the task build I wanted to find the directories that had the package.json (all of them, really), but then I don't know how to change to that directory to do a "yarn build"
With the task yarnBuild I wanted to just do it with a unix command. But I couldn't get it to work either.
I would be more interested in finding a solution more in line with the "build" task, using actual Gradle. Can anybody give me some pointers? How can I change into those directories? I'm guessing once I'm in the right folder I can just use Exec to run "yarn build".
Thanks!
I'd probably create a task per directory and the wire them all into the DAG
apply plugn: 'base'
FileTree directories = fileTree(projectDir).matching {
include { FileTreeElement el ->
return el.directory && el.file.parentFile == projectDir
}
}
directories.files.each { File dir ->
// create a task for the directory
Task yarnTask = tasks.create("yarn${dir.name}", Exec) {
workingDir dir
commandLine 'yarn', 'build'
// TODO: set these so gradle's up-to-date checks can work
inputs.dir = ???
outputs.dir = ???
}
// wire the task into the DAG
build.dependsOn yarnTask
}

How to remove an element from gradle task outputs?

is it possible to exclude an element from the output files of a Task in order to not consider it for the up-to-date check? In my case I have a copy task that automatically set the destination directory in outputs variable, but I'd like to remove it and set only some of the copied files.
Or, as alternative, is it possible to overwrite the entire outputs variable?
Thanks,
Michele.
Incremental tasks create snapshots from input and output files of a task. If these snapshots are the same for two task executions (based on the hash code of file content), then Gradle assumes that task is up-to-date.
You are not able to remove some files from output and expect Gradle to forget about them, simply because the hash codes will be different.
There is an option that allows you to manually define the logic of up-to-date checks.
You should use a method upToDateWhen(Closure upToDateClosure) in TaskOutputs class.
task myTask {
outputs.dir files('/home/user/test')
outputs.upToDateWhen {
// your logic here
return true; // always up-to-date
}
}
I've found the solution:
task reduceZip(type: Copy) {
outputs.files.setFrom(file("C:/temp/unzip/test.properties"))
outputs.file(file("C:/temp/unzip/test.txt"))
from zipTree("C:/temp/temp.zip")
into "C:/temp/unzip"
}
Outputs.files list could be modified only register new elements, not removing (for what I know). So I need to reset the list and then eventually add other files. The outputs.files.setFrom method reset the outputs.files list so it is possible add other file. In the example above I reduce the up-to-date check only to the test.properties and test.txt files.

Intelligent up-to-date-check for a gradle task unzipping a file

My goal is to create a gradle task which fulfills the following requirements:
unzip a file into a directory
run as infrequent as possible (good up-to-date-check)
should detect if something has changed in the unzipped directory
The initial thought was to use a task like that:
p.task("unzip", type: Copy) {
def zipFile = p.file('some-zip-file.zip')
def outputDir = p.file("$p.buildDir/unpacked")
from p.zipTree(zipFile)
into outputDir
}
If I create a file "test.txt" within $p.buildDir/unpacked and I rerun the unzip task, the file "test.txt" remains within the unpacked directory. The unzip task effectively performs obviously something similar to a "merge". Unfortunately that is not optimal since the content of /unpacked should be defined as the unzipped zip file.
The following solution unfortunately doesn't detect that the output directory "unpacked" has changed since the last execution of the task:
p.task("deleteUnzippedDirectory", type: Delete) {
delete "$p.buildDir/unpacked"
inputs.file p.file('some-zip-file.zip')
outputs.dir p.file("$p.buildDir/unpacked")
}
p.task("unzip", type: Copy) {
def zipFile = p.file('some-zip-file.zip')
def outputDir = p.file("$p.buildDir/unpacked")
from p.zipTree(zipFile)
into outputDir
dependsOn p.deleteUnzippedDirectory
}
Is there a way to achieve the desired result besides running "deleteUnzippedDirectory" every time? (outputs.upToDateWhen {false})

Resources