Hi I try to collect plugins from sub-folder, zip them and copy to my export folder.
task buildPlugin {
dependsOn ':empty-plugin:build'
}
task exportPlugin(type: Zip) {
dependsOn buildPlugin
// create new export folder as destination for nightly build
def folder = '/export';
def file = "${project.name}-sdk-${project.version}";
// collect all plugins into cwc-sdk zip file
baseName = file
fileTree("cwc-plugin").each({
if (it.name.endsWith(".zip")) {
from it.absolutePath
}
})
// move cwc-sdk zip file into export destination folder
copy { into folder from file }
delete file
}
I run clean task first. The gradle logs:
:api:compileJava
:api:processResources
:api:classes
:api:jar
:empty-plugin:compileJava
:empty-plugin:processResources
:empty-plugin:classes
:empty-plugin:jar
:empty-plugin:assemble
:empty-plugin:compileTestJava UP-TO-DATE
:empty-plugin:processTestResources UP-TO-DATE
:empty-plugin:testClasses UP-TO-DATE
:empty-plugin:test UP-TO-DATE
:empty-plugin:check UP-TO-DATE
:api:javadoc
:empty-plugin:zip
:empty-plugin:build
:buildPlugin
:exportPlugin UP-TO-DATE
BUILD SUCCESSFUL
Total time: 2.097 secs
While first run :exportPlugin is marked as UP-TO-DATE and I don't get the zipped file from build. When I run :exportPlugin again everything is fine. It's also fine when I chain both tasks manually (rungradle clean, next run gradle buildPlugin, run gradle exportPlugin by doublclick to tasks at IDEA)
I think the order of tasks are still ok. I don't need to work with mustRunAfter.
I also played around with copySpec, buildplugin.outputs.files. But nothing helps.
Can anybody help me to solve this issue for initial build run?
Thanks!
Update:
A Zip task is an abstracted Copy task
AbstractCopyTask is the base class for all copy tasks. (Docu)
I found this comment from Peter Niederwieser
A Copy task only gets executed if it has something to copy. Telling it what to copy is part of configuring the task, and therefore needs to be done in the configuration phase, rather than the execution phase.
How do I change from it.absolutePath code line inside fileTree loop to be part during configuration phase?
Related
I have set up some tasks compiling a libGdx project in Android Studio. What I want is, that a dynamic file "version.txt" is created in the assets folder. This file must be available in the final jar-output.
Here is how I did the setup:
libGdx provides me with a dist task.
I created two tasks on top of that: buildDebugVersion and buildReleaseVersion.
Both shall create a version.txt, one containing debug information and the other no debug info.
libGdx' original dist task
task dist(type: Jar) {
manifest {
attributes 'Main-Class': project.mainClassName
}
//processResources.dependsOn tasks.updateVersionFileRelease
dependsOn configurations.runtimeClasspath
from {
configurations.runtimeClasspath.collect { it.isDirectory() ? it : zipTree(it) }
}
with jar
}
my debug/release tasks
task createDebugVersion {
outputs.upToDateWhen { false }
doFirst {
modifyVersionFile(1)
}
}
task createReleaseVersion {
outputs.upToDateWhen { false }
doFirst {
modifyVersionFile(0)
}
}
task dependencies
To put that all in the right running order, I created
createDebugVersion.finalizedBy(tasks.dist)
createReleaseVersion.finalizedBy(tasks.dist)
dist.mustRunAfter(tasks.createDebugVersion, tasks.createReleaseVersion)
My expectation was:
createdebug/release runs and creates the file
THEN dist runs
This seems to happen, when I look at the output. You can see, what file is generated:
Here is the output when I run it:
> Task :desktop:createReleaseVersion
Generated version file is:
--------------------------
Version=0.1.113
BuildDate=2020-04-24
--------------------------
BUILD SUCCESSFUL in 5s
10 actionable tasks: 9 executed, 1 up-to-date
...but when I run the program, I see this version information:
The file contained in the jar is alway 1 or 2 versions behind, as if it would've been taken out of some cache folder.
What scares me is the question: How many of the other assets are now 1 or 2 versions behind, too?
From where does gradle take this file?
I was not able to find it out so far.
what I already tried
I run this through a .cmd script. Before starting the gradle task, I already delete all build folders (and tested it, everything is removed before the build starts)
(the variable %TASK% contains either createDebugVersion or createReleaseVersion)
ECHO Forcing resources rebuild
RD /S /Q .\desktop\build
RD /S /Q .\android\build
ECHO Compiling distribution version of %GAMENAME%...
CALL gradlew desktop:clean desktop:build %TASK% --rerun-tasks
But still, out of some "ghost-galaxy-space", a file that is two builds old is taken from somewhere and put into the jar...
Any help greatly appreciated!
Ok, I figured it out - maybe it helps others if they run into a similar issue.
The error was here:
CALL gradlew desktop:clean desktop:build %TASK% --rerun-tasks
If you look closely, you see FIRST desktop:clean runs, THEN desktop:build runs, which causes a full-rebuild due to the deleted caches and build folders, and THEN my %TASK% runs - as third in order!
So, my task runs after the build. Even when I put up the build chain with
createDebugVersion.finalizedBy(tasks.dist)
createReleaseVersion.finalizedBy(tasks.dist)
this all only runs after the build is already done. And the dist task has no clean in this build chain so it uses the output of the build generated by the second parameter of my gradlew call!
The solution
I simply changed how I run gradlew:
my command script now calls
call gradlew desktop:clean %TASK% --rerun-tasks
so a clean is still forced, but the build only happens as a reaction from dist because it can't find any binaries, so it has to build them right now.
And then the build finally runs after my file has been written to the assets folder.
Hope this helps someone some day!
cheers, gris
I am writing a task to unzip a war file, remove some jars and then create a war from extracted folder.
task unzipWar(type: Copy){
println 'unzipping the war'
def warFile = file("${buildDir}/temp/libs/webapps/service-app.war")
def warOutputDir = file("$buildDir/wartemp")
from zipTree(warFile)
into warOutputDir
}
task deleteJars(type: Delete){
println 'deleting the logging jars'
file("$buildDir/wartemp/WEB-INF/lib/slf4j-api-1.7.5.jar").delete();
file("$buildDir/wartemp/WEB-INF/lib/logback-classic-1.1.7.jar").delete();
file("$buildDir/wartemp/WEB-INF/lib/logback-core-1.1.7.jar").delete();
}
task createWar(type: War){
destinationDir = file("$buildDir")
baseName = "service-app"
from "$buildDir/wartemp"
dependsOn deleteJars
}
For some reason, the jars are not getting deleted and the war file is getting created which only includes MANIFEST.MF and nothing else. What am I missing here?
First thing to note, is that your createWar task depends on deleteJarstask, but deleteJars doesn't depend on unzipWar. It seems, that if you call the createWar task it won't call unzipWar task and there will be nothing to copy or delete. Note that you have a MANIFEST.MF file, because it was generated by createWar task.
And the second thing is that you are trying to delete some files in the configuration stage of the build, though your unzipWar will do it's job in the execution phase. So your delete task will try to delete this files just before they are even unzipped. You can read about build lifecycle in the official userguide. So you need to rewrite your deleteJars task, to configure it properly. Take a look into the docs, it has an example how to do it.
So if you call a
file("$buildDir/wartemp/WEB-INF/lib/slf4j-api-1.7.5.jar").delete();
it tries to delete your files at the time it's called, because it's not a task property, but an action at the configuration.
To configure it you have to do something like:
task deleteJars(type: Delete) {
delete "$buildDir/wartemp/WEB-INF/lib/slf4j-api-1.7.5.jar", "$buildDir/wartemp/WEB-INF/lib/logback-classic-1.1.7.jar", "$buildDir/wartemp/WEB-INF/lib/logback-core-1.1.7.jar"
}
I’m getting the following error whenever I attempt to use a Copy task to copy a file into the root of a project (the same folder I’m running gradle from):
Failed to create MD5 hash for file content.
I thought this was related to the artifacts I was pulling from Artifactory, but that seems to be unrelated. I was able to get the same results with a minimal script.
Is there something obviously wrong with what I’m doing, or does Gradle intentionally disallow such things?
task fails(type:Copy) {
from 'build/someFile.txt'
into new File('.').absolutePath
}
task works(type:Copy) {
from 'build/someFile.txt'
into new File('.').absolutePath + '/output'
}
Short Answer: Don't copy into the project directory, you are best to use into "$buildDir/someFolder" so that the folder is isolated to this single task, and also so that it will be cleaned by gradle clean
Long Answer: At it's core, Gradle has the concept of an "UP-TO-DATE" check for every single task. If Gradle sees that nothing has changed since last time a task was executed it will use the old result instead of executing again.
UP-TO-DATE checking is implemented by taking a "hash" of the task inputs and task outputs. Since you are using into '.' that means that the entire contents of the project directory is considered a task output (bad)
Gradle uses the .gradle folder for temp files (eg task hashes) It's likely some of these files are locked for writing as Gradle is trying to also read the same files (to calculate the "hash" of the task outputs) causing the error you are seeing
* EDIT *
If you need to copy into the project directory for legacy reasons, you might use Project.copy(...) directly instead of a Copy task. You could manually manage the task inputs/outputs in this case
Eg
task customCopy {
inputs.file "$buildDir/someFile.txt"
outputs.file 'someFile.txt'
doLast {
copy {
from "$buildDir/someFile.txt"
into '.'
}
}
}
Can you believe it, the following works
task myCopy(type: Copy) {
from "$rootDir/app1/src/main/resources/db"
into "$rootDir/app2/src/test/resources/db"
}
test.dependsOn myCopy
and the following doesn't 🤦
task myCopy(type: Copy) {
from '$rootDir/app1/src/main/resources'
into '$rootDir/app2/src/test/resources'
}
test.dependsOn myCopy
I've a simple task in Gradle:
task cleanBuild(type: Delete) {
def build = ".src/buildfiles"
FileTree tree = fileTree (dir: dbEditorBuild);
tree.each { File file ->
println file
}
}
When I run it, I get this output:
:user:cleanBuild UP-TO-DATE
BUILD SUCCESSFUL
Total time: 1.656 secs
I've read the docs and it says that tasks results are cached for performance. I wanted to rerun the task, but I couldn't. And that was despite editing the task code. So, apparently, it seems that Gradle is not able to detect that the task has been changed, which kind of suck.
I've tried what others recommended, like adding this line to the task:
outputs.upToDateWhen { false }
But it doesn't have any effect.
You define a task of the type Delete, but you don't define any files to delete. That is the reason, why your task is always up-to-date, since it has nothing to do. You can define which files will be deleted via the delete method. Everything you pass to this method will be evaluated via Project.files(...):
task myDelete(type: Delete) {
delete 'path/to/file', 'path/to/other/file'
}
Please note, that your code example does not interfere with the up-to-date checks, it doesn't even interfere with the task at all. Since you are not using a doFirst/doLast closure, you are using the configuration closure of the task, which is executed during configuration phase. Since you are also not using any task methods, your code would mean absolutely the same if it would be placed outside of the task configuration closure.
As a small addition: Even if this specific problem is not caused by the Gradle up-to-date checks, there is a way to force Gradle to execute all tasks ignoring any task optimization: Simply add --rerun-tasks as command line argument, as described in the docs.
If you are trying to delete some additional files that are not deleted with a default clean task (because it deletes only the build directories) you can extend the clean task to delete other things as well.
clean {
delete += "$buildDir"
delete += "$rootDir/someDir/someClass.java"
delete += "$rootDir/otherDir
}
Or create a new task to delete files and dependOn it to put in the build lifecycle.
task deleteSomething(type: Delete) {
// to delete a file
delete 'uglyFolder', 'uglyFile'
// to delete a directory
delete 'uglyFolder'
followSymlinks = true
}
Be default symlinks will not be followed when deleting files. To change this behavior call Delete.setFollowSymlinks(boolean) with true. On systems that do not support symlinks, this will have no effect.
Or you can put the action into the execution phase, and delete it.
task cleanBuild {
def build = new File("$rootDir/src/buildfiles")
doLast{
build.deleteDir()
}
}
Also, be sure the task has something to do because if the task has nothing to do, there is nothing to delete etc it will print UP-TO-DATE #lu.koerfer answer explains it perfectly.
I have copy task configured like this:
tmp = "$project.buildDir/tmp"
tmpClassesDir = "$tmp/WEB-INF/classes"
task copyFiles(type: Copy) {
from sourceSets.main.java.srcDirs
include '*properties'
into tmpClassesDir
}
After than clean task deletes build directory, if I run copyFiles once again it's UP-TO-DATE
It works if I add --rerun-tasks or set outputs.dir to tmp, but when outputs.dir is set as tmpClassesDir gradle says it's UP-TO-DATE
Any ideas what causes that strange behavior?
UPDATE: Problem exists only on remote test servers - when running local it's working well - any ideas?