Sort contents of a property file in a Gradle task - sorting

As part of building a number of projects, I would like to sort the contents in some property-files that are semi-generated but also checked in to source control. The generation/update step in Gradle leaves them in different order (Done in 3rd party plugin code, probably the changing ordering is due to using the Java Properties class internally).
What is the simplest way to sort the contents of a file in Gradle?
The files are not large, reading in the file into lines, sorting and writing out again to the same file should suffice?

The following piece of code should do the job:
new File('lol').with { it.text = it.readLines().findAll { it }.sort().join('\n') }
With gradle task it will be:
task sortLines << {
new File('lol').with { it.text = it.readLines().findAll { it }.sort().join('\n') }
}

Related

Do Gradle `Copy` tasks depend automatically on tasks in its `from` blocks?

Let's say there is a Gradle task that produces an artifact. For example, a Zip task:
tasks.register("myZip", Zip) {
...
}
Would the following task of type Copy automatically gain a dependency on task myZip?
task copyMyZips(type: Copy) {
from { subprojects.findAll { it.tasks.findByName('myZip') }.myZip }
into '/tmp'
}
Note the really convoluted way of referring to the task myZip.
Yes. The Copy tasks do gain dependency on task outputs and tasks which are mentioned in their from blocks.
Per documentation of method from in class Copy as of Gradle 7.4:
AbstractCopyTask from(Object sourcePath, Closure c)
Specifies the source files or directories for a copy and creates a child CopySourceSpec. The given source path is evaluated as per Project.files(java.lang.Object[]).
in turn, documentation of Project.files(java.lang.Object[]) (emphasis mine):
Returns a ConfigurableFileCollection containing the given files. You can pass any of the following types to this method:
[...most of the list snipped...]
A Task. Converted to the task's output files. The task is executed if the file collection is used as an input to another task.
A TaskOutputs. Converted to the output files the related task. The task is executed if the file collection is used as an input to another task.
Unfortunately, the documentation of Copy does not refer to what is passed into method from as "inputs". However, judging by the code of AbstractCopyTask.java every change to the CopySpec of the Copy task is propagated to the inputs via a ChildSpecListener. The listener is added to the field CopySpecInternal rootSpec .

Get list of files containing string(s) or pattern(s)

Is there a Gradle pattern for retrieving the list of files in a folder or set of folders that contain a given string, set of strings, or pattern?
My project produces RPMs and is using the Nebula RPM type (great package!). There are a couple of different kinds of sets of files that need post-processing. I am trying to generate the list of files that contain the strings that are the indicators for post-processing. For example, files that contain "#doc" need to be processed by the doc generator script. Files that contain "#HOSTNAME#" and "#HOSTFQDN#" need to be processed by sed to replace the strings with the actual host name or host fqdn.
The search root in the package will be src\main\resources. With the result the build script sets up the post-install script commands - something like:
postInstall('/opt/product/bin/postprocess.sh ' + join(filesContainingDocs, " "))
postInstall('/bin/sed -i -e "s/#HOSTNAME#/$(hostname -s)/" -e s/#HOSTFQDN#/$(hostname)/" ' + join(filesContainingHostname, " ")
I can figure out the postinstall syntax. I'm having difficulty finding the filter for any of the regular Gradle 'things' (i.e., FileTree) that operate on contents of files rather than names of files. How would I populate filesContainingDocs and filesContainingHostname - something along the lines of:
filesContainingDocs = FileTree('src/main/resources', { contents.matches('#doc') }
filesContainingHostname = FileTree('src/main/resources', { contents.matches('#(HOSTNAME|HOSTFQDN)#') }
While the post-process script could simply do the grep, the several RPMs in our product overlay each other and each RPM should only post-process the files it provides, so a general grep over the final installed folder is not workable - it would catch files provided by other RPMs. It seems to me that I ought to be able to, at build time, produce the correct static list of files from the bigger set of source files that comprise the given RPM's project.
It doesn't have to be FileTree - running a command like findstr /s /m /c:"#doc" src\main\resources\*.conf (alas, the build platform is Windows) produces the answer in stdout but I'm not sure how to get that result into an object Gradle can use to expand the result. (I also suspect there is a 'more Gradle way' to do this.)
The set of files, and the contents of those files, is generally fairly small.
I'm having difficulty finding the filter for any of the regular Gradle 'things' (i.e., FileTree) that operate on contents of files rather than names of files.
You can apply any filter you can imagine on a Gradle file tree, in the end it is just Groovy (or Kotlin) code running in the JVM. Each Gradle FileTree is nothing more than a (lazily evaluated) collection of Java File objects. To filter those File objects, you can read their content, e.g. in the same way you would read them in Java. Groovy even provides a JDK enhancement for the Java class File that includes the simple method getText() for this purpose. Now you can easily filter for files that contain a certain string:
filesContainingDocs = fileTree('src/main/resources').filter { file ->
file.text.contains('#doc')
}
Using Groovy, you can call getters like .getText() in the same way as accessing fields (.text in this case).
If a simple contains check is not enough, the Groovy JDK enhancements even provide the method matches(Pattern pattern) on CharSequence/string instances to perform a regular extension check:
filesContainingDocs = fileTree('src/main/resources').filter { file ->
file.text.replace('\r\n','\n').matches('.*some regex.*') }
}

Gradle copies all the files if at least one is not up-to-date

For instance, I've got the following task:
task testCopy(type: Copy) {
from ("folder/copy_from")
into ("folder/copy_to")
eachFile {println it.name}
}
Unless the inner files of the folder copy_from are touched, task works fine. As soon as I change, let's say one file in the folder copy_from, then Gradle begins to copy all the files from copy_from folder into copy_to instead of copying only one changed/added file.
Is this behaviour expected? Is there a way to make Gradle copy only changed/added file?
Yes based on this github issue and gradle discuss:
The build is incremental in the sense that the copy task only executes
when things have changed but it isn’t incremental itself, in that it
only copies inputs that have been modified.
I couldn't find a propper solution, but one solution is just splitting your task into smaller ones with specific types.
task copy1(type: Copy) {
into 'build/out'
from ('src') {
include 'docs/*.txt'
}
eachFile {println it.name}
}
task copy2(type: Copy) {
into 'build/out'
from ('src') {
include 'docs/*.md'
}
eachFile {println it.name}
}
task copy3 {
dependsOn copy1, copy2
}
It's not exactly what you want but it improves the performance by reducing files to copy.
when you change a text file and run gradle copy3 it just copies the text files not md files.
UPDATE:
Ant copy task doesn't have this problem
from it's documentation:
By default, files are only copied if the source file is newer than the destination file, or when the destination file does not exist. However, you can explicitly overwrite files with the overwrite attribute
So you can use ant copy task instead, as we can use ant tasks from gradle:
task copyFromAnt {
doLast {
ant.copy(todir: 'build/out') {
fileset(dir: 'src')
}
}
}
ant logs the files it copies so you can check the log with help of gradle -d and grep:
gradle copyFromAnt -d | grep "\[ant:copy\]"
and to see just the files it copies with out up-to-dat and etc. you can use the below command:
gradle copyFromAnt -d | grep "\[ant:copy\] Copying"

How to remove an element from gradle task outputs?

is it possible to exclude an element from the output files of a Task in order to not consider it for the up-to-date check? In my case I have a copy task that automatically set the destination directory in outputs variable, but I'd like to remove it and set only some of the copied files.
Or, as alternative, is it possible to overwrite the entire outputs variable?
Thanks,
Michele.
Incremental tasks create snapshots from input and output files of a task. If these snapshots are the same for two task executions (based on the hash code of file content), then Gradle assumes that task is up-to-date.
You are not able to remove some files from output and expect Gradle to forget about them, simply because the hash codes will be different.
There is an option that allows you to manually define the logic of up-to-date checks.
You should use a method upToDateWhen(Closure upToDateClosure) in TaskOutputs class.
task myTask {
outputs.dir files('/home/user/test')
outputs.upToDateWhen {
// your logic here
return true; // always up-to-date
}
}
I've found the solution:
task reduceZip(type: Copy) {
outputs.files.setFrom(file("C:/temp/unzip/test.properties"))
outputs.file(file("C:/temp/unzip/test.txt"))
from zipTree("C:/temp/temp.zip")
into "C:/temp/unzip"
}
Outputs.files list could be modified only register new elements, not removing (for what I know). So I need to reset the list and then eventually add other files. The outputs.files.setFrom method reset the outputs.files list so it is possible add other file. In the example above I reduce the up-to-date check only to the test.properties and test.txt files.

How can I transform a .properties file during a Gradle build?

As part of a deploy task in Gradle, I want to change the value of a property in foo.properties to point to a production database instead of a development database.
I'd rather not replace the whole file outright, as it's rather large and it means we would have to maintain two separate versions that only differ on a single line.
What is the best way to accomplish this?
You can use the ant.propertyfile task:
ant.propertyfile(
file: "myfile.properties") {
entry( key: "propertyName", value: "propertyValue")
entry( key: "anotherProperty", operation: "del")
}
You should be able to fire off an ant "replace" task that does what you want: http://ant.apache.org/manual/Tasks/replace.html
ant.replace(file: "blah", token: "wibble", value: "flibble")
Create a properties object, then create the file object with the targeted properties file path, load the file on the properties object with load, set the desired property with setProperty, and save the changes with store.
def var = new Properties()
File myfile = file("foo.properties");
var.load(myfile.newDataInputStream())
var.setProperty("db", "prod")
var.store(myfile.newWriter(), null)
A simple solution is to code a task that uses java.util.Properties to write the file. If you really want to incrementally update the file, you'll have to implement this on your own. Or maybe you find an Ant task that does what you want (all Ant tasks can be used as-is from Gradle). For best results, you should also declare the inputs and outputs of the task, so that Gradle only executes the tasks when the properties file needs to be changed.
You can use ReplaceTokens
Say you have a file called db.properties in src/main/java/resources/com.stackoverlow (the location doesn't matter) with the following content
database.url=#url#
Note that the # surrounding the url text is required.
You can then define this in your build.gradle file.
processResources {
filter ReplaceTokens, tokens: [
"url": "https://stackoverflow.com"
]
}
When you build your code, this would replace #url# with https://stackoverflow.com.
If you are only interested in applying this rule to a specific file, you can add a filesMatching
processResources {
filesMatching('**/db.properties') {
filter ReplaceTokens, tokens: [
"url": "https://stackoverflow.com"
]
}
}
See gradle doc for more explanation

Resources