Closure defined in root not visible in child - gradle

I have root project and subproject (:child).
Root build looks like like this:
def foo = {
println("foo")
}
allprojects {
task bar << {
println(project.name + ":bar")
}
afterEvaluate {
foo()
}
}
Running gradle bar prints:
foo
foo
:bar
:child:bar
child:bar
parent:bar
This make sense. However, IRL I need foo to be called by the child's build file (because I want it to be called only by some of the submodules).
The documentation seems to be clear enough: In a multi-project build, sub-projects inherit the properties and methods of their parent project
However, moving the "afterEvaluate" block above into child/build.gradle results in an error: Could not find method foo() for arguments [] on project ':child' of type org.gradle.api.Project.
Why does this happen and how do I fix this? I have tried a whole bunch of different variations - moving the def around (to buildscript, allprojects, to ext, to allprojects.ext, making it a variable in ext, instead of a method etc.), referring to it differently (as rootProject.foo, rootProject.foo(), ext.foo() etc.) - nothing seems to work.
Any ideas?

Vars need to be declared in the ext namespace for them to be propagated downstream. Try:
ext.foo = {
println("foo")
}
ref: https://docs.gradle.org/current/dsl/org.gradle.api.plugins.ExtraPropertiesExtension.html

Related

How can I create an incremental build to derive output filenames from input filenames?

I must write a plugin to compile files of a certain type not covered by an existing plugin.
My initial requirements are simply that if the input file is changed or if the output is missing, then the task will not be up-to-date. In other words, I want the core "working with files" paradigm to work for me.
My first attempt was to declare both inputs and outputs to see if the project would work as most would expect.
class TestPlugin implements Plugin<Project> {
void apply (Project project) {
project.task('compile') {
doFirst {
inputs.files.eachWithIndex { inFilename, idx ->
def inFile = project.file(inFilename)
def outFilename = outputs.files[idx]
def outFile = project.file(outFilename)
logger.info "converting ${inFile} to ${outFile}"
outFile.append "something"
}
}
}
}
}
apply plugin: TestPlugin
compile {
inputs.file "test.in"
outputs.file "test.out"
}
And it does. At this point it does all I need except for one thing: I have to define outputs in correlation with inputs. But defining the outputs is complicated enough to warrant factoring that part of the code into the plugin's task.
My next attempt was to try to get the task to define its outputs but it fails because when the "populate outputs" code executes, inputs is empty and so no outputs are added.
class TestPlugin implements Plugin<Project> {
void apply (Project project) {
project.task('compile') {
inputs.files.each { outputs.files.add(it.replaceFirst(~/\.in$/, '.out')) }
doFirst {
inputs.files.eachWithIndex { inFilename, idx ->
def inFile = project.file(inFilename)
def outFilename = outputs.files[idx]
def outFile = project.file(outFilename)
logger.info "converting ${inFile} to ${outFile}"
outFile.append "something"
}
}
}
}
}
apply plugin: TestPlugin
compile { inputs.file "test.in" }
The above fails with a "path may not be null..." error caused by indexing an empty outputs list (because inputs is empty at the time the task's outer block iterates over inputs.
I tried populating outputs in Project.afterEvaluate and Project.beforeEvaluate but neither worked. The beforeEvaluate closure never executed. The afterEvaluate closure executed, well, after the project was evaluated which means it executed after the task was set as either up-to-date or out-of-date and so it's not useful for what I need.
I also tried Project.configure but that didn't work either.
Then I discovered lazy configuration but since it's incubating I think I should avoid it. I'm assuming that the Java and C plugins don't use incubating features and that leaves me wondering how those plugins are accomplishing the equivalent of lazy configuration.
Then I found Gradle plugin for custom language but it doesn't have an answer. It does, however, have a comment leading me to look at a lot of gradle source code. Again, I don't think I should have to reinvent the wheel.
Lastly, if I have to push the decision to compile into doFirst and not bother with declaring outputs (and thereby abandon task up-to-datedness), then so be it. I just need to know and move on.
For more context, I'm migrating a build away from ant. The two udemy.com classes I took helped me a lot but didn't go far enough to lead me in a confident direction to solve the stated problem. I asked a similar question to one of the instructors and its community to no avail.

Gradle doesn't collect strings starting with -D

Is that a Gradle or Groovy bug?
I want to pass JVM parameters from Gradle to forked JVM, which is unfortunately not done automatically. This is supposed to work, build.gradle:
...
bootRun {
jvmArgs = System.properties.iterator().findAll{it.key.startsWith('myapp')}.collect {
"-D${it.key}=${it.value}"}
}
...
It is executed as:
gradle bootRun -Dmyapp.port=34501 -Dmyapp.member.name=server1
The method collect always return empty collecting if string starts with -D. If it starts with anything else it returns expected two element String collection. If I put space before -D it also works however it breaks the build further downstream on :findMainClass misinterpreting -Dmyapp.port=... with main class name. It simply has to start with -D.
I also tried different string concatenation but as far as the result is a string starting with -D it doesn't work.
Is it a bug or I'm missing something. This is my first Gradle project and I'm not a Groovy developer.
Should I report is bug? Where, Groovy or Gradle?
Notes:
I'm running Gradle from IntelliJ IDE 2016.1.2
Using Gradle 3.5
Forked JVM runs Spring Boot application
UPDATE
Big apologies, my bad! The truth is, the JVM parameters are passed down using the formula above; the problem is with how I measured it that the weren't. I simply put printouts:
println "jvmArgs: ${jvmArgs}"
println "jvmArgs.size: ${jvmArgs.size}"
println "jvmArgs.class: ${jvmArgs.class}"
..and aborting bootRun if jvmArgs.size == 0, to avoid slow application start; that is I wasn't really checking if parameters were passed or not in the application itself. And it turned out they were.
FYI the outputs were:
jvmArgs: []
jvmArgs.size: 0
jvmArgs.class: java.lang.ArrayList
The class of jvmArgs is reported as a standard ArrayList, but behaves more like a input stream consumer, whatever array is jvmArgs assigned to, that array is scanned for all strings starting with "-D", those are consumed (by what?), passed to some ProcessBuilder (??) and jvmArgs is left only with remaining elements.
Take this example:
jvmArgs = ["-Daaa=bbb", "foo", "bar"]
jvmArgs = ["stuff", "-Dccc=ddd", "morestuff"]
jvmArgs = ["-Deee=fff"]
println "jvmArgs: ${jvmArgs}"
..it prints jvmArgs: [] and Spring Boot application is launched with -Daaa=bbb -Dccc=ddd -Deee=fff.
Can someone explain what causes this magic stream like property of jvmArgs, which otherwise claims to be a simple ListArray?
This works for me, but I don't have an explanation for the observed behavior. Hope it helps anyway.
def array = System.properties.iterator().findAll{
it.key.startsWith('myapp')
}.collect {
"-D${it.key}=${it.value}"
}
jvmArgs.addAll(array)
EDIT
jvmArgs = ["value"] calls setJvmArgs which, if I haven't missed something, goes from JavaExec to JavaExecHandleBuilder and later JvmOptions. Here, some parameters get removed. Entries beginning with -D gets added to systemproperties instead.
setJvmArgs( ["-Dtest=1", "xx"])
println getJvmArgs() //[xx]
println systemProperties //[test:1]
Does your Application does't have access to that properties?
https://github.com/gradle/gradle/blob/master/subprojects/core/src/main/java/org/gradle/process/internal/JvmOptions.java#L183
EDIT: what's happening in the background
In Groovy, a property assignment calls the setter instead, accessing it will call the getter. They are interchangeably. If you omit the setter and getter pair, it will be generated for you and will be visible in the bytecode. But you can even omit the property itself, only write the getter and setter pair and use it as a property.
class Foo {
def setBar(String foo) {println "no thanks"}
String getBar() {"test"}
}
f = new Foo()
f.bar="write Var" // println "no thanks"
println f.bar instanceof String // --> f.getBar() inst... true
println f.bar //
So you never assigned a List to a variable, but called setJvmArgs(List). You can list all args with getAllJvmArgs() btw.
In combination with delegation strategies and dynamic Properties/Methods, this can be a blessing for DSL programming, but a curse to debug...
http://groovy-lang.org/style-guide.html#_getters_and_setters
and google for groovy propertyMissing/groovy metaprogramming/groovy Resolve Strategies if you like to learn more about this topic.

How do extension properties act like methods that take a closure and objects at the same time

Assume your build.gradle is very simple, like
apply plugin: 'groovy'
ext.foo1 = 'bar1'
ext {
foo2 = 'bar2'
}
assert foo1 == 'bar1'
assert foo2 == 'bar2'
This is legitimate groovy but I don't understand why. In the second reference to ext, ext is treated like a method that takes a closure which sets its owner to the instance of ext. Yet, in the first reference, it acts like just an ExtraProperties instance. Using something like:
println ext.class.name
Actually causes an error because "class" doesn't exist on ext. This might be because ext is a regular object with a dynamically added an ExtensionAware interface, which was added by extensions.create(...). But that's a farfetched, not-quite-reasonable guess.
I don't know how these kind of properties are set up. The documentation is only clear on how property extensions are intended to be used, not how they work or what they are. Can anyone explain?
(1) How does groovy know to go to project.ext.prop1 when 'prop1' is referenced in the build script?
(2) What is 'ext', really?
the following modified script should give you understanding:
apply plugin: 'groovy'
ext.foo1 = 'bar1'
ext {
println "log1:: ${it.getClass()} ${System.identityHashCode(it)}"
foo2 = 'bar2'
}
println "log2:: ${ext.getClass()} ${System.identityHashCode(ext)}"
assert foo1 == 'bar1'
assert foo2 == 'bar2'
output:
log1:: class org.gradle.api.internal.plugins.DefaultExtraPropertiesExtension 464908575
log2:: class org.gradle.api.internal.plugins.DefaultExtraPropertiesExtension 464908575
means that ext{ ... } equals to ext.with{ ... }

Gradle looping inside ProcessResources and filter

In a jar task, I want to replace some texts in a conf file.
jar {
ext {
excludedClasses = ["com.MyClass1", "com.MyClass2"]
}
doFirst {
println 'Jar task started execution'
println 'Excluded classes ' + excludedClasses
exclude(excludedClasses)
}
doLast {
println 'Jar task finished execution'
}
processResources {
filesMatching('**/moduleconfiguration/conf.json') { f ->
excludedClasses.each { c ->
filter {
println it
it.replace(c, "com.MyClass3")
}
}
}
}
}
But the above code tries to replace c from all *.class files, resulting in an illegal jar. I want it to make replacements only in '**/moduleconfiguration/conf.json' file.
How can I achieve that?
UPDATE
Looks like I am suffering from the same problem happening here: https://issues.gradle.org/browse/GRADLE-1566. This issue has already been resolved but reoccurs if I use an each loop inside processResources.
Meanwhile, I have found 2 solutions to my problem as follows:
Solution 1: Changing order of filter and each loop. i.e. Looping inside filter
filesMatching('**/moduleconfiguration/conf.json') { f ->
filter {
excludedClasses.each { c ->
println it
it = it.replace(c, "com.MyClass3")
}
it
}
}
Solution 2: Using regex instead of each loop
filesMatching('**/moduleconfiguration/conf.json') { f ->
filter {
println it
def regex = excludedClasses.join("|") // watch for .(dot) or other regex chars here
it.replaceAll(regex, "com.MyClass3")
}
}
I am still wondering why the scope of filtering changes to all files if I use each loop within the filesMatching method closure. Is this a groovy thing or gradle thing? I would be very thankful if someone could explain what is happening there.
UPDATE 2
println output of values of delegate, this and owner at different positions for problematic case:
:processResources
Inside filesMatching. delegate:file '.../configuration/conf.json' this:root project 'projectName' owner:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90#6f3e18b8
Problematic Case inside loop before filter. delegate:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91#4587ec31 this:root project 'projectName' owner:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91#4587ec31
Problematic Case inside loop before filter. delegate:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91#4587ec31 this:root project 'projectName' owner:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91#4587ec31
:classes
:jar
Jar task started execution
Excluded classes [MyClass1.class, MyClass2.class]
Problematic Case inside loop inside filter. delegate:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91_closure92#3a0d0128 this:root project 'projectName' owner:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91_closure92#3a0d0128
.
.
.
.
.
println output of values of delegate, this and owner at different positions for solution 1:
:processResources
Inside filesMatching. delegate:file '.../configuration/conf.json' this:root project 'projectName' owner:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90#6ece61a3
Solution 1 Inside filter before loop. delegate:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91#64af2ad7 this:root project 'projectName' owner:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91#64af2ad7
Solution 1 Inside filter inside loop. delegate:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91_closure92#22c74276 this:root project 'projectName' owner:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91_closure92#22c74276
Solution 1 Inside filter inside loop. delegate:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91_closure92#22c74276 this:root project 'projectName' owner:build_95q5jrf5z5ao0hk03tsevn2t0$_run_closure10_closure90_closure91_closure92#22c74276
.
.
.
.
.
:classes
:jar
Jar task started execution
Excluded classes [MyClass1.class, MyClass2.class]
Update
Based on your second update and some testing on my side, it doesn't seem to directly be what I originally suggested. It does definitely appear to be something with delegation, but I can't pin it down.
You can illustrate the difference in your original (problematic) example by changing the filter { line to f.filter {. This explicitly tries to execute the FileCopyDetails#filter method instead of whichever happens to be in scope.
You should be able to see that when calling f.filter only the matched file is filtered. When you call filter on its own, it is calling the task level one. However, since you are already in the middle of copying files, it only applies the filter to files that occur alphabetically after the first one that matches.
For example, if you have this folder structure:
+ resources/main
- abc.json
- configuration.json
- def.json
- efg.json
The def.json and efg.json will be filtered, but the first two will not.
This still doesn't answer why it doesn't call the correct filter method, but it at least confirms that it is calling the task level one.
Original Answer
I believe this is due to the Groovy closures delegating to different objects. I believe you are actually calling Copy#filter in the original (problematic) case, and FileCopyDetails#filter in the two solutions. The copy task's filter method will apply a filter to everything, whereas the details filter method will be for the specific file in your filesMatching.
You should be able to illustrate the different delegates by printing them out:
Original
processResources {
filesMatching('**/moduleconfiguration/conf.json') { f ->
excludedClasses.each { c ->
// print the class of this closure's delegate
println delegate.class
filter {
it.replace(c, "com.MyClass3")
}
}
}
}
Solution 1
filesMatching('**/moduleconfiguration/conf.json') { f ->
filter {
// print the class of this closure's delegate
println delegate.class
excludedClasses.each { c ->
println it
it = it.replace(c, "com.MyClass3")
}
it
}
}
My expectation is that you will see the original delegating to your Copy task (i.e. processResources) and the solutions delegating to the FileCopyDetails. By default all closures in Groovy are "owned" by the object in which they were declared, and also will "delegate" to the owner. When Gradle has API's that take a closure as an argument (such as the filter methods), they are usually reconfigured to have a specific delegate (usually the enclosing container object) and to use the "delegate first" strategy to lookup methods.
See Groovy's documentation on delegation strategies.

How to make sure the running order of puppet classes?

I am new to puppet deployment. I have two classes defined
class taskname{
exec{ "deploy_script":
command = "cp ${old_path} ${new path}",
user = root,
}
cron{"cron_script2":
command = "pyrhton ${new_path}",
user = root,
require = Exec["deploy_script"]
}
}
class taksname2{
exec{ "deploy_script2":
command = "cp ${old_path} ${new path}",
user = root,
}
cron{"cron_script":
command = "pyrhton ${new_path}",
user = root,
require = Exec["deploy_script2"]
}
}
How do I make sure the running order of these two classes.
I have tried in a new manifest file
init.pp to include these two classes
include taskname
include taskname2
It seems that second task running before the first task. How to I enforce the running order?
Use one of these metaparameters.
So to sum up: whenever a resource depends on another resource, use the
before or require metaparameter or chain the resources with ->.
Whenever a resource needs to refresh when another resource changes,
use the notify or subscribe metaparameter or chain the resources with
~>. Some resources will autorequire other resources if they see them,
which can save you some effort.
Also works for classes declared with a resource-like syntax.
When declared with the resource-like syntax, a class may use any
metaparameter. In such cases, every resource contained in the class
will also have that metaparameter. So if you declare a class with noop
=> true, every resource in the class will also have noop => true,
unless they specifically override it. Metaparameters which can take
more than one value (like the relationship metaparameters) will merge
the values from the container and any specific values from the
individual resource.
Try using the metaparameter -> to specify a dependency relationship between the classes. In init.pp where you declare/instantiate these classes, replace the include statements with parameterized class syntax:
class {"taskname":} ->
class {"taskname2":}
This will ensure taskname is invoked before taskname2. For more information, see http://docs.puppetlabs.com/guides/parameterized_classes.html#declaring-a-parameterized-class

Resources