custom war tasks and applying custom resources within Gradle - gradle

i want to have dynamic WAR tasks based on customer configuration. I created an array with the configuration names and tried to apply custom behavior as so:
ext.customerBuilds = ['customer1', 'customer2', 'customer3']
ext.customerBuilds.eachWithIndex() {
obj, i ->
task "dist_${obj}" (type:War) << {
from "etc/customers/${obj}/deploy"
println "I'm task number $i"
}
};
This creates my three tasks like dist_customer1, etc. Now i want that gradle uses the normal resources under src/main/webapp AND also my customer based ones under etc/customers/XXXX/deploy as stated in the from property.
But it doesnt pick up any file in this folder.
What i am doing wrong here? Thanks.

when setting up your War task ensure you don't accidently use the '<<' notation.
'<<' is just a shortcut for 'Task#doLast'. so instead do:
ext.customerBuilds = ['customer1', 'customer2', 'customer3']
ext.customerBuilds.eachWithIndex() { obj, i ->
task("dist_${obj}", type:War){
from "etc/customers/${obj}/deploy"
println "I'm task number $i"
}
};
You can just add more from statements to pickup stuff from 'src/main/webapp'.

Related

Gradle aggregate task outputs

How do I pass a task's outputs forward as the outputs of a subsequent task?
I want to dynamically register subtasks in a loop and aggregate them. For example, I want the "assemble" task to include both .tar archives created by "assemble_test" and "assemble_prod".
This works but definitely feels wrong. What am I missing?
plugins {
id "base"
}
["test", "prod"].each { env ->
def task = tasks.register("assemble_${env}", Tar) {
archiveFileName = "output-${env}.tar"
from env
}
assemble.outputs.files(task.get().outputs.files)
}

Jenkins pipeline - change class name of loaded scripts

I have a Jenkins Pipeline project which loads several Groovy scripts. When I run this pipeline, Jenkins names these scripts' classes Script1, Script2, and so on. These names are displayed when replaying a build. They also appear on exception stack traces. I find this confusing, especially when there is more than a couple of scripts.
Is there any way of setting these names from the pipeline or -preferably- from within the scripts themselves? So far I tried manipulating the scripts' metaClass:
this.metaClass.name = 'Foo' //fails, doesn't find metaClass property
this.class.metaClass.name = 'Foo' //doesn't fail but has no apparent effect
this.class.metaClass.simpleName = 'Foo' //idem
this.class.metaClass.canonicalName = 'Foo' //idem
NOTE: I am well aware of Jenkins shared libraries. This question is meant to focus on loaded scripts alone.
No, there is currently no way to change the generated class name for a loaded script.
The name generation comes from the load step implementation class LoadStepExecution.
String clazz = execution.getNextScriptName(step.getPath());
In CpsFlowExecution, the script name is generated from the calling generateScriptName() on the shell which is a CpsGroovyShell. This invocationand removes the .groovy suffix.
public String getNextScriptName(String path) {
return shell.generateScriptName().replaceFirst("[.]groovy$", "");
}
The CpsGroovyShell generates the class name, which is where the Script1.groovy, Script2.groovy, etc. get created from
#Override
protected synchronized String generateScriptName() {
if (execution!=null)
return "Script" + (execution.loadedScripts.size()+1) + ".groovy";
else
return super.generateScriptName();
}
Maybe this will be of some help to some users. By chance I managed to force the name of the script class when explicitly creating a class inside the script, like:
class MyOwnScriptClass {
def someClassMember() {
}
}
return new MyOwnScriptClass()
After loading that file it shows me MyOwnScriptClass as class name - for the object which is returned by that script - not the script itself. However for my purposes this is sufficient.

How to run multiple filters on various file types in processResources

I'm trying to do some processing on some source before moving it to the build directory. Specifically, for files with a .template name, replace instances of #timestamp# with the timeStamp variable i've defined. Additionally, for .jsp files, I would like to remove whitespace between tags.
The first part of this, replacing the timestamp works. Replacing the whitespace in the jsps does not.
processResources {
def timeStamp = Long.toString(System.currentTimeMillis())
from ('../src/resources/webapp') {
include '**/*.template'
filter {
it.replace('#timestamp#', timeStamp)
}
rename '(.*)\\.template', '$1'
}
from ('../src/resources/webapp') {
include '**/*.jsp'
filter {
it.replace('>\\s+<', '><')
}
}
}
Previous to using processResources, I had done something like this for the minification:
task minifyJSPs(type: Copy) {
from ('../src/resources/webapp') {
include '**/*.jsp'
filter {
it.replace('>\\s+<', '><')
}
}
into 'gbuild'
}
Filtering the files like this using copy worked, however, I noticed I wasn't able to copy from a location to itself -- the file would end up empty. This meant that I had to copy files from one intermediate directory to another, applying a filter at each step.
I want to be able to apply various transformations to my source in one step. How can I fix my processResources task?

Rename the filename in build.gradle with version

I am working on one gradle script where I need to rename the artifact name at the end but I am facing an issue my code is below.Version is coming from the
version.properties file and I am able to read it properly in build.gradle script but while I change the name of the artifact at the end for e.g. libmain.so to NativeJNI-4.0.0_15 then it doesn't change and change it from libmain.so to filechange.Cansome one let me know what is the issue here
def filechange = file("NativeJNI-${project.version}.so")
//println filechange
task fixartifactname (type: Copy) {
//def filechange = "NativeJNI-${project.version}.so"
//println filechange
from 'build/binaries/mainSharedLibrary'
into 'build/libs'
// def filechange = file("NativeJNI-${project.version}.so")
println filechange
include('libmain.so')
rename ('libmain.so', '${filechange}')
}
//println fixartifactname
build.dependsOn fixartifactname
I am able to fix my issue in below way
def filechange = file("build/libs/NativeJNI-${project.version}.so")
task copyfile(type: Copy) {
from 'build/binaries/mainSharedLibrary'
into 'build/libs'
include('libmain.so')
rename ('libmain.so', filechange.name)
}
Per your question I understand that you're building a native binary. Did you try to set the baseName property for your component? This way you could create your shared library with your desired name from the first place.
Now, regarding your code above then it contains 2 problems:
When calling rename you're wrapping ${filechange} with a single apostrophe (') rather than using inverted commas("), thus this variable is not being resolved to its value. Also, there is no need for a closure block here.
By default using filechange inside the rename would map to its string value which is its full path rather than just the new file name. To overcome this simply use the name property of filechange.
To conclude, each of the following options should do the work for you:
// use inverted commas
rename ('libmain.so', "$filechange.name")
// remove apostrophe or commas
rename ('libmain.so', filechange.name) //
// groovy style function call with no paranthesis
rename 'libmain.so', filechange.name //

multiple into sections in gradle zip seems to fail

Very similar to this question: Gradle Zip task to do multiple sub-trees? Which I don't believe is fully answered, merely circumvented..
I have a project with child projects, built with gradle 1.6, and I need to assemble some results into multiple paths, but I too see the last path surviving.
task zip (type: Zip) {
from ('src/resources') {
into '/'
}
from ('web') {
into '/'
}
from project('A').reference { into ('A') }
from project('B').reference { into ('B') }
}
(Essentially the reference task creates a few directories which are named the same in A and B, so needs to prepend the project name)..
Obviously the references all end up into /B/** in the zip file. When I reverse the order of the two lines, they end up in /A/**.
The other two goes correctly into /. If I move the subproject up before the root resources, they would still go in either /A or /B depending on their order, but the normal resources end in / as assumed.
I would essentially like to include the subprojects dynamically, i.e.
project.subprojects.each {
def pname = it.name
from project(pname).reference {
into "$pname"
}
}
but all my attempts so far has been in vain.
Any pointers welcome
The syntax doesn't look right. It should be from(project('A').reference) { into ('A') }. (Same for B.) Does this make a difference?
PS: into "/" is redundant and can be omitted.

Resources