Gradle : how can I call a 'def' from an imported script? - gradle

I am currently modularizing our gradle build in order to have a libs/commons.gradle file containing a lot of global stuff. I need this because of various branches of the software beeing developed in parallel and we'd like to avoid to spread every scriptfile change among all branches.
So I created that lib file and used "apply from" to load it :
apply from: 'gradle/glib/commons.gradle'
Inside commons.gradle I define the svnrevision function :
...
def svnRevision = {
ISVNOptions options = SVNWCUtil.createDefaultOptions(true);
SVNClientManager clientManager = SVNClientManager.newInstance(options);
SVNStatusClient statusClient = clientManager.getStatusClient();
SVNStatus status = statusClient.doStatus(projectDir, false);
SVNRevision revision = status.getCommittedRevision();
return revision.getNumber().toString();
}
...
I am calling the function from my including build.gradle:
...
task writeVersionProperties {
File f = new File(project.webAppDirName+'/WEB-INF/version.properties');
if (f.exists()) { f.delete(); }
f = new File(project.webAppDirName+'/WEB-INF/version.properties');
FileOutputStream os = new FileOutputStream(f);
os.write(("version="+svnRevision()).getBytes());
os.flush();
os.close();
}
...
But I end up in :
...
FAILURE: Build failed with an exception.
* Where:
Build $PATH_TO/build20.gradle
* What went wrong:
A problem occurred evaluating root project 'DEV_7.X.X_GRADLEZATION'.
> Could not find method svnRevision() for arguments [] on root project 'DEV_7.X.X_GRADLEZATION'.
...
So my queston is : How can I call a subfunction in gradle, which is defined in an included script?
Any help appreciated!

From http://www.gradle.org/docs/current/userguide/writing_build_scripts.html:
13.4.1. Local variables
Local variables are declared with the def keyword. They are only
visible in the scope where they have been declared. Local variables
are a feature of the underlying Groovy language.
13.4.2. Extra properties
All enhanced objects in Gradle's domain model can hold extra
user-defined properties. This includes, but is not limited to,
projects, tasks, and source sets. Extra properties can be added, read
and set via the owning object's ext property. Alternatively, an ext
block can be used to add multiple properties at once.
If you declare it as:
ext.svnRevision = {
...
}
and don't change the call, I expect it will work.

Related

how to "include" another file as part of a Jenkins Pipeline definition

We have a large project that has multiple separate declarative pipeline file definitions. This is used to build different apps and installers from the single code base.
Right now, all of these files contain a large block of "code" used to generate the email body and JIRA update messages. examples:
// Get a JIRA's to add Comments to
// Return map of JIRA id to comment text from all commits for that JIRA
#NonCPS
def getJiraMap() {
a bunch of stuff
return jiraset
}
// Get the body text for the emails
def getMailBody1() {
return "See: ${BUILD_URL}\n\nChanges:\n" + getChangeString() + "\n" + testStatuses()
}
etc...
What I would like to do is have all these common methods in a separate file that all the other pipeline files can include. This seems like it SHOULD be easy, but all examples I've found appear to be rather complex involving a separate SCM - which is NOT what I want.
Updates:
Going through the various suggestions given in that link, I make the following file - BuildTools.groovy: Note that this file is in the same directory as the jenkins pipeline file that uses it.
import hudson.tasks.test.AbstractTestResultAction
import hudson.model.Actionable
Class BuildTools {
// Get a JIRA's to add Comments to
// Return map of JIRA id to comment text from all commits for that JIRA
#NonCPS
def getJiraMap() {
def jiraset = [:]
.. whole bunch of stuff ..
Here are the various things I've tried, and the results.
File sourceFile = new File("./AutomatedBuild/BuildTools.groovy");
Class gcl = new GroovyClassLoader(getClass().getClassLoader()).parseClass(sourceFile);
GroovyObject bt = (GroovyObject) gcl.newInstance();
Fails with:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use method java.lang.Class getClassLoader
evaluate(new File("./AutomatedBuild/BuildTools.groovy"))
def bt = new BuildTools()
Fails with:
15:29:07 WorkflowScript: 8: unable to resolve class BuildTools
15:29:07 # line 8, column 10.
15:29:07 def bt = new BuildTools()
15:29:07 ^
import BuildTools
def bt = new BuildTools()
Fails with:
15:35:58 WorkflowScript: 16: unable to resolve class BuildTools (note that BuildTools.groovy is in the same folder as this script)
15:35:58 # line 16, column 1.
15:35:58 import BuildTools
15:35:58 ^
GroovyShell shell = new GroovyShell()
def bt = shell.parse(new File("./AutomatedBuild/BuildTools.groovy"))
Fails with:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use new groovy.lang.GroovyShell

gradle: how to use result of task in configuration phase (plugin ospackage)

I have a gradle script in which I configure a plugin (in my case ospackage but I guess the same would apply to another plugin) using a variable as per:
ospackage {
...
version project.ext.my_version
...
}
This variable is first initialized and then is updated using a task that I call first in my build script:
ext {
...
my_version = "XXX"
...
}
task init{
group 'ho'
description 'get HO Version'
doLast {
...
project.ext.my_version = getParameter("VERSION")
...
}
}
the problem is that the plugin (in my case ospackage) always consider the initial value "XXX" and not the correct one that was set by executing the init task.
I know it has something to do with configuration and execution phase but still I cannot find a workaround to do what I want.
For info, I also tried to create a task like the one below but it also fail as it seems that buildDeb task does not overwrite ospackage version parameter
buildDeb {
doLast {
...
version project.ext.my_version
link('/usr/bin/aa', '/usr/bin/aa.sh')
...
}
}
I also tried to put at the end of my file something like:
ospackage.dependsOn("init")
but the problem is that ospackage is not recognized as a task
Thank you in advance for your help.
It looks to me like the essence of your question revolves around on-demand values. My understanding is that you would like to set a version number during the configuration phase and use that value during the execution phase to set a package version using the ospackage plugin.
The issue is that the ospackage documentation only provides examples (to date) that setup the package constants during the configuration phase. Obviously that won't work because it is the same time you are setting your version (something that must be able to be done in parallel). You have the right idea with doLast. I found that some things from the ospackage cannot go in "doLast" blocks like packageName (if you have more than one of the same package/task type), so I just put the things that require on-demand evaluation in that block (the version, because we need its evaluation delayed until the execution phase).
My solution was to create a variable that holds the function that resolves the version.
def versionForRpm = { -> project.version }
Create a configuration block
configurations.ext {
version = versionForRpm
...
}
This is an example of an on-demand value (aka lazily-evaluated value).
task someRpmBuild(type: Rpm) {
// all package configs that require evaluation during execution phase (lazy)
doLast {
version = configurations.ext.version
requires("someotherpackageinthisbuild", configurations.ext.version(), 0)
}
// all package configs that may be evaluated during the configuration phase
release = configurations.ext.release
packageGroup = configurations.ext.packageGroup
license = configurations.ext.license
packager = configurations.ext.packager
user = configurations.ext.user
distribution = configurations.ext.distribution
vendor = configurations.ext.vendor
url = configurations.ext.url
os = configurations.ext.os
buildHost = configurations.ext.buildHost
epoch = configurations.ext.epoch
arch = configurations.ext.arch
}
Note that configurations.ext.version will be "called" automatically in the execution phase. I needed to explicitly call it when used as an argument in requires, however.
according to the documentation, the task type is Deb:
task fooDeb(type: Deb) {
packageName // Default to project.name
packageDescription // Defaults to project.description
version // Version field, defaults to project.version
arch // Architecture, defaults to "all". E.g. "amd64", "all"
multiArch // Configure multi-arch behavior: NONE (default), SAME, FOREIGN, ALLOWED (see: https://wiki.ubuntu.com/MultiarchSpec )
release // DEB Release
epoch // Epoch, defaults to 0
user // Default user to permission files to
permissionGroup // Default group to permission files to, "group" is used by Gradle for the display of tasks
packageGroup
buildHost
license
packager
distribution
vendor
url
signingKeyId
signingKeyPassphrase
signingKeyRingFile
sourcePackage
provides
uid // Default uid of files
gid // Default gid of files
createDirectoryEntry // [Boolean]
maintainer // Defaults to packager
uploaders // Defaults to packager
priority
summary
conflicts
recommends
suggests
enhances
preDepends
breaks
replaces
}
where:
version Version field, defaults to project.version
might give the RPM plugin a try.
I was able to solve the issue i had, setting the destination for the ospackage copy destination to a calculated value by using
configurations.ext{
mydestdir = ""
rpmVersion = "1"
releaseNumber = "1"
}
task init{
group 'ho'
description 'get HO Version'
doLast {
...
configurations.ext.mydestdir = "/store/tmp/+getSubDir()"
configurations.ext.rpmVersion = "123"
configurations.ext.releaseNumber = "456"
...
}
}
task fooRpm(type: Rpm) {
dependsOn init
...
doLast(){
version = configurations.rpmVersion
release = configurations.releaseNumber
}
from(project.tempDir) {
into configurations.mydestdir
fileMode = 0644
user = "nobody"
permissionGroup = "nobody"
}
}
I think you'll have use type Deb, and make some changes, but this should speed up your build, and you can verify results by adding --scan before and after making these changes.

gradle script, What’s difference between direct dsl and called by a custom function of “apply from:”

First, there are some common scripts deployed in private maven repo:
http://domain/repo/com/d/build/script/java-project/1.0/java-project-1.0.gradle
http://domain/repo/com/d/build/script/maven/1.0/maven-1.0.gradle
In the target project, build.gradle
subprojects {
apply from: 'http://domain/repo/com/d/build/script/java-project/1.0/java-project-1.0.gradle'
apply from: 'http://domain/repo/com/d/build/script/maven/1.0/maven-1.0.gradle'
}
it's OK!
but,
ext.applyScript = { script, version ->
apply from: "http://domain/repo/com/d/build/script/${script}/${version}/${script}-${version}.gradle"
}
subprojects {
applyScript('java-project', '1.0')
applyScript('maven', '1.0')
}
it will fail, with message:
"Error:Cannot add task ':javadocJar' as a task with that name already exists."
task ':javadocJar' is defined in script 'java-project-1.0.gradle'
and we have several sub projects.
why ?
BTW: anyone can give me a lead of source location of "apply from:"?
It's hard to location it by myself.
The problem is that in the latte case you are applying the scripts multiple times to the same root project.
How is that possible? It is quite interesting and a little bit tricky:
you are defining applyScript as a Closure on the extension container ext of the current Gradle project,
generally the apply from: ... is handled as a method call apply(Map) on the org.gradle.api.plugins.PluginAware interface which is one of the super interfaces of the org.gradle.api.Project interface
this means every time you write apply ... you are calling the apply method on the current Gradle project (the one where the apply ... is specified)
as you defined the apply ... as part of the closure, the standard delegation applies
it is semantically the same as this.apply ...
this by default points to the enclosing class/object which is the root project (here it cannot be anything else)
So even if it looks like you are applying the 2 scripts to all the subprojects, you are actually applying the 2 scripts N times to the root project (N is the number of subprojects).
What you need to do is to change the delegate to the correct Project instance.:
you can do it very easily by adding one additional argument to the closure and explicitly calling the apply method on that argument:
ext.applyScript = { project, script, version ->
project.apply from: "..."
}
subprojects {
applyScript(it, 'java-project', '1.0')
applyScript(it, 'maven', '1.0')
}
or you can set the delegate explicitly:
ext.applyScript = { script, version ->
apply from: "..."
}
subprojects {
applyScript.resolveStrategy = Closure.DELEGATE_FIRST
applyScript.delegate = it
applyScript('java-project', '1.0')
applyScript('maven', '1.0')
}

Is it possible to override gradle from prompting error because ("." and "-") in the name of a variable? eg; name.dir (.dir not found) in task

The error message:
* What went wrong:
A problem occurred evaluating root project 'telescope-master'.
> Cannot get property 'dir' on null object
gradle.properties file
classes.dir = WebContent/WEB-INF/classes
webContent.dir = WebContent
template.dir = hdm/template
javascript.dir = hdm/function
javascript4.0.2.dir = hdm/function/4.0.2
datamodel.dir = hdm/datamodel
certificate.dir = certificate
build.gradle file
Properties extFile = new Properties()
extFile.load(new FileInputStream('gradle.properties'))
task FirmwareMatch(type: Zip) {
from ("${extFile.javascript.dir}")
include 'factoryResetOnFirmwareMatch.*'
archiveName 'factoryResetOnFirmwareMatch.zip'
destinationDir file('dist/hdm/function')
}
So basically if I remove the "." from .dir on both files it would work. But is there any way to over ride it?
Also how can I display actual date when using ${TODAY} in gradle.
So your problematic expression is:
extFile.javascript.dir
If we break that into how Groovy will interpret it:
extFile.getProperty('javascript').getProperty('dir')
You want Groovy to interpret it as:
extFile.getProperty('javascript.dir')
Besides directly calling getProperty, here are a couple Groovy options:
extFile.'javascript.dir'
extFile['javascript.dir']
Additionally, assuming your gradle.properties file is either in your project root (generally as a sibling to the build.gradle) or in your GRADLE_HOME directory (i.e. ~/.gradle/gradle.properties) it will be automatically loaded by Gradle and all properties available as project properties.
So you can remove all of your properties parsing code and just do the following:
project.getProperty('javascript.dir')
// or
project.'javascript.dir'
// or
project['javascript.dir']
If you want to protect against those properties not being set, and are on Gradle 2.13 or higher, you can use findProperty instead of getProperty which will return null instead of throwing an exception.

Reading includes from idl file in custom task

I want to make my gradle build inteligent when building my model.
To acquire this I was planning to read schema files, acquire what is included and then build firstly included models (if they are not present).
I'm pretty new to Groovy and Gradle, so please that into account.
What I have:
build.gradle file on root directory, including n subdirectories (subprojects added to settings.gradle). I have only one gradle build file, because I defined tasks like:
subprojects {
task init
task includeDependencies(type: checkDependencies)
task build
task dist
(...)
}
I will return to checkDependencies shortly.
Schema files located externally, which I can see.
Each of them have from 0 to 3 lines of code, that say about dependencies and looks like that:
#include "ModelDir/ModelName.idl"
In my build.gradle I created task that should open, and read those dependencies, preferably return them:
class parsingIDL extends DefaultTask{
String idlFileName="*def file name*"
def regex = ~/#include .*\/(\w*).idl/
#Task Action
def checkDependencies(){
File idlFile= new File(idlFileName)
if(!idlFile.exists()){
logger.error("File not found)
} else {
idlFile.eachLine{ line ->
def dep = []
def matcher = regex.matcher(line)
(...)*
}
}
}
}
What should I have in (...)* to find all dependencies and how should I define, that for example
subprojectA::build.dependsOn([subprojectB::dist, subprojectC::dist])?
All I could find on internet created dep, that outputted given:
[]
[]
[modelName]
[]
[]
(...)

Resources