I'm creating pipepiles automatically with Job Dsl and I want to add the pipeline in an already existing View.
I'm using this script
def myJobs = [
'Test1',
'Test2'
]
myJobs.each { mj ->
job(mj) {
steps {
shell('echo ')
}
}
}
listView('TestView') {
filterBuildQueue()
filterExecutors()
jobs {
myJobs.each { mj ->
name(mj)
}
}
columns {
status()
weather()
name()
lastSuccess()
lastFailure()
lastDuration()
buildButton()
}
}
With this script, if I want to add two new Jobs (for example Test3 and Test4), the old pipelines are deleted and in the View there are only the new pipelines.
How can I add new pipelines without deleting old pipelines? Thank you
Suppose I have the following folder structure:
folder
-subfolderA
-module1.mod
-module1.a
-module1.b
-module1.c
-module2.mod
-module2.a
-module2.b
-module2.c
-module1.d
-subfolderB
-module3.mod
-module3.a
-module3.b
-module3.c
-module3.d
I'd like to flatten away just the "subfolder" tier of directories, producing the following:
outputFolder
-module1.mod
-module1.a
-module1.b
-module1.c
-module2.mod
-module2.a
-module2.b
-module2.c
-module3.mod
-module3.a
-module3.b
-module3.c
-module1.d
-module3.d
I expected this to be extremely simple, with:
copy {
from "folder/*/"
into "outputFolder"
}
But this didn't work. What's the easiest way to flatten away one (or more) layers of subdirectories?
You could probably do it as
copy {
from 'folder'
include '*/**/*.*'
eachFile { FileCopyDetails fcd ->
int slashIndex = fcd.path.indexOf('/')
fcd.path = fcd.path.substring(slashIndex+1)
}
into "outputFolder"
}
Or perhaps
copy {
from { file('folder').listFiles().findAll { it.directory } }
into "outputFolder"
}
I eventually settled on the following as the best combination of clean and configurable. By modifying n, you can flatten as many directories as you like:
copy {
from {
file("folder")
include "**/*"
eachFile { file ->
file.relativePath = new RelativePath(true, file.relativePath.segments.drop(n))
}
includeEmptyDirs = false
}
into "outputFolder"
}
In very brief, I want to find all files that ends with *.sql and copy them if they exist.
There might be 0 or more files in etc directory.
File sqlfiles = file('etc/' + '*.sql')
logger.info("Looking for SQL files: " + sqlfiles);
if (sqlfiles.exists())
{
logger.info("Found log SQL file: " + sqlfiles);
copy
{
from sqlfiles
into "$rpmStoredir"
}
}
else
{
logger.warn("No SQL file found - skipping");
}
With my code, the wildcard is not working here.
So adding "include" to the copy as in the below is working but I just want to figure how to add a logger if the file does not exist
copy
{
from "etc/"
include "*.sql"
into "$rpmStoredir"
}
file(...) is the wrong method to use as this returns a single java.io.File
You could do something like
FileTree myTree = fileTree('etc') {
include '*.sql'
}
if (myTree.empty) {
...
} else {
copy {
from myTree
...
}
}
See Project.fileTree(Object, Closure) and FileTree
Is there a way in Gradle to do something like this?
task printIsSpecificInputUpToDate() {
inputs.property("file1", file("file1.log"))
inputs.property("file2", findProperty("file2.log"))
outputs.file(file("file3.log"))
// if one or more inputs is not up to date
doLast {
// find out if file1 is actually the input out of date
// NOTE: pseudo-code!
if (inputs.get("file1").isUpToDate()) {
onlyProcessFile2()
} else {
processFile1AndFile2()
}
}
}
If there is not, does that indicate that Gradle think this would be a bad pattern?
I think what you are looking for are Incremental tasks.
You need to define your own task class for this, but then you can query exactly for the changed files in your inputs:
#TaskAction
void execute(IncrementalTaskInputs inputs) {
println inputs.incremental ? 'CHANGED inputs considered out of date'
: 'ALL inputs considered out of date'
if (!inputs.incremental)
project.delete(outputDir.listFiles())
inputs.outOfDate { change ->
println "out of date: ${change.file.name}"
def targetFile = new File(outputDir, change.file.name)
targetFile.text = change.file.text.reverse()
}
inputs.removed { change ->
println "removed: ${change.file.name}"
def targetFile = new File(outputDir, change.file.name)
targetFile.delete()
}
}
As part of my project, I need to read files from a directory and do some operations all these in build script. For each file, the operation is the same(reading some SQL queries and execute it). I think its a repetitive task and better to write inside a method. Since I'm new to Gradle, I don't know how it should be. Please help.
One approach given below:
ext.myMethod = { param1, param2 ->
// Method body here
}
Note that this gets created for the project scope, ie. globally available for the project, which can be invoked as follows anywhere in the build script using myMethod(p1, p2) which is equivalent to project.myMethod(p1, p2)
The method can be defined under different scopes as well, such as within tasks:
task myTask {
ext.myMethod = { param1, param2 ->
// Method body here
}
doLast {
myMethod(p1, p2) // This will resolve 'myMethod' defined in task
}
}
If you have defined any methods in any other file *.gradle - ext.method() makes it accessible project wide. For example here is a
versioning.gradle
// ext makes method callable project wide
ext.getVersionName = { ->
try {
def branchout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'rev-parse', '--abbrev-ref', 'HEAD'
standardOutput = branchout
}
def branch = branchout.toString().trim()
if (branch.equals("master")) {
def stdout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'describe', '--tags'
standardOutput = stdout
}
return stdout.toString().trim()
} else {
return branch;
}
}
catch (ignored) {
return null;
}
}
build.gradle
task showVersion << {
// Use inherited method
println 'VersionName: ' + getVersionName()
}
Without ext.method() format , the method will only be available within the *.gradle file it is declared. This is the same with properties.
You can define methods in the following way:
// Define an extra property
ext.srcDirName = 'src/java'
// Define a method
def getSrcDir(project) {
return project.file(srcDirName)
}
You can find more details in gradle documentation Chapter 62. Organizing Build Logic
An example with a root object containing methods.
hg.gradle file:
ext.hg = [
cloneOrPull: { source, dest, branch ->
if (!dest.isDirectory())
hg.clone(source, dest, branch)
else
hg.pull(dest)
hg.update(dest, branch)
},
clone: { source, dest, branch ->
dest.mkdirs()
exec {
commandLine 'hg', 'clone', '--noupdate', source, dest.absolutePath
}
},
pull: { dest ->
exec {
workingDir dest.absolutePath
commandLine 'hg', 'pull'
}
},
]
build.gradle file
apply from: 'hg.gradle'
hg.clone('path/to/repo')
Somehow, maybe because it's five years since the OP, but none of the
ext.someMethod = { foo ->
methodBody
}
approaches are working for me. Instead, a simple function definition seems to be getting the job done in my gradle file:
def retrieveEnvvar(String envvar_name) {
if ( System.getenv(envvar_name) == "" ) {
throw new InvalidUserDataException("\n\n\nPlease specify environment variable ${envvar_name}\n")
} else {
return System.getenv(envvar_name)
}
}
And I call it elsewhere in my script with no prefix, ie retrieveEnvvar("APP_PASSWORD")
This is 2020 so I'm using Gradle 6.1.1.
#ether_joe the top-voted answer by #InvisibleArrow above does work however you must define the method you call before you call it - i.e. earlier in the build.gradle file.
You can see an example here. I have used this approach with Gradle 6.5 and it works.
With Kotlin DSL (build.gradle.kts) you can define regular functions and use them.
It doesn't matter whether you define your function before the call site or after it.
println(generateString())
fun generateString(): String {
return "Black Forest"
}
tasks.create("MyTask") {
println(generateString())
}
If you want to import and use a function from another script, see this answer and this answer.
In my react-native in build.gradle
def func_abc(y){return "abc"+y;}
then
def x = func_abc("y");
If you want to check:
throw new GradleException("x="+x);
or
println "x="+x;