I have a folder with a lot of projects inside it (too much to manually write build files for them)
The projects are mostly in a flat layout:
root
-project 1
-project 2
-project 3
-project 4
-project 5
( -project 5.1)
But can be nested as shown above, and I need to account for this.
Ideally the following should happen:
I can run user#user:/root gradle build and every project in the directory shoudl be built as long as it contains a gradle build file
if a build fails just continue with the next one
How can I make this possible ?
How about this one-liner (not tested):
find . -maxdepth 1 -type d \( ! -name . \) -exec bash -c "cd '{}' && gradle build || true" \;
Or, more verbose:
dirs=($(find . -type d))
for dir in "${dirs[#]}"; do
cd "$dir"
gradle build || true
done
I came up with a working solution:
def flist = []
// change to you workspace name
new File('./Workspace').eachDir {
//blacklist any folders you want
if (it.name !='.gradle' && it.name != 'master' && it.name!= 'Build-All') {
flist << it.name
}
}
// build task objects
flist.each { folder ->
task "${folder}"(type: GradleBuild) {
buildFile = "./Workspace/"+ folder + "/build.gradle"
dir = './' + folder
tasks = ['build']
}
}
// create super task
task (all, dependsOn: flist) {
}
You need to invoke it as such in the root directory: gradle :all --continue this has the benefit that any failing project builds will not halt the other builds.
Another bonus is that gradle gives a neat report about all failing builds.
Related
Within a Jenkinsfile, I am running regression tests in pytest and using the --junitxml flag to produce test output.
Each regression test runs in its own stage and captures an .xml file with the test output, I then stash these xml files (as each stage runs on a different agent). After all the regression tests have been run, the stashed files are then recovered for reporting once all tests are done.
Please see below:
stage ('Regression 01') {
agent {
label 'rhel1'
}
steps {
sh "cd /directory1/appServer && /home/appServer/py/venvs/*/bin/python -m " +
"pytest -m fast test-dir/regression_test.py -c conf.cfg --junitxml /share/01.xml"
stash includes: '01.xml', name: 'test01'
}
}
stage ('Regression 02') {
agent {
label 'rhel2'
}
steps {
sh "cd /directory1/appServer && /home/appServer/py/venvs/*/bin/python -m " +
"pytest -m fast test-dir/regression_test.py -c conf.cfg --junitxml /share/1.xml"
stash includes: '02.xml', name: 'test02'
}
}
post {
always {
unstash 'test01'
unstash 'test02'
junit "*.xml"
...
}
}
I have a total of 10 regression tests running which each stash a unique .xml file, I am also looking to add more, therefore hardcoding the XML test names is not feasible.
How can I create some sort of automation or logic within my Jenkinsfile that will do the XML naming, stashing and unstashing?
I have a huge maven project, lot of people are using it. I'm currently working on converting it to gradle. One of the last steps will be that I merge the gradle files, and delete the pom.xml files. But I'd like to add a gradle task to clean the maven target directories (of all the sub-projects). In shell I would do something like:
find . -type d -name target -exec rm -rf "{}" \;
But I prefer this to be a gradle task. How do I add it? This is what I tried but it doesn't delete anything:
task cleanMaven(type: Delete) {
delete fileTree('.').matching { include '**/target/**' }
}
below will handle all modules of root project and prints true if a target dir existed and is deleted
allprojects {
task mvnClean {
doFirst {
def targetPath = project.projectDir.toString() + '/target'
println "target dir exists: ${Files.deleteIfExists(Paths.get(targetPath))}"
}
}
}
Based on #PrasadU's answer, but this also deletes all the contents of the target/ directories:
allprojects {
task mvnClean {
doFirst {
def targetPath = project.projectDir.toString() + '/target'
project.delete(files("${targetPath}"))
}
}
}
I am running a shell script within the Jenkins pipeline and i want to run find command to get all extension files and copy them to the scan folder inside the "AppName" folder
here's the code:
stage("SCA Check"){
node("default"){
checkout scm
conf.findAll { key, value -> key.contains("token.") }.each { key, value ->
tokens << string(credentialsId: value, variable: key.replace('token.',''))
}
withEnv(vars) {
withCredentials(tokens){
dir(dirpath) {
dir(AppName) {
git url: gitURL,
credentialsId: 'bitbucket-https-url-rdonly',
branch: branchName
}
sh """#!/bin/bash
set
echo '${scaInfo}'
python --version
cp Action.py '${AppName}' && cd '${AppName}' && mkdir scan
"find . -regex '.*\.\(sql\|conf\|py\|csv\|coveragerc\|css\|eot\|etlconf\|hql\|html\|idx\|ini\|js\|json\|log\|map\|md\|pack\|pdf\|sample\|sh\|svg\|ttf\|txt\|woff\|woff2\)$' -exec cp {} scan/ \;"
echo find
cd scan && zip scan.zip * && mv scan.zip .. && cd ..
python Action.py '${scaInfo}'
"""
}
}
}
}}
}
But this gives error while running pipeline:
Branch event
Obtained jenkinsfile from 1ace31daa88df82dd21cb4a04251065b78562fdf
Running in Durability level: PERFORMANCE_OPTIMIZED
[Bitbucket] Notifying commit build result
[Bitbucket] Build result notified
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 108: unexpected char: '\' # line 108, column 22.
find . -regex '.*\.\(sql\|conf\|py\|csv\|coveragerc\|css\|eot\|etlconf\|hql\|html\|idx\|ini\|js\|json\|log\|map\|md\|pack\|pdf\|sample\|sh\|svg\|ttf\|txt\|woff\|woff2\)$' -exec cp {} scan/ \;
^
1 error
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
How would i resolve the issue so that i would get all the extension files within the dir & sub-dir & copy to the "scan" folder so that i would go ahead and create zip file of the folder and ship to different location
I just observed a very weird behaviour from a Gradle Tar task.
Let's take a simple example, 2 files :
/tmp/test$ ls
test1.txt ##test2##
Here is a simple Tar task :
task('testHash', type: Tar) {
from "/tmp/test"
extension = 'tar.gz'
compression = Compression.GZIP
}
The file ##test2## is skipped for some reason, after running gradle testHash :
/path/to/gradle/project/foo$ tar tvf build/distributions/foo-1.0.tar.gz
test1.txt
It seems to happen when the filename is containing # character both at the beginning and the end.
A regular tar is working well :
/tmp/test$ tar czvf test.tar.gz *
test1.txt
##test2##
/tmp/test$ tar tf test.tar.gz
test1.txt
##test2##
I am using Gradle 4.1. Any explanation ?
Thanks to Opal's comments, I adjusted my searches and found a workaround. There is maybe a cleaner way but this one works for me
task('testHash', type: Tar) {
doFirst {
org.apache.tools.ant.DirectoryScanner.defaultExcludes.each {
DirectoryScanner.removeDefaultExclude it
}
}
from "/tmp/test"
extension = 'tar.gz'
compression = Compression.GZIP
}
FYI, here are default excludes
There are a set of definitions that are excluded by default from all
directory-based tasks. As of Ant 1.8.1 they are:
**/*~
**/#*#
**/.#*
**/%*%
**/._*
**/CVS
**/CVS/**
**/.cvsignore
**/SCCS
**/SCCS/**
**/vssver.scc
**/.svn
**/.svn/**
**/.DS_Store
Ant 1.8.2 adds the following default excludes:
**/.git
**/.git/**
**/.gitattributes
**/.gitignore
**/.gitmodules
**/.hg
**/.hg/**
**/.hgignore
**/.hgsub
**/.hgsubstate
**/.hgtags
**/.bzr
**/.bzr/**
**/.bzrignore
I have this stage in my Jenkins pipeline:
stage('Build') {
def mvnHome = tool 'M3'
sh '''for f in i7j-*; do
(cd $f && ${mvnHome}/bin/mvn clean package)
done
wait'''
}
In Jenkins » Manage Jenkins » Global Tool Configuration I have a Maven installation called M3, version 3.3.9.
When running this pipeline, mvnHome is empty because I get this in the log:
+ /bin/mvn clean install -Dmaven.test.skip=true
/var/lib/jenkins/jobs/***SNIP***/script.sh: 3: /var/lib/jenkins/jobs/***SNIP***/script.sh: /bin/mvn: not found
I did find a path /var/lib/jenkins/tools/hudson.tasks.Maven_MavenInstallation/M3 on the Jenkins server, which works, but I would prefer not to use a hard coded path to mvn in this script.
How do I fix this?
EDIT: Summary of the answer, using tool and withEnv.
My working code is now:
stage('Build') {
def mvn_version = 'M3'
withEnv( ["PATH+MAVEN=${tool mvn_version}/bin"] ) {
sh '''for f in i7j-*; do
(cd $f && mvn clean package -Dmaven.test.skip=true -Dadditionalparam=-Xdoclint:none | tee ../jel-mvn-$f.log) &
done
wait'''
}
}
You can use your Tools in Jenkinsfile with the tool and withEnv snippets.
Should looks like this:
def mvn_version = 'M3'
withEnv( ["PATH+MAVEN=${tool mvn_version}/bin"] ) {
//sh "mvn clean package"
}
The easiest way should be to use is tools directives:
pipeline {
agent any
tools {
maven 'M3'
}
stages {
stage('Build') {
steps {
sh 'mvn -B -DskipTests clean package'
}
}
}
}
M3 is the name pre-configured in Global Tool Configuration, see the docs: https://jenkins.io/doc/book/pipeline/syntax/#tools
What about using the construct:
withMaven(mavenOpts: MAVEN_OPTS, maven: 'M3', mavenLocalRepo: MAVEN_LOCAL_REPOSITORY, mavenSettingsConfig: MAVEN_SETTINGS) {
sh "mvn ..."
}