I am trying to include a single file into a stash in a Jenkins/Cloudbees pipeline. In my understanding, this should work like this:
stage('Stash File') {
steps {
stash includes: 'File.jar', name: 'File'
}
}
However, for some reason it does not, and when it runs in Jenkins, the step fails, but without any error message, like this:
[Pipeline] stage
[Pipeline] { (Stash File)
[Pipeline] node
Running on Jenkins in /var/lib/cloudbees-core-cm/workspace/AutoDeploy/APPS/File/AutoDeploy
[Pipeline] {
[Pipeline] stash
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Copy File to target folder)
Stage "Copy File to target folder" skipped due to earlier failure(s)
I've added an sh "ls -la" to ascertain that the file is in the current folder, and it looks like it is. The output of that command is:
[Pipeline] sh
+ ls -la
total 12
drwxr-xr-x 3 cloudbees-core-cm cloudbees-core-cm 4096 27. Oct 12:46 .
drwxr-xr-x 4 cloudbees-core-cm cloudbees-core-cm 4096 27. Oct 12:46 ..
drwxr-xr-x 2 cloudbees-core-cm cloudbees-core-cm 4096 27. Oct 12:46 File.jar
If I use this syntax instead, it works, and the file is added to the stash on account of this being the only file in the folder:
stash includes: '**', name: 'File'
However, I would really prefer to specify the file I want stashed by name. Is this something that is not possible?
Here's variations of my initial syntax that I've tried, all without success:
stash includes: '**/File.jar', name: 'File'
stash includes: '/File.jar', name: 'File'
stash includes: '.File.jar', name: 'File'
stash includes: '***/File.jar', name: 'File'
stash includes: '*/File.jar', name: 'File'
tl:dr: What am I doing wrong here? What is the correct syntax for including a single file in a stash?
Try using just 'File.jar'. I was having a similar issue and put the file name in in the 'includes' since Jenkins was already in the directory and it worked.
Related
I'm writing a Dockerfile to run ROS on my Windows rig and I can't seem to get this COPY command to copy to the container's user root or any sub directory there. I've tried a few things, including messing with the ownership. I know file is ugly but still learning. Not really sure what the issue is here.
This file sits next to a /repos dir which has a git repo within it which can be found here (the ros-noetic branch). This is also the location from which I build and run the container from.
Overall objective is to get roscore to run (which it has been), then exec in with another terminal and get rosrun ros_essentials_cpp (node name) to actually work
# ros-noetic with other stuff added
FROM osrf/ros:noetic-desktop-full
SHELL ["/bin/bash", "-c"]
RUN apt update
RUN apt install -y git
RUN apt-get update && apt-get -y install cmake protobuf-compiler
RUN bash
RUN . /opt/ros/noetic/setup.bash && mkdir -p ~/catkin_ws/src && cd ~/catkin_ws/ && chmod 777 src && catkin_make && . devel/setup.bash
RUN cd /
RUN mkdir /repos
COPY /repos ~/catkin_ws/src
RUN echo ". /opt/ros/noetic/setup.bash" >> ~/.bashrc
Expanding tilde to home directory is a shell feature, which apparently isn't supported in Dockerfile's COPY command. You're putting the files into a directory which is literally named ~, i.e. your container image probably contains something like this:
...
dr-xr-xr-x 13 root root 0 Jun 9 00:07 sys
drwxrwxrwt 7 root root 4096 Nov 13 2020 tmp
drwxr-xr-x 13 root root 4096 Nov 13 2020 usr
drwxr-xr-x 18 root root 4096 Nov 13 2020 var
drwxr-xr-x 2 root root 4096 Jun 9 00:07 ~ <--- !!!
Since root's home directory is always /root, you can use this:
COPY /repos /root/catkin_ws/src
You need to pay attention on the docker context.
When you build docker, you are adding the path to build your image.
If you are not on the / folder, your COPY /repos command won't work.
Try to change the docker context with that:
docker build /
I have a script test.sh
#!/bin/bash
echo start old file
sleep 20
echo end old file
in the repository which I do execute, and in the mean time I git merge other-branch changes like
#!/bin/bash
echo start new file
sleep 20
echo end new file
into the current branch.
It seems that git on Unix (?) does not directly overwrite the existing file node (?) and does instead rm test.sh and creates the new file.
In that way its guaranteed that the script execution will always read the initial file test.sh and terminate with echo end old file.
Note: On my system (Ubuntu 20.04), while executing the script and directy overtwriting the content in an editor, results in executing the new code, which is bad...
Is that correct and is it also correct on Windows with git-for-windows?
I can't answer regarding Windows, but on Ubuntu 18.04 I can confirm that a git checkout or git merge will delete and recreate a changed file, rather than editing it in place. This can be seen in strace output, for example:
unlink("test.sh") = 0
followed later by
openat(AT_FDCWD, "test.sh", O_WRONLY|O_CREAT|O_EXCL, 0666) = 4
It can also be seen if you create a hard link to the file before the git command and then look again afterwards, you will see that you have two different inodes, with different contents. This is to be expected following deletion and recreation, whereas an in-place edit would have preserved the hard linking.
$ ls -l test.sh
-rw-r--r-- 1 myuser mygroup 59 Jun 5 17:04 test.sh
$ ln test.sh test.sh.bak
$ ls -li test.sh*
262203 -rw-r--r-- 2 myuser mygroup 59 Jun 5 17:04 test.sh
262203 -rw-r--r-- 2 myuser mygroup 59 Jun 5 17:04 test.sh.bak
$ git merge mybranch
Updating 009b964..d57f33a
Fast-forward
test.sh | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
$ ls -li test.sh*
262219 -rw-r--r-- 1 myuser mygroup 70 Jun 5 17:05 test.sh
262203 -rw-r--r-- 1 myuser mygroup 59 Jun 5 17:04 test.sh.bak
You mentioned in a comment attached to the question that it is related to Overwrite executing bash script files. Although it would seem not to be the best idea to run a git command affecting a script which is currently still being executed, in fact the delete and recreate behaviour should mean that the existing execution will be unaffected. Even if the bash interpreter has not yet read the whole file into memory, it will have an open filehandle on the existing inode and can continue to access its contents even though that inode is no longer accessible via the filename that it had. See for example What happens to an open file handle on Linux if the pointed file gets moved or deleted
On Windows with git-for-windows I see the same behavior:
$ mklink /H test.sh.bak
$ fsutil hardlink list test.sh.bak
test.sh.bak
test.sh
$ git merge test
$ fsutil hardlink list test.sh.bak
test.sh.bak
Meaning the hard link did not get preserved, meanin a new file has been created.
I have a simple gin gonic microservice Golang project that I'm using to learn how to make a Jenkins pipeline. Every stage is successfully run, but the binary is not running after the pipelines finished. Can also tell the process is not running by using curl to hit the endpoint:
curl http://localhost:9191/users
This is the pipeline in question:
pipeline {
agent any
stages {
stage('git') {
steps {
echo "git"
git 'https://github.com/eduFDiaz/golang-microservices.git'
}
}
stage('clean') {
steps {
echo "clean"
sh "make clean"
}
}
stage('test') {
steps {
echo "test"
sh "make test"
}
}
stage('build') {
steps {
echo "build"
sh "make build"
}
}
stage('run') {
steps {
echo "run"
sh "make run"
}
}
}
}
The Makefile:
executableName=testApi
clean:
echo "stoping if running and cleaning"
rm -rf ./bin
killall $(executableName) || true
test:
echo "Testing..."
go test -coverprofile cp.out ./mvc/...
go tool cover -html=cp.out
build:
echo "Building..."
go build -o bin/$(executableName) mvc/main.go
run:
echo "Running..."
./bin/$(executableName) &
all: test build run
Everything runs perfectly when I do it by hand. What am I missing here?
Console Output:
Started by user Eduardo fernandez
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on Jenkins in /root/.jenkins/workspace/golang pipeline test
[Pipeline] {
[Pipeline] stage
[Pipeline] { (git)
[Pipeline] echo
git
[Pipeline] git
No credentials specified
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://github.com/eduFDiaz/golang-microservices.git # timeout=10
Fetching upstream changes from https://github.com/eduFDiaz/golang-microservices.git
> git --version # timeout=10
> git fetch --tags --force --progress -- https://github.com/eduFDiaz/golang-microservices.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git rev-parse refs/remotes/origin/master^{commit} # timeout=10
> git rev-parse refs/remotes/origin/origin/master^{commit} # timeout=10
Checking out Revision bfa434ff2aca9ea748182aa2b29094e1b9f442c6 (refs/remotes/origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f bfa434ff2aca9ea748182aa2b29094e1b9f442c6 # timeout=10
> git branch -a -v --no-abbrev # timeout=10
> git branch -D master # timeout=10
> git checkout -b master bfa434ff2aca9ea748182aa2b29094e1b9f442c6 # timeout=10
Commit message: "run reverted to previous state in Makefile"
> git rev-list --no-walk bfa434ff2aca9ea748182aa2b29094e1b9f442c6 # timeout=10
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (clean)
[Pipeline] echo
clean
[Pipeline] sh
+ make clean
echo "stoping if running and cleaning"
stoping if running and cleaning
rm -rf ./bin
killall testApi || true
testApi: no process found
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (test)
[Pipeline] echo
test
[Pipeline] sh
+ make test
echo "Testing..."
Testing...
go test -coverprofile cp.out ./mvc/...
? github.com/golang-microservices/mvc [no test files]
? github.com/golang-microservices/mvc/app [no test files]
? github.com/golang-microservices/mvc/controllers [no test files]
ok github.com/golang-microservices/mvc/domain 0.004s coverage: 0.0% of statements
ok github.com/golang-microservices/mvc/services 0.003s coverage: 0.0% of statements [no tests to run]
? github.com/golang-microservices/mvc/utils [no test files]
go tool cover -html=cp.out
HTML output written to /tmp/cover914928629/coverage.html
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (build)
[Pipeline] echo
build
[Pipeline] sh
+ make build
echo "Building..."
Building...
go build -o bin/testApi mvc/main.go
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (run)
[Pipeline] echo
run
[Pipeline] sh (hide)
+ make run
echo "Running..."
Running...
./bin/testApi &
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
This problem is occurring because Jenkins is cleaning all the child process started during build.. i.e. make run is bringing up the application but Jenkins is killing the process as part of cleaning (For More details search for "ProcessTreeKiller").
To Resolve update your line as below
stage('run') {
steps {
echo "run"
sh "export JENKINS_NODE_COOKIE=dontKillMe; make run "
}
So I'm trying to setup a Jenkins declarative Pipeline to run an Xcode build job. I want to use xcpretty Ruby gem but will also need several other Ruby gems later for other jobs.
stage('Pre-Build')
{
steps
{
echo "Executing Pre-Build steps ..."
sh(returnStdout: true, script: "#!/bin/bash -xle && source ~/.rvm/scripts/rvm && rvm use 2.3.1 && cd ${WORKSPACE}/${env.PROJECT_PATH} && gem install xcpretty && set -o pipefail && xcpretty")
}
}
First off all, I get no echo for the sh in Pre-Build stage whatsoever. Neither returnStdout: true nor the hashbang seem to have any effect on getting any log output from the shell invocation.
That leaves me blind on what's going on here. When running the job, the Pre-Build stage passes and then it fails at my actual build stage when I want to use xcpretty.
Here's the log output from the Pre-Build stage:
Executing Pre-Build steps ...
[Pipeline] script
[Pipeline] {
[Pipeline] sh
[job] Running shell script
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Build)
[Pipeline] echo
If I run it in the bash manually, no problem! On Jenkins it seems that something isn't working with RVM but I'm tapping in the dark while trying to fix this for days and it drives me insane.
Any help is appreciated!
MichaĆ Knapik has a blog post covering one hand-rolled solution which defines a groovy wrapper which replicates rvm use semantics
See:
https://blog.knapik.me/how-to-use-rvm-with-jenkins-pipeline/
node {
withRvm('ruby-2.3.1') {
sh 'ruby --version'
sh 'gem install rake'
}
}
I don't want to copy any more than this example - its his code.
Note the caveats on specifying the version
Here's my Jenkins 2.x pipeline:
node ('master'){
stage 'Checkout'
checkout scm
stage "Build Pex"
sh('build.sh')
}
When I run this pipeline the checkout puts the code into to the workspace as expected, however instead of expecting to find the script in workspace/ (it's really there!), it looks in an unrelated directory: workspace#tmp/durable-d812f509.
Entering stage Build Pex
Proceeding
[Pipeline] sh
[workspace] Running shell script
+ build.sh
/home/conmonsysdev/deployments/jenkins_ci_2016_interns/jenkins_home/jobs/pex/branches/master/workspace#tmp/durable-d812f509/script.sh: line 2: build.sh: command not found
How do I modify this Jenkinsfile so that build.sh is executed in the exact same directory as where I checked out the project source code?
You can enclose your actions in dir block.
checkout scm
stage "Build Pex"
dir ('<your new directory>') {
sh('./build.sh')
}
... or ..
checkout scm
stage "Build Pex"
sh(""" <path to your new directory>/build.sh""")
...
<your new directory> is place holder your actual directory. By default it is a relative path to workspace. You can define absolute path, if you are sure this is present on the agent.
The reason that your script doesn't work is because build.sh is not in your PATH.
The Jenkinsfile is running a "sh" script, whose entire contents is the string build.sh. The parent script is in the "#tmp" directory and will always be there - the "#tmp" directory is where Jenkins keeps the Jenkinsfile, essentially, during a run.
To fix the problem, change your line to sh "./build.sh" or sh "bash build.sh", so that the sh block in the Jenkinsfile can correctly locate the build.sh script that you want to execute.
Jenkins create a folder when it make a clone from your project like this:
/var/lib/jenkins/workspace/job-name#script
For this to work you must set the file as executable if you are in a linux environment and then call the shell script.
Something like this:
// Permission to execute
sh "chmod +x -R ${env.WORKSPACE}/../${env.JOB_NAME}#script"
// Call SH
sh "${env.WORKSPACE}/../${env.JOB_NAME}#script/script.sh"
I am having the same issue and dir is not helping, possibly because I am working inside a subdirectory of the tmp dir itself (for reason not germane here). My code looks like this
dir(srcDir){
sh 'pwd; la -l; jenkins.sh "build.sh"'
}
(the pwd and la -l statements were added just for debugging. Issue exists w/o them.) With them I get output like:
+ pwd
/jenkins/workspace/aws-perf-test#tmp/repos/2
+ ls -l
total 72
-rw-r--r-- 1 jenkins jenkins 394 May 19 12:20 README.md
drwxr-xr-x 3 jenkins jenkins 4096 May 19 12:20 api-automation
-rwxr-xr-x 1 jenkins jenkins 174 May 19 12:20 build-it.sh
-rwxr-xr-x 1 jenkins jenkins 433 May 19 12:20 build-release.sh
-rwxr-xr-x 1 jenkins jenkins 322 May 19 12:20 build.sh
drwxr-xr-x 3 jenkins jenkins 4096 May 19 12:20 ix-core
drwxr-xr-x 3 jenkins jenkins 4096 May 19 12:20 ix-java-client
drwxr-xr-x 3 jenkins jenkins 4096 May 19 12:20 ix-rest-models
drwxr-xr-x 4 jenkins jenkins 4096 May 19 12:20 ix-service
drwxr-xr-x 7 jenkins jenkins 4096 May 19 12:20 ixternal
drwxr-xr-x 5 jenkins jenkins 4096 May 19 12:20 ixtraneous-stuff
-rwxr-xr-x 1 jenkins jenkins 472 May 19 12:20 jenkins.sh
-rw-r--r-- 1 jenkins jenkins 16847 May 19 12:20 pom.xml
+ jenkins.sh build.sh
/home/jenkins/workspace/aws-perf-test#tmp/repos/2#tmp/durable-a3ec0501/script.sh: line 2: jenkins.sh: command not found
I ultimately did this:
dir(srcDir){
sh 'cdr=$(pwd); $cdr/jenkins.sh "build.sh"'
}
I was able to get my script execution working with a simplified derivative of Rafael Manzoni's response. I wondered about the whole "JOB_NAME#script" thing and found that unnecessary, at least for declarative using our version of Jenkins. Simply set the access permissions on the workspace. No need to go any deeper than that.
stage('My Stage') {
steps {
sh "chmod +x -R ${env.WORKSPACE}"
sh "./my-script.sh"
}
}
I compiled all the answers above and for me it worked like this:
stage('Run Script'){
steps {
script {
sh('cd relativePathToFolder && chmod +x superscript.sh && ./superscript.sh parameter1 paraeter2')
}
}
}
Thanks to #Rafael Manzoni #Keith Mitchell and #Jayan
Use the GIT_CHECKOUT_DIR environment variable
Jenkinsfile:
pipeline {
agent any
stages {
stage('Install dependencies') {
steps {
dir(GIT_CHECKOUT_DIR) {
// now everything is executed in your checkout directory
sh 'yarn'
}
}
}
}
post {
// ...
}
}
Set your GIT_CHECKOUT_DIR while setting up your pipeline. See this question:
List of more environment variables:
${YOUR_JENKINS_HOST}/env-vars.html
example: http://localhost:8000/env-vars.html
The most upvoted answer is dangerous and brittle. No need to use static operating system dependant ABSOLUTE PATHs in multiple places that break your builds once you change the machine.