I am trying to execute shell script from my jenkins pipeline. I have provided absolute and relative path in the shell command and still I am facing No such file or directory error while building the pipeline.
This is simple script but yet not working.
Try 1:
stage ( 'Executing shell script' ) {
steps {
sh '/home/patching/shell_script.sh'
}
}
Try 2:
stage ( 'Executing shell script' ) {
steps {
sh './shell_script.sh'
}
}
Try 3:
stage ( 'Executing shell script' ) {
steps {
dir ('/home/patching/shell_script.sh){
sh './shell_script.sh'
}
}
I really don't know what is really wrong with the script. Could some one help me on this?
I got the issue why it wasn't able to find the file that I wanted to run. It was running on different slave.
Can you try this
pipeline {
agent any
stages {
stage('Hello World') {
steps {
script {
sh '''
/home/ubuntu/First.sh
'''
}
}
}
}
}
Related
I have a shell script inside my jenkins pipeline which will call mvn. For that i have to pass variable value to mvn. The variable is not passing inside the Jenkins pipeline's shell. But when trying from local machine shell it is working fine as expected.
ARTIFACT_NAME="Sample_Artifact"
pipeline{
agent {
node{
label "${AGENT}"
}
}
stages{
stage("Setting MultiJob Properties"){
steps{
sh '''set +x
export VERSION=$(mvn -B -q -Dexec.executable=echo -Dexec.args=\${${ARTIFACT_NAME}} )
echo $VERSION
'''
}
}
}
}
Expected Process: export VERSION=$(mvn -B -q -Dexec.executable=echo -Dexec.args=${Sample_Artifact} )
Expected Output: 1.0001
ARTIFACT_NAME - I am passing it from Jenkins UI.
${${ARTIFACT_NAME}} - This variable is perfectly replace value in Freestyle jobs and it is throwing error in the Pipeline jobs.
Error Message: script.sh: 3: Bad substitution
Can Anyone please help me to resolve the issue?
As Ian wrote, you're passing the whole script as a literal (''') instead of an interpolated string ("""), so the variable name doesn't get substituted with its value:
pipeline{
agent {
node {
label AGENT
}
}
stages {
stage("Setting MultiJob Properties") {
steps {
sh """set +x
export VERSION=\$(mvn -B -q -Dexec.executable=echo -Dexec.args=\${$ARTIFACT_NAME})
echo \$VERSION"""
}
}
}
}
I'm building several android apps in a docker image using gradle and a bash script. The script is triggered by jenkins, which runs the docker image.
In the bash script I gather information about the successes of the builds. I want to pass that information to the groovy script of the jenkinsfile.
I tried to create a txt file in the docker container, but the groovy script in the jenkinsfile can not find that file.
This is the groovy script of my jenkinsfile:
script {
try {
sh script:'''
#!/bin/bash
./jenkins.sh
'''
} catch(e){
currentBuild.result = "FAILURE"
} finally {
String buildResults = null
try {
def pathToBuildResults="[...]/buildResults.txt"
buildResults = readFile "${pathToBuildResults}"
} catch(e) {
buildResults = "error receiving build results. Error: " + e.toString()
}
}
}
In my jenkins.sh bash script I do the following:
[...]
buildResults+=" $appName: Build Failed!" //this is done for several apps
echo "$buildResults" | cat > $pathToBuildResults //this works I checked, if the file is created
[...]
The file is created, but groovy cannot find it. I think the reason is, that the jenkins script does not run inside the docker container.
How can I access the string buildResults of the bash script in my groovy jenkins script?
One option that you have in order to avoid the need to read the results file is to modify your jenkins.sh script to print the results to the output instead of writing them to a file and then use the sh step to capture that output and use it instead of the file.
Something like:
script {
try {
String buildResults = sh returnStdout: true, script:'''
#!/bin/bash
./jenkins.sh
'''
// You now have the output of jenkins.sh inside the buildResults parameter
} catch(e){
currentBuild.result = "FAILURE"
}
}
This way you are avoiding the need to handle the output files and directly get the results you need, which you can then parse and use however you need.
I wrote a declarative Jenkins pipeline and would like to track the CLI commandos executed by Jenkins. To do this, I added a stage and the step sh 'history -a' in it:
pipeline {
options {
...
}
agent {
node {
...
}
}
stages {
stage('Build') {
steps {
sh 'hostname'
sh 'pwd'
...
}
}
...
stage('History') {
steps {
sh 'history -a'
}
}
}
post {
...
}
}
But that is not working:
Console Output
...
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Tear Down)
[Pipeline] sh
+ history -a
/path/to/project-root#tmp/durable-66ba15cc/script.sh: 1: history: not found
[Pipeline] }
...
Other Linux commands like hostname, ls, or pwd are working fine.
Why does history run into an error? How to store the shell commands called by Jenkins in the context of a pipeline?
That specific error that you are getting I think it is only because the agent where you are running sh, does not have the history cmd available - history: not found
If you can store the sh commands... If you want only the sh commands, I think you need to write to a file that you create at the beginning, where you write to everytime you have a sh step, or you can just use the pipeline log file (the output console).
You can find here a thread about the location of the pipeline or build logs, in case it helps.
I am running a shell script inside a docker container via jenkins groovy pipeline script. The bash script sets some environment variables and then executes unit tests. The stdout of these unit test execution is dumped to a text file.
I later copy this text file outside of the container for usage.
Here is the shell script:
#/bin/bash
source /root/venv/bin/activate
export PYTHONPATH=/foo/bar
cd unit_tests
rm -f results.txt
python tests.py >> results.txt
My pipeline script is as follows:
stage('Run Unit Tests') {
steps {
sh '''
docker-compose -f ./dir1/docker-compose-test.yml up -d
docker cp /supporting_files/run_unit_tests.sh container_1:/foo/bar/
docker exec container_1 /bin/bash run_unit_tests.sh
docker cp container_1:/foo/bar/unit_tests/results.txt .
'''
}
}
stage('Reporting') {
steps {
//steps for reporting
}
}
The problem is whenever any test fails, the results.txt has the appropriate text about failures and their stack. But the pipeline stop executing saying
[Pipeline] }
ERROR: script returned exit code 1
Because of this I am not able to execute next steps of parsing the results.txt file and reporting the results.
How do I make the pipeline execute next stage ?
I tried some things like:
1. Using catchError:
stage('Run Unit Tests') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
sh '''
//Running the commands above
'''
}
}
}
Using try:
try{
stage('Run Unit Tests') {
sh '''
//Executing tests
'''
}
} catch(e) {
echo e.toString()
}
But both of them does not help.
Also the shell script simply dumps the stdout of running tests into a text file so I don't understand why an exit code 1 should be returned as the operation itself does not fail. I saw the text file later, it had the correct failures and error counts with stack.
In my jenkins pipeline i use the "Execute shell command " to run my gradle build script.
Now i want to check if the build has failed in which case i would like to read the console output, store it in a string and publish it to a slack channel.
The code that i have tried goes as follows :
try {
for (int i = 0 ; i < noOfComponents ; i++ ){
component = compileProjectsWithPriority[i]
node {
out = sh script: "cd /home/jenkins/projects/${component} && ${gradleHome}/bin/gradle build", returnStdout: true}
}
}
catch (e){
def errorSummary = 'Build failed due to compilation error in '+"${component}"+'\n'+"${out}"
slackSend (channel: '#my_channel', color: '#FF0000', message: errorSummary)
}
However it does not even execute the shell script and also the console output is null. What is the right approach to do this.
Thanks in advance
The sh command in Jenkins pipelines may not work with shell built-ins like cd. Perhaps try using dir, as below:
node {
dir("/home/jenkins/projects/${component}") {
out = sh script: "${gradleHome}/bin/gradle build", returnStdout: true
}
}
All commands within { and } for a dir will execute with the specified directory as their working directory. This will overcome any problems that may exist with the cd shell built-in.