zsh: no matches found: push[dev1] - terminal

I am running a script like ./ci.sh push[dev1] and I get the response like zsh: no matches found: push[dev1]. I have tried to place alias into .zshrc but not jolly joy.
My .zshrc file:
alias push='noglob push'
alias git='noglob git'
alias jake='noglob jake'
alias task='noglob task'
alias branch='noglob branch'
alias gp='git push'`
Also the the task from the jakefile.js:
desc('Push commits to integration machine for validation.');
task('push', ['status'], (branch) => {
if (!branch) {
console.log(
'This command will push your code to the integration machine. Pass your\n' +
'branch name as a parameter (e.g., \'push[workstation_name]\').\n'
);
fail('No branch provided');
}
run([
'git push origin ' + branch
], () => {
console.log('\nOK. Current branch has been copied to integration machine.');
complete();
});
}, { async: true });
and the file ci.sh contains:
#!/bin/sh
. build/scripts/run_jake.sh -f build/scripts/ci.jakefile.js $*
Thanks for your help.

Simply just escape the brackets
./ci.sh push\[dev1\]

Related

How to return output of shell script into Jenkinsfile [duplicate]

I have something like this on a Jenkinsfile (Groovy) and I want to record the stdout and the exit code in a variable in order to use the information later.
sh "ls -l"
How can I do this, especially as it seems that you cannot really run any kind of groovy code inside the Jenkinsfile?
The latest version of the pipeline sh step allows you to do the following;
// Git committer email
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
Another feature is the returnStatus option.
// Test commit message for flags
BUILD_FULL = sh (
script: "git log -1 --pretty=%B | grep '\\[jenkins-full]'",
returnStatus: true
) == 0
echo "Build full flag: ${BUILD_FULL}"
These options where added based on this issue.
See official documentation for the sh command.
For declarative pipelines (see comments), you need to wrap code into script step:
script {
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
}
Current Pipeline version natively supports returnStdout and returnStatus, which make it possible to get output or status from sh/bat steps.
An example:
def ret = sh(script: 'uname', returnStdout: true)
println ret
An official documentation.
quick answer is this:
sh "ls -l > commandResult"
result = readFile('commandResult').trim()
I think there exist a feature request to be able to get the result of sh step, but as far as I know, currently there is no other option.
EDIT: JENKINS-26133
EDIT2: Not quite sure since what version, but sh/bat steps now can return the std output, simply:
def output = sh returnStdout: true, script: 'ls -l'
If you want to get the stdout AND know whether the command succeeded or not, just use returnStdout and wrap it in an exception handler:
scripted pipeline
try {
// Fails with non-zero exit if dir1 does not exist
def dir1 = sh(script:'ls -la dir1', returnStdout:true).trim()
} catch (Exception ex) {
println("Unable to read dir1: ${ex}")
}
output:
[Pipeline] sh
[Test-Pipeline] Running shell script
+ ls -la dir1
ls: cannot access dir1: No such file or directory
[Pipeline] echo
unable to read dir1: hudson.AbortException: script returned exit code 2
Unfortunately hudson.AbortException is missing any useful method to obtain that exit status, so if the actual value is required you'd need to parse it out of the message (ugh!)
Contrary to the Javadoc https://javadoc.jenkins-ci.org/hudson/AbortException.html the build is not failed when this exception is caught. It fails when it's not caught!
Update:
If you also want the STDERR output from the shell command, Jenkins unfortunately fails to properly support that common use-case. A 2017 ticket JENKINS-44930 is stuck in a state of opinionated ping-pong whilst making no progress towards a solution - please consider adding your upvote to it.
As to a solution now, there could be a couple of possible approaches:
a) Redirect STDERR to STDOUT 2>&1
- but it's then up to you to parse that out of the main output though, and you won't get the output if the command failed - because you're in the exception handler.
b) redirect STDERR to a temporary file (the name of which you prepare earlier) 2>filename (but remember to clean up the file afterwards) - ie. main code becomes:
def stderrfile = 'stderr.out'
try {
def dir1 = sh(script:"ls -la dir1 2>${stderrfile}", returnStdout:true).trim()
} catch (Exception ex) {
def errmsg = readFile(stderrfile)
println("Unable to read dir1: ${ex} - ${errmsg}")
}
c) Go the other way, set returnStatus=true instead, dispense with the exception handler and always capture output to a file, ie:
def outfile = 'stdout.out'
def status = sh(script:"ls -la dir1 >${outfile} 2>&1", returnStatus:true)
def output = readFile(outfile).trim()
if (status == 0) {
// output is directory listing from stdout
} else {
// output is error message from stderr
}
Caveat: the above code is Unix/Linux-specific - Windows requires completely different shell commands.
this is a sample case, which will make sense I believe!
node('master'){
stage('stage1'){
def commit = sh (returnStdout: true, script: '''echo hi
echo bye | grep -o "e"
date
echo lol''').split()
echo "${commit[-1]} "
}
}
For those who need to use the output in subsequent shell commands, rather than groovy, something like this example could be done:
stage('Show Files') {
environment {
MY_FILES = sh(script: 'cd mydir && ls -l', returnStdout: true)
}
steps {
sh '''
echo "$MY_FILES"
'''
}
}
I found the examples on code maven to be quite useful.
All the above method will work. but to use the var as env variable inside your code you need to export the var first.
script{
sh " 'shell command here' > command"
command_var = readFile('command').trim()
sh "export command_var=$command_var"
}
replace the shell command with the command of your choice. Now if you are using python code you can just specify os.getenv("command_var") that will return the output of the shell command executed previously.
How to read the shell variable in groovy / how to assign shell return value to groovy variable.
Requirement : Open a text file read the lines using shell and store the value in groovy and get the parameter for each line .
Here , is delimiter
Ex: releaseModule.txt
./APP_TSBASE/app/team/i-home/deployments/ip-cc.war/cs_workflowReport.jar,configurable-wf-report,94,23crb1,artifact
./APP_TSBASE/app/team/i-home/deployments/ip.war/cs_workflowReport.jar,configurable-temppweb-report,394,rvu3crb1,artifact
========================
Here want to get module name 2nd Parameter (configurable-wf-report) , build no 3rd Parameter (94), commit id 4th (23crb1)
def module = sh(script: """awk -F',' '{ print \$2 "," \$3 "," \$4 }' releaseModules.txt | sort -u """, returnStdout: true).trim()
echo module
List lines = module.split( '\n' ).findAll { !it.startsWith( ',' ) }
def buildid
def Modname
lines.each {
List det1 = it.split(',')
buildid=det1[1].trim()
Modname = det1[0].trim()
tag= det1[2].trim()
echo Modname
echo buildid
echo tag
}
If you don't have a single sh command but a block of sh commands, returnstdout wont work then.
I had a similar issue where I applied something which is not a clean way of doing this but eventually it worked and served the purpose.
Solution -
In the shell block , echo the value and add it into some file.
Outside the shell block and inside the script block , read this file ,trim it and assign it to any local/params/environment variable.
example -
steps {
script {
sh '''
echo $PATH>path.txt
// I am using '>' because I want to create a new file every time to get the newest value of PATH
'''
path = readFile(file: 'path.txt')
path = path.trim() //local groovy variable assignment
//One can assign these values to env and params as below -
env.PATH = path //if you want to assign it to env var
params.PATH = path //if you want to assign it to params var
}
}
Easiest way is use this way
my_var=`echo 2`
echo $my_var
output
: 2
note that is not simple single quote is back quote ( ` ).

How to capture last part of the git url using shell script?

I am writing a Jenkins pipeline. I am trying to capture last part of the git url without the git extension. For instance: https://github.hhhh.com/aaaaaa-dddd/xxxx-yyyy.git. I want only xxxx-yyyy to be returned. Below is my code:
String getProjectName() {
echo "inside getProjectName +++++++"
# projectName = sh(
# script: "git config --get remote.origin.url",
# returnStdout: true
# ).trim()
def projectName= sh returnStdout:true, script: '''
#!/bin/bash
GIT_LOG = $(env -i git config --get remote.origin.url)
echo $GIT_LOG
basename -s .git "$GIT_LOG"; '''
echo "projectName: ${projectName}"
return projectName
}
PS: Please ignore the commented lines of code.
There is basic Bourne shell functionality that achieves that:
# strip everything up to the last /
projectName=${GIT_LOG##*/}
# strip trailing .git
projectName=${projectName%.git}
This leaves just the requested name in projectName.
No space before and after =:
x='https://github.hhhh.com/aaaaaa-dddd/xxxx-yyyy.git'
basename "$x" .git
Output:
xxxx-yyyy

Git completion for alias as if for Git itself

Background
I have successfully configured Bash completion for various Git aliases. For example:
$ git config alias.subject
!git --no-pager show --quiet --pretty='%s'
$ function _git_subject() { _git_show; }
$ git subject my<TAB>
$ git subject my-branch
Challenge
However, I have a Git alias that I don't know how to set up Bash completion for. The problem is that I want the alias to complete as if for the top-level Git command itself. The alias is this:
$ git config alias.alias
alias = !"f() { if [[ \"$#\" != 1 ]]; then >&2 echo \"Usage: git alias COMMAND\"; return 1; fi; git config alias.\"$1\"; }; f"
# Example
$ git alias s
status
I have tried using _git, __git_main, and __git_wrap__git_main, but none of them work (I think it leads to an infinite loop since it never returns after I press tab).
Is there a way to add completion for a Git alias that completes as if it was the top-level Git command? Or specifically how to have completion for this alias?
Tried but doesn't work
function _git_alias() { _git; }
function _git_alias() { __git_main; }
function _git_alias() { __git_wrap__git_main; }
Desired behavior
$ git alias su<TAB>
subject submodule
$ git alias sub
Alternatively, if there's an easy way to complete for only aliases that would be cool, too. I would like to know how to complete as if for the top-level Git command just for curiosity as well, though.
I was finally able to create a working solution with a bit of hackery around the "magic" Bash completion variables. I changed these variables to "pretend" we were completing the given command as given to git itself.
If anybody has any suggestions to simplify this I would totally be open to suggestions.
# This is complex because we want to delegate to the completion for Git
# itself without ending up with an infinite loop (which happens if you try
# to just delegate to _git).
_git_alias() {
if [[ "$COMP_CWORD" -lt 2 ]]; then
return
fi
local old_comp_line_length new_comp_line_length
COMP_WORDS=(git "${COMP_WORDS[#]:2}")
((COMP_CWORD -= 1))
old_comp_line_length=${#COMP_LINE}
if [[ "$COMP_LINE" =~ ^[^[:blank:]]+[[:blank:]]+[^[:blank:]]+[[:blank:]]+(.*)$ ]]; then
COMP_LINE="git ${BASH_REMATCH[1]}"
fi
new_comp_line_length=${#COMP_LINE}
(( COMP_POINT += new_comp_line_length - old_comp_line_length ))
_git "$#"
# git alias blah
# ^
# 01234567890123
# 0 1
# point: 11
# length: 13
#
# git blah
# ^
# 01234567
# point: 5
# length: 7
#
# point = point - (old length) + (new length)
# point = 11 - 13 + 7
# point = -2 + 7
# point = 5
}

Set environment variables then run script in Jenkins Scripted Pipeline

I am new to Jenkins, Groovy and pipelines. I have created a simple pipeline stage like so:
//working build but not setting env variables
node('build-01') {
stage('Building') {
echo "[*] Starting build (id: ${env.BUILD_ID}) on ${env.JENKINS_URL}"
try {
sh 'ls -l'
//ls shows the damn file
sh '. setup-target'
} catch(all) {
sh "echo 'Failed to run setup-target script with error: ' ${all}"
}
}
}
This works. But I want to modify/add environment variables to the session running this script (this script is a bash file with the correct shebang line on top). So I did:
node('build-01') {
withEnv(["CMAKE_INSTALL_DIR=${WORKSPACE}", "SDK_INSTALL_DIR=${WORKSPACE}"]){
stage('Building') {
echo "[*] Starting build (id: ${env.BUILD_ID}) on ${env.JENKINS_URL}"
try {
sh 'ls -l'
//ls shows the damn file
sh '. setup-target'
} catch(all) {
sh "echo 'Failed to run setup-target script with error: ' ${all}"
}
}
}
}
This errors out with:
/home/jenkins-sw/ci/workspace/myWorkSpace#tmp/durable-6d30b48d/script.sh: line 1: .: setup-target: file not found
and
Failed to run setup-target script with error: hudson.AbortException: script returned exit code 1
But the environment variables are set, I check this by doing a sh 'printenv' right below the ls -l line. Interestingly ls -l does show the script.
What am I missing?
UPDATE
The following:
node('build-01') {
withEnv(["CMAKE_INSTALL_DIR=${WORKSPACE}", "SDK_INSTALL_DIR=${WORKSPACE}"]){
stage('Building') {
echo "[*] Starting build (id: ${env.BUILD_ID}) on ${env.JENKINS_URL}"
try {
sh 'ls -l'
//ls shows the damn file
sh './setup-target'
} catch(all) {
sh "echo 'Failed to run setup-target script with error: ' ${all}"
}
}
}
}
results in:
/home/jenkins-sw/ci/workspace/myWorkSpace#tmp/durable-6d30b48d/script.sh: line 1: ./setup-target: Permission denied
Interesting. How is withEnv effecting permissions? What?! And if I chmod that file to have permissions, i get a new error, something related to "missing workspace".
I figured it out. I was cloning directly into the workspace and then setting my environment variables to point to the workspace as well. I modified both those things. I now create a dir in my workspace and clone into it and I also set my environment variables to directories inside my workspace. Like so:
node('build-01') {
withEnv(["CMAKE_INSTALL_DIR=${WORKSPACE}/cmake_install", "SDK_INSTALL_DIR=${WORKSPACE}/sdk"]){
stage('Building') {
echo "[*] Starting build (id: ${env.BUILD_ID}) on ${env.JENKINS_URL}"
try {
sh 'ls -l'
//ls shows the damn file
dir('path/to/checkout/') {
sh '. ./setup-target'
}
} catch(all) {
sh "echo 'Failed to run setup-target script with error: ' ${all}"
}
}
}
}
This works.
My guess would be that either CMAKE_INSTALL_DIR or SDK_INSTALL_DIR are on the path.
Instead of sh '. setup-target' you should sh './setup-target'.

Jenkins: Pipeline sh bad substitution error

A step in my pipeline uploads a .tar to an artifactory server. I am getting a Bad substitution error when passing in env.BUILD_NUMBER, but the same commands works when the number is hard coded. The script is written in groovy through jenkins and is running in the jenkins workspace.
sh 'curl -v --user user:password --data-binary ${buildDir}package${env.BUILD_NUMBER}.tar -X PUT "http://artifactory.mydomain.com/artifactory/release-packages/package${env.BUILD_NUMBER}.tar"'
returns the errors:
[Pipeline] sh
[Package_Deploy_Pipeline] Running shell script
/var/lib/jenkins/workspace/Package_Deploy_Pipeline#tmp/durable-4c8b7958/script.sh: 2:
/var/lib/jenkins/workspace/Package_Deploy_Pipeline#tmp/durable-4c8b7958/script.sh: Bad substitution
[Pipeline] } //node
[Pipeline] Allocate node : End
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
If hard code in a build number and swap out ${env.BUILD_NUMBER} I get no errors and the code runs successfully.
sh 'curl -v --user user:password --data-binary ${buildDir}package113.tar -X PUT "http://artifactory.mydomain.com/artifactory/release-packages/package113.tar"'
I use ${env.BUILD_NUMBER} within other sh commands within the same script and have no issues in any other places.
This turned out to be a syntax issue. Wrapping the command in ''s caused ${env.BUILD_NUMBER to be passed instead of its value. I wrapped the whole command in "s and escaped the nested. Works fine now.
sh "curl -v --user user:password --data-binary ${buildDir}package${env.BUILD_NUMBER}.tar -X PUT \"http://artifactory.mydomain.com/artifactory/release-packages/package${env.BUILD_NUMBER}.tar\""
In order to Pass groovy parameters into bash scripts in Jenkins pipelines (causing sometimes bad substitions) You got 2 options:
The triple double quotes way [ " " " ]
OR
the triple single quotes way [ ' ' ' ]
In triple double quotes you can render the normal parameter from groovy using ${someVariable} ,if it's environment variable ${env.someVariable} , if it's parameters injected into your job ${params.someVariable}
example:
def YOUR_APPLICATION_PATH= "${WORKSPACE}/myApp/"
sh """#!/bin/bash
cd ${YOUR_APPLICATION_PATH}
npm install
"""
In triple single quotes things getting little bit tricky, you can pass the parameter to environment parameter and using it by "\${someVaraiable}" or concating the groovy parameter using ''' + someVaraiable + '''
examples:
def YOUR_APPLICATION_PATH= "${WORKSPACE}/myApp/"
sh '''#!/bin/bash
cd ''' + YOUR_APPLICATION_PATH + '''
npm install
'''
OR
pipeline{
agent { node { label "test" } }
environment {
YOUR_APPLICATION_PATH = "${WORKSPACE}/myapp/"
}
continue...
continue...
continue...
sh '''#!/bin/bash
cd "\${YOUR_APPLICATION_PATH}"
npm install
'''
//OR
sh '''#!/bin/bash
cd "\${env.YOUR_APPLICATION_PATH}"
npm install
'''
Actually, you seem to have misunderstood the env variable. In your sh block, you should access ${BUILD_NUMBER} directly.
Reason/Explanation: env represents the environment inside the script. This environment is used/available directly to anything that is executed, e.g. shell scripts.
Please also pay attention to not write anything to env.*, but use withEnv{} blocks instead.
Usually the most common issue for:
Bad substitution
error is to use sh instead of bash.
Especially when using Jenkins, if you're using Execute shell, make sure your Command starts with shebang, e.g. #!/bin/bash -xe or #!/usr/bin/env bash.
I can definitely tell you, it's all about sh shell and bash shell. I fixed this problem by specifying #!/bin/bash -xe as follows:
node {
stage("Preparing"){
sh'''#!/bin/bash -xe
colls=( col1 col2 col3 )
for eachCol in ${colls[#]}
do
echo $eachCol
done
'''
}
}
I had this same issue when working on a Jenkins Pipeline for Amazon S3 Application upload.
My script was like this:
pipeline {
agent any
parameters {
string(name: 'Bucket', defaultValue: 's3-pipeline-test', description: 'The name of the Amazon S3 Bucket')
string(name: 'Prefix', defaultValue: 'my-website', description: 'Application directory in the Amazon S3 Bucket')
string(name: 'Build', defaultValue: 'public/', description: 'Build directory for the application')
}
stages {
stage('Build') {
steps {
echo 'Running build phase'
sh 'npm install' // Install packages
sh 'npm run build' // Build project
sh 'ls' // List project files
}
}
stage('Deploy') {
steps {
echo 'Running deploy phase'
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'AWSCredentials', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
sh 'aws s3 ls' // List AWS S3 buckets
sh 'aws s3 sync "${params.Build}" s3://"${params.Bucket}/${params.Prefix}" --delete' // Sync project files with AWS S3 Bucket project path
}
}
}
}
post {
success {
echo 'Deployment to Amazon S3 suceeded'
}
failure {
echo 'Deployment to Amazon S3 failed'
}
}
}
Here's how I fixed it:
Seeing that it's an interpolation call of variables, I had to substitute the single quotation marks (' ') in this line of the script:
sh 'aws s3 sync "${params.Build}" s3://"${params.Bucket}/${params.Prefix}" --delete' // Sync project files with AWS S3 Bucket project path
to double quotation marks (" "):
sh "aws s3 sync ${params.Build} s3://${params.Bucket}/${params.Prefix} --delete" // Sync project files with AWS S3 Bucket project path
So my script looked like this afterwards:
pipeline {
agent any
parameters {
string(name: 'Bucket', defaultValue: 's3-pipeline-test', description: 'The name of the Amazon S3 Bucket')
string(name: 'Prefix', defaultValue: 'my-website', description: 'Application directory in the Amazon S3 Bucket')
string(name: 'Build', defaultValue: 'public/', description: 'Build directory for the application')
}
stages {
stage('Build') {
steps {
echo 'Running build phase'
sh 'npm install' // Install packages
sh 'npm run build' // Build project
sh 'ls' // List project files
}
}
stage('Deploy') {
steps {
echo 'Running deploy phase'
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'AWSCredentials', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
sh 'aws s3 ls' // List AWS S3 buckets
sh "aws s3 sync ${params.Build} s3://${params.Bucket}/${params.Prefix} --delete" // Sync project files with AWS S3 Bucket project path
}
}
}
}
post {
success {
echo 'Deployment to Amazon S3 suceeded'
}
failure {
echo 'Deployment to Amazon S3 failed'
}
}
}
That's all
I hope this helps
I was having the issue with showing the {env.MAJOR_VERSION} in an artifactory of jar file . show I approaches by keeping of environment step in Jenkinsfile.
pipeline {
agent any
environment {
MAJOR_VERSION = 1
}
stages {
stage('build') {
steps {
sh 'ant -f build.xml -v'
}
}
}
post {
always{
archiveArtifacts artifacts: 'dist/*.jar', fingerprint: true
}
}
}
I got the issue solved and then it was not showing me bad substitution in my Jenkins build output. so environment step plays a more role in Jenkinsfile.
suggestion from #avivamg didn't worked for me, here is the syntax which works for me:
sh "python3 ${env.WORKSPACE}/package.py --product productname " +
"--build_dir ${release_build_dir} " +
"--signed_product_dir ${signed_product_dir} " +
"--version ${build_version}"
I got similar issue. But my usecase is little different
steps{
sh '''#!/bin/bash -xe
VAR=TRIAL
echo $VAR
if [ -d /var/lib/jenkins/.m2/'\${params.application_name}' ]
then
echo 'working'
echo ${VAR}
else
echo 'not working'
fi
'''
}
}
here I'm trying to declare a variable inside the script and also use a parameter from outside
After trying multiple ways
The following script worked
stage('cleaning com/avizva directory'){
steps{
sh """#!/bin/bash -xe
VAR=TRIAL
echo \$VAR
if [ -d /var/lib/jenkins/.m2/${params.application_name} ]
then
echo 'working'
echo \${VAR}
else
echo 'not working'
fi
"""
}
}
changes made :
Replaced triple single quotes --> triple double quotes
Whenever I want to refer to local variable I used escape character
$VAR --> \$VAR
This caused the error Bad Substitution:
pipeline {
agent any
environment {
DOCKER_IMAGENAME = "mynginx:latest"
DOCKER_FILE_PATH = "./docker"
}
stages {
stage('DockerImage-Build') {
steps {
sh 'docker build -t ${env.DOCKER_IMAGENAME} ${env.DOCKER_FILE_PATH}'
}
}
}
}
This fixed it: replace ' with " on sh command
pipeline {
agent any
environment {
DOCKER_IMAGENAME = "mynginx:latest"
DOCKER_FILE_PATH = "./docker"
}
stages {
stage('DockerImage-Build') {
steps {
sh "docker build -t ${env.DOCKER_IMAGENAME} ${env.DOCKER_FILE_PATH}"
}
}
}
}
The Jenkins Script is failing inside the "sh" command-line E.g:
sh 'npm run build' <-- Fails referring to package.json
Needs to be changed to:
sh 'npm run ng build....'
... ng $PATH is not found by the package.json.

Resources