How to capture last part of the git url using shell script? - bash

I am writing a Jenkins pipeline. I am trying to capture last part of the git url without the git extension. For instance: https://github.hhhh.com/aaaaaa-dddd/xxxx-yyyy.git. I want only xxxx-yyyy to be returned. Below is my code:
String getProjectName() {
echo "inside getProjectName +++++++"
# projectName = sh(
# script: "git config --get remote.origin.url",
# returnStdout: true
# ).trim()
def projectName= sh returnStdout:true, script: '''
#!/bin/bash
GIT_LOG = $(env -i git config --get remote.origin.url)
echo $GIT_LOG
basename -s .git "$GIT_LOG"; '''
echo "projectName: ${projectName}"
return projectName
}
PS: Please ignore the commented lines of code.

There is basic Bourne shell functionality that achieves that:
# strip everything up to the last /
projectName=${GIT_LOG##*/}
# strip trailing .git
projectName=${projectName%.git}
This leaves just the requested name in projectName.

No space before and after =:
x='https://github.hhhh.com/aaaaaa-dddd/xxxx-yyyy.git'
basename "$x" .git
Output:
xxxx-yyyy

Related

How to return output of shell script into Jenkinsfile [duplicate]

I have something like this on a Jenkinsfile (Groovy) and I want to record the stdout and the exit code in a variable in order to use the information later.
sh "ls -l"
How can I do this, especially as it seems that you cannot really run any kind of groovy code inside the Jenkinsfile?
The latest version of the pipeline sh step allows you to do the following;
// Git committer email
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
Another feature is the returnStatus option.
// Test commit message for flags
BUILD_FULL = sh (
script: "git log -1 --pretty=%B | grep '\\[jenkins-full]'",
returnStatus: true
) == 0
echo "Build full flag: ${BUILD_FULL}"
These options where added based on this issue.
See official documentation for the sh command.
For declarative pipelines (see comments), you need to wrap code into script step:
script {
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
}
Current Pipeline version natively supports returnStdout and returnStatus, which make it possible to get output or status from sh/bat steps.
An example:
def ret = sh(script: 'uname', returnStdout: true)
println ret
An official documentation.
quick answer is this:
sh "ls -l > commandResult"
result = readFile('commandResult').trim()
I think there exist a feature request to be able to get the result of sh step, but as far as I know, currently there is no other option.
EDIT: JENKINS-26133
EDIT2: Not quite sure since what version, but sh/bat steps now can return the std output, simply:
def output = sh returnStdout: true, script: 'ls -l'
If you want to get the stdout AND know whether the command succeeded or not, just use returnStdout and wrap it in an exception handler:
scripted pipeline
try {
// Fails with non-zero exit if dir1 does not exist
def dir1 = sh(script:'ls -la dir1', returnStdout:true).trim()
} catch (Exception ex) {
println("Unable to read dir1: ${ex}")
}
output:
[Pipeline] sh
[Test-Pipeline] Running shell script
+ ls -la dir1
ls: cannot access dir1: No such file or directory
[Pipeline] echo
unable to read dir1: hudson.AbortException: script returned exit code 2
Unfortunately hudson.AbortException is missing any useful method to obtain that exit status, so if the actual value is required you'd need to parse it out of the message (ugh!)
Contrary to the Javadoc https://javadoc.jenkins-ci.org/hudson/AbortException.html the build is not failed when this exception is caught. It fails when it's not caught!
Update:
If you also want the STDERR output from the shell command, Jenkins unfortunately fails to properly support that common use-case. A 2017 ticket JENKINS-44930 is stuck in a state of opinionated ping-pong whilst making no progress towards a solution - please consider adding your upvote to it.
As to a solution now, there could be a couple of possible approaches:
a) Redirect STDERR to STDOUT 2>&1
- but it's then up to you to parse that out of the main output though, and you won't get the output if the command failed - because you're in the exception handler.
b) redirect STDERR to a temporary file (the name of which you prepare earlier) 2>filename (but remember to clean up the file afterwards) - ie. main code becomes:
def stderrfile = 'stderr.out'
try {
def dir1 = sh(script:"ls -la dir1 2>${stderrfile}", returnStdout:true).trim()
} catch (Exception ex) {
def errmsg = readFile(stderrfile)
println("Unable to read dir1: ${ex} - ${errmsg}")
}
c) Go the other way, set returnStatus=true instead, dispense with the exception handler and always capture output to a file, ie:
def outfile = 'stdout.out'
def status = sh(script:"ls -la dir1 >${outfile} 2>&1", returnStatus:true)
def output = readFile(outfile).trim()
if (status == 0) {
// output is directory listing from stdout
} else {
// output is error message from stderr
}
Caveat: the above code is Unix/Linux-specific - Windows requires completely different shell commands.
this is a sample case, which will make sense I believe!
node('master'){
stage('stage1'){
def commit = sh (returnStdout: true, script: '''echo hi
echo bye | grep -o "e"
date
echo lol''').split()
echo "${commit[-1]} "
}
}
For those who need to use the output in subsequent shell commands, rather than groovy, something like this example could be done:
stage('Show Files') {
environment {
MY_FILES = sh(script: 'cd mydir && ls -l', returnStdout: true)
}
steps {
sh '''
echo "$MY_FILES"
'''
}
}
I found the examples on code maven to be quite useful.
All the above method will work. but to use the var as env variable inside your code you need to export the var first.
script{
sh " 'shell command here' > command"
command_var = readFile('command').trim()
sh "export command_var=$command_var"
}
replace the shell command with the command of your choice. Now if you are using python code you can just specify os.getenv("command_var") that will return the output of the shell command executed previously.
How to read the shell variable in groovy / how to assign shell return value to groovy variable.
Requirement : Open a text file read the lines using shell and store the value in groovy and get the parameter for each line .
Here , is delimiter
Ex: releaseModule.txt
./APP_TSBASE/app/team/i-home/deployments/ip-cc.war/cs_workflowReport.jar,configurable-wf-report,94,23crb1,artifact
./APP_TSBASE/app/team/i-home/deployments/ip.war/cs_workflowReport.jar,configurable-temppweb-report,394,rvu3crb1,artifact
========================
Here want to get module name 2nd Parameter (configurable-wf-report) , build no 3rd Parameter (94), commit id 4th (23crb1)
def module = sh(script: """awk -F',' '{ print \$2 "," \$3 "," \$4 }' releaseModules.txt | sort -u """, returnStdout: true).trim()
echo module
List lines = module.split( '\n' ).findAll { !it.startsWith( ',' ) }
def buildid
def Modname
lines.each {
List det1 = it.split(',')
buildid=det1[1].trim()
Modname = det1[0].trim()
tag= det1[2].trim()
echo Modname
echo buildid
echo tag
}
If you don't have a single sh command but a block of sh commands, returnstdout wont work then.
I had a similar issue where I applied something which is not a clean way of doing this but eventually it worked and served the purpose.
Solution -
In the shell block , echo the value and add it into some file.
Outside the shell block and inside the script block , read this file ,trim it and assign it to any local/params/environment variable.
example -
steps {
script {
sh '''
echo $PATH>path.txt
// I am using '>' because I want to create a new file every time to get the newest value of PATH
'''
path = readFile(file: 'path.txt')
path = path.trim() //local groovy variable assignment
//One can assign these values to env and params as below -
env.PATH = path //if you want to assign it to env var
params.PATH = path //if you want to assign it to params var
}
}
Easiest way is use this way
my_var=`echo 2`
echo $my_var
output
: 2
note that is not simple single quote is back quote ( ` ).

Git completion for alias as if for Git itself

Background
I have successfully configured Bash completion for various Git aliases. For example:
$ git config alias.subject
!git --no-pager show --quiet --pretty='%s'
$ function _git_subject() { _git_show; }
$ git subject my<TAB>
$ git subject my-branch
Challenge
However, I have a Git alias that I don't know how to set up Bash completion for. The problem is that I want the alias to complete as if for the top-level Git command itself. The alias is this:
$ git config alias.alias
alias = !"f() { if [[ \"$#\" != 1 ]]; then >&2 echo \"Usage: git alias COMMAND\"; return 1; fi; git config alias.\"$1\"; }; f"
# Example
$ git alias s
status
I have tried using _git, __git_main, and __git_wrap__git_main, but none of them work (I think it leads to an infinite loop since it never returns after I press tab).
Is there a way to add completion for a Git alias that completes as if it was the top-level Git command? Or specifically how to have completion for this alias?
Tried but doesn't work
function _git_alias() { _git; }
function _git_alias() { __git_main; }
function _git_alias() { __git_wrap__git_main; }
Desired behavior
$ git alias su<TAB>
subject submodule
$ git alias sub
Alternatively, if there's an easy way to complete for only aliases that would be cool, too. I would like to know how to complete as if for the top-level Git command just for curiosity as well, though.
I was finally able to create a working solution with a bit of hackery around the "magic" Bash completion variables. I changed these variables to "pretend" we were completing the given command as given to git itself.
If anybody has any suggestions to simplify this I would totally be open to suggestions.
# This is complex because we want to delegate to the completion for Git
# itself without ending up with an infinite loop (which happens if you try
# to just delegate to _git).
_git_alias() {
if [[ "$COMP_CWORD" -lt 2 ]]; then
return
fi
local old_comp_line_length new_comp_line_length
COMP_WORDS=(git "${COMP_WORDS[#]:2}")
((COMP_CWORD -= 1))
old_comp_line_length=${#COMP_LINE}
if [[ "$COMP_LINE" =~ ^[^[:blank:]]+[[:blank:]]+[^[:blank:]]+[[:blank:]]+(.*)$ ]]; then
COMP_LINE="git ${BASH_REMATCH[1]}"
fi
new_comp_line_length=${#COMP_LINE}
(( COMP_POINT += new_comp_line_length - old_comp_line_length ))
_git "$#"
# git alias blah
# ^
# 01234567890123
# 0 1
# point: 11
# length: 13
#
# git blah
# ^
# 01234567
# point: 5
# length: 7
#
# point = point - (old length) + (new length)
# point = 11 - 13 + 7
# point = -2 + 7
# point = 5
}

Escape double quotes in a Jenkins pipeline file's shell command

Below is a snippet from my Jenkins file -
stage('Configure replication agents') {
environment {
AUTHOR_NAME="XX.XX.XX.XX"
PUBLISHER_NAME="XX.XX.XX.XX"
REPL_USER="USER"
REPL_PASSWORD="PASSWORD"
AUTHOR_PORT="4502"
PUBLISHER_PORT="4503"
AUTHOR="http://${AUTHOR_NAME}:${AUTHOR_PORT}"
PUBLISHER="http://${PUBLISHER_NAME}:${PUBLISHER_PORT}"
S_URI= "${PUBLISHER}/bin/receive?sling:authRequestLogin=1"
}
steps {
sh 'curl -u XX:XX --data "status=browser&cmd=createPage&label=${PUBLISHER_NAME}&title=${PUBLISHER_NAME}&parentPath =/etc/replication/agents.author&template=/libs/cq/replication/templates/agent" ${AUTHOR}/bin/wcmcommand'
}
The above command, in Jenkins console, is printed as
curl -u XX:XX --data status=browser&cmd=createPage&label=XXXX&title=XXX&parentPath =/etc/replication/agents.author&template=/libs/cq/replication/templates/agent http://5XXXX:4502/bin/wcmcommand
Note how the double quotes "" are missing.
I need to preserve the double quotes after --data in this command. How do I do it?
I tried using forward slashes but that didnt work.
Cheers
To expand on my comment, a quick test revealed its the case.
You need to escape twice, once the quote for the shell with a slash, and once that slash with a slash for groovy itself.
node() {
sh 'echo "asdf"'
sh 'echo \"asdf\"'
sh 'echo \\"asdf\\"'
}
Result
[Pipeline] {
[Pipeline] sh
+ echo asdf
asdf
[Pipeline] sh
+ echo asdf
asdf
[Pipeline] sh
+ echo "asdf"
"asdf"
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
After long time of struggling and googling, this is what has worked for me on similar use case:
sh("ssh root#my.server.com \"su user -c \\\"mkdir ${newDirName}\\\"\"")
Update: How I think it gets interpreted
1] sh extension strips first escaping (\" becomes " and \\ becomes \, first and last " are not part of input)
ssh root#my.server.com "su user -c \"mkdir ${newDirName}\""
2] ssh command strips second level of escaping (\" becomes ", while outer " also not part of input)
su user -c "mkdir ${newDirName}"
I had double quotes inside the variable, so escaped single quotes worked for me:
sh "git commit -m \'${ThatMayContainDoubleQuotes}\'"
I needed the output to be with trailing \\ so I had to do something like this
echo 'key1 = \\\\"__value1__\\\\"' > auto.file
File looks like
cat auto.file
key1 = \\"__value1__\\"
Dependent Script
export value1="some-value"
var=${value1}
# Read in template one line at the time, and replace variables
tmpfile=$(mktemp)
sed -E 's/__(([^_]|_[^_])*)__/${\\1}/g' auto.file > ${tmpfile}
while read auto
do
eval echo "$auto"
done < "${tmpfile}" > autoRendered.file
rm -f ${tmpfile}
Rendered File looks like
cat autoRendered.file
key1 = "some-value"
For anyone who comes looking for a fix to a similar issue with quoting numbers during helm install/upgrade, you can use --set-string instead of --set
Ref: https://helm.sh/docs/chart_best_practices/values/#consider-how-users-will-use-your-values

Rename directory in jenkins with shell cmd

I'm trying to run a shell script in jenkins to rename a directory to add the date on it .
// rename file
sh("mv $file-reports $file-reports-$date")
sh("mv $file-reports-$date jmeter-tests")
date is get by this next script :
// Getting date
date = sh(
script: """
(date +%T-"%F")
""",
returnStdout: true
)
output of date : 12:55:39-2018-07-26
im getting this error in the log :
[Pipeline] sh
[workspace] Running shell script
+ mv quickquote-belair-appstatic-reports quickquote-belair-appstatic-reports-15:59:27-2018-07-26a
[Pipeline] sh
[workspace] Running shell script
+ mv -T quickquote-belair-appstatic-reports-15:59:27-2018-07-26a
mv: missing destination file operand after ‘quickquote-belair-appstatic-reports-15:59:27-2018-07-26a’
Try 'mv --help' for more information.
I'm confused on why its telling me there's a missing destination file ..?
The return value of sh() will followed by an new line as default, so you need to use trim() to remove the new line at the end as following:
date = sh(
script: """
(date +%T-"%F")
""",
returnStdout: true
).trim()
Becasue $date has a newline at the end, so that mv $file-reports-$date jmeter-tests will be broken at mv $file-reports-$date the destination jmeter-tests move into next line. So it report mv miss destination file

Run a string as a command within a Bash script

I have a Bash script that builds a string to run as a command
Script:
#! /bin/bash
matchdir="/home/joao/robocup/runner_workdir/matches/testmatch/"
teamAComm="`pwd`/a.sh"
teamBComm="`pwd`/b.sh"
include="`pwd`/server_official.conf"
serverbin='/usr/local/bin/rcssserver'
cd $matchdir
illcommando="$serverbin include='$include' server::team_l_start = '${teamAComm}' server::team_r_start = '${teamBComm}' CSVSaver::save='true' CSVSaver::filename = 'out.csv'"
echo "running: $illcommando"
# $illcommando > server-output.log 2> server-error.log
$illcommando
which does not seem to supply the arguments correctly to the $serverbin.
Script output:
running: /usr/local/bin/rcssserver include='/home/joao/robocup/runner_workdir/server_official.conf' server::team_l_start = '/home/joao/robocup/runner_workdir/a.sh' server::team_r_start = '/home/joao/robocup/runner_workdir/b.sh' CSVSaver::save='true' CSVSaver::filename = 'out.csv'
rcssserver-14.0.1
Copyright (C) 1995, 1996, 1997, 1998, 1999 Electrotechnical Laboratory.
2000 - 2009 RoboCup Soccer Simulator Maintenance Group.
Usage: /usr/local/bin/rcssserver [[-[-]]namespace::option=value]
[[-[-]][namespace::]help]
[[-[-]]include=file]
Options:
help
display generic help
include=file
parse the specified configuration file. Configuration files
have the same format as the command line options. The
configuration file specified will be parsed before all
subsequent options.
server::help
display detailed help for the "server" module
player::help
display detailed help for the "player" module
CSVSaver::help
display detailed help for the "CSVSaver" module
CSVSaver Options:
CSVSaver::save=<on|off|true|false|1|0|>
If save is on/true, then the saver will attempt to save the
results to the database. Otherwise it will do nothing.
current value: false
CSVSaver::filename='<STRING>'
The file to save the results to. If this file does not
exist it will be created. If the file does exist, the results
will be appended to the end.
current value: 'out.csv'
if I just paste the command /usr/local/bin/rcssserver include='/home/joao/robocup/runner_workdir/server_official.conf' server::team_l_start = '/home/joao/robocup/runner_workdir/a.sh' server::team_r_start = '/home/joao/robocup/runner_workdir/b.sh' CSVSaver::save='true' CSVSaver::filename = 'out.csv' (in the output after "runnning: ") it works fine.
You can use eval to execute a string:
eval $illcommando
your_command_string="..."
output=$(eval "$your_command_string")
echo "$output"
I usually place commands in parentheses $(commandStr), if that doesn't help I find bash debug mode great, run the script as bash -x script
don't put your commands in variables, just run it
matchdir="/home/joao/robocup/runner_workdir/matches/testmatch/"
PWD=$(pwd)
teamAComm="$PWD/a.sh"
teamBComm="$PWD/b.sh"
include="$PWD/server_official.conf"
serverbin='/usr/local/bin/rcssserver'
cd $matchdir
$serverbin include=$include server::team_l_start = ${teamAComm} server::team_r_start=${teamBComm} CSVSaver::save='true' CSVSaver::filename = 'out.csv'
./me casts raise_dead()
I was looking for something like this, but I also needed to reuse the same string minus two parameters so I ended up with something like:
my_exe ()
{
mysql -sN -e "select $1 from heat.stack where heat.stack.name=\"$2\";"
}
This is something I use to monitor openstack heat stack creation. In this case I expect two conditions, an action 'CREATE' and a status 'COMPLETE' on a stack named "Somestack"
To get those variables I can do something like:
ACTION=$(my_exe action Somestack)
STATUS=$(my_exe status Somestack)
if [[ "$ACTION" == "CREATE" ]] && [[ "$STATUS" == "COMPLETE" ]]
...
Here is my gradle build script that executes strings stored in heredocs:
current_directory=$( realpath "." )
GENERATED=${current_directory}/"GENERATED"
build_gradle=$( realpath build.gradle )
## touch because .gitignore ignores this folder:
touch $GENERATED
COPY_BUILD_FILE=$( cat <<COPY_BUILD_FILE_HEREDOC
cp
$build_gradle
$GENERATED/build.gradle
COPY_BUILD_FILE_HEREDOC
)
$COPY_BUILD_FILE
GRADLE_COMMAND=$( cat <<GRADLE_COMMAND_HEREDOC
gradle run
--build-file
$GENERATED/build.gradle
--gradle-user-home
$GENERATED
--no-daemon
GRADLE_COMMAND_HEREDOC
)
$GRADLE_COMMAND
The lone ")" are kind of ugly. But I have no clue how to fix that asthetic aspect.
To see all commands that are being executed by the script, add the -x flag to your shabang line, and execute the command normally:
#! /bin/bash -x
matchdir="/home/joao/robocup/runner_workdir/matches/testmatch/"
teamAComm="`pwd`/a.sh"
teamBComm="`pwd`/b.sh"
include="`pwd`/server_official.conf"
serverbin='/usr/local/bin/rcssserver'
cd $matchdir
$serverbin include="$include" server::team_l_start="${teamAComm}" server::team_r_start="${teamBComm}" CSVSaver::save='true' CSVSaver::filename='out.csv'
Then if you sometimes want to ignore the debug output, redirect stderr somewhere.
For me echo XYZ_20200824.zip | grep -Eo '[[:digit:]]{4}[[:digit:]]{2}[[:digit:]]{2}'
was working fine but unable to store output of command into variable.
I had same issue I tried eval but didn't got output.
Here is answer for my problem:
cmd=$(echo XYZ_20200824.zip | grep -Eo '[[:digit:]]{4}[[:digit:]]{2}[[:digit:]]{2}')
echo $cmd
My output is now 20200824

Resources