I have a bash shell script executed by Jenkins / Hudson. For some reason when I create a function, the output of that function is not logged back to the console. This is an example of my code:
EDIT!!! - this culprit here is SSH... Is there a way to force its output back to console?
perform_task()
{
ssh jenkins#192.168.100.5 pwd
}
echo "Starting a task";
perform_task || { echo "The Task Failed"; exit 1; }
The output in the console is:
"Starting a task"
if I move the commands outside of the function, I can see their output.
Help on this would be greatly appreciated!
I partially solved the problem by getting the script to echo the command:
#!/bin/bash -ex
This still hides the output of that command, but at least I have insights about what's being run
I just copied your function, added #!/bin/bash -e to the beginning of and pasted it into the "Execute Shell" of Jenkins.
It worked fine!
Is your bash executable actually located at /bin? What OS are you running the script on?
Building remotely on slave_1
[test] $ /bin/bash -xe /tmp/hudson1206345540964396738.sh
+ echo 'Starting a task'
Starting a task
+ perform_task
+ ssh slave_2 pwd
/home/jenkins
Finished: SUCCESS
Not sure about Jenkins or all that, but this worked for me:
perform_task()
{
ls -al
ps aux
pwd
}
echo "Starting a task"
perform_task || ( echo "The Task Failed" ; exit 1 )
add #!/bin/bash as first line of the file
The reason is the basic bourne shell(/bin/sh) does not support some of the features you use.
Related
I want to execute maven command (mvn) through a shell script to be executed through cron.
My shell script
echo "setting the variables"
export M2_HOME=/Users/<XXXXX>/Downloads/apache-maven-3.5.2
export M2=/Users/<XXXXX>/Downloads/apache-maven-3.5.2/bin
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_151.jdk/Contents/Home/jre
export PATH=$PATH:/Users/<XXXXX>/google-cloud-sdk/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/Users/<XXXXX>/Documents/Software/neo-java-web-sdk-3.54.23/tools:/Users/<XXXXX>/Downloads/apache-maven-3.5.2/bin:/usr/local/go/bin:/Applications/Privileges.app/Contents/Resources:JAVA_HOME/bin
echo "variables set"
{
/Library/Java/JavaVirtualMachines/jdk1.8.0_151.jdk/Contents/Home/jre/bin/java -version > /Users/<XXXXX>/git/SimpleTests/org.saurav.simpletests/java.log
} || {
echo "java command failed"
}
{
/Users/<XXXXX>/Downloads/apache-maven-3.5.2/bin/mvn -version
} || {
echo "mvn command failed"
}
echo "Tests Executed"
output.log always print
setting the variables
variables set
mvn command failed
Tests Executed
So, it seems the mvn command execution fails.
Output of java command execution has been redirected to java.log but that prints empty. But it seems the java command execution is happening since the fallback echo statement is not printed here.
Best Regards,
Saurav
It seems the mvn file execution is a protected file execution.
With Mac Mojave the programs accessing the protected files have to be given full disk access
Please check here
https://apple.stackexchange.com/questions/378553/crontab-operation-not-permitted
https://support.intego.com/hc/en-us/articles/360016683471-Enable-Full-Disk-Access-in-macOS
After i followed the steps mentioned in the link 1, it started working.
Best Regards,
Saurav
I have a shell script that executes multiple sql files that updates to the database. I am calling the shell script from jenkins- build- execute shell. The jenkins console shows success at all times irrespective of the errors from the sql files. I want Jenkins to fail the build, if there is an error or any of the sql file failed executing and send the console output to the developer, if fails.
I tried echo $? in the shell script but it shows 0.
#!/bin/bash
walk_dir () {
shopt -s nullglob dotglob
for pathname in "$1"/*; do
if [ -d "$pathname" ]; then
walk_dir "$pathname"
else
case "$pathname" in
*.sql|*.SQL)
printf '%s\n Executing SQL File:' "$pathname"
sudo -u postgres psql <DBName> -f $pathname
rm $pathname
esac
fi
done
}
DOWNLOADING_DIR=/home/jenkins/DB/
walk_dir "$DOWNLOADING_DIR"
Jenkins Console results
ALTER TABLE
ERROR: cannot change return type of existing
DETAIL: Row type defined by OUT parameters is different.
CREATE FUNCTION
ALTER FUNCTION
CREATE FUNCTION
ALTER FUNCTION
Finished: SUCCESS
Expected Results: Failed from Jenkins (if any of the sql files failed executing from shell script) but it is showing as passed in Jenkins
Thanks for all the inputs. I was able to fix this issue. I installed 'log parser plugin' in Jenkins which will parse the keywords like /Error/ in the console output and make the build to fail.
This S/O answer will probably address your scenario: Automatic exit from bash shell script on error
This duplicate answer also provides useful guidance:Stop on first error [duplicate]
Essentially use set -e or #!/bin/bash -e.
If you don't trap every potential error, then the next step in the script will execute and the return code will be that of the last command in the script.
Direct link to www.davidpashley.com - Writing Robust Bash Shell Scripts
** This also assumes any external commands (eg: psql) also properly trap and return status codes.
Below code may help you. This is how i sorted the issue with mine.
Instead of sudo -u postgres psql <DBName> -f $pathname
Use below code:
OUTPUT=$(psql -U postgres -d <DBName> -c "\i $pathname;")
echo $OUTPUT | grep ERROR
if [[ $? -eq 0 ]]
then
echo "Error while running sql file $pathname"
exit 2
else
echo "$pathname - SQL file Executed, successfully"
fi
Jenkins will not provide you the SQL files error code. Jenkins just checks if your shell script is executed or not and based on that it will return a status code which generally is zero as it is executing the shell script successfully.
I made simple script. file name is sutest.
#!/bin/bash
cd ~/Downloads/redis-4.0.1/src
./redis-server
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
I runed script.$ . sutest
But, script code is stopped at ./redis-server.
So I can't see echo messages.
I want to make this kind of script files. How can I do that??
I would be appreciate your help.
Let's say more general case.
myscript1 file executes process like redis-server above.
another myscript2 file executes process like redis-server above.
another myscript3 file executes process like redis-server above.
How can I run three script files simultaneously??
I want to do job in ssh connection.
To make the matter worse, If I can't use screen or tmux??
Add a '&' char at the end of the row
./redis-server &
this char permits to run in backgroud the job, and the script continues.
Just do the echos first:
cd ~/Downloads/redis-4.0.1/src
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
exec ./redis-server
The use of exec is a small trick (which you can omit if you prefer): it replaces the shell script with redis-server, so the shell script is no longer running at all. Without exec, you end up with the shell script waiting around for redis-server to finish, which is unnecessary if the script will do nothing further.
If you don't like that for some reason, you can keep the original order:
cd ~/Downloads/redis-4.0.1/src
./redis-server & # run in background
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
wait # optional
I am writing a bash script that modifies some config files, runs "ant ear war" as a different user, outputs the return, exits back to the root to continue with the rest of the script. The issue is that the script does not continue after exiting and I don't get an output from "ant ear war".
Thank you for the help.
here is an example
#When running the bash script i don't see the output. Maybe it's because I run it as root and switched to another_user. So I tried to outputing result into a variable and into a text file. Both failed
su another_user
cd /usr/empi/MMEMPIV741/
echo $(ant ear war) >> /tmp/empi_install.txt
varant="$?"
echo 'if zero it's success otherwise it's a failure'
cp /usr/accessmgr/AMV741/bin/am/JBoss/AccessManager.war /usr/jboss/jboss-eap-4.3/jboss-as/server/default/deploy/
cp /usr/empi/MMEMPIV741/person_project/working-dir/dist/* /usr/jboss/jboss-eap-4.3/jboss-as/server/default/deploy/
exit
#By this time above is exited from another_user and should return to root
echo $varant
echo "http://`hostname`:21080/PersonMasterIndexDQM/flex/login.jsp#"
Put the commands you want to run in a different user context into a separate script and run that script via
su another_user -c /path/to/other.sh
I have a bash script which calls another bash script, like so:
#!/bin/bash
echo "Hi"
./script-two.sh
echo "Hello!"
The problem that I have is that it never makes it to printing "Hello!"
I think this is because ./script-two.sh (Which I did not write) is somehow exiting or changing the shell. I have included this script at the end of this post.
Is there a way I can gurentee that my execution will continue after script-two.sh executes?
I have looked into using the trap command, but I don't fully understand its use properly.
Thanks,
Casey
Here is the contents of what would be script-two.sh
#!/bin/sh
# This file is part of the DITA Open Toolkit project hosted on
# Sourceforge.net. See the accompanying license.txt file for
# applicable licenses.
# (c) Copyright IBM Corp. 2006 All Rights Reserved.
export DITA_HOME=cwd
if [ "${DITA_HOME:+1}" != "1" ]; then
echo "DITA_HOME environment variable is empty or not set";
exit 127;
fi
echo $DITA_HOME
cd "$DITA_HOME"
# Get the absolute path of DITAOT's home directory
DITA_DIR="`pwd`"
echo $DITA_DIR
if [ -f "$DITA_DIR"/tools/ant/bin/ant ] && [ ! -x "$DITA_DIR"/tools/ant/bin/ant ]; then
chmod +x "$DITA_DIR"/tools/ant/bin/ant
fi
export ANT_OPTS="-Xmx512m $ANT_OPTS"
export ANT_OPTS="$ANT_OPTS -Djavax.xml.transform.TransformerFactory=net.sf.saxon.TransformerFactoryImpl"
export ANT_HOME="$DITA_DIR"/tools/ant
export PATH="$DITA_DIR"/tools/ant/bin:"$PATH"
NEW_CLASSPATH="$DITA_DIR/lib:$DITA_DIR/lib/dost.jar:$DITA_DIR/lib/commons-codec-1.4.jar:$DITA_DIR/lib/resolver.jar:$DITA_DIR/lib/icu4j.jar"
NEW_CLASSPATH="$DITA_DIR/lib/saxon/saxon9.jar:$DITA_DIR/lib/saxon/saxon9-dom.jar:$NEW_CLASSPATH"
NEW_CLASSPATH="$DITA_DIR/lib/saxon/saxon9-dom4j.jar:$DITA_DIR/lib/saxon/saxon9-jdom.jar:$NEW_CLASSPATH"
NEW_CLASSPATH="$DITA_DIR/lib/saxon/saxon9-s9api.jar:$DITA_DIR/lib/saxon/saxon9-sql.jar:$NEW_CLASSPATH"
NEW_CLASSPATH="$DITA_DIR/lib/saxon/saxon9-xom.jar:$DITA_DIR/lib/saxon/saxon9-xpath.jar:$DITA_DIR/lib/saxon/saxon9-xqj.jar:$NEW_CLASSPATH"
if test -n "$CLASSPATH"
then
export CLASSPATH="$NEW_CLASSPATH":"$CLASSPATH"
else
export CLASSPATH="$NEW_CLASSPATH"
fi
"$SHELL"
It looks like script-two.sh is setting up an ant build environment.
I think the author intended that it sets up the build environment, then you type your build commands in manually, then type exit to leave the build environment.
I say this because the bottom line of script-two.sh is:
"$SHELL"
which starts a new shell.
Try running your script, then type exit. I think you will see it print Hello! after you type exit.
I'm guessing you're trying to do something like:
#!/bin/bash
echo "Hi"
./script-two.sh
ant <some args>
To do that, what you really want to do is source it, by changing:
./script-two.sh
to
. script-two.sh
e.g.
#!/bin/bash
echo "Hi"
. script-two.sh
ant <some args>
But, you will need to edit script-two.sh and change:
"$SHELL"
to:
case $0 in *script-two.sh)
# executed, start a new shell with the new environment
"$SHELL"
;;
*)
# sourced, don't start a new shell
;;
esac
so that it only starts a shell if the script is being run like ./script-two.sh, but not if it is being sourced like . script-two.sh.
Or if you absolutely can't change script-two.sh, then you could do:
#!/bin/bash
echo "Hi"
. script-two.sh </dev/null
ant <some args>
which will trick "$SHELL" into exiting because it has no input.
Also
export DITA_HOME=cwd
doesn't seem right to me.
It should probably be
export DITA_HOME=$(pwd)
or
export DITA_HOME=`pwd`
(both are equivalent)
I had a similar problem today, up on digging I finally found the answer.
The script I was calling (from within my script) actually had an exit 0 in the end. Removing that fixed my issues.
Just leaving this here as someone may find it useful.
Well for starters, you can execute your bash script with the -x switch to see where it is failing:
bash -x script-one.sh
Secondly, if you call the second script like this:
#!/bin/bash
echo "Hi"
var=$(bash script-two.sh)
echo "Hello!"
It will continue, as long as script-two.sh exits cleanly. Again, you can run the -x script against that script find any problems.
And as Mikel mentioned, always make sure to have exit at the bottom of your scripts.