SSH commands via bash script - bash

I've been trying several fails to perform the following:
Basically, what I need is to execute several sequenced commands on a remote unix shell, such as setting environment variables with variables that I have on the script, move to a particular directory and run a script there and so on.
I've tried using a printf with the portion of the script and then piped the ssh command, but it didn't work quite well, also, I've read about the "ssh ... >> END" marker, which is great but since I'm using functions, it doesn't work well.
Do you have any thoughts?
Here's an excerpt of the code:
deployApp() {
inputLine=$1;
APP_SPECIFIC_DEPLOY_SCRIPT="$(echo $inputLine | cut -d ' ' -s -f1)";
BRANCH="$(echo $inputLine | cut -d ' ' -s -f2)";
JBOSS_HOME="$(echo $inputLine | cut -d ' ' -s -f3)";
BASE_PORT="$(echo $inputLine | cut -d ' ' -s -f4)";
JAVA_HOME_FOR_JBOSS="$(echo $inputLine | cut -d ' ' -s -f5)";
JAVA_HEAP="$(echo $inputLine | cut -d ' ' -s -f6)";
echo "DEPLOYING $APP_SPECIFIC_DEPLOY_SCRIPT"
echo "FROM BRANCH $BRANCH"
echo "IN JBOSS $JBOSS_HOME"
echo "WITH BASE PORT $BASE_PORT"
echo "USING $JAVA_HOME_FOR_JBOSS"
if [[ -n "$JAVA_HEAP" ]]; then
echo "WITH $JAVA_HEAP"
fi
echo
echo "Exporting jboss to $JBOSS_HOME"
ssh me#$SERVER <<END
cleanup() {
rm -f $JBOSS_SERVER/log/*.log
rm -Rf $JBOSS_SERVER/deploy/
rm -Rf $JBOSS_SERVER/tmp/
mkdir $JBOSS_SERVER/deploy
}
startJboss() {
cd $JBOSS_SERVER/bin
./jbossctl.sh start
return 0;
}
export JBOSS_HOME
export JBOSS_SERVER=$JBOSS_HOME/server/default
END
return 0;
}
With that "HERE" approach, I'm getting this error: "syntax error: unexpected end of file"
Thanks a lot in advance!

Just put the functions in your here document, too:
var="Hello World"
ssh user#host <<END
x() {
print "x function with args=$*"
}
x "$var"
END
[EDIT] Some comments:
You say "export JBOSS_HOME" but you never define a value for the variable in the here document. You should use export JBOSS_HOME="$JBOSS_HOME". BASH will take all text between the two END, replace all variables, and send the result to SSH for processing.
That also means the other side will see rm -f /path/to/jboss/server/*.log; the assignment to JBOSS_SERVER in the last line of the here document has no effect (at least not to the code in cleanup()).
If you want to pass $ unmodified to the remote server, you have to escape it with \: rm -f \$JBOSS_SERVER/log/*.log
You never call cleanup()
There is a } missing after return 0 to finish the definition of deployapp()
There may be other problems as well. Run the script with bash -x to see what the shell actually executes. You can also add echo commands in the here document to see what the values of the variables are or you can add set -x before cleanup() to get the same output as with bash -x but from the remote side.

I don't understand why you're using cut to split the arguments to your function. Just do
APP_SPECIFIC_DEPLOY_SCRIPT=$1
BRANCH=$2
JBOSS_HOME=$3
# etc.
If you don't quote your here document delimiter, the contents are expanded before they're sent to the server. That may be what you want. If you don't and you want all expansion to be done on the server side, then quote it like this:
ssh me#$SERVER <<'END'
# etc.
END
If you wan't a mixture, don't quote the delimiter, but do escape those things that you want delayed expansion for:
ssh me#$SERVER <<END
echo $EXPAND_ME_NOW \$EXPAND_ME_LATER
END
What are the export statements supposed to do? I can't see that they would have any effect at all.

Related

Iterating array in declared function of bash shell script

I've been working through creating a script to move some files from a local machine to a remote server. As part of that process I have a function that can either be called directly or wrapped with 'declare -fp' and sent along to an ssh command. The code I have so far looks like this:
export REMOTE_HOST=myserver
export TMP=eyerep-files
doTest()
{
echo "Test moving files from $TMP with arg $1"
declare -A files=(["abc"]="123" ["xyz"]="789")
echo "Files: ${!files[#]}"
for key in "${!files[#]}"
do
echo "$key => ${files[$key]}"
done
}
moveTest()
{
echo "attempting move with wrapped function"
ssh -t "$REMOTE_HOST" "$(declare -fp doTest|envsubst); doTest ${1#Q}"
}
moveTest $2
If I run the script with something like
./myscript.sh test dev
I get the output
attempting move with wrapped function
Test moving files from eyerep-files with arg dev
Files: abc xyz
bash: line 7: => ${files[]}: bad substitution
It seems like the string expansion for the for loop is not working correctly. Is this expected behaviour? If so, is there an alternative way to loop through an array that would avoid this issue?
If you're confident that your remote account's default shell is bash, this might look like:
moveTest() {
ssh -t "$REMOTE_HOST" "$(declare -f doTest; declare -p $(compgen -e)); doTest ${1#Q}"
}
If you aren't, it might instead be:
moveTest() {
ssh -t "$REMOTE_HOST" 'exec bash -s' <<EOF
set -- ${##Q}
$(declare -f doTest; declare -p $(compgen -e))
doTest \"\$#\"
EOF
}
I managed to find an answer here: https://unix.stackexchange.com/questions/294378/replacing-only-specific-variables-with-envsubst/294400
Since I'm exporting the global variables, I can get a list of them using compgen and use that list with envsubst to specify which variables I want to replace. My finished function ended up looking like:
moveTest()
{
echo "attempting move with wrapped function"
ssh -t "$REMOTE_HOST" "$(declare -fp doTest|envsubst "$(compgen -e | awk '$0="${"$0"}"') '${1}'"); doTest ${1#Q}"
}

Shell Script to Replace or Insert Lines of Text in File

I wrote a Shell script to replace text in the Jenkinsfiles of a large number of repos. I had used this to add on a parameter to a single line of text. However, now I need to insert a line of text before and after an existing command in the Jenkinsfiles. I don't have too much shell experience and could use some help.
Here is the Jenkinsfile text before:
sh "chmod +x increment.sh"
def result = sh returnstdout:true, script: "./increment.sh '${Version}' '${ReleaseVersion}' '${GitRepoURL}' '${CutRelease}' '${Branch}' '${JiraID}'"
//echo "$result"
I need to add the following before the "def result" line:
sshagent(['gitssh']) {
and then add a closing curly bracket after the "def result" line:
}
I need the end result to be:
sh "chmod +x increment.sh"
sshagent(['gitssh']) {
def result = sh returnstdout:true, script: "./increment.sh '${Version}' '${ReleaseVersion}' '${GitRepoURL}' '${CutRelease}' '${Branch}' '${JiraID}'"
}
//echo "$result"
I actually don't care about keeping the commented out echo command if that makes it more difficult, but it is just to show what I have surrounding the "def result" line.
How can I accomplish this end result?
If it helps, I previously was adding new parameters at the end of the "def result" line with this code:
if [ -e Jenkinsfile ]
then
sed -i -e "s/\${Branch}/\${Branch}\' \'\${JiraID}/g" Jenkinsfile
fi
Note: I am on a Mac.
Code so far:
file=repos_remaining.txt
while IFS="," read -r repoURL repoName; do
echo $repoURL
cd ..
echo $repoName
cd $(echo $repoName | tr -d '\r')
file=repos_remaining.txt
if [ -e Jenkinsfile ]
then
# sed -i -e $"s/def result/sshagent([\'gitssh\']) {\
# def result/g" Jenkinsfile
fi
# git add "Jenkinsfile"
# git commit -m "Added JiraID parameter to Jenkinsfile"
# git push origin master
done < "$file"
As with most cases where people want to automate the editing of a file, I suggest using ed:
ed -s Jenkinsfile <<'EOF'
/^def result/i
sshagent(['gitssh']) {
.
.+1a
}
.
w
EOF
The commands in the heredoc tell ed to move the cursor to the first line starting with def result, insert a line above it, append a line after it, and finally write the modified file back to disc.

Bash: Elegant way to run a command and exit if a process exited with error

I have a this logic in a Bash script:
Run something;
If it failed, print something and stop;
If not, run something else.
And all this runs in a ssh session.
This is probably trivial if I used $? and if / else.
But because of the script maintainability, I am looking for some elegant 2 lines solution.
This is what I have so far
ssh ... '
ls attributes/*'$CONF_FILE'.rb || ls -l attributes/ && exit 1;
'$EDITOR' attributes/*'$CONF_FILE'.rb '$PART_VER';'
However, this exits no matter what. So I tried:
ssh ... '
ls attributes/*'$CONF_FILE'.rb || (ls -l attributes/ && exit 1);
'$EDITOR' attributes/*'$CONF_FILE'.rb '$PART_VER';'
However, exit only exits the subshell. And exiting a script from within a subshell is not elegant at all.
Is there a simple 2-lines solution? Perhaps other operators precedence?
Written for clarity, correctness, and maintainability -- not terseness:
# store your remote script as a string. Because of the quoted heredoc, no variables are
# evaluated at this time; $1, $2 and $3 are expanded only after the code is sent to the
# remote system.
script_text=$(cat <<'EOF'
CONF_FILE=$1; PART_VER=$2; EDITOR=$3
shopt -s nullglob # Return an empty list for any failed glob
set -- attributes/*"$CONF_FILE".rb # Replace our argument list with a glob result
if (( $# )); then # Check length of that result...
"$EDITOR" "$#" "$PART_VER" # ...if it's nonzero, run the editor w/ that list
else
ls attributes # otherwise, run ls and fail
exit 1
fi
EOF
)
# generate a single string to pass to the remote shell which passes the script text
# ...and the arguments to place in $0, $1, etc while that script is running
printf -v ssh_cmd_str '%q ' \
bash -c "$script_text" '_' "$CONF_FILE" "$PART_VER" "$EDITOR"
# ...thereafter, that command can be run as follows:
ssh -tt ... "$ssh_cmd_str"
Don't use a subshell; use a command group.
ssh ... "
ls attributes/*'$CONF_FILE'.rb || { ls -l attributes/ && exit 1; };
'$EDITOR' attributes/*'$CONF_FILE'.rb '$PART_VER';"
(Note the change in quotes; this better ensures that the result of the local parameter expansions are properly quoted on the remote end, although there will still be problems if the parameter expansions themselves contain single quotes. A proper solution would run an explicit shell on the remote end, taking your local parameters as arguments, instead of using interpolation to build the script. The following is untested, but I think I quoted everything correctly.
ssh ... sh -c '
ls attributes/*"$1.rb" || { ls -l attributes/ && exit 1; };
"$EDITOR" attributes/*"\$1.rb" "$2";
' _ "$CONF_FILE" "$PART_VER"
)
For now, I resorted to duplicating the condition, like this:
ssh ... '
ls attributes/*'$CONF_FILE'.rb > /dev/null || ls -l --color=always attributes/
ls attributes/*'$CONF_FILE'.rb > /dev/null || exit 1;
'$EDITOR' attributes/*'$CONF_FILE'.rb '$PART_VER';'
(I used &>- as a short for redirecting stdout and stderr to /dev/null, but that is not probably correct, see the comment.)

use of ssh variable in the shell script

I want to use the variables of ssh in shell script.
suppose I have some variable a whose value I got inside the ssh and now I want to use that variable outside the ssh in the shell itself, how can I do this ?
ssh my_pc2 <<EOF
<.. do some operations ..>
a=$(ls -lrt | wc -l)
echo \$a
EOF
echo $a
In the above example first echo print 10 inside ssh prints 10 but second echo $a prints nothing.
I would refine the last answer by defining some special syntax for passing the required settings back, e.g. "#SET var=value"
We could put the commands (that we want to run within the ssh session) in a cmdFile file like this:
a=`id`
b=`pwd`
echo "#SET a='$a'"
echo "#SET b='$b'"
And the main script would look like this:
#!/bin/bash
# SSH, run the remote commands, and filter anything they passed back to us
ssh user#host <cmdFile | grep "^#SET " | sed 's/#SET //' >vars.$$
# Source the variable settings that were passed back
. vars.$$
rm -f vars.$$
# Now we have the variables set
echo "a = $a"
echo "b = $b"
If you're doing this for lots of variables, you can add a function to cmdFile, to simplify/encapsulate your special syntax for passing data back:
passvar()
{
var=$1
val=$2
val=${val:-${!var}}
echo "#SET ${var}='${val}'"
}
a=`id`
passvar a
b=`pwd`
passvar b
You might need to play with quotes when the values include whitespace.
A script like this could be used to store all the output from SSH into a variable:
#!/bin/bash
VAR=$(ssh user#host << _EOF
id
_EOF)
echo "VAR=$VAR"
it produces the output:
VAR=uid=1000(user) gid=1000(user) groups=1000(user),4(adm),10(wheel)

bash script not running properly

When I run this by its self in the command line it seems to work fine, but when I have another script execute this, it doesn't work. Any ideas? I'm guessing it has to do with quotes, but not sure.
#!/bin/sh
#Required csvquote from https://github.com/dbro/csvquote
#TODO: Clean CSV File using CSVFix
#Version 3
echo "File Name: $1"
function quit {
echo "Quitting Script"
exit 1
}
function fileExists {
if [ ! -f "$1" ]
then
echo "File $1 does not exists"
quit
fi
}
function getInfo {
#Returns website url like: "http://www.website.com/info"
#Reads last line of a csv file, and gets the 2nd item.
RETURN=$(tail -n 1 $1 | csvquote | cut -d ',' -f 2 | csvquote -u)
echo $RETURN
}
function work {
CURLURL="http://127.0.0.1:9200/cj/_query"
URL=$(getInfo)
echo "URL: $URL"
CURLDATA='{ "query" : { "match" : { "PROGRAMURL" : '$URL' } } }'
#URL shows up as blank...???
echo "Curl Data: $CURLDATA"
RESPONSE=$(curl -XDELETE "$CURLURL" -d "$CURLDATA" -vn)
echo $RESPONSE
echo "Sleeping Allowing Time To Delete"
sleep 5s
}
fileExists $1
work $1
I cant see why a simpler version wont work: functions are useful, but I think there are too many, overcomplicating things, if what you are posting is the entirety of your script (in my opinion)
Your script is doing things using a broken lucky pattern: $1 variables are also arguments to shell functions as well as the main script. Think of them as local variables to a function. So when you are calling $(getInfo) it is calling that function with no argument, so actually runs tail -n 1 which falls back to stdin, which you are specifying to work as < $1. You could see this for yourself by putting echo getInfo_arg_1="$1" >&2 inside the function...
Note also you are not quoting $1 anywhere, this script is not whitespace in file safe, although this is only more likely to be a problem if you are having to deal with files sent to you from a Windows computer.
In the absence of other information, the following 'should' work:
#!/bin/bash
test -z "$1" && { echo "Please specify a file." ; exit 1; }
test -f "$1" || { echo "Cant see file '$1'." ; exit 1; }
FILE="$1"
function getInfo() {
#Returns website url like: "http://www.website.com/info"
#Reads last line of a csv file, and gets the 2nd item.
tail -n 1 "$1" | csvquote | cut -d ',' -f 2 | csvquote -u
}
CURLURL="http://127.0.0.1:9200/cj/_query"
URL=$(getInfo "$FILE")
echo "URL: $URL"
CURLDATA='{ "query" : { "match" : { "PROGRAMURL" : '$URL' } } }'
curl -XDELETE "$CURLURL" -d "$CURLDATA" -vn
echo "Sleeping Allowing Time To Delete"
sleep 5s
If it still fails you really need to post your error messages.
One other thing, especially if you are calling this from another script, chmod +x the script so you can run it without having to invoke it with bash directly. If you want to turn on debugging then put set -x near the start somewhere.

Resources