Curl command in bash script using variables - bash

I have a curl command that looks like this:
curl -X PUT -H "myheader:coca-cola" -d '{ "name":"harrypotter" }' http://mygoogle.com/service/books/123
Running this command as is via terminal returns the expected results.
I am trying to incorporate this curl command in my bash script as follows:
#!/bin/bash
MYURL=http://mygoogle.com/service/books/123
# Generate body for curl request
generate_put_data()
{
cat <<EOF
{
"name":"harrypotter"
}
EOF
}
put_data=$(echo "$(generate_put_data)")
put_data_with_single_quotes="'$put_data'"
# Generate headers for curl request
header=myheader:coca-cola
header_with_double_quotes="\"$header\""
# The following function takes two inputs - a simple string variable (with no spaces or quotes) and the curl command string
function run_cmd() {
echo $1
echo $2
#Run the curl command
"$2"
#Check return code of the curl command
if [ "$?" -ne 0 ]; then
#do something with simple string variable
echo "$1"
echo "Job failed"
exit 1
else
#do something with simple string variable
echo "$1"
echo "Job Succeeded"
fi
}
# Run the bash function - run_cmd
run_cmd "mysimplestring" "curl -X PUT -H $header_with_double_quotes -d $put_data_with_single_quotes $MYURL"
However, when I try to run the above bash script, it fails at the point where I call run_cmd() function with the two inputs. I get the following error:
curl -X PUT -H "myheader:coca-cola" -d '{
"name":"harrypotter"
}' http://mygoogle.com/service/books/123: No such file or directory
Job failed
This error occurs on the line where "$2" is being executed in the run_cmd() function declaration.
Could someone help me understand where I am going wrong? Thanks!

"$2"
This will take the second argument and try to run it without doing any word splitting. It treats it as one string.
You're going to run into trouble passing in the curl command as one string. You'll do better if you pass it without quotes, just as if you typed it on the command line. You'll want to quote each of the variables but not quote the command as a whole.
run_cmd "mysimplestring" curl -X PUT -H "$header" -d "$put_data" "$MYURL"
Notice that you don't need the "with_quotes" variables any more. You don't have to do anything like that. The original plain values will work.
Now you can access the command using array syntax:
function run_cmd() {
local name=$1; shift
local cmd=("$#")
#Run the curl command
"${cmd[#]}"
}
By the way, this is a useless use of echo:
put_data=$(echo "$(generate_put_data)")
Make that:
put_data=$(generate_put_data)

Related

I am getting a parse error with eval with }

Here is my simple Bash script:
curljson() {
eval "curl $* | python -m json.tool"
}
It simply outputs JSON a readable way.
But when I do:
curljson -X PUT -d '{"settings": {"number_of_shards": 3, "number_of_replicas": 1}}' http://192.168.1.111:9200/blogs
I get this error:
(eval):1: parse error near `}'
But when I do it strait in curl it works so it's seems to be my script.
So how can I make this Bash script accept }?
You don't need eval here at all.
curljson() {
curl "$#" | python -m json.tool
}
All curljson really does is pass all of its arguments as-is to curl, then pipe the output to a Python script. "$#" expands to the same sequence of words that the function received as arguments.

Assign variable and redirect in bash

I'm doing ad-hoc profiling on a web service that seems to maintain some state and get slower and slower until eventually things start timing out. I have a simple script that will expose this behavior:
while true
do
RESPONSE_CODE=$( curl --config curl.config )
if [ "$RESPONSE_CODE" -eq "200" ]; then
echo SUCCESS
else
echo FAILURE
fi
done
Along with some headers, cookies, post data, url, etc. curl.config in particular has the lines:
silent
output = /dev/null
write-out = "%{http_code}"
So the only output from curl should be the HTTP status code.
This works fine. What I'd like to do is something like this:
{ time -p RESPONSE_CODE=$(curl --config curl.config) ; } 2>&1 | awk '/real/{print $2;}'
to get a running printout of how long these queries actually take, while still saving curl's output for use in my test. But that doesn't work.
How can I capture the http status from curl AND grab the output of time so I can process both?
As written:
RESPONSE_CODE = $( curl --config curl.config )
you have spaces around the assignment which simply does not work in shell (it tries to execute a command RESPONSE_CODE with = as the first argument, etc. You need:
RESPONSE_CODE=$( curl --config curl.config )
The time built-in is hard to redirect. Since you need both HTTP status and real time, you will have to do something to capture both values. One possibility is:
set -- $( (time -p -- curl --config curl.config ) 2>&1 |
awk '/real/{print $2} /^[0-9]+$/{print}')
which will set $1 and $2. Another is array assignment:
data=( $( (time -p -- curl --config curl.config ) 2>&1 |
awk '/real/{print $2} /^[0-9]+$/{print}') )
The HTTP response code should appear before the time.
(Tested using sh -c 'echo 200; sleep 1' in lieu of curl --config curl.config.)
This should work if Curl's response is only a single line:
#!/bin/bash
RESPONSE_CODE=''
TIME=''
while read -r TYPE DATA; do
case "$TYPE" in
curl)
RESPONSE_CODE=$DATA
;;
real)
TIME=$DATA
;;
esac
done < <(exec 2>&1; time -p R=$(curl --config curl.config); echo "curl $R")
Or use an associative array:
#!/bin/bash
declare -A RESPONSE
while read -r TYPE DATA; do
RESPONSE[$TYPE]=$DATA
done < <(exec 2>&1; time -p R=$(curl ...); echo "code $R")
echo "${RESPONSE[code] ${RESPONSE[real]}"

bash script not running properly

When I run this by its self in the command line it seems to work fine, but when I have another script execute this, it doesn't work. Any ideas? I'm guessing it has to do with quotes, but not sure.
#!/bin/sh
#Required csvquote from https://github.com/dbro/csvquote
#TODO: Clean CSV File using CSVFix
#Version 3
echo "File Name: $1"
function quit {
echo "Quitting Script"
exit 1
}
function fileExists {
if [ ! -f "$1" ]
then
echo "File $1 does not exists"
quit
fi
}
function getInfo {
#Returns website url like: "http://www.website.com/info"
#Reads last line of a csv file, and gets the 2nd item.
RETURN=$(tail -n 1 $1 | csvquote | cut -d ',' -f 2 | csvquote -u)
echo $RETURN
}
function work {
CURLURL="http://127.0.0.1:9200/cj/_query"
URL=$(getInfo)
echo "URL: $URL"
CURLDATA='{ "query" : { "match" : { "PROGRAMURL" : '$URL' } } }'
#URL shows up as blank...???
echo "Curl Data: $CURLDATA"
RESPONSE=$(curl -XDELETE "$CURLURL" -d "$CURLDATA" -vn)
echo $RESPONSE
echo "Sleeping Allowing Time To Delete"
sleep 5s
}
fileExists $1
work $1
I cant see why a simpler version wont work: functions are useful, but I think there are too many, overcomplicating things, if what you are posting is the entirety of your script (in my opinion)
Your script is doing things using a broken lucky pattern: $1 variables are also arguments to shell functions as well as the main script. Think of them as local variables to a function. So when you are calling $(getInfo) it is calling that function with no argument, so actually runs tail -n 1 which falls back to stdin, which you are specifying to work as < $1. You could see this for yourself by putting echo getInfo_arg_1="$1" >&2 inside the function...
Note also you are not quoting $1 anywhere, this script is not whitespace in file safe, although this is only more likely to be a problem if you are having to deal with files sent to you from a Windows computer.
In the absence of other information, the following 'should' work:
#!/bin/bash
test -z "$1" && { echo "Please specify a file." ; exit 1; }
test -f "$1" || { echo "Cant see file '$1'." ; exit 1; }
FILE="$1"
function getInfo() {
#Returns website url like: "http://www.website.com/info"
#Reads last line of a csv file, and gets the 2nd item.
tail -n 1 "$1" | csvquote | cut -d ',' -f 2 | csvquote -u
}
CURLURL="http://127.0.0.1:9200/cj/_query"
URL=$(getInfo "$FILE")
echo "URL: $URL"
CURLDATA='{ "query" : { "match" : { "PROGRAMURL" : '$URL' } } }'
curl -XDELETE "$CURLURL" -d "$CURLDATA" -vn
echo "Sleeping Allowing Time To Delete"
sleep 5s
If it still fails you really need to post your error messages.
One other thing, especially if you are calling this from another script, chmod +x the script so you can run it without having to invoke it with bash directly. If you want to turn on debugging then put set -x near the start somewhere.

Building command strings using variables with various quote levels and spaces

I have a script that runs curl. I want to be able to optionally add a -H parameter, if a string isn't empty. What's complex is the levels of quoting and spaces.
caption="Test Caption"
if [ "${caption}" != "" ]; then
CAPT=-H "X-Caption: ${caption}"
fi
curl -A "$UA" -H "Content-MD5: $MD5" -H "X-SessionID: $SID" -H "X-Version: 1" $CAPT http://upload.example.com/$FN
The idea is that the CAPT variable is either empty, or contains the desired -H header in the same form as the others, e.g., -H "X-Caption: Test Caption"
The problem is when run, it interprets the assignment as a command to be executed:
$bash -x -v test.sh
+ '[' 'Test caption' '!=' '' ']'
+ CAPT=-H
+ 'X-Caption: Test caption'
./test.sh: line 273: X-Caption: Test caption: command not found
I've tried resetting IFS before the code, but it didn't make a difference.
The key to making this work is to use an array.
caption="Test Caption"
if [[ $caption ]]; then
CAPT=(-H "X-Caption: $caption")
fi
curl -A "$UA" -H "Content-MD5: $MD5" -H "X-SessionID: $SID" -H "X-Version: 1" "${CAPT[#]}" "http://upload.example.com/$FN"
If you only need to know whether or not the caption is there, you can interpolate it when it needs to be there.
caption="Test Caption"
NOCAPT="yeah, sort of, that would be nice"
if [ "${caption}" != "" ]; then
unset NOCAPT
fi
curl ${NOCAPT--H "X-Caption: ${caption}"} -A "$UA" ...
To recap, the syntax ${var-value} produces value if var is unset.
I finally did get it to work. Part of the problem is specific to curl, in that when using the -H option to set custom headers, it seems to work best when everything after the -H (that is, both the custom header name and value) are protected by single quotes. Then, I needed to pass the constructed string through eval to get it to work.
To make this easier to read, I store a single quote in a variable named TICK.
Example:
TICK=\'
#
HDRS=""
HDRS+=" -H ${TICK}Content-MD5: ${MD5}${TICK}"
HDRS+=" -H ${TICK}X-SessionID: ${SID}${TICK}"
HDRS+=" -H ${TICK}X-Version: 1.1.1${TICK}"
HDRS+=" -H ${TICK}X-ResponseType: REST${TICK}"
HDRS+=" -H ${TICK}X-ID: ${ID}${TICK}"
if [ "${IPTC[1]}" != "" ]; then
HDRS+=" -H ${TICK}X-Caption: ${IPTC[1]}${TICK}"
fi
if [ "${IPTC[2]}" != "" ]; then
HDRS+=" -H ${TICK}X-Keywords: ${IPTC[2]}${TICK}"
fi
#
# Set curl flags
#
CURLFLAGS=""
CURLFLAGS+=" --cookie $COOKIES --cookie-jar $COOKIES"
CURLFLAGS+=" -A \"$UA\" -T ${TICK}${the_file}${TICK} "
eval curl $CURLFLAGS $HDRS -o $OUT http://upload.example.com/$FN

SSH commands via bash script

I've been trying several fails to perform the following:
Basically, what I need is to execute several sequenced commands on a remote unix shell, such as setting environment variables with variables that I have on the script, move to a particular directory and run a script there and so on.
I've tried using a printf with the portion of the script and then piped the ssh command, but it didn't work quite well, also, I've read about the "ssh ... >> END" marker, which is great but since I'm using functions, it doesn't work well.
Do you have any thoughts?
Here's an excerpt of the code:
deployApp() {
inputLine=$1;
APP_SPECIFIC_DEPLOY_SCRIPT="$(echo $inputLine | cut -d ' ' -s -f1)";
BRANCH="$(echo $inputLine | cut -d ' ' -s -f2)";
JBOSS_HOME="$(echo $inputLine | cut -d ' ' -s -f3)";
BASE_PORT="$(echo $inputLine | cut -d ' ' -s -f4)";
JAVA_HOME_FOR_JBOSS="$(echo $inputLine | cut -d ' ' -s -f5)";
JAVA_HEAP="$(echo $inputLine | cut -d ' ' -s -f6)";
echo "DEPLOYING $APP_SPECIFIC_DEPLOY_SCRIPT"
echo "FROM BRANCH $BRANCH"
echo "IN JBOSS $JBOSS_HOME"
echo "WITH BASE PORT $BASE_PORT"
echo "USING $JAVA_HOME_FOR_JBOSS"
if [[ -n "$JAVA_HEAP" ]]; then
echo "WITH $JAVA_HEAP"
fi
echo
echo "Exporting jboss to $JBOSS_HOME"
ssh me#$SERVER <<END
cleanup() {
rm -f $JBOSS_SERVER/log/*.log
rm -Rf $JBOSS_SERVER/deploy/
rm -Rf $JBOSS_SERVER/tmp/
mkdir $JBOSS_SERVER/deploy
}
startJboss() {
cd $JBOSS_SERVER/bin
./jbossctl.sh start
return 0;
}
export JBOSS_HOME
export JBOSS_SERVER=$JBOSS_HOME/server/default
END
return 0;
}
With that "HERE" approach, I'm getting this error: "syntax error: unexpected end of file"
Thanks a lot in advance!
Just put the functions in your here document, too:
var="Hello World"
ssh user#host <<END
x() {
print "x function with args=$*"
}
x "$var"
END
[EDIT] Some comments:
You say "export JBOSS_HOME" but you never define a value for the variable in the here document. You should use export JBOSS_HOME="$JBOSS_HOME". BASH will take all text between the two END, replace all variables, and send the result to SSH for processing.
That also means the other side will see rm -f /path/to/jboss/server/*.log; the assignment to JBOSS_SERVER in the last line of the here document has no effect (at least not to the code in cleanup()).
If you want to pass $ unmodified to the remote server, you have to escape it with \: rm -f \$JBOSS_SERVER/log/*.log
You never call cleanup()
There is a } missing after return 0 to finish the definition of deployapp()
There may be other problems as well. Run the script with bash -x to see what the shell actually executes. You can also add echo commands in the here document to see what the values of the variables are or you can add set -x before cleanup() to get the same output as with bash -x but from the remote side.
I don't understand why you're using cut to split the arguments to your function. Just do
APP_SPECIFIC_DEPLOY_SCRIPT=$1
BRANCH=$2
JBOSS_HOME=$3
# etc.
If you don't quote your here document delimiter, the contents are expanded before they're sent to the server. That may be what you want. If you don't and you want all expansion to be done on the server side, then quote it like this:
ssh me#$SERVER <<'END'
# etc.
END
If you wan't a mixture, don't quote the delimiter, but do escape those things that you want delayed expansion for:
ssh me#$SERVER <<END
echo $EXPAND_ME_NOW \$EXPAND_ME_LATER
END
What are the export statements supposed to do? I can't see that they would have any effect at all.

Resources