Building command strings using variables with various quote levels and spaces - bash

I have a script that runs curl. I want to be able to optionally add a -H parameter, if a string isn't empty. What's complex is the levels of quoting and spaces.
caption="Test Caption"
if [ "${caption}" != "" ]; then
CAPT=-H "X-Caption: ${caption}"
fi
curl -A "$UA" -H "Content-MD5: $MD5" -H "X-SessionID: $SID" -H "X-Version: 1" $CAPT http://upload.example.com/$FN
The idea is that the CAPT variable is either empty, or contains the desired -H header in the same form as the others, e.g., -H "X-Caption: Test Caption"
The problem is when run, it interprets the assignment as a command to be executed:
$bash -x -v test.sh
+ '[' 'Test caption' '!=' '' ']'
+ CAPT=-H
+ 'X-Caption: Test caption'
./test.sh: line 273: X-Caption: Test caption: command not found
I've tried resetting IFS before the code, but it didn't make a difference.

The key to making this work is to use an array.
caption="Test Caption"
if [[ $caption ]]; then
CAPT=(-H "X-Caption: $caption")
fi
curl -A "$UA" -H "Content-MD5: $MD5" -H "X-SessionID: $SID" -H "X-Version: 1" "${CAPT[#]}" "http://upload.example.com/$FN"

If you only need to know whether or not the caption is there, you can interpolate it when it needs to be there.
caption="Test Caption"
NOCAPT="yeah, sort of, that would be nice"
if [ "${caption}" != "" ]; then
unset NOCAPT
fi
curl ${NOCAPT--H "X-Caption: ${caption}"} -A "$UA" ...
To recap, the syntax ${var-value} produces value if var is unset.

I finally did get it to work. Part of the problem is specific to curl, in that when using the -H option to set custom headers, it seems to work best when everything after the -H (that is, both the custom header name and value) are protected by single quotes. Then, I needed to pass the constructed string through eval to get it to work.
To make this easier to read, I store a single quote in a variable named TICK.
Example:
TICK=\'
#
HDRS=""
HDRS+=" -H ${TICK}Content-MD5: ${MD5}${TICK}"
HDRS+=" -H ${TICK}X-SessionID: ${SID}${TICK}"
HDRS+=" -H ${TICK}X-Version: 1.1.1${TICK}"
HDRS+=" -H ${TICK}X-ResponseType: REST${TICK}"
HDRS+=" -H ${TICK}X-ID: ${ID}${TICK}"
if [ "${IPTC[1]}" != "" ]; then
HDRS+=" -H ${TICK}X-Caption: ${IPTC[1]}${TICK}"
fi
if [ "${IPTC[2]}" != "" ]; then
HDRS+=" -H ${TICK}X-Keywords: ${IPTC[2]}${TICK}"
fi
#
# Set curl flags
#
CURLFLAGS=""
CURLFLAGS+=" --cookie $COOKIES --cookie-jar $COOKIES"
CURLFLAGS+=" -A \"$UA\" -T ${TICK}${the_file}${TICK} "
eval curl $CURLFLAGS $HDRS -o $OUT http://upload.example.com/$FN

Related

Read json values from curl GET response

How can I get a specific field from the json response?
#!/bin/bash -
status=`curl -sk -H "api-token: $TOKEN" -H "Content-Type: application/json" https://path_to/values`
The response is
{
"cancelled": false,
"percentage": 0.5,
"state": "running"
}
I want to poll the 'status' that the response percentage is 100 and the cancelled field is always true. Can this be done without another tool like jq?
EDIT
I`m trying to figure out if I can install jq on the system. Is my approach correct using jq?
while
update_status=`curl -sk -H "api-token: $TOKEN" -H "Content-Type: application/json" https://path_to/values`
cancelled=$(jq -r '.cancelled' <<< "$update_status")
percentage_complete=$(jq -r '.percentage_complete' <<< "$update_status")
state=$(jq -r '.state' <<< "$update_status")
[[ $cancelled -eq 1 || $state == 'running' ]]
do true; done
"cancelled" is a boolean and "state" is a string with the values "running" or "not_running".
How can I add a log message which shows if the update fails or not? I`m not pretty sure with the do while loop...
echo "[INFO] Update done" ## depending on the failed var?
Using jq and reading the results into an array:
readarray -t dat <<< "$(curl -sk -H "api-token: $TOKEN" -H "Content-Type: application/json" https://path_to/values | jq -r '.cancelled,.percentage,.state')"
The array can then be used in an if statement:
if [[ "${dat[0]" == "true" && "${dat[1]" == "100" ]]
then
echo "There are no issues"
else
echo "There are issues"
fi
If jq is really not an option and if the json returned is as posted, you can use awk and return back an error code:
if (curl -sk -H "api-token: $TOKEN" -H "Content-Type: application/json" https://path_to/values | awk '/(cancelled)|(percentage)|(state)/ { gsub("[\",:]","",$0);gsub(" ","",$1);map[$1]=$2 } END { if ( map["cancelled"]=="false" && map["percentage"] == 100 ) { exit 0 } else { exit 1 } }');
then
echo "There are no issues"
else
echo "There are issues"
fi
Pipe the output of the curl command into awk and where there is "cancelled", "percentage" or "state" in the line, process. Remove any "," or double quotes or ":" from the line and then remove any spaces from the first space delimited field with gsub and then add to an array called map and use the first field as the index and the second field as the value. At the end, check the indexes of the map array and exit with 0 if all are as expected, otherwise, exit with 0.

Bash loop a curl request, output to file and stop until empty response

So I have the following bash file and right now its looping a curl request based on the for loop. However, i am tying to find out how to continue looping until the response is empty.
Unfortunately the API that I am calling is based on pages with a maximum responses of 500 results per page. I am trying to pull the data since 2017 so its a lot of data.
I want to continue countering until the response is empty.
#!/bin/bash
# Basic while loop
counter=1
for ((i=1;i<=2;i++));
do
curl -o gettext.txt --request GET \
--url "https://api.io/v1/candidates?page=${counter}&per_page=500" \
--header 'Authorization: Basic aklsjdl;fakj;l;kasdflkaj'
((counter++))
done
echo $counter
echo All done
Anyone have an idea?
As stated in author's comment on his/her own post, the returned data is in json format. The author didn't ask how to append two json files, but it is a necessary step for him/her to accomplish his/her job. In order to append two json's, json1 and json2, maybe skipping json1 last byte } and json2 first byte {, and appending , between them would be enough. Here I am using jq to join two jsons as a more generic approach.
In the examples shown below, the nextjsonchunk file is the json file got at each request. If it has contents, it is appended to the mainjsonfile with jq. If it seems to be empty (inferred by its size) the loop breaks and the result is moved to the present folder and cleanup is made.
Using curl:
#!/usr/bin/env bash
tempfolder=/dev/shm # temporary memory parition, avaiable in ubuntu
emptyjsonize=10 # the minimum json file length, to be used as a threshold
for ((counter=1; 1; counter++))
do
curl "https://api.io/v1/candidates?page=${counter}&per_page=500" \
--header "Authorization: Basic aklsjdl;fakj;l;kasdflkaj" \
--ouput $tempfolder/nextjsonchunk
if [ $(wc -c <$tempfolder/nextjsonchunk) -le $emptyjsonize ]; then break; fi
jq -s '.[0]*.[1]' $tempfolder/mainjsonfile $tempfolder/nextjsonchunk > $folder/mainjsonfile
done
rm $tempfolder/nextjsonchunk # cleaning up
mv $tempfolder/mainjsonfile ./jsonresultfile # end result
Alternativelly, using wget:
#!/usr/bin/env bash
tempfolder=/dev/shm # temporary memory parition, avaiable in ubuntu
emptyjsonize=10 # the minimum json file length, to be used as a threshold
for ((counter=1; 1; counter++))
do
wget "https://api.io/v1/candidates?page=${counter}&per_page=500" \
--header="Authorization: Basic aklsjdl;fakj;l;kasdflkaj" \
--ouput-document $tempfolder/nextjsonchunk
if [ $(wc -c <$tempfolder/nextjsonchunk) -le $emptyjsonize ]; then break; fi
jq -s '.[0]*.[1]' $tempfolder/mainjsonfile $tempfolder/nextjsonchunk > $folder/mainjsonfile
done
rm $tempfolder/nextjsonchunk # cleaning up
mv $tempfolder/mainjsonfile ./jsonresultfile # end result
It is a good idea to take two sample json and test the merging between them, to check if it is being done properly.
It is also good to assure if the empty json file check is ok. The 10 byte was just a guess.
A tmpfs (in memory) partition, /dev/shm was used in the examples, to avoid a many writes, but its use is optional.
You can use break to end the loop at any point:
#!/bin/bash
for ((counter=1; 1; counter++)); do
curl -o gettext.txt --request GET \
--url "https://api.io/v1/candidates?page=${counter}&per_page=500" \
--header 'Authorization: Basic aklsjdl;fakj;l;kasdflkaj'
if [ ! -s gettext.txt ]; then
break;
fi
# do something with gettext.txt
# as in your question, it will be overwritten in the next iteration
done
echo "$counter"
echo "All done"
Like this?
#!/bin/bash
# Basic while loop
counter=1
while true; do
data=$(curl --request GET \
--url "https://api.io/v1/candidates?page=${counter}&per_page=500" \
--header 'Authorization: Basic aklsjdl;fakj;l;kasdflkaj')
[[ $data ]] || break
echo "$data" >> gettext.txt
((counter++))
done
echo $counter
echo All done

Bash script into crontab behaving differently

I'm having trouble in finding the reason why my BASH script behaves differently if run alone or inside cron. I have this snippet:
#!/bin/bash
RESPONSE=$(curl -fs -XPOST -H "Content-type: application/json" -d '{"id" : 4}' https://myserver.com)
echo $RESPONSE
if [ -z "$RESPONSE" ]; then
echo "empty response"
return 0
fi
COMMAND=$(echo $RESPONSE | python -c "import sys, json; print json.load(sys.stdin)['command']")
if [ -z "$COMMAND" ]; then
echo "empty command"
elif [ "$COMMAND" = "SYS_INFO" ];
then
#business logic
fi
that prints two different responses in the two environments:
$RESPONSE Running from console:
{"id":"1f78d8d0-e754-4a23-a2f0-448fbeb42995", "key":"\n4RHDFAnTull1Z+aHGbO1zXcAGghuaEUz0w8sT7dlpc80jG6ZaWnbDox4G0f8sKY\ng0WZ80zWf8ftNgX3nes9MWYEq00nM5jJWCSavmGSKCKjoGD2XqBod8W0Z5w/KAHTSitGVMFgMjda91+xozw8uMlzR/t3Y8FP2k/NHj\n"}
$RESPONSE Running from :
{"id":"1f78d8d0-e754-4a23-a2f0-448fbeb42995", "key":"
4RHDFAnTull1Z+aHGbO1zXcAGghuaEUz0w8sT7dlpc80jG6ZaWnbDox4G0f8sKYj
g0WZ80zWf8ftNgX3nes9MWYEq00nM5jJWCSavmGSKCKjoGD2XqBod8W0Z5w/KAHTSitGVMFgMjda91+xozw8uMlzR/t3Y8FP2k/NHj
"}
Please notice the \n that the server returns into key field that are present when running from console and NOT present (actually, they are encoded as newline) when running from crontab
What I've tried:
adding source ~/.bashrc as suggested here
changing value of PATH evaluating the differences of the two environments as suggested here
However, nothing seems to work.

Bash - How to get http response body and status code?

I am trying below code to get response body and status:
read -ra result <<< $(curl -i --insecure \
-H "Accept: application/json" \
-H "Content-Type:application/json" \
-X POST --data "$configData" $openingNode"/voice/v1/updateWithPh")
status=${result[1]}
response=${result[#]}
echo $status
Problem here is -
I get both status code and response Body correctly.
But when I create a bash function and send it as an argument, the response body changes to "HTTP/1.1" in the function as shown below.
echo $(validateUpdate $configData $response)
Code for the function -
function validateUpdate(){
echo $1
echo $2
}
$2 prints as "HTTP/1.1"
What is the reason? How to rectify this issue?
You need to enclose your variables in double quotes to prevent bash splitting it into separate tokens.
Try
echo $(validateUpdate "$configData" "$response")
or even better (echo is useless as #tripleee points out; furthermore curly braces improves readability):
validateUpdate "${configData}" "${response}"
use same thing inside of your function
echo "$2"

Curl command in bash script using variables

I have a curl command that looks like this:
curl -X PUT -H "myheader:coca-cola" -d '{ "name":"harrypotter" }' http://mygoogle.com/service/books/123
Running this command as is via terminal returns the expected results.
I am trying to incorporate this curl command in my bash script as follows:
#!/bin/bash
MYURL=http://mygoogle.com/service/books/123
# Generate body for curl request
generate_put_data()
{
cat <<EOF
{
"name":"harrypotter"
}
EOF
}
put_data=$(echo "$(generate_put_data)")
put_data_with_single_quotes="'$put_data'"
# Generate headers for curl request
header=myheader:coca-cola
header_with_double_quotes="\"$header\""
# The following function takes two inputs - a simple string variable (with no spaces or quotes) and the curl command string
function run_cmd() {
echo $1
echo $2
#Run the curl command
"$2"
#Check return code of the curl command
if [ "$?" -ne 0 ]; then
#do something with simple string variable
echo "$1"
echo "Job failed"
exit 1
else
#do something with simple string variable
echo "$1"
echo "Job Succeeded"
fi
}
# Run the bash function - run_cmd
run_cmd "mysimplestring" "curl -X PUT -H $header_with_double_quotes -d $put_data_with_single_quotes $MYURL"
However, when I try to run the above bash script, it fails at the point where I call run_cmd() function with the two inputs. I get the following error:
curl -X PUT -H "myheader:coca-cola" -d '{
"name":"harrypotter"
}' http://mygoogle.com/service/books/123: No such file or directory
Job failed
This error occurs on the line where "$2" is being executed in the run_cmd() function declaration.
Could someone help me understand where I am going wrong? Thanks!
"$2"
This will take the second argument and try to run it without doing any word splitting. It treats it as one string.
You're going to run into trouble passing in the curl command as one string. You'll do better if you pass it without quotes, just as if you typed it on the command line. You'll want to quote each of the variables but not quote the command as a whole.
run_cmd "mysimplestring" curl -X PUT -H "$header" -d "$put_data" "$MYURL"
Notice that you don't need the "with_quotes" variables any more. You don't have to do anything like that. The original plain values will work.
Now you can access the command using array syntax:
function run_cmd() {
local name=$1; shift
local cmd=("$#")
#Run the curl command
"${cmd[#]}"
}
By the way, this is a useless use of echo:
put_data=$(echo "$(generate_put_data)")
Make that:
put_data=$(generate_put_data)

Resources