How do I use a variable in the --data section of a curl command? - shell

I am writing a bash script to call curl from the command line. I sadly cannot figure out how to substitute a simple variable into the -d section of the curl request.
Why doesn't this work?
#!/bin/sh
name=$1
test -z $name && echo "Repo name required." 1>&2 && exit 1
curl -u 'metaraine' https://api.github.com/user/repos -d '{"name":"$name"}'
It doesn't actually substitute the value of $name into the data.

What about this?
curl -u 'metaraine' https://api.github.com/user/repos -d "{\"name\":\"$name\"}"
That is, escape the quotes and use double quotes around the {} instead of simple ones.

Related

curl works, but won't execute in BASH script

The following curl command works from the command line. I get a valid response from the server.
curl -X POST https://somebaseurl/api/v1/auth/login -H "Content-Type:application/json" -d '{"email": "foo#bar,com", "password": "foo"}'
However I am trying to write a BASH script like this
baseUrl=https://somebaseurl
contentTypeJson="\"Content-Type:application/json\""
credentials="'{\"email\": \"foo#bar.com",\"password\": \"foo\"}'"
login="curl -X POST $baseUrl/api/v1/auth/login -H $contentTypeJson -d $credentials"
echo ${login}
response=`${login}`
echo ${response}
I get a bad request response from the server. However if I copy the echoed curl command directly into my terminal it works. What am I doing wrong?
edit:
As requested I get
Bad Request For request 'POST api/v1/auth/login' [Expected application/json]
Bash and cURL can be quite particular how quotes are used within a script. If the escaping gets thrown off then everything else can easily fail. Running the script through shellcheck.net is often very helpful in identifying such issues. Below is a revised version of the script after fixing based upon the suggestions:
#!/bin/bash
baseUrl="https://somebaseurl/api/v1/auth/login"
contentTypeJson="Content-Type:application/json"
credentials="{\"email\": \"foo#bar.com\", \"password\": \"foo\"}"
login="$(curl -X POST "$baseUrl" -H "$contentTypeJson" -d "$credentials")"
echo "${login}"
response="${login}"
echo "${response}"
Executing with backticks interprets the command only as a sequence of words, and doesn't treat quotes specially. To have the shell interpret quotes as if they were interactively typed, use eval ${login} instead.
As an aside, bash has a -x option which will show you commands as they are being executed (run your script with bash -x script.sh instead of bash script.sh or ./script.sh). This will show you the commands correctly quoted, and is more helpful than printing them out using echo.

While loop to execute nagios commands not working properly

I wrote a small bash script in this post: How to search for a string in a text file and perform a specific action based on the result
I noticed that when I ran the script and check the logs, everything appears to be working but when I look at the Nagios UI, almost half of the servers listed in my text file did not get their notifications disabled. A revised version of the script is below:
host=/Users/bob/wsus.txt
password="P#assw0rd123"
while read -r host; do
region=$(echo "$host" | cut -f1 -d-)
if [[ $region == *sea1* ]]
then
echo "Disabling host notifications for: $host"
curl -vs -o /dev/null -d "cmd_mod=2&cmd_typ=25&host=$host&btnSubmit=Commit" https://nagios.$region.blah.com/nagios/cgi-bin/cmd.cgi" -u "bob:$password" -k 2>&1
else
echo "Disabling host notifications for: $host"
curl -vs -o /dev/null -d "cmd_mod=2&cmd_typ=25&host=$host&btnSubmit=Commit" https://nagios.$region.blah02.com/nagios/cgi-bin/cmd.cgi" -u "bob:$password" -k 2>&1
fi
done < wsus.txt >> /Users/bob/disable.log 2>&1
If i run the command against the servers having the issue manually, it does get disabled in the Nagios UI, so I'm a bit confused. FYI, I'm not well versed in Bash either so this was my attempt at trying to automate this process a bit.
1 - There is a missing double-quote before the first https occurence:
You have:
curl -vs -o /dev/null -d "cmd_mod=2&cmd_typ=25&host=$host&btnSubmit=Commit" https://nagios.$region.blah.com/nagios/cgi-bin/cmd.cgi" -u "bob:$password" -k 2>&1
Should be:
curl -vs -o /dev/null -d "cmd_mod=2&cmd_typ=25&host=$host&btnSubmit=Commit" "https://nagios.$region.blah.com/nagios/cgi-bin/cmd.cgi" -u "bob:$password" -k 2>&1
2 - Your first variable host is never used (overwritten inside the while loop).
I'm guessing what you were trying to do was something like:
hosts_file="/Users/bob/wsus.txt"
log_file="/Users/bob/disable.log"
# ...
while read -r host; do
# Do stuff with $host
done < $hosts_file >> $log_file 2>&1
3 - This looks suspicious to me:
if [[ $region == *sea1* ]]
Note: I haven't tested it yet, so this is my general feeling about this, might be wrong.
The $region isn't double-quoted, so make sure there could be no spaces / funny stuff happening there (but this should not be a problem inside a double-bracket test [[).
The *sea* looks like it would be expanded to match your current directory files matching this globbing. If you want to test this as a regular expression, you should use ~= operator or (my favorite for some reason) grep command:
if grep -q ".*sea.*" <<< "$region"; then
# Your code if match
else
# Your code if no match
fi
The -q keeps grep quiet
There is no need for test like [ or [[ because the return code of grep is already 0 if any match
The <<< simply redirects the right strings as the standard input of the left command (avoid useless piping like echo "$region" | grep -q ".*sea.*").
If this doesn't solve your problem, please provide a sample of your input file hosts_file as well as some output logs.
You could also try to see what's really going on under the hood by enclosing your script with set -x and set +x to activate debug/trace mode.

Running script through command

I'm trying to decode my password through a script while it's being run but it seems like the script is being run with a literal and the password is not processed. Is there a better way of doing this?
#!/bin/bash
MYENC="Tk9UX1RIQVRfU1RVUElEX0xPTAo="
rdesktop -u FOO -d mgmt -p 'echo $(echo $MYENC) | base64 --decode' 192.0.0.0
I also tried to just pass in a variable but that failed as well.
Try this instead:
#!/bin/bash
MYENC="Tk9UX1RIQVRfU1RVUElEX0xPTAo="
rdesktop -u FOO -d mgmt -p $(echo $MYENC | base64 --decode) 192.0.0.0
Note that I wrapped the juicy stuff echo...base64... in $(...). This is called "command substitution" - basically you're telling bash that you want the code inside the $(...) to be executed before the rest of the line, with the result substituted in its place. More info here: http://www.tldp.org/LDP/abs/html/commandsub.html
Or this
#!/bin/bash
MYENC="Tk9UX1RIQVRfU1RVUElEX0xPTAo="
rdesktop -u FOO -d mgmt -p $(base64 --decode <<< "$MYENC") 192.0.0.0

Trouble escaping quotes in a variable held string during a Sub-shell execution call [duplicate]

This question already has answers here:
Why does shell ignore quoting characters in arguments passed to it through variables? [duplicate]
(3 answers)
Closed 6 years ago.
I'm trying to write a database call from within a bash script and I'm having problems with a sub-shell stripping my quotes away.
This is the bones of what I am doing.
#---------------------------------------------
#! /bin/bash
export COMMAND='psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o ${EXPORT_FILE} 2>&1'
PSQL_RETURN=`${COMMAND}`
#---------------------------------------------
If I use an 'echo' to print out the ${COMMAND} variable the output looks fine:
echo ${COMMAND}
screen output:-
#---------------
psql drupal7 -F , -t --no-align -c "SELECT DISTINCT hostname FROM accesslog;" -o /DRUPAL/INTERFACES/EXPORTS/ip_list.dat 2>&1
#---------------
Also if I cut and paste this screen output it executes just fine.
However, when I try to execute the command as a variable within a sub-shell call, it gives an error message.
The error is from the psql client to the effect that the quotes have been removed from around the ${SQL} string.
The error suggests psql is trying to interpret the terms in the sql string as parameters.
So it seems the string and quotes are composed correctly but the quotes around the ${SQL} variable/string are being interpreted by the sub-shell during the execution call from the main script.
I've tried to escape them using various methods: \", \\", \\\", "", \"" '"', \'"\', ... ...
As you can see from my 'try it all' approach I am no expert and it's driving me mad.
Any help would be greatly appreciated.
Charlie101
Instead of storing command in a string var better to use BASH array here:
cmd=(psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o "${EXPORT_FILE}")
PSQL_RETURN=$( "${cmd[#]}" 2>&1 )
Rather than evaluating the contents of a string, why not use a function?
call_psql() {
# optional, if variables are already defined in global scope
DB_NAME="$1"
SQL="$2"
EXPORT_FILE="$3"
psql "$DB_NAME" -F , -t --no-align -c "$SQL" -o "$EXPORT_FILE" 2>&1
}
then you can just call your function like:
PSQL_RETURN=$(call_psql "$DB_NAME" "$SQL" "$EXPORT_FILE")
It's entirely up to you how elaborate you make the function. You might like to check for the correct number of arguments (using something like (( $# == 3 ))) before calling the psql command.
Alternatively, perhaps you'd prefer just to make it as short as possible:
call_psql() { psql "$1" -F , -t --no-align -c "$2" -o "$3" 2>&1; }
In order to capture the command that is being executed for debugging purposes, you can use set -x in your script. This will the contents of the function including the expanded variables when the function (or any other command) is called. You can switch this behaviour off using set +x, or if you want it on for the whole duration of the script you can change the shebang to #!/bin/bash -x. This saves you explicitly echoing throughout your script to find out what commands are being run; you can just turn on set -x for a section.
A very simple example script using the shebang method:
#!/bin/bash -x
ec() {
echo "$1"
}
var=$(ec 2)
Running this script, either directly after making it executable or calling it with bash -x, gives:
++ ec 2
++ echo 2
+ var=2
Removing the -x from the shebang or the invocation results in the script running silently.

Bash and curl, getting custom argument working

I'm trying to make a very small script but i run into a problem, i want to call a simple bash script, passing an IP addres, like this:
./bashScript 192.111.211.211
the script looks like this:
#!/bin/bash
curl https://www.xxx.com/api_json.html \
-d 'a=ban' \
-d 'tkn=xxxxxx' \
-d 'email=xxx#gmail.com' \
-d 'key=$1' \
but it isn't working, the $1 argument is not sending and i get an error from the web-service.
What i'm doing wrong?
Thanks a lot!
Use doubles quotes:
-d "key=$1"
Single quotes prevent variable expansion:
~$ foo=bar
~$ echo '$foo'
$foo
~$ echo "$foo"
bar

Resources