Bash: variable insertion into curl call not working [duplicate] - bash

This question already has answers here:
Expansion of variables inside single quotes in a command in Bash
(8 answers)
Closed 4 years ago.
I have a very simple bash script with three commands.
The first command strips the first word off of the last git commit, the second command attempts to make a POST call to an api endpoint, with that same variable as part of the call, and the third command just prints that variable, to ensure it was working properly. See the code below
SOMETHING=$(git log -1 --pretty=%B | head -n1 | sed -e 's/\s.*$//' | cut -d ' ' -f1)
curl -X POST \
http://www.someurl.com/ \
-H 'Cache-Control: no-cache' \
-d '{"item":"$SOMETHING"}'
echo "variable was $SOMETHING"
When I run that bash script, I get a response from the service saying that "item was not set properly" in XML, however it does correctly echo the correct variable. So I know that first line is working. If I copy that curl command and paste it into bash, replacing $SOMETHING with the actual value, it works fine.

Single quotes do not expand the $variables inside them.
Try
'{"item":"'"$SOMETHING"'"}'
instead. Brief explanation:
'{"item":"' is a string delimited by single quotes that contains double quotes
"$SOMETHING" is a string delimited by double quotes, that expands the variable $SOMETHING
'"}' is again a ''-delimited string that contains double quotes
Simply writing those strings in a row without gaps is string concatenation
In this way, you get your variable expansion, but don't have to insert any backslashes to escape the double quotes.

Related

literal double quotes around function parameter in bash scripting [duplicate]

This question already has answers here:
Expansion of variables inside single quotes in a command in Bash
(8 answers)
Difference between single and double quotes in Bash
(7 answers)
Closed 1 year ago.
I want to create a shell script function that runs this:
docker inspect 5914ba2819d6 | jq '.[0].NetworkSettings.Networks["foo"].IPAddress'
I'm trying to get the function to run like:
getcontainerip 5914ba2819d6 foo
I've edited .bash_profile to have many permutations like this:
getcontainerip() {
docker inspect "$1" | jq '.[0].NetworkSettings.Networks["$2"].IPAddress'
}
No matter what, the literal double quotes never get added surrounding the second parameter. The output will either be blank (because the double quotes aren't there), or a syntax error:
jq: error: syntax error, unexpected '$' (Unix shell quoting issues?)
How do I get the literal double quotes to surround the second parameter? Thanks.
Try this:
getcontainerip() {
docker inspect "$1" | jq '.[0].NetworkSettings.Networks["'$2'"].IPAddress'
}
The outer '' have the effect that $2 is not replaced with the second parameter but rather passed literally to jq. That's why you need to close the ''s temporarily to let the shell expand $2.

bash: treat all arguments as a single string param [duplicate]

This question already has answers here:
Stop shell wildcard character expansion?
(4 answers)
Closed 3 years ago.
This will be better explained with an example. I'm building a little bash script that reruns the passed commandline every 1 second. this is my first attempt:
-- file wtbash
watch -n1 "$#"
If I issue:
wt "df -h | grep sda"
It works ok. I want to be able to do:
wt df -h | grep sda
And to work just the same
That is, I want to treat "df -h | grep sda" as a single string param
There's an app a variable for that! From man bash:
* Expands to the positional parameters, starting from one. When the expan‐
sion is not within double quotes, each positional parameter expands to a
separate word. In contexts where it is performed, those words are sub‐
ject to further word splitting and pathname expansion. When the expan‐
sion occurs within double quotes, it expands to a single word with the
value of each parameter separated by the first character of the IFS spe‐
cial variable. That is, "$*" is equivalent to "$1c$2c...", where c is
the first character of the value of the IFS variable. If IFS is unset,
the parameters are separated by spaces. If IFS is null, the parameters
are joined without intervening separators.
So change your script to:
#!/bin/bash
watch -n1 "$*"
But that won't work for pipes. You will still need to escape those:
wt df -h \| grep sda

Passing string to curl without literal double quotes

I am trying to write a shell script that invokes two APIs using curl.
One of the keys of JSON output of the first curl is passed to the second curl. In the Bash script below, I am passing token as a command line parameter to the first curl and it works fine.
The output of the first curl is extracted into client_token and I am passing it to the second curl. It is failing.
The reason being, wherever I have $client_token, the value is getting substituted as "value" (with quotes) instead of value (without quotes). Curl expects strings without quotes in the second curl. How can I get rid of double quotes?
echo $1
XVaultToken=`curl -X POST "https://sub.domain.tld:8200/login" -d '{"token":"'"$1"'"}'`
client_token=`echo $XVaultToken|jq '.auth.client_token'
echo $client_token
apiKey=`curl -X GET https://sub.domain.tld:8200/api-key -H 'X-Vault-Token: "'"$client_token"'"'`
#apiKey=`curl -X GET https://sub.domain.tld:8200/api-key -H 'X-Vault-Token: $client_token'`
echo "apikey"
Probably your jq command is outputting the quotes that you don't want. Ask jq for the raw value instead:
client_token=`echo $XVaultToken|jq -r '.auth.client_token'

variable in bash not working [duplicate]

This question already has answers here:
How do I use variables in single quoted strings?
(8 answers)
Closed 6 years ago.
I have a simple function in my bashrc file, it takes 2 arguments and makes a curl call. The problem is, the curl request is not getting the variable. Here's the function...
checkUserName() {
cd ~/
echo "checking for username $2"
curl -w "#curl-format.txt" -H "Content-Type: application/json" -X POST -d '{"userName": "$2"}' http://localhost:8080/$1/verify
}
alias unameCheck=checkUserName
Then I call it with something like...
unameCheck users danwguy
and I will see...
checking for username danwguy
in my terminal prompt, however when I look at my logs from my application it shows that it is checking for userName $2
So the variable isn't being replaced in the curl command, even though the $1 is being replaced, since it is calling the correct API on my localhost.
It replaces it in the echo command, but not in the curl command.
I have even tried creating a local variable, but that still doesn't work, no matter what I do, it doesn't replace in the curl call, but it does in the echo call.
Can anyone see why it wouldn't be properly replacing the $2
Parameter expansions ($var) will not expand in single quotes. Use double quotes instead:
$ curl -w "#curl-format.txt" \
-H "Content-Type: application/json" \
-X POST \
-d '{"userName": "'"$2"'"}' \
"http://localhost:8080/$1/verify"
Also wrap parameter expansions in double quotes to avoid word splitting and
pathname expansion.
Just a note to previous comment.
You can escape double quotes with backslash, to tell bash interpreter to not interpret its special meaning, so final code looks like:
... -d "{\"userName\": \"$2\"}" ...
Which is way more obvious for me...

Bash: Format environment variables in string

How does one format a string leveraging an environment variable within at the command line? For example, I want to curl and pass some variable, i.e.:
curl -X POST --data-urlencode 'payload={"text": "I want to print an environment variable here, eg a path: $PATH"}' https://someapi.com/
The easy answer is to change the kind of quotes you're using:
curl -X POST --data-urlencode \
'payload={"text": "I want to print an environment variable here, eg a path: '"$PATH"'"}' \
https://someapi.com/
Notably, this is still using single-quotes on the outside (so you don't need to change your payload), but then it ends the single quotes, starts double quotes, and embeds your substitution in those double quotes (before ending them and switching back to single quotes, within which literal -- rather than syntactic -- double quotes can be embedded).
A variant on this approach, which avoids the need for syntactic quotes mixed into the document content, is to used an unquoted heredoc, as advised by #chepner in the comments:
curl -X POST --data-urlencode #- https://someapi.com/ <<EOF
payload={"text": "I want to print an environment variable here, eg a path: $PATH"}
EOF
The better answer is to use a tool that knows how to format JSON; jq is a widely popular choice. Consider the following example:
text="I want to print an environment variable here, eg a path: \"$PATH\""
curl -X POST --data-urlencode #- https://someapi.com/ <<EOF
payload=$(printf '%s\n' "$text" | jq -R '{text: .}')
EOF
This way you're guaranteed valid output, even if your environment variable contains backslashes, literal quotes, nonprintable characters, or whatever else may come.

Resources