I have a curl command in a script, when the script is run the command isn't able to fetch a resource (the command itself works, it's the response that's incorrect), but if I copy and paste the same command into the terminal I get the expected response.
After reading this my script looks like this:
jsess=`awk '/\sJSESSION/ { print "\x27"$6"="$7"\x27" }' cookies.txt`
ARGS=( -k -v -b $jsess $url7)
echo "curl ${ARGS[*]}"
curl "${ARGS[#]}"
and the last echo looks like this:
curl -k -v -b 'JSESSIONID=hexystuff' https://secretstuff.com
The last curl doesn't work, but copy-pasting that echo works. Any ideas what could be wrong? Thanks.
The problem seems in the two single quotes, try this :
jsess="$(awk '/\sJSESSION/ { print $6"="$7 }' cookies.txt)"
ARGS=( -k -v -b "$jsess" "$url7")
echo "curl ${ARGS[*]}"
curl "${ARGS[#]}"
args="-k -v -b"
jsess=$(awk '/\sJSESSION/ { print "\x27"$6"="$7"\x27" }' cookies.txt)
url7="https://secretstuff.com"
curl "${args}" "${jsess}" "${url7}"
The use of arrays is not my personal preference, and I believe the current situation demonstrates why. I believe that as much as possible, every individual item of data should be contained in its' own variable. This makes accessing said variables much simpler, and also greatly increases flexibility. I can choose exactly which pieces of information will go into a given command line.
Related
When write bash scripts, I want to store my whole curl command in heredoc to get a better layout. The following works fine:
#/bin/bash
read -r -d '' command1 <<- MULTI_STRING_SCOPE
curl -v www.stackoverflow.com
MULTI_STRING_SCOPE
But when add some json data with the -d option, the command is executed weirdly. For example:
#/bin/bash
read -r -d '' command2 <<- MULTI_STRING_SCOPE
curl -v www.stackoverflow.com
-d '{
"hello":"world"
}'
MULTI_STRING_SCOPE
response2=$(${command2})
Wrong logs from terminal:
curl: (3) URL using bad/illegal format or missing URL
curl: (3) unmatched close brace/bracket in URL position 1:
}'
And it seems that the curl take line }' as a seperated url, and thus the json data not sent as a unit.
How to solve the problem? Any suggestions will be highly appreciated.
I learned from this post to make heredoc work with curl command.
As the comment made by #Gordon Davisson in current post, we should not mix command with data. Since the json data set to -d option is only data and other parts is command, so I decide to use heredoc to store only the json data and remain other parts to be command itself, rather than store them in string by heredoc.
The result bash script should be something like this:
#/bin/bash
response3=$(curl -v www.stackoverflow.com \
-d #- <<- MULTI_STRING_SCOPE
{
"hello":"world"
}
MULTI_STRING_SCOPE
)
Notice: heredoc indent only works with tab, not with blanks. Be careful, especially when you are working with editors like Visual Studio Code, which may have already set indent as blanks for you.
I have a script which needs to pass a string containing a space.
Basically, I want to collect the error return of calling curl "https://my-api.com" -H'X-API-Key: API key here'
The following works, but I'd really rather avoid using eval. I'm sure there is a cleaner way of writing this, but I can't find it.
CURL="curl -s --fail $URL -H\"X-API-Key:$API_KEY\""
return $(eval "$CURL" >> /dev/null)
This is a duplicate of Dynamically building a command in bash, but here is the fix for your code.
Please take the time to read the answers from the parent question.
# Build the curl command with its arguments in an array
curl_cmd=(curl -s --fail "$URL" -H "X-API-Key:$API_KEY")
# Execute the curl command with its arguments from the curl_cmd array
"${curl_cmd[#]}"
I'm trying to run a curl command such as:
curl -Sks -XPOST https://localhost:9999/some/local/service -d 'some text arguments'
The above works fine. However, when I place
'some text arguments'
in a file and call the command as:
curl -Sks -XPOST https://localhost:9999/some/local/service -d `cat file_with_args`
I get many exceptions from the service. Does anyone know why this is?
Thanks!
If the file contains 'some text arguments', then your command is equivalent to this:
curl -Sks -XPOST https://localhost:9999/some/local/service -d \'some text arguments\'
— that is, it's passing 'some, text, and arguments' as three separate arguments to curl.
Instead, you should put just some text arguments in the file (no single-quotes), and run this command:
curl -Sks -XPOST https://localhost:9999/some/local/service -d "`cat file_with_args`"
(wrapping the `cat file_with_args` part in double-quotes so that the resulting some text arguments doesn't get split into separate arguments).
Incidentally, I recommend writing $(...) rather than `...`, because it's more robust in general (though in your specific command it doesn't make a difference).
I've got a script in sh under Solaris 5.8 that isn't working as expected and don't really know why...
The script reads a list of URLs from a file, tests them with curl and writes the output to a log file:
#!/bin/sh
# Logs path
LOG_DIR=/somedir/logs
# URLs file path
URL_FILE=/somedir/url
# Actual date
DATE=`date +%Y%m%d%H%M`
# CURL
CURL=/somedir/bin/curl
test_url()
{
cat $URL_FILE | grep -i $1 | while read line
do
NAME=`echo $line | awk '{ print $1 }'`
URL=`echo $line | awk '{ print $2 }'`
TIME=`$CURL -s -o /dev/null -w %{time_total} $URL`
echo "$DATE $TIME" >> $LOG_DIR/${NAME}_${1}.log
done
}
test_url someurl
test_url someotherurl
The URL_FILE has this layout:
somename1 http://someurl/test
somename2 http://someotherurl/test
The script loads the URLs from the file and then uses curl to get the total time the URL takes to load, then prints the date and the time (in ms). The problem I find is that the variable TIME doesn't work when called inside a crontab, but it does when called with the user itself:
# Output when called with the user ./script.sh
201202201018 0.035
# Output when called from crontab.
201202201019
If I redirect all output * * * * * /path/to/script/script.sh 1&2 > /tmp/output, the output file is blank.
Also I haven't been able to see any output in /var/log/syslog about it. Any clue why TIME variable isn't displaying correctly when called via crontab?
Things you should check out:
Is /path/to/script/script.sh 1&2 > /tmp/output valid in your cron? On my machine, it would run the script with argument "1" and try to find a program called "2" to run it. Failing to find it, it creates an empty output file. I think you're looking for something like /path/to/script/script.sh >> /tmp/output 2>&1
Do you set CURL to the full path of curl? Cron normally doesn't have the same path information that you have.
Which user do you use for running cron? Could there be access restrictions to curl or the network?
% is indeed handled differently by cron, if it's written in the crontab. It means newline, and should otherwise be escaped. Have you been running tests with curl directly in cron? As long as you keep this in your script you should be fine.
Inside a bash script I am piping the output of a curl request to tee and then duplicating the output to many subshells. At the end of each subshell I want to assign the result to a variable declared inside my script:
#!/bin/bash
token_secret=""
token_value=""
function extractTokenSecret() {
sed -n 's/.*secret":"\([^"]*\)".*/\1/p'
}
function extractTokenValue() {
sed -n 's/.*token":"\([^"]*\)".*/\1/p'
}
function createToken() {
curl -v \
-X POST \
-s http://localhost:8080/token | tee >/dev/null \
>(extractTokenSecret | [ASSIGN THE VARIABLE token_secret HERE]) \
>(extractTokenValue | [ASSING THE VARIABLE token_value HERE] \
}
Any help appreciated
The commands that consume the output of your curl command are listed after the pipe "|" character. So... they will be subshells of the current command processor, just as you say in your question. In other words, they are child processes, and cannot affect the environment of the parent shell directly.
You'll need to find some other way to process the output of the curl command that will allow your script to assign text to variables in the current shell. i.e. Don't try to do the assignment as a second or third command in a pipeline. For this, things like $() and eval(1) are your friends.
Maybe something like:
$ output=$(curl options...)
$ variable1=$(echo $output | sed ...)
$ variable2=$(echo $output | sed other stuff...)
Something along these lines should work (I haven't got a particularly clear idea of how precisely you were trying to split it up, but this should be a basis):
function createToken() {
original=`curl -v -X POST -s http://localhost:8080/token`
token_secret=`extractTokenSecret $original` # And then get extractTokenSecret to use $1
token_value=`extractTokenValue $token_secret` # Ditto
}
Also, no spaces around =, please.
token_secret=''
token_value=''