I have a cron issue with curl:
curl -w "%{time_total}\n" -o /dev/null -s http://myurl.com >> ~/log
works great and add a line in log file with total_time.
But the same line with cron doesn't do anything.
It's not a path problem because curl http://myurl.com >> ~/log works.
% is a special character for crontab. From man 5 crontab:
The "sixth" field (the rest of the line) specifies the command to be
run. The entire command portion of the line, up to a newline or a
"%" character, will be executed by /bin/sh or by the shell specified
in the SHELL variable of the cronfile. A "%" character in the
command, unless escaped with a backslash (\), will be changed into
newline characters, and all data after the first % will be sent to
the command as standard input.
So you need to escape the % character:
curl -w "%{time_total}\n" -o /dev/null -s http://myurl.com >> ~/log
to
curl -w "\%{time_total}\n" -o /dev/null -s http://myurl.com >> ~/log
^
Related
I would like to use a variable inside a curl and shell command, the command is :
sh """\ oc exec pod-test -c "conttest" -- \\ /bin/bash -c 'curl X POST \\ -u "usr:pass" \\ -H "content-Type:application/json" \\ -d "{\"id\": \"1\"} \\ http://127.0.0.1:8080/tym/api/obj/$profile/add"' """.stripIndent()
the variable profile is not replaced correctly and i have this error : command terminated with exit 3
Be sure you use the right string syntax espacially escaping characters like " and \.
You can use variables in strings like this
"${profile}"
For more information about groovy and strings see https://www.tutorialspoint.com/groovy/groovy_strings.htm
I'm trying to run a script for pulling finance history from yahoo. Boris's answer from this thread
wget can't download yahoo finance data any more
works for me ~2 out of 3 times, but fails if the crumb returned from the cookie has a "\" character in it.
Code that sometimes works looks like this
#!usr/bin/sh
symbol=$1
today=$(date +%Y%m%d)
tomorrow=$(date --date='1 days' +%Y%m%d)
first_date=$(date -d "$2" '+%s')
last_date=$(date -d "$today" '+%s')
wget --no-check-certificate --save-cookies=cookie.txt https://finance.yahoo.com/quote/$symbol/?p=$symbol -O C:/trip/stocks/stocknamelist/crumb.store
crumb=$(grep 'root.*App' crumb.store | sed 's/,/\n/g' | grep CrumbStore | sed 's/"CrumbStore":{"crumb":"\(.*\)"}/\1/')
echo $crumb
fileloc=$"https://query1.finance.yahoo.com/v7/finance/download/$symbol?period1=$first_date&period2=$last_date&interval=1d&events=history&crumb=$crumb"
echo $fileloc
wget --no-check-certificate --load-cookies=cookie.txt $fileloc -O c:/trip/stocks/temphistory/hs$symbol.csv
rm cookie.txt crumb.store
But that doesn't seem to process in wget the way I intend either, as it seems to be interpreting as described here:
https://askubuntu.com/questions/758080/getting-scheme-missing-error-with-wget
Any suggestions on how to pass the $crumb variable into wget so that wget doesn't error out if $crumb has a "\" character in it?
Edited to show the full script. To clarify I've got cygwin installed with wget package. I call the script from cmd prompt as (example where the script above is named "stocknamedownload.sh, the stock symbol I'm downloading is "A" from the startdate 19800101)
c:\trip\stocks\StockNameList>bash stocknamedownload.sh A 19800101
This script seems to work fine - unless the crumb returned contains a "\" character in it.
The following implementation appears to work 100% of the time -- I'm unable to reproduce the claimed sporadic failures:
#!/usr/bin/env bash
set -o pipefail
symbol=$1
today=$(date +%Y%m%d)
tomorrow=$(date --date='1 days' +%Y%m%d)
first_date=$(date -d "$2" '+%s')
last_date=$(date -d "$today" '+%s')
# store complete webpage text in a variable
page_text=$(curl --fail --cookie-jar cookies \
"https://finance.yahoo.com/quote/$symbol/?p=$symbol") || exit
# extract the JSON used by JavaScript in the page
app_json=$(grep -e 'root.App.main = ' <<<"$page_text" \
| sed -e 's#^root.App.main = ##' \
-e 's#[;]$##') || exit
# use jq to extract the crumb from that JSON
crumb=$(jq -r \
'.context.dispatcher.stores.CrumbStore.crumb' \
<<<"$app_json" | tr -d '\r') || exit
# Perform our actual download
fileloc="https://query1.finance.yahoo.com/v7/finance/download/$symbol?period1=$first_date&period2=$last_date&interval=1d&events=history&crumb=$crumb"
curl --fail --cookie cookies "$fileloc" >"hs$symbol.csv"
Note that the tr -d '\r' is only necessary when using a native-Windows jq mixed with an otherwise native-Cygwin set of tools.
You are adding quotes to the value of the variable instead of quoting the expansion. You are also trying to use tools that don't know what JSON is to process JSON; use jq.
wget --no-check-certificate \
--save-cookies=cookie.txt \
"https://finance.yahoo.com/quote/$symbol/?p=$symbol" \
-O C:/trip/stocks/stocknamelist/crumb.store
# Something like thist; it's hard to reverse engineer the structure
# of crumb.store from your pipeline.
crumb=$(jq 'CrumbStore.crumb' crumb.store)
echo "$crumb"
fileloc="https://query1.finance.yahoo.com/v7/finance/download/$symbol?period1=$first_date&period2=$last_date&interval=1d&events=history&crumb=$crumb"
echo "$fileloc"
wget --no-check-certificate \
--load-cookies=cookie.txt "$fileloc" \
-O c:/trip/stocks/temphistory/hs$symbol.csv
I'm trying to send a notification via pushover using curl in a bash script.
I cannot get curl -F to interpret the line break correctly though.
curl -s \
-F "token=TOKEN" \
-F "user=USER" \
-F "message=Root Shell Access on HOST \n `date` \n `who` " \
https://api.pushover.net/1/messages.json > NUL
I've tried:
\n
\\\n
%A0
I'd rather push the message out directly, not through a file.
curl doesn't interpret backslash escapes, so you have to insert an actual newline into the argument which curl sees. In other words, you have to get the shell (bash in this case) to interpret the \n, or you need to insert a real newline.
A Posix standard shell does not interpret C escapes like \n, although the standard utility command printf does. However, bash does provide a way to do it: in the quotation form $'...' C-style backslash escapes will be interpreter. Otherwise, $'...' acts just like '...', so that parameter and command substitutions do not take place.
However, any shell -- including bash -- allows newlines to appear inside quotes, and the newline is just passed through as-is. So you could write:
curl -s \
-F "token=$TOKEN" \
-F "user=$USER" \
-F "message=Root Shell Access on $HOST
$(date)
$(who)
" \
https://api.pushover.net/1/messages.json > /dev/null
(Note: I inserted parameter expansions where it seemed like they were missing from the original curl command and changed the deprecated backtick command substitutions to the recommended $(...) form.)
The only problem with including literal newlines, as above, is that it messes up indentation, if you care about appearances. So you might prefer bash's $'...' form:
curl -s \
-F "token=$TOKEN" \
-F "user=$USER" \
-F "message=Root Shell Access on $HOST"$'\n'"$(date)"$'\n'"$(who)" \
https://api.pushover.net/1/messages.json > /dev/null
That's also a little hard to read, but it is completely legal. The shell allows a single argument ("word") to be composed of any number of quoted or unquoted segments, as long as there is no whitespace between the segments. But you can avoid the multiple quote syntax by predefining a variable, which some people find more readable:
NL=$'\n'
curl -s \
-F "token=$TOKEN" \
-F "user=$USER" \
-F "message=Root Shell Access on $HOST$NL$(date)$NL$(who)" \
https://api.pushover.net/1/messages.json > /dev/null
Finally, you could use the standard utility printf, if you are more used to that style:
curl -s \
-F "token=$TOKEN" \
-F "user=$USER" \
-F "$(printf "message=Root Shell Access on %s\n%s\n%s\n" \
"$HOST" "$(date)" "$(who)")" \
https://api.pushover.net/1/messages.json > /dev/null
wget -q -T 60 --retry-connrefused -t 5 --waitretry=60 --user=ftp2.company.com|company2013 --password=!company2013 -N -P data/parser/company/ ftp://ftp2.company.com/Production/somedata.zip
I'm having trouble with this command, because the password contains an exclamation mark. I tried escaping with \, tried single quotes, and it either gives the output:
wget: missing URL
or
bash: !company2013: event not found
This is really demotivating...
Perhaps this part needs to be quoted to prevent it from being seen as a pipe to another command.
--user='ftp2.company.com|company2013'
And this one too to prevent history expansion with !:
--password='!company2013'
Final:
wget -q -T 60 --retry-connrefused -t 5 --waitretry=60 --user='ftp2.company.com|company2013' --password='!company2013' -N -P data/parser/company/ ftp://ftp2.company.com/Production/somedata.zip
And it's also a good idea to quote the other parts if on later time they have spaces:
wget -q -T 60 --retry-connrefused -t 5 --waitretry=60 --user='ftp2.company.com|company2013' --password='!company2013' -N -P "data/parser/company/" "ftp://ftp2.company.com/Production/somedata.zip"
I have written a smoke-testing script that uses BASH script & Curl to test RESTful web services we're working on. The script reads a file, and interprets each line as a URL suffix and parameters for a Curl REST call.
Unfortunately, the script gives unexpected results when I adapted it to run HTTP POST calls as well as GET calls. It does not give the same results running the command on its own, vs. in script:
The BASH Script:
IFS=$'\n' #Don't split an input URL line at spaces
RESTHOST='hostNameAndPath' #Can't give this out
URL="/activation/v2/activationInfo --header 'Content-Type:Application/xml'"
URL2="/activation/v2/activationInfo"
OUTPUT=`curl -sL -m 30 -w "%{http_code}" -o /dev/null $RESTHOST$URL -d #"./activation_post.txt" -X POST`
echo 'out:' $OUTPUT
OUTPUT2=`curl -sL -m 30 -w "%{http_code}" -o /dev/null $RESTHOST$URL2 --header 'Content-Type:Application/xml' -d #'./activation_post.txt' -X POST`
echo 'out2:' $OUTPUT2
Results Out:
out: 505
out2: 200
So, the first call fails (HTTP return code 505, HTTP Version Not Supported), and the second call succeeds (return code "OK").
Why does the first call fail, and how do I fix it? I've verified they should execute the same command (evaluating in echo). I am sure there is something basic I'm missing, as I am just NOW learning Bash scripting.
I think I have found the problem! It is caused by IFS=$'\n'! Because of this, variable expansion does not work as expected. It does not let to split the arguments specified in the URL string!
As a result the SERVER_PROTOCOL variable on the server side will be set to '--header Content-Type:Application/xml HTTP/1.1' instead of "HTTP/1.1", and the CONTENT_TYPE will be 'application/x-www-form-urlencoded' instead of 'Application/xml'.
To show the root of the problem in detail:
VAR="Solaris East"
printf "+%s+ " $VAR
echo "==="
IFS=$'\n'
printf "+%s+ " $VAR
Output:
+Solaris+ +East+ ===
+Solaris East+
So the $VAR expansion does not work as expected because of IFS=$'\n'!
Solution: Do not use IFS=$'\n' and replace space to %20 in URL!
URL=${URL2// /%20}" --header Content-Type:Application/xml"
In this case your first curl call will work properly!
If You still use IFS=$'\n' and give --header option in the command line it will not work properly if URL contains a space, because the server will fail to process it (I tested on apache)!
Even You still cannot use HEADER="--header Content-Type:Application/xml" as expanding $HEADER will result one(!) argument for curl, namely --header Content-Type:Application/xml instead of splitting them into two.
So I may suggest to replace spaces in URL to %20 anyway!
The single quotes surrounding Content-Type:Application/xml, because they are quoted in the value of URL are treated as literal quotes and not removed when $URL is expanded in that call to curl. As a result, you are passing an invalid HTTP header. Just use
URL="/activation/v2/activationInfo --header Content-Type:Application/xml"
OUTPUT=`curl -sL -m 30 -w "%{http_code}" -o /dev/null $RESTHOST$URL -d #"./activation_post.txt" -X POST`
However, it's not a great idea to rely on word-splitting like this to combine two separate pieces of the call to curl in a single variable. Try something like this instead:
URLPATH="activation/v2/activationInfo"
HEADERS=("--header" "Content-Type:Application/xml")
OUTPUT=$( curl -SL -m 30 -w "%{http_code}" -o /dev/null "$RESTHOST/$URL" "${HEADERS[#]}" -d #'./activation_post.txt' -X POST )