Can you set multiple cURL --write-out variables to bash variables in a single call - bash

I need to set or access multiple cURL variables so I can access them later in a script. For example:
curl -s --write-out "%{http_code} | %{local_ip} | %{time_total}" "http://endpoint.com/payload"
Now how can I access http_code or local_ip to do things like add them to an bash array, etc? Is the only option to grep them out of the response?

You can pipe your curl command to a read command :
curl -s --write-out "write-out: %{http_code} | %{local_ip} | %{time_total}\n" "http://yahoo.com" | \
sed -n '/^write-out:/ s///p' | \
while IFS='|' read http_code local_ip time_total;
do
printf "http_code: %s\nlocal_ip: %s\ntotal_time: %s\n" $http_code $local_ip $time_total;
# or in an array
curlvars=($http_code $local_ip $time_total)
for data in "${curlvars[#]}"
do
printf "%s | " $data
done
done
I added a \n to the write-out string to allow process it as a line.
The sed command extract the write-out line from the curl output.
In the read command you can define a separator and assign all parsed strings to vars.

Related

How to get a command variable inside another command variable?

Example here:
gitrepo=$(jq -r '.gitrepo' 0.json)
releasetag=$(curl --silent ""https://api.github.com/repos/\"$gitrepo\""/releases/latest" | grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/')
echo "$releasetag"
Used \" to escape characters.
0.json:
{
"type": "github-releases",
"gitrepo": "ipfs/go-ipfs"
}
How to put $gitrepo to work inside $releasetag?
Thanks in advance!
Bash variables expand inside quoted " strings.
gitrepo="$(jq -r '.gitrepo' 0.json)"
releasetag="$(
curl --silent "https://api.github.com/repos/$gitrepo/releases/latest" \
| grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/'
)"
echo "$releasetag"
Btw, as you are using jq to extract .gitrepo from 0.json, you could also use it in the exact same way to extract .tag_name from curl's output (instead of using grep and sed) like so:
gitrepo="$(jq -r '.gitrepo' 0.json)"
releasetag="$(
curl --silent "https://api.github.com/repos/$gitrepo/releases/latest" \
| jq -r '.tag_name'
)"
echo "$releasetag"
And to simplify it even further (depending on your use case), just write:
curl --silent "https://api.github.com/repos/$(jq -r '.gitrepo' 0.json)/releases/latest" \
| jq -r '.tag_name'

bash variable as a command: echo the command before execution and save the result to a variable

I am executing a chain of curl commands:
I need to echo the command before the execution.
Execute the command and save the result to a bash variable.
Get values from the result of the execution and execute the next curl with that values.
This is how it looks like:
# -----> step 1 <-----
URL="https://web.example.com:8444/hello.html"
CMD="curl \
--insecure \
--dump-header - \
\"$URL\""
echo $CMD && eval $CMD
OUT="<result of the curl command???>"
# Set-Cookie: JSESSIONID=5D5B29689EFE6987B6B17630E1F228AD; Path=/; Secure; HttpOnly
JSESSIONID=$(echo $OUT | grep JSESSIONID | awk '{ s = ""; for (i = 2; i <= NF; i++) s = s $i " "; print s }' | xargs)
# Location: https://web.example.com:8444/oauth2/authorization/openam
URL=$(echo $OUT | grep Location | awk '{print $2}')
# -----> step 2 <-----
CMD="curl \
--insecure \
--dump-header - \
--cookie \"$JSESSIONID\" \
\"$URL\""
echo $CMD && eval $CMD
OUT="<result of the curl command???>"
...
# -----> step 3 <-----
...
I only have a problem with the step 2: save the full result of the curl command to a variable in order to I can parse it.
I have tried it many different way, non of them works:
OUT="eval \$CMD"
OUT=\$$CMD
OUT=$($CMD)
...
What I missed?
For very basic commands, OUT=$($CMD) should work. The problem with this is, that strings stored in variables are processed differently than strings entered directly. For instance, echo "a" prints a, but var='"a"'; echo $a prints "a" (note the quotes). Because of that and other reasons, you shouldn't store commands in variables.
In bash, you can use arrays instead. By the way: The naming convention for regular variables is NOT ALLCAPS, as such names might accidentally collide with special variables. Also, you can probably drastically simplifiy your grep | awk | xargs.
url="https://web.example.com:8444/hello.html"
cmd=(curl --insecure --dump-header - "$url")
printf '%q ' "${cmd[#]}"; echo
out=$("${cmd[#]}")
# Set-Cookie: JSESSIONID=5D5B29689EFE6987B6B17630E1F228AD; Path=/; Secure; HttpOnly
jsessionid=$(awk '{$1=""; printf "%s%s", d, substr($0,2); d=FS}' <<< "$out")
# Location: https://web.example.com:8444/oauth2/authorization/openam
url=$(awk '/Location/ {print $2}' <<< "$out")
# -----> step 2 <-----
cmd=(curl --insecure --dump-header - --cookie "$jsessionid" "$url")
printf '%q ' "${cmd[#]}"; echo
out=$("${cmd[#]}")
# -----> step 3 <-----
...
If you have more steps than that, wrap the repeating part into a function, as suggested by Charles Duffy.
Easy Mode: Use set -x
Bash has a built-in feature, xtrace, which tells it to log every command to the file descriptor named in the variable BASH_XTRACEFD (by default, file descriptor 2, stderr).
#!/bin/bash
set -x
url="https://web.example.com:8444/hello.html"
output=$(curl \
--insecure \
--dump-header - \
"$url")
echo "Output of curl follows:"
echo "$output"
...will provide logs having the form of:
+ url=https://web.example.com:8444/hello.html
++ curl --insecure --dump-header - https://web.example.com:8444/hello.html
+ output=Whatever
+ echo 'Output of curl follows:'
+ echo Whatever
...where the + is based on the contents of the variable PS4, which can be modified to have more information. (I often use and suggest PS4=':${BASH_SOURCE}:$LINENO+' to put the source filename and line number in each logged line).
Doing It By Hand
If that's not acceptable, you can write a function.
log_and_run() {
{ printf '%q ' "$#"; echo; } >&2
"$#"
}
output=$(log_and_run curl --insecure --dump-header - "$url")
...will write your curl command line to stderr before storing its output in $output. Note when writing that output that you need to use quotes: echo "$output", not echo $output.
I guess OUT=$(eval $CMD) will do what you want.

Bash, loop unexpected stop

I'm having problems with this last part of my bash script. It receives input from 500 web addresses and is supposed to fetch the server information from each. It works for a bit but then just stops at like the 45 element. Any thoughts with my loop at the end?
#initializing variables
timeout=5
headerFile="lab06.output"
dataFile="fortune500.tsv"
dataURL="http://www.tech.mtu.edu/~toarney/sat3310/lab09/"
dataPath="/home/pjvaglic/Documents/labs/lab06/data/"
curlOptions="--fail --connect-timeout $timeout"
#creating the array
declare -a myWebsitearray
#obtaining the data file
wget $dataURL$dataFile -O $dataPath$dataFile
#getting rid of the crap from dos
sed -n "s/^m//" $dataPath$dataFile
readarray -t myWebsitesarray < <(cut -f3 -d$'\t' $dataPath$dataFile)
myWebsitesarray=("${myWebsitesarray[#]:1}")
websitesCount=${#myWebsitesarray[*]}
echo "There are $websitesCount websites in $dataPath$dataFile"
#echo -e ${myWebsitesarray[200]}
#printing each line in the array
for line in ${myWebsitesarray[*]}
do
echo "$line"
done
#run each website URL and gather header information
for line in "${myWebsitearray[#]}"
do
((count++))
echo -e "\\rPlease wait... $count of $websitesCount"
curl --head "$curlOptions" "$line" | awk '/Server: / {print $2 }' >> $dataPath$headerFile
done
#display results
echo "Results: "
sort $dataPath$headerFile | uniq -c | sort -n
It would certainly help if you actually passed the --connect-timeout option to curl. As written, you are currently passing the single argument --fail --connect-timeout $timeout rather than 3 distinct arguments --fail, --connect-timeout, and $timeout. This is one instance where you should not quote the variable. IOW, use:
curl --head $curlOptions "$line"

reading parameters into cURL from a text file in bash

I have a following cURL command to execute soapui test in Jenkins. There I am passing two parameters variable1 and variable2.
curl --form "project=/build.tool/.jenkins/jobs/$variable1" --form "suite=$variable2" host
I have a following .txt file where values for those variables are in two columns separated by a space. How do I loop through all those values in my cURL command?
#file1.txt
project1.xml TestSuite_1
project1.xml TestSuite_2
project2.xml TestSuite_3
project2.xml TestSuite_4
project3.xml TestSuite_5
Used this on your test input
while read p; do
var1=`echo $p | awk '{print $1" "}'`
var2=`echo $p | awk '{ print " "$2}'`
echo $var1
echo $var2
done <infile
Where infile contains your data.
It output:
project1.xml
TestSuite_1
project1.xml
TestSuite_2
project2.xml
TestSuite_3
project2.xml
TestSuite_4
project3.xml
TestSuite_5
Now you've got them stored in variables, and you can just add them to your curl command
curl --form "project=/build.tool/.jenkins/jobs/$var1" --form "suite=$var2" host

Simple Server using netcat and curl

I am trying to build a simple server with netcat and curl.
netcat gets data coming from a port and then runs a curl command as follows to send the data to a webservice.
nc -l -k 2233 | while read x ; do curl -X POST -H "Content-Type: application/json" -d '{"DATA": `echo $x` }' https://example.com/FEP ; done
for some reason, the echo $x is not being evaluated to the read value.
You need to move $x out of the ' delimited string, e.g. -d '{"DATA":"'$x'" }'.
Special characters in single quoted strings are never honored.
See the example:
x=text
echo single: '$x'
echo double: "$x"
prints
single: $x
double: text
There is no need to echo $x; just use
curl [options] '{"DATA": '$x' }' https://example.com/FEP; done

Resources