Simple Server using netcat and curl - bash

I am trying to build a simple server with netcat and curl.
netcat gets data coming from a port and then runs a curl command as follows to send the data to a webservice.
nc -l -k 2233 | while read x ; do curl -X POST -H "Content-Type: application/json" -d '{"DATA": `echo $x` }' https://example.com/FEP ; done
for some reason, the echo $x is not being evaluated to the read value.

You need to move $x out of the ' delimited string, e.g. -d '{"DATA":"'$x'" }'.
Special characters in single quoted strings are never honored.

See the example:
x=text
echo single: '$x'
echo double: "$x"
prints
single: $x
double: text

There is no need to echo $x; just use
curl [options] '{"DATA": '$x' }' https://example.com/FEP; done

Related

bash variable as a command: echo the command before execution and save the result to a variable

I am executing a chain of curl commands:
I need to echo the command before the execution.
Execute the command and save the result to a bash variable.
Get values from the result of the execution and execute the next curl with that values.
This is how it looks like:
# -----> step 1 <-----
URL="https://web.example.com:8444/hello.html"
CMD="curl \
--insecure \
--dump-header - \
\"$URL\""
echo $CMD && eval $CMD
OUT="<result of the curl command???>"
# Set-Cookie: JSESSIONID=5D5B29689EFE6987B6B17630E1F228AD; Path=/; Secure; HttpOnly
JSESSIONID=$(echo $OUT | grep JSESSIONID | awk '{ s = ""; for (i = 2; i <= NF; i++) s = s $i " "; print s }' | xargs)
# Location: https://web.example.com:8444/oauth2/authorization/openam
URL=$(echo $OUT | grep Location | awk '{print $2}')
# -----> step 2 <-----
CMD="curl \
--insecure \
--dump-header - \
--cookie \"$JSESSIONID\" \
\"$URL\""
echo $CMD && eval $CMD
OUT="<result of the curl command???>"
...
# -----> step 3 <-----
...
I only have a problem with the step 2: save the full result of the curl command to a variable in order to I can parse it.
I have tried it many different way, non of them works:
OUT="eval \$CMD"
OUT=\$$CMD
OUT=$($CMD)
...
What I missed?
For very basic commands, OUT=$($CMD) should work. The problem with this is, that strings stored in variables are processed differently than strings entered directly. For instance, echo "a" prints a, but var='"a"'; echo $a prints "a" (note the quotes). Because of that and other reasons, you shouldn't store commands in variables.
In bash, you can use arrays instead. By the way: The naming convention for regular variables is NOT ALLCAPS, as such names might accidentally collide with special variables. Also, you can probably drastically simplifiy your grep | awk | xargs.
url="https://web.example.com:8444/hello.html"
cmd=(curl --insecure --dump-header - "$url")
printf '%q ' "${cmd[#]}"; echo
out=$("${cmd[#]}")
# Set-Cookie: JSESSIONID=5D5B29689EFE6987B6B17630E1F228AD; Path=/; Secure; HttpOnly
jsessionid=$(awk '{$1=""; printf "%s%s", d, substr($0,2); d=FS}' <<< "$out")
# Location: https://web.example.com:8444/oauth2/authorization/openam
url=$(awk '/Location/ {print $2}' <<< "$out")
# -----> step 2 <-----
cmd=(curl --insecure --dump-header - --cookie "$jsessionid" "$url")
printf '%q ' "${cmd[#]}"; echo
out=$("${cmd[#]}")
# -----> step 3 <-----
...
If you have more steps than that, wrap the repeating part into a function, as suggested by Charles Duffy.
Easy Mode: Use set -x
Bash has a built-in feature, xtrace, which tells it to log every command to the file descriptor named in the variable BASH_XTRACEFD (by default, file descriptor 2, stderr).
#!/bin/bash
set -x
url="https://web.example.com:8444/hello.html"
output=$(curl \
--insecure \
--dump-header - \
"$url")
echo "Output of curl follows:"
echo "$output"
...will provide logs having the form of:
+ url=https://web.example.com:8444/hello.html
++ curl --insecure --dump-header - https://web.example.com:8444/hello.html
+ output=Whatever
+ echo 'Output of curl follows:'
+ echo Whatever
...where the + is based on the contents of the variable PS4, which can be modified to have more information. (I often use and suggest PS4=':${BASH_SOURCE}:$LINENO+' to put the source filename and line number in each logged line).
Doing It By Hand
If that's not acceptable, you can write a function.
log_and_run() {
{ printf '%q ' "$#"; echo; } >&2
"$#"
}
output=$(log_and_run curl --insecure --dump-header - "$url")
...will write your curl command line to stderr before storing its output in $output. Note when writing that output that you need to use quotes: echo "$output", not echo $output.
I guess OUT=$(eval $CMD) will do what you want.

Bash, loop unexpected stop

I'm having problems with this last part of my bash script. It receives input from 500 web addresses and is supposed to fetch the server information from each. It works for a bit but then just stops at like the 45 element. Any thoughts with my loop at the end?
#initializing variables
timeout=5
headerFile="lab06.output"
dataFile="fortune500.tsv"
dataURL="http://www.tech.mtu.edu/~toarney/sat3310/lab09/"
dataPath="/home/pjvaglic/Documents/labs/lab06/data/"
curlOptions="--fail --connect-timeout $timeout"
#creating the array
declare -a myWebsitearray
#obtaining the data file
wget $dataURL$dataFile -O $dataPath$dataFile
#getting rid of the crap from dos
sed -n "s/^m//" $dataPath$dataFile
readarray -t myWebsitesarray < <(cut -f3 -d$'\t' $dataPath$dataFile)
myWebsitesarray=("${myWebsitesarray[#]:1}")
websitesCount=${#myWebsitesarray[*]}
echo "There are $websitesCount websites in $dataPath$dataFile"
#echo -e ${myWebsitesarray[200]}
#printing each line in the array
for line in ${myWebsitesarray[*]}
do
echo "$line"
done
#run each website URL and gather header information
for line in "${myWebsitearray[#]}"
do
((count++))
echo -e "\\rPlease wait... $count of $websitesCount"
curl --head "$curlOptions" "$line" | awk '/Server: / {print $2 }' >> $dataPath$headerFile
done
#display results
echo "Results: "
sort $dataPath$headerFile | uniq -c | sort -n
It would certainly help if you actually passed the --connect-timeout option to curl. As written, you are currently passing the single argument --fail --connect-timeout $timeout rather than 3 distinct arguments --fail, --connect-timeout, and $timeout. This is one instance where you should not quote the variable. IOW, use:
curl --head $curlOptions "$line"

Can you set multiple cURL --write-out variables to bash variables in a single call

I need to set or access multiple cURL variables so I can access them later in a script. For example:
curl -s --write-out "%{http_code} | %{local_ip} | %{time_total}" "http://endpoint.com/payload"
Now how can I access http_code or local_ip to do things like add them to an bash array, etc? Is the only option to grep them out of the response?
You can pipe your curl command to a read command :
curl -s --write-out "write-out: %{http_code} | %{local_ip} | %{time_total}\n" "http://yahoo.com" | \
sed -n '/^write-out:/ s///p' | \
while IFS='|' read http_code local_ip time_total;
do
printf "http_code: %s\nlocal_ip: %s\ntotal_time: %s\n" $http_code $local_ip $time_total;
# or in an array
curlvars=($http_code $local_ip $time_total)
for data in "${curlvars[#]}"
do
printf "%s | " $data
done
done
I added a \n to the write-out string to allow process it as a line.
The sed command extract the write-out line from the curl output.
In the read command you can define a separator and assign all parsed strings to vars.

curl: pass a named parameter from stdin

I have a web service that expects some parameters. I want to pass one of these parameters (named "data") to curl via stdin.
I tried
echo -n "some data" | curl -d x="foo" -d y="bar" -d data=#- "http://somewhere"
which doesn't work, as the value of data then is "#-" instead of "some data".
Is it possible to use the # in curl to associate the input from stdin with a specific parameter?
edit: My goal is to chain multiples web services so the data I pass will be the output of another curl call.
Use -d #-
E.g.:
echo '{"text": "Hello **world**!"}' | curl -d #- https://api.github.com/markdown
Output:
<p>Hello <strong>world</strong>!</p>
Is this not possible?
curl -d x="foo" -d y="bar" -d data="#some data" "http://somewhere"
Or
curl -d x="foo" -d y="bar" -d data="#$(echo "some data")" "http://somewhere"
echo "some data" can be another command or file input: $(<somefile).
You could also save your data in a variable and then use it like this:
VAR="some data"; curl -d x="foo" -d y="bar" -d data=$VAR "http://somewhere"

Shell integer expression expected?

response=$(curl -sL -w \\n%{http_code} "http://<ip_addr>/api/1/app" -X DELETE)
echo response
if [ "$response" -eq 200 ]
then
echo "Got 200 OK"
else
echo "not getting the result"
fi
What i'm trying to do is to get the http response code.
I'm positive that the response should be 200 OK
When I run the script I'm getting
{
"result":true
}
200
tst.sh: line 302: [: {
200: integer expression expected
I even don't want to display
{
"result":true
}
I just want to print 200 and compare 200.
Just with curl command:
curl -sL -w '%{http_code}' "http://<ip_addr>/api/1/app" -X DELETE -o /dev/null
Get the last line of the output.
response=$(curl -sL -w \\n%{http_code} "http://<ip_addr>/api/1/app" -X DELETE | tail -1)
Besides tail -n you can also use:
awk 'END { print }'
sed -n '$p'
Another way if you're using bash is to remove everything before the last line:
shopt -s extglob
response=${response##*[[:space:]]}
In your above, you are missing a $ in your first echo, but no matter. You can strip the {"result":true} from the $response string
before the if statement:
response=${response#*\}\ }

Resources