How to get Http Status Code and Vourbose logs by Curl function - bash

We try to check the API connection and response by several curl commands as follows, we have two issue though.
-- function
function exec_cmd() {
cmd=$#
echo "curl command:" ${cmd}
response=$(${cmd} --max-time ${timeout_sec} -o ${LogFile} -w '%{http_code}\n' -S)
echo statud_code:${response}
}
-- curl command#1 ...
exec_cmd curl -v -x put <LBFrontIP/ApiEndpoint1> -H 'Content-Type:application/json; charset=utf-8' -H 'Host:<HostName>' -d '{"SystemId":"XXXX", "AppNo":"99999999999"}'
-- curl command#2
exec_cmd curl -v -x put <LBFrontIP/ApiEndpoint2> -H 'Content-Type:application/json; charset=utf-8' -H 'Host:<HostName>' -d '{"sbSystemId":"XXXX", "email":"XXX#example.com",
"PhoneNo":"99999999999"}'
-- curl command#3
exec_cmd curl -v -x put <LBFrontIP/ApiEndpoint3> -H 'Content-Type:application/json; charset=utf-8' -H 'Host:<HostName>' -d '
{
"ID1": "XXXXX",
"ID2": "1q2w3e4r5t",
"applyNo": "00db182faef74e779ca958681127bcec", ...
"docLiveScore": "0.1391"
}'
Issue #1 : cmd=$# can get the curl command but if that curl command contains # or return code, it can't work as expected.
curl command#1 is working fine, but curl command#2 and #3 is not, since bash misunderstands # and return code in data option as terminate of the command line. Curl command can work by itself though.
Also tried cmd=$* or cmd="$#" but not working so far.
Issue #2 : We'd like to get status_code for our quick check, but on the same time, we'd like to get the detail response in the log.
Is there any better way to get both of status_code and verbose log by appropriate function?
Above commands over write the verbose logs...

Given the complexity of the commands involved, I am tempted to say that you should avoid command parsing and read the command lines from a job configuration file.
#!/bin/bash
START=`pwd`
echo ${START}
BASE=`basename "$0" ".sh" `
ThisDATE=`date '+%Y%m%d_%H%M%S' `
LogPrefix="${START}/${BASE}.${ThisDATE}" ; rm -f "${LogPrefix}"*
BatchFile="${START}/${BASE}.${ThisDATE}.config" ; rm -f "${BatchFile}"
cat >$BatchFile <<-!EnDoFbAtCh
curl -v -x put <LBFrontIP/ApiEndpoint1> -H 'Content-Type:application/json; charset=utf-8' -H 'Host:<HostName>' -d '{"SystemId":"XXXX", "AppNo":"99999999999"}'
curl -v -x put <LBFrontIP/ApiEndpoint2> -H 'Content-Type:application/json; charset=utf-8' -H 'Host:<HostName>' -d '{"sbSystemId":"XXXX", "email":"XXX#example.com", "PhoneNo":"99999999999"}'
curl -v -x put <LBFrontIP/ApiEndpoint3> -H 'Content-Type:application/json; charset=utf-8' -H 'Host:<HostName>' -d '{ "ID1": "XXXXX", "ID2": "1q2w3e4r5t", "applyNo": "00db182faef74e779ca958681127bcec", ... "docLiveScore": "0.1391" }'
!EnDoFbAtCh
function exec_cmd() {
echo "curl command: ${cmd}"
${cmd} --max-time ${timeout_sec} -o ${LogFile} -w '%{http_code}\n' -S
RC=$?
echo "statud_code:${RC}"
}
index=1
while read cmd
do
if [ -z "${cmd}" ] ; then break ; fi
LogFile="${LogPrefix}.${index}.log"
exec_cmd
index=`expr ${index} + 1 `
done < "${BatchFile}"

Related

curl: (3) URL using bad/illegal format or missing URL AND HTTP/1.1 403 Forbidden

I have a problem with this code. It states that there is a URL issue but I don't see that as right. I am not responsible for the server though. I just wanted to test some IoT properties with basic linux commands.
#!/bin/bash
a=Client secret
c=Custom domain
b=Client id:
#!/bin/bash
response=$(curl -X POST --user $b:$a "https://${c}.auth.eu-central-1.amazoncognito.com/oauth2/token?grant_type=client_credentials" -H 'Content-Type: application/x-www-form-urlencoded')
token=$(jq -r '.access_token' <<< "$response")
secondsValid=$(jq -r '.expires_in' <<< "$response")
refreshToken=$(jq -r '.refresh_token' <<< "$response")
for i in {1..4}
do
declare -i TIME
TIME=$(date +%s)
TEMP=$(sensors -j | jq '."cpu_thermal-virtual-0"."temp1"."temp1_input"')
curl -i -k -X POST -H "Authorization: $token"-H "Content-Type: application/json" --data '{ "id":"2","timestamp":"'${TIME}'","data":"'${TEMP}'"}' https://dv7knsjzph.execute-api.eu-central-1.amazonaws.com/prod/boxtronic-devices/2/data/
sleep 1
echo "$refreshToken"
echo "$secondsValid"
echo "$TIME"
done
Anyone knows why I am getting the error ?

cURL using bash script with while read text file

Hi there anyone there having the same trouble like mine?
whenever I cURL the $list from the list.txt it just displaying {} which is a blank response from the API does my code should be really working properly or it is just a bug?
I know the $list is working because I can update the database status
Please this is a bit urgennnnttt :(
#! /bin/bash
filename=/var/lib/postgresql/Script/list.txt
database='dbname'
refLink='URL'
authorization='Authorization: Basic zxc'
expireDate=$(date -d "+3 days")
body="Message."
while IFS=' ' read -r list
do
wow=$(curl --location --request POST $refLink \
--header 'Authorization: Basic $authorization' \
--header 'Content-Type: application/json' \
--data-raw '{
"title":"Expiration Notice",
"body":"$body",
"min" :[{"mobileNumber" : "$list"}],
"type" : "Notification",
"action_type" : "NotificationActivity"}')
echo "result: '$result'"
RESP=$(echo "$result" | grep -oP "^[^a-zA-Z0-9]")
echo "RESP:'$RESP'"
echo $body
#echo $wow >> logs.txt
psql -d $database -c "UPDATE tblname SET status='hehe' WHERE mobile='$list'"
done < $filename
Your "$list" JSON entry is not populated with the content of the $list variable because it is within single quotes of the --data-raw curl parameter.
What you need is compose your JSON data for the query before-hand, preferably with the help of jq or some other JSON processor, before sending it as argument to the curl's POST request.
Multiple faults in your scripts (not exhaustive):
Shebang is wrong with a space #! /bin/bash
expireDate=$(date -d "+3 days") return date in locale format and this may not be what you need for your request.
The request and the response data are not processed with JSON grammar aware tools. grep is not appropriate for JSON data.
Some clues but cannot fix your script more without knowing more about the API answers and functions you use.
Anyway here is how you can at least compose a proper JSON request:
#!/usr/bin/env bash
filename='/var/lib/postgresql/Script/list.txt'
database='dbname'
refLink='URL'
authorization='zxc'
expireDate=$(date -R -d "+3 days")
body="Message."
while IFS=' ' read -r list; do
raw_json="$(
jq -n --arg bdy "$body" --arg mobN "$list" \
'.action_type="NotificationActivity"|.title="Expiration Notice"|.type="Notification"|.body=$bdy|.min[0].mobileNumber=$mobN|.'
)"
json_reply="$(curl --location --request POST "$refLink" \
--header "Authorization: Basic $authorization" \
--header 'Content-Type: application/json' \
--data-raw "$raw_json")"
echo "json_reply: '$json_reply'"
echo "$body"
# psql -d "$database" -c "UPDATE tblname SET status='hehe' WHERE mobile='$list'"
done <"$filename"

Passing arguments values to shell script

I'm using the following script to purge cache from cdn,
#!/bin/bash
## API keys ##
zone_id=""
api_key=""
login_id=""
akamai_crd=""
## URL ##
urls="$1"
[ "$urls" == "" ] && { echo "Usage: $0 url"; exit 1; }
echo "Purging $urls..."
curl -X DELETE "https://api.cloudflare.com/client/v4/zones/${zone_id}/purge_cache" \
-H "X-Auth-Email: ${login_id}" \
-H "X-Auth-Key: ${api_key}" \
-H "Content-Type: application/json" \
--data "{\"files\":[\"${urls}\"]}"
#echo "CF is done now purging from Akamai ..."
echo "..."
curl -v -s https://api.ccu.akamai.com/ccu/v2/queues/default -H "Content-Type:application/json" -d '{"objects":["$urls"]}' -u $akamai_crd
The 1st part for cloudflare is working fine the 2nd part when I pass it to Akamai
["$urls"]
I keep getting an error and it's passing the url as an argument it returns the variable itself ($urls) not the arg value.
I ran the script as following:
sh +x script.sh url
Any advise here?
First i would change this:
SCRIPTNAME=$(basename "$0")
...
if [ $# != 1 ] then
echo "Usage: $SCRIPTNAME url"
exit
fi
$urls="$1"
Change your second curl command as the following (you need to escape the quotes):
--data "{\"files\":[\"${urls}\"]}"
Avoid creating JSON by hand like this; you can't guarantee that the resulting JSON is properly escaped. Use a tool like jq instead.
#!/bin/bash
## API keys ##
zone_id=""
api_key=""
login_id=""
akamai_crd=""
## URL ##
url=${1:?Usage: $0 url}
headers=(
-H "X-Auth-Email: $login_id"
-H "X-Auth-Key: $api_key"
-H "Content-Type: application/json"
)
purge_endpoint="https://api.cloudflare.com/client/v4/zones/${zone_id}/purge_cache"
echo "Purging $url..."
jq -n --arg url "$url" '{files: [$url]}' |
curl -X DELETE "$purge_endpoing" "${headers[#]}" --data #-
#echo "CF is done now purging from Akamai ..."
echo "..."
jq -n --arg url "$url" '{objects: [$url]}' |
curl -v -s -H "Content-Type:application/json" -d #- -u "$akamai_crd"

Unable to send large files to elasticsearch using curl: argument too long

This is the script i used to export some documents to elasticsearch but no luck
#!/bin/ksh
set -v
trap read debug
date=$(date +%Y-%m-%d);
echo $date;
config_file="/home/p.sshanm/reports_elastic.cfg";
echo $config_file;
URL="http://p-acqpes-app01.wirecard.sys:9200/reports-"$date"";
echo $URL;
find /transfers/documents/*/done/ -type f -name "ABC-Record*_${date}*.csv"|
while IFS='' read -r -d '' filename
do
echo "filename : ${filename}"
var=$(base64 "$filename"| perl -pe 's/\n//g');
#if i use below it will fail as argument too long , so i used with curl option #
# var1= $(curl -XPUT 'http://localhost:9200/reports-'$date'/document/reports?pipeline=attachment&pretty' -d' { "data" : "'$var'" }')
var1=$(curl -X PUT -H "Content-Type: application/json" -d #- "$URL" >>CURLDATA
{ "data": "$var" }
CURL_DATA)
done;
If i use below it as
var1= $(curl -XPUT 'http://localhost:9200/reports-'$date'/document/reports?pipeline=attachment&pretty' -d' { "data" : "'$var'" }')
will fail as below, so i used with curl option #
argument too long
Your syntax to read from stdin is wrong, the here-doc string should have been (<<) and the de-limiters are mis-matching use CURL_DATA at both places.
curl -X PUT -H "Content-Type: application/json" -d #- "$URL" <<CURL_DATA
{ "data": "$var" }
CURL_DATA

if condition in shell scripting based on 404 unauthorized error

Here is my shell script. I want to put a condition to my curl commands based on the status i get. I need help in grepping "Http/1.1 401 Unauthorized" from the first curl command. After that i need to put it in a condition if status is 401, execute 2nd and 3rd curl command. Pls help
STATUS="HTTP/1.1 401 Unauthorized"
read $STATUS
for ((i=1; i<=2000; i++)); do
curl -i -vs -X POST -D post.txt -H "$SESSION_TOKEN" -H "$AUTH_TOKEN" -H "Accept:$ACCEPT_HEADER" -H "Content-Type:text/plain" "http://$BASE_URI/api/$PLATFORM//$CHANNEL_ID/subscription" | grep -e "*Unauth*" >> post.txt
if [$STATUS]
then
curl -i -vs -X POST -D tmp.txt -H "Content-Type:text/plain" --data "$SECRET" -H "Accept:$ACCEPT_HEADER" -H "Connection:close" http://$BASE_URI/api/${PLATFORM}/authenticate >> tmp1.txt
SESSION_TOKEN=`grep SessionToken tmp.txt`
curl -i -vs -X POST -D tmp2.txt -H "$SESSION_TOKEN" -H "Content-Type:text/plain" -H "Content-Type:application/json" --data "{ \"username\":\"$USER_NAME\",\"password\":\"$PASSWORD\", \"rememberMe\":\"false\"}" http://$BASE_URI/api/web/users/authenticate >> data.json
AUTH_TOKEN=`grep Authorization tmp2.txt`
continue
fi
done
The proper solution :
res=$(curl -s -o /dev/null -w '%{http_code}\n' http://google.fr)
if ((res == 404)); then
echo "404 spotted"
fi
or
res=$(curl -s -o /dev/null -w '%{http_code}\n' http://google.fr)
if [[ $res == 404 ]]; then
echo "404 spotted"
fi
Check
man curl | less +/'^ *-w'
You have to capture the curl output and examine the status line:
output=$( curl -i ... )
if [[ $output == "$STATUS"* ]]; then
# I was unauthorized
else
# ...
fi
Be aware that [ (and [[ I use in my answer)
is not mere syntax, it is a command, and as such it requires a space between it and its arguments

Resources