Pass jq output to curl - bash

I want to be use a number of values from jq piped to curl. I have json data with a structure that looks something like this
{
"apps": {
"code": "200",
"name": "app1",
},
"result": "1"
}
{
"apps": {
"code": "200",
"name": "app2",
},
"result": "5"
}
...
...
...
What I want to do is to take the values of every apps.name and result and pass that to curl:
curl -XPOST http://localhost --data-raw 'app=$appsName result=$result'
The curl command will then run X amount of times depending on how many objects there are.
In jq I came up with this but I don't know how or the best way of passing this to curl in a loop.
jq -r '.result as $result | .apps as $apps | $apps.name + " " + $result myfile.json

Maybe something like this?
jq < data.json '[.apps.name, .result]|#tsv' -r | while read appsName result; do
curl -XPOST http://localhost --data-raw "$appsName $result"
done

jq -r '"\( .apps.name ) \( .result )"' |
while read -r app result; do
curl -XPOST http://localhost --data-raw "app=$app result=$result"
done
or
jq -r '"app=\( .apps.name ) result=\( .result )"' |
while read -r data; do
curl -XPOST http://localhost --data-raw "$data"
done
The first one assumes the name can't contain whitespace.

Would this do it?
while read -r line; do
curl -XPOST http://localhost --data-raw "$line"
done < <(jq -r '"app=\(.apps.name) result=\(.result)"' myfile.json)

I think xargs is better than while loops. Maybe just personal preference.
jq -r '"app=\( .apps.name ) result=\( .result )"' \
| xargs -r -I{} curl -XPOST http://localhost --data-raw "{}"

Related

Iterate through the json object and run curl command as variable from json output using jq

I would like to filter the json object while iterating through it and run curl command over each item from the output.
JSON object.
{
"repo": "releases",
"path": "/apps/releases",
"created": "2021-04-01T10:12:23.496-01:00",
"children": [
{
"uri": "/Image1",
"folder": true,
"created": 2022-08-09T17.12.22.987.04.000
},
{
"uri": "/Image2",
"folder": true,
"created": 2022-06-10T10.12.22.412.10.000
},
{
"uri": "/Image3",
"folder": true,
"created": 2022-10-10T07.03.14.742.01.000
},
{
"uri": "/Image4",
"folder": true,
"created": 2022-10-10T07.010.11.542.08.000
}
]
}
Looking for some logic that will iterate through the uri under children and that is passed through curl command as $i which would be Image1, Image2 and Image3.
curl -k -s --user user:password -X GET "https://artifactory.com/api/releases/baseimage/${i}"
While I was running this below command and the output is as follows
for i in $(curl -k -s --user user:password -X GET "https://artifactory.com/api/releases/baseimage/" | jq -c ".children[] |.uri)
Output: ["/Image1", "/Image2", "/Image3"]
I tried the following command but in the output it replaces ${i} with only Image3, somehow it is not taking Image1 and Image2.
for i in $(curl -k -s --user user:password -X GET "https://artifactory.com/api/releases/baseimage/" | jq -r ".children[] |.uri); do curl -k -s --user user:password -X GET "https://artifactory.com/api/releases/baseimage/${i}"; done
I tried the following command but in the output it replaces ${i} with only Image3, somehow it is not taking Image1 and Image2.
curl can read URLs to fetch from a file, which you can generate with jq. Something like
base=https://artifactory.com/api/releases/baseimage
curl -k -s --user user:password -X GET "$base/" |
jq -r --arg b "$base" '.children[].uri | "url = \"\($b)/\(.)\""' |
curl -k -s --user user:password -X GET --config -
Just one curl process to fetch all the individual images, and no shell loop needed.
You might find it easier to construct the for loop along the following lines:
for uri in $( echo "$json" | jq '.children[].uri') ; do
echo curl ... ${uri}...
done

cURL using bash script with while read text file

Hi there anyone there having the same trouble like mine?
whenever I cURL the $list from the list.txt it just displaying {} which is a blank response from the API does my code should be really working properly or it is just a bug?
I know the $list is working because I can update the database status
Please this is a bit urgennnnttt :(
#! /bin/bash
filename=/var/lib/postgresql/Script/list.txt
database='dbname'
refLink='URL'
authorization='Authorization: Basic zxc'
expireDate=$(date -d "+3 days")
body="Message."
while IFS=' ' read -r list
do
wow=$(curl --location --request POST $refLink \
--header 'Authorization: Basic $authorization' \
--header 'Content-Type: application/json' \
--data-raw '{
"title":"Expiration Notice",
"body":"$body",
"min" :[{"mobileNumber" : "$list"}],
"type" : "Notification",
"action_type" : "NotificationActivity"}')
echo "result: '$result'"
RESP=$(echo "$result" | grep -oP "^[^a-zA-Z0-9]")
echo "RESP:'$RESP'"
echo $body
#echo $wow >> logs.txt
psql -d $database -c "UPDATE tblname SET status='hehe' WHERE mobile='$list'"
done < $filename
Your "$list" JSON entry is not populated with the content of the $list variable because it is within single quotes of the --data-raw curl parameter.
What you need is compose your JSON data for the query before-hand, preferably with the help of jq or some other JSON processor, before sending it as argument to the curl's POST request.
Multiple faults in your scripts (not exhaustive):
Shebang is wrong with a space #! /bin/bash
expireDate=$(date -d "+3 days") return date in locale format and this may not be what you need for your request.
The request and the response data are not processed with JSON grammar aware tools. grep is not appropriate for JSON data.
Some clues but cannot fix your script more without knowing more about the API answers and functions you use.
Anyway here is how you can at least compose a proper JSON request:
#!/usr/bin/env bash
filename='/var/lib/postgresql/Script/list.txt'
database='dbname'
refLink='URL'
authorization='zxc'
expireDate=$(date -R -d "+3 days")
body="Message."
while IFS=' ' read -r list; do
raw_json="$(
jq -n --arg bdy "$body" --arg mobN "$list" \
'.action_type="NotificationActivity"|.title="Expiration Notice"|.type="Notification"|.body=$bdy|.min[0].mobileNumber=$mobN|.'
)"
json_reply="$(curl --location --request POST "$refLink" \
--header "Authorization: Basic $authorization" \
--header 'Content-Type: application/json' \
--data-raw "$raw_json")"
echo "json_reply: '$json_reply'"
echo "$body"
# psql -d "$database" -c "UPDATE tblname SET status='hehe' WHERE mobile='$list'"
done <"$filename"

Parse curl response to variable and use it in curl

I have the following script:
#!/bin/bash
TOKEN=$(curl -isX POST 'http://localhost:3005/auth/tokens' \
-H 'Content-Type: application/json' \
-d '{
"name": "test#test.de",
"password": "1234"
}' | grep X-Subject-Token | sed "s/X-Subject-Token: //g")
echo $TOKEN
curl --trace test.txt -X POST "http://localhost:3005/v1/users" \
-H "Content-Type: application/json" \
-H "X-Auth-token: $TOKEN" \
-d '{
"user": {
"username": "alice",
"email": "alice#test.com",
"password": "test"
}
}'
The command echo $TOKEN is printing the right result (something like 35be3d05-7f80-4b11-ad20-7a7110e9d3a7). From the last curl request I get the following error from curl:
curl: (52) Empty reply from server
If I write above the last curl request TOKEN="35be3d05-7f80-4b11-ad20-7a7110e9d3a7" the request is working. So I guess there is something wrong with the TOKEN variable.
Kindly Regards
EDIT:
Output from declare -p TOKEN :
"eclare -- TOKEN="6770806a-1230-4f64-b519-1841e9deb5f1
I had to remove an carriage return. Thanks to #chepner!
Solution:
TOKEN=$(curl -isX POST 'http://localhost:3005/auth/tokens' \
-H 'Content-Type: application/json' \
-d '{
"name": "test#test.de",
"password": "1234"
}' | grep X-Subject-Token | sed "s/X-Subject-Token: //g" | tr -d '\r')

Unable to send large files to elasticsearch using curl: argument too long

This is the script i used to export some documents to elasticsearch but no luck
#!/bin/ksh
set -v
trap read debug
date=$(date +%Y-%m-%d);
echo $date;
config_file="/home/p.sshanm/reports_elastic.cfg";
echo $config_file;
URL="http://p-acqpes-app01.wirecard.sys:9200/reports-"$date"";
echo $URL;
find /transfers/documents/*/done/ -type f -name "ABC-Record*_${date}*.csv"|
while IFS='' read -r -d '' filename
do
echo "filename : ${filename}"
var=$(base64 "$filename"| perl -pe 's/\n//g');
#if i use below it will fail as argument too long , so i used with curl option #
# var1= $(curl -XPUT 'http://localhost:9200/reports-'$date'/document/reports?pipeline=attachment&pretty' -d' { "data" : "'$var'" }')
var1=$(curl -X PUT -H "Content-Type: application/json" -d #- "$URL" >>CURLDATA
{ "data": "$var" }
CURL_DATA)
done;
If i use below it as
var1= $(curl -XPUT 'http://localhost:9200/reports-'$date'/document/reports?pipeline=attachment&pretty' -d' { "data" : "'$var'" }')
will fail as below, so i used with curl option #
argument too long
Your syntax to read from stdin is wrong, the here-doc string should have been (<<) and the de-limiters are mis-matching use CURL_DATA at both places.
curl -X PUT -H "Content-Type: application/json" -d #- "$URL" <<CURL_DATA
{ "data": "$var" }
CURL_DATA

Curl with multiline of JSON

Consider the curl command below, is it possible to allow newline in JSON (without the minify) and execute directly in bash (Mac/Ubuntu)
curl -0 -v -X POST http://www.example.com/api/users \
-H "Expect:" \
-H 'Content-Type: text/json; charset=utf-8' \
-d \
'
{
"field1": "test",
"field2": {
"foo": "bar"
}
}'
When I run the command above, seems error occurred at the second {
How to fix the above command?
Updated: actually I was able to run the command without issue previously, not sure why problem happen recently.
I remembered another way to do this with a "Here Document" as described in the Bash man page and detailed here. The #- means to read the body from STDIN, while << EOF means to pipe the script content until "EOF" as STDIN to curl. This layout may be easier to read than using separate files or the "echo a variable" approach.
curl -0 -v -X POST http://www.example.com/api/users \
-H "Expect:" \
-H 'Content-Type: application/json; charset=utf-8' \
--data-binary #- << EOF
{
"field1": "test",
"field2": {
"foo": "bar"
}
}
EOF
NOTE: Use the --trace <outfile> curl option to record exactly what goes over the wire. For some reason, this Here Document approach strips newlines. (Update: Newlines were stripped by curl -d option. Corrected!)
Along the lines of Martin's suggestion of putting the JSON in a variable, you could also put the JSON in a separate file, and then supply the filename to -d using curl's # syntax:
curl -0 -v -X POST http://www.example.com/api/users \
-H "Expect:" \
-H 'Content-Type: text/json; charset=utf-8' \
-d #myfile.json
The disadvantage is obvious (2 or more files where you used to have one.) But on the plus side, your script could accept a filename or directory argument and you'd never need to edit it, just run it on different JSON files. Whether that's useful depends on what you are trying to accomplish.
For some reason, this Here Document approach strips newlines
#eric-bolinger the reason the Heredoc strips newlines is because you need to tell your Heredoc to preserve newlines by quoting the EOF:
curl -0 -v -X POST http://www.example.com/api/users \
-H "Expect:" \
-H 'Content-Type: text/json; charset=utf-8' \
-d #- <<'EOF'
{
"field1": "test",
"field2": {
"foo": "bar"
}
}
EOF
Notice the single-ticks surrounding EOF the first time it's defined, but not the second.
You should use outer double quotes, and the escape all inner quotes like this:
curl -0 -v -X POST http://www.example.com/api/users \
-H "Expect:" \
-H 'Content-Type: text/json; charset=utf-8' \
-d \
"
{
\"field1\": \"test\",
\"field2\": {
\"foo\": \"bar\"
}
}"
You could assign your json to a var:
json='
{
"field1": "test",
"field2": {
"foo": "bar"
}
}'
Now you can forward this to curl using stdin:
echo $json | curl -0 -v -X POST http://www.example.com/api/users \
-H "Expect:" \
-H 'Content-Type: text/json; charset=utf-8' \
-d #-
I think this can be an answer
curl -0 -v -X POST http://www.example.com/api/users \
-H "Expect:" \
-H 'Content-Type: text/json; charset=utf-8' \
--data-raw '
{
"field1": "test",
"field2": {
"foo": "bar"
}
}'

Resources