Read JSON objects from a file and pass each of them to cURL - shell

I have a file access.log containing timestamps and JSON objects delimited by a newline:
09:52:11 { "key1": "value", "key2": 2, "key3": true }\n
09:52:13 { "key4": "value2", "key5": 5, "key6": false }\n
09:55:33 { "key7": "value7", "key8": 8, "key9": true }\n
...
I need to read the file line by line, extract JSON objects and pass each of them as a request body to cURL. I tried to solve the problem with xargs but it passes {} to the request body:
cat access.log | cut -c10- | xargs -0 -I {} curl -X POST-H "Content-Type: application/json" http://localhost:8080/api/ -d '{}'
cat access.log | cut -c10- | xargs -0 -I {} curl -X POST-H "Content-Type: application/json" http://localhost:8080/api/ -d {}
I have a mistake somewhere in the syntax. What am I doing wrong?
UPDATE: \n is not a literal value. So by issuing cat access.log the result is:
09:52:11 { "key1": "value", "key2": 2, "key3": true }
09:52:13 { "key4": "value2", "key5": 5, "key6": false }
09:55:33 { "key7": "value7", "key8": 8, "key9": true }
...

You shouldn't pass -0 to xargs - it makes it expect zero character as a separator. Since you don't have zero character it treats all input as one record and passes it all to curl. Removing the -0 fixes it.
In addition, -I option makes xarg to use newlines as a delimiter, instead of spaces.
It is easier to see with echo i/o curl. Compare output from
cat aa.log | cut -c10- | xargs -0 -I {} echo %%'{}'%% which is
%%{ "key1": "value", "key2": 2, "key3": true }
{ "key4": "value2", "key5": 5, "key6": false }
{ "key7": "value7", "key8": 8, "key9": true }
%%
To output from cat aa.log | cut -c10- | xargs -I {} echo %%'{}'%% which is
%%{ key1: value, key2: 2, key3: true }%%
%%{ key4: value2, key5: 5, key6: false }%%
%%{ key7: value7, key8: 8, key9: true }%%
P.S. Also, you can use -t option to see the command that xargs builds

use:
#!/bin/bash
cat access.log | sed 's/\\n//g' > access_temp.log
echo ""
while IFS='' read -r line || [[ -n "$line" ]]; do
json_string=$(echo "$line" | awk '{$1=""; print $0}' | jq -c .)
curl -X POST -H "Content-Type: application/json" -d "$json_string" http://localhost:8080/api/
done < "access_temp.log"
rm access_temp.log
The sed in the beginning was because the '\n' was messing up the jq command, so I had to remove it.
You can add echos, or set -x in the beginning to check the values being passed.

i think this should work - wrap xargs with a light-weight shell such as dash :
printf '%s' '
09:52:11 { "key1": "value", "key2": 2, "key3": true }\n
09:52:13 { "key4": "value2", "key5": 5, "key6": false }\n
09:55:33 { "key7": "value7", "key8": 8, "key9": true }\n' |
mawk NF=NF FS='^[^{]*' OFS= ORS='\0' |
xargs -0 dash -v -x -c '
for __; do
curl -s -X POST-H '\''Content-Type: application/json'\'' \
--url '\''https://localhost:8080/api'\'' -d "$__"
done ' _
+ curl -s -X POST-H Content-Type: application/json --url
https://localhost:8080/api -d { "key1": "value", "key2": 2, "key3": true }
+ curl -s -X POST-H Content-Type: application/json --url
https://localhost:8080/api -d { "key4": "value2", "key5": 5, "key6": false }
+ curl -s -X POST-H Content-Type: application/json --url
https://localhost:8080/api -d { "key7": "value7", "key8": 8, "key9": true }
The -v(erbose) -x(trace) flags for dash are totally optional, and in this situation they could be a rough proxy for xargs -t.

Related

Iterate through the json object and run curl command as variable from json output using jq

I would like to filter the json object while iterating through it and run curl command over each item from the output.
JSON object.
{
"repo": "releases",
"path": "/apps/releases",
"created": "2021-04-01T10:12:23.496-01:00",
"children": [
{
"uri": "/Image1",
"folder": true,
"created": 2022-08-09T17.12.22.987.04.000
},
{
"uri": "/Image2",
"folder": true,
"created": 2022-06-10T10.12.22.412.10.000
},
{
"uri": "/Image3",
"folder": true,
"created": 2022-10-10T07.03.14.742.01.000
},
{
"uri": "/Image4",
"folder": true,
"created": 2022-10-10T07.010.11.542.08.000
}
]
}
Looking for some logic that will iterate through the uri under children and that is passed through curl command as $i which would be Image1, Image2 and Image3.
curl -k -s --user user:password -X GET "https://artifactory.com/api/releases/baseimage/${i}"
While I was running this below command and the output is as follows
for i in $(curl -k -s --user user:password -X GET "https://artifactory.com/api/releases/baseimage/" | jq -c ".children[] |.uri)
Output: ["/Image1", "/Image2", "/Image3"]
I tried the following command but in the output it replaces ${i} with only Image3, somehow it is not taking Image1 and Image2.
for i in $(curl -k -s --user user:password -X GET "https://artifactory.com/api/releases/baseimage/" | jq -r ".children[] |.uri); do curl -k -s --user user:password -X GET "https://artifactory.com/api/releases/baseimage/${i}"; done
I tried the following command but in the output it replaces ${i} with only Image3, somehow it is not taking Image1 and Image2.
curl can read URLs to fetch from a file, which you can generate with jq. Something like
base=https://artifactory.com/api/releases/baseimage
curl -k -s --user user:password -X GET "$base/" |
jq -r --arg b "$base" '.children[].uri | "url = \"\($b)/\(.)\""' |
curl -k -s --user user:password -X GET --config -
Just one curl process to fetch all the individual images, and no shell loop needed.
You might find it easier to construct the for loop along the following lines:
for uri in $( echo "$json" | jq '.children[].uri') ; do
echo curl ... ${uri}...
done

Pass jq output to curl

I want to be use a number of values from jq piped to curl. I have json data with a structure that looks something like this
{
"apps": {
"code": "200",
"name": "app1",
},
"result": "1"
}
{
"apps": {
"code": "200",
"name": "app2",
},
"result": "5"
}
...
...
...
What I want to do is to take the values of every apps.name and result and pass that to curl:
curl -XPOST http://localhost --data-raw 'app=$appsName result=$result'
The curl command will then run X amount of times depending on how many objects there are.
In jq I came up with this but I don't know how or the best way of passing this to curl in a loop.
jq -r '.result as $result | .apps as $apps | $apps.name + " " + $result myfile.json
Maybe something like this?
jq < data.json '[.apps.name, .result]|#tsv' -r | while read appsName result; do
curl -XPOST http://localhost --data-raw "$appsName $result"
done
jq -r '"\( .apps.name ) \( .result )"' |
while read -r app result; do
curl -XPOST http://localhost --data-raw "app=$app result=$result"
done
or
jq -r '"app=\( .apps.name ) result=\( .result )"' |
while read -r data; do
curl -XPOST http://localhost --data-raw "$data"
done
The first one assumes the name can't contain whitespace.
Would this do it?
while read -r line; do
curl -XPOST http://localhost --data-raw "$line"
done < <(jq -r '"app=\(.apps.name) result=\(.result)"' myfile.json)
I think xargs is better than while loops. Maybe just personal preference.
jq -r '"app=\( .apps.name ) result=\( .result )"' \
| xargs -r -I{} curl -XPOST http://localhost --data-raw "{}"

Getting rid of Double quotes within a value using JQ

I am trying to create a --data-raw with the help of jq , it works fine for the most part, except where i have a list .
#!/bin/bash
TEST="'bfc', 'punch', 'mld', 'extended_mld', 'chargingstation', 'ch'"
BFC_TAG=$BUILD_ID
UPLOAD_TO=s3
COMPILE=$test
REGION=world
PROJECT=bfc
JSON_STRING=$( jq -n \
--arg a "$BFC_TAG" \
--arg b "$UPLOAD_TO" \
--arg c "$TEST" \
--arg d "$REGION" \
--arg f "$PROJECT" '
{brfc_tag: $a,
upload_to: $b,
compile: [ $c ],
region: $d,
project: $f
}')
echo ${JSON_STRING}
Output:
{ "bfc_tag": "", "upload_to": "s3", "compile": [ "'bfc', 'punch', 'mld', 'extended_mld', 'chargingstation', 'ch'" ], "region": "world", "project": "bfc" }
If u look at the output particularly in compile key it has " in beginning and in the end, which break breaks python code, which consumes this, i need to get rid of the double quotes within compile, any help aprreaciated.
Those single quotes are already present in your input string:
TEST="'bfc', 'punch', 'mld', 'extended_mld', 'chargingstation', 'ch'"
I would recommend removing those, and creating a simple csv witch JQ can split for us
TEST="bfc,punch,mld,extended_mld,chargingstation,ch"
To create an array from that string, use;
compile: $c | split(",")
So the complete code:
#!/bin/bash
TEST="bfc,punch,mld,extended_mld,chargingstation,ch"
BFC_TAG=$BUILD_ID
UPLOAD_TO=s3
COMPILE=$test
REGION=world
PROJECT=bfc
JSON_STRING=$( jq -n \
--arg a "$BFC_TAG" \
--arg b "$UPLOAD_TO" \
--arg c "$TEST" \
--arg d "$REGION" \
--arg f "$PROJECT" '
{brfc_tag: $a,
upload_to: $b,
compile: $c | split(","),
region: $d,
project: $f
}')
echo ${JSON_STRING}
Will produce:
{ "brfc_tag": "", "upload_to": "s3", "compile": [ "bfc", "punch", "mld", "extended_mld", "chargingstation", "ch" ], "region": "world", "project": "bfc" }

How to get curl command output into variable

I want to take output of following command one variable and then want to get email id and id in two variables.
Code
curl -s -b ${COOKIE_FILE} -c ${COOKIE_FILE} 'https://api.xxxx.xxxx.com/sso/user?email='${USER_EMAIL} |python -m json.tool
Output
[
{
"createdAt": "2017-12-08T11:07:15.000Z",
"email": "vxxx.sxxx#domain.com",
"gravatarUrl": "https://gravatar.com/avatar/13656",
"id": 937,
"updatedAt": "2017-12-08T11:07:15.000Z",
"username": "339cba4c-d90c-11e7-bc18-005056ba0d15"
}
]
one more, if USER_EMAIL is wrong then we get output as [] then I have to print Email ID is not present and I will exit code exit -1
I am python developer, doing scripting first time
id=$(cat output | grep -w "id" | awk -F ':' {'print $2'} | sed -e 's|[," ]||g'); echo $id
Do the same for email.
Or second way like anubhava suggest in comment, using jq. ( Probably you will need to install jq first , apt-get install jq ). And then:
cat output | jq '.[].email'
Use jq to parse json:
$ cat input
[
{
"createdAt": "2017-12-08T11:07:15.000Z",
"email": "vxxx.sxxx#domain.com",
"gravatarUrl": "https://gravatar.com/avatar/13656",
"id": 937,
"updatedAt": "2017-12-08T11:07:15.000Z",
"username": "339cba4c-d90c-11e7-bc18-005056ba0d15"
}
]
$ jq -r '.[] | (.email,.id)' input
vxxx.sxxx#domain.com
937
$ read email id << EOF
> $(jq -r '.[] | (.email,.id)' input | tr \\n ' ')
> EOF
$ echo $email
vxxx.sxxx#domain.com
$ echo $id
937
To check if email was valid, you can do things like:
echo "${email:?}"
or
test -z "$email" && exit 1

jq shell script: aggregate iteration content into json body

I'm creating a shell script.
I need to create a curl json body according the a key-value list content. This content is splitted by way of a awk which generate a two column table:
KVS_VARIABLES=$(awk -F= '!($1 && $2 && NF==2) { print "File failed validation on line " NR | "cat 1>&2"; next } { print $1, $2 }' $f)
Example output:
VAR1 VAL1
VAR2 VAL2
VAR3 VAL3
So, this table is iterated on a while iteration and each key and value are splitted:
echo "$KVS_VARIABLES" | while read -r kv
do
key=$(echo $kv | awk '{print $1}')
value=$(echo $kv | awk '{print $2}')
done
So, I need some way to aggregate this content into a json document in order to send it out using curl:
curl -k \
-X PUT \
-d #- \
-H "Authorization: Bearer $TOKEN" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
"$SERVER_URL/api/v1/namespaces/$NAMESPACE/secrets/$SECRET_ID" <<-EOF
{
"kind": "Secret",
"apiVersion": "v1",
"metadata": {
"name": "$SECRET_ID"
},
"stringData": {
"$key": "$value" <<<<<<<<<<<<<(1)>>>>>>>>>>>>>>
}
}
EOF
So, on <<<<<<<<<<<<<(1)>>>>>>>>>>>>>> I need to aggregate each key and value propagation.`
So, in this case I'd need to generate:
"VAR1": "VAL1",
"VAR2": "VAL2",
"VAR3": "VAL3"
and then insert it inside "stringData":
{
"kind": "Secret",
"apiVersion": "v1",
"metadata": {
"name": "$SECRET_ID"
},
"stringData": {
<<<<<<<<<<<<<(1)>>>>>>>>>>>>>>
}
}
So, after all:
{
"kind": "Secret",
"apiVersion": "v1",
"metadata": {
"name": "$SECRET_ID"
},
"stringData": {
"VAR1": "VAL1",
"VAR2": "VAL2",
"VAR3": "VAL3"
}
}
jq is installed.
Any ideas?
You don't need an awk statement inside the while loop, but just read the key value pairs inside the read command itself.
Also storing awk output in a variable and later parsing is an anti-pattern. You could use the process substitution feature provided by the shell, the < <() part will slurp the output of a command as if it were appearing on a file (or) use the here-strings
json=$(cat <<-EOF
{
"kind": "Secret",
"apiVersion": "v1",
"metadata": {
"name": "$SECRET_ID"
},
"stringData": {
}
}
EOF
)
while read -r key value; do
json=$(echo "$json" | jq ".stringData += { \"$key\" : \"$value\" }")
done< <(awk -F= '!($1 && $2 && NF==2) { print "File failed validation on line " NR | "cat 1>&2"; next } { print $1, $2 }' $f)
You could now use the variable "$json" in the curl as
curl -k \
-X PUT \
-d #- \
-H "Authorization: Bearer $TOKEN" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
"$SERVER_URL/api/v1/namespaces/$NAMESPACE/secrets/$SECRET_ID" <<<"$json"

Resources