I am hitting a wall trying to build a script to save myself quite a good bit of time. I am working in a system in which I need to run a curl POST against a list of values. The list is about 400 lines long, so I am hoping to find a way of scripting this in Bash instead of running that call manually for each entry. Below are some details to help understand what I'm trying to accomplish:
If I were to be doing this task manually, each call would be formatted like the below:
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d #json_payload.json
This points to my JSON in the listed file which shows as the below:
{
"groupId": "12345678987654321",
"type": "serial",
"memberInfo": "apple"
}
If I run the above, the call works, and the expected operation occurs. The issue is that I need to run this against roughly 400 values for that "memberInfo" field in the JSON payload. I'm trying to identify a way to run a single bash script, which will run this curl command over and over, and update the JSON payload to use each row in a file as the below:
memberList.txt
apple
banana
peach
pear
orange
.
.
And then maybe insert a pointer in my JSON for the "memberInfo" field over to this file.
Any and all help/suggestions are greatly appreciated!
.
This will do as you intend. Its a little convoluted but you might polish it a bit.
#!/bin/bash
function getString(){
echo $1 | python3 -c '
import json
import sys
payload="""
{
"groupId": "12345678987654321",
"type": "serial",
"memberInfo": ""
}
"""
obj = json.loads(payload)
obj["memberInfo"] = sys.stdin.read().strip()
print(json.dumps(obj, indent = " "))
'
}
while read member
do
getString "$member" > json_payload.json
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d #json_payload.json
done <<< "$( cat fruits.txt )"
Hope it helps!
while read member; do
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d '{"groupId": "12345678987654321","type": "serial","memberInfo": "$member"}'
done <members.txt
This will work if you only care about the memberInfo field, another method could be writing your json line by line to payloads.txt file.
payloads.txt
{"groupId": "12345678987455432","type": "stereo","memberInfo": "apple"}
{"groupId": "34532453453453465","type": "serial","memberInfo": "banana"}
...
then use this as the script
while read payload; do
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d '$payload'
done <payloads.txt
here is a collection of bash scripting common uses I've had to use
https://github.com/felts94/advanced-bash/blob/master/bash_learn.sh
Related
I have got around 1000 contacts to import to Mailchimp. This is my company's old database, which we have exported from the CSM system, and we want every contact to confirm their subscription if they want to be on our subscription list.
When I try to import it through Mailchimp, I can't give the contact status pending.
So, I have managed how to do it with single contact through bash, but I will want to import the whole contact list.
I am not familiar with this scripting language that much, so can anybody advise me, is there a way to import the data from the CSV file and how can I do it?
Or maybe there is some other way to do it?
This is the code that is working for a single contact:
#!/bin/bash
set -euo pipefail
list_id="Add_LIST_ID"
user_email="Add_E_MAIL"
user_fname="Add_F_NAME"
user_lname="Add_L_NAME"
curl -sS --request POST \
--url "https://$API_SERVER.api.mailchimp.com/3.0/lists/$list_id/members" \
--user "key:$API_KEY" \
--header 'content-type: application/json' \
--data #- \
<<EOF | jq '.id'
{
"email_address": "$user_email",
"status": "pending",
"merge_fields": {
"FNAME": "$user_fname",
"LNAME": "$user_lname"
}
}
EOF
EDIT1
Okay, I have managed to load the data from csv file. The code is below.
while IFS=, read -r col1
do
{
#!/bin/bash
set -euo pipefail
list_id="LIST_ID"
echo "$col1"
curl -sS --request POST \
--url "https://$API_SERVER.api.mailchimp.com/3.0/lists/$list_id/members" \
--user "key:$API_KEY" \
--header 'content-type: application/json' \
--data #- \
<<EOF | jq '.id'
{
"email_address": "$(echo $col1)",
"status": "pending",
"merge_fields": {
"FNAME": "",
"LNAME": ""
}
}
EOF
}
done < mails.csv
I have put echo line after list_id to see if the data is imported correctly.
The code is working (no errors in the buildup), but I have managed to add a contact to the list only once (subscriber hash is the response). In other tries, I have got a "null" value in response. Does anybody know why?
#!/bin/bash
curl -v \
--request PATCH \
--data "$(
printf '{"files": {"somefile.json": {"content": " {"field": "value"} "}}}' \
)" \
--user x:x \
https://api.github.com/gists/x
Tried adding --header "Content-Type: application/json", no luck.
I'm using this because the content is actually a command output but right now I'm testing the basics because this is not workig.
I believe is somethig related to double quote escaping in bash, tried for a couple of hours with no luck. This is a nightmare.
Any tip is welcomed. Thanks.
It looks like you have too many quotation marks. If you want the value of the "content" element to be an object, then instead of this:
"content": " {"field": "value"} "
try this:
"content": {"field": "value"}
On the off chance that you want it to be a string, then try this:
"content": " {\"field\": \"value\"} "
Instead of fighting against quote escaping, you could write your payload to a file and tell curl to use that file as data like so:
curl -v \
--request PATCH \
--data #/tmp/some/file \
--user x:x \
https://api.github.com/gists/x
Note the # sign in the --data argument, which tells curl that the rest of the argument is a file name to read data from.
Depending on how you create your payload, you could also pipe it to curl using - as filename (echo payload | curl --data #- ...)
I want to send an email with attached pdf file through the Sparkpost API with curl post.
To insert the pdf I use (my test.pdf is ~ 200KB)
"data":"'$(cat test.pdf} | base64 --wrap=0)'"
But somehow this doesn't work out showing the following error:
/usr/bin/curl: Die Argumentliste ist zu lang (original)
/usr/bin/curl: Argument list is too long
EDIT:
curl command
curl -X POST https://api.eu.sparkpost.com/api/v1/transmissions -H 'Authorization: <APIKEY>' -H 'Content-Type: application/json' -d '{
"options":{
"open_tracking":false,
"click_tracking":false,
"inline_css":false
},
"recipients":[
{
"address":{
"email":"user#domain.tld",
"name":"user"
}
}
],
"content":{
"from":{
"name":"sender",
"email":"sender#domain.tld"
},
"reply_to":"replyto#domain.tld",
"subject":"subject",
"text":"textbody",
"attachments":[
{
"name":"attachmentname.pdf",
"type":"application/pdf",
"data":"'$(cat test.pdf | base64 --wrap=0)'"
}
]
}
}'
This is coming up because you are trying to pass the entirety of the base64'd content on the command line. curl has the ability to load in data to POST from a file, which I'd recommend doing. More information can be found in the man page, but the basic format is this:
curl -X POST -d #filename.txt https://website.com/path
According to the curl manual, the -F option allows you to encode a file for base64, but limits the output to 76 characters.
Ex:
-F '=#localfile;encoder=base64'
I have a curl request in below format
curl -v -H "Content-Type:application/json" -H "x-user-id:xxx" -H "x-api-key:yyy" --data '{"logs":"'"${TEST_OUTPUT}"'","pass":"true | false"}' https://razeedash.one.qqq.cloud.com/api/v1/clusters/zzz/api/test_results
This works fine while I do from my MAC terminal. But the same command throws
13:49:26 {
13:49:26 "status": "error",
13:49:26 "message": "Invalid credentials"
13:49:26 }
I saw this post but not sure how else would I send a json body without curly braces. I know that we can save it as a file.json and use the file as body.But for some reasons that cannot be implemented in my scenario
In general, you should avoid trying to build JSON using string interpolation. Use a tool like jq to handle any necessary quoting.
jq -n --argson o "$TEST_OUTPUT" '{logs: $o, pass: "true | false"}' |
curl -v -H "Content-Type:application/json" \
-H "x-user-id:xxx" \
-H "x-api-key:yyy" \
--data #- \
https://razeedash.one.qqq.cloud.com/api/v1/clusters/zzz/api/test_results
However, if you can manage to correctly generate your JSON as you are now, you can just replace the jq command with echo:
echo '{"logs": ...' | curl ...
The #- argument to --data says to read from standard input.
I have a curl command:
curl -u ${USER_ID}:${PASSWORD} -X GET 'http://blah.gso.woo.com:8080/rest/job-execution/job-details/${job_id}'
The variable job_id has a value in it, say, 1160. When I execute the curl command in shell it gives me the following error:
{"message":"Sorry. An unexpected error occured.", "stacktrace":"Bad Request. The request could not be understood by the server due to malformed syntax."}
If I pass the number '1160' directly in the command, as shown below, the curl command works.
curl -u ${USER_ID}:${PASSWORD} -X GET 'http://blah.gso.woo.com:8080/rest/job-execution/job-details/1160'
I want to be able to pass the value of the variable in the curl command.
When using variables in shell, you can only use doubles quotes, not single quotes : the variables inside single quotes are not expanded.
Learn the difference between ' and " and `. See http://mywiki.wooledge.org/Quotes and http://wiki.bash-hackers.org/syntax/words
I ran into this problem with passing as well, it was solved by using ' " $1 " '
See connection.uri below
curl -X POST -H "Content-Type: application/json" --data '
{"name": "mysql-atlas-sink",
"config": {
"connector.class":"com.mongodb.kafka.connect.MongoSinkConnector",
"tasks.max":"1",
"topics":"mysqlstock.Stocks.StockData",
"connection.uri":"'"$1"'",
"database":"Stocks",
"collection":"StockData",
"key.converter":"io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url":"http://schema-registry:8081",
"value.converter":"io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url":"http://schema-registry:8081",
"transforms": "ExtractField",
"transforms.ExtractField.type":"org.apache.kafka.connect.transforms.ExtractField$Value",
"transforms.ExtractField.field":"after"
}}' http://localhost:8083/connectors -w "\n"
How to pass json to curl with shell variable(s):
myvar=foobar
curl -H "Content-Type: application/json" --data #/dev/stdin<<EOF
{ "xkey": "$myvar" }
EOF
With the switch -d or --data, the POST request is implicit
use variable in a double-quote single-quote "' $variable '"
#!/usr/bin/bash
token=xxxxxx
curl --location --request POST 'http://127.0.0.1:8009/submit/expense/' \
--form 'token="'$token'"' \
--form 'text="'$1'"' \
--form 'amount="'$2'"'
userdetails="$username:$apppassword"
base_url_part='https://api.XXX.org/2.0/repositories'
path="/$teamName/$repoName/downloads/$filename"
base_url="$base_url_part$path"**strong text**
curl -L -u "$userdetails" "$base_url" -o "$downloadfilename"