Adding Mailchimp members through Bash from .csv file - bash

I have got around 1000 contacts to import to Mailchimp. This is my company's old database, which we have exported from the CSM system, and we want every contact to confirm their subscription if they want to be on our subscription list.
When I try to import it through Mailchimp, I can't give the contact status pending.
So, I have managed how to do it with single contact through bash, but I will want to import the whole contact list.
I am not familiar with this scripting language that much, so can anybody advise me, is there a way to import the data from the CSV file and how can I do it?
Or maybe there is some other way to do it?
This is the code that is working for a single contact:
#!/bin/bash
set -euo pipefail
list_id="Add_LIST_ID"
user_email="Add_E_MAIL"
user_fname="Add_F_NAME"
user_lname="Add_L_NAME"
curl -sS --request POST \
--url "https://$API_SERVER.api.mailchimp.com/3.0/lists/$list_id/members" \
--user "key:$API_KEY" \
--header 'content-type: application/json' \
--data #- \
<<EOF | jq '.id'
{
"email_address": "$user_email",
"status": "pending",
"merge_fields": {
"FNAME": "$user_fname",
"LNAME": "$user_lname"
}
}
EOF
EDIT1
Okay, I have managed to load the data from csv file. The code is below.
while IFS=, read -r col1
do
{
#!/bin/bash
set -euo pipefail
list_id="LIST_ID"
echo "$col1"
curl -sS --request POST \
--url "https://$API_SERVER.api.mailchimp.com/3.0/lists/$list_id/members" \
--user "key:$API_KEY" \
--header 'content-type: application/json' \
--data #- \
<<EOF | jq '.id'
{
"email_address": "$(echo $col1)",
"status": "pending",
"merge_fields": {
"FNAME": "",
"LNAME": ""
}
}
EOF
}
done < mails.csv
I have put echo line after list_id to see if the data is imported correctly.
The code is working (no errors in the buildup), but I have managed to add a contact to the list only once (subscriber hash is the response). In other tries, I have got a "null" value in response. Does anybody know why?

Related

Sending a complex html as the body of email from the command line via an SendGrid API

I need to send an HTML file as the body of an eamil to several customers. Our company will be using SendGrid for this, and I need to be able to send the email via API Curl Call.
The way that I'm doing it so far works for simple html or plain text:
curl -s --request POST \
--url https://api.sendgrid.com/v3/mail/send \
--header "Authorization: Bearer SECRET_API_KEY" \
--header 'Content-Type: application/json' \
--data '{"personalizations":[{"to":[{"email":"my1#email.com"},{"email":"my2#email.com"}]}],"from":{"email":"info#somewhere.com"},"subject":"Testing sending emails via SendgridAPI","content":[{"type":"text\/html","value":"Test API Email From ME"}]}'
Now this works just fine. The problem is when I want to replace 'Test API Email From ME' with the contents of a rather large, complex HTML file. This has all the usual cli nightmares such as a mix of ' and " and new lines everywhere. I need to sanatize the HTML in order to accomplish three things:
The final result needs to be a valid command line string
The --data switch argument needs to remain a valid JSON enconded string
The HTML should not break.
What I do is I create the actual string command and the execute it using a scripting language. So I can perform any operation that I want on the html before inserting it in the value field of the content field. So my question is: what are the string operations that I should perform on the html so that I can send the email using this methodology?
Using jq and bash
I'll do it with static data, you may improve upon it
Define a JSON template for the API:
IFS='' read -r -d '' json_template <<'EOF'
{
"personalizations": [
{
"to": [
{ "email": "my1#email.com" },
{ "email": "my2#email.com" }
]
}
],
"from": { "email": "info#somewhere.com" },
"subject": "Testing sending emails via SendgridAPI",
"content": [
{
"type": "text/html",
"value": "Test API Email From ME"
}
]
}
EOF
Define the HTML content:
IFS='' read -r -d '' html_email <<'EOF'
<!doctype html>
<html>
<head>
title>Simple Email</title>
</head>
<body>
Test API Email From ME
</body
</html>
EOF
Replace the email content in the JSON with the HTML
json_data=$(
jq -c -n \
--arg html "$html_email" \
--argjson template "$json_template" \
'$template | .content[0].value = $html'
)
Send the query
curl -s --request POST \
--url https://api.sendgrid.com/v3/mail/send \
--header "Authorization: Bearer SECRET_API_KEY" \
--header 'Content-Type: application/json' \
--data "$json_data"
Here is how you can compose a proper JSON data payload with jq so it can be sent to the API.
jq will ensure every values, recipients, from, subject and the html body will be respectively encoded to proper JSON data objects, arrays and strings before it is submitted as --data #- to curl:
I added comments everywhere, so it is very clear what is done at every step:
#!/usr/bin/env bash
recipients=(
'my1#email.com'
'my2#email.com'
)
from='info#somewhere.com'
subject='Testing sending emails via SendgridAPI'
# Streams null-delimited recipients array entries
printf '%s\0' "${recipients[#]}" |
# jq slurps the null-delimited recipients,
# read the raw html content into the jq $contentHTML variable
# and integrate it all as a proper JSON
jq --slurp --raw-input --rawfile contentHTML example.html \
--arg from "$from" \
--arg subject "$subject" \
'
# Fills the jq $recipient JSON array variable
# by splitting the null-delmited entries
# from the incoming stream
split( "\u0000") as $recipients |
{
"personalizations": [
{
# Uses the $recipients array that has been
# slurped from the input stream
"to": $recipients
}
],
"from": {
# Use the $from that has been passed as --arg
"email": $from
},
# Use the $subject that has been passed as --arg
"subject": $subject,
"content": [
{
"type": "text/html",
"value": $contentHTML
}
]
}
' |
# Get the resultant JSON piped into curl
# that will read the data from the standard input
# using --data #-
# rather than passing it as an argument, because
# the payload could exceed the maximum length of arguments
curl -s --request POST \
--url https://api.sendgrid.com/v3/mail/send \
--header "Authorization: Bearer SECRET_API_KEY" \
--header 'Content-Type: application/json' \
--data #-

Running Curl POST w/ JSON Payload Sequentially Against File

I am hitting a wall trying to build a script to save myself quite a good bit of time. I am working in a system in which I need to run a curl POST against a list of values. The list is about 400 lines long, so I am hoping to find a way of scripting this in Bash instead of running that call manually for each entry. Below are some details to help understand what I'm trying to accomplish:
If I were to be doing this task manually, each call would be formatted like the below:
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d #json_payload.json
This points to my JSON in the listed file which shows as the below:
{
"groupId": "12345678987654321",
"type": "serial",
"memberInfo": "apple"
}
If I run the above, the call works, and the expected operation occurs. The issue is that I need to run this against roughly 400 values for that "memberInfo" field in the JSON payload. I'm trying to identify a way to run a single bash script, which will run this curl command over and over, and update the JSON payload to use each row in a file as the below:
memberList.txt
apple
banana
peach
pear
orange
.
.
And then maybe insert a pointer in my JSON for the "memberInfo" field over to this file.
Any and all help/suggestions are greatly appreciated!
.
This will do as you intend. Its a little convoluted but you might polish it a bit.
#!/bin/bash
function getString(){
echo $1 | python3 -c '
import json
import sys
payload="""
{
"groupId": "12345678987654321",
"type": "serial",
"memberInfo": ""
}
"""
obj = json.loads(payload)
obj["memberInfo"] = sys.stdin.read().strip()
print(json.dumps(obj, indent = " "))
'
}
while read member
do
getString "$member" > json_payload.json
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d #json_payload.json
done <<< "$( cat fruits.txt )"
Hope it helps!
while read member; do
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d '{"groupId": "12345678987654321","type": "serial","memberInfo": "$member"}'
done <members.txt
This will work if you only care about the memberInfo field, another method could be writing your json line by line to payloads.txt file.
payloads.txt
{"groupId": "12345678987455432","type": "stereo","memberInfo": "apple"}
{"groupId": "34532453453453465","type": "serial","memberInfo": "banana"}
...
then use this as the script
while read payload; do
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d '$payload'
done <payloads.txt
here is a collection of bash scripting common uses I've had to use
https://github.com/felts94/advanced-bash/blob/master/bash_learn.sh

Editing Gist with cURL: "Problems parsing JSON",

#!/bin/bash
curl -v \
--request PATCH \
--data "$(
printf '{"files": {"somefile.json": {"content": " {"field": "value"} "}}}' \
)" \
--user x:x \
https://api.github.com/gists/x
Tried adding --header "Content-Type: application/json", no luck.
I'm using this because the content is actually a command output but right now I'm testing the basics because this is not workig.
I believe is somethig related to double quote escaping in bash, tried for a couple of hours with no luck. This is a nightmare.
Any tip is welcomed. Thanks.
It looks like you have too many quotation marks. If you want the value of the "content" element to be an object, then instead of this:
"content": " {"field": "value"} "
try this:
"content": {"field": "value"}
On the off chance that you want it to be a string, then try this:
"content": " {\"field\": \"value\"} "
Instead of fighting against quote escaping, you could write your payload to a file and tell curl to use that file as data like so:
curl -v \
--request PATCH \
--data #/tmp/some/file \
--user x:x \
https://api.github.com/gists/x
Note the # sign in the --data argument, which tells curl that the rest of the argument is a file name to read data from.
Depending on how you create your payload, you could also pipe it to curl using - as filename (echo payload | curl --data #- ...)

Replacing IP in curl command with bash variable

I'm currently trying to make a DDNS script that interacts with the Cloudflare API to catch changes in my ip address and automatically fix the ip address change for my web server. Everything is working correctly so far except I can't get $IP to be put properly in the curl statement. I first run a python script from within the bash script to get the ip address, then run the curl statement in the bash script. Here's what the python script looks like (it returns an ip address like "1.1.1.1" with quotations included because the curl command requires the quotations)
#!/usr/bin/python3
import subprocess as sp
def main():
command = "dig +short myip.opendns.com #resolver1.opendns.com";
ip = sp.check_output(command, shell=True).decode('utf-8').strip('\n');
ip_tmp = ip;
ip_tmp = '"' + ip + '"';
ip = ip_tmp;
print(ip);
if __name__ == "__main__":
main();
And the bash script looks like this:
#!/bin/bash
IP=$("./getIP.py")
curl -X PUT "https://api.cloudflare.com/client/v4/zones/zone_id/dns_records/dns_id" \
-H "X-Auth-Email: example.com" \
-H "X-Auth-Key: authkey" \
-H "Content-Type: application/json" \
--data '{"type":"A","name":"example.com","content":$IP,"ttl":120,"proxied":true}'
I've tried to have the python script only return numbers and then added the quotations in the bash script and now vice versa and I can't seem to get it to work. The last line should end up looking like this once the variable replaces with quotations around the ip address:
'{"type":"A","name":"example.com","content":"127.0.0.1","ttl":120,"proxied":true}'
The single quotes around your json structure prevent the variable from expanding.
You have a few options that are readily available.
Ugly quote escaping inside/around your json.
"{\"type\":\"A\",\"name\":\"example.com\",\"content\":$IP,\"ttl\":120,\"proxied\":true}"
Having the python write this data to a file and telling curl to use that file for the source of the post data.
curl -X PUT "https://api.cloudflare.com/client/v4/zones/zone_id/dns_records/dns_id" \
-H "X-Auth-Email: example.com" \
-H "X-Auth-Key: authkey" \
-H "Content-Type: application/json" \
--data #file_you_wrote_your_json_to.json
Using the python requests or urllib modules to issue the request to cloud flare.
Update your main() function to return the IP instead of print it.
my_ip = main()
url = "https://api.cloudflare.com/client/v4/zones/zone_id/dns_records/dns_id"
myheaders = {
"X-Auth-Email": "example.com",
"X-Auth-Key": "authkey",
"Content-Type": "application/json"
}
myjson = {
"type":"A",
"name":"example.com",
"content":my_ip,
"ttl":120,
"proxied":true
}
requests.put(url, headers=myheaders, data=myjson)
Better yet, just do it in bash. Cloudflare DDNS on github.
One shot to fetch the dynamic A-record ID:
curl -X GET "https://api.cloudflare.com/client/v4/zones/**Zone ID** \
/dns_records?type=A&name=dynamic" \
-H "Host: api.cloudflare.com" \
-H "User-Agent: ddclient/3.9.0" \
-H "Connection: close" \
-H "X-Auth-Email: example#example.com" \
-H "X-Auth-Key: "**Authorization key**" \
-H "Content-Type: application/json"
Cron job (* * * * *) to set the dynamic A-record:
#/usr/bin/env sh
AUTH_EMAIL=example#example.com
AUTH_KEY=** CF Authorization key **
ZONE_ID=** CF Zone ID **
A_RECORD_NAME="dynamic"
A_RECORD_ID=** CF A-record ID from cloudflare-dns-id.sh **
IP_RECORD="/tmp/ip-record"
RECORDED_IP=`cat $IP_RECORD`
PUBLIC_IP=$(curl --silent https://api.ipify.org) || exit 1
if [ "$PUBLIC_IP" = "$RECORDED_IP" ]; then
exit 0
fi
echo $PUBLIC_IP > $IP_RECORD
RECORD=$(cat <<EOF
{ "type": "A",
"name": "$A_RECORD_NAME",
"content": "$PUBLIC_IP",
"ttl": 180,
"proxied": false }
EOF
)
curl "https://api.cloudflare.com/client/v4/zones/$ZONE_ID \
/dns_records/$A_RECORD_ID" \
-X PUT \
-H "Content-Type: application/json" \
-H "X-Auth-Email: $AUTH_EMAIL" \
-H "X-Auth-Key: $AUTH_KEY" \
-d "$RECORD"

Can I define an object as a variable in a shell script?

I know how to store a string as a variable, for example: API="http://localhost:4741"
However, for the sake of a CURL request I would like to be able to store on object as a variable that I can access values on, something like OBJ="{name : Joe}". Is this possible?
Right now my CURL request looks like this:
curl --include --request POST localhost:3000/scrape \
--header "Content-Type: application/json" \
--data '{
"url": "http://www.oddsshark.com/stats/gamelog/basketball/nba/20736",
"team": "LA Clippers"
}'
I would like to be able to do something like this, using a dictionary or an object:
TEAM=( ["Clippers"]="http://www.oddsshark.com/stats/gamelog/basketball/nba/20736" )
curl --include --request POST localhost:3000/scrape \
--header "Content-Type: application/json" \
--data '{
"url": "http://www.oddsshark.com/stats/gamelog/basketball/nba/20736",
"team": "${TEAM[Clippers]}"
}'

Resources