I am using Mailchimp API in an application for sending weekly emails to users.
The flow is the following: at the begging the list of subscribers is empty, so I make a request to fill it. After filling it, I make another request to create a campaign associated with the list. But when to send it, it says that the campaign is not ready to be sent because the list of subscribers is empty. Indeed, in the Mailchimp web interface the number of subscribers in the list is zero, but when I enter the list, there are subscribers. After a while (several minutes), the number of subscribers gets updated and the campaign can be sent. And it is not about email confirmation because I'm not using Double Opt In.
Does anybody know what am I missing or how long must wait to get the numbers of subscribers to get updated and then to make the request for creating the campaign?
If it does matter I'm using Ruby on Rails and the Gibbon gem.
Here is the method for adding users to list
def add_subscriber_to_list data, list_id = default_list_id
Gibbon::Request.new.lists(list_id).members.create(body: data)
end
where data has the following format
{
email_address: 'email#test.com',
status: 'subscribed',
double_option: false,
merge_fields: {
FNAME: 'First name',
LNAME: 'Last name'
}
}
Related
I am currently implementing a new feature at work. The app wants to give sellers an admin where they can see various things. An example of one of the things they can see on their dashboard is their last 10 orders.
The order's API only returns an array of various ids (customer id, product id, seller id, etc) In order to populate the orders page, I have to make 3 different API call on each order to get the data to render on the order list page
Now that I have to create a dashboard that is different from the order list page, I do not want to make such a tedious API call again. I want to create a dashboard$ observable that has the last emitted value of orderList$ but I do not want anything subscribed to orderList$ to be cached.
So when orderList$ is called, it gets the lastest orderList data from the server (I do not feel something as sensitive as order list should be cached), when dashboard$ is called, get the last emitted value of orderList$ and if orderList$ has not emitted any values then dashboard$ can make a request to retrieve order list.
When working with a reactive state of mind, I like to define what should happen based on events. By this I mean that I'll have a Subject in which in can push values into to notify an event and from there we can react to these events.
In your case, here's an idea for what you want:
const navigationToOrdersPage$$ = new Subject<void>();
const refreshOrdersButtonClicked$$ = new Subject<void>();
const orders$: Order[] = merge(
navigationToOrdersPage$$,
refreshOrdersButtonClicked$$
).pipe(
switchMap(() => yourOrderService.getOrders()),
shareReplay({
bufferSize: 1,
// make sure that even if we don't have anyone subscribed to that observable
// we keep the result in the cache and if we ever go to the orders page or click
// on the refresh button it'll be updated anyway but with this it's safe to navigate
// to another page than orders and the dashboard and if you go back to the dashboard
// you'd still get an instant result
refCount: false,
})
);
So here it'd be safe to reuse this observable on both the admin and the dashboard page. Of course, you'll need to call next on the 2 subjects when appropriate so that the orders can be refreshed when they need to.
I have noticed that forwarded email from a different sender for an existing case if there a case already exists in the system, is getting associated with the same case instead of creating a new case in the queue.
Explaining the Scenario:
1. I have configured a mailbox and 'Dummy' Queue for email address say: dummy#mycompanydomain.com
2. Case Creation rule is configured for any email coming to this queue.
Sending Emails in 2 different scenarios:
Sent email from ABC#outlook.com
TO : DEF#outlook.com & XYZ#outlook.com
"Replied to All" from DEF#outlook.com to ABD#outlook.com, XYZ#outlook.com and Added dummy#mycompanydomain.com email address as well.
New Ticket got created in 'Dummy' queue. – Working as expected.
"Replied to All" from XYZ to ABD, DEF and Added dummy#mycompanydomain.com mailbox as well.
New Ticket got created in ‘'Dummy' queue. – Working as expected.
Sent email from ABC#outlook.com
TO : DEF#outlook.com & XYZ#outlook.com
Replied to All from DEF to ABD#outlook.com, XYZ#outlook.com and Added dummy#mycompanydomain.com email address as well.
New Ticket got created in "Dummy" queue. – Working as expected.
Forwarded from XYZ to dymmy#mycompanydomain.com mailbox.
Incoming Email got associated with existing Ticket in ‘Dummy’ queue. -- Is this an expected behavior or a product bug?
Incoming Email got associated with existing Ticket in ‘Dummy’ queue. -- Is this an expected behavior or a product bug?
Also be aware of the type of matching you are doing in you SSS. If the matching is via regular expression or token a forwarded email will result in a matching Subject.
I am successfully retrieving user emails from our Exchange Online server, via REST requests and the ADAL library. We have been retrieving and processing calendar-event emails, and their associated calendar events, which are generated by Outlook, GMail/Google-Calendar, iPad, iPhone and Android devices.
We have been looking in the ClassName property for "meeting.request" or "meeting.cancelled", but those values were removed a week ago and have not returned. We have now been looking for a non-Null MeetingMessageType property (MeetingRequest or MeetingCancelled), but as of today, those properties have also been removed. This is incredibly valuable interop data but I don't know where to look next.
How can I associate a retrieved json Message object from a user's mailbox or a shared mailbox, with an (Exchange...) associated Calendar event? We can process Meeting creations, invitations, acceptances etc. with message items which we then purge; Whereas, querying the calendars for new and updated events is much much more intensive, since we certainly can't purge calendar events off the calendar as we process them!
Can I query the calendar for associated message Ids? I can't imagine this would be possible to do for every message.
Thanks!
Edit: #Venkat Thanks. Mail items are infinitely more process-able than emergent calendar-event standards. As an Exchange dev, I have to ask-- do you really need an example of how I can process a mail-bound event better as a mail item rather than a calendar event item? Ok that's cool, here is one:
One thing we are doing is cc/bcc-ing mail/mtg-requests to specific mailboxes for processing (or using client and server rules to accomplish the same thing). We can then poll individual mailboxes, shared mailboxes, and/or collections of mailboxes to auto-respond or not... and to move to specific calendars or not, and to redirect to specific users or not, and to change header information during routing for further category classification or not, and to even replace recipients/attendees or not etc. etc. To do the same thing with REST calendar requests, we'd lose all server rules automation, all client rules automation, procedural auto-respond, all headers manipulation (data-insertion/extraction), etc. We're just trying to push events to a cloud app, for certain collections of users, using shared mailboxes which redirect to specific daemon accounts, which hold calendars for specific subsets of our users/clients.
Like everyone else, we are trying to integrate with cloud apps. So we need procedural parsing, data-manipulation, and pushing of both mail and calendar items. So, for one thing, we have the massive advantages of server mail-processing rules, client/user mail rules, mail header modifications (easy item data modification), mail auto-respond control, and blind recipients. Calendar events don't get any of those things. For a second thing, we have a much more robust mail folders taxonomy than calendar(s) taxonomy (which is almost non-existent). For a third thing, Calendar event mail items are user-specific and have less persistent value than shared calendar events. Finally, if we're processing mail items any way-- why not at least have an eventId for events? Why take out ALL interop information? Having an eventId completely eliminates the need for a query against a calendar endpoint returning multiple items, and adds no addition queries against a mail endpoint.
Google includes an attached ics. Even if you eliminate the event item attachment from the API mail item, I don't see why you have to remove the eventId. Processing calendar events by mail is nothing new, but we have to have a data-binding between the two objects, to do it. That is all.
My Exchange Server still knows when a mail item is a calendar event. It just won't tell ~me~, any more, if I ask it over REST. So, as a brutish work-around I can set up a mail rule to add a category of "api_calendarEvent" for all incoming messages that are of type "Meeting Request". Then, after making a REST call for mail items, I can parse categories to manually repopulate a class property. But why remove the attachment, classname, MeetingMessageType, and EventId altogether from the mail item? Even if I made a server rule to re-categorize certain mail items in certain mailboxes as calendar events, and was able to know when to poll a calendar to get event details-- would I always know what calendar to poll, to find that event? All we'd need to avoid blind polling across multiple calendars, is for you to retain the EventId and/or ClassName. Then we'd also have massive automation of calendar processing again, which has currently been removed from the API.
Thanks!
Thanks for the detailed response to my comment! Your scenario is something we wish to support. As part of schema clean up, we removed event ID and meeting message type from message, as it was being included for every message. For calendar invites and responses, we plan to add back 2 properties:
1. A navigation link to the related event, so you can click to the event, and take actions on it, if you have calendar permissions.
2. A calendar response type e.g. Meeting Accepted, Meeting Declined etc., so you know what type of the page you have received.
We are currently working on the design and we don't have the exact timeline to share. But, we will update our documentation as soon as we have this API available.
[UPDATE] We now return calendar event invites and responses as EventMessage which is a subclass of Message. This entity includes a property called MeetingMessageType and a navigation link to the corresponding Event on the user's calendar. See below for an example:
{
#odata.context: "https://outlook.office365.com/api/v1.0/$metadata#Users('<snipped>')/Messages/$entity",
#odata.type: "#Microsoft.OutlookServices.EventMessage",
#odata.id: "https://outlook.office365.com/api/v1.0/Users('<snipped>')/Messages('<snipped>')",
#odata.etag: "<snipped>",
Id: "<snipped>",
ChangeKey: "<snipped>",
Categories: [ ],
DateTimeCreated: "2015-04-08T14:37:55Z",
DateTimeLastModified: "2015-04-08T14:37:55Z",
Subject: "<snipped>",
BodyPreview: "",
Body: {
ContentType: "HTML",
Content: "<snipped>"
},
Importance: "Normal",
HasAttachments: false,
ParentFolderId: "<snipped>",
From: {
EmailAddress: {
Address: "<snipped>",
Name: "<snipped>"
}
},
Sender: {
EmailAddress: {
Address: "<snipped>",
Name: "<snipped>"
}
},
ToRecipients: [{
EmailAddress: {
Address: "<snipped>",
Name: "<snipped>"
}
}],
CcRecipients: [ ],
BccRecipients: [ ],
ReplyTo: [ ],
ConversationId: "<snipped>",
DateTimeReceived: "2015-04-08T14:37:55Z",
DateTimeSent: "2015-04-08T14:37:48Z",
IsDeliveryReceiptRequested: null,
IsReadReceiptRequested: false,
IsDraft: false,
IsRead: false,
WebLink: "<snipped>",
MeetingMessageType: "MeetingRequest",
Event#odata.navigationLink: "https://outlook.office365.com/api/v1.0/Users('<snipped>')/Events('<snipped>')"
}
Please let me know if our proposed changes meet your requirements, if you have any questions or need more info.
Thanks,
Venkat
I am building an integration between Salesforce and Twilio that sends/receives SMS using TwilioForce REST API. The main issue is getting around the 10-call API limit from Salesforce, as well as the prohibition on HTTP call outs from a trigger.
I am basing the design on Dan Appleman's Asynchronous Request processes, but in either Batch mode or RequestAsync(), ASync(), Sync(), repeat... I'm still hitting the limits.
I'd like to know how other developers have done this successfully; the integrations have been there for a while, but the examples are few and far between.
Are you sending unique messages for each record that has been updated? If not, then why not send one message to multiple recipients to save on your API limits?
Unfortunately, if you do actually need to send more than 10 unique messages there is no way to send messages in bulk with the Twilio API, you could instead write a simple application that runs on Heroku or some other application platform that you can call out to that will handle the SMS functionality for you.
I have it working now using the following structure (I apologize for the formatting - it's mostly pseudocode):
ASyncRequest object:
AsyncType (picklist: 'SMS to Twilio' is it for now),
Params (long text area: comma-separated list of Ids)
Message object:
To (phone), From (phone), Message (text), Sent (boolean), smsId (string), Error (text)
Message trigger: passes trigger details to CreateAsyncRequests() method.
CreateAsyncRequests: evaluate each new/updated Message__c; if Sent == false for any messages, we create an AsyncRequest, type=SMS to Twilio, Params += ',' + message.Id.
// Create a list to be inserted after all the Messages have been processed
List requests = new List();
Once we reach 5 message.Ids in a single AsyncRequest.Params list, add it to requests.
If all the messages have been processed and there's a request with < 5 Ids in Params, add it to requests as well.
If requests.size() > 0 {
insert requests;
AsyncProcessor.StartBatch();
}
AsyncProcessor implements .Batchable and .AllowsCallouts, and queries ASyncRequest__c for any requests that need to be processed, which in this case will be our Messages list.
The execute() method takes the list of ASyncRequests, splits each Params value into its component Message Ids, and then queries the Message object for those particular Messages.
StartBatch() calls execute() with 1 record at a time, so that each execute() process will still contain fewer than the maximum 10 callouts.
Each Message is processed in a try/catch block that calls SendMessage(), sets Message.smsId = Twilio.smsId and sets Message.Sent = true.
If no smsId is returned, then the message was not sent, and I set a boolean bSidIsNull = true indicating that (at least) one message was not sent.
** If any message failed, no smsIds are returned EVEN FOR MESSAGES THAT WERE SUCCESSFUL **
After each batch of messages is processed, I check bSidIsNull; if true, then I go back over the list of messages and put any that do not have an smsId into a map indexed by the Twilio number I'm trying to send them From.
Since I limited each ASyncRequest to 5 messages, I still have the use of a callout to retrieve all of the messages sent from that Twilio.From number for the current date, using
client.getAccount().getMessages('From' => fromNumber, 'DateSent' => currentDate)
Then I can update the Message.smsIds for all of the messages that were successful, and add an error message to Message.Error_on_Send__c for any that failed.
I'm working on a YRS 2013 project and would like to use Twilio. I already have a Twilio account set up with over $100 worth of funds on it. I am working on a project which uses an external API and finds events near a location and date. The project is written in Ruby using Sinatra (which is going to be deployed to Heroku).
I am wondering whether you guys could guide me on how to approach this scenario: a user texts to the number of my Twilio account (the message would contain the location and date data), we process the body of that sms, and send back the results to the number that asked for them. I'm not sure where to start; for example if Twilio would handle some of that task or I would just use Twilio's API and do checking for smss and returning the results. I thinking about not using a database.
Could you guide me on how to approach this task?
I need to present the project on Friday; so I'm on a tight deadline! Thanks for our help.
They have some great documentation on how to do most of this.
When you receive a text you should parse it into the format you need
Put it into your existing project and when it returns the event or events in the area you need to check how long the string is due to a constraint that twilio has of restricting messages to 160 characters or less.
Ensure that you split the message elegantly and not in the middle of an event. If you were returned "Boston Celtics Game", "The Nut Cracker Play". you want to make sure that if both events cannot be put in one message that the first message says "Boston Celtics Game, Another text coming in 1 second" Or something similar.
In order to receive a text message from a mobile device, you'll have to expose an endpoint that is reachable by Twilio. Here is an example
class ReceiveTextController < ActionController
def index
# let's pretend that we've mapped this action to
# http://localhost:3000/sms in the routes.rb file
message_body = params["Body"]
from_number = params["From"]
SMSLogger.log_text_message from_number, message_body
end
end
In this example, the index action receives a POST from Twilio. It grabs the message body, and the phone number of the sender and logs it. Retrieving the information from the Twilio POST is as simple as looking at the params hash
{
"AccountSid"=>"asdf876a87f87a6sdf876876asd8f76a8sdf595asdD",
"Body"=> body,
"ToZip"=>"94949",
"FromState"=>"MI",
"ToCity"=>"NOVATO",
"SmsSid"=>"asd8676585a78sd5f548a64sd4f64a467sg4g858",
"ToState"=>"CA",
"To"=>"5555992673",
"ToCountry"=>"US",
"FromCountry"=>"US",
"SmsMessageSid"=>"hjk87h9j8k79hj8k7h97j7k9hj8k7",
"ApiVersion"=>"2008-08-01",
"FromCity"=>"GRAND RAPIDS",
"SmsStatus"=>"received",
"From"=>"5555992673",
"FromZip"=>"49507"
}
Source