I am working on mailchimp integration.
I need to pull campaign stats (opens and clicks) and put it in my local database.
Using mailchimp api i am getting the list of all the users with their action taken.
But my issue is how to keep data sync at all time.
Is there any way to skip that data from mailchimp api that i had synced already.
The problem is the entire data set can change between calls and there is no 'since' parameter... The only way to get an updated picture is to query all records and update....
Keeping stat "data synced at all times" really would just depend on your solution (have it query for updates when you/your users access that section...)
You could expidite the update process by keeping track of previous calls/updates with the timestamp (keep track of the timestamp and only update/add records that are newer than the last sync... )
As I said, there is currently no "since" command for the campaignEmailStatsAIMAll method (and no direct equivelent in the export API...)
A 'since' parameter would actually be a good feature... So if coding your own solution to track updates via the timestamp is undesirable, you may want to ask the question in the google group or post a feature request in the google code project:
http://code.google.com/p/mailchimp-api/
EDIT: I just opened the feature request as it may solve a similar issue for an upcomming project:
http://code.google.com/p/mailchimp-api/issues/detail?id=60
Related
I have an application where I need to keep local copies of some of my user's Google Spreadsheets, and this copy should be in sync with the Google Drive version. I've been testing two methods of tracking changes to a Google Spreadsheet: (1) the file version polling method and (2) the files.watch method.
1) The file version polling method
In this method, whenever I need the most recent version of a Spreadsheet (for instance, when the user wants to download the file from my application), I retrieve the file version from Google using:
POST https://www.googleapis.com/drive/v3/files/FILE_ID?fields=version
If the version is greater than the version I have stored on my end, I know that changes have been made and the file on my end is outdated. So I download the file and update my copy.
The problem is that it takes a while for the file version number to be updated on Google's end. Ideally, after editing a Google Spreadsheet cell, my application should be able to detect this change within less than 10 seconds. However, after editing a cell and seeing the Saved to Drive confirmation at the top, sometimes it takes seconds, other times it takes minutes before the version number gets updated, so it is very inconsistent.
Aside from the version number, I've also tried polling the modifiedTime value to see if it changed sooner, but it didn't. So I tried another method.
2) The files.watch method
In this method, I keep track of the file changes by registering a webhook to receive change notifications from Google:
POST https://www.googleapis.com/drive/v3/files/FILE_ID/watch
Whenever I receive a change notification, I know that I need to update my local copy.
Unfortunately, the change notifications also don't happen as quick as I would like. It also has very inconsistent delays: sometimes taking a few seconds, sometimes taking more than a minute.
UPDATE (3) The 'always export' method? Never cache method?
To complicate matters, it seems that even if I ignore my local copy and always try to download the latest version of the file directly from Google, the downloaded file will not necessarily be the absolute latest version that the user sees on the Spreadsheets editor. I tried that using
GET https://www.googleapis.com/drive/v3/files/FILE_ID/export?mimeType=application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
but it would often return the outdated version, and sometimes would only return the latest version after a few minutes.
Is there something else that I can try? The above methods use the Google Drive Files API, but if there is a way to detect changes sooner using the Google Spreadsheets API, I would like to know.
1. How to detect file changes as soon as possible?
After a file change (in my case, a change in a Google Spreadsheet), the version does not get updated immediately, and when you watch for file changes with the files.watch API you will also not get a notification immediately.
What does get updated immediately is the list of revisions of the file, which can be retrieved with the revisions.list API:
GET https://www.googleapis.com/drive/v3/files/FILE_ID/revisions
This returns a list of all revisions of the file FILE_ID. The last item in the list is the most recent revision (the "head" revision). In order to know if a file has changed, I retrieve this list. If the id of head revision is different from the id stored in my end, it means that my local copy is outdated, so I have to update the file and its revision id.
However, if you call files.export, the file version returned will not necessarily be the absolute most recent version (e.g., the Google Sheet you are seeing in your browser). And in the case of Google editor documents, it is not possible to retrieve the most recent revision using the revisions.get API. What can you do then?
2. How to retrieve the most recent revision of a Google Sheet?
(I bet it works for other Google editor documents as well).
Before calling files.export, you have to "touch" the file using the files.update API, updating its modifiedTime:
PATCH https://www.googleapis.com/drive/v3/files/fileId
{
"modifiedTime": "TIMESTAMP"
}
Where TIMESTAMP is a date with the format 2022-04-16T22:00:00Z.
For some reason, touching the file like this "forces" Google to return the head revision of the file the next time you call files.export:
GET https://www.googleapis.com/drive/v3/files/FILE_ID/export?mimeType=MIMETYPE
In my case, MIMETYPE is application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.
That's it. So far, this has been working for me.
I would like to ask you for help with google admin sdk, User Usage Reports. I got following questions:
One of the parameters used to retrieve a usage reports is a date. Unfortunately for current date the reports are not yet available - I must use a date few days in the past. Is there any official rule, recommendation or any advice how to know most recent date for which the reports are already available? Should I query reports for a one day in the past, 3 days, week or...? I just want to have the latest results without worrying about the date...
Even more: sometimes the returned reports are OK for one of the google apps but for other one - like docs - I got warning that its not ready. So my question is: how to get most recent data for all of the apps? (mainly: accounts, docs, gmail).
I am using "all" parameter to retrieve all accounts at once.
Second question is related to first one: assuming I receive partial data - for example no reports for docs - is it possible that when I retrieve reports for all accounts (emails) some of them will have reports ready and some of the only partial data? Or the reports are always either ready or partial and for all accounts (emails) I should expect the same, full or partial data?
Thanks in advance!
Is there any official rule, recommendation or any advice how to know most recent date for which the reports are already available? Should I query reports for a one day in the past, 3 days, week or...? I just want to have the latest results without worrying about the date...
UserUsageReport returns Google Apps service activities across your account's users. the get() method retrieves a report for a set of account users. You can query for any date value provided that its timestamp is represented as ISO 8601 and should use your account's timezone.
how to get most recent data for all of the apps? (mainly: accounts, docs, gmail).
I'm guessing you're using CustomerUsageReport, it may take a while to process a report and thus you may have to retry to call it again after a few hours.
Second question is related to first one: assuming I receive partial data - for example no reports for docs - is it possible that when I retrieve reports for all accounts (emails) some of them will have reports ready and some of the only partial data? Or the reports are always either ready or partial and for all accounts (emails) I should expect the same, full or partial data?
I think this will depend on if the data are now available or not. You can test out various parameter values when accessing the API to test it out.
I need to sync entities from Ms Dynamics Crm 2015 - On Premise to my 3rd party application, for this I have set a JavaScript function on the OnSave event of the Entites( eg. account) I can access all of the attributes and send them to my webservice, but the Id (GuId) of the entity!
how can I access the Id (or set it manually) on this event?!
Xrm.Page.getAttribute("accountid") or Xrm.Page.getAttribute("id") both return null, so I can not setValue using them.
Also Xrm.Page.data.entity.getId() returns "" which is probably logical, since Object has not been inserted in the db yet, this is the reason which makes inserting a runtime generated guid for the entety seems doable !
P.S.
I know I can do same thing with plugins, which I have gone through, but the problem there is that when I register my plugin for Update message it gets called a lot of times, (mostly when it has been set for invoice), this is the reason that made me go with the JScript, since the OnSave Event seems more logical than the Update Message of the plugin
As you already found out, records which have not yet been saved have no ID. That's by design (and obvious).
Also, IDs being PKs in the database, they are handled by the system and cannot be touched or hand-crafted.
Your best bet to keep a similar behavior would be a Post-Operation Create plugin living outside the sandbox (Isolation mode: None).
Another good option would be to pull data instead of pushing it: the 3rd party application can periodically fetch new records through any of the exposed APIs (REST, SOAP, SDK ... there are many options).
We are using the push notifications provided by google Drive API. The change id is pushed to the service by google, and we use the GET Api to retrive the change object. Our requirement is to find from the change object whether it is a file upload/update or trash. We tried using created date, modified date and modification date to find the difference. But the modification date is not providing the time the change happened originally and there is a difference of 2 seconds sometimes.. Also, When a folder is shared to multiple users, for the same change, the modification date which each user receive is different from that of the other and the difference is in seconds.
Please suggest if there is a way to distinguish between file upload/update/trash from change object.
Also, provide inputs on why the modification date is different for each user, and different from file modified date.
NOTE: We cannot use watch for files as per our requirement as there are lot of files in the drive and we can watch only for changes in the entire drive.
I have been caching Google calendar event data locally in a local database using the python api for v3.
I have this ID cached locally 16...hk_20140530T010000Z. The event has been deleted by a user using Google Calendar on the web, and was not deleted by any scripting. The user reported the event was still appearing in her reports from our local cache.
I investigated, assuming there had been a problem at some point in time and the scripts had simply missed canceling this event (I use updatedMin so I thought maybe something could have theoretically been missed.) However, even when I ran a full query of all data for this calendar this instance of the recurrence was nowhere to be seen. There were plenty of others in the recurrence that appeared, but this instance did not exist in the results from Google, cancelled or otherwise.
I pull these IDs directly from the results from google when I save them in the database, and the fact it exists tells me it had to have existed at some point. The fact it is no longer listed has me puzzled, as it should be there with a status of canceled from my understanding.
Does anyone have any suggestion on why this may be the case? We have 200+ calendars I sync and deleting and re-importing on a regular basis would be a very time consuming process.
Thanks.
Event deletions can disappear after a while. Make sure to correctly handle the 410 response code when using old sync tokens / modified since values (http://developers.google.com/google-apps/calendar/v3/sync).