I need a suitable caching approach for use with an enterprise portal showing data from the Google Calendar API. What algorithms or design patterns are best applicable?
The Google Calendar API is limited by number of requests per day (defaults to 10,000 requests/day - I have requested more) and rate of access (5 requests/second/user).
There are two core API methods that I expect to use, one to get a list of user calendars (1 API hit) and one to download the events of an individual calendar (1 API hit per calendar).
Both the calendar list and individual calendars contain etag values which can be used to help avoid unnecessary API requests. If you have a list of individual calendar etag values then you can see if any of these have changed by just querying the calendar list. (Unfortunately a HTTP 304 Not Modified response is still counted as an API hit).
Also I don’t really want to download and cache the entire calendar contents (so maybe just a few days or weeks at a time).
I need to find an approach which tries to minimize the number of API calls but doesn't try to store everything. It also needs to be able to cope with occasionally fetching data from unchanged calendars because the "time sliding window" on the calendar data has moved on. I would like the system to be backed by data storage so that multiple portal instances could share the same data.
Related
My team and I are currently in the process of creating a website listing a number of local businesses and as part of our service we are going to be administrating their GoogleMyBusiness. Basically each business will have a page on out website containing their informaiton, such as phone numbers, address and such - including reviews, opening hours and a Google Maps integration.
My question is; is it possible to use GMB APIs to integrate reviews, opening hours and Google Maps for each of our customers on the same website (on different pages) and would we need to request unique API keys for each customer?
You only need one key. Make sure to restrict the key so you can have multiple instances of your Google Maps
When using the LUIS programmatic API, we get frequent 429 errors ("too many requests") when doing a half-dozen GET and POST requests. We've inserted a pause in our code to deal with this.
We have a paid subscription key to LUIS, which indicates we should get 50 requests/second (see https://azure.microsoft.com/en-us/pricing/details/cognitive-services/language-understanding-intelligent-services/). However, it seems the paid subscription key can only be used for hitting the application endpoint. For Ocp-Apim-Subscription-Key in request headers, we must use our "programmatic key", which is associated with the Starter_Key, which is (apparently) rate-limited.
Am I missing something here? How do we get more throughput on the LUIS programmatic api?
One of our engineers arrived at the following answer, so I'm going to post his answer here.
The programmatic API is limited to 5 requests per second, 100K requests per month for everyone. Your paid subscription key is only for the endpoint API, not for the programmatic API.
If you need more throughput:
Put your API requests in a queue. It is unlikely that you need to update your LUIS model non-stop, 5 times per second -- you probably just have a burst of updates. Put them in a queue to stay within the limit.
Don't try using the same user account to manage multiple LUIS models. Set up additional accounts for each model, which gives you additional programmatic keys. Each programmatic key gives you 5 requests per second.
With a simple java program, I send GET requests using YouTube Data API specifically videos.list, in order to get the public metadata of a video and store it as .json files.
For my universities research, we have to do this with all available YouTube video IDs provided in the Youtube-8M Database.
Therefore, I would like to know if there is a way to extend the available quota for requests (I already know about the billing option, but I am a student and my university is small).
I have read the YouTube API terms, which states that only one project per client may be used to send such requests with the necessary API Key.
If I understand it correctly, even my simple java code is such a client.
In some other Stack Overflow questions about extending ones daily quota with API Keys, some suggested creating multiple accounts or projects.
Is this a legal option or not? Or is there another possibility to get a higher quota for simple requests used in research like I do right now?
If you go to the Google Developer console where you enabled the YouTube API. the second tab is called quota
Click the pencil next to which ever quota it is that you are blowing out. A new window will pop up with a link called apply for higher quota.
Fill out the form to apply. To my knowledge you do not have to pay for additional YouTube quota but it can take time to get approved. Make sure you comply with everything on the form.
I have never heard of the one project per client term. Technically you can run your application using different API Keys it should work fine. Technically there is nothing wrong with creating additional projects on Google Developer console. You don't need to go as far as creating another Google account.
Google Maps API provides an Autocomplition service.
According to this blog post (official?) this service is limited only by adding "powered by Google" logo.
When I'm using js library (http://maps.google.com/maps/api/js?sensor=false&libraries=places) I'm not sending any Key information. But in a sniffer I can see some token GET parameter, which seems is generated by library.
Which one limitation information is correct?
How Google can track without Key (in case it is limited by requests per day)?
Is that possible to retrieve autosuggestion by js (from google.maps.places.Autocomplete), but then using reference (without storing) on backend and loading place details (similar to getPlace() functionality of an Autocomplete object)? If this not limited, how to generate token?
Google Places API Web Service
The Google Places API Web Service enforces a default limit of 1 000
requests per 24 hour period, which you can increase free of charge. If
your app exceeds the limit, the app will start failing. Verify your
identity to get up to 150 000 requests per 24 hour period, by enabling
billing on the Google Developers Console.
Now check at the very top of that page
Note: These limits do not apply to the Places Library in the Google
Maps JavaScript API, which is covered by the Google Maps JavaScript
API limits. If you are developing a web based application that only
needs to search for places, and does not submit new places, you should
use the Places Library of the Google Maps Javascript API rather than
the Google Places API Web Service. The Places library assigns a quota
to each end user rather than to each key. This means that your
available quota increases with your user base rather than being capped
at a fixed amount.
they are probably using ip address to identify different users.
Because of the change of Terms, Parse now limits the number of requests a second, which is a good thing but does Parse Push and Parse Analytics count as requests ?
Anytime you make a network call to Parse on behalf of your app via the iOS, Android, JavaScript, Windows, Xamarin, Unity, or REST API, it counts as an API request.
This does include things like finds, saves, logins, amongst other kinds of requests. It also includes requests to send push notifications, although this is seen as a single request regardless of how many recipients are targeted. Serving Parse files counts as an API request, including static assets served from Parse Hosting.
Analytics requests do have a special exemption. You can send us your analytics events any time without being limited by your app's request limit, as noted on Parse's Plans page.
Here is the link to official FAQ - https://parse.com/plans/faq
What is considered an API request?
In general, anytime you make a network call to Parse on behalf of your
app using one of the Parse SDKs or REST API, it counts as an API
request.
What types of operations are counted as API requests?
Queries, saves, logins, amongst other kinds of operations will be taken into account
when determining the number of requests generated by your app. A
request to send a push notification sent by a SDK will count as a
single request regardless of how many installations are targeted.
Batched requests will be counted based on the number of operations
performed in each batch. Serving Parse files counts as an API request,
including static assets served from Parse Hosting. Analytics requests
do not count towards your app's request limit.