I know there are plenty of questions to this topics, but I couldn't find a proper explanation for my case. On my website the user manually clicks on the google map and then the geocode function is called. While "realistically" testing, nearly all calls takes more than 1 second to the next call (6 calls where around 0.9sec), see the logging of my geocode calls:
The 94th call hit the OVER_QUERY_LIMIT
In total i had 424 geocoding calls today, see google metrics:
As far as I understand I'm far from the usage limit. (I am in the "Premium Plan"). So I don't understand why I'm exceeding the quota, any ideas? Or how high are the "initial quota of requests" and the "additional requests on a per-second basis"?
Related
Does someone know what "sf_max_daily_api_calls" parameter in Heroku mappings does? I do not want to assume it is a daily limit for write operations per object and I cannot find an explanation.
I tried to open a ticket with Heroku, but in their support ticket form "Which application?" drop-down is required, but none of the support categories have anything to choose there from, the only option is "Please choose..."
I tried to find any reference to this field and can't - I can only see it used in Heroku's Quick Start guide, but without an explanation. I have a very busy object I'm working on, read/write, and want to understand any limitations I need to account for.
Salesforce orgs have rolling 24h limit of max daily API calls. Generally the limit is very generous in test orgs (sandboxes), 5M calls because you can make stupid mistakes there. In productions it's lower. Bit counterintuitive but protects their resources, forces you to write optimised code/integrations...
You can see your limit in Setup -> Company information. There's a formula in documentation, roughly speaking you gain more of that limit with every user license you purchased (more for "real" internal users, less for community users), same as with data storage limits.
Also every API call is supposed to return current usage (in special tag for SOAP API, in a header in REST API) so I'm not sure why you'd have to hardcode anything...
If you write your operations right the limit can be very generous. No idea how that Heroku Connect works. Ideally you'd spot some "bulk api 2.0" in the documentation or try to find synchronous vs async in there.
Normal old school synchronous update via SOAP API lets you process 200 records at a time, wasting 1 API call. REST bulk API accepts csv/json/xml of up to 10K records and processes them asynchronously, you poll for "is it done yet" result... So starting job, uploading files, committing job and then only checking say once a minute can easily be 4 API calls and you can process milions of records before hitting the limit.
When all else fails, you exhausted your options, can't optimise it anymore, can't purchase more user licenses... I think they sell "packets" of more API calls limit, contact your account representative. But there are lots of things you can try before that, not the least of them being setting up a warning when you hit say 30% threshold.
I keep getting errors from the Google API telling me that I have exceeded the limit of 100 reads per second per user when trying to read from Google Sheets using the API. I'm using CozyRoc's REST Connection and REST Source connected to Google Sheets then connecting to a SQL Server Destination table and trying to populate the table from the Google Sheet.
Please don't try to offer any suggestions about other programs or other ways of accessing the data. I have scores of SSIS packages that use the same set up to import data. In all of those that cycle through about 10 Google sheets with anywhere from one to 20 tabs and as many as 400 rows, I've had to set the Reads per Second to one to avoid the error. That makes my uploads incredibly slow. And YES, I contacted CozyRoc who tells me it's a Google API problem, not theirs, and there's nothing they can or will do about it.
So . . . I made sure we have a billing account . . .
I was able to sign in and look at the Quotas screen. 500 reads per Second and 100 Reads per Second per user.
I was able to request an increased limit for the Reads Per Second to 500 which doesn't require a billing account, but that doesn't change the Reads Per Second Per User.
CozyRoc uses my user id (oAuth token) to access Google Sheets, so every read only comes from one user.
A popup displays when I try to edit the Reads Per Second Per User from the Quotas screen asking me to set the limit to a maximum of 100 or to request a quota limit increase.
I click on that request button and I am immediately returned to the Quotas screen. I'm never asked to set a new limit, I'm never told that the request was sent and received, and I see no notice telling me how long it may take to process the request. It's been about a week now since I first tried. It would be nice if they would tell you SOMETHING!
My thought is that there may be something wrong with their website, perhaps with that popup. I've tried calling Cloud Support, but they refuse to help, saying basically "Not my job to help you with setting quota limits," even though my question is really whether there is a known issue with the quota limit increase website/popup.
SO . . . Is anyone else having a similar problem?
Can anyone tell me if what's happening is normal and that it just takes WEEKS for Google to process a quota limit increase or how long it normally takes for them to process a quota limit rate increase?
Is there anywhere I can reach out to at Google where I can do a screen sharing session and show them what's happening to get an answer or find out whether or not my request was even received?
Any ideas or thoughts as to how I can find out what's going on?
Please let me know.
Thanks,
How to edit API specific quota
The user per minute quota can be edited API and project specific
Go to the GCP console
Chose the project for which you want to icnrease the quota limit
Go onIAM & Admin -> Quotas
Choose Google Sheets API and Read reuquests
Click on the pencil next to Read requests per 100 seconds per user
Choose a quota limit of below 100 - which is the maimum limit allowed by the Sheets API
Click on Save
If you want to increase your limit above 100 requests per 100 seconds per user:
Mind that this limit is the above the normally alloweed quota
Click on apply for higher quota
Click on ALL QUOTA for Read requests per 100 seconds per user
Check the toickbox next to GLOBAL
Confirm your contact details and click on Next
Enter the desire new limit and click on Done
KEEP IN MIND
If you do not have a billing account, you are not eligible to chose a limit of more than 100 read requests for user per 100 seconds and you will get the error message
You can't request more quota because your project is not linked to a billing account.
In the quota calculation for YouTube, there is neither a currency nor a volume that the price refers to. Where do I find the pricing per API call?
Thank you!
The YouTube API quota calculator can be complicated thats why they created the calculator
YouTube Data API (v3) - Quota Calculator
This tool lets you estimate the quota cost for an API query. All API requests, including invalid requests, incur a quota cost of at least one point.
To use the tool, select the appropriate resource, method, and part parameter values for your request, and the approximate quota cost will display in the table. Please remember that quota costs can change without warning, and the values shown here may not be exact.
The cost is against your quota found in the developer console
I am not sure i understand what you mean by currency. The YouTube api is a free api it doesnt cost anything to use.
Does the Google Analytics API throttle requests?
We have a batch script that I have just moved from v2 to v3 of the API and the requests go through quite well for the first bit (50 queries or so) and then they start taking 4s or so each. Is this Google throttling us?
While Matthew is correct, I have another possibility for you. Google analytics API cashes your requests to some extent. Let me try and explain.
I have a customer / site that I request data from. While testing I noticed some strange things.
the first million rows results would come back with in an acceptable amount of time.
after a million rows things started to slow down we where seeing results returning in 5 times as much time instead of 5 minutes we where waiting 20 minutes or more for the results to return.
Example:
Request URL :
https://www.googleapis.com/analytics/v3/data/ga?ids=ga:34896748&dimensions=ga:date,ga:sourceMedium,ga:country,ga:networkDomain,ga:pagePath,ga:exitPagePath,ga:landingPagePath&metrics=ga:entrances,ga:pageviews,ga:exits,ga:bounces,ga:timeOnPage,ga:uniquePageviews&filters=ga:userType%3D%3DReturning+Visitor;ga:deviceCategory%3D%3Ddesktop&start-date=2014-05-12&end-date=2014-05-22&start-index=236001&max-results=2000&oauth_token={OauthToken}
Request Time (seconds:milliseconds): :0:484
Request URL :
https://www.googleapis.com/analytics/v3/data/ga?ids=ga:34896748&dimensions=ga:date,ga:sourceMedium,ga:country,ga:networkDomain,ga:pagePath,ga:exitPagePath,ga:landingPagePath&metrics=ga:entrances,ga:pageviews,ga:exits,ga:bounces,ga:timeOnPage,ga:uniquePageviews&filters=ga:userType%3D%3DReturning+Visitor;ga:deviceCategory%3D%3Ddesktop&start-date=2014-05-12&end-date=2014-05-22&start-index=238001&max-results=2000&oauth_token={OauthToken}
Request Time (seconds:milliseconds): :7:968
I did a lot of testing stopping and starting my application. I couldn't figure out why the data was so fast in the beginning then slow later.
Now I have some contacts on the Google Analytics Development team the guys in charge of the API. So I made a nice test app, logged some results showing my issue and sent it off to them. With the question Are you throttling me?
They where also perplexed, and told me there is no throttle on the API. There is a flood protection limit that Matthew speaks of. My Developer contact forwarded it to the guys in charge of the traffic.
Fast forward a few weeks. It seams that when we make a request for a bunch of data Google cashes the data for us. Its saved on the server incase we request it again. By restarting my application I was accessing the cashed data and it would return fast. When I let the application run longer I would suddenly reach non cashed data and it would take longer for them to return the request.
I asked how long is data cashed for, answer there was no set time. So I don't think you are being throttled. I think your initial speedy requests are cashed data and your slower requests are non cashed data.
Email back from google:
Hi Linda,
I talked to the engineers and they had a look. The response was
basically that they thinks it's because of caching. The response is
below. If you could do some additional queries to confirm the behavior
it might be helpful. However, what they need to determine is if it's
because you are querying and hitting cached results (because you've
already asked for that data). Anyway, take a look at the comments
below and let me know if you have additional questions or results that
you can share.
Summary from talking to engineer: "Items not already in our cache will
exhibit a slower retrieval processing time than items already present
in the cache. The first query loads the response into our cache and
typical query times without using the cache is about 7 seconds and
with using the cache is a few milliseconds. We can also confirm that
you are not hitting any rate limits on our end, as far as we can tell.
To confirm if this is indeed what's happening in your case, you might
want to rerun verified slow queries a second time to see if the next
query speeds up considerably (this could be what you're seeing when
you say you paste the request URL into a browser and results return
instantly)."
-- IMBA Google Analytics API Developer --
Google's Analytics API does have a rate limit per their docs: https://developers.google.com/analytics/devguides/reporting/core/v3/coreErrors
However they should not caused delayed requests, rather the request should be returned with a response of: 403 userRateLimitExceeded
Description of that error:
Indicates that the user rate limit has been exceeded. The maximum rate limit is 10 qps per IP address. The default value set in Google Developers Console is 1 qps per IP address. You can increase this limit in the Google Developers Console to a maximum of 10 qps.
Google's recommended course of action:
Retry using exponential back-off. You need to slow down the rate at which you are sending the requests.
I have developed a google places application to get info about places. I have verified my identity with google and as per the limits, I should be allowed up to 100 000 requests per day. However, after under 300 requests (different numbers go through each day), I get the message back: OVER_QUERY_LIMIT. Any similar experiences or ideas how to enable the requests to go through?
Thank you.
D Lax
You can track your requests at
https://code.google.com/apis/console/?noredirect#:stats
I ran into this issue as well. I was able to find 3 throttle limits for the places api, given by google.
10 api calls per every 1 second
100 api calls per every 100 seconds
50,000 api calls per 1 day
If I were to go over any of these limits, I would receive the OVER_QUERY_LIMIT error and it would return no results for that given address.
I found a way to have my program sleep for 11 seconds after calling the places api with a dataset of 10 addresses. Then the program would call the places api with a new dataset of 10 address. This solution gets around the 10calls/second and the 100calls/100seconds throttle limits. However, I did run into the OVER_QUERY_LIMIT error once I tried my 25th dataset of 10 address (after 240 api calls). So it is clear that there are other throttles not published to help protect the google maps platform.
But, I did see that the limits mentioned above may be changed if you get in contact with the google api help team and sort it out with them.