When using the LUIS programmatic API, we get frequent 429 errors ("too many requests") when doing a half-dozen GET and POST requests. We've inserted a pause in our code to deal with this.
We have a paid subscription key to LUIS, which indicates we should get 50 requests/second (see https://azure.microsoft.com/en-us/pricing/details/cognitive-services/language-understanding-intelligent-services/). However, it seems the paid subscription key can only be used for hitting the application endpoint. For Ocp-Apim-Subscription-Key in request headers, we must use our "programmatic key", which is associated with the Starter_Key, which is (apparently) rate-limited.
Am I missing something here? How do we get more throughput on the LUIS programmatic api?
One of our engineers arrived at the following answer, so I'm going to post his answer here.
The programmatic API is limited to 5 requests per second, 100K requests per month for everyone. Your paid subscription key is only for the endpoint API, not for the programmatic API.
If you need more throughput:
Put your API requests in a queue. It is unlikely that you need to update your LUIS model non-stop, 5 times per second -- you probably just have a burst of updates. Put them in a queue to stay within the limit.
Don't try using the same user account to manage multiple LUIS models. Set up additional accounts for each model, which gives you additional programmatic keys. Each programmatic key gives you 5 requests per second.
Related
I have some simple websites (not Laravel applications) with forms where people can input there postalcode and housenumber where the street and city field automatically gets filled in with the associated information. To accomplish this I make an API call with a ajax request to my Laravel application which returns the associated street and city. My Laravel application then makes a call to a third-party api which costs me around € 0.01 per request.
No I want to avoid unwanted an unauthorized access to my Laravel api calls, because each call costs me money. Because at this moment it is very easy to replicate such calls and someone with bad intentions could make a script that could perform thousands of calls per minute.
So my questions is how I can prevent unwanted and unauthorized api calls. I already read about Sanctum and passport, but from what I read this applies only for authenticated users. And using a token in the request header seems unnecessary, because anybody with a little knowledge can trace the token and use it.
Note that the people who fill in the forms can be random people and don't have an account.
There are probably many approaches. A simple but effective one would be sessions. You can save the user in a session. This way you can also count his Api accesses. As soon as they are larger than allowed, you can block their requests. You also write the block in the session. But pay attention to the session duration. It must be long enough.
But the user with bad intentions can get a new session. To avoid this, you can also put his IP on an internal blacklist for a day.
Note: But an open api is always a point of attack.
I have couple questions about google pay api of passes:
How many loyalty classes can we create?
What is the daily quota of this rest api call, like how many calls we can make per user per day?
Can we send the pass token links in email with a "self generated QRcode"? Or it have to be the "add to google pay button", is this allowed according to google design and use cases?
Thank you for answer this question.
How many loyalty classes can we create?
Unlimited
What is the daily quota of this rest api call, like how many calls we
can make per user per day?
"We might limit the rate at which you can call our API. We recommend you keep your requests to no more than 20 requests per second."
Source : https://developers.google.com/pay/passes/guides/get-started/api-guidelines/performance-tips#limit
Can we send the pass token links in email with a "self generated
QRcode"? Or it have to be the "add to google pay button", is this
allowed according to google design and use cases?
It's a shame to use a QRCode, especially if the user uses Gmail which integrates Go-To-Action (https://developers.google.com/gmail/markup/reference/go-to-action).
since trustpilot do not allow us to query just the reviews published after a specific date I will need to make loads of requests to get the products reviews.
Do you have any limit on the number of requests we can make in a minute?
Thanks #neisantos for your question.
Unfortunately our product reviews API doesn't support a query based on published datetime.
Fortunately for you, you don't need to worry about query limitations on the endpoint. However, we can see which apiKey is accessing our endpoints, and if necessary we can throttle or revoke the access.
BTW. Fetching all product reviews via the API will probably not affect our API that much :)
I am planning to use zoho crm for my business. On on side I have clients who pay my business, on other hand I have online customer to whom I assign work given to me by clients. So basically my business is kind a mediator.
Now I want to use zoho crm workflow automation like when lead is created signup mail should be sent. I want to increase lead score when client does particular activity. I want to use webform to capture leads.
My issue is that zoho crm gives very less number of APIs like 500 per user per day. Then how do I do capture leads directly into crm. How do I increase lead score.
How do you guys manage such scenarios ?
The API Calls per day day will depend of your subscription plan.
Standard: starts with 2000 calls per day
Professional: starts with 3000 per day
Enterprise: starts with 4000 per day
Reference link
In several cases this will be enough, however there are a little tricks for saving API calls like using the API V4 in which you can insert/update multiples records (100 per request).
Also, you can use the custom function (zoho deluge) in the CRM and set yours workflow rules like:
Each time a new lead is created with the status "Not contacted" then:
Send a welcome email
Create a case (zoho deluge)
Create a taks (zoho deluge)
etc.
The rate limit for zoho custom functions is not the same as zoho api calls. (Integration Tasks - 25000 Zoho API calls/day using deluge.) So you can use both of them.
Reference link
I need a suitable caching approach for use with an enterprise portal showing data from the Google Calendar API. What algorithms or design patterns are best applicable?
The Google Calendar API is limited by number of requests per day (defaults to 10,000 requests/day - I have requested more) and rate of access (5 requests/second/user).
There are two core API methods that I expect to use, one to get a list of user calendars (1 API hit) and one to download the events of an individual calendar (1 API hit per calendar).
Both the calendar list and individual calendars contain etag values which can be used to help avoid unnecessary API requests. If you have a list of individual calendar etag values then you can see if any of these have changed by just querying the calendar list. (Unfortunately a HTTP 304 Not Modified response is still counted as an API hit).
Also I don’t really want to download and cache the entire calendar contents (so maybe just a few days or weeks at a time).
I need to find an approach which tries to minimize the number of API calls but doesn't try to store everything. It also needs to be able to cope with occasionally fetching data from unchanged calendars because the "time sliding window" on the calendar data has moved on. I would like the system to be backed by data storage so that multiple portal instances could share the same data.