Why do I get in_use_customer_managed_encryption_keys quota exceeds quota limits creating Vertex AI endpoint with CMEK? - google-cloud-vertex-ai

When attempting to create a Vertex AI endpoint with CMEK, I get:
Failed to create endpoint...The following quota metrics exceed quota limits: aiplatform.googleapis.com/in_use_customer_managed_encryption_keys'
Status: 429 Error code: 429
On the Console, under "IAM and admin", "Quotas", the limit for quota "In use customer managed encryption keys per region" for the region corresponding to my endpoint, is listed as 1 and current usage as 0. Perhaps the current usage must be less than the limit?
I requested a raise in that quota limit and in the description asked whether a limit >1 was required to create a single CMEK endpoint, however the response did not address this question but rather just asked for payment to increase the quota.

Yes, it appears that the current usage must be less than the limit i.e. a limit of 1 does not allow any usage.
I successfully applied for a quota increase to raise the limit to 2 and was subsequently able to create the endpoint with the CMEK key.

Related

Im getting a Quota exceeded response from the YouTube API but my quota limits are not reached yet

At least I think thats what this error message means:
HttpError: <HttpError 429 when requesting https://youtube.googleapis.com/youtube/v3/playlists?part=snippet%2Cstatus&alt=json returned "Resource has been exhausted (e.g. check quota).". Details: "Resource has been exhausted (e.g. check quota).">
But my YouTube Data API v3 quotas have not been fully used up:
I have used 31k from my 112k limit and as I understand it the Queries per minute quota is higher than my daily limit, so that should also be no problem.
Am I understanding it right that this error happened because of Quota Limits or is this from another source?
My request basically was creating a new playlist.

How to Resolve a 403 error: User rate limit exceeded in Google Drive API?

I am getting
"code": 403,
"message": "User Rate Limit Exceeded"
while using Google Drive API in my web app
Although the quota is 10,000 requests per 100 seconds and my average is less than 2:
How can I resolve this error? How to implement exponential backoff as the documents say?
There are sevrail types of quotas with Google apis.
Project based quotas which effect your project itself. These quotas can be extended. If for example you your project can make 10000 requests pre 100 seconds. you could request that this be extended.
Then there is the user based quotas. these quotas limit how much each user can send.
User Rate Limit Exceeded
Means that you are hitting a user rate quota. User rate quotas are flood protection they ensure that a single user of your application can not make to many requests at once.
These quotas can not be extended.
if you are getting a user rate limiting quota then you need to slow down your application and implement exponential backoff.
How you implement exponential backoff is up to you and the language you are using but it basically involves just retrying the same request again only adding wait times each time it fails
the graph
the graph in the google cloud console is an guestimate and it is not by any means accurate. If you are getting the error message you should go by that and not by what the graph says.
After hours of searching and thinking, I found out that,
'User Rate Limit Exceeded' has a spam protection which allow max 10 requests per second.
Thus I found out a lazy trick to do so by delaying the calls using:
usleep(rand(1000000,2000000);
It simply delays the call by a random duration between 1 and two seconds.

YouTube API Quotas, please explain. Quota exceeded. How to solve?

Can someone please explain me in simple language how these quotas work?
I know where is a similar question, but I want an explanation related to the screenshot below.
First, I opened the quotas page in Google Dev Console for YouTube API.
But I don't understand what these lines are and how they work, why there are several lines?
For example, I was trying to make a simple request like this
https://www.googleapis.com/youtube/v3/search?part=snippet&q=welcome&type=playlist&key=[MY_API-KEY]
Which returns me a json response:
{
"error": {
"code": 403,
"message": "The request cannot be completed because you have exceeded your \u003ca href=\"/youtube/v3/getting-started#quota\"\u003equota\u003c/a\u003e.",
"errors": [
{
"message": "The request cannot be completed because you have exceeded your \u003ca href=\"/youtube/v3/getting-started#quota\"\u003equota\u003c/a\u003e.",
"domain": "youtube.quota",
"reason": "quotaExceeded"
}
]
}
}
So, I assume it gives me an error because somewhere there is a quota = zero, because I only tried to make this request once.
What should I do to get rid of this error and be able to use the API?
Thank you.
project based quotas
The YouTube data api is a cost based quota as opposed to a request based quota.
With request based quotas you are given a quota of say 10000 requests that you can make, each request you make removes one from your quota.
The YouTube data api is a cost based quota. This means you are given a quota of say 10000 points which you can spend on requests. Each requests has a different cost.
uploading videos costs around 1600 points against your request so you can upload a limited number of videos yet list only costs 50 so you could do more lists then uploads before running out of quota.
I recommend having a look at the quota calculator which will help you understand the cost of each request against your quota allotment.
This video may also help you understand cost based quotas YouTube API and cost based quota demystified
As far as the error you are getting from the following request
https://www.googleapis.com/youtube/v3/search
As search.list method costs 100 quota points each time you request it, this and the error message would suggest that you have exceeded your quota. You need to either apply for an extension or make fewer reqeusts.
How to check your current quota allotment:
Go to https://console.cloud.google.com/ -> library -> search for youtube data api -> quota
user based quotas.
Besides that there are also user based quotas which are the number of requests a user can make per second these are flood protection quotas.

Limit on number of AWS Lambdas per region/account

I would like to know if there is a limit on the number of Lambda functions one can define (either per region or per account). To be clear I am not talking about a limit on the number of instances of a single lambda function, but instead on the number of function definitions that are allowed.
Looking at http://docs.aws.amazon.com/lambda/latest/dg/limits.html I did not find any explicit limit. There is a limit on the "Total size of all the deployment packages that can be uploaded per region" that is set to 75 GB, which indirectly limits the number of defined functions (for instance, if average size - across all of my lambda functions - of the deployment package is 40MB, then there is a limit of 75GB/40MB = 1875 functions per region).
As you found on the AWS Lambda Limits page, there is no current limit on the number of AWS Lambda functions per region or account.
You are correct, that there is a limit on Function and Layer Storage. That is described as "The amount of storage that's available for deployment packages and layer archives in the current Region." By default, this is set to 75 GB per region. However, this limit is adjustable. If you're getting close to this limit, you can request a quota increase and expand your indirect limit on the number of functions or layers.
Check out the Service Quotas service to see your current hard and soft limits for AWS services and request quota increases.
To request an increase in function storage:
In the Service Quotas console, select the "AWS Lambda" service.
Then, select "Function and layer storage" and click "Request quota increase".
On the request form, enter your requested storage amount and submit the request.

Error 429: Insufficient tokens (DefaultGroupUSER-100s) What defines a user?

tl/dr do 100 devices all using the same Client ID count as 100 users, with their own limits, or one user sharing limits?
I have a webpage which reads and writes to a Google Sheet.
Because the webpage needs to know if a cell has changed, it polls the server once every 1000ms:
var pollProcId = window.setInterval(pollForInput, 1000);
where pollForInput does a single:
gapi.client.sheets.spreadsheets.values.get(request).then(callback);
When I tried to use this app with a class of 100 students I got many 429 error codes (more than I got successful reads) in response to google.apps.sheets.v4.SpreadsheetsService.GetValues requests:
Many of my users never got as far as seeing even the first request come back.
As far as I can make out, these are AnalyticsDefaultGroupUSER-100s errors which, according to the error responses page:
Indicates that the requests per 100 seconds per user per project quota
has been exhausted.
But with my app only requesting once per 1000 milliseconds, I wouldn't expect to see this many 429s as I have a limit of 100 requests per 100 seconds (1 per second) so only users whose application didn't complete in 100 seconds should have received a 429.
I know I should implement Exponential Backoff (which I'll do, I promise) but I'm worried I'm misunderstanding what a "user" in this context is.
Each user is using their own device (so presumably has a different IP address) but they are all using my "Client ID".
Does this scenario count as many users making one request per second, or a single user making a hundred requests per second?
Well, the user in the per user quota means that a single user making a request. So let's take the Sheets API, it has a quota of 100 for the Read requests per 100 seconds per user. So meaning only a single user can make a request per second. Note that Read request has a same set of quota as the Write request. But these two sets of quotas have their own set of quota and didn't share the same limit quota.
If you want a higher quota than the default, then you can apply for a higher quota by using this form or by visiting your developer console, then click the pencil icon in the quota that you want to increase.
I also suggest you to do the Exponential Backoff as soon as possible, because it can help you to avoid getting this kind of error.
Hope it helps you.

Resources