Limit Request by IP on Codeigniter Rest API Server (chriskacerguis) - codeigniter

According to this github from chriskacerguis (https://github.com/chriskacerguis/codeigniter-restserver/issues/710) ,
anyone knows how to limit request by IP ?
Curently the API only limit by request per user/key.
$this->methods['user_get']['limit'] = 500; // 500 requests per hour per user/key
$this->methods['user_post']['limit'] = 100; // 100 requests per hour per user/key
$this->methods['user_delete']['limit'] = 50; // 50 requests per hour per user/key
Is this possible to create limitation by IP?

Finally, Chris already updated his Codeigniter,
now i can limit request by IP.
Here is the docs : https://github.com/chriskacerguis/codeigniter-restserver/commit/706f3b8375a0f6d1c65224dc112429f02037a572

Related

Multi request for distance via MKDirections

I'm coding an application that it use MapKit features. When I make request with for loop to get directions for many locations using MapKit's MKDirections , it gives error as "Directions is not available " with following details :
Error Domain=MKErrorDomain Code=3 "Directions Not Available" UserInfo={NSLocalizedFailureReason=Route information is not available at this moment., MKErrorGEOError=-3, MKErrorGEOErrorUserInfo={
details = (
{
intervalType = short;
maxRequests = 50;
"throttler.keyPath" = "app:lszlp.nobetciEczane/0x20200/short(default/any)";
timeUntilReset = 54;
windowSize = 60;
}
);
timeUntilReset = 54; ```
what is the possible causes ??
I've recognized that 2 arguments must be taken into account to avoid this type of error.
Firstly, Apple Map Server doesn't give permission more than one location's request in 60 seconds, so you have to check 2 consecutive location request time.
Second, maximum request number is set to 50 for Apple Map Server as it's written in error definition. So , you have to limit your "for loop" with 50 loops. I could not find any argument why this limitation requires.
with this 2 approaches , the problem has gone.

I want to send only one email using SMTP (gmail) with in 100 to 200 milli seconds?

Below java code is taking 2 seconds.
Store store = session.getStore("imaps");
store.connect("smtp.gmail.com", "******#gmail.com", "*******");
Transport t = session.getTransport("smtp");
t.connect("******#gmail.com", "******");
t.sendMessage(replyMessage,replyMessage.getAllRecipients());

Why and how is the quota "critial read requests" exceeded when using batchCreateContacts

I'm programming a contacts export from our database to Google Contacts using the Google People API. I'm programming the requests over URL via Google Apps Script.
The code below - using https://people.googleapis.com/v1/people:batchCreateContacts - works for 13 to about 15 single requests, but then Google returns this error message:
Quota exceeded for quota metric 'Critical read requests (Contact and Profile Reads)' and limit 'Critical read requests (Contact and Profile Reads) per minute per user' of service 'people.googleapis.com' for consumer 'project_number:***'.
For speed I send the request with batches of 10 parallel requests.
I have the following two questions regarding this problem:
Why, for creating contacts, I would hit a quotum regarding read requests?
Given the picture link below, why would sending 2 batches of 10 simultaneous requests (more precise: 13 to 15 single requests) hit that quotum limit anyway?
quotum limit of 90 read requests per user per minute as displayed on console.cloud.google.com
Thank you for any clarification!
Further reading: https://developers.google.com/people/api/rest/v1/people/batchCreateContacts
let payloads = [];
let lengthPayloads;
let limitPayload = 200;
/*Break up contacts in payload limits*/
contacts.forEach(function (contact, index) /*contacts is an array of objects for the API*/
{
if(!(index%limitPayload))
{
lengthPayloads = payloads.push(
{
'readMask': "userDefined",
'sources': ["READ_SOURCE_TYPE_CONTACT"],
'contacts': []
}
);
}
payloads[lengthPayloads-1]['contacts'].push(contact);
}
);
Logger.log("which makes "+payloads.length+" payloads");
let parallelRequests = [];
let lengthParallelRequests;
let limitParallelRequest = 10;
/*Break up payloads in parallel request limits*/
payloads.forEach(function (payload, index)
{
if(!(index%limitParallelRequest))
lengthParallelRequests = parallelRequests.push([]);
parallelRequests[lengthParallelRequests-1].push(
{
'url': "https://people.googleapis.com/v1/people:batchCreateContacts",
'method': "post",
'contentType': "application/json",
'payload': JSON.stringify(payload),
'headers': { 'Authorization': "Bearer " + token }, /*token is a token of a single user*/
'muteHttpExceptions': true
}
);
}
);
Logger.log("which makes "+parallelRequests.length+" parallelrequests");
let responses;
parallelRequests.forEach(function (parallelRequest)
{
responses = UrlFetchApp.fetchAll(parallelRequest); /* error occurs here*/
responses = responses.map(function (response) { return JSON.parse(response.getContentText()); });
responses.forEach(function (response)
{
if(response.error)
{
Logger.log(JSON.stringify(response));
throw response;
}
else Logger.log("ok");
}
);
Output of logs:
which makes 22 payloads
which makes 3 parallelrequests
ok (15 times)
(the error message)
I had raised the same issue in Google's issue tracker.
Seems that the single BatchCreateContacts or BatchUpdateContacts call consumes six (6) "Critical Read Request" quota per request. Still did not get an answer why for creating/updating contacts, we are hitting the limit of critical read requests.
Quota exceeded for quota metric 'Critical read requests (Contact and Profile Reads)' and limit 'Critical read requests (Contact and Profile Reads) per minute per user' of service 'people.googleapis.com' for consumer 'project_number:***'.
There are two types of quotas: project based quotas and user based quotas. Project based quotas are limits placed upon your project itself. User based quotes are more like flood protection they limit the number of requests a single user can make over a period of time.
When you send a batch request with 10 requests in it it counts as ten requests not as a single batch request. If you are trying to run this parallel then you are defiantly going to be overflowing the request per minute per user quota.
Slow down this is not a race.
Why, for creating contacts, I would hit a quota regarding read requests?
I would chock it up to a bad error message.
Given the picture link below, why would sending 13 to 15 requests hit that quota limit anyway? ((there are 3 read requests before this code)) quota limit of 90 read requests per user per minute as displayed on console.cloud.google.com
Well you are sending 13 * 10 = 130 per minute that would exceed the request per minute. There is also no way of knowing how fast your system is running it could be going faster as it will depend upon what else the server is doing at the time it gets your requests what minute they are actually being recorded in.
My advice is to just respect the quota limits and not try to understand why there are to many variables on Googles servers to be able to tack down what exactly a minute is. You could send 100 requests in 10 seconds and then try to send another 100 in 55 seconds and you will get the error you could also get the error after 65 seconds depend upon when they hit the server and when the server finished processing your initial 100 requests.
Again slow down.

Why did I get a timed out error after queuing up 7k jobs?

I'm sending around 7k emails using Laravel and SES. Because I have a limit of 10 emails per second I need to delay when Laravel is sending all the emails in batches of 10 at a time.
Controller
public function queue(){
$invites = Subscriber::all();
$send_at = now();
foreach ($invites as $i => $invite){
if($i % 10 == 0){
$send_at = $send_at->addSeconds(1);
}
SendEmailJob::dispatch($invite)->delay($send_at);
}
dd('sent!');
}
And the Job
public function handle()
{
Mail::to($this->user->email)->send(new InviteMail($this->user));
}
This gave me a timed out error but weirdly it queued all 7k emails and sent them. I'm just curious why I got the error.
Put this function at the starting of your controller function
set_time_limit() //In seconds
It will increase the max execution time .
check the max_execution_time value on your php.ini file or use set_time_limit(700); into your function queue()
700 come from 7000 invites /10 = 700 seg
max_execution_time default is 300 seg

BigQuery: 403 User Rate Limit Exceeded but error not shown in joblist

Im reciveing 403 User Rate Limit Exceeded error making querys but I'm sure I'm not exceding.
In the past I've reach the rate limimt doing inserts and It was reflected in the job list as
[errorResult] => Array
(
[reason] => rateLimitExceeded
[message] => Exceeded rate limits: too many imports for this project
)
But in this case the jobs-list doesn't reflect the query (nor error or done), and studing the job-list i haven't reach the limits or have been close to reach it (no more than 4 concurrent querys and each processing 692297 Bytes)
I've the billing active, and I've make only 2.5K querys in the las 28 days.
Edit: The user limit is set up to 500.0 requests/second/user
Edit: Error code recived
User Rate Limit Exceeded User Rate Limit Exceeded
Error 403
Edit: code that I use to make the query jobs and get results
function query_data($project,$dataset,$query,$jobid=null){
$jobc = new JobConfigurationQuery();
$query_object = new QueryRequest();
$dataset_object = new DatasetReference();
$dataset_object->setProjectId($project);
$dataset_object->setDatasetId($dataset);
$query_object->setQuery($query);
$query_object->setDefaultDataset($dataset_object);
$query_object->setMaxResults(16000);
$query_object->setKind('bigquery#queryRequest');
$query_object->setTimeoutMs(0);
$ok = false;
$sleep = 1;
while(!$ok){
try{
$response_data = $this->bq->jobs->query($project, $query_object);
$ok = true;
}catch(Exception $e){ //sleep when BQ API not avaible
sleep($sleep);
$sleep += rand(0,60);
}
}
try{
$response = $this->bq->jobs->getQueryResults($project, $response_data['jobReference']['jobId']);
}catch(Exception $e){
//do nothing, se repite solo
}
$tries = 0;
while(!$response['jobComplete']&&$tries<10){
sleep(rand(5,10));
try{
$response = $this->bq->jobs->getQueryResults($project, $response_data['jobReference']['jobId']);
}catch(Exception $e){
//do nothing, se repite solo
}
$tries++;
}
$result=array();
foreach($response['rows'] as $k => $row){
$tmp_row=array();
foreach($row['f'] as $field => $value){
$tmp_row[$response['schema']['fields'][$field]['name']] = $value['v'];
}
$result[]=$tmp_row;
unset($response['rows'][$k]);
}
return $result;
}
Is there any other rate limits? or it is a bug?
Thanks!
You get this error trying to import CSV files right?
It could be one of these reasons:
Import Requests
Rate limit: 2 imports per minute
Daily limit: 1,000 import requests per day (including failures)
Maximum number of files to import per request: 500
Maximum import size per file: 4GB2
Maximum import size per job: 100GB2
The query() call is, in fact, limited by the 20-concurrent limit. The 500 requests / second / user limit in developer console is somewhat misleading -- this is just the number of total calls (get, list, etc) that can be made.
Are you saying that your query is failing immediately and never shows up in the job list?
Do you have the full error that is being returned? I.e does the 403 message contain any additional information?
thanks
I've solved the problem by using only one server to make the requests.
Looking what I was doing different in the night cronjobs (that never fail) the only diference was I was using only one client in one server instead of using diferent clients in 4 diferent servers.
Now I have only one script in one server that manages the same number of querys and now it never gets the User Rate Limit Exceded error.
I think there is a bug managing many clients or many active IPs at a time, althought the total number of threads never exceds 20.

Resources