Request Entity Too Large on godaddy server in codeigniter - codeigniter

I have codeigniter application in which i have to upload 11 images using base64 string, i works properly on localhost but not working on server , if images small then it works.
The requested resource
KnSD67KDqyt9QdeatGg--
does not allow request data with GET requests, or the amount of data provided in the request exceeds the capacity limit.
Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
I have increased post_max_size to 128 MB

Related

Twig\\Error\\RuntimeError status 500 mailjet API

In a symfony project,
I use an API mailjet to send emails,
In case of sending lots of email (over 290 emails), I have this error.
Status Code: 500 Internal Server Error
Runtime exception
Thanks!
After a search, I found that the error is due to a limit of the post request.
In fact I used ajax to send an array of 400 itemsin the data, but ajax is limited to 294 items.
Do you have a solution to increase this limit?

User Session lost after number of cookies exceeds in IE11/Edge - Browser Cookie limit

We are experiencing an issue and appreciate any help on it please.
In our application, AJAX(using simple $.ajax with a POST ) requests are getting html content from server and it all works fine in Chrome/FF/Safari but user session is lost in IE and Edge if the size of the returned content exceeds 30kb. If the returned payload size < 30 KB it all works fine in IE as well. So basically the ajax request returns successfully but we can see that session cookie is lost in subsequent request.
Please note that same ajax handling mechanism is working fine on a number of other servers.
Attached are the details of request/response headers of working and non working
environments
Request response headers - snapshot from working environment
Headers
Request response headers - snapshot from Non working environment
Headers
Please note the difference in response headers. Kindly let us know if you need any more information on this.
Well it turned out to be be browser cookie limit. IE11+ Impose a limit of 50(increased from 20) cookies per domain and we were exceeding this limit. IE simply knocks out the older cookies if this limit is exceeded(Including the session cookie).
http://browsercookielimits.iain.guru/
https://support.microsoft.com/en-us/help/941495/internet-explorer-increases-the-per-domain-cookie-limit-from-20-to-50

Turn rate limit (throttle) down for a specific origin in Laravel 5.4

I have a Laravel-made API running in a server and an Angular application running in another server. My Angular app loads a huge JSON file (with more than 500 lines) and tries to insert each line in a database through the api. A request is sent for each line, so I get an 409 error (too many requests).
I know this is a matter of security, so I don't want to remove the throttle from my middlewareGroup array in Kernel.php. I'd like to know, however, if there's any chance I can turn this rate limit down for a specific origin address (http://www.myangularapp.com/ only, for example)? So I can send these various requests while keeping the rate limit for other origin addresses.
Thanks in advance!

Intermittent Http Error 403 When Invoking Google Custom Search API

I'm getting the following error intermittently when invoking the custom search api from a server side setup:
HttpError 403 when requesting https://www.googleapis.com/customsearch/v1?q=John+Doe+john%40simpler.com&alt=json&cx=&key= returned "There is a per-IP or per-Referer restriction configured on your API key and the request does not match these restrictions. Please use the Google Developers Console to update your API key configuration if request from this IP or referer should be allowed.
I'm using a server api key, and have confirmed that the configured server ip address is correct. And about 50% of the time my request come back fine, too. I'm issuing the request from the server like this:
service = build("customsearch", "v1",
developerKey=api_key)
custom_search_context = <my_context>
res = service.cse().list(
q=search_query_string,
cx=custom_search_context,
).execute()
My requests per sec are well with in the configured limit of 10/sec and daily purchased limit of 5000 requests.
One more thing I noticed is that Google counts a forbidden request towards the daily limit, too.
Any pointers on why I'm being presented with the error only intermittently would be very helpful
The error can be raised when you're exceeding a request/second limit. Can you confirm that your request rate is below your configured user rate limit? It might be worth noting that the limit is enforced even if you don't explicitly provide a user value in your requests.

Getting 400 Bad request error when sending huge amont of data with post

I am using yahoo YUI Ajax call to post request.
Page is developed in JSP and server is Tomcat 6 and using Struts 2.x.
While i am sending small amount of data using Ajax call post request it is working fine.But when i am sending huge amount of data i am getting 400 bad request error.
The 400 error comes when The Web server thinks that the data stream sent by the client (e.g. your Web browser ) was 'malformed' i.e. did not respect the HTTP protocol completely. So the Web server was unable to understand the request and process it.
There could be possibilities that data is too large, so you should better encode the data using java script inbuilt function for example escape().
Please check maxPostSize attribute of Connector in conf/server.xml.
See the following doc for the details
In Tomcat, when the post data exceeds the maximum specified in maxPostSize (server.xml) it returns a 400 as error code

Resources