Increase upload_max_filesize via Forge - laravel

Error: 413 Request Entity Too Large
I have attempted to increase upload_max_filesize to 20M using the Edit PHP FPM Configuration and Edit PHP CLI Configuration tools in Laravel Forge. It successfully saves my settings, but the changes don't seem to take affect. I have tried restarting nginx and the server.
Environment:
AWS EC2
nginx

Updating since this is the first search engine result for a search on this topic:
Forge now has a built-in setting you can update by going to the server details page and then clicking on PHP from the menu on the left. You'll see a form to change the max file size.

As #dave-alvarez mentioned, there is a setting in Laravel Forge to do this.
Select your server & choose PHP from the left menu.
Set Max File Upload Size as a megabyte integer (with no trailing unit).
Confirm your change by going to the bottom of the page & clicking the Files pull-up, Edit PHP FPM Configuration option. You can search the php.ini for upload_max_filesize.

I was missing a piece. Here's the whole answer.
Nginx Configuration
Add the following line to http or server or location context to increase the size limit in nginx.conf:
# set client body size to 20M #
client_max_body_size 20M;
PHP Configuration
Edit php.ini and set the following directives:
;This sets the maximum amount of memory in bytes that a script is allowed to allocate
memory_limit = 256M
;The maximum size of an uploaded file.
upload_max_filesize = 20M
;Sets max size of post data allowed. This setting also affects file upload. To upload large files, this value must be larger than upload_max_filesize
post_max_size = 30M
Source: http://www.cyberciti.biz/faq/linux-unix-bsd-nginx-413-request-entity-too-large/

Related

20 images get uploaded instead of 30

I'm using Laravel with the plugin to create files in AWS S3 (league/flysystem-aws-s3-v3).
I'm having an issue where:
I have an API call with a method in a controller that receives an array of files.
The method reads all the files and uploads them to S3.
For some reason, if I send more than 20 files, only 20 files get uploaded to AWS S3.
Since the plugin for AWS S3 uses Guzzle under the hood, I was thinking it could be related to a timeout or maximum number of calls to be made within a certain period.
Any ideas of what might be causing this?
Looks like a limitation in your php.ini file.
When you install php this is the default configuration:
; Maximum number of files that can be uploaded via a single request
max_file_uploads = 20
Try changing this limit and then restarting your server (apache, nginx, etc)
Please verify your php.ini file with below two values
Please try increasing "upload_max_filesize"
; Maximum allowed size for uploaded files.
upload_max_filesize = 2M
Also check "max_file_uploads" is greater than 20.
; Maximum number of files that can be uploaded via a single request
max_file_uploads = 20

I'm getting request too large Error 413 when submiting file in laravel

I'm using laravel and filepond to upload some files. It works fine with files smaller than 100MB, but if I try to upload bigger (400MB) files I get 413 error.
I have already increased post_max_size and upload_max_filesize in php.ini and changed client_max_body_size in nginx, but it still does not work.
I'm missing something?
Best regards
Its php and nginx related.
Check which php.ini is used php --ini
Sometimes there are multiple ones that change the original value
Search in all php.ini files cat /path/php.ini | grep upload_max_filesize
Last thing is, make sure you restart nginx and php after the changes
Check that the PHP interpreter is reading from the php.ini that you are editing.
Check whether your server resources is sufficient i.e memory size is not exhausted in the process of file uploading. htop can tell you resource consumption on your server. If on php.ini you put values to post_max_size, upload_max_filesize or memory_limit that exceed your server resources, you are most likely to meet the "413 Content Too Large" error.

Over the memory limit when changing Magento template

I'm trying to run Magento community edition 1.7.0.2
using NGINX, PHP FPM
on 512Mb RAM VPS, Ubuntu 12.04.3 32Bit.
Whenever I try to change the default template by changing all the settings under
System->Configuration->Design->Themes by setting all of options, i.e.
Templates, Skin (Images / CSS), Layout, Default
to provided modern template (as well as other template) I get over the PHP memory limit.
Even if I set the limit to 256Mb.
I find it strange, because I was able to make it on shared hosting with less RAM, but on Apache, I guess.
Each time I attempt this - it fails and it is impossible to get either to admin or front end - getting white screen. I solve it by restoring machine from the snapshot.
Can anyone help me debug this?
Update:
Actually, I'm not even able to refresh configuration cache. One of the php-fpm processes increases memory use until it reaches max ram...
2014/01/06 16:58:09 [error] 892#0:
*27 FastCGI sent in stderr: "PHP message: PHP Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 32 bytes)
in
/usr/share/nginx/www/spaparts/app/code/core/Mage/Core/Model/Config.php
on line 622"
while reading response header from upstream, client: 66.249.66.xxx,
server: domain.com,
request: "GET /index.php/apparel/shoes.html?cat=16 HTTP/1.1",
upstream: "fastcgi://unix:/tmp/php5-fpm.sock:", host: "domain.com"
I though it would be nice to write the details, in case anyone else is having similar trouble.
So, the over the limit memory PHP errors were caused by the:
{{unsecure_base_url}} being set to "{{unsecure_base_url}}"
{{secure_base_url}} being set to "{{secure_base_url}}"
This was suggested somewhere as the way to allow the change the domain of the magento install, and it allows to run it as usual, but seems to cause some loops and over the limit RAM consumption.
After changing the settings in System->Configuration->Web everything is back to normal, I was able to clear cache, change theme etc.
Thanks everyone for all your suggestions!

max_execution_time in php.ini wont update, other settings will

So I've edited my php.ini file to allow for a longer max_execution_time, among some other settings. When I recycle the application pool in IIS 6 on windows server 2003 and check the php info file I've created, the other settings I've changed stick, but max_execution_time stays at it's default setting (300). What's up? It is not commented out and looks like this:
max_execution_time = 1800
Like I said, I've changed max_input_time to have the same value, and it works.
max_input_time = 1800
Reading a PHP Info file shows that the max_input_time is 1800 seconds, but max_execution_time still says 300. Thoughts on this?
Edit: The Loaded Configuration File listed within the phpinfo file is the file I'm working with. As I mentioned, other settings are taking effect, however, this specific setting is not. This means that it is indeed reading the file I'm editing, it just doesn't want to change the max_execution_time. I've also restarted the server.
please check your php.ini path first after then change in correct file.
or
if you change correct file then restart server
Try you can try to use set_time_limit(0);

Upload 6 MB image in magento

I wants to upload 6MB image in the products of the magento store. Please help me where i have to change my maximum limit ? This code did not work in php.ini file
upload_max_filesize = 10M
post_max_size = 10M
Any suggestion would be appreciable
Typically the server process (apache, httpd or php-cgi) needs to be restarted after making changes to php.ini. This might be why you are not seeing any difference.
Another way is to put your upload_max_filesize and post_max_size settings in a .htaccess file in the root of your Magento directory. Apache tends to read that more often.
Are you getting an error when you upload the file, or does it just time out? It might be that the dimensions of the image (say 4,000 x 5,000) are too big for scaling/cropping.
Place a file in your root with in it and call it phpinfo.php. Now go to http://www.yoursite.com/phpinfo.php and see what the maximum upload size is.
If you are on shared hosting it may not be possible to increase your php settings beyond what your hosting provider allows. This could be the reason why your settings are not taking hold. Run phpinfo.php and take things from there.

Resources