In laravel-livewire upload system when we choose a file, upload to the temp folder will begin and it is a bug! because before clearing the tmp folder it may be getting full. How can we validate file size before even moving to tmp folder and cancel upload if the file is not valid. I try real-time validation also but it's not working.
Thanks a lot
Related
I have a simple form with just a field to upload a file on a Drupal 8 site using the Webform module 6.0.1. If I test the form using webforms test tool, the file gets uploaded fine, it displays the loading icon for couple seconds and its ready to upload, but if I try the same in a page with the form embedded as a block, I can not upload the file it will always show me the following message if I try to submit the form, no matter how long I wait.
"File upload in progress. Uploaded file may be lost. Do you want to
continue?"
Just for testing, I increased the max upload file to 2GB and the file I'm uploading is just 5kb.
All the .js libraries I can think of are on my template.info.yml file. (core/drupal, core/jquery, core/drupal.ajax)
The private folder has been set up in my settings.php and has the permissions set correctly the .htaccess file is also set as recommended for Drupal.
I have try to upload the files in the public folder just as desperation but I get the same error
Any help will be appreciate.
I know this is more than a year old but JIC anyone else had this problem:
There are four files where you might have to make changes to raise the memory limit on uploads. The first are two separate php.ini files. One is your server's php.ini and the other is the Drupal installation's php.ini in the Drupal root folder. The server's php.ini will apply the settings to all Drupal installs (and anything using PHP) on the server, while the Drupal root php.ini file will only apply to that Drupal installation. Make changes in the php.ini file(s) under memory_limit, upload_max_filesize, post_max_size. You can often edit the server's php.ini file through cPanel's MultiPHP INI Editor.
The next file is the .htacess file where you would change the value under php_value memory_limit. Changing this only affects the Drupal install the .htaccess file belongs to. This file can be found in the Drupal root directory but is sometimes hard to find because it is a hidden file. The FTP client Forklift has the option to show hidden files under View > Show View Options > Show Hidden Files. Your FTP client of choice may also have an option to show hidden files.
The last file is the settings.php file (sites/default/settings.php) and the value to change can be found under ini_set('memory_limit', ''). For example: ini_set('memory_limit', '64M'); Again, this only affects the Drupal install the settings.php file belongs to.
memory_limit This sets the maximum amount of memory in bytes that a script is allowed to allocate. This helps prevent poorly written scripts for eating up all available memory on a server. Note that to have no memory limit, set this directive to -1.
post_max_size Sets max size of post data allowed. This setting also affects file upload. To upload large files, this value must be larger than upload_max_filesize. Generally speaking, memory_limit should be larger than post_max_size.
upload_max_filesize The maximum size of an uploaded file.
I'm working on a Laravel API project, let see when you upload a image I change the colors, with a shell script. The api accepts urls so that means I have to save the image in a temp folder so that I can edit it and save it to my S3 filesystem. Is it convenient that I save the temp image in the S3 filesystems or local?
It will likely be much faster to save the image locally in a temp directory to make the changes before storing it on S3. You can use sys_get_temp_dir() to get a path used for temporary files.
https://secure.php.net/manual/en/function.sys-get-temp-dir.php
I am currently using carrierwave-aws to upload to my S3 bucket.
One issue I am having is after the image is uploaded and saved, if the user, lets say changed something about the image locally and re-submitted for upload with the same file name, it will not reflect the new file uploaded.
The user has to change some part of the file name for it to show the correct one in my application.
I am assuming this is a caching issue but not sure where to begin to address this matter.
Has anyone else experienced this?
If your S3 bucket is set with a long or infinite expiry (which is a good idea for performance), you'll need to change the filename each time the image changes. See the Carrierwave wiki page on how to do this.
My client has millions of users driver website. I had to implement a custom file upload using AJAX/jQuery.
The file upload field is just needed to create a clickable and drop-downable file upload mechanism. Everything is working fine except when I submit the form it uploads the file again which has already been uploaded using ajax.
How can I prevent this uploading?
Move the file input to outside the form or replace the file input with a new one when you upload the file.
I have made an image upload page. this all works smooth.
My problem is when i upload an image called filename.jpg and i upload it again the same image uploads but changes the name to filename1.jpg
how can i force this and make a message with something like 'the image you upload was allready uploaded'
thanks
the change in filename was initiated by the os of server computer
to produce the promt you have to check the file name for duplicates while you are uploading
using the php script
if you have a database which stores the filenames then it is very easy