Codeigniter is not allowing uploading raw image files (CR2, NEF, DNG etc..) with following configuration.
I'm using dropzone.js to upload image/video files. On dragging the files to dropzone window, files get uploaded to temp folder and on submitting, they're moved to desired storage location.
I'm able to upload files with mime_types image/jpeg|image/png|image/gif|video/*. However, raw image files with extension .CR2!.DNG etc are not getting uploaded. It always throws the error You did not select a file to upload.
On analysing, I found that the default 'upload' library allows only jpeg|png|gif and other jpeg supported image formats.
$config['upload_path'] = './upload_files/temp/';
$config['allowed_types'] = 'jpeg|png|jpg|cr2|dng|srf|mp4|mov|mpg|mpeg|wmv|mkv';
$config['encrypt_name'] = TRUE;
$this->load->library('upload');
$this->upload->initialize($config);
if (!$this->upload->do_upload('file')) {
$msg = $this->upload->display_errors('', '');
header("HTTP/1.1 404 Not Found");
return $msg;
} else {
....
}
Is updating upload library to allow required file types the only option OR are there other options exist?
Secondly, if the files are to be uploaded to different storage server viz., AWS S3 or others, is it always wise to upload to local temp folder and then push it to remote server?
TIA
Have a look at your mime type configuration file # config/mimes.php and add
'dng' => array('image/x-adobe-dng'),
'cr2' => array('image/x-dcraw', 'image/x-canon-cr2'),
you find a list of mime types for raw images here
update resuming comments:
RAW files are rather large files, so make sure the image-size is met in your php.ini settings: change post_max_size accordingly.
It works on php 7.2.20
Related
Hi i am building a laravel-livewire app on heroku
the app requires file upload and I'm using FileUpload of livewire, it works fine on local, but when I try it on heroku, it says upload successful, but when I download the file it gets "No file" message. I don't know where the error lies.
here is my source code:
_ In controller:
public function updatedFile()
{
// $this->validate();
$fileUpload = new File();
$fileUpload->url = $this->file->storeAs('public/files/' . auth()->id(), $this->file->getFilename());
$fileUpload->size_file = $this->getFileSize($this->file);
$fileUpload->file_name = $this->file->getClientOriginalName();
$fileUpload->model_name = $this->model_name;
$fileUpload->model_id = $this->model_id;
$fileUpload->admin_id = auth()->check() ? auth()->id() : null;
$fileUpload->save();
if ($this->model_id == null)
$this->list[] = $fileUpload->id;
}
_ In view
<a href="{{ $canDownload ? asset('storage/' . substr($val['url'], 7, strlen($val['url']) - 7)) : '#' }}" download>
<span class="d-block mb-0" style="word-break: break-all;">{{ $val['file_name'] }}</span>
<small class="kb">{{ $val['size_file'] }}</small>
</a>
```
The immediate issue may just be with relative vs. absolute paths.
But even once you resolve that you'll find that your uploads disappear frequently and unpredictably. This is due to Heroku's ephemeral filesystem.
To store uploads long-term you'll need to use a third-party service like Amazon S3 or Azure Blob Storage. It looks like Livewire supports this directly:
The previous example demonstrates the most basic storage scenario: Moving the temporarily uploaded file to the "photos" directory on the app's default filesystem disk.
However, you may want to customize the file name of the stored file, or even specify a specific storage "disk" to store the file on (maybe in an S3 bucket for example).
It also provides the following example:
// Store in the "photos" directory in a configured "s3" bucket.
$this->photo->store('photos', 's3');
And links to the relevant Laravel documentation, saying that Livewire uses the same API. Just make sure to configure an S3 bucket.
You can also completely bypass your server, having uploads go directly from the user's browser to your S3 bucket. This is particularly useful with large uploads.
Make sure to use the correct disk when building your download URL.
I'm using Laravel Vapor to host a site. Up until now, I've not had a problem with the lack of a filesystem, but now I've hit a brick wall.
I'm trying to optimize .png and .jpeg files, and the libraries I found require a filesystem to write the compressed files:
Image Optimizer (https://github.com/spatie/image-optimizer)
PHP Image Cache (https://nielse63.github.io/php-image-cache/)
I'm guessing that I can set up an external service that runs on an additional traditional server... But I'd prefer to make it work with Vapor.
Any ideas?
Have you tried using GD Library or Imagick directly?
Using Imagick with files on s3 you can do something like this:
$s3 = \Storage::disk('s3');
$file = $s3->get('tmp/'.$uuid); // assuming Vapor upload to S3 here
$imagick = new \Imagick(); // extension can be added to Vapor.
$imagick->readImageBlob($file);
$imagick->thumbnailImage(200,200);// whatever size you are looking for.
$s3->put('path/on/s3/for/your/optimized/file',$imagick->getImageBlob(),['CacheControl' => 'max-age=10000000, public', 'ACL' => 'public-read']) // whatever options you need
Note both read and write are from/to s3 directly. No need to write to local disk
I'm having a problem loading images in my html dynamically after storing them successfully with Laravel Vapor.
I have followed this documentation provided by laravel vapor to store files, and it works like a charm. I copy my uploaded files from the tmp directory into the root of my S3 bucket and then store the path of that file in my databases images table so that later I can return the file path to my front end and display the image in my browser.
Unfortunately this is always returning a 403 status code from AWS S3.
I could fix this by making my generated S3 bucket public, but that would raise a security issue. I believe this should work out of the box, not sure where I could have gone wrong... any ideas?
I am returning the uploaded image url using the Storage facade.
use Illuminate\Support\Facades\Storage;
return Storage::url($image->path);
Where $image->path is the file path in my S3 bucket.
I'm sure that the storage facade is working correctly because it is returning the correct url with the file's path.
I got the solution to this problem. I contacted laravel vapor support and I was told to set the visibility property for my file to public when I copy it to the permanent location, as stated in Laravel's official documentation here.
So after you upload your file using the js vapor.store method you should copy it to a permanent directory, then set it's visibility to public.
Storage::copy($request->path, str_replace('tmp/', '', $request->path));
Storage::setVisibility(str_replace('tmp/', '', $request->path), 'public');
I also noticed that your can set the visibility of the file directly in the vapor.store method by passing a visibility attribute with the respective value.
vapor.store(file, { visibility: 'public-read' });
As a side note: just 'public' will return a 400 bad request, it must be set to 'public-read'.
I was trying to store image from dropbox url to my local folder with laravel Intervention , but with it i am getting errors after error.
Can anyone please tell me how can i do so ?
My code is this
$path = 'https://www.dropbox.com/s/vwswp91fiz0m1wd/1200px-Good_Food_Display_-_NCI_Visuals_Online.jpg?dl=0';
$filename = explode('?',basename($path))[0];
Image::make($path)->save('images/'.$filename);
The error i am getting for this is
Unable to init from given binary data.
So i tried the solution from of of stackoverflow post
$path = 'https://www.dropbox.com/s/vwswp91fiz0m1wd/1200px-Good_Food_Display_-_NCI_Visuals_Online.jpg?dl=0';
$filename = explode('?',basename($path))[0];
$path = base64_decode($path);
Image::make($path)->save('images/'.$filename);
But that gave me another error.
I tried looking on goggle but i didn't find any solid answer that works for my case
Can anyone please help me on this how to download image from dropbox url and save to loacal storage ? Or do i have to add dropbox api or something??
The dropbox link that you used https://www.dropbox.com/s/vwswp91fiz0m1wd/1200px-Good_Food_Display_-_NCI_Visuals_Online.jpg?dl=0 is a image preview page, which is not a valid image content. You can use force download mode to fetch the image content from dropbox, by editing the query parameter from ?dl=0 to ?raw=1.
$path = 'https://www.dropbox.com/s/vwswp91fiz0m1wd/1200px-Good_Food_Display_-_NCI_Visuals_Online.jpg?raw=1';
Image::make($path)->save('images/'.$filename);
See also: Force a file or folder to download, or to render on dropbox.com
I am developing an API using Codeigniter and I want to let users upload images to my Amazon S3 account. I am using Phils Restserver and Donovan Schönknecht S3 library (for Ci).
It works perfectly to upload a local file to Amazon but how can I get the image file
sent via normal external form?
Using the built in Ci upload library it works fine but then I have to store the files
locally on my own server and I want them on S3. Can the two be combined?
I guess what I am asking is how can I "get" the image file that is sent to the controller
and resize it and then upload it to S3?
Do I perhaps need to temporary save it on the local server, upload it to S3 and then remove it from the local server?
This is my "upload" modal:
// Load the s3 library
$this->load->library('S3');
// Make the upload
if ($this->s3->putObjectFile($args['local'], "siticdev", $args['remote'], S3::ACL_PUBLIC_READ)) {
// Handle success
return TRUE;
} else {
// Handle failure
return FALSE;
}
Thankful for all input!
If I understand you correctly, you want a user to upload an image via a form, resize that image, then transfer that to Amazon S3.
You'll have to store the file locally (at least for a few seconds) to resize it with CI. After you resize it, then you can transfer it to Amazon S3. In your success callback from the transfer, you can delete the image from your server.
You should definitely check out the CI S3 library. The "spark" is available here - http://getsparks.org/packages/amazon-s3/versions/HEAD/show