Migrating Wordpress Uploads To Amazon S3 - image

I currently use a plugin on wordpress that creates a carbon copy of the uploads in a s3 bucket whenever a new picture is added. The problem is on the same site there are about 700 pictures uploaded before we started to use this plugin that aren't on s3 and we need to free up some space. The plugin doesn't copy these old pictures over.
Does any one know of a way to redirect all the image urls to the s3 bucket instead if we manually copy the files over? Could this be done with a htaccess in the uploads folder?
Thanks for your help I am very bad with redirects and things and need to improve.
Jozef

Using:
RedirectPermanent /wp-content/uploads http://xxxx.s3.amazonaws.com/wp-content/uploads
In your main .htaccess file should do the trick, note this will redirect all uploads (including potentially non-image uploads).

No need to install any plugin. I did it by mapping wp-content/uploads to S3 bucket using S3FS. Refer Moving wordpress uploads to Amazon s3 using S3FS
If you want to save space on your server, don't use use_cache option while mounting S3FS. Good luck!

try to use s3cmd
install it configure it by providing it access and secrate key of aws
then sync your upload folder or image folder with s3 bucket
and your done

Related

Validation file extensions

I am working on a Laravel/VueJS Serverless project.
I am trying to whitelist/blacklist some file extensions for my AWS S3 bucket to avoid people uploading files I don't want.
I use Vapor in VueJS to temporary upload files and get a key that I send to my Controller in POST method to get back the file from temporary path and move it to the final path.
I have tried adding policies on my S3 bucket but I am still able to upload file extensions I should not.
If you have any clue, feel free to help me !
Thanks a lot !

How to fetch newly added images from VM in google cloud platform?

When I add images it is stored inside google cloud platform VM correctly.
But I am not able to fetch newly added images in my website.
If I redeploy project with newly added images in assets folder it is showing correctly.
I have verified there is no mistake on frontend or backend side.
Is it not possible to get live image update with VM?
Edit:
I have used Vue.js.
I am storing images inside src/assets folder.
When I save images in my website it is saved at src/assets folder.
I think it can only access things in dist folder after build.
Can you suggest where should I save my file?
I'm making an educateg guess it's some cache issue.
Instead of putting your image files inside the VM you can try storing them in a bucket that's accessible to public. The downside is that it can serve only static files (no PHP or anything).
You have to configure your load balancer to forward all your.domain.com/images/ requests to the bucket but that's actually quite easy. Have a look at my answer asking for such configuration.

Laravel API temp image via url

I'm working on a Laravel API project, let see when you upload a image I change the colors, with a shell script. The api accepts urls so that means I have to save the image in a temp folder so that I can edit it and save it to my S3 filesystem. Is it convenient that I save the temp image in the S3 filesystems or local?
It will likely be much faster to save the image locally in a temp directory to make the changes before storing it on S3. You can use sys_get_temp_dir() to get a path used for temporary files.
https://secure.php.net/manual/en/function.sys-get-temp-dir.php

Uploading files to a Heroku website connected to GitHub?

I'm making a pretty simple website and I have a feature that allows users to upload an image for their profile which is then saved in an uploads directory.
It seems to work fine, however when I push a new local version of the site out I lose all the uploaded files. I'm not exactly sure where Heroku stores them, is there a way I can push those to GitHub or another solution?
It seems like you want data from heroku app to your repository, if i understood your question correctly. Then try this:
https://blog.heroku.com/push_and_pull_databases_to_and_from_heroku
Heroku's filesystem is ephemeral. If you want to store user uploaded content, you need to use something like S3 to store the files.

Secure upload files in Laravel

I have a Laravel 5 project in which I am uploading files in database in Medium Blob format.
But uploading files in database takes some extra time to execute.
Uploading files in database is a secured way to keep files safe from crawlers or some bots.
I have tried to Upload files to the Public folder. But the crawlers can open these files.
Is there any possible way to upload files in the file system?
So that the Crawlers cannot open these files.
I want these files to be Secured
you can upload them outside of the public scope. For example, storage/ folder is a good place. Also, you can grab them using the file system manager. Take a look:
$image = \Storage::get('file.jpg');
Edit
A correct laravel installation just allow the content of public/ to be accesible via web browser. If other directories as storage/ or resources/ are public too, then you installation is really incorrect.
Said that, once you upload the files in storage/ folder nobody can access them except by you using the \Storage facade. When you call for example \Storage::get('file.jpg'); it returns an stream of bits that you can allocate them in a temporary folder and then display it in the webside. Once the request has finished, the image will disappear again from public domain.
No need to change the directory this can be achieved by two ways
LazyOne Answer using .htaccess
AND
Using robots.txt
I will suggest to implement both .htaccess and robots.txt as some cheap crawlers ignore robots.txt but they can't ignore .htaccess
You can follow this method
image-accessibility-for-authenticated-users-only
As this only allows authorized uses to view image

Resources