Uploading files to a Heroku website connected to GitHub? - heroku

I'm making a pretty simple website and I have a feature that allows users to upload an image for their profile which is then saved in an uploads directory.
It seems to work fine, however when I push a new local version of the site out I lose all the uploaded files. I'm not exactly sure where Heroku stores them, is there a way I can push those to GitHub or another solution?

It seems like you want data from heroku app to your repository, if i understood your question correctly. Then try this:
https://blog.heroku.com/push_and_pull_databases_to_and_from_heroku

Heroku's filesystem is ephemeral. If you want to store user uploaded content, you need to use something like S3 to store the files.

Related

one-click Strapi deployment but no empty .git folder localy

I have watched a demo video how to use one-click Strapi deployment to set up a Strapi API. I follow all the steps in the video and I get no messages until I get the message: "you appear to have cloned an empty repository". This also happens in the video, but the tutor has an empty .git folder in the project folder. He says it is important that this is there but for me there is no .git folder. I am very confused since I do the exact same steps. Anyone had this issue? I can run the Strapi API locally on my machine, but I want to create a page with Netlify that fetches data from the said API and display it on the web for my portfolio. I am right that if I do this locally and not via Heroku I can't display products etc. for a final website?

Validation file extensions

I am working on a Laravel/VueJS Serverless project.
I am trying to whitelist/blacklist some file extensions for my AWS S3 bucket to avoid people uploading files I don't want.
I use Vapor in VueJS to temporary upload files and get a key that I send to my Controller in POST method to get back the file from temporary path and move it to the final path.
I have tried adding policies on my S3 bucket but I am still able to upload file extensions I should not.
If you have any clue, feel free to help me !
Thanks a lot !

How to fetch newly added images from VM in google cloud platform?

When I add images it is stored inside google cloud platform VM correctly.
But I am not able to fetch newly added images in my website.
If I redeploy project with newly added images in assets folder it is showing correctly.
I have verified there is no mistake on frontend or backend side.
Is it not possible to get live image update with VM?
Edit:
I have used Vue.js.
I am storing images inside src/assets folder.
When I save images in my website it is saved at src/assets folder.
I think it can only access things in dist folder after build.
Can you suggest where should I save my file?
I'm making an educateg guess it's some cache issue.
Instead of putting your image files inside the VM you can try storing them in a bucket that's accessible to public. The downside is that it can serve only static files (no PHP or anything).
You have to configure your load balancer to forward all your.domain.com/images/ requests to the bucket but that's actually quite easy. Have a look at my answer asking for such configuration.

Heroku file upload

I'm new to Heroku and I'm looking for a way to upload a bunch of images to my application's directory. Is there a proper way of doing this, like a ssh, or a filesystem access, or something?
Thanks in advance
I'm not sure what you mean, but I would just add them to the git and upload them like all other documents. Like so:
git push heroku master
Hope it helps!

Migrating Wordpress Uploads To Amazon S3

I currently use a plugin on wordpress that creates a carbon copy of the uploads in a s3 bucket whenever a new picture is added. The problem is on the same site there are about 700 pictures uploaded before we started to use this plugin that aren't on s3 and we need to free up some space. The plugin doesn't copy these old pictures over.
Does any one know of a way to redirect all the image urls to the s3 bucket instead if we manually copy the files over? Could this be done with a htaccess in the uploads folder?
Thanks for your help I am very bad with redirects and things and need to improve.
Jozef
Using:
RedirectPermanent /wp-content/uploads http://xxxx.s3.amazonaws.com/wp-content/uploads
In your main .htaccess file should do the trick, note this will redirect all uploads (including potentially non-image uploads).
No need to install any plugin. I did it by mapping wp-content/uploads to S3 bucket using S3FS. Refer Moving wordpress uploads to Amazon s3 using S3FS
If you want to save space on your server, don't use use_cache option while mounting S3FS. Good luck!
try to use s3cmd
install it configure it by providing it access and secrate key of aws
then sync your upload folder or image folder with s3 bucket
and your done

Resources