How to update a laravel project on AWS - laravel

I'm new to AWS. I uploaded a laravel project to AWS using elastic beanstalk and users uploaded videos and I'm saving the videos inside the public folder. Now I did few updates and I want to upload a newer version of the laravel project. If I uploaded a new version. Does that replace the public folder on AWS with with one in the uploaded version? I don't want to lose the videos that the users uploaded. How can I do it?

Does that replace the public folder on AWS with with one in the uploaded version? I don't want to lose the videos that the users uploaded.
Yes it does, and you will probably loose data after update. If not, then you will loos data due to auto-scaling events (automated termination by AWS and creation of a replacement) on your EB instance.
Therefore, its not a good practice to keep your videos on the instance itself. Instead, they should be stored outside the instance. Usually, for this purpose, S3 or EFS services are used. This way, you store data outside of your EB environment, and you will not loose it due to updates or auto-scaling events.

Related

Directus images disappear after a while

I created a Next.js project that is deployed on Vercel and uses a MySQL database. I then deployed a Directus instance on Heroku that is tied to that same database. In my Next.js project I want to fetch and render images that I uploaded to Directus. At first this works, but after a while all the images disappear in the Directus media library. The folders and references to the images are still there, but I don't see the pictures anymore, I get to see a JPG logo instead. When I try to fetch the images I get a 502 "Bad Gateway" error. I don't know what causes the images to disappear and how to fix this.
By default, Directus stores uploaded files locally on disk.
All Heroku applications run in a collection of lightweight Linux containers called dynos. Be aware that Heroku dyno's filesystem is ephemeral.
It means if your Heroku app doesn't receive any traffic for 30 minutes or is being deployed, the VM it lives on is destroyed, and its filesystem goes along with it. So, this filesystem should not be used for any permanent storage (Directus storage in your case).
You can configure Directus to use S3, Google Cloud Storage, Azure, or Cloudinary.
For more details check the File Storage Directus docs.

Parse- Server Transferring files hosted on Heroku to AWS S3 bucket

I have parse-server running on Heroku. When I first created this app, I didn't specify a files adapter in index.js, so all uploaded files have been getting stored on Heroku.
So I have now run out of room and I have set up an AWS S3 bucket to store my files on. This is working fine expect for the fact that any files which were originally stored on Heroku can no longer be accessed through the application.
At the moment I am thinking about looping through all objects which have a relation to a file stored on heroku, then uploading that file to S3 bucket. Just hoping that there may be some tool out there or that someone has an easier process for doing this.
thanks
There are migration guides for migrating parse server itself but I don't see anything in the documentation for migrating hosted files unfortunately.
I did find one migration tool but it appears to still utilize the previous file adapter (on your heroku instance) and then stores anything new on the new adapter (s3 storage).
parse-server-migrating-adapter

How to persist Bluemix files between app restarts

I am using a rails 4 application on Bluemix, attaching files using paperclip gem. As we all know, Paperclip is saving a reference to that file in the actual db, saving the physical file into a /public location.
I am submitting a file to this db which is getting saved here
/home/vcap/app/public/files/submissions/files/140/original/Successful_Submission.pdf
and then the file retrieval is working perfectly fine. Once I restart my app, I get:
Errno::ENOENT (No such file or directory # rb_file_s_lstat - /home/vcap/app/public/files/submissions/files/140/original/Successful_Submission.pdf):
And this is because Bluemix is not persisting this information. How can I get hold of those files between app restarts?
Bluemix is built on top of Cloud Foundry and it has an ephemeral filesystem, i.e., once your application stops the platform will claim back that filesystem and creates a brand new one once you restart your application.
Writing to the local filesystem is not recommended for cloud applications and you may need to redesign your application to work with Bluemix. One solution is to save your files in your database and not only the reference.
You can find more details on this link.
Each application instance on Bluemix (which is based on Cloud Foundry) has ephemeral storage. This storage is only available for the lifetime of that particular instance. When you redeploy your app then you'll get a new app instance and any data on the previous app instance will be inaccessible.
There's a good explanation of why it's best to avoid writing to the local file system when designing an application for Bluemix / Cloud Foundry.
You may want to take a look at a gem like CarrierWave to store the files on Amazon S3 or another persistent store. There's also Paperclip which offers similar functionality.

Store, fetch and map images from amazon webservice in MVC Dotnet Application

I'm new to Amazon webservice. I created a instance in AWS EC2 to publish my website.Now I have an requirement.
I have resources where each resource must be able to choose the images(as profile picture)during runtime. I want to fetch the images from amazon storage and map in the already developed mvc.net application. I had this idea of storing the images in amazonS3(via budget) but I need to know how to fetch them during run time which enables resources to choose their profile picture from the uploaded images in bucket.
Please letme know if there is anyother way to store and fetch profile pictures using amazon to my mvcdotnet application?
Store the Original Image file in S3 Standard option.Store the reproducible images like thumbs etc in the S3 Reduced Redundancy option (RRS) to save costs. Store the Meta data about images including the S3 URL mapping in Amazon RDS and query them whenever needed from EC2.

How to upload images on heroku server using attachment_fu plugin

I have an app in Heroku, I need simple file storage for uploaded images for this I
used send_data using attachment_fu plugin.
After that I have used the tmp/ directory to write this file and want to display on browser, But these files are not displayed in browser.
How can I display these images on browser?
What is the alternate solution to store and retrieve images?
Thanks!
You cannot store uploaded files on Heroku.
You must use an alternative strategy. A couple alternative strategies:
Store uploaded files in your database on Heroku. Because database systems are not specifically optimized for storing files, you should test out how well your application performs under load before you use this strategy.
Store uploaded files in an external storage service. Amazon S3 is a popular cloud storage service that anyone can use, and it has pay-as-you-go pricing just like Heroku. (In fact, Heroku itself runs on top of another of Amazon's cloud services, EC2.)

Resources