I created a Next.js project that is deployed on Vercel and uses a MySQL database. I then deployed a Directus instance on Heroku that is tied to that same database. In my Next.js project I want to fetch and render images that I uploaded to Directus. At first this works, but after a while all the images disappear in the Directus media library. The folders and references to the images are still there, but I don't see the pictures anymore, I get to see a JPG logo instead. When I try to fetch the images I get a 502 "Bad Gateway" error. I don't know what causes the images to disappear and how to fix this.
By default, Directus stores uploaded files locally on disk.
All Heroku applications run in a collection of lightweight Linux containers called dynos. Be aware that Heroku dyno's filesystem is ephemeral.
It means if your Heroku app doesn't receive any traffic for 30 minutes or is being deployed, the VM it lives on is destroyed, and its filesystem goes along with it. So, this filesystem should not be used for any permanent storage (Directus storage in your case).
You can configure Directus to use S3, Google Cloud Storage, Azure, or Cloudinary.
For more details check the File Storage Directus docs.
Related
I've deployed Strapi on Heroku and have set up the content fine. When I uploaded images and videos to Strapi using the cms interface and saved the update. it saved successfully but the file url returns 404. has anyone experienced this before? Am I missing something?
Thanks guys.
https://strapi.io/documentation/3.0.0-beta.x/guides/deployment.html#file-uploads
File Uploads
Like with project updates on Heroku, the file system doesn't support local uploading of files as they will be wiped when Heroku "Cycles" the dyno. This type of file system is called ephemeral, which means the file system only lasts until the dyno is restarted (with Heroku this happens any time you redeploy or during their regular restart which can happen every few hours or every day).
Due to Heroku's filesystem you will need to use an upload provider such as AWS S3, Cloudinary, or Rackspace. You can view the documentation for installing providers here and you can see a list of providers from both Strapi and the community on npmjs.com.
I have parse-server running on Heroku. When I first created this app, I didn't specify a files adapter in index.js, so all uploaded files have been getting stored on Heroku.
So I have now run out of room and I have set up an AWS S3 bucket to store my files on. This is working fine expect for the fact that any files which were originally stored on Heroku can no longer be accessed through the application.
At the moment I am thinking about looping through all objects which have a relation to a file stored on heroku, then uploading that file to S3 bucket. Just hoping that there may be some tool out there or that someone has an easier process for doing this.
thanks
There are migration guides for migrating parse server itself but I don't see anything in the documentation for migrating hosted files unfortunately.
I did find one migration tool but it appears to still utilize the previous file adapter (on your heroku instance) and then stores anything new on the new adapter (s3 storage).
parse-server-migrating-adapter
I am using a rails 4 application on Bluemix, attaching files using paperclip gem. As we all know, Paperclip is saving a reference to that file in the actual db, saving the physical file into a /public location.
I am submitting a file to this db which is getting saved here
/home/vcap/app/public/files/submissions/files/140/original/Successful_Submission.pdf
and then the file retrieval is working perfectly fine. Once I restart my app, I get:
Errno::ENOENT (No such file or directory # rb_file_s_lstat - /home/vcap/app/public/files/submissions/files/140/original/Successful_Submission.pdf):
And this is because Bluemix is not persisting this information. How can I get hold of those files between app restarts?
Bluemix is built on top of Cloud Foundry and it has an ephemeral filesystem, i.e., once your application stops the platform will claim back that filesystem and creates a brand new one once you restart your application.
Writing to the local filesystem is not recommended for cloud applications and you may need to redesign your application to work with Bluemix. One solution is to save your files in your database and not only the reference.
You can find more details on this link.
Each application instance on Bluemix (which is based on Cloud Foundry) has ephemeral storage. This storage is only available for the lifetime of that particular instance. When you redeploy your app then you'll get a new app instance and any data on the previous app instance will be inaccessible.
There's a good explanation of why it's best to avoid writing to the local file system when designing an application for Bluemix / Cloud Foundry.
You may want to take a look at a gem like CarrierWave to store the files on Amazon S3 or another persistent store. There's also Paperclip which offers similar functionality.
I'm using App Engine's high performance image serving on my site, and I'm able to get everything working properly on both my local machine and in production i.e. I can upload an image and successfully display the images using get_serving_url on the blob key. However, these images don't seem to persist on my development server, i.e. after I come back from a computer restart, the images no longer show up. The development server spits out:
images_service_pb.ImagesServiceError.BAD_IMAGE_DATA
which I'm guessing is actually because the underlying blobs are no longer there (although this is just a hunch). The rest of my datastore is still intact though, as I'm using the launch setting "--datastore_path" to ensure my data persists. Is there a separate flag I need to be using to persist the blobs as well? Or is there a separate problem here that I'm missing?
You must use --blobstore_path=DIR:
--blobstore_path=DIR Path to directory to use for storing Blobstore
file stub data.
You can see all options typing dev_appserver.py --help in the command line.
I have an app in Heroku, I need simple file storage for uploaded images for this I
used send_data using attachment_fu plugin.
After that I have used the tmp/ directory to write this file and want to display on browser, But these files are not displayed in browser.
How can I display these images on browser?
What is the alternate solution to store and retrieve images?
Thanks!
You cannot store uploaded files on Heroku.
You must use an alternative strategy. A couple alternative strategies:
Store uploaded files in your database on Heroku. Because database systems are not specifically optimized for storing files, you should test out how well your application performs under load before you use this strategy.
Store uploaded files in an external storage service. Amazon S3 is a popular cloud storage service that anyone can use, and it has pay-as-you-go pricing just like Heroku. (In fact, Heroku itself runs on top of another of Amazon's cloud services, EC2.)