Phoenix file copying on Heroku - heroku

I'm trying to upload images to my Phoenix app on Heroku. I have a simple app that follows the instructions for file uploading from Phoenix's docs.
I have a simple form and the controller uses File.cp_r() to copy the file from the temp directory.
def create(conn, %{"user" => user_params}) do
if upload = user_params["photo"] do
File.cp_r(upload.path, "priv/static/images/foo.jpg") #non-dynamic name, doens't matter
end
...
end
Works just file on my local. However when I upload this to Heroku, test the form and heroku run find on the directory, I don't see anything.
I have noticed that this directory on heroku has a seemingly forbidding privilege:
drwx------ 2 u25619 dyno 4096 Apr 23 05:14 images
I tried slipping in a nice little File.chmod("priv/static/images", 0o777), but to no avail; that directory seems locked away from me, so I think this is a heroku issue.
Any idea how to handle this?
EDIT: Resolved by using the phoenix dep ex_aws to upload to an Amazon S3 bucket.
ex_aws dependency
partial explanation (note: you need to add poison and hackney to make this work, they are not mentioned)

The file system on heroku is ephemeral, and you won't have access to any files that you save on it across deploys or when you start new instances.
Also, when you run heroku run, you're not connecting up to that same instance that's currently running your app, instead what it'll do is to launch a new instance so those uploaded files would not exist there.
A better approach is to save the uploaded files to S3 or similar where you can still access it across deploys.

Related

Is there a way to access files that are created by a code on heroku

I have a discord bot that saves JSON files on the same directory he is in so it could work on more than one server without colliding (he saves variables that are not important to the question) .
I finished my code and I uploaded it to heroku for hosting. The thing is , when I ran the code from my pc I could see the files that were being created for each server for testing but now I don't know how to reach them.
Is there a way to check all the files I have in heroku?
(when I mean all, I mean also the JSON files that were created from the bot itself)
side not:
You can do heroku run bash -a APPNAME but it still doesn't let me see the files that were made in the dyno or directory.
On top of that, if someone has another good hosting site(preferred free) which doesn't use a ephemeral filesystem, that would be great if you can comment them down bellow.
(or if you have a way to save the files before the dyno deletes them)
What you are searching for in Heroku is called Heroku Exec (SSH Tunneling) which you can use to SSH into running dynos for debugging purposes.

heroku and nuxt file uploader not working

I have a PWA made with NuxtJS correctly deployed and working on Heroku.
I would like to implement a file uploader and manager so that I can manage some files in a directory (~/static/files) from my front-end through some APIs.
On localhost, it works fine so I have my directory and when I add or delete the file, it deletes or creates it from the file system (as it should).
My question is: why can't I do the same on Heroku? I mean, I tried by uploading a file and deleting it and it works but the problem comes when I restart the app (through heroku ps:restart -a appname) because when I do so it deletes the file as if it was saved in RAM and not onto the file system.
If I try to see the files in the directory where they should be through heroku run bash -a appname and then down to the directory, no file is showed.
How can I fix this?
The Heroku filesystem is ephemeral - that means that any changes to the filesystem whilst the dyno is running only last until that dyno is shut down or restarted. Each dyno boots with a clean copy of the filesystem from the most recent deploy. This is similar to how many container based systems, such as Docker, operate.
In addition, under normal operations dynos will restart every day in a process known as "Cycling".
These two facts mean that the filesystem on Heroku is not suitable for persistent storage of data. In cases where you need to store data we recommend using a database addon such as Postgres (for data) or a dedicated file storage service such as AWS S3 (for static files).

Parse- Server Transferring files hosted on Heroku to AWS S3 bucket

I have parse-server running on Heroku. When I first created this app, I didn't specify a files adapter in index.js, so all uploaded files have been getting stored on Heroku.
So I have now run out of room and I have set up an AWS S3 bucket to store my files on. This is working fine expect for the fact that any files which were originally stored on Heroku can no longer be accessed through the application.
At the moment I am thinking about looping through all objects which have a relation to a file stored on heroku, then uploading that file to S3 bucket. Just hoping that there may be some tool out there or that someone has an easier process for doing this.
thanks
There are migration guides for migrating parse server itself but I don't see anything in the documentation for migrating hosted files unfortunately.
I did find one migration tool but it appears to still utilize the previous file adapter (on your heroku instance) and then stores anything new on the new adapter (s3 storage).
parse-server-migrating-adapter

Save images not properly display on heroku

i'm using Rails 3.2.6 and using carrierwave to upload images.When i upload image it is display fine and its image url also working well. But when i push next git commit on heroku.
git push staging master
than all images that i had already upload not display and its image path are not working
why?
Please anyone can tell what's the problem is running.
Thnaks
Whilst your uploads will work - the moment you push new code, or your application is restarted you will loose any uploads.
Heroku uses an Ephemeral file system, in that each dyno receives a separate copy (slug) of the originally deployed code so uploads would only exist on the dyno that handled the upload (https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem).
You need to use an external persistant data store like Amazon S3, Rackspace Files etc - fortunately with Carrierwave it's trivial to update it to use it as it supports it out of the box.
Did you setup carrierwave with s3 (https://github.com/jnicklas/carrierwave#using-amazon-s3)?
Heroku has a read only file system (https://devcenter.heroku.com/articles/s3).

how to migrate heroku file storage to S3

I am an idiot, and very new to Heroku. I used the heroku file system to store paperclip attached files to my models.
Have I lost these files? And can I unload them to S3 somehow and have better access?
Its a low traffic site but its causing problems as it should for me to have it setup to store locally on the server.
You can assume you've lost the files - if the app has been restarted/scaled/deployed to then they'll have gone.
You'll want to get it setup to save the files to S3 in the future.

Resources