Heroku PlayFramework - create thumbnail - heroku

I already have PlayFramework app runing , but I am in process of migrating it to Heroku. Because on Heroku I can not use local filesystem like I did in my app. I am forced to use Amazon S3 ,but I do not know how to rewrite creating of thumbnails. For that I am using :
https://github.com/coobird/thumbnailator
Thumbnails.of(picture.getFile()).size(300,300).toFile(new File("public/images/data", thumb));
The problem is that I can not do this at heroku,because file wont be saved.
How do I create thumbnails then? If I do not want to use another service which will generate thumbnails for me and save them to s3 somehow...
Honestly if I would know how many different services would I need for simple page with java then I would stay at php forever...

In Heroku (as in many PaaS) there's no persistent filesystem. However, you do have access to temp directory.
So you could save it into a temp file
File temp = File.createTempFile("prefix", "suffix").getAbsolutePath;
Thumbnails.of(picture.getFile()).size(300,300).toFile(temp, thumb));
Then take that file and upload it to S3.
If you strictly insist on not using S3 for storing binary files, then you could base64 the file content and save it into the DB. (see some pros/cons for such approach here)

Related

upload file to google bucket from remote url in ruby

We have found various solutions around upload the file to google cloud bucket from local system. However I am wondering if is there a way we can upload file to bucket using the public URL or link.
https://googleapis.dev/ruby/google-cloud-storage/latest/index.html
I want to upload a file from remote url to GCS bucket via ruby code. Any suggestion here would be really appreciated.
Your code sits between the remote URL and the Google Cloud Storage (GCS) Bucket.
You've 2 alternatives:
(As you describe) Download the file behind the remote URL to a file system accessible to your code and then upload it to GCS;
Stream the file from the remote location into memory (you'll need to write this) and then (using GCS client library) stream the file into a GCS object.
You tagged question with ruby-on-rails-3
Old rails versions use uploaders like carrierwave
It's possible to use it to upload files to GCS
You can upload not only local files using this gem but also from remote URL, just use special attribute

Parse- Server Transferring files hosted on Heroku to AWS S3 bucket

I have parse-server running on Heroku. When I first created this app, I didn't specify a files adapter in index.js, so all uploaded files have been getting stored on Heroku.
So I have now run out of room and I have set up an AWS S3 bucket to store my files on. This is working fine expect for the fact that any files which were originally stored on Heroku can no longer be accessed through the application.
At the moment I am thinking about looping through all objects which have a relation to a file stored on heroku, then uploading that file to S3 bucket. Just hoping that there may be some tool out there or that someone has an easier process for doing this.
thanks
There are migration guides for migrating parse server itself but I don't see anything in the documentation for migrating hosted files unfortunately.
I did find one migration tool but it appears to still utilize the previous file adapter (on your heroku instance) and then stores anything new on the new adapter (s3 storage).
parse-server-migrating-adapter

Save images not properly display on heroku

i'm using Rails 3.2.6 and using carrierwave to upload images.When i upload image it is display fine and its image url also working well. But when i push next git commit on heroku.
git push staging master
than all images that i had already upload not display and its image path are not working
why?
Please anyone can tell what's the problem is running.
Thnaks
Whilst your uploads will work - the moment you push new code, or your application is restarted you will loose any uploads.
Heroku uses an Ephemeral file system, in that each dyno receives a separate copy (slug) of the originally deployed code so uploads would only exist on the dyno that handled the upload (https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem).
You need to use an external persistant data store like Amazon S3, Rackspace Files etc - fortunately with Carrierwave it's trivial to update it to use it as it supports it out of the box.
Did you setup carrierwave with s3 (https://github.com/jnicklas/carrierwave#using-amazon-s3)?
Heroku has a read only file system (https://devcenter.heroku.com/articles/s3).

how to migrate heroku file storage to S3

I am an idiot, and very new to Heroku. I used the heroku file system to store paperclip attached files to my models.
Have I lost these files? And can I unload them to S3 somehow and have better access?
Its a low traffic site but its causing problems as it should for me to have it setup to store locally on the server.
You can assume you've lost the files - if the app has been restarted/scaled/deployed to then they'll have gone.
You'll want to get it setup to save the files to S3 in the future.

How to upload images on heroku server using attachment_fu plugin

I have an app in Heroku, I need simple file storage for uploaded images for this I
used send_data using attachment_fu plugin.
After that I have used the tmp/ directory to write this file and want to display on browser, But these files are not displayed in browser.
How can I display these images on browser?
What is the alternate solution to store and retrieve images?
Thanks!
You cannot store uploaded files on Heroku.
You must use an alternative strategy. A couple alternative strategies:
Store uploaded files in your database on Heroku. Because database systems are not specifically optimized for storing files, you should test out how well your application performs under load before you use this strategy.
Store uploaded files in an external storage service. Amazon S3 is a popular cloud storage service that anyone can use, and it has pay-as-you-go pricing just like Heroku. (In fact, Heroku itself runs on top of another of Amazon's cloud services, EC2.)

Resources