Avoid making the backup db public when importing into Heroku? - heroku

According to Heroku docs, a backup db should be placed into a publicly accessible folder on S3, to allow importing it.
In order for PG Backups to access and import your dump file you will
need to upload it somewhere with an HTTP-accessible URL. We recommend
using Amazon S3. Wherever you place the file, ensure that the file is
publicly accessible. This may require changing the permissions on the
uploaded file. For security, we recommend obscuring the filename and
also removing the file once the import has completed.
Source: https://devcenter.heroku.com/articles/heroku-postgres-import-export
This seems unreasonable from a security standpoint, so I'm wondering if there is a more secure way to import a db.

Related

How to import Redis rdb file to Heroku

I need to import Redis data from a private server to a Heroku Application. The problem is that the server is not accessible from outside, and the only way to import the data to Heroku app is via data.rdb.
Some Heroku addons, like "Heroku Redis", "Redis Enterprise Cloud" and others offer migration solution, but only by forking the data from another server/Redis, which has to be accessible from the outside.
TLDR; Is there a way to import Redis data from a rdb file on my machine to my Heroku app?
To anybody having the same issue: I previously used "Heroku Redis" addon, which does not have an easy data import solution. At the time of writing this, they offer only importing data from another Redis Server (which I did not have).
How I solved this was to use another Redis addon called Redis Enterprise Cloud which offers more data import solutions, one of them being importing via Amazon S3. So, I've just created an S3 bucket there (for free), uploaded my redis dump.rdb in that bucket, and imported it via said redis addon dashboard. The whole importing process is documented nicely in the redis enterprise cloud docs link.

Heroku PlayFramework - create thumbnail

I already have PlayFramework app runing , but I am in process of migrating it to Heroku. Because on Heroku I can not use local filesystem like I did in my app. I am forced to use Amazon S3 ,but I do not know how to rewrite creating of thumbnails. For that I am using :
https://github.com/coobird/thumbnailator
Thumbnails.of(picture.getFile()).size(300,300).toFile(new File("public/images/data", thumb));
The problem is that I can not do this at heroku,because file wont be saved.
How do I create thumbnails then? If I do not want to use another service which will generate thumbnails for me and save them to s3 somehow...
Honestly if I would know how many different services would I need for simple page with java then I would stay at php forever...
In Heroku (as in many PaaS) there's no persistent filesystem. However, you do have access to temp directory.
So you could save it into a temp file
File temp = File.createTempFile("prefix", "suffix").getAbsolutePath;
Thumbnails.of(picture.getFile()).size(300,300).toFile(temp, thumb));
Then take that file and upload it to S3.
If you strictly insist on not using S3 for storing binary files, then you could base64 the file content and save it into the DB. (see some pros/cons for such approach here)

Heroku: Can I commit remotely

We have a CMS on heroku, some files were generated by the CMS, how can I pull those changes down? Can I commit the changes remotely and pull them down? Is there an FTP option of some kind?
See: https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem
It's not designed for persistent file generation and usage.
In practice, it works like this: User puts some code into a repository. That code is dynamically pulled into temporary Amazon EC instances and executed. The code can be pulled from virtual machine to virtual machine, node to node, without disruption, across data centers. There is no real "place" to get the products of your code from the environment, because anything generated by the checked-out code can (and will) be destroyed as your code deploy skips around between the temporary machines.
That being said, there are some workarounds:
If your app includes something like a file browser within your deployed code, you can grab the (entirely) temporary files using that file browser, and commit it back to your persistent code trunk.
Another option is using something like S3 for your persistent storage, with your application reading from, and writing to, a data storage service, knowing that while heroku will just re-write and destroy your local data on a frequent basis, the external service will maintain the files.
Similarly, you can change your application to use heroku's postgres for persistent data storage, or use Amazon's RDS, (etc.).
Alternately, you can edit your application in such a way as to ensure that any files generated by it will be regenerated every time the code is refreshed, redeployed, and moved around.

Access to filesystem on AppHarbor

I want to try AppHarbor, but I have an application which stores uploaded files in certain place on a filesystem. Is it compatible with AppHarbor? Can I store files in the file system and access them later?
(what kind of path can I expect, like c:\blabla something or what?)
Thank you.
You can store files on the local filesystem, but the application directory is wiped on each new deployment so it's not recommended to rely on for file storage.
Instead we recommend that you use a cloud storage service such as Amazon S3, Google Cloud Storage or similar. There are .NET libraries for both services.
We recently wrote a blog post about uploading files directly to S3 and GCS from the browser that you might want to read.
If you are using a background worker, you need to 'Enable File System Write Access' in the settings of you application.
Then, you are permitted access to write to: Path.GetTempPath()
Sourced from this support question: http://support.appharbor.com/discussions/problems/5868-create-directory-in-background-worker

how to migrate heroku file storage to S3

I am an idiot, and very new to Heroku. I used the heroku file system to store paperclip attached files to my models.
Have I lost these files? And can I unload them to S3 somehow and have better access?
Its a low traffic site but its causing problems as it should for me to have it setup to store locally on the server.
You can assume you've lost the files - if the app has been restarted/scaled/deployed to then they'll have gone.
You'll want to get it setup to save the files to S3 in the future.

Resources