I need to import Redis data from a private server to a Heroku Application. The problem is that the server is not accessible from outside, and the only way to import the data to Heroku app is via data.rdb.
Some Heroku addons, like "Heroku Redis", "Redis Enterprise Cloud" and others offer migration solution, but only by forking the data from another server/Redis, which has to be accessible from the outside.
TLDR; Is there a way to import Redis data from a rdb file on my machine to my Heroku app?
To anybody having the same issue: I previously used "Heroku Redis" addon, which does not have an easy data import solution. At the time of writing this, they offer only importing data from another Redis Server (which I did not have).
How I solved this was to use another Redis addon called Redis Enterprise Cloud which offers more data import solutions, one of them being importing via Amazon S3. So, I've just created an S3 bucket there (for free), uploaded my redis dump.rdb in that bucket, and imported it via said redis addon dashboard. The whole importing process is documented nicely in the redis enterprise cloud docs link.
Related
I have deployed Strapi headless CMS on Heroku free tier and tried to use it both with MongoDB and Postgres databases, whenever I restart the dyno e.g. during deployment - all the data created thus far is not persisted?
I tried rebuilding Strapi locally and I cannot reproduce the behaviour.
I am using free tier for hosting of Strapi as well as free tier of Heroku Postgres.
Most likely you created your project with --quickstart which is not Postgres, it is SQLite. Can you please check your config/environments/*/database.json files and ensure you have PostgreSQL setup?
All model configs are stored in files meaning you will not be able to create, edit, or delete new models, fields, components while using Heroku. All data (content) is saved to the database.
https://strapi.io/documentation/3.0.0-beta.x/guides/deployment.html#heroku
Most of the tutorials only show how to deploy a simple Flask hello world app on Heroku. But I have a Flask app which contains URLs with both GET and POST requests and they use MySQLdb library for fetching data from database.
How to set up such apps on Heroku? Currently I have a MySQL database on my local machine which is used by the code to fetch data. The Flask code contains many functions which are invoked by API calls. For example:
#app.route('/display_table', methods=['POST'])
def display_webstats():
db = MySQLdb.connect("localhost", "root", "root", "db_name")
cursor = db.cursor()
cursor.execute("select * from table_name")
ws = cursor.fetchall()
return jsonify(ws), 200
How to deploy such apps on Heroku?
Make sure your dependencies are listed in your requirements.txt or Pipfile and Pipfile.lock.
Select a MySQL addon and provision it, e.g.
heroku addons:create cleardb:ignite
Update your code to connect to the database provided by whatever environment variables the addon provides, e.g. CLEARDB_DATABASE_URL. You can use os.getenv() with a default argument to fall back in your development environment:
import os
database_url = os.getenv(
'CLEARDB_DATABASE_URL',
default='mysql://root:root#localhost/db_name', # For local development
)
It's probably also a good idea to centralize your database connection logic so it's not done in every controller. Something like Flask-SQLAlchemy might be helpful to simplify connecting and querying your database. It also provides an ORM if you want one of those.
Deploy.
Assuming you have them, run your migrations on Heroku via heroku run. If you're not using a migrations library I urge you to start. Flask-Migrate might be a good fit.
The alternative is manually creating and maintaining your schema across environments, and that's time-consuming, error prone, and frustrating.
I have parse-server running on Heroku. When I first created this app, I didn't specify a files adapter in index.js, so all uploaded files have been getting stored on Heroku.
So I have now run out of room and I have set up an AWS S3 bucket to store my files on. This is working fine expect for the fact that any files which were originally stored on Heroku can no longer be accessed through the application.
At the moment I am thinking about looping through all objects which have a relation to a file stored on heroku, then uploading that file to S3 bucket. Just hoping that there may be some tool out there or that someone has an easier process for doing this.
thanks
There are migration guides for migrating parse server itself but I don't see anything in the documentation for migrating hosted files unfortunately.
I did find one migration tool but it appears to still utilize the previous file adapter (on your heroku instance) and then stores anything new on the new adapter (s3 storage).
parse-server-migrating-adapter
I am using a rails 4 application on Bluemix, attaching files using paperclip gem. As we all know, Paperclip is saving a reference to that file in the actual db, saving the physical file into a /public location.
I am submitting a file to this db which is getting saved here
/home/vcap/app/public/files/submissions/files/140/original/Successful_Submission.pdf
and then the file retrieval is working perfectly fine. Once I restart my app, I get:
Errno::ENOENT (No such file or directory # rb_file_s_lstat - /home/vcap/app/public/files/submissions/files/140/original/Successful_Submission.pdf):
And this is because Bluemix is not persisting this information. How can I get hold of those files between app restarts?
Bluemix is built on top of Cloud Foundry and it has an ephemeral filesystem, i.e., once your application stops the platform will claim back that filesystem and creates a brand new one once you restart your application.
Writing to the local filesystem is not recommended for cloud applications and you may need to redesign your application to work with Bluemix. One solution is to save your files in your database and not only the reference.
You can find more details on this link.
Each application instance on Bluemix (which is based on Cloud Foundry) has ephemeral storage. This storage is only available for the lifetime of that particular instance. When you redeploy your app then you'll get a new app instance and any data on the previous app instance will be inaccessible.
There's a good explanation of why it's best to avoid writing to the local file system when designing an application for Bluemix / Cloud Foundry.
You may want to take a look at a gem like CarrierWave to store the files on Amazon S3 or another persistent store. There's also Paperclip which offers similar functionality.
I have an app in Heroku, I need simple file storage for uploaded images for this I
used send_data using attachment_fu plugin.
After that I have used the tmp/ directory to write this file and want to display on browser, But these files are not displayed in browser.
How can I display these images on browser?
What is the alternate solution to store and retrieve images?
Thanks!
You cannot store uploaded files on Heroku.
You must use an alternative strategy. A couple alternative strategies:
Store uploaded files in your database on Heroku. Because database systems are not specifically optimized for storing files, you should test out how well your application performs under load before you use this strategy.
Store uploaded files in an external storage service. Amazon S3 is a popular cloud storage service that anyone can use, and it has pay-as-you-go pricing just like Heroku. (In fact, Heroku itself runs on top of another of Amazon's cloud services, EC2.)