How do i migrate data between heroku databases? - heroku

I have a current 'staging' database in 'heroku' and would like to migrate it into 'production' database? How could I do this? I looked into their taps app but it was not too clear of how it works?

Two options,
PGBackups - https://devcenter.heroku.com/articles/pgbackups - use against your staging database to back it up and then restore to your your production one. Looks at the Transfers sub heading in that page.
Taps via heroku db:pull and heroku db:push - This you would use to pull your staging database to your local machine (doesn't matter what DB you're using locally) and then push it to your production application.
If you're dealing with a large dataset then option 1 is the best option to use. Option 2 also lets you only push specific tables if you uses the --tables <tablenames> argument which is useful in some occasions.

Related

Why do I lose contents of my database after a heroku dyno restart?

Anytime my app goes to sleep and comes back on, I lose data in my database
And I'm not storing any media, it's just form data (texts)... I built the app on strapi and I've followed all their guidelines but it keeps happening. I'd be happy if anyone can help
Local data (files, db) is cleared after a Dyno restart because the Heroku File System is ephemeral. A Dyno is restarted (at least) every 24hrs.
In your case Strapi uses SQLite where data is saved in a local file.
Strapi suggests to configure Postgres on Heroku, alternatively you can use an external DB storage service.
First of all:
As you create content types with strapi it generates the code (= new files) for the according controllers/routes/services
Heroku does not persist data after a restart
After a restart strapi checks which content types exist in the code and deletes the tables of nonexisting types from the database.
Therefore, on Heroku you have to set up all your content types locally and connect to an external db (e.g. Heroku Postgres) but never strapi's default textfile based db.
Then push the generated files and finally deploy.
Thus, on Heroku you should always run in production mode. This way the option to alter content types is completely blocked and you will not run into the issue of data loss after a restart.

A couple of heroku postgres questions (just started, am lost)

I have provisioned postgres on my heroku app and also installed postgres locally to maintain parity (as the documentation recommends) with the online database but I'm also not understanding how this will work. Am I supposed to be accessing a local copy of a database when running on my own computer (while building and before deploying) and then using heroku's separate postgres database once it is deployed? If it is parity, shouldn't they both be using the heroku postgres database?
In other words, will my local app (during production) and heroku app (deployed and live) be using the same online postgres database?
Thanks.
Am I supposed to be accessing a local copy of a database when running on my own computer (while building and before deploying) and then using heroku's separate postgres database once it is deployed?
Yes, that's exactly it. Without seeing what bit of documentation you're referencing it's hard to say what they mean but perhaps there's another way to explain it.
In your local development environment, you may find that you need to test database schema changes (this is just one example, there are many). If you only had the one heroku postgres database you'd be forced to test these changes in production, which might result in poor usability for your users and that doesn't even account for the possibility of making a mistake and accidentally destroying your production data. There are a number of other shortcomings and challenges with this single database configuration.
For these reasons and more, it's best to keep your production data completely separated from your development/staging/test environment by creating a local/staging database. You might reasonably ask, "What about the data? I need data to test!". There are many ways to put together your test database and which you choose will likely depend on your needs. A shortlist of possibilities:
Use a seed file to generate mock data in your db
Use a model factory (usually runs in conjunction with your testing framework)
Take a dump of your production database, anonymize and redact sensitive information and use that for local testing.

Heroku. db:push when I app contains two databases

I have Rails 3.2 application hosted on Heroku. My application contains two databases (one for my model, the second is a kind of a dictionary with static data).
I need to push the second database (dictionary) to Heroku, but when I try db:push Heroku thinks that I'm going to push the first database (with Rails model).
The question is - how could I specify that I want to push my local database dictionary.sqlite to heroku dictionary.pg?
You could use the Heroku pg:transfer plugin which will let you set the target destination by it's URL.
https://github.com/ddollar/heroku-pg-transfer
Alternatively, use psql client locally but restore to the heroku pg isntance.
Don't use db:push/pull; those methods are deprecated. Use pgbackups:capture/restore for things like this. It accepts the HEROKU_POSTGRESQL_COLOR as part of the command:
$ heroku pgbackups:restore HEROKU_POSTGRESQL_COLOR 'https://example.com/data.dump' --app app-name
See Importing and Exporting Heroku Postgres Databases with PG Backups for more detailed explanation.
Also, heroku-pg-transfer has been integrated into pg-extras, check that out here: https://github.com/heroku/heroku-pg-extras

upgrading to postgres on Heroku

What is the recommended way to upgrade a Heroku Postgres production database to 9.2 with minimal downtime? Is it possible to use a follower, or should we take the pgbackups/snapshots route?
Until logical followers in 9.4, you'll have to dump and restore (for the reasons Craig describes). You can simplify this with pgbackups:transfer. The direct transfer is faster than dump and restore, but know that you won't have a snapshot to keep.
The script below is basically Heroku's Using PG Backups to Upgrade Heroku Postgres Databases
with modification for pgbackups:transfer. (If you have multiple instances, say a staging server, add "-a" or "--remote" to each Heroku line to specify which server.)
# get the pgbackups plugin
heroku plugins:install git://github.com/heroku/heroku-pg-extras.git
# provision new db
heroku addons:add heroku-postgresql:crane --version=9.2
# wait for it to come online, make note of new color
heroku pg:wait
# prevent new data from arriving during dump
heroku ps:scale worker=0 web=0
heroku maintenance:on
# copy over the DB. could take a while.
heroku pgbackups:transfer OLDCOLOR NEWCOLOR
# promote new database as default for DATABASE_URL
heroku pg:promote NEWCOLOR
# start everything back up and test
heroku ps:scale worker=N web=N
heroku maintenance:off
heroku open
# remove old database
heroku addons:remove HEROKU_POSTGRESQL_OLDCOLOR
Note that if you compare your data size between them, the new one may be much smaller because of efficiencies in 9.2. (My 9.2 was about 70% of the 9.1.)
Heroku followers are, AFAIK, just PostgreSQL streaming replica servers. This means you can't use them across versions, you must have binary-compatible databases.
The same techniques should apply as ordinary PostgreSQL, except that you may not be able to use pg_upgrade on Heroku. This requires shell (ssh, etc) access as the postgres user on the system that hosts the database, so I doubt it's possible on Heroku unless they've provided a tool to run pg_upgrade for you. I can't find much information on this.
You will probably have to look at using Slony-I, Bucardo, or another trigger-based replication solution to do the upgrade unless you can find a way to run pg_upgrade on a Heroku database instance. The general idea is that you set up a new 9.2 instance, use Slony to clone data from the 9.1 instance into it, then once they're fully in sync you stop the 9.1 instance, remove the Slony triggers, and switch clients over to the 9.2 instance.
Search for more information on "postgresql low downtime upgrade slony" etc, see how you go.

Heroku: Migration issues when pulling production database to testing and running rake db:migrate

I have 3 instances of my rails app on heroku (test, stage and production). When I want to test an issue that is happening with real users' data, I would like to heroku db:pull --app production and then heroku db:push --app test. The problem is that at this point heroku rake db:migrate --app test throws an error because the columns the migration is trying to create have already been created.
My understanding is that heroku db:push pushes data into an existing database schema and rather than literally pushing the entire database (schema included). This means that the schema we are pushing to may be more advanced than the migrations table we are pushing since this migrations table will be missing migration records that have not run on the database we pulled from but have obviously run on the database we are pushing to.
My first question is, am I correct in my understanding of how this works? My second question is how do I fix this so that I can pull production data, stick it in testing and run migrations without receiving this error. Ideally, I would want to copy the production database and stick it in test and then migrate it fully since if I could do this I wouldn't have to worry about the existing schema on test. Is there a way to do this?
If not, is there a way to fake that migrations have already run by populating the new migrations table with records for each migration that has already run on my test database?
No, db:push pushes the local schema and data. You can push your local DB into an empty DB on Heorku, this is how I put sites live - when you run it you see it creating the schema then pushing the data in.
I work like this - Test environment on Heroku same code as live - ie. a branch of master (ie what's live and pushed to test). Pull DB from Live. Fix on my local system. Push to test and run migrations. Test release against DB on Heroku. When I'm happy merge test code into master and then deploy and run migrations. Rinse and Repeat for future bugs. The production DB should never have a more advanced schema version that test. You can always check this out by looking in the schema_migrations table - this is how Rails knows what migrations have run so far, so you can compare this to db/migrations files.

Resources