How do I create backups of my entire cluster? Can they only be imported back into CockroachDB or can they also be imported into other systems?
CockroachDB support generating a snapshot of your entire cluster with the cockroach dump command. This creates a human-readable output of SQL statements that can be easily imported into other databases, if so desired. Here’s an example of dumping two tables from the same database:
cockroach dump db1 tbl1 tbl2 > db1_backup.sql
We’re also working on a much more performant, efficient backup-and-restore functionality for our upcoming 1.0 release. The files from the new backup functionality will only be restorable into CockroachDB, however.
the "more performant efficient backup-and-restore functionality" Alex mentioned has been available for awhile now. documentation is here:
https://www.cockroachlabs.com/docs/stable/backup.html
Cockroach Labs calls the open source version of the database "Cockroach Core", so open source users should note the caveat at the top of the page:
"Core users can only take full backups. To use the other backup features, you need an enterprise license."
Related
I am a java person and not so much familiar with Oracle available features. Please help me.
The requirement is that, we are looking for some virtual(replica/mirror/view) database to be created from Production database just for testing purpose. Once we are done with executing all the automation test cases, delete the virtual database created. So are there any such concepts in Oracle ?
We are on Oracle 12c.
Many apps use same DB(its huge)
PS: We also use docker for deployment and also AWS.
use Rman duplicate to duplicate the test database from production.
https://oracle-base.com/articles/11g/duplicate-database-using-rman-11gr2
you can duplicate from backups or duplicate from active database
You can probably ask your database admin to export the table space to a new test machine which has the same oracle version installed. May require If there are only very few tables, then you can spool your tables out and use sqlloader to load them to a test database ( you will need to manually create the structure of the tables in test environment before hand.
In both cases, you might want to scrub out the sensitive information as per your requirements and standards.
For example: Lets say i have two database DB1 and DB2. Now my requirement is to refresh data from DB1 to DB2 every night. DB1 is live database and DB2 is for non business users for data analysis.
My questions:
1) What must be the tool i should use for my requirement? I need a solution that is fast, since the database copy has to be done everyday.
2) Does AWS have any tool to automate the backup and restore the data?
There's a load of ways to do this and the answer comes down to what storage you're using, are they on the same server and then ultimately the size of the database.
RMAN's more a backup / recovery tool but definitely a runner for cloning. If you're not sure what RMAN does then I wouldn't even start to implement it as it's very tricky if you aren't super comfortable with Oracle DB's.
My recommendation is just use Oracle datapump, export the schema's you need to a dump file then ship it over and import them into the other database making sure to overwrite/drop the existing schemas.
Other than doing a differential clone at a SAN level this is probably the quickest and definitely easiest way to get it done
My colleague running Oracle Database (11g) in AIX and they would like to move this DB to RHEL. I already found Link. However I would like to check if someone have already migrated or used any other best tools.
you have several options. As pointed out before, Oracle Data Pump is the easiest approach. It would lift you from every version >=10g upwards (or even back when you use the VERSION= parameter).
The caveat is:
Size of the database - and your downtime requirements.
In terms of larger databases, Transportable Tablespaces is the usual choice. More work as you will have to rebuild meta information such as synonyms, view, plsql, sequences etc - and in your case you'll have to either CONVERT the tablespaces as you are coming from a Big Endiann platform and going to a Little Endiann. DBMS_FILE_TRANSFER could assist you here as it can restore and covert at the same time whereas RMAN will need a 2-phase operation with staging space for it.
You can speed up transportable tablespaces with RMAN Incremental Backups to avoid most of the copy/convert time. And you can ease it with Full Transportable Export/Import (minimum source: 11.2.0.3 - minimum destination: 12.1.0.1) where Data Pump does the manual work of transportable tablespaces.
And of course there are other techniques such as Create-Table-As-Select or Insert-Append-Select options via Database Links and such.
Just see the big slide deck "Upgrade / Migrate / Consolidate to 12.2" for customer examples - and the "Migrate >230Tb in <24 hours" decks on my page: https://mikedietrichde.com/slides/
Cheers,
Mike
Is there some reason you can't just use Oracle Database Pump?
Create the database on RHEL, make sure you use a compatible character set.
https://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_overview.htm
Does MonetDB support online schema changes? For example adding/changing a column on the fly while the tables are loaded in memory. Many in-memory databases have to restarted to get the schema changes reflected. So, I was wondering if MonetDB took care of this issue.
Yes. MonetDB supports SQL-99 which includes DDL statements (that are immediately reflected in the schema).
I am working for a company that has two Oracle Databases, lets call them LIVE and TEST. An export is performed every night to take a snapshot of the database for each day TEST is then dropped and recreated using existing table creation scripts, with the import finally putting the exported data from LIVE into the new TEST environment.
My questions are,
Is this really the best way to do this?
What better way is there?
Any URL's to demonstrate these ways, would be great
Instead of import/export use Datapump
check Oracle GoldenGate
check Oracle Streams
If you are using Enterprise Edition then you can look into transportable tablespaces as well, which have the advantage of exactly preserving the physical state of the data files so performance testing is more realistic.