Is there a method to import data from a single file into multiple Oracle tables while maintaining the referential integrity?
Yes.
Without a lot more detail, I'll just say that you should look to external table to get the data from the file into the database, then select from the external table and use the 'INSERT ALL' feature to insert into multiple tables, from the single input.
Hope that helps.
There are couple alternatives (not an exhaustive list):
Walk the dependency graph of FOREIGN KEYs and make sure you insert data to "parents" before inserting it to "children".
Defer all the FOREIGN KEYs, so order of insertion does not matter. This is OK if you can perform the whole import in a single transaction.
Temporarily disable FOREIGN KEY constraints, import the data in any order, then re-enable them.
Related
I have an oracle table, and I export the data from my oracle server, and then import the data into another oracle server.
My question is: for every row in the table, will the rowid keep unchanged after importing into another oracle?
I guess the answer is NO, but I have no idea how rowid is generated.
No, the row IDs will almost certainly change. Even within the same database, from the docs:
If you delete and reinsert a row with the Import and Export utilities, for example, then its rowid may change.
The row ID represents the location of the row within a block, within a data file, within a tablespace. (That documentation explains that more.) Even if the target database has the same tablespaces and data files, the import will load data into files and blocks as efficiently as it can, and will not make any attempt to preserve old row IDs - which it won't know anyway as they are not part of the exported data. Even if it could try, that would involve writing each row to a specific place on disk, which would slow things down quite a bit, and existing data in the target DB might already be using the same row ID.
ROWID is a pseudocolumn, not part of the the actual row, and it would be meaningless to include it in the exported data.
Although you can use the ROWID pseudocolumn in the SELECT and WHERE clause of a query, these pseudocolumn values are not actually stored in the database.
It isn't even necessarily unique.
Also, you shouldn't really be using it directly, except possibly within a single query/statement (here is one use) or maybe procedure, as they can change even within an existing database if Oracle decides it needs to reorganize things. That's partly why the documentation also says:
You should not use ROWID as the primary key of a table.
I was wondering what would be the approach to get rid of a lot of records from an Oracle database in order to create a lighter database for developer's laptops.
We aim to reduce the exports from different production environments NOT excluding entities, but reducing the number of records in each table mantaining the referential integrity.
Is there a tool/script around?
I was also wondering if transforming all the FKs on a replica DB to "on delete cascade" and deleting a subset of record from the entities on the top of the relational hierarchy would do the job.
Any suggestion?
With Jailer you can export data to an SQL script which can traverse foreign key constraints to include all data needed to maintain referential integrity.
http://jailer.sourceforge.net
If you wanted to export/import limit object from/to database, then you could EXCLUDE the objects, which you don't wanted to be part of your dump.
You can exclude any specific table to be exported/imported by specify the object type and object name.
EXCLUDE=TABLE:"='<TABLE_NAME>'"
==Update==
AFAIK, I don't see, if Oracle provides such flexibility to export subset data, but Oracle does have option to export partitioned data from TABLES
TABLES=[schema_name.]table_name[:partition_name] [, ...]
If I have 1 table in a database, and I want to export it, then import it into new table in a different database?
Should I set up the table with same fields in database two, or is there a way create empty table so all the import will work?
If you have a dblink established, a quick way to copy a table without intermediate files would be to execute this from the target database (the one where you want the new table to be copied):
create table my_new_table as
select *
from my_original_table#my_original_database
This presupposes the dblink, of course, and also that there is sufficient redo space to allow that much data to be copied in one fell swoop.
If not, you could also build the table this way and then do a bunch of insert into transactions to move the data in chunks.
If you only want the structure (your question sort of implied that, but I wasn't sure), you can always add a where 1 = 3 to copy only the structure.
This won't import constrains or indexes, but I'm not sure if that matters for what you seek.
I have a dump of a huge oracle database so it is impossible to import it all. I want to import a specific table called X. The problem is that X has foreign keys. If I import just X, I will get the following error:
imp user/pass#dbName tables=X rows=y ignore=Y
ORA-02291: integrity constraint violated - parent key not found
I already have the whole db locally (but without data), I want to import all tables that are associated to X. How can I achieve that? I have plsql installed. I also need to know the order of these tables to know which to import at first.
You dan disable all DB constraints before the import, and re-enable them afterwards. See:
disable-all-table-constraints-in-oracle or
oracle_disable_constraints
There are few questions so I will try to answer one by one.
ORA-02291: integrity constraint violated - parent key not found
No brainer to guess this because as you know you don't have parent record for table X. By the way you may also want to use flag CONSTRAINTS=N because you already have db as you said.
"I want to import all tables that are associated to X. How can I achieve that?"
Well no option but to find all the dependencies manually (or use data dictionary tables user_cons_columns, user_constraints etc to lookup) and import those tables as well. Think of it if you don't do that. You will break your data integrity. If you still want that data in table X without dependencies then disable the constraints and then import. But you won't be enable your constraint again and I don't know what you want to do with broken data.
"I also need to know the order of these tables to know which to import at first."
Disable the constraints before import and then enable them after import. You don't have to worry about order in that case.
I have two databases with identical table layouts. There are a dozen or so tables of interest. They are a number of FK between them.
I have been asked to write a stored procedure to copy data from database A to database B based on the PK of the parent table at the top of the hierarchy. I may receive just one value, or a list of values. I'm supposed to select all records from database A that match the value(s) and insert/update them into database B. This includes all the records in the child tables too.
My questions is whats the best(most efficent/ best practice) way to do this?
Should I write a dozen select from... insert into... statements?
Should I join the tables together an try to insert into all the tables at the same time?
Thanks!
Additional info:
The record should be inserted if it is not already there. (based on the PK of the respective table). Otherwise it should be updated.
Obviously I need to traverse down to all child tables, so There would only be one record to copy in the parent table, but the child table might have 10, and the child's child table might have 500. I would of course need to update the record if it already existed, insert if it does not for the child tables too...
UPDATE:
I think it would make the solution simpler if I just deleted all records related to the top level key, and then insert all the new records rather than trying to do updates.
So I guess the questions is it best to just do a dozen:
delete from ... where ... in ...
select from ... where ... in ...
insert into...
or is it better to do some kinda of fancy joins to do all the inserts in one sql statement?
I would do this by disabling all the foreign key constraints, then doing a set of MERGE statements to deal with the updates and inserts, then enable all the constraints.
Think about logging. How much redo do you want to generate?
You might find that it's quicker and better to truncate all the target tables and then do inserts of everything with nolog. Could be simpler than the merges.
One major main alternative would be to drop all the target tables and use export and import. Might be a lot faster.
A second alternative would be to use materialized views, particularly if you don't need to do updates on the target tables. That way, Oracle does all the heavy lifting for you. You can force integrity by choosing refresh groups carefully.
There are several ways to deal with this business problem. A PL/SQL program may not be the best.