I have a dump of a huge oracle database so it is impossible to import it all. I want to import a specific table called X. The problem is that X has foreign keys. If I import just X, I will get the following error:
imp user/pass#dbName tables=X rows=y ignore=Y
ORA-02291: integrity constraint violated - parent key not found
I already have the whole db locally (but without data), I want to import all tables that are associated to X. How can I achieve that? I have plsql installed. I also need to know the order of these tables to know which to import at first.
You dan disable all DB constraints before the import, and re-enable them afterwards. See:
disable-all-table-constraints-in-oracle or
oracle_disable_constraints
There are few questions so I will try to answer one by one.
ORA-02291: integrity constraint violated - parent key not found
No brainer to guess this because as you know you don't have parent record for table X. By the way you may also want to use flag CONSTRAINTS=N because you already have db as you said.
"I want to import all tables that are associated to X. How can I achieve that?"
Well no option but to find all the dependencies manually (or use data dictionary tables user_cons_columns, user_constraints etc to lookup) and import those tables as well. Think of it if you don't do that. You will break your data integrity. If you still want that data in table X without dependencies then disable the constraints and then import. But you won't be enable your constraint again and I don't know what you want to do with broken data.
"I also need to know the order of these tables to know which to import at first."
Disable the constraints before import and then enable them after import. You don't have to worry about order in that case.
Related
Situation:
I'm exporting data from the Oracle DB A, and importing it into an empty Oracle DB B.
When exporting from A, I deliberately export partially the data in different tables. That results into a dump file that for sure does not guarantee FK integrity due to the ignored data. Then I want to import it into the DB B it fails when creating the FKs.
Goal:
I want to have the same effect as NOVALIDATE has for FK Constraints but at import time. That is: for old data, it should not validate, and just check for new data. I didn't find a way to do it on the export, or import. Does any one know how can I do it?
Extra information:
I can't change the FK on the DB A before dumping it.
The data is so intertwined with old and new data, that there is no way to create the DB B while maintaining the consistency of FKs.
I need to create the FKs, because the new data will rely on them
I'm totally novice in terms of Oracle DB knowledge. Trying to understand IMPDB command and its scope.
Issue: Suppose there are 500 tables in a particular DB, many of them (60% - 70% or more) are coming as zero records when we're importing the data into a fresh Oracle DB (getting the data from one vendor who has the DB). The doubt is, how can most of the tables be zero records in a DB (why were they created at the first place then?). Also, we're assuming maybe the vendor is using a specific user while generating the .DMP files who has no access to those tables and hence the 0 count. When we asked the vendor, they said, that's not how Oracle works, they've provided user export dump and said, "Schema is a collection of database objects owned by a specific user. Those objects include tables, indexes, views, functions, stored procedures, etc."
When asked about the zero records issue, they said they're pulling correctly and have no understanding as to why so many tables are zero. The SO community has great experts in Oracle DB, can anyone shed some light as to:
What might be the issue?
Is our assumption correct (i.e, that user doesn't have access to those tables which got zero records)?
What's the right way forward?
4) Anything else you want to add.
The vendor is correct - the utility used to generate the export, EXPDP (the compliment to IMPDP) can create a full dump of all of the database objects of a specific user. However, the parameters used to generate the export can vary greatly, and it's absolutely possible for an export to not include table data IF the EXPDP command/parameters used to create the export are specified in that way. For example, let's imagine that someone wants to export a specific schema using the following commmand:
expdp [USER]#[DATABASE] schemas=test directory=DATA_PUMP_DIR dumpfile=test.dmp logfile=test.log query=TEST.TABLE:'"WHERE row_date>sysdate"'
While the export is being generated, all of the rows in that specific table will be evaluated based on the where condition. Unless rows have a date that is in the future, none of the rows dated prior and up to the sysdate will be exported. If a where condition like that is applied to the entire export, you'll have tables with 0 rows in the dump file.
That is just an example - it might also be the case that the tables really have 0 rows. This is possible for a lot of reasons - perhaps it is an older schema with tables that have previously been truncated. Perhaps that particular database isn't used often, and the tables within the schema are empty because rows were never added to the tables. Maybe a developer or another DBA created a bunch of unnecessary tables and they simply were never dropped. It could be a plethora of potential reasons/issues for a schema to have empty tables, and that doesn't mean there is something wrong with the database or the export file being generated. Applications and their technical requirements change all the time, and it's possible that the schema simply wasn't updated when those tables were no longer needed.
The first thing I would recommend is:
Ask the vendor to provide record counts of each table in that schema from their end for validation purposes. This will tell you if the tables are empty in the database. If they are empty in the database, they will be empty in your export. This is very simple and can be achieved with a query like select owner, table_name, num_rows, sample_size, last_analyzed from all_tables where owner=[SCHEMA]; provided that their table statistics are up to date.
If this is a big concern for you, you can always ask them to exclude those tables in the export with a command like:
expdp [USER]#[DATABASE] schemas=test exclude=TABLE:"IN ('Table1', 'Table2')" directory=DATA_PUMP_DIR dumpfile=test.dmp logfile=test.log
Or simply exclude them during your import with a command like:
impdp [USER]#[DATABASE] schemas=test exclude=TABLE:"IN ('Table1', 'Table2')" directory=DATA_PUMP_DIR dumpfile=test.dmp logfile=test.log
Either way should work, but be careful and ensure that there will be no issues from a constraint/child record perspective. You can also exclude the constraints. There are many ways to work around it.
IF THERE ARE INCONSISTENCIES BETWEEN THE COUNTS AND THE ROWS IMPORTED, I would recommend asking the vendor for the specific EXPDP command or parameter file that was used to generate the export. This will let you know if the empty rows are being caused by a clause in the export command.
It's impossible to know if your assumption is correct without knowing more about the database the export is coming from or seeing the the commands being used to generate the export. I would ask the vendor to verify record counts before assuming that it's a permission issue. Empty tables are created all the time.
I have Oracle database with almost 300 tables out of that 200 tables doesn't have any primary key and few tables have composite primary key. My requirement is to import all tables data in incremental manner to HDFS. Can you please let me know how this can be achieved using Sqoop. It would be great help if any other option is suggested.
Unfortunately, being unable to recognize updated rows (you indicate that you do not track update timestamps), makes it practically impossible to use incremental loads to capture the changes.
Some possibilities:
Add timestamps
Do a full load
Use the rownumber to identify new records, and don't process updated records
Is there a method to import data from a single file into multiple Oracle tables while maintaining the referential integrity?
Yes.
Without a lot more detail, I'll just say that you should look to external table to get the data from the file into the database, then select from the external table and use the 'INSERT ALL' feature to insert into multiple tables, from the single input.
Hope that helps.
There are couple alternatives (not an exhaustive list):
Walk the dependency graph of FOREIGN KEYs and make sure you insert data to "parents" before inserting it to "children".
Defer all the FOREIGN KEYs, so order of insertion does not matter. This is OK if you can perform the whole import in a single transaction.
Temporarily disable FOREIGN KEY constraints, import the data in any order, then re-enable them.
We get *.dmp files from client which has some masked table data including indexes and constraints.
I do have those table structures (including indexes and constraints) at my end.
I want to import just the data without the indexes and constraints (present in the .dmp file) in Oracle10g using 'imp' command.
I am aware of the 'imp' command. Do help me in letting me know the options available in 'imp' command to import only the data.
I have tried using -- rows=yes indexes=no but this does not help.
You should be able to specify indexes=N and constraints=N.
For more info:
%> imp help=y
Here is a link with some good info on the options:
Oracle imp information
I am assuming from your post that you already have the tables and ancillary structures in your database, and you just want to suppress the error messages. If that is indeed the case the option you want is IGNORE=Y.
The Oracle documentaion is available online. You don't say what version you're on, but as you're using IMP I'd say 9i was a good fit. Find out more.. (On later versions you should check out DataPump instead).
Import the dump with show=y option. This will create/extract the scripts from dmp file. Now you can remove the indexes and constraint scripts from the log and execute against the database.
Here you can see lot of exp/data pump related examples.
http://www.acehints.com/p/site-map.html
You need to disable all triggers and then import your data with the CONSTRAINTS=N argument. Consider importing a table G3E_COMPONENT with constraints, foreign keys and triggers:
SQL>alter table G3E_COMPONENT disable all triggers;
import your data:
imp userid=USER/XXXXX#ORCL CONSTRAINTS=N DATA_ONLY=Y STATISTICS=NONE file=export.exp log=imp.log TABLES=G3E_COMPONENT
Should do the trick
IMHO IMP can't prevent constraints being applied and triggers being fired, ignore=y only ignores errors that arise. Maybe datapump allows it, I don't know.
So you have to:
manually disable all triggers and constraints on imported table
do an import with tables=<table names> rows=Y indexes=N constraints=N
enable triggers
enable validate constraints and resolve any errors (find and edit/remove offending values).
Be careful to use imp version that exactly matches your DB version. I had trouble with this...
Do Ignore=Y. It will ignore the create errors since you have already have the schema.