I'm trying to import specific rows from a full '.dmp' file using the parfile parameter.
Import command:
IMP userid=user/password#db parfile=parfile.dat
parfile.dat file:
But I'm receiving the error below when executing the IMP command:
What can be the problem?
Is it possible to use a condition using the old IMP command?
If yes, why it is not working?
Thank you for your help,
As long as PARFILE is a valid parameter for the original IMP utility, QUERY is not - which answers your question:
is it possible to use a condition using the old IMP command?
No, it is not.
If yes, why is it not working?
Because it is not supported.
As you're on 12c, here's the Original Import documentation. Have a look at its Parameters section - you won't find QUERY in there (to see the list of all parameters, expand the tree node on the left hand side of the screen).
So, what to do?
Use Data Pump instead, if possible
If all you have is the DMP file created with the original EXP (and you can't obtain a new, data pump one), import the whole table and write a query which will select data from it, using WHERE clause you meant to use in the PARFILE.
Alternatively, delete all rows that don't satisfy that condition.
Related
I am trying to import the dump file to .sql file using SQLFILE parameter.
I used the command "impdp username/password DIRECTORY=dir DUMPFILE=sample.dmp SQLFILE=sample.sql LOGFILE=sample.log"
I expected this to return a sql file with contents inside the table. But it created a sql file with only DDL queries.
For export I used, "expdp username/password DIRECTORY=dir DUMPFILE=sample.dmp LOGFILE=sample.log FULL=y"
Dump file size is 130 GB. So, I believe the dump has been exported correctly.
Am I missing something in the import command? Is there any other parameter should I use to get the contents?
Thanks in advance!
Your expectation was wrong, I'm afraid. You're asking it to do something it isn't designed for.
The documentation for SQLFILE says:
Purpose
Specifies a file into which all of the SQL DDL that Import would have executed, based on other parameters, is written.
So it will only ever contain DDL.
There isn't a mechanism to turn a .dmp file into a .sql containing insert statements. If you need to put the data into a table, just use the native import.
Individual insert statements - if you could generate them, which SQL Developer will do as a separate task unrelated to your data pump export - would be slower, would have problems with LOBs, and would have to be careful about the order they were run unless integrity constraints were disabled. Data pump takes care of all of that for you.
I want export data something like this.
exp xxx/xxx file=d:\xxx.dmp owner=xxx query=\"where rownum < 1000\"
But I get an error "QUERY parameter is only use in table mode"
Oracle version 10g
As #Thilo says, with exp you can only user the query parameter in table mode. If you're able to use the newer data pump functionality, via the expdp command, you can apply a similar query parameter to the whole export.
#Thilo is right, you can export a single table or a SUBSET of a single table
I also recommend reading Tom's advice in regards to using parfile
What is the best way to copy schema from one user/instance/server:
jdbc:oracle:thin:#deeb02:1535:DH, user pov
to another user/instance/server
jdbc:oracle:thin:#123.456.789.123:1523:orcl, user vrs_development
?
Similarly, if you're using Oracle 10g+, you should be able to make this work with Data Pump:
expdp user1/pass1#db1 directory=dp_out schemas=user1 dumpfile=user1.dmp logfile=user1.log
And to import:
impdp user2/pass2#db2 directory=dp_out remap_schema=user1:user2 dumpfile=user1.dmp logfile=user2.log
Use oracle exp utility to take a dump of the schema from the first database
exp user1/pass1#db1 owner=user1 file=user1.dmp log=user1.log
Then use imp utility to populate the other schema in the other datbase
imp user2/pass2#db2 fromuser=user1 touser=user2 file=user1.dmp log=user2.log
you can directly copy schema via network (without moving files from one server to another) using datapump parameter NETWORK LINK as described here:
http://vishwanath-dbahelp.blogspot.com/2011/09/network-link-in-datapump.html
for example:
impdp -userid user/pass#destination_server LOGFILE=log.txt NETWORK_LINK=dblink_from_dest_to_source SCHEMAS=schema1 directory=DATA_PUMP_DIR
check that directory DATA_PUMP_DIR exists in
select *
from dba_directories
and points to correct place in destination_server file system.
cmd:
exp bla/bla file=c:\bla.bkp
my bla schema in objects
Table
T_1
T_2
T_3
T_4
Functions
F_1
F_2
Procedure
P_1
P_2
I need all object but not in table ( T_4 ) how to make ?
If you are using the deprecated export utility, you cannot exclude a single object. You would have to specify every table that you wanted in a TABLES clause, i.e.
exp username/password file=c:\bla.dmp tables=(T_1, T_2, T_3)
Obviously, that gets unwieldy rather quickly. You can potentially write a query that generates the tables list for you and then copy & paste from a SQL*Plus window. But that is also rather unwieldy.
Assuming you are using a reasonably new version of Oracle, however, you should be able to use the data pump version of the export and import utilities, expdp. With expdp
expdp username/password dumpfile=c:\bla.dmp exclude=T_4
You can specify teh tables of interest n the command line, something like
exp bla/bla file=c:\bla.bkp TABLES=(T_1,T_2,T_3)
Ok, that only gets tables, the rest of the stuff you are going to have to use/write something else. Look at the enter code heredbms_metadata.GET_DDL procedure,
I have an entire DB to be imported as a dump into my own. I want to exclude data out of certain tables(mostly because they are huge in size and not useful). I cannot entirely exclude those tables since I need the table object per se(minus the data) and will have to re create them in my schema if I do so. Also in the absence of those table objects , various other foreign constraints defined on other tables will also fail to be imported and will need to be redefined.So I need to exclude just the data from certain tables.I want data from all other tables though.
Is there a set of parameters for impdp that can help me do so?
I would make two runs at it: The first I would import metadata only:
impdp ... CONTENT=METADATA_ONLY
The second would include the data only for the tables I was interested in:
impdp ... CONTENT=DATA_ONLY TABLES=table1,table2...
Definitely make 2 runs. One to create all the table objects, but instead of using tables in the second impdp run, use the exclude
impdp ... Content=data_only exclude=TABLE:"IN ('table1', 'table2')"
The other way works, but this way you only have to list the tables you don't want versus all that you want.
If the size of the table is big for export import the you can use "SAMPLE" parameter in expdp command to take export of table for what ever percentage you want ....
$ expdp tables=T100test DIRECTORY=expimp1 DUMPFILE=test12.dmp SAMPLE = 10;
This command will export only 10% data of the T100test table's data.
Syntax:
EXCLUDE=[object_type]:[name_clause],[object_type]:[name_clause]
INCLUDE=[object_type]:[name_clause],[object_type]:[name_clause]
Examples of operator-usage:
EXCLUDE=SEQUENCE
or EXCLUDE=TABLE:"IN ('EMP','DEPT')"
or EXCLUDE=INDEX:"= 'MY_INDX'"
or INCLUDE=PROCEDURE:"LIKE 'MY_PROC_%'"
or INCLUDE=TABLE:"> 'E'"
The parameter can also be stored in a parameter file, for example: exp.par
DIRECTORY = my_dir
DUMPFILE = exp_tab.dmp
LOGFILE = exp_tab.log
SCHEMAS = scott
INCLUDE = TABLE:"IN ('EMP', 'DEPT')"
It seems you can exclude directly when importing using impdp query parameter
impdp [...] QUERY='TABLE_NAME:"WHERE rownum = 0"'
cf : community.oracle.com