I'd like to know if there is an option in the "exp" command which disables partitioning when exporting an Oracle 11g database. As a matter of fact I'm looking for a simple way to import this data into an Oracle XE database which does not support partitioning.
Thanks
Use the Data Pump import and export tools, unless you have very specific reasons not to (and with Oracle 11g, I can't think of a reason).
impdp and expdp have an option for doing exactly what you want: partition_options. Set that to merge and partitioned tables will be transformed into non-partitioned ones (during import or export).
See the examples to get started with these tools.
You may want to look-up the Oracle Database Utilities Guide on more options on how-to use expdp/impdp.
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#CIHCAFIG
Note, the PARTITION_OPTIONS parameter can only be used during import.
Related
can someone give me a point how to export and import table statistics in Oracle? Thank you
As always, it depends on what you want.
In oracle exporting table statistics is seen as getting them from the live table statistics and putting them in a separate table. This can be done on various levels and multiple times.
That separate stats table can then be exported using expdp and imported using impdp. It can be helpful to test with production statistics in a test environment or analyze all kinds of oddities.
First create a stat table using
DBMS_STATS.CREATE_STAT_TABLE
Next use the export functions you need from dbms_stats using
DBMS_STATS.EXPORT_TABLE_STAT.
Importing can be done using the import_table_stat procedure.
Full documentation: https://docs.oracle.com/database/121/ARPLS/d_stats.htm#ARPLS059
I created a dump of a local oracle database like this:
expdp mydb/passwd -schemas=myschema -dumpfile=mydumpfile.dmp -logfile=oralog.log
I sent the dump to someone who is supposed to import the dump in his oracle server. Now, he tells me, the import fails due to some errors related to tablespaces (like tablespace XYZ is not available, - the database XYZ is in no relation to the respective database). Besides, he asks me to give some information about the dump concerning the tablespaces.
Since I am usually working with MySQL and have limited knowledge about these Oracle-Tablespace things: I would really appreciate to get some advise.
Use REMAP_TABLESPACE parameter.
For example,
REMAP_TABLESPACE=(source1:destination1,source2:destination1,source3:destination1,source4:destination1)
Go through the documentation about Data Pump Import. A small quote -
Multiple REMAP_TABLESPACE parameters can be specified, but no two can
have the same source tablespace. The target schema must have
sufficient quota in the target tablespace.
Note that use of the REMAP_TABLESPACE parameter is the only way to
remap a tablespace in Data Pump Import. This is a simpler and cleaner
method than the one provided in the original Import utility. That
method was subject to many restrictions (including the number of
tablespace subclauses) which sometimes resulted in the failure of some
DDL commands.
By contrast, the Data Pump Import method of using the REMAP_TABLESPACE
parameter works for all objects, including the user, and it works
regardless of how many tablespace subclauses are in the DDL statement.
I am needing to export the tables for a given schema, into DDL scripts and Insert statements - and have it scripted such that, the order of dependencies/constraints is maintained.
I came across this article suggesting how to archive the database with data - http://www.dba-oracle.com/t_archiving_data_in_file_structures.htm - not sure if the article is applicable for oracle 10g/11g.
I have seen "export table with data" features in "Sql Developer", "Toad for Oracle", "DreamCoder for Oracle" etc, but i would need to do this one table at a time, and will still need to figure out the right order of script execution manually.
Are there any tools/scripts that can utilize oracle metadata and generate DDL script with data?
Note that some of the tables have CLOB datatype columns - so the tool/script would need to be able to handle these columns.
P.S. I am needing something similar to the "Generate Scripts" feature in SQL Server 2008, where one can specify "script data" option and get back a self-sufficient script with DDL and data, generated in the order of table constraints. Please see: http://www.kodyaz.com/articles/sql-server-script-data-with-generate-script-wizard.aspx
Thanks for your help!
Firstly, recognise that this isn't necessarily possible. A view can use a function in a package that also selects from the view. Another issue is that you might need to load data into tables and then apply constraints, even though this might be slower than the other way round.
In short, you will need to do some work here.
Work out the dependencies in your system. ALL_DEPENDENCIES is the primary mechanism.
Then use DBMS_METADATA.GET_DDL to extract the DDL statements. For small data volumes, I'd extract the constraints separately for applying after the data load.
In current versions you can create external tables to unload data from regular tables into OS files (and obviously go the other way round). But if you've got exotic datatypes (BLOB, RAW, XMLTYPEs, User Defined Types....) it will be more challenging.
I suggest that you use Oracle standard export and import (exp/imp) here, is there a reason why you won't consider it? Note in addition you can use the "indexfile" option on the import to output the SQL statements (unfortunately this doesn't include the inserts) to a file instead of actually executing them.
I would like to export data from a Oracle table into *.dbf file (like excel) through PL/SQL scripts. Are there any codes available?
There are a number of different ways to do this. The easiest way is to use an IDE like SQL Developer or TOAD, which offer this functionality.
If you want to call it from PL/SQL, then then are no built-in Oracle functions. However, it is relatively straightforward to build something using UTL_FILE which can write out value separated records. These can be picked up in Excel.
Note that the default separator - , (comma being the "C" in .CSV) - will cause problems if your exported data contains commas. So you will need to use the Data Import wizard rather than a right-click Open With ...
Incidentally, it is probably a bad idea to use the .dbf suffix. In an Oracle file system the presumed meaning is database file - i.e. part of the database's infrastructure. This is just a convention, but there is no point in needlessly confusing people. Perferred alternatives include .csv, .dmp or .exp.
edit
If your interest is just to export data for transferring to another Oracle database then you should look at using the Data Pump utility. This comes with an API so it can be used from PL/SQL. Alternatively we unload data through external tables declared with a DataPump driver.
You could also consider using the External Tables feature of Oracle. This essentially allows you to map a CSV file to a 'virtual' table and the you can insert into it (and therefore the file.)
Oracle External Tables Concept Guide
We get *.dmp files from client which has some masked table data including indexes and constraints.
I do have those table structures (including indexes and constraints) at my end.
I want to import just the data without the indexes and constraints (present in the .dmp file) in Oracle10g using 'imp' command.
I am aware of the 'imp' command. Do help me in letting me know the options available in 'imp' command to import only the data.
I have tried using -- rows=yes indexes=no but this does not help.
You should be able to specify indexes=N and constraints=N.
For more info:
%> imp help=y
Here is a link with some good info on the options:
Oracle imp information
I am assuming from your post that you already have the tables and ancillary structures in your database, and you just want to suppress the error messages. If that is indeed the case the option you want is IGNORE=Y.
The Oracle documentaion is available online. You don't say what version you're on, but as you're using IMP I'd say 9i was a good fit. Find out more.. (On later versions you should check out DataPump instead).
Import the dump with show=y option. This will create/extract the scripts from dmp file. Now you can remove the indexes and constraint scripts from the log and execute against the database.
Here you can see lot of exp/data pump related examples.
http://www.acehints.com/p/site-map.html
You need to disable all triggers and then import your data with the CONSTRAINTS=N argument. Consider importing a table G3E_COMPONENT with constraints, foreign keys and triggers:
SQL>alter table G3E_COMPONENT disable all triggers;
import your data:
imp userid=USER/XXXXX#ORCL CONSTRAINTS=N DATA_ONLY=Y STATISTICS=NONE file=export.exp log=imp.log TABLES=G3E_COMPONENT
Should do the trick
IMHO IMP can't prevent constraints being applied and triggers being fired, ignore=y only ignores errors that arise. Maybe datapump allows it, I don't know.
So you have to:
manually disable all triggers and constraints on imported table
do an import with tables=<table names> rows=Y indexes=N constraints=N
enable triggers
enable validate constraints and resolve any errors (find and edit/remove offending values).
Be careful to use imp version that exactly matches your DB version. I had trouble with this...
Do Ignore=Y. It will ignore the create errors since you have already have the schema.