can someone give me a point how to export and import table statistics in Oracle? Thank you
As always, it depends on what you want.
In oracle exporting table statistics is seen as getting them from the live table statistics and putting them in a separate table. This can be done on various levels and multiple times.
That separate stats table can then be exported using expdp and imported using impdp. It can be helpful to test with production statistics in a test environment or analyze all kinds of oddities.
First create a stat table using
DBMS_STATS.CREATE_STAT_TABLE
Next use the export functions you need from dbms_stats using
DBMS_STATS.EXPORT_TABLE_STAT.
Importing can be done using the import_table_stat procedure.
Full documentation: https://docs.oracle.com/database/121/ARPLS/d_stats.htm#ARPLS059
Related
I created a dump of a local oracle database like this:
expdp mydb/passwd -schemas=myschema -dumpfile=mydumpfile.dmp -logfile=oralog.log
I sent the dump to someone who is supposed to import the dump in his oracle server. Now, he tells me, the import fails due to some errors related to tablespaces (like tablespace XYZ is not available, - the database XYZ is in no relation to the respective database). Besides, he asks me to give some information about the dump concerning the tablespaces.
Since I am usually working with MySQL and have limited knowledge about these Oracle-Tablespace things: I would really appreciate to get some advise.
Use REMAP_TABLESPACE parameter.
For example,
REMAP_TABLESPACE=(source1:destination1,source2:destination1,source3:destination1,source4:destination1)
Go through the documentation about Data Pump Import. A small quote -
Multiple REMAP_TABLESPACE parameters can be specified, but no two can
have the same source tablespace. The target schema must have
sufficient quota in the target tablespace.
Note that use of the REMAP_TABLESPACE parameter is the only way to
remap a tablespace in Data Pump Import. This is a simpler and cleaner
method than the one provided in the original Import utility. That
method was subject to many restrictions (including the number of
tablespace subclauses) which sometimes resulted in the failure of some
DDL commands.
By contrast, the Data Pump Import method of using the REMAP_TABLESPACE
parameter works for all objects, including the user, and it works
regardless of how many tablespace subclauses are in the DDL statement.
I'd like to know if there is an option in the "exp" command which disables partitioning when exporting an Oracle 11g database. As a matter of fact I'm looking for a simple way to import this data into an Oracle XE database which does not support partitioning.
Thanks
Use the Data Pump import and export tools, unless you have very specific reasons not to (and with Oracle 11g, I can't think of a reason).
impdp and expdp have an option for doing exactly what you want: partition_options. Set that to merge and partitioned tables will be transformed into non-partitioned ones (during import or export).
See the examples to get started with these tools.
You may want to look-up the Oracle Database Utilities Guide on more options on how-to use expdp/impdp.
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#CIHCAFIG
Note, the PARTITION_OPTIONS parameter can only be used during import.
I use oracle 11, and use exp/imp tools to migrate data between databases.
It works very fine IF all empty tables and sequences are already created in target database.
But If tables dont exists in target DB than a few bad things happen;
It still creates tables but only the ones with data, I couldnt find a way to force it create empty tables in target DB.
It does not create the sequences.
This is how I enter my values to export tool;
Users or Tables -> Tables
Export table data -> yes
Compress -> yes
Table or Partition to be exported -> I enter table names here one by one,
But it does accept table names without data..It says table does not exist, so no surprize they are not imported later.
Import Data only > no
Import File > Full path to Dump file.
List contents of import file > no
Ignore create error > no
import grants > yes
import table data > yes
import entire export > yes
Sequences are not exported in table mode. The documentation lists the objects exported in each mode, and that shows that sequences are only exported in user and full database modes.
Export is deprecated in 11g, as the documentation also states:
Original Export is desupported for general use as of Oracle Database 11g. The only supported use of original Export in Oracle Database 11g is backward migration of XMLType data to Oracle Database 10g release 2 (10.2) or earlier. Therefore, Oracle recommends that you use the new Data Pump Export and Import utilities
The empty tables are not being exported if you have deferred segment creation. This AskTom articles refers to it, and it's also mentioned in the documentation:
The original Export utility does not export any table that was created
with deferred segment creation and has not had a segment created for
it.
You can either use dbms_metadata.get_ddl() to get the table creation statements for all the tables, or just the empty ones, and build them manually from that; or force an extent to be allocated (as mentioned in the docs too); or use the supported and current data pump export and import. Based on previous questions you should only be using exp/imp if your customer refuses to handle data pump files, and I can't really think of a good justification for that.
I want to load data from one table from one schema to another schema on daily basis.
Tables are in different database so to create database link will not be an option due to some security purpose....
About million records will get process....
Databases are on different server , from database "A" I am fetching Employee presence details by combining emp details and emp presence table for period of a month , and loading this data in other table on database "B". Need to run this activity on daily basis.
I need to run a job daily at low peak hours to get complete copy of table into other db ...
will Import/Export or loading data with help of sqlldr?
please let me know the correct way..
Thanks in Advance..
What are my best options?
Well, it seems that using database link would best fit for your situation. If you want to read a table from a database, you should have read privilege. Perhaps you can ask the DBA creating an account(user) which only has read privilege for specific table. Then you can use database link connecting with the new user.
You can't update or delete data from the table because the user you connecting doesn't have the write privilege. This can solve the security problem.
exp/imp and sqlldr are different tools. They don't work together. You can only import data from an export file. You can't load export file with sqlldr.
If you want to run this periodically, it sounds like you might want to take a look at the Oracle Scheduler
Overview: http://docs.oracle.com/cd/B28359_01/server.111/b28310/schedover001.htm
To export the data and add it into the new database, you might want to use Oracle DataPump, which can do both the export and import for you, securely.
Data Pump Export: http://docs.oracle.com/cd/B28359_01/server.111/b28319/dp_export.htm
So your bet might be creating a shell script that uses data pump to create an export file from database number 2, and then uses data pump again to import said file into database number 1.
Once you have that script, you can schedule it to run during nights or at any time you have low traffic.
Regards
Is it possible to export multiple tables in oracle using QUERY parameter the WHERE clauses are different for each table?
If you're using the old export (exp) then no, you'd need to do a separate export for each table. The restrictions are shown in the documentation.
If you're using data pump (expdp) then yes, you can specify multiple QUERY clauses and specify which table each applies too, again as described in the documentation.