Is it possible to export multiple tables in oracle using QUERY parameter the WHERE clauses are different for each table?
If you're using the old export (exp) then no, you'd need to do a separate export for each table. The restrictions are shown in the documentation.
If you're using data pump (expdp) then yes, you can specify multiple QUERY clauses and specify which table each applies too, again as described in the documentation.
Related
can someone give me a point how to export and import table statistics in Oracle? Thank you
As always, it depends on what you want.
In oracle exporting table statistics is seen as getting them from the live table statistics and putting them in a separate table. This can be done on various levels and multiple times.
That separate stats table can then be exported using expdp and imported using impdp. It can be helpful to test with production statistics in a test environment or analyze all kinds of oddities.
First create a stat table using
DBMS_STATS.CREATE_STAT_TABLE
Next use the export functions you need from dbms_stats using
DBMS_STATS.EXPORT_TABLE_STAT.
Importing can be done using the import_table_stat procedure.
Full documentation: https://docs.oracle.com/database/121/ARPLS/d_stats.htm#ARPLS059
If I have a schema_search_path set and I wish to create a bunch of tables using a common script by setting the schema and not explicating the
schema in the table create (common script could be used in multiple schemas, this also sets the schema_search_path to just the specified schema.
This seems like an undesirable side affect.
Value set by SET SCHEMA_SEARCH_PATH is not affected by any other commands.
But this value is only used when an identified is not qualified with the schema and an object with this name doesn't exist in the current schema (affected by SET SCHEMA command).
For example, tables referenced by non-qualified names are searched in the following order:
Tables of the current schema.
Local temporary tables. (Currently they also include query aliases from the WITH clauses, but this may be changed when somebody will implement a separate scope of identifiers for these views.)
Tables of each schema from SCHEMA_SEARCH_PATH, if any. When multiple schemas are specified, they order has a meaning, they are processed in the same order.
Legacy or compatibility tables, such as DUAL or SYSDUMMY1 in DB2 and Derby compatibility modes.
The first table matched by its name will be used.
This is a complex case, for the most types of database objects only steps (1) and (3) are performed.
If you think that something is not going as described here and you can create a standalone test case (Java / JDBC / SQL only, no third-party libraries), you can create a bug report on GitHub:
https://github.com/h2database/h2database/issues
I need to export the full schema of a one month data range. I have a lot of partitioned tables inside the schema, so I need a query to export a schema without any constraint issue or partitioned table issue.
Usually I use to take separate table mentioning all partitioned table names inside the query.
I'm not sure you can do that (at least, not in a simple manner).
In order to export only part of every table, you'd have to use the QUERY parameter and include a WHERE clause to fetch data within the last month.
If there are many (actually, more than one or two) tables, I'd suggest you not to put everything into the command line, but to create a parameter file, put everything you use with EXPDP in there and use it later. It is easier to maintain.
I use oracle 11, and use exp/imp tools to migrate data between databases.
It works very fine IF all empty tables and sequences are already created in target database.
But If tables dont exists in target DB than a few bad things happen;
It still creates tables but only the ones with data, I couldnt find a way to force it create empty tables in target DB.
It does not create the sequences.
This is how I enter my values to export tool;
Users or Tables -> Tables
Export table data -> yes
Compress -> yes
Table or Partition to be exported -> I enter table names here one by one,
But it does accept table names without data..It says table does not exist, so no surprize they are not imported later.
Import Data only > no
Import File > Full path to Dump file.
List contents of import file > no
Ignore create error > no
import grants > yes
import table data > yes
import entire export > yes
Sequences are not exported in table mode. The documentation lists the objects exported in each mode, and that shows that sequences are only exported in user and full database modes.
Export is deprecated in 11g, as the documentation also states:
Original Export is desupported for general use as of Oracle Database 11g. The only supported use of original Export in Oracle Database 11g is backward migration of XMLType data to Oracle Database 10g release 2 (10.2) or earlier. Therefore, Oracle recommends that you use the new Data Pump Export and Import utilities
The empty tables are not being exported if you have deferred segment creation. This AskTom articles refers to it, and it's also mentioned in the documentation:
The original Export utility does not export any table that was created
with deferred segment creation and has not had a segment created for
it.
You can either use dbms_metadata.get_ddl() to get the table creation statements for all the tables, or just the empty ones, and build them manually from that; or force an extent to be allocated (as mentioned in the docs too); or use the supported and current data pump export and import. Based on previous questions you should only be using exp/imp if your customer refuses to handle data pump files, and I can't really think of a good justification for that.
I am needing to export the tables for a given schema, into DDL scripts and Insert statements - and have it scripted such that, the order of dependencies/constraints is maintained.
I came across this article suggesting how to archive the database with data - http://www.dba-oracle.com/t_archiving_data_in_file_structures.htm - not sure if the article is applicable for oracle 10g/11g.
I have seen "export table with data" features in "Sql Developer", "Toad for Oracle", "DreamCoder for Oracle" etc, but i would need to do this one table at a time, and will still need to figure out the right order of script execution manually.
Are there any tools/scripts that can utilize oracle metadata and generate DDL script with data?
Note that some of the tables have CLOB datatype columns - so the tool/script would need to be able to handle these columns.
P.S. I am needing something similar to the "Generate Scripts" feature in SQL Server 2008, where one can specify "script data" option and get back a self-sufficient script with DDL and data, generated in the order of table constraints. Please see: http://www.kodyaz.com/articles/sql-server-script-data-with-generate-script-wizard.aspx
Thanks for your help!
Firstly, recognise that this isn't necessarily possible. A view can use a function in a package that also selects from the view. Another issue is that you might need to load data into tables and then apply constraints, even though this might be slower than the other way round.
In short, you will need to do some work here.
Work out the dependencies in your system. ALL_DEPENDENCIES is the primary mechanism.
Then use DBMS_METADATA.GET_DDL to extract the DDL statements. For small data volumes, I'd extract the constraints separately for applying after the data load.
In current versions you can create external tables to unload data from regular tables into OS files (and obviously go the other way round). But if you've got exotic datatypes (BLOB, RAW, XMLTYPEs, User Defined Types....) it will be more challenging.
I suggest that you use Oracle standard export and import (exp/imp) here, is there a reason why you won't consider it? Note in addition you can use the "indexfile" option on the import to output the SQL statements (unfortunately this doesn't include the inserts) to a file instead of actually executing them.