Why regular oracle table support DML statements,but not the same for External table? - oracle

This is known to us that all DML statement has been supported by Oracle Regular Table but not the same for External Table? I tried below :
SQL> INSERT INTO xtern_empl_rpt VALUES ('70','Rakshit','Nantu','4587966214','na
tu.rakshit#ge.com','55');
INSERT INTO xtern_empl_rpt VALUES ('70','Rakshit','Nantu','4587966214','natu.ra
kshit#ge.com','55')
*
ERROR at line 1:
ORA-30657: operation not supported on external organized table
SQL> update xtern_empl_rpt set FIRST_NAME='Arup' where SSN='896743856';
update xtern_empl_rpt set FIRST_NAME='Arup' where SSN='896743856'
*
ERROR at line 1:
ORA-30657: operation not supported on external organized table
SQL>
So it seems External table not support this. But my question is - what the logical reason behind this design?

There is no mechanism in Oracle for locking rows in external tables, none of the concurrency controls which we get with regular heap tables. So updating is not allowed.
External tables created with the Oracle Loader driver are read only; the Datapump driver allows us to write to external table files but only in an CTAS mode.
The problem is that eternal tables are basically windows on OS files, without the layer of abstraction and control that internal tables offer. Basically, there is no way for the database to lock a record in an OS file, because the notion of a "record" is a databse thang, not an OS file thang.

External tables are designed for only one thing: data loading and unloading. They are simply not meant to be used with normal DML, and they're not really meant for normal selects either - that works, but if you need to do a lot of selections on an external table, you're "doing it wrong": load the data into proper tables, calculate statistics & add indexes as necessary.
Having external tables behave like normal tables would need that all the transactional machinery be implemented for them, which is very complex, and not worth it since that's not what they are meant for.
If you need normal tables and want to transplant them from one Oracle database to another, you should evaluate using transportable tablespaces too.

Limitations of external table are an obvious consequence of their being read-only; they are an adapter to involve in SQL queries either arbitrary record-organized files (ORACLE_LOADER type) or exported copies of tables in another database (ORACLE_DATAPUMP type).
As already mentioned, external tables are only good for full table scan queries; if one needs to use indexes in heavy duty queries or to modify foreign data sets that have been imported from files, regular tables can be populated using the SQL Loader tool.

Related

When can we use Oracle external tables

I have read many posts compaing External table with sqlloader and the main advantage is optimizing the select query with many options available in SQL for the external table. But i am finding it difficult to do selects on large files(1.5 GB). Just for a select count(*) itself it takes minutes to perform.
My plan is to generate a report based on this data by doing a number of select statements from this data. I wonder if this is a better idea compared to loading the data to an internal table.
I assume the ideal use of External table would be to do SELECT on the file to perform cleanup and Load to an internal table more efficiently. It is not meant to use the file as a table for a longer duration(Especially for large files). Please correct if i am wrong.
If you're going to execute multiple select on data from big file it is much better to load it to some internal staging table (either by SQLoader or by external table and insert as select) and then perform queries.
You should probably consider creating some indexes on table to speed up your queries.

Can EF6 use oracle database links?

We have a legacy app that I am rewriting in .net. All of our databases are oracle and make use of database links. Is there any way for Entity Framework 6 to generate models based on tables located on a different database?
Currently the legacy app gets data from table like this
SELECT * FROM emp#foo2;
where its db connection is to database foo that has a database link to the database foo2.
I would like to reproduce this using EF6. So far all I have found regarding this is this question.
You can do two things that EF 4 or higher will work with:
CREATE VIEW EMP as SELECT * FROM emp#foo2;
CREATE MATERIALIZED VIEW EMP as SELECT * FROM emp#foo2;
LOBS are not accessible across a database link without some contorted PL/SQL processing to read the LOB piece by piece.
I believe fast refresh does not work across database links so you must consider the size of the table on the linked database. If you are refreshing a million rows you may find the time to do this is an issue. Most large tables are full of tombstone data that never changes so a timestamp column with the last modified date could help you create a package that only picks out the changed data.
If you are doing complicated joins for either ensure that Oracle considers the column that would be the primary key as not null.
You can add a primary key on views and materialized view but it must be disabled. See here for details.

External table limitations

Now i am working on external tables... While i do like its flexibility. I would like to know these things about external table -
Like in SQL Loader we can append data to the table . Can we do that in External table ?
In external table , we cannot create indexes neither can we perform DML operations. Is this kind of virtual table or this acquires space in the data base ?
Also in SQL loader we can access the data from any server in external table we define the default directory. Can we in turn do the same in external table that is access the data from any server ?
External tables allow Oracle to query data that is stored outside the database in flat files as though the file were an Oracle table.
The ORACLE_LOADER driver can be used to access any data stored in any format that can be loaded by SQL*Loader. No DML can be performed on external tables but they can be used for query, join and sort operations. Views and synonyms can be created against external tables. They are useful in the ETL process of data warehouses since the data doesn't need to be staged and can be queried in parallel. They should not be used for frequently queried tables.
You asked:
like in SQL Loader we can append data to the table . Can we do that in External table ?
Yes.
In external table , we cannot create indexes neither can we perform DML operations. Is this kind of virtual table or this acquires space in the data basE ?
As the name suggests, it is external to the database. You use ORGANIZATION EXTERNAL syntax. The directory is created at OS level.
Also in SQL loader we can access teh data from any server in external table we define the DEfault directory. Can we in turn do the same in external table that is access the data from any server ?
This is wrong. SQL*Loader is a client-side tool, while external table is a server-side tool. External Table can load file which is accessible from database server. You can't load External Table from file residing on your client. You need to save the files to a filesystem available to the Oracle server.
Prior to version 10g, external tables were READ ONLY. DML could not be performed. Starting with version Oracle Database 10g, external tables can be written to as well as read from.
From documentation, also read Behavior Differences Between SQL*Loader and External Tables

Why External table concept has been established in Oracle?

SQL*Loader: Oracle uses this functionality, through the ORACLE_LOADER access driver to move data from a flat file into the database;
Data Pump: It uses a Data Pump access driver to move data out of the database into a file in an proprietary Oracle format, and back into the database from files of that format.
When a data load can be done by either the SQL*Loader or Data Pump utilities, and data unload can also be done by the Data Pump utility:
Are there any extra benefits that can be achieved by using external tables, that none of the previously mentioned utilities can do by themselves?
The below Oracle table creation command creates a table which looks like an Oracle table.Why are then Oracle telling us to call it as an external table?
create table export_empl_info organization external
( type oracle_datapump
default directory xtern_data_dir
location ('empl_info_rpt.dmp')
) as select * from empl_info;
"Are there any extra benefits that can be achieved by using external
tables, that none of the previously mentioned utilities can do by
themselves?"
SQL*loader and Datapump both require us to load the data into tables before we can access it with the database. Whereas we only access external tables through SELECT statements. It's a much more flexible mechanism.
"Why are then Oracle telling us to call it as an external table?"
umm, because it is external. The data resides in an file (or files) which is controlled by the OS. We can change the data in an external table by running an OS command like
$> cp wnatever.csv external_table_data.csv
There's no redo, rollback, flashback query or any of the other appurtenances of an internal database table.
I think that the primary benefits of external tables for me have been:
i) Not having to execute a host command to import data, which supports a trend in Oracle to control the entire code bade from inside the database. Preprocessing in 11g allows access to remote files through ftp, use of compressed files, combining multiple files into one, etc
ii) More efficient loads, by means of applying complex data transformations during the load process. Aggregations, merges, multitable inserts ... etc
I've used it for data warehouse loads, but any scenario requiring loading of or access to standard data files is a candidate for use of external tables. SQL*Loader still has its place as a tool for loading to an Oracle database from a client or other host system. Data pump is for transfer of data between Oracle databases, so it's rather different.
One limitation of external tables is that they won't process stream data -- records have to be delimited. This was true in 10.2, not sure if it's been permitted since then.
Use the system catalog views ALL/DBA/USER_EXTERNAL_TABLES for information on them
RE: Why external table vs sqlldr for loading data? Mainly to have server managed parallelism vs client managed parallelism.

Script Oracle tables (DDL) with data insert statements into single/multiple sql files

I am needing to export the tables for a given schema, into DDL scripts and Insert statements - and have it scripted such that, the order of dependencies/constraints is maintained.
I came across this article suggesting how to archive the database with data - http://www.dba-oracle.com/t_archiving_data_in_file_structures.htm - not sure if the article is applicable for oracle 10g/11g.
I have seen "export table with data" features in "Sql Developer", "Toad for Oracle", "DreamCoder for Oracle" etc, but i would need to do this one table at a time, and will still need to figure out the right order of script execution manually.
Are there any tools/scripts that can utilize oracle metadata and generate DDL script with data?
Note that some of the tables have CLOB datatype columns - so the tool/script would need to be able to handle these columns.
P.S. I am needing something similar to the "Generate Scripts" feature in SQL Server 2008, where one can specify "script data" option and get back a self-sufficient script with DDL and data, generated in the order of table constraints. Please see: http://www.kodyaz.com/articles/sql-server-script-data-with-generate-script-wizard.aspx
Thanks for your help!
Firstly, recognise that this isn't necessarily possible. A view can use a function in a package that also selects from the view. Another issue is that you might need to load data into tables and then apply constraints, even though this might be slower than the other way round.
In short, you will need to do some work here.
Work out the dependencies in your system. ALL_DEPENDENCIES is the primary mechanism.
Then use DBMS_METADATA.GET_DDL to extract the DDL statements. For small data volumes, I'd extract the constraints separately for applying after the data load.
In current versions you can create external tables to unload data from regular tables into OS files (and obviously go the other way round). But if you've got exotic datatypes (BLOB, RAW, XMLTYPEs, User Defined Types....) it will be more challenging.
I suggest that you use Oracle standard export and import (exp/imp) here, is there a reason why you won't consider it? Note in addition you can use the "indexfile" option on the import to output the SQL statements (unfortunately this doesn't include the inserts) to a file instead of actually executing them.

Resources