Oracle Data Modeler - How to Commit DDL changes back to Database? - oracle

Initial Note: Created Model by choosing to Import the Data Dictionary using one of my Connections and then choosing the Schema and lastly the Tables for which I want to model.
After making changes within Oracle SQL Developer Data Modeler how can I commit the changes made in the newly created relational model back to the database.
I can manually parse thru the generated DDL but that seems like unnecessary work. I attempted to use the 'Synchronize with Data Dictionary' option however when I went back to my tables within my schema they were not altered/updated in any way. No Primary Keys... Foreign Keys, Indexes or any other of the DDL actions I created in the model were seen in my database. What am I missing here?
I really thought the Synchronize options where what I should be using.

We will never commit changes to the database.
You'll do the compare, review the delta DDL, and then if you think it's good - load it up in SQLcl, SQL Developer, or SQLPlus to run.
It's not that we don't trust you to do the review part first, but also, it'd be just too easy to muck up a database if you hit the wrong button. Especially as some table structural changes could result in data loss.

Related

Dynamic Audit Trigger

I want to keep logs of all tables into 1 single log table. Suppose if any DML operation is going on any table inside DB. Than that should be logged in 1 single tables.
But there should be a dynamic trigger which will not hard coded the column names for every table.
Is there any solution for this.
Regards,
Somdutt Harihar
"Is there any solution for this"
No. This is not how databases work. Strongly enforced data structures is what they do, and that applies to audit tables just as much as transaction tables.
The reason is quite clear: the time you save not writing audit code specific to each transactional table is the time you will spend writing a query to retrieve the audit records. The difference is, when you're trying to get the audit records out you will have your boss standing over your shoulder demanding to know when you can tell them what happened to the payroll records last month. Or asking how long it will take you to produce that report for the regulators, are you trying to make the company look like a bunch of clowns? You get the picture. This is not where you want to be.
Also, the performance of a single table to store all the changes to all the tables in the database? That is going to be so slow, you have no idea.
The point is, we can generate the auditing code. It is easy to write some SQL which interrogates the data dictionary and produces DDL for the target tables and triggers to populate those tables.
In fact it gets even easier in 11.2.0.4 and later because we can use FLASHBACK DATA ARCHIVE (formerly Oracle Total Recall) to build and maintain such journalling functionality automatically, and query it automatically with the as of syntax. Find out more.
Okay, so technically there is a solution. You could have a trigger on each table which executes some dynamic PL/SQL to interrogate the data dictionary and assembles a piece of JSON which you stuff into your single table. The single table could be partitioned by day range and sub-partitioned by table name (assuming you have licensed the Partitioning option) to mitigate the performance of querying it.
But that is extremely complex. Running dynamic PL/SQL for every DML statement will have a bad effect on performance, which the users will notice. And this still doesn't solve the fundamental problem of retrieving the audit trail when you need it.
To audit DML actions on any table just enable such audit by using following code:
audit insert table, update table, delete table;
All actions with tables will then be logged to sys.dba_audit_object table.
Audit will only log timestamp, user, host and other params, not exact copies of new or old rows.

Can EF6 use oracle database links?

We have a legacy app that I am rewriting in .net. All of our databases are oracle and make use of database links. Is there any way for Entity Framework 6 to generate models based on tables located on a different database?
Currently the legacy app gets data from table like this
SELECT * FROM emp#foo2;
where its db connection is to database foo that has a database link to the database foo2.
I would like to reproduce this using EF6. So far all I have found regarding this is this question.
You can do two things that EF 4 or higher will work with:
CREATE VIEW EMP as SELECT * FROM emp#foo2;
CREATE MATERIALIZED VIEW EMP as SELECT * FROM emp#foo2;
LOBS are not accessible across a database link without some contorted PL/SQL processing to read the LOB piece by piece.
I believe fast refresh does not work across database links so you must consider the size of the table on the linked database. If you are refreshing a million rows you may find the time to do this is an issue. Most large tables are full of tombstone data that never changes so a timestamp column with the last modified date could help you create a package that only picks out the changed data.
If you are doing complicated joins for either ensure that Oracle considers the column that would be the primary key as not null.
You can add a primary key on views and materialized view but it must be disabled. See here for details.

Oracle 11g Replication - Using refresh on commit with remote database (database links)

Good afternoon,
I have 3 databases; the SIDs are config, prod1 and prod2.
I am using Materialized Views to replicate data on 11 tables in the config database onto the other two databases. The Materialized Views are currently refreshing every five seconds but it would be ideal if they were updated on commit.
I came across this website that explains that when replicating from a remote database that on commit is not supported.
This is what I was expecting to work
CREATE MATERIALIZED VIEW "schema"."table" USING INDEX REFRESH FORCE ON COMMIT AS select column1 from schema.table#config;
The method "refresh fast on demand with primary key" is suggested in the link but obviously this is on demand. I am wondering what ideas anyone may have in order to get a refresh-on-commit environment running if possible?
Thanks
You can't create a materialized view refreshed on commit from a remote table. From the documentation:
Restrictions on Refreshing ON COMMIT
This clause is not supported for materialized views containing object types or Oracle-supplied types.
This clause is not supported for materialized views with remote tables.
The reason is that the database link is defined in the "child" database, not in the "parent" database. Therefore, the parent database can't possibly trigger or modify anything in the child database on its own.
If you want a 100% real-time copy of a table, I suggest a view.
If you want to replicate the data on commit, you could modify your DML procedures so that they update the children remote tables at the same time.

Oracle Data Modeler: add initial data to tables

I've created a relational model in Oracle SQL Developer Data Modeler.
I want to add predefined\static data that should exist in the initial clean database: enum values, fixed lists( for example: contries ) using modeler. My goal is to receive script using "DDL File Editor" tool which contains not only "create table" commands and so on, but also "inserts" with initial data.
I there any way to do this?
What might be the easiest way would be to put the DML into the AFTER CREATE tab under Scripts for each table - and to make sure it's included in the DDL script.

Script Oracle tables (DDL) with data insert statements into single/multiple sql files

I am needing to export the tables for a given schema, into DDL scripts and Insert statements - and have it scripted such that, the order of dependencies/constraints is maintained.
I came across this article suggesting how to archive the database with data - http://www.dba-oracle.com/t_archiving_data_in_file_structures.htm - not sure if the article is applicable for oracle 10g/11g.
I have seen "export table with data" features in "Sql Developer", "Toad for Oracle", "DreamCoder for Oracle" etc, but i would need to do this one table at a time, and will still need to figure out the right order of script execution manually.
Are there any tools/scripts that can utilize oracle metadata and generate DDL script with data?
Note that some of the tables have CLOB datatype columns - so the tool/script would need to be able to handle these columns.
P.S. I am needing something similar to the "Generate Scripts" feature in SQL Server 2008, where one can specify "script data" option and get back a self-sufficient script with DDL and data, generated in the order of table constraints. Please see: http://www.kodyaz.com/articles/sql-server-script-data-with-generate-script-wizard.aspx
Thanks for your help!
Firstly, recognise that this isn't necessarily possible. A view can use a function in a package that also selects from the view. Another issue is that you might need to load data into tables and then apply constraints, even though this might be slower than the other way round.
In short, you will need to do some work here.
Work out the dependencies in your system. ALL_DEPENDENCIES is the primary mechanism.
Then use DBMS_METADATA.GET_DDL to extract the DDL statements. For small data volumes, I'd extract the constraints separately for applying after the data load.
In current versions you can create external tables to unload data from regular tables into OS files (and obviously go the other way round). But if you've got exotic datatypes (BLOB, RAW, XMLTYPEs, User Defined Types....) it will be more challenging.
I suggest that you use Oracle standard export and import (exp/imp) here, is there a reason why you won't consider it? Note in addition you can use the "indexfile" option on the import to output the SQL statements (unfortunately this doesn't include the inserts) to a file instead of actually executing them.

Resources