Dropping Constraints using ONLINE - oracle

I wanted to understand that if we drop a PRIMARY KEY using online then what will be the situation for ongoing transactions. Will the maintain data integrity? ideally not cause we have dropped PK. But what could be the real scenario from dropping constraints using online or how we can prevent data conflicts while doing so.

In Oracle, you can keep index but drop Primary Key. In this way you can prevent data conflicts.
ALTER TABLE TBL DROP PRIMARY KEY KEEP INDEX
after that you can add new primary key or unique constraint (whatever it is) and drop former unique constraint

Related

How to apply both on delete and on update cascade simultaneously in oracle12c?

I'm beginer and I'm working on oracle 12c database so, In my database project I want to apply cascade on delete and on update simultaneously as i did in mysql but when i apply tha same technique in oracle it show me the error so how can i do that?
There is no ON UPDATE CASCADE on Oracle. While you can probably argue updating a table's primary key is valid in SQL, you probably should not, hence the decision of Oracle not to implement it.
More info here:
https://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:5773459616034
EDIT: As discussed in comments below, think of that constraint as a way Oracle prevents people from doing something wrong (updating primary keys).
The correct way to handle the case of a primary key that might be updated is to create a separate field that will act as the surrogate primary key. The surrogate key, of course, is immutable.
The danger of using a natural key as primary key is discussed there.

How to update a column which is also a primary key?

There is a name field in the UI which is also the primary key column in the underlying table. There is a requirement to make that field editable in the UI. There should be an ID which should serve as the primary key, but there isn't and now it is not feasible to introduce any ID column.
Is there any alternate design idea which can be used in such a scenario ?
The UI is in Swing and DB is Oracle.
First of all, I don't know, who thinks Name field can be Primary Key. That's the wrong database design ever.
Yes, you better change it to some ID column as Primary Key and that shouldn't be updated in future. Since, you can't have multiple Primary Key. So, you need to perform some circus here.
You need to drop existing Primary Key first. Since, you can't have multiple Primary Key in single table.
Create your ID column and allow NULL
Then, update this column with sequence.
Once your ID column gets populated, you need to create Primary Key on this column.
You can only have one primary key, but you can have any number of unique indexes on a table. So let the existing primary key be the immutable primary key and have the application use this key internally for everything. Add another column to the table and create a unique index on it. Let the users modify this other field.
Another alternative would be to declare all child tables with foreign keys ON UPDATE CASCADE. That way, any update to the primary key will cascade to the child tables. Once implemented in production, quit the company and run fast in the other direction and write an article about how you were the first person ever to use ON UPDATE CASCADE in a production setting.

Why dropping a primary key is not dropping its unique index?

I have a table with Col1 and Col2 as a composite primary key pk_composit_key and a unique index that was automatically created for the constraint.
I then altered the table to add new column Col3.
I dropped the pk_composit_key constraint:
ALTER TABLE table_name DROP CONSTRAINT pk_composit_key;
Now, When I tried to insert records I got ORA-00001: unique constraint pk_composit_key violated.
Why am I getting that error?
When the key was dropped why wasn't the unique index dropped automatically?
You mentioned exporting and importing the schema, and if that happened in the environment that showed this behaviour it would explain what you're seeing; at least if you used legacy imp rather than the data pump impdp.
The original import documentation states the order objects are imported:
Table objects are imported as they are read from the export dump file. The dump file contains objects in the following order:
Type definitions
Table definitions
Table data
Table indexes
Integrity constraints, views, procedures, and triggers
Bitmap, function-based, and domain indexes
So the unique index would have been imported, then the constraint would have been created.
When you drop a primary key constraint:
If the primary key was created using an existing index, then the index is not dropped.
If the primary key was created using a system-generated index, then the index is dropped.
Because of the import order, the constraint is using an existing index,so the first bullet applies; and the index is retained when the constraint is dropped.
You can use the drop index clause to drop the index even if it wasn't created automatically:
ALTER TABLE table_name DROP CONSTRAINT pk_composit_key DROP INDEX;
See also My Oracle Support note 370633.1; and 1455492.1 suggests similar behaviour will occur with data pump import as well. I'm not aware of any way to check if an index is associated with a constraint at this level; there is no difference in the dba_constraints or dba_indexes views when you create the index manually or automatically. Including drop index will make it consistent though.
It depends on how unique index was created...below are the various ways and behaviour
1) first create unique index (on the column for which primary key to be defined) and then add the primary key constraint. In this situation your DDL to add the primary key will utilize the existing unique index. So when you drop the primary key it will not drop the index but only primary key. ==> this is your situation I guess...
2) While creating the table you define the primary key OR when you add the primary key when there was no existing unique index for the column(s) on which primary key to be defined, so system will create a unique index and use it for primary key. So in this case when you drop the primary key the unique index will also get dropped.

Oracle SQL Data Modeler missing a PRIMARY KEY on DDL script export

The diagram has over 40 tables, most of them have a primary key defined.
For some reason there is this one table, which has a primary key defined, but that's being ignored when I export the model to a DDL script.
This is the "offending" key (even though it's checked it is nowhere to be found on the generated DDL script):
Has anybody had the same problem? Any ideas on how to solve it?
[EDIT] This is where the key is defined:
And this is the DDL preview (yes, the primary key shows up there):
This is what happens if I try to generate the DDL for just that table (primary key still not generated):
I was finally able to identify and reproduce the problem.
It was a simple conflict of constraints.
Table MIEMBROS had a mandatory 1 to n relationship (foreign key) from another table on its primary key column and vice-versa (there was a foreign key on MIEMBROS against the other table's primary key).
This kind of relationship between two tables makes it impossible to add a record to any of them: The insert operation will return an error complaining about the foreign key restriction pointing the other table.
Anyway I realized that one of the relationships was 0 to n so I simply unchecked the "mandatory" checkbox on the foreign key definition and everything went fine.
So, in a nutshell: The Data Modeler "fails" silently if you are defining a mutual relationship (two foreign keys, one on each table against the other table) on non nullable unique columns, by not generating the primary key of one of the tables.
Such an odd behavior, if you ask me!
"This kind of relationship between two tables makes it impossible to add a record to any of them: The insert operation will return an error complaining about the foreign key restriction pointing the other table."
Actually, if you have deferred constraints, this is not impossible. The constraints can be enforced, for example, at commit time rather than immediately at insert time.
From the Data Modeler menu under File, I used Export -> DDL File. The keys appeared in the DDL, then when I went back to the diagram and did DDL Preview, it showed all the missing stuff.

Create constraint in alter table without checking existing data

I'm trying to create a constraint on the OE.PRODUCT_INFORMATION table which is delivered with Oracle 11g R2.
The constraint should make the PRODUCT_NAME unique.
I've tried it with the following statement:
ALTER TABLE PRODUCT_INFORMATION
ADD CONSTRAINT PRINF_NAME_UNIQUE UNIQUE (PRODUCT_NAME);
The problem is, that in the OE.PRODUCT_INFORMATION there are already product names which currently exist more than twice.
Executing the code above throws the following error:
an alter table validating constraint failed because the table has
duplicate key values.
Is there a possibility that a new created constraint won't be used on existing table data?
I've already tried the DISABLED keyword. But when I enable the constraint then I receive the same error message.
You can certainly create a constraint which will validate any newly inserted or updated records, but which will not be validated against old existing data, using the NOVALIDATE keyword, e.g.:
ALTER TABLE PRODUCT_INFORMATION
ADD CONSTRAINT PRINF_NAME_UNIQUE UNIQUE (PRODUCT_NAME)
NOVALIDATE;
If there is no index on the column, this command will create a non-unique index on the column.
If you are looking to enforce some sort of uniqueness for all future entries whilst keeping your current duplicates you cannot use a UNIQUE constraint.
You could use a trigger on the table to check the value to be inserted against the current table values and if it already exists, prevent the insert.
http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14251/adfns_triggers.htm
or you could just remove the duplicate values and then enfoce your UNIQUE constraint.
EDIT: After Jonearles and Jeffrey Kemp's comments, I'll add that you can actually enable a unique constraint on a table with duplicate values present using the NOVALIDATE clause but you'd not be able to have a unique index on that constrained column.
See Tom Kyte's explanation here.
However, I would still worry about how obvious the intent was to future people who have to support the database. From a support perspective, it'd be more obvious to either remove the duplicates or use the trigger to make your intent clear.
YMMV
You can use deferrable .
ALTER TABLE PRODUCT_INFORMATION
ADD CONSTRAINT PRINF_NAME_UNIQUE UNIQUE (PRODUCT_NAME)
deferrable initially deferred NOVALIDATE;

Resources