Oracle backup with SEQUENCE-ID Column - oracle

Good morning,
I allready know how to do oracle backups with expdp and impdp. It is also no problem for me to remap backups from one schema to an other.
But know I have something where I find no solution to solve it.
I have a table with an ID column. This column has a default value based on a sequence.
CREATE TABLE "TABLE_NAME"
( "ID" NUMBER(*,0) DEFAULT "TABLE_NAME_ID_SEQ"."NEXTVAL",
"KEY" VARCHAR2(200 BYTE),
"VALUE" CLOB
)
After executing thise statement Oracle defines this default with "MY_USER"."TABLE_NAME_ID_SEQ"."NEXTVAL".
This is not an issue until I try to uses the backup with a different schema name. The REMAP_SCHEMA parameter of impdp seems only to change the schema definition in the tabelnames but not in the default values. So the imported backup creates a table with a default values which references to a wrong schema.
How could I solve this?
Best regards"

Related

Impossible to create trigger in a table with a reserved name in Oracle

I have a table named BLOB, and I can select, update, insert normally via SQL, but it's impossible to define a trigger for it, obviously the cause is the name.
In PL/SQL as a procedure, I work around the problem by creating a view which selects all the columns of the table, and using it in the body but it doesn't work on the table itself.
Does anyone have a solution? I cannot rename the table. Also using a synonym is out.
Structure of the table:
CREATE TABLE BLOB
(
BLOBID CHAR(7 BYTE) NOT NULL,
OGGETTO BLOB
)
The problem is related to the presence of a column of type BLOB in a table named BLOB, but only in PL/SQL environment (procs, triggers, ...)

Oracle 12c, export table with sequence as default value, schema is attached

I'm use oracle 12c with a user u1. With user u1 I create table T1 like this,
create SEQUENCE T1_SEQ START WITH 1;
CREATE table T1(
id number(11) DEFAULT T1_SEQ.nextval PRIMARY KEY ,
name varchar2(255)
);
Now I want to export the schema to others, and use below command to export
expdp u1/password dumpfile=u1.dmp schemas=u1
, but when others use u1.dmp to import the schema with a new user named u2,
impdp u2/password remap_schema=u1:u2 DUMPFILE=u1.DMP TABLE_EXISTS_ACTION=REPLACE
error happend, beacause default value of table T1 column id add U1 as prefix.
U1.T1_SEQ.nextval
and here is the new create table statement
CREATE table T1(
id number(11) DEFAULT U1.T1_SEQ.nextval PRIMARY KEY ,
name varchar2(255)
);
How can I remove the U1's influence while I want to move one schema to another schema.
Could you give me some suggestions? Thanks in advance!.
this is a known and ongoing issue according to Ask Tom.
So You can,
excluding the affected objects from the import
using the sqlfile option to import the affected objects
amending the sqlfile script output to point to the correct objects before running it
Please See relevant topic

Oracle 11g - "DEFAULT ON NULL"?

I am running Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production, I am trying to find an equivalent to 12c's "DEFAULT ON NULL" for a table. Basically I have to create a table where the requirement is than whenever someone intentionally or inadvertently passes a NULL value, it is replaced with a DEFAULT value (in this case a NUMBER type equal to 1). Is there any easy way to do this in 11g? I know I could do a trigger on the table, but I would have to put in logic for every single column, and that seems ridiculous.
My table definition currently looks like this:
CREATE TABLE MYTABLE
(
FLAG NUMBER(1) DEFAULT 0
)
If I explicitly pass in null it WILL get stored. In that situation I was expecting the default value to be placed instead.
You can use simple 'DEFAULT' behavior for columns and don't provide anything when inserting.
CREATE TABLE table1 (
id NUMBER(8.0) NOT NULL,
col1 NUMBER(8.0) NOT NULL DEFAULT 10,
col2 NUMBER(8.0) NOT NULL DEFAULT 20,
PRIMARY KEY (id);
);
INSERT INTO table1 (id) values (123); //will result in creating a row with default values.
Also if you're using some kind of ORM you can use dynamic insert and dynamic update options. This way if you don't provide values on insert they will be set to defaults;
and on update the unchanged values will not be included into query.
This is how you can bypass explicit null values in query and defaults will apply.

Moving dependencies (PK, FK and indexes) from one table to another within the same database in Oracle

Please tell me how can I move dependencies (such as PK, FK and indexes) from one table to another within the same database in Oracle? The second table is a copy of the first, only created later for partition reasons. Thank you in advance! :)
You could look at using the dictionary views in oracle, specifically the USER_CONSTRAINTS view. Then either construct a SQL statement dynamically or use DBMS_METADATA.get_ddl procedure to get the ddl for the constraint. You could do a REPLACE on the SQL to replace the original table name and constraint name with a new constraint name and the name of the new table.

Why does Char(1) change to Char(3) when copying over an Oracle DBLINK?

I have 2 databases, and I want to transport an existing table containing a CHAR column from database A to database B.
Database A is Oracle 9i, has encoding WE8ISO8859P1, and contains a table "foo" with at least 1 column of type CHAR(1 char). I can not change the table on database A because it is part of a third party setup.
Database B is my own Oracle 10g database, using encoding AL32UTF8 for all kinds of reasons, and I want to copy foo into this database.
I setup a database link from database B to database A. Then I issue the following command:
*create table bar as select * from #link#.foo;*
The data gets copied over nicely, but when I check the types of the columns, I notice that CHAR(1 char) has been converted into CHAR(3 char), and when querying the data in database B, it is all padded with spaces.
I think somewhere underwater, Oracle confuses it's own bytes and chars. CHAR(1 byte) is different from CHAR(1 char) etc. I've read about all that.
Why does the datatype change into a padded CHAR(3 char) and how do I stop Oracle from doing this?
Edit: It seems to have to do with transfering CHAR's between two specific patchlevels of Oracle 9 and 10. It looks like it is really a bug. as soon as I find out I'll post an update. Meanwhile: don't try to move CHAR's between databases like I described. VARCHAR2 works fine (tested).
Edit 2: I found the answer and posted it here: Why does Char(1) change to Char(3) when copying over an Oracle DBLINK?
Too bad I can not accept my own answer, because my problem is solved.
This problem is caused by the way Oracle (mis)handles character conversions between different character sets based on the original column length definition. When you define the size of a character type column in bytes, Oracle does not know how to do a conversion and bodges it. The solution is to always define the length of a character type in characters.
For a more in-depth explanation of the problem and how I figured this out have a look at
http://www.rolfje.com/2008/11/04/transporting-oracle-chars-over-a-dblink/
YOu need to learn the difference between the WE8ISO8859P1 NLS (which stores characters in one byte) and the AL32UTF8 which stores characters in up to four bytes. You will need to spend some quality time with the Oracle National Language Support (NLS) Documentation. Oracle automatically does the conversion through the database link, in an attempt to be helpful.
Try the following from your SQL prompt:
ALTER SESSION NLS_NCHAR WE8ISO8859P1
create table bar as select * from #link#.foo;
The first thing I would try is Creating the table NOT as a CTAS but with a list of column definitions and try to perform an insert of the first few thousand rows. If that didn't succeed then it would be very clear why... and you'd have quick confirmation that Thomas Low is dead on accurate.

Resources