I've got an Oracle DB with ALL the character columns defined as NVARCHAR or NCHAR or NCLOB, using charset UTF-16.
Now I want to migrate to a new DB that has charset UTF-8. Since it can store unicode characters, I'm wandering if I will be able to import data converting column types.
The reason I'm doubtful is I know that I cannot convert a NVARCHAR2 column in a VARCHAR2 if not empty.
What is the best option to perform the import. Will datapump complain if I import the schema, modify the column types and after that I'll import the data?
Thank
Yes datapump would be unimpressed with your proposed solution. You need to use to_char() or cast( MYNVARCHAR2 as VARCHAR2 ) to convert the datatypes after the import or before the export. As far as I know, any characters which can't be converted will be turned into '?'s so you have to be careful not to convert any data that contains non-english-alphabet characters.
It will probably be easiest to just import the data as is into a different schema, then convert the data while copying it to the correct, modified schema. i.e.
impdp remap_schema=myschema:tempschema
insert into myschema.mytable (select myprimarykey, to_char(mynvarchar2) from tempschema.mytable);
But this will obviously use up twice the space in your database.
Related
Does an import from a WE8ISO8859P1 or IW8ISO8859P8 dmp file into an AL32UTF8 Oracle database, avoid the truncation problem of string fields when changing character set in a database?
If a table field was defined as varchar2(10) in the source database, will it be imported as varchar2(10 CHAR) or as it was originally defined?
Thanks in advance
If you do it properly, Oracle will translate from whatever charset your file is to the database one. sqlloader for example lets you specify the charset of the file. If it is a dmp file (i.e. coming from exp or expdp), then the importing tool, impdp or imp will do the conversion. Problems occur only if the charsets are not compatible, e.g. importing WE8ISO into an ASCII.
We are using Oracle and we have a requirement to allow greek characters to be stored in the DB. Currently, our DB instance doesn't let us insert greek characters such as 'ϕ'. On googling, I found that it is to do with the character set. My oracle uses NLS_CHARACTERSET - WE8MSWIN1252 that doesn't support greek characters. I will have to change the character set to one of AL32UTF8, UTF8, AL16UTF16 or WE8ISO8859P7 if it has to work. Now that we have so much of data in the DB already, it would be a risk to change the character set now.
The other option I have is to change the column type (used to insert greek) from CLOB or VARCHAR2 to NVARCHAR2 and it works fine.
Before changing the column type, I want to know what are the risks involved in changing column type from CLOB to NVARCHAR2 and what are the things I need to keep in mind before changing.
Also, I would like to know the pros and cons of changing my existing character set to AL32UTF8.
EDIT:
There is also an option of changing CLOB to NCLOB and this seems to be less risky as both are closely related (almost same) types. Please do let me know the pros and cons of changing CLOB to NCLOB.
Ok. I was googling and posting Qs in other forums and got a much needed answer in here.
https://www.toolbox.com/tech/oracle/question/migrating-clob-to-nclob-010917/
So I just encountered this myself and I had issues with the above solution as it didn't copy across the foreign keys and constraints of my other columns. To get round this I did the following
created a new column for my NCLOB data,
ALTER TABLE table_name
ADD new_table_column NCLOB;
I then copied my CLOB data into my new NCLOB data column using the TO_NCLOB() function
UPDATE table_name
SET new_table_column = TO_NCLOB(old_table_column);
3)Finally I dropped the old column and then renamed my new column to my old column name
ALTER TABLE table_name
DROP COLUMN old_table_column;
ALTER TABLE table_name
RENAME COLUMN new_table_column TO old_table_column;
Please make sure you backup your data if you do this though as dropping the column will get rid of it and commit any transactions you've got
I also did this in Oracle so the syntax may differ slightly in other versions of SQL.
I am trying to import an excel file into an oracle table via sql developer. One of the oracle columns is of type CLOB, and during the verification step of the import wizard, i get the following message in the information column: "Data Types CLOB, not supported for import." The data fields i am attempting to import for the CLOB column is empty. Does anybody have any idea what might be wrong? Thanks.
If it is not possible, How can I import/export CLOB data in Oracle?
You just need to use a more recent copy of SQL Developer. We support importing into a CLOB field now from Excel.
And then when it's over, checking the data...
If I change an existing column type from varchar2 to nvarchar2 in Oracle will Oracle automatically convert existing column data between character set or should I do it myself?
I'm using Oracle 11g, the varchar2 character set is WE8MSWIN1252 and the nvarchar2 character set is AL16UTF16
You can use the package DBMS_REDEFINITION for doing the changing the varchar2 to nvarchar2 column for a table
Please find the below link which might be helpful
Using Online Table Redefinition to Migrate a Large Table to Unicode
Also find the documentation for General Character set Migration
General Character set Migration
We have a table in Oracle 11g with a varchar2 column. We use a proprietary programming language where this column is defined as string. Maximum we can store 2000 characters (4000 bytes) in this column. Now the requirement is such that the column needs to store more than 2000 characters (in fact unlimited characters). The DBAs don't like BLOB or LONG datatypes for maintenance reasons.
The solution that I can think of is to remove this column from the original table and have a separate table for this column and then store each character in a row, in order to get unlimited characters. This tble will be joined with the original table for queries.
Is there any better solution to this problem?
UPDATE: The proprietary programming language allows to define variables of type string and blob, there is no option of CLOB. I understand the responses given, but I cannot take on the DBAs. I understand that deviating from BLOB or LONG will be developers' nightmare, but still cannot help it.
UPDATE 2: If maximum I need is 8000 characters, can I just add 3 more columns so that I will have 4 columns with 2000 char each to get 8000 chars. So when the first column is full, values would be spilled over to the next column and so on. Will this design have any bad side effects? Please suggest.
If a blob is what you need convince your dba it's what you need. Those data types are there for a reason and any roll your own implementation will be worse than the built in type.
Also you might want to look at the CLOB type as it will meet your needs quite well.
You could follow the way Oracle stored their stored procedures in the information schema. Define a table called text columns:
CREATE TABLE MY_TEXT (
IDENTIFIER INT,
LINE INT,
TEXT VARCHAR2 (4000),
PRIMARY KEY (INDENTIFIER, LINE));
The identifier column is the foreign key to the original table. The Line is a simple integer (not a sequence) to keep the text fields in order. This allows keeping larger chunks of data
Yes this is not as efficient as a blob, clob, or LONG (I would avoid LONG fields if at all possible). Yes, this requires more mainenance, buf if your DBAs are dead set against managing CLOB fields in the database, this is option two.
EDIT:
My_Table below is where you currently have the VARCHAR column you are looking to expand. I would keep it in the table for the short text fields.
CREATE TABLE MY_TABLE (
INDENTIFER INT,
OTHER_FIELD VARCHAR2(10),
REQUIRED_TEXT VARCHAR(4000),
PRIMERY KEY (IDENTFIER));
Then write the query to pull the data join the two tables, ordering by LINE in the MY_TEXT field. Your application will need to split the string into 2000 character chunks and insert them in line order.
I would do this in a PL/SQL procedure. Both insert and select. PL/SQL VARCHAR strings can be up to 32K characters. Which may or may not be large enough for your needs.
But like every other person answering this question, I would strongly suggest making a case to the DBA to make the column a CLOB. From the program perspective this will be a BLOB and therefore simple to manage.
You said no BLOB or LONG... but what about CLOB? 4GB character data.
BLOB is the best solution. Anything else will be less convenient and a bigger maintenance annoyance.
Is BFILE a viable alternative datatype for your DBAs?
I don't get it. A CLOB is the appropriate database datatype. If your weird programming language will deal with strings of 8000 (or whatever) characters, what stops it writing those to a CLOB.
More specifically, what error do you get (from Oracle or your programming language) when you try to insert an 8000 character string into a column defined as a CLOB.