How to store unlimited characters in Oracle 11g? - oracle

We have a table in Oracle 11g with a varchar2 column. We use a proprietary programming language where this column is defined as string. Maximum we can store 2000 characters (4000 bytes) in this column. Now the requirement is such that the column needs to store more than 2000 characters (in fact unlimited characters). The DBAs don't like BLOB or LONG datatypes for maintenance reasons.
The solution that I can think of is to remove this column from the original table and have a separate table for this column and then store each character in a row, in order to get unlimited characters. This tble will be joined with the original table for queries.
Is there any better solution to this problem?
UPDATE: The proprietary programming language allows to define variables of type string and blob, there is no option of CLOB. I understand the responses given, but I cannot take on the DBAs. I understand that deviating from BLOB or LONG will be developers' nightmare, but still cannot help it.
UPDATE 2: If maximum I need is 8000 characters, can I just add 3 more columns so that I will have 4 columns with 2000 char each to get 8000 chars. So when the first column is full, values would be spilled over to the next column and so on. Will this design have any bad side effects? Please suggest.

If a blob is what you need convince your dba it's what you need. Those data types are there for a reason and any roll your own implementation will be worse than the built in type.
Also you might want to look at the CLOB type as it will meet your needs quite well.

You could follow the way Oracle stored their stored procedures in the information schema. Define a table called text columns:
CREATE TABLE MY_TEXT (
IDENTIFIER INT,
LINE INT,
TEXT VARCHAR2 (4000),
PRIMARY KEY (INDENTIFIER, LINE));
The identifier column is the foreign key to the original table. The Line is a simple integer (not a sequence) to keep the text fields in order. This allows keeping larger chunks of data
Yes this is not as efficient as a blob, clob, or LONG (I would avoid LONG fields if at all possible). Yes, this requires more mainenance, buf if your DBAs are dead set against managing CLOB fields in the database, this is option two.
EDIT:
My_Table below is where you currently have the VARCHAR column you are looking to expand. I would keep it in the table for the short text fields.
CREATE TABLE MY_TABLE (
INDENTIFER INT,
OTHER_FIELD VARCHAR2(10),
REQUIRED_TEXT VARCHAR(4000),
PRIMERY KEY (IDENTFIER));
Then write the query to pull the data join the two tables, ordering by LINE in the MY_TEXT field. Your application will need to split the string into 2000 character chunks and insert them in line order.
I would do this in a PL/SQL procedure. Both insert and select. PL/SQL VARCHAR strings can be up to 32K characters. Which may or may not be large enough for your needs.
But like every other person answering this question, I would strongly suggest making a case to the DBA to make the column a CLOB. From the program perspective this will be a BLOB and therefore simple to manage.

You said no BLOB or LONG... but what about CLOB? 4GB character data.

BLOB is the best solution. Anything else will be less convenient and a bigger maintenance annoyance.

Is BFILE a viable alternative datatype for your DBAs?

I don't get it. A CLOB is the appropriate database datatype. If your weird programming language will deal with strings of 8000 (or whatever) characters, what stops it writing those to a CLOB.
More specifically, what error do you get (from Oracle or your programming language) when you try to insert an 8000 character string into a column defined as a CLOB.

Related

Oracle change column type from CLOB to NCLOB

We are using Oracle and we have a requirement to allow greek characters to be stored in the DB. Currently, our DB instance doesn't let us insert greek characters such as 'ϕ'. On googling, I found that it is to do with the character set. My oracle uses NLS_CHARACTERSET - WE8MSWIN1252 that doesn't support greek characters. I will have to change the character set to one of AL32UTF8, UTF8, AL16UTF16 or WE8ISO8859P7 if it has to work. Now that we have so much of data in the DB already, it would be a risk to change the character set now.
The other option I have is to change the column type (used to insert greek) from CLOB or VARCHAR2 to NVARCHAR2 and it works fine.
Before changing the column type, I want to know what are the risks involved in changing column type from CLOB to NVARCHAR2 and what are the things I need to keep in mind before changing.
Also, I would like to know the pros and cons of changing my existing character set to AL32UTF8.
EDIT:
There is also an option of changing CLOB to NCLOB and this seems to be less risky as both are closely related (almost same) types. Please do let me know the pros and cons of changing CLOB to NCLOB.
Ok. I was googling and posting Qs in other forums and got a much needed answer in here.
https://www.toolbox.com/tech/oracle/question/migrating-clob-to-nclob-010917/
So I just encountered this myself and I had issues with the above solution as it didn't copy across the foreign keys and constraints of my other columns. To get round this I did the following
created a new column for my NCLOB data,
ALTER TABLE table_name
ADD new_table_column NCLOB;
I then copied my CLOB data into my new NCLOB data column using the TO_NCLOB() function
UPDATE table_name
SET new_table_column = TO_NCLOB(old_table_column);
3)Finally I dropped the old column and then renamed my new column to my old column name
ALTER TABLE table_name
DROP COLUMN old_table_column;
ALTER TABLE table_name
RENAME COLUMN new_table_column TO old_table_column;
Please make sure you backup your data if you do this though as dropping the column will get rid of it and commit any transactions you've got
I also did this in Oracle so the syntax may differ slightly in other versions of SQL.

Is it ok to define multiple NCLOB columns in an oracle table?

I've got to store multiple text fields of variable length in an oracle database. I'd need to define them as columns of the same table to be able to order the results when I query it.
I can't know the max size of the field contents, most of them will be less than 100 characters but there could be some of thousands of chars. Furthermore the number of fields changes dinamically.
I was thinking of defining a table with multiple NCLOB columns that would allow me to store anything in them (very short and very long texts) But I wonder if this would be the right design.
Summary:
Variable number of fields (metadata of the same object)
Variable length of the content
I need to order the results
Thanks
KL
When you need variable number of fields, it's better to split the table into parent and child. Then you can effectively have any number of fields. And you can add order column to store ordering information to order the result. You can query by joining the two table and use order by clause to order the result. Also you can add foreign key constraint to make sure the relationship and data integrity.
In the case of variable length of contents, you can use varchar2(or nvarchar2) to store text date. Varchar2 can hold characters up to 4000 bytes. If you know that the maximum length of the content can be longer than 4000 bytes, you should use CLOB(or NCLOB).

What is space allocated for NULL values of an Oracle column (especially CLOB)

We have a report (simple text) to store in Oracle , average case will be less than 4K but some cases exceeding that. So an option is to use CLOB.
It is for logging purposes only, not used in query or update. Only insert once and retireve few times.
Space and overall schema (other tables) performance is of main concern.
I read about CLOB storage allocation format.
We are contemplating using 2 columns, msgV varchar2(4000) and msgC CLOB. When text exceeds 4k then we store into CLOB otherwise usual varchar2 and CLOB remains NULL.
So my question is,
Is this scheme better w.r.t to above performance consideration or simply use CLOB ?
(apart from it entailing more coding work maintaining this condition everywhere)
And what is the space consumed by NULL and Empty CLOBs (or any datatype) ?
Use a clob. If the data in the clob is under 4000 bytes, it will actually be stored inline. See section in link below about LOB's compared to LONG.
Oracle Lob

Characters spilled over multiple columns in Oracle 11g?

This is related to question: How to store unlimited characters in Oracle 11g?
If maximum I need is 8000 characters, can I just add 3 more varchar2 columns so that I will have 4 columns with 2000 char each to get 8000 chars. So when the first column is full, values would be spilled over to the next column and so on. Will this design have any bad side effects? Please suggest.
Why not just use a CLOB column instead? I read your other link, and I don't understand why your DBA's don't like these types of columns. I mean, CLOB is an important feature of Oracle just for this exact purpose. And your company paid good money for that feature when buying Oracle, so why not leverage Oracle to it's fullest capabilities instead of trying to come up with hacks to do something that the DB is not really designed to do?
Maybe instead of spending time trying to devise hacks to overcome limitations created by your DBAs, you should spend some time educating your DBA's on why CLOBs are the right feature to solve your problem.
I would never be satisfied with a bad design when the DB has the feature I need to make a good design. If the DBA's are the problem, then they need to change their viewpoint or you should go to senior level management in my opinion.
I agree with dcp that you should be using CLOB. But if against all sense you are forced to "roll your own" unlimited text using just VARCHAR2 columns then I would not do it by adding more and more VARCHAR2 columns to the table like this:
create table mytable
( id integer primary key
, text varchar2(2000)
, more_text varchar2(2000)
, and_still_more_text varchar2(2000)
);
Instead I would move the text to a separate child table like this:
create table mytable
( id integer primary key
);
create table mytable_text
( id references mytable(id)
, seqno integer
, text varchar2(2000)
, primary key (id, seqno)
);
Then you can insert as much text as you like for each mytable row, using many rows in mytable_text.
To add to DCP's and Tony's excellent answers:
You ask if the approach you are proposing will have any bad side effects. Here are a few things to consider:
Suppose you want to perform a search of your text data for a particular string. Your approach requires repeating the search on each column containing your text - results in a convoluted and inefficient WHERE clause.
Every time you want to expand your text field you have to add another column, which now means you have to modify every place you coded to do (1).
Everyone who has to maintain this structure after you will curse your name ;-)

Why does Char(1) change to Char(3) when copying over an Oracle DBLINK?

I have 2 databases, and I want to transport an existing table containing a CHAR column from database A to database B.
Database A is Oracle 9i, has encoding WE8ISO8859P1, and contains a table "foo" with at least 1 column of type CHAR(1 char). I can not change the table on database A because it is part of a third party setup.
Database B is my own Oracle 10g database, using encoding AL32UTF8 for all kinds of reasons, and I want to copy foo into this database.
I setup a database link from database B to database A. Then I issue the following command:
*create table bar as select * from #link#.foo;*
The data gets copied over nicely, but when I check the types of the columns, I notice that CHAR(1 char) has been converted into CHAR(3 char), and when querying the data in database B, it is all padded with spaces.
I think somewhere underwater, Oracle confuses it's own bytes and chars. CHAR(1 byte) is different from CHAR(1 char) etc. I've read about all that.
Why does the datatype change into a padded CHAR(3 char) and how do I stop Oracle from doing this?
Edit: It seems to have to do with transfering CHAR's between two specific patchlevels of Oracle 9 and 10. It looks like it is really a bug. as soon as I find out I'll post an update. Meanwhile: don't try to move CHAR's between databases like I described. VARCHAR2 works fine (tested).
Edit 2: I found the answer and posted it here: Why does Char(1) change to Char(3) when copying over an Oracle DBLINK?
Too bad I can not accept my own answer, because my problem is solved.
This problem is caused by the way Oracle (mis)handles character conversions between different character sets based on the original column length definition. When you define the size of a character type column in bytes, Oracle does not know how to do a conversion and bodges it. The solution is to always define the length of a character type in characters.
For a more in-depth explanation of the problem and how I figured this out have a look at
http://www.rolfje.com/2008/11/04/transporting-oracle-chars-over-a-dblink/
YOu need to learn the difference between the WE8ISO8859P1 NLS (which stores characters in one byte) and the AL32UTF8 which stores characters in up to four bytes. You will need to spend some quality time with the Oracle National Language Support (NLS) Documentation. Oracle automatically does the conversion through the database link, in an attempt to be helpful.
Try the following from your SQL prompt:
ALTER SESSION NLS_NCHAR WE8ISO8859P1
create table bar as select * from #link#.foo;
The first thing I would try is Creating the table NOT as a CTAS but with a list of column definitions and try to perform an insert of the first few thousand rows. If that didn't succeed then it would be very clear why... and you'd have quick confirmation that Thomas Low is dead on accurate.

Resources