Oracle: query with 'N' function - which datatype should I use? - oracle

I have created an Oracle table with an indexed varchar2 column, called 'ID'.
A software I'm using is reading this table, but instead of running queries like
select * from table_name where ID='something'
it does (notice the extra "N" before the value)
select * from table_name where ID=N'something'
which is causing some kind of character conversion.
The problem is that, while the 1st query is performing a range scan, the 2nd is performing a full table scan.
Since I cannot modify the queries that this software is running, which data type should I use, instead of varchar2, so that the conversion performed by the 'N' function does not imply a full table scan?

The prefix N before the string is used to specify a NVARCHAR2 or NCHAR datatype.
When comparing NVARCHAR2s to VARCHAR2s, Oracle converts the VARCHAR2 variable to NVARCHAR2. This is why you are experiencing a FULL SCAN.
Use a NVARCHAR2 column instead of a VARCHAR2 in your table if you can't modify the query.

Related

SQLPLUS displaying of tables with CLOB types

I have a table which has 3 columns. I have a NUMBER column, CLOB column, and BLOB column. how can i use some sort of SELECT * statement in order to display what I have entered into this table, not just a partial piece of the long character strings i have in there. The only way I know of displaying a long string form a CLOB would be using the DBMS_LOB.substr technique. My BLOB column is currently all NULL so not too worried about displaying that section, Just the number column with its associated CLOB. Thanks!
See here How to query a CLOB column in Oracle
When getting the substring of a CLOB column and using a query tool that has size/buffer restrictions
sometimes you would need to set the BUFFER to a larger size.
For example while using SQL Plus use the SET BUFFER 10000 to set it to 10000 as the default is 4000.
Running the DMBS_LOB.substr command you can also specify the amount of characters you want to return and the offset from which.
So using DMBS_LOB.substr(column, 3000) might restrict it to a small enough amount for the buffer.
See oracle documentation for more info on the substr command
DBMS_LOB.SUBSTR (
lob_loc IN CLOB CHARACTER SET ANY_CS,
amount IN INTEGER := 32767,
offset IN INTEGER := 1)
RETURN VARCHAR2 CHARACTER SET lob_loc%CHARSET;

Oracle CHAR Comparison Not Working in Function

Could someone please explain to me the difference between the below two Oracle queries? I know they look very similar but the first one returns results and the second one does not. My implementation of the function can be seen below as well.
--Returns results
SELECT *
FROM <TABLE_NAME>
WHERE ID = CAST(<UserID> AS CHAR(2000)); --ID is defined as CHAR(8) in the DB.
--Does not return results
SELECT *
FROM <TABLE_NAME>
WHERE ID = CAST_TO_CHAR(<UserID>); --ID is defined as CHAR(8) in the DB.
--Function definition
CREATE OR REPLACE FUNCTION CAST_TO_CHAR(varToPad IN VARCHAR2)
RETURN CHAR IS returnVal CHAR(2000);
BEGIN
SELECT CAST(varToPad AS CHAR(2000))
INTO returnVal
FROM DUAL;
RETURN returnVal;
END;
/
It almost seems to me that the type is not persisting when the value is retrieved from the database. From what I understand from CHAR comparisons in Oracle, it will take the smaller of the two fields and truncate the larger one so that the sizes match (that is why I am casting the second variable to length 2000).
The reason that I need to achieve something like this is because a vendor tool that we are upgrading from DB2 to Oracle defined all of the columns in the Oracle database as CHAR instead of VARCHAR2. They did this to make their legacy code more easily portable to a distributed environment. This is causing big issues in our web applications because compares are now being done against fixed length CHAR fields.
I thought about using TRIM() but these queries will be accessed a lot and I do not want them to do a full table scan each time. I also considered RPAD(, ) but I don't really want to hard code lengths in the application as these may change in the future.
Does anyone have any thoughts about this? Thank you in advance for your help!
I have similar problem. It turned out that these are the rules of implicit data conversion. Oracle Database automatically converts a value from one datatype to another when such a conversion makes sense.
If you change your select:
SELECT *
FROM <TABLE_NAME>
WHERE CAST(ID as CHAR(2000)) = CAST_TO_CHAR(<UserID>);
You will see that's works properly.
And here's another test script showing that the function works correctly:
SET SERVEROUTPUT ON --for DBMS_OUTPUT.PUT_LINE.
DECLARE
test_string_c CHAR(8);
test_string_v VARCHAR2(8);
BEGIN
--Assign the same value to each string.
test_string_c := 'string';
test_string_v := 'string';
--Test the strings for equality.
IF test_string_c = CAST_TO_CHAR(test_string_v) THEN
DBMS_OUTPUT.PUT_LINE('The names are the same');
ELSE
DBMS_OUTPUT.PUT_LINE('The names are NOT the same');
END IF;
END;
/
anonymous block completed
The names are the same
Here are some rules govern the direction in which Oracle Database makes implicit datatype conversions:
During INSERT and UPDATE operations, Oracle converts the value to
the datatype of the affected column.
During SELECT FROM operations, Oracle converts the data from the
column to the type of the target variable.
When comparing a character value with a numeric value, Oracle
converts the character data to a numeric value.
When comparing a character value with a DATE value, Oracle converts
the character data to DATE.
When making assignments, Oracle converts the value on the right side
of the equal sign (=) to the datatype of the target of the assignment
on the left side.
When you use a SQL function or operator with an argument of a
datatype other than the one it accepts, Oracle converts the argument
to the accepted datatype.
Complete list of datatype comparison rules you can explore here

Oracle: Coercing VARCHAR2 and CLOB to the same type without truncation

In an app that supports MS SQL Server, MySQL, and Oracle, there's a table with the following relevant columns (types shown here are for Oracle):
ShortText VARCHAR2(1700) indexed
LongText CLOB
The app stores values 850 characters or less in ShortText, and longer ones in LongText. I need to create a view that returns that data, whichever column it's in. This works for SQL Server and MySQL:
SELECT
CASE
WHEN ShortText IS NOT NULL THEN ShortText
ELSE LongText
END AS TheValue
FROM MyTable
However, on Oracle, it generates this error:
ORA-00932: inconsistent datatypes: expected CHAR got CLOB
...meaning that Oracle won't implicitly convert the two columns to the same type, so the query has to do it explicitly. Don't want data to get truncated, so the type used has to be able to hold as much data as a CLOB, which as I understand it (not an Oracle expert) means CLOB, only, no other choices are available.
This works on Oracle:
SELECT
CASE
WHEN ShortText IS NOT NULL THEN TO_CLOB(ShortText)
ELSE LongText
END AS TheValue
FROM MyTable
However, performance is amazingly awful. A query that returns LongText directly took 70-80 ms for about 9k rows, but the above construct took between 30 and 60 seconds, unacceptable.
So:
Are there any other Oracle types I could coerce both columns to
that can hold as much data as a CLOB? Ideally something more
text-oriented, like MySQL's LONGTEXT, or SQL Server's NTEXT (or even
better, NVARCHAR(MAX))?
Any other approaches I should be looking at?
Some specifics, in particular ones requested by #Guido Leenders:
Oracle version: Oracle Database 11g 11.2.0.1.0 64bit Production
Not certain if I was the only user, but the relative times are still striking.
Stats for the small table where I saw the performance I posted earlier:
rowcount: 9,237
varchar column total length: 148,516
clob column total length: 227,020
The to_clob is pretty expensive, so try to avoid it. But I think it should perform reasonable well for 9K rows. Following test case based upon one of the applications we develop which has the similar datamodel behaviour:
create table bubs_projecten_sample
( id number
, toelichting varchar2(1700)
, toelichting_l clob
)
begin
for i in 1..10000
loop
insert into bubs_projecten_sample
( id
, toelichting
, toelichting_l
)
values
( i
, case when mod(i, 2) = 0 then 'short' else null end
, case when mod(i, 2) = 0 then rpad('long', i, '*') else null end
)
;
end loop;
commit;
end;
Now make sure everything in cache and dirty blocks written out:
select *
from bubs_projecten_sample
Test performance:
create table bubs_projecten_flat
as
select id
, to_clob(toelichting) toelichting_any
from bubs_projecten_sample
where toelichting is not null
union all
select id
, toelichting_l
from bubs_projecten_sample
where toelichting_l is not null
The create table take less than 1 second on a normal entry level server, including writing out the data, 17K consistent gets, 4K physical reads. Stored on disk (note the rpad) is 25K for toelichting and 16M for toelichting_l.
Can you further elaborate on the problem?
Please check that large CLOBs are not stored inline. Normally large CLOBs are stored in a separate system-maintained table. Storing large CLOBs inside a table can make going through the table with a Full Table Scan expensive.
Also, I can imagine populating both columns always. You still have the benefits of indexing working for the first so many characters. You just need to memorize in the table using an indicator whether the CLOB or the shortText column is leading.
As a side note; I see a difference between 850 and 1700. I would recommend making them equal, but remember to check that you are creating the table using character semantics. That can be done on statement level by using: "varchar2(850 char)". Please note that Oracle will actually create a column that fits 850 * 4 bytes (in AL32UTF8 at least, there the "32" stands for "4 bytes at most per character"). Good luck!

Datatype difference in procedure parameter and sql query inside it

In my backend procedure i have a varchar2 parameter and i am using it in the SQL query to search with number column. Will this cause any kind of performance issues ?
for ex:
Proc (a varchar)
is
select * from table where deptno = a;
end
Here deptno is number column in table and a is varchar .
It might do. The database will resolve the differences in datatype by casting DEPTNO to a VARCHAR2. This will prevent the optimizer from using any (normal) index you have on that column. Depending on the data volumes and distribution, an indexed read may not always be the most efficient access path, in which case the data conversion doesn't matter.
So it does depend. But what are your options if it does matter (you have a highly selective index on that column)?
One solution would be to apply an explicit data conversion in your query:
select * from table
where deptno = to_number(a);
This will cause the query to fail if A contains a value which won't convert to a number.
A better solution would be to change the datatype of A so that the calling program can only pass a numeric value. This throws the responsibility for duff data where it properly belongs.
The least attractive solution is to keep the procedure's signature and the query as is, and build a function-based index on the column:
create index emp_deptchar_fbi on emp(to_char(deptno));
Read the documentation to find out more about function-based indexes.

Inserting data into Oracle

I am trying to migrate a DB from Informix to Oracle.Informix had an option like while inserting into a table if the size of the value exceeds the column length then Informix automatically trims the data.But Oracle does not support this and always throws an exception.Is there a way to prevent and allow trim or we have to respect religiously?
There is no automatic trimming of data in Oracle, you have to trim it explicitly yourself e.g.
insert into mytable (id, text) values (123, substr(var,1,4000));
Oracle does support a variety of SQL functions which trim variables. I suspect the one you'll want is 'SUBSTR()'. The problem is that you will need to specify the desired length explicitly. In this example T23.WHATEVER is presumed to be VARCHAR2(30) and T24.TOO_LONG_COLUMN is, er, longer:
insert into t23
(id
, whatever)
select pk_col
, substr(too_long_col, 1, 30)
from t42
/
As well as Tony's suggestion, you can use a CAST
select cast ('1234' as varchar2(3)) a
from dual
If you are doing data migration, look into DML Error Logging
Having all your non-conformant data put into a corresponding table with the failure reason is positively dreamy.

Resources