How to change character set of the XMLTYPE variable? - oracle

I'm currently having non-utf-8 DB but I need to produce XMLType variable with utf-8 encoding. I'm having a workaround but there seems to be bug in the Oracle, see the following link:
https://forums.oracle.com/forums/thread.jspa?messageID=10238641
...and Oracle Support bug: 7698684
The bug causes random
ORA-1482: unsupported character set
ORA-6512: at "SYS.XMLTYPE", line 107
First of all I'm getting XMLType with dbms_xmlgen package. That XMLType is encoded with DB character set.
To convert it to utf-8 character set I do like this:
I convert XMLType variable to BLOB variable with getBlobVal method using NLS_CHARSET_ID
('UTF8') as parameter
I convert BLOB variable back to XMLType with XMLType constructor method using BLOB variable as first parameter and NLS_CHARSET_ID
('UTF8') as second parameter. This causes random error :(
Does anybody know any alternative solution for this?
l_xml := dbms_xmlgen.getXMLType(l_ctx);
l_xml_b := l_xml.getBlobVal(C_UTF8_CHARSET_ID);
l_xml := XMLType(l_xml_b, C_UTF8_CHARSET_ID);

I managed to do this with convert function. It was not possible to convert the whole xml document (even the clob value of it) but only element values.
This was not working (XMLType constructor fails):
l_xml := XMLType(convert(l_xml.getClobVal, 'UTF8'));
So I had to put convert to the query string (this is just an example):
select dbms_xmlgen.getXMLType(
q'{select convert('ä', 'UTF8') myValue from dual}')
from dual
Finally I made a function which reads dictionary and loops through all columns of the given table/view and generates select statement string where all columns are converted separately to UTF8. This string can then be passed as parameter to the dbms_xmlgen.newContext function.

Related

Converting Big Clob into Table using XMLTable

I´m trying to convert a input clob variable from a store procedure into a xmltype and then, using XMLTable, trying to join it with other tables on my db, here is my code:
with clientes_data as (SELECT clientes.identificacion,clientes.tipoDoc,clientes.cuentas
from xmltable('/Clientes/Cliente'
passing xmltype(to_clob('<Clientes>
<Cliente><NumeroIdentificacion>94406495</NumeroIdentificacion><TipoIdentificacion>CC</TipoIdentificacion></Cliente>
<Cliente><NumeroIdentificacion>1136881480</NumeroIdentificacion><TipoIdentificacion>CC</TipoIdentificacion></Cliente>
</Clientes>'))
columns
identificacion varchar2(10) path 'NumeroIdentificacion',
tipoDoc varchar2(2) path 'TipoIdentificacion',
cuentas xmltype path 'Cuentas') clientes)
,cuentas_Data as (
SELECT cl.identificacion,cl.tipoDoc,cuentasT.*
from clientes_data cl
LEFT JOIN
xmltable('/Cuentas/Cuenta'
passing cl.cuentas
columns
numCta varchar2(10) path 'Numero',
tipoCta varchar2(3) path 'Tipo') cuentasT ON 1=1)
select * from cuentas_Data;
--select count(*) from cuentas_Data;
But I´m gettin this error: String literal too long...The string literal is longer tahn 4000 characters when the input (the passing section) is longer than 4000 characters,so, I´m a little bit confused, cause it´s suposed that XMLTable have a parameter xmltype (kind of a clob size), but, I´m assuming that actually is a varchar2(4000)?
Thanks for any "light" on this.

Oracle user defined records

In Oracle PL/SQL records we can use anchor datatypes (including %TYPE and %ROWTYPE) to define the fields.
When I populate a record from a query, in my select clause I want type conversion. Is that possible using an Oracle built-in function or some other approach?
In this example scenario I am using a simple decode function to perform a conversion:
DECLARE
TYPE TEST_RECORD IS RECORD(
FIRST_NAME EMPLOYEE_MT.FIRST_NAME%TYPE,
LAST_NAME EMPLOYEE_MT.LAST_NAME%TYPE,
MARITIAL_STATUS EMPLOYEE_MT.MARITAL_STATUS%TYPE);
EMPLOYEE_NAME TEST_RECORD;
BEGIN
SELECT EMP.FIRST_NAME,
EMP.LAST_NAME,
DECODE(EMP.MARITAL_STATUS, 1, 'MARRIED', 0, 'UN-MARRIED')
INTO EMPLOYEE_NAME
FROM EMPLOYEE_MT EMP
WHERE EMP.EMPLOYEE_ID = 1;
DBMS_OUTPUT.put_line(EMPLOYEE_NAME.MARITIAL_STATUS);
END;
which gets error:
ORA-06502: PL/SQL: numeric or value error: character to number conversion error
ORA-06512: at line 9
You have defined your record type with the maritial_status (shouldn't that be marital_status?) field using the same data type as the table column. From your decode that appears to be a number data type. You're then trying to set the record's field value to a string, either 'MARRIED' or 'UN-MARRIED', when that field is expecting a number. Clearly neither of those strings can be converted to a number, hence the error you're getting.
If you want the record to store the string value, you'll have to define it like that - explicitly as a string, rather than using %TYPE:
DECLARE
TYPE TEST_RECORD IS RECORD(
FIRST_NAME EMPLOYEE_MT.FIRST_NAME%TYPE,
LAST_NAME EMPLOYEE_MT.LAST_NAME%TYPE,
MARITAL_STATUS VARCHAR2(10));
...
DBMS_OUTPUT.put_line(EMPLOYEE_NAME.MARITAL_STATUS);
...
You can't do that automatically using %TYPE as the data type just doesn't match. You've explicitly told Oracle that you want the field's data type to be a number, so Oracle isn't going to let you put a string in that field instead. It isn't about there not being a built-in function, it just doesn't make sense.
This also means you can't use %ROWTYPE if you're changing the data type either (unless your modified value can be implicitly converted back to the column data type).

Postgres: After converting from bytea to varchar '\r' remains

I have a table which contains xml file as binary data. The xmls contains "\r\n" characters as "\015\012" in bytea. I need to change the column type from bytea to varchar.
I run:
ALTER TABLE my_table ALTER COLUMN xml_data TYPE VARCHAR;
UPDATE my_table SET xml_data = convert_from(xml_data::bytea, 'UTF8');
And it works for linux. But on Windows it converts '\015' to "\r" (two characters). So I have something like that in the result:
<field>...</field>\r
<field>...</field>
Maybe there is an proper method to convert binary data to UTF?
You'll have to strip the carriage returns in a separate step.
If you are ok with getting rid of them wholesale, I suggest something like:
ALTER TABLE my_table
ALTER xml_data TYPE text
USING replace(
convert_from(xml_data, 'UTF8'),
E'\r',
''
);
Is there a good reason for using data type varchar (or text, which is the same) rather than xml?

PLSQL Invalid Month

Why am I getting invalid month when I test this code? How does PLSQL and XML handle data types?
CURSOR c_DATA_INF is
select * from xmltable ('/' PASSING i_XML COLUMNS READING_DT DATE PATH 'DATE',
Actual NUMBER PATH 'ACTUAL',
Eligible NUMBER PATH 'ELIGIBLE'
);
begin
for d in c_DATA_INF loop
insert into table_name(READING_DT, actual, eligible)
values (to_date('d.READING_DT', 'MM/DD/YYYY'), d.ACTUAL, d.ELIGIBLE);
end loop;
end;
I'm not sure if it's incorrect in my insert statement or in my cursor.
Thanks!
You are passing d.READING_DT as string. within quotes. Please remove the quotes and try
The parameter types are invalid in to_date(DATE, "DATE FORMAT STRING");
Link to Example
Instead of passing 'd.READING_DT', remove the quotes. Pass: d.READING_DT
The error is thrown becasue Oracle does not recognize 'd.READING_DT' as a valid date/ or date string.

sqlloader date conversion: inconsistent data type

Am trying to load some data to an Oracle database using SQLLoader but I keep getting the errors ORA-00932: inconsistent datatypes: expected DATE got NUMBER or expecting char got date
The date comes in the format of e.g. 20130815 and it is to be stored into a database with a column of type Date.
The logic should be that if someone passes an invalid date a null or say an invalid month in the date string e.g. 20131301, then I want to insert a default date e.g. 19990101.
Here is the SQLLoader code I have so far:
COLUMN_DOB DATE 'YYYYMMDD' "CASE:COLUMN_DOB WHEN 'isdate(:COLUMN_DOB)=1' THEN :COLUMN_DOB ELSE to_date('19000101', 'YYYYMMDD') END",,
The immediate issue is that you're calling to_date() on the fixed value. The then part is returning the original string value of the date from the file; the else is returning a date; which leads to the 'expecting char got date' message.
If you remove the to_date() part then it will load all the values as 1900-01-01, because you have the case statement wrong. You're comparing the :COLUMN_DOB value with the string 'isdate(:COLUMN_DOB)=1'. It isn't calling the function, it's a fixed string, and your date field is never going to exactly match that text. So the case always goes into the else and gets the fixed value. You also seem to be mixing up the two forms of case statement.
So it should be:
... "CASE WHEN isdate(:COLUMN_DOB)=1 THEN :COLUMN_DOB ELSE '19000101' END"
Which, assuming you've built an isdate() function - since that is not an Oracle built-in - with a default format mask, something like this one based on an AskTom version:
create or replace function isdate (p_string in varchar2,
p_fmt in varchar2 := 'YYYYMMDD')
return number as
l_date date;
begin
if l_date is null then
return 0;
end if;
l_date := to_date(p_string, p_fmt);
return 1;
exception
when others then
return 0;
end;
/
... will put in valid dates as supplied, and invalid dates as 1900-01-01. The null check also means nulls will be inserted as 1900-01-01; it's perhaps simpler to have it here than to try to handle that separately in the control file.
You could maybe simplify it further by having a function that tries to convert the string and returns a date, and just calling that in the control file, without the date mask or the case statement. That approach is covered in that AskTom link too.
Personally I'd probably prefer to see the column left null rather than giving it a magic number. It isn't impossible to have someone with a valid DOB of 1900-01-01.
If you are creating a new function anyway, you could do this instead:
create or replace function my2date(p_str in varchar2) return date is
begin
return to_date(nvl(p_str, '19000101'), 'YYYYMMDD');
exception
when others then -- just about acceptable here
return date '1900-01-01';
end;
/
which you can execute from SQL*Plus, SQL Developer, Toad or whatever client you're using. And then you control file would have:
COLUMN_DOB "my2date(:COLUMN_DOB)"
I don't think there's a way of doing this without using a function. If you used an external table instead then you could use an anonymous block to do the conversion I suppose, but if you're stuck with SQL*Loader then I believe a function is the only way to stop a bad date causing the whole row to be rejected.

Resources