ORA-12899 Error Too large String for Same column but success different tables - oracle

I am updating string to column of length 35 into two tables
first table update was success but second table give ORA error ORA-12899 Error Too large String
select length('Andres Peñalver D1 Palmar Sani salt') bytes from dual;
BYTES
----------
35
select lengthb('Andres Peñalver D1 Palmar Sani salt') bytes from dual;
BYTES
----------
36
Both tables colm1 field declared as VARCHAR(35), first table update fails and second one success.
update t
set colm1='Andres Peñalver D1 Palmar Sani Salt'
where value1='123456';
update t2
set colm1='Andres Peñalver D1 Palmar Sani Salt'
where value1='123456';
ORA-12899
select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
----------------------------------------------------------------
AL32UTF8
let me know why this behaviour for these table which is having same column type

Check the actual columns size for both the tables in all_tab_columns.
35 Char is 3 times 35 bytes, and if one table's column is defined in char other in byte(during ddl) the size is different.
Normal characters like A-Z a-z take 1 byte to store but language specific characters take 3 byte to store.

The full error message as described in the error message documentation
should give you the answer:
$ oerr ora 12899
12899, 00000, "value too large for column %s (actual: %s, maximum: %s)"
// *Cause: An attempt was made to insert or update a column with a value
// which is too wide for the width of the destination column.
// The name of the column is given, along with the actual width
// of the value, and the maximum allowed width of the column.
// Note that widths are reported in characters if character length
// semantics are in effect for the column, otherwise widths are
// reported in bytes.
// *Action: Examine the SQL statement for correctness. Check source
// and destination column data types.
// Either make the destination column wider, or use a subset
// of the source column (i.e. use substring).
This is likely linked to character length semantics.

Related

Migrate tables with special characters in Talend studio

I am migrating from table A (DB A) to table B (DB B), an error occurs on 1 specific field that contains french characters (é, à, ..) and special characters (&, ', ..):
Exception in component tOracleOutput_1
java.sql.SQLException: ORA-12899: value too large for column "DB1"."COLUMN1"."COMMENT" (actual: 121, maximum: 118)
While querying the table from sql editor, the maximum length for the values is 100.
How can I insert these values into the new table without loosing the special and the french characters?
This is not due to the special characters. Your column is too small.
You have three possibilities :
Increase the size of your column directly in the table schema.
See here:
how to modify the size of a column
Delete blank character before and after the value with the TRIM function in the tMap: StringHandling.TRIM(row1.yourcolumn)
Truncate the value to fit the column in your tMap: StringHandling.LEFT(row1.yourcolumn,118) (your column have 118 characters max)

how to check the size of input to avoid exceed DB column limit

I have an input field of my page with size=8.
And in the DB, the corresponding column is VARCHAR2(8).
But if I input a string of length 8 with a special ascii character in the field, I will get the following exception.
ORA-12899: value too large for column xxxx (actual: 10, maximum: 8)
I'm trying to catch this in the validator, I check myString.getBytes().length which is also 8.
I know one solution is on DB side that change the column to VARCHAR2(8 CHAR).
Is there another solution that I can check this in the controller?
The error is telling you that you've given 10 bytes but the column only allows 8. I am assuming it's bytes because of your use of the Chinese character set. So, I believe that the column was created as if it were VARCHAR2(8 byte).
If you describe the table, you'll see what's going on. Compare that describe with a describe of this one:
create table x (a varchar2(30), b varchar2(30 byte), c varchar2(30 char));
The code you are executing to obtain the number of bytes is almost correct. Instead of:
myString.getBytes().length /* this probably returns 8 */
you need to execute this:
myString.getBytes("UTF-8").length /* this probably returns 10 */
This should help you, this will return the actual size in Bytes.
SELECT LENGTHB ('é')
FROM DUAL;
Above will return 2. So whatever character you are using, you can specify something like MY_VARCHAR_FIELD VARCHAR2(2 BYTES)

What can I do to ensure fields longer than column width go to the BAD File?

When creating Oracle external tables, how should I phrase the reject rows clause to ensure that any field which exceeds its column width rejects and goes in the BADFILE?
This is my current design and I don't want records greater than 20 characters. I do want them to go BADFILE instead. Yet, they still appear when I select * from foobar
DROP TABLE FOOBAR CASCADE CONSTRAINTS;
CREATE TABLE FOOBAR
(
FOO_MAX20 VARCHAR2(20 CHAR)
)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY FOOBAR
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE 'foobar_bad_rec.txt'
DISCARDFILE 'foobar_discard_rec.txt'
LOGFILE 'foobar_logfile.txt'
FIELDS
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
(
FOO_MAX20 POSITION(1:20)
)
)
LOCATION (foobar:'foobar.txt') )
REJECT LIMIT UNLIMITED
PARALLEL ( DEGREE DEFAULT INSTANCES DEFAULT )
NOMONITORING;
Here is my external file foobar.txt
1234567
1234567890123456
126464843750476074218751012345678901234567890
7135009765625
048669433593
7
527
You can't do this with the reject rows clause, as it only accepts one form.
You have a variable-length (delimited) record, but a fixed-length field. Everything after the last position you specify, which is 20 in this case, is seen as filler that you want to ignore. That isn't an error condition; you might have rubbish at the end that isn't relevant to your table. There is nothing that says chars 21-45 in your third record shouldn't be there - just that you aren't interested in them.
It would be nice if you could discard them with the load when clause, but you don't seem to be able to compare , say, (21:21) to null or an empty string - the former isn't recognised and the latter causes an internal error, which isn't good.
You can make the longer records be sent to the bad file by forcing an SQL error when it tries to put a longer parsed value from the file into the field, by changing:
FOO_MAX20 POSITION(1:20)
to
FOO_MAX20 POSITION(1:21)
Values that are up to 20 characters are still loaded:
select * from foobar;
FOO_MAX20
--------------------
1234567
1234567890123456
7135009765625
048669433593
7
527
6 rows selected
but for anything longer than 20 characters it'll try to put 21 chars in to the database's 20-char field, which gets this in the log:
error processing column FOO_MAX20 in row 3 for datafile /path/to/dir/foobar.txt
ORA-12899: value too large for column FOO_MAX20 (actual: 21, maximum: 20)
And the bad file gets that record:
126464843750476074218751012345678901234567890
Have a CHECK CONSTRAINT on the column to not allow any value exceeding the `LENGTH'.

OCI: Determine length of text representation of query columns

My goal is to execute a query (SELECT), fetch results and output them as text. Query is given as a parameter and can be e.g. select * from t.
I use OCIStmtPrepare and OCIStmtExecute, then I can describe columns of the query by OCIParamGet and series of OCIAttrGet. Suppose I get OCI_ATTR_DATA_TYPE = 12 (DATE) for one of the columns. Then OCI_ATTR_DATA_SIZE = 7 -- this is size of internal DATE representation.
I want to get this DATE as text, with respect to currect NLS settings. For that I use OCIDefineByPos with dty = SQLT_STR. It works alright, but I also need to supply a buffer for fetching. The question is: what size of buffer do I need?
Evidently it depends on NLS_DATE_FORMAT. I believe that Oracle knows this value:
SQL> create table x as select to_char(sysdate) d from dual;
Table created.
SQL> select value from nls_session_parameters where parameter='NLS_DATE_FORMAT';
VALUE
----------------------------------------
DD.MM.RR
SQL> select data_length from dba_tab_columns where table_name='X';
DATA_LENGTH
-----------
8
This is the exact length. Only when date format is masked from Oracle (by a function, for example), it uses absolute maximum (?) value of 75:
SQL> create or replace function get_date_format return varchar2 is
2 begin
3 return 'DD.MM.RR';
4 end;
5 /
Function created.
SQL> create table x as select to_char(sysdate,get_date_format) d from dual;
Table created.
SQL> select data_length from dba_tab_columns where table_name='X';
DATA_LENGTH
-----------
75
All said above applies to NUMBER as well.
So, is it possible to get length of text representation of a column in OCI?
The maximum buffer size for any date is 75. The maximum buffer size for any number is 42.
I hope that helps.
You can determine needed buffer size by calling OCIAttrGet for OCI_ATTR_DISP_SIZE attribute. It returns 40 for NUMBER, 75 for DATE, N for VARCHAR2(N). Add 1 byte for Null-termination and you good to go.
Yes - the trick is that in C, a string is really a pointer to a character array, so you would say char* mystring = OCIStringPtr(envhp, x); where x is a pointer to an OCIString, which you can get back by connecting with OCI_OBJECT set and asking for a SQLT_VST instead of an SQLT_STR. The actual memory for the string is allocated for you in the global env by OCI behind the scenes.

Problem with Oracle Sql Loader control file

I'm trying to load some data using sql loader. Here is the top of my control/data file:
LOAD DATA
INFILE *
APPEND INTO TABLE economic_indicators
FIELDS TERMINATED BY ','
(ASOF_DATE DATE 'DD-MON-YY',
VALUE FLOAT EXTERNAL,
STATE,
SERIES_ID INTEGER EXTERNAL,
CREATE_DATE DATE 'DD-MON-YYYY')
BEGINDATA
01-Jan-79,AL,67.39940538,1,23-Jun-2009
... lots of other data lines.
The problem is that sql loader won't recognize the data types I'm specifying. This is the log file:
Table ECONOMIC_INDICATORS, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
ASOF_DATE FIRST * , DATE DD-MON-YY
VALUE NEXT * , CHARACTER
STATE NEXT * , CHARACTER
SERIES_ID NEXT * , CHARACTER
CREATE_DATE NEXT * , DATE DD-MON-YYYY
value used for ROWS parameter changed from 10000 to 198
Record 1: Rejected - Error on table ECONOMIC_INDICATORS, column VALUE.
ORA-01722: invalid number
... lots of similiar errors, expected if trying to insert char data into a numeric column.
I've tried no datatype spec, all other numeric specs, and always the same issue. Any ideas?
Also, any ideas on why it's changing the Rows parameter?
From your example, SQL*Loader will try to evaluate the string "AL" to a number value, which will result in the error message you gave. The sample data has something looking like it could be a decimal number at third position, not second as specified int he column list.

Resources