I'm getting ORA-01843: not a valid month error while loading the data using tab delimited text file with sql loader 11.1.0.6.0 on oracle 12c.
control file:
options (skip=1)
load data
truncate
into table test_table
fields terminated by '\t' TRAILING NULLCOLS
(
type,
rvw_date "case when :rvw_date = 'NULL' then null WHEN REGEXP_LIKE(:rvw_date, '\d{4}/\d{2}/\d{2}') THEN to_date(:rvw_date,'yyyy/mm/dd') else to_date(:rvw_date,'mm-dd-yy') end"
)
Data:
type rvw_date
Phone 2014/01/29
Phone 2014/02/13
Field NULL
Phone 01/26/15
Field 02/25/12
Schema:
create table test_table
(
type varchar2(20),
rvw_date date
)
The SQL*Loader control file is interpreting a backslash as an escape character, it seems. The SQL operator section shows double-quotes being escaped too; it ins't obvious it would apply to anything else, but it kind of makes sense that a single backslash would always be assumed to be escaping something.
Your regular expression pattern needs to double-escape the \d to make the pattern work in the file as it does in plain SQL. At the moment the pattern is not matched, so all the values to go the else, which is the wrng format mask (even when they are both corrected).
This works:
options (skip=1)
load data
truncate
into table test_table
fields terminated by '\t' TRAILING NULLCOLS
(
type,
rvw_date "case when :rvw_date = 'NULL' then null when REGEXP_LIKE(:rvw_date,'\\d{4}/\\d{2}/\\d{2}') then to_date(:rvw_date,'yyyy/mm/dd') else to_date(:rvw_date,'mm/dd/rr') end"
)
With your original data that creates rows:
alter session set nls_date_format = 'YYYY-MM-DD';
select * from test_table;
TYPE RVW_DATE
-------------------- ----------
Phone 2014-01-29
Phone 2014-02-13
Field
Phone 2015-01-26
Field 2012-02-25
Related
We have flat file which we are trying to load into an Oracle 19c table using SQL*Loader, but it fails with "Multibyte character error" for one of the CHAR(2) field. We know it's a junk value but still we have to load it into the database. The database character set is AL32UTF8.
The value we are trying to load is a block element : U+2592 ▒ Medium shade
We tried using UTF-8 in the SQL*Loader control file but are still facing the same issue. Any advice how to proceed?
Command:
sqlldr $connection parfile=SCHEMA.TABLE.par
Error:
Record 1: Rejected - Error on table SCHEMA.TABLE, column COL2.
Multibyte character error.
COLUMN info :COL2 CHAR(2) NOT NULL ENABLE
Parameter file contents:
data=filename.dat
control=SCHEMA.TABLE.ctl
log=SCHEMA.TABLE.log
bad=SCHEMA.TABLE.bad
Control file contents:
OPTIONS (BINDSIZE=20000000,READSIZE=10485760,ROWS=10000,DIRECT=FALSE,ERRORS=50)
LOAD DATA
CHARACTERSET 'AL32UTF8'
DISCARDMAX 100
REPLACE PRESERVE BLANKS INTO TABLE SCHEMA.TABLE
TRAILING NULLCOLS
(
COL1 POSITION(1:15) "NVL(:COL1,' ')",
COL2 POSITION(16:17) "NVL(:COL2,' ')"
)
File characterset:
filename.dat: text/plain; charset=utf-8
OS: GNU/Linux
From the documentation:
The start and end arguments to the POSITION parameter are interpreted in bytes, even if character-length semantics are in use in a data file.
So POSITION(16:17) is the 16th and 17th bytes of the line, not 16th (and only, in your example) character. The U+2592 character is three bytes in UTF-8 - 0xE2 0x96 0x92 (e29692) - and you're only looking at the first two bytes; and those on their own don't represent a valid character.
You can change from using POSITION to using CHAR with the fixed length of each field, and specify LENGTH SEMANTICS CHARACTER:
OPTIONS (BINDSIZE=20000000,READSIZE=10485760,ROWS=10000,DIRECT=FALSE,ERRORS=50)
LOAD DATA
CHARACTERSET 'AL32UTF8'
LENGTH SEMANTICS CHARACTER
DISCARDMAX 100
REPLACE PRESERVE BLANKS INTO TABLE SCHEMA.TABLE
TRAILING NULLCOLS
(
COL1 CHAR(15) "NVL(:COL1,' ')",
COL2 CHAR(2) "NVL(:COL2,' ')"
)
I'm loading data into my table through SQL Loader
data loading is successful but i''m getting garbage(repetitive) value in a particular column for all rows
After inserting :
column TERM_AGREEMENT is getting value '806158336' for every record
My csv file contains atmost 3 digit data for that column,but i'm forced to set my column definition to Number(10).
LOAD DATA
infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
REPLACE
INTO TABLE LOAN_BALANCE_MASTER_INT
fields terminated by ',' optionally enclosed by '"'
(
ACCOUNT_NO,
CUSTOMER_NAME,
LIMIT,
REGION,
**TERM_AGREEMENT INTEGER**
)
create table LOAN_BALANCE_MASTER_INT
(
ACCOUNT_NO NUMBER(30),
CUSTOMER_NAME VARCHAR2(70),
LIMIT NUMBER(30),
PRODUCT_DESC VARCHAR2(30),
SUBPRODUCT_CODE NUMBER,
ARREARS_INT NUMBER(20,2),
IRREGULARITY NUMBER(20,2),
PRINCIPLE_IRREGULARITY NUMBER(20,2),
**TERM_AGREEMENT NUMBER(10)**
)
INTEGER is for binary data type. If you're importing a csv file, I suppose the numbers are stored as plain text, so you should use INTEGER EXTERNAL. The EXTERNAL clause specifies character data that represents a number.
Edit:
The issue seems to be the termination character of the file. You should be able to solve this issue by editing the INFILE line this way:
INFILE'/ipoapplication/utl_file/LBR_HE_Mar16.csv' "STR X'5E204D'"
Where '5E204D' is the hexadecimal for '^ M'. To get the hexadecimal value you can use the following query:
SELECT utl_raw.cast_to_raw ('^ M') AS hexadecimal FROM dual;
Hope this helps.
I actually solved this issue on my own.
Firstly, thanks to #Gary_W AND #Alessandro for their inputs.Really appreciate your help guys,learned some new things in the process.
Here's the new fragment which worked and i got the correct data for the last column
LOAD DATA
infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
REPLACE
INTO TABLE LOAN_BALANCE_MASTER_INT
fields terminated by ',' optionally enclosed by '"'
(
ACCOUNT_NO,
CUSTOMER_NAME,
LIMIT,
REGION,
**TERM_AGREEMENT INTEGER Terminated by Whitspace**
)
'Terminated by whitespace' - I went through some threads of SQL Loader and i used 'terminated by whitespace' in the last column of his ctl file. it worked ,this time i didn't even had to use 'INTEGER' or 'EXTERNAL' or EXPRESSION '..' for conversion.
Just one thing, now can you guys let me now what could possibly be creating issue ?what was there in my csv file in that column and how by adding this thing solved the issue ?
Thanks.
As mentioned in the title, i wish to have a control file to handle this case. The scenario is i have to insert record into different table. For example, WHEN (1:3) is HEA, it need to Append into table header. WHEN (1:3) is DTL it need replace into table Detail. is that possible to do this?
I have a situation where data from one file goes to three tables depending on the first field in the file. The WHEN clause looks at the first field and takes action based on that. Notice that when a 'WHEN' is met, the first field is then skipped by declaring it a filler. To answer your question, I believe you can put the APPEND or REPLACE after the INTO TABLE clause. Give it a try and let us know.
OPTIONS (DIRECT=TRUE)
UNRECOVERABLE
LOAD DATA
APPEND
INTO TABLE TABLE_A
WHEN (01) = 'CLM'
FIELDS TERMINATED BY '|' TRAILING NULLCOLS
( rec_skip filler POSITION(1)
,CLM_CLAIM_ID CHAR NULLIF(CLM_CLAIM_ID=BLANKS)
...
)
INTO TABLE TABLE_B
WHEN (01) = 'SLN'
FIELDS TERMINATED BY '|' TRAILING NULLCOLS
( rec_skip filler POSITION(1)
,SL_CLAIM_ID CHAR NULLIF(SL_CLAIM_ID=BLANKS)
...
)
INTO TABLE TABLE_C
WHEN (01) = 'COB'
FIELDS TERMINATED BY '|' TRAILING NULLCOLS
( rec_skip filler POSITION(1)
,COB_CLAIM
...
)
More info: http://docs.oracle.com/cd/B28359_01/server.111/b28319/ldr_control_file.htm#i1005657
I am having text file contaning field in below manner.
"64252368","7489040","305762",
"64285217","12132108","787341",
I am using a below control file.
OPTIONS (SKIP=1)
LOAD DATA
TRUNCATE INTO TABLE test_table
FIELDS TERMINATED BY '",'
(
LEARNEVENT_ID,
ORGANIZATION,
COURSE_ID
)
But, I am getting the error:
Record 1: Rejected - Error on table test_table, column LEARNEVENT_ID
ORA-01722: invalid number
Kindly help me on it.
You need to change your ctl file to include OPTIONALLY ENCLOSED BY option.
OPTIONS (SKIP=1)
LOAD DATA
TRUNCATE INTO TABLE test_table
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(
LEARNEVENT_ID,
ORGANIZATION,
COURSE_ID
)
I'd recommend reading up on SQL*Loader.
The problem lays with the encapsulation of the numbers with the quotes " " and your fields terminated by '",' simply does not strip the quotes.
Try this
OPTIONS(SKIP=1)
LOAD DATA
TRUNCATE INTO TABLE test_table
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(
LEARNEVENT_ID "replace ( :LEARNEVENT_ID ,'"', '')",
ORGINAZATION "replace ( :ORGINAZATION ,'"', '')",
COURSE_ID "replace ( :COURSE_ID ,'"', '')"
)
In order to load data (from a CSV file) into an Oracle database, I use SQL*Loader.
In the table that receives these data, there is a varchar2(500) column, called COMMENTS.
For some reasons, I want to ignore this information from the CSV file.
Thus, I wrote this control file:
Options (BindSize=10000000,Readsize=10000000,Rows=5000,Errors=100)
Load Data
Infile 'XXX.txt'
Append into table T_XXX
Fields Terminated By ';'
TRAILING NULLCOLS
(
...
COMMENTS FILLER,
...
)
This code seems to work correctly, as the COMMENTS field in database is always set to null.
However, if in my CSV file I have a record where the corresponding COMMENTS field exceeds the 500 characters limit, I get an error from SQL*Loader:
Record 2: Rejected - Error on table T_XXX, column COMMENTS.
Field in data file exceeds maximum length
Is there a way to really exclude the processing of my COMMENTS fields?
I can't reproduce your problem. I'm using Oracle 10.2.0.3.0 with SQL*Loader 10.2.0.1.
Here is my test case:
SQL> CREATE TABLE test_sqlldr (
2 ID NUMBER,
3 comments VARCHAR2(20),
4 id2 NUMBER
5 );
Table created
Control file:
LOAD DATA
INFILE test.data
INTO TABLE test_sqlldr
APPEND
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
( id,
comments filler,
id2
)
data file:
1;aaa;2
3;abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz;4
5;bbb;6
I'm using the command sqlldr userid=xxx/yyy#zzz control=test.ctl and I'm getting all the rows without errors:
SQL> select * from test_sqlldr;
ID COMMENTS ID2
---------- -------------------- ----------
1 2
3 4
5 6
You may try another approach, I'm getting the same desired result with the following control file:
LOAD DATA
INFILE test.data
INTO TABLE test_sqlldr
APPEND
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
( id,
comments "substr(:comments,1,0)",
id2
)
Update following Romaintaz's comment: I looked into it again and managed to get the same error as you when the size of the column exceeded 255 characters. This is because the default datatype of SQL*Loader is char(255). If you have a column with more data you will have to specify the length. The following control file solved the problem for a column with 300 characters:
LOAD DATA
INFILE test.data
INTO TABLE test_sqlldr
APPEND
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
( id,
comments filler char(4000),
id2
)
Hope this Helps,
--
Vincent
Just to suggest a tiny improvement, you might try something like:
LOAD DATA
IN FILE test.data INTO TABLE test_sqlldr
APPEND
FIELDS TERMINATED BY ';'TRAILING NULLCOLS
(
id,
comments char(4000) "substr(:comments, 1, 200)",
id2)
Now you'll grab the first 200 characters (or any number you specify in it's place) of all comments - unless some of your input records have values for the comments field that exceed 4000 characters, in which they'll be rejected by loader with the 'exceeds max length' error noted earlier. But assuming that's rare or not the case, all the records will load with some of the comments truncated to 200 chars.
If you go over char(4000) you'll get a SQL Loader error - there's a limit to how far you can push the beast.