I'm entering values from a CSV file to an Oracle table using a SQL*Loader script. In this table there are fields with NOT NULL constraints. In my CSV file the corresponding field is "" and I would like put a blank string into the Oracle table when that happens.
This is my control file:
LOAD DATA
infile 'F:\tar.csv'
REPLACE
INTO TABLE tar
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
(IDTAR ,
DATABACKUP DATE "YYYY-MM-DD",
PAESE ,
R_ELEM NULLIF (R_ELEM=BLANKS))
and this is the error in the log file:
ORA-01400: cannot insert NULL into ("MY_SCHEMA"."TAR"."PAESE")
How can I avoid the error by supplying a different value?
You can apply an SQL operator, such as NVL(:PAESE, 'XXX'). Notice the colon before the reference to the field name. In situ:
LOAD DATA
infile 'gian.csv'
REPLACE
INTO TABLE tar
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
(
IDTAR,
DATABACKUP DATE "YYYY-MM-DD",
PAESE "NVL(:PAESE, 'XXX')",
R_ELEM NULLIF (R_ELEM=BLANKS)
)
With a dummy table:
create table tar (
idtar number,
databackup date,
paese varchar2(10) not null,
r_elem varchar2(10)
);
and CSV, where the 3rd and 4th lines have trailing spaces for the nullif() clause:
1,2017-08-01,A,B
2,2017-08-02,C,
3,2017-08-03,,
4,2017-08-04,"",
then running with that control file gets:
SQL*Loader: Release 11.2.0.4.0 - Production on Fri Aug 4 19:39:23 2017
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 4
and the log says:
...
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
IDTAR FIRST * , O(") CHARACTER
DATABACKUP NEXT * , O(") DATE YYYY-MM-DD
PAESE NEXT * , O(") CHARACTER
SQL string for column : "NVL(:PAESE, 'XXX')"
R_ELEM NEXT * , O(") CHARACTER
NULL if R_ELEM = BLANKS
Table TAR:
4 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
...
Querying the table shows all four rows were loaded:
set null "<null>"
select * from tar;
IDTAR DATABACKU PAESE R_ELEM
---------- --------- ---------- ----------
1 01-AUG-17 A B
2 02-AUG-17 C <null>
3 03-AUG-17 XXX <null>
4 04-AUG-17 XXX <null>
Obviously replace 'XXX' with the actual default string you want to use. You said 'a blank string', so you could use "NVL(:PAESE, ' ')" to insert a single space character for instance. You can't use an empty string though, as that is the same as null as far as Oracle is concerned.
Related
using the control file below to load data
LOAD DATA
INFILE '/c/Transaction.txt'
INTO TABLE tab1 APPEND WHEN (1:1) = 'D'
(RUN_ID "RUN_ID_SEQ.NEXTVAL"
,RUN_DATE_TIME "SYSDATE"
)
INTO TABLE tab2 TRUNCATE WHEN(1:1) <> 'D'
FIELDS TERMINATED BY '|' TRAILING NULLCOLS
(
DEALER_NUMBER
,TRAN_CODE
,TRAN_AMOUNT "TO_NUMBER(:PL818_TRAN_AMOUNT,'999999.99')"
,TRAN_DATE "TO_DATE(:PL818_TRAN_DATE,'DD-MM-YYYY')"
)
This is the data set
DEALER_ID|TRAN_TYPE_CODE|TRAN_AMOUNT|TRAN_DATE
203113|34|1000.50|12-07-2022
No errors during load. Data loads correctly into the first table but first two characters are skipped when loading data into second table, table data looks like this. What could be causing this?
3113 34 1000.5 12-JUL-22
Skips "20"
Sample tables:
SQL> desc tab1
Name Null? Type
----------------------------------------- -------- ----------------------------
RUN_ID NUMBER
RUN_DATE_TIME DATE
SQL> desc tab2
Name Null? Type
----------------------------------------- -------- ----------------------------
DEALER_NUMBER NUMBER
TRAN_CODE NUMBER
TRAN_AMOUNT NUMBER
TRAN_DATE DATE
SQL>
Control file; note position(1) when loading into tab2. You must use it when loading data into different tables, using the same control file:
LOAD DATA
INFILE *
INTO TABLE tab1 APPEND WHEN (1:1) = 'D'
(RUN_ID "RUN_ID_SEQ.NEXTVAL"
,RUN_DATE_TIME "SYSDATE"
)
INTO TABLE tab2 TRUNCATE WHEN(1:1) <> 'D'
FIELDS TERMINATED BY '|' TRAILING NULLCOLS
(
DEALER_NUMBER position(1) --> here
,TRAN_CODE
,TRAN_AMOUNT "TO_NUMBER(:TRAN_AMOUNT,'999999.99')"
,TRAN_DATE "TO_DATE(:TRAN_DATE,'DD-MM-YYYY')"
)
BEGINDATA
203113|34|1000.50|12-07-2022
Loading session and the result:
SQL> $sqlldr scott/tiger#orcl control=test42.ctl log=test42.log
SQL*Loader: Release 18.0.0.0.0 - Production on Sri Srp 13 08:28:19 2022
Version 18.5.0.0.0
Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional
Commit point reached - logical record count 1
Table TAB1:
0 Rows successfully loaded.
Table TAB2:
1 Row successfully loaded.
Check the log file:
test42.log
for more information about the load.
SQL> select * from tab2;
DEALER_NUMBER TRAN_CODE TRAN_AMOUNT TRAN_DATE
------------- ---------- ----------- ----------
203113 34 1000,5 12-07-2022
--
here's your missing "20"
SQL>
I am loading CSV file data into table called EMPLOYEE using SQL*Loader.
My CSV file data is separated by pipe (|):
EMPID|EMPNAME_ADDRESS|SALARY|GRADE
123|Rams Hyd|1000|A1
124|Sand MUM|2,000|A2
125|"PRASANNA qwer trasf\"501 advv vvd, qee ggg\trfe \411005 THE|3,00,000|A3
and my control file is:
LOAD DATA
Insert INTO TABLE EMPLOYEE
Fields terminated by "|" Optionally enclosed by '"' TRAILING NULLCOLS
(
EMPID,
EMPNAME,
SALARY,
GRADE
)
When I load the data using above control file, the first two record are loading fine and for the third record i am getting error like as mentioned below.
no terminator found after TERMINATED and ENCLOSED field
Please suggest changes to be done to load the data properly.
Here's a way to load such data.
First, a target table (I hope it makes sense; you should have provided it):
SQL> create table employee
2 (empid number,
3 empname varchar2(200),
4 salary number,
5 grade varchar2(10)
6 );
Table created.
SQL>
Control file: note the SKIP option (which skips the header line), as well as REPLACE function call for the SALARY column (to remove superfluous commas which are formatting stuff; add them later, in the presentation layer).
options (skip = 1)
load data
infile *
replace
into table employee
fields terminated by "|" TRAILING NULLCOLS
(
empid,
empname,
salary "replace(:salary, ',', null)",
grade)
begindata
EMPID|EMPNAME_ADDRESS|SALARY|GRADE
123|Rams Hyd|1000|A1
124|Sand MUM|2,000|A2
125|"PRASANNA qwer trasf\"501 advv vvd, qee ggg\trfe \411005 THE|3,00,000|A3
Loading session:
SQL> $sqlldr scott/tiger control=test06.ctl log=test06.log
SQL*Loader: Release 11.2.0.2.0 - Production on Uto Tra 9 19:37:46 2019
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 2
Commit point reached - logical record count 3
SQL> select * From employee;
EMPID EMPNAME SALARY GRADE
---------- ---------------------------------------- ---------- ----------
123 Rams Hyd 1000 A1
124 Sand MUM 2000 A2
125 "PRASANNA qwer trasf\"501 advv vvd, qee 300000 A3
ggg\trfe \411005 THE
SQL>
I am running the following to load data into a table
OPTIONS (Skip=1)
LOAD DATA
INFILE 'D:\EPM\import\test.txt'
APPEND
INTO TABLE HYP.HS_MEMBER_D
FIELDS TERMINATED BY "|"
TRAILING NULLCOLS
(
DIMENSION,
PARENT,
CHILD,
ALIAS,
ATTRB01
)
This is working fine but I want to keep the ATTRB01 field as a static value, I want to load "Alloc" for all records, is there a way in this script to load a static value even though I am loading from the file for the other fields?
You'd load a constant, such as
OPTIONS (Skip=1)
LOAD DATA
INFILE 'D:\EPM\import\test.txt'
APPEND
INTO TABLE HYP.HS_MEMBER_D
FIELDS TERMINATED BY "|"
TRAILING NULLCOLS
(
DIMENSION constant Account,
PARENT,
CHILD,
ALIAS,
ATTRB01 constant 'Alloc' --> this
)
Here's an example: test table:
SQL> desc test
Name Null? Type
----------------------------------------- -------- -------------
ID NUMBER
ATTRB01 VARCHAR2(20)
Control file:
load data
infile *
replace
into table test
fields terminated by ","
trailing nullcols
(
id,
attrb01 constant 'Alloc'
)
begindata
1,xxx
2,yyy
3,zzz
Loading session & the result:
SQL> $sqlldr scott/tiger control=test01.ctl log=test01.log
SQL*Loader: Release 11.2.0.2.0 - Production on Sri Kol 15 21:08:59 2018
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 2
Commit point reached - logical record count 3
SQL> select * From test;
ID ATTRB01
---------- --------------------
1 Alloc
2 Alloc
3 Alloc
try by using set at end
OPTIONS (Skip=1)
LOAD DATA
INFILE 'D:\EPM\import\test.txt'
APPEND
INTO TABLE HYP.HS_MEMBER_D
FIELDS TERMINATED BY "|"
TRAILING NULLCOLS
(
DIMENSION constant Account,
PARENT,
CHILD,
ALIAS,
ATTRB01
)
set ATTRB01='default_value'
I am trying to import some data with sql*loader but I can't import a latitude/longitude. On the table those columns are FLOAT(126) and on the data file is just text. I tried with FLOAT EXTERNAL on the sqlldr control file but it doesn't work. I got ORA-01722: invalid number.
Describe my_table;
Name Null Type
------------------------- -------- ------------
PRE_ID NOT NULL NUMBER(38)
PRE_DH NOT NULL TIMESTAMP(6)
PRE_PRO NOT NULL NUMBER(38)
PRE_INF NOT NULL NUMBER(38)
PRE_TPL NOT NULL NUMBER(38)
PRE_LAT NOT NULL FLOAT(126)
PRE_LNG NOT NULL FLOAT(126)
data file:
55831;08/12/2016 16:48:07;1;-128;2;-22.4741249084473;-50.55194854736336
55831;09/12/2016 08:02:06;1;-128;2;-22.5002975463867;-50.8194427490234
55831;09/12/2016 19:12:06;1;-128;2;-22.5002975463867;-50.8194427490234
and sqlldr control file:
load data
infile 'my_file.csv' "str '\r\n'"
append
into table my_table
fields terminated by ';'
trailing nullcols
( PRE_ID CHAR(4000),
PRE_DH TIMESTAMP "DD/MM/YYYY HH24:MI:SS",
PRE_PRO CHAR(4000),
PRE_INF CHAR(4000),
PRE_TPL CHAR(4000),
PRE_LAT FLOAT EXTERNAL,
PRE_LNG FLOAT EXTERNAL,
)
log file:
Table MY_TABLE, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
PRE_ID FIRST 4000 ; CHARACTER
PRE_DH NEXT * ; DATETIME DD/MM/YYYY HH24:MI:SS
PRE_PRO NEXT 4000 ; CHARACTER
PRE_INF NEXT 4000 ; CHARACTER
PRE_TPL NEXT 4000 ; CHARACTER
PRE_LAT NEXT * ; CHARACTER
PRE_LNG NEXT * ; CHARACTER
value used for ROWS parameter changed from 64 to 1
Record 1: Rejected - Error on table MY_TABLE, column PRE_LAT.
ORA-01722: invalid number
Record 2: Rejected - Error on table MY_TABLE, column PRE_LAT.
ORA-01722: invalid number
Record 3: Rejected - Error on table MY_TABLE, column PRE_LAT.
ORA-01722: invalid number
You're seeing this because the operating system environment is set up in a way that causes Oracle to treat a comma as a decimal separator and a period as a group separator. Your error messages are in English, interestingly, so not sure exactly what you have that set to, but you can see the same thing with something like NLS_LANG="FRENCH_FRANCE.WE8ISO8859P1".
From the log you can see that the field in your CSV file is being read as character data. The target column is a float (any type of number column would have the same issue), which means an implicit conversion is happening, and is using your NLS settings. You can see the same thing more simply with:
alter session set NLS_NUMERIC_CHARACTERS='.,';
select to_number('-22.4741249084473') from dual;
TO_NUMBER('-22.4741249084473')
------------------------------
-22.4741249084473
alter session set NLS_NUMERIC_CHARACTERS=',.';
select to_number('-22.4741249084473') from dual;
Error report -
ORA-01722: invalid number
Same conversion, but the alter session is swapping the meaning of the comma and period.
You can either explicitly set your environment to something with the right NLS numeric characters via NLS_LANG:
export NLS_LANG="ENGLISH_UNITED KINGDOM.WE8ISO8859P1"
or just that specific setting:
export NLS_NUMERIC_CHARACTERS='.,'
... before running SQL*Loader.
I am using Oracle Sql Loader Utility from Linux shell to load csv data into Oracle DB.
But I have noticed that if source csv files lines endings are '\r\n' (Windows format), sqlldr fails to load data for last column.
For example, if last column is of FLOAT type (defined in ctl file as 'FLOAT EXTERNAL'), sqlldr fails with 'ORA-01722: invalid number':
Sqlldr ctl file:
OPTIONS(silent=(HEADER))
load data
replace
into table fp_basic_bd
fields terminated by "|" optionally enclosed by '"'
TRAILING NULLCOLS
(
FS_PERM_SEC_ID CHAR(20),
"DATE" DATE "YYYY-MM-DD",
ADJDATE DATE "YYYY-MM-DD",
CURRENCY CHAR(3),
P_PRICE FLOAT EXTERNAL,
P_PRICE_OPEN FLOAT EXTERNAL,
P_PRICE_HIGH FLOAT EXTERNAL,
P_PRICE_LOW FLOAT EXTERNAL,
P_VOLUME FLOAT EXTERNAL
)
sqlldr execution command:
sqlldr -userid XXX -data ./test.data -log ./test.log -bad ./test.errors -control test.ctl -errors 3 -skip_unusable_indexes -skip_index_maintenance
sqlldr error log:
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
FS_PERM_SEC_ID FIRST 20 | O(") CHARACTER
"DATE" NEXT * | O(") DATE YYYY-MM-DD
ADJDATE NEXT * | O(") DATE YYYY-MM-DD
CURRENCY NEXT 3 | O(") CHARACTER
P_PRICE NEXT * | O(") CHARACTER
P_PRICE_OPEN NEXT * | O(") CHARACTER
P_PRICE_HIGH NEXT * | O(") CHARACTER
P_PRICE_LOW NEXT * | O(") CHARACTER
P_VOLUME NEXT * | O(") CHARACTER
value used for ROWS parameter changed from 300000 to 65534
Record 1: Rejected - Error on table FP_BASIC_BD, column P_VOLUME.
ORA-01722: invalid number
Record 2: Rejected - Error on table FP_BASIC_BD, column P_VOLUME.
ORA-01722: invalid number
When I replaced Windows line endings to Unix ones, all errors gone and all data loaded correctly.
My question is: how could I specify line terminator char in sqlldr config file but still keep the source file name in shell command?
I've seen some examples of how to do that with stream record format http://docs.oracle.com/cd/E11882_01/server.112/e16536/ldr_control_file.htm#SUTIL1087,
but these examples are not applicable in my case as I need to keep name of data file in shell command, and not inside ctl file.
I recently encountered the same issue while loading data into my table via csv file.
My file looked like this :
LOAD DATA
infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
REPLACE
INTO TABLE LOAN_BALANCE_MASTER_INT
fields terminated by ',' optionally enclosed by '"'
(
ACCOUNT_NO,
CUSTOMER_NAME,
LIMIT,
REGION,
TERM_AGREEMENT INTEGER EXTERNAL
)
And as you mentioned , i kept getting the same error 'invalid number'
Turns out this usually occurs
-when your column datatype is Number but data you're getting from your csv file is in string,so oracle loader fails to perform a conversion of string to number.
- when your field in csv file is terminated by some delimiters ,say space,tabs etc.
This is how i altered my ctl file :
LOAD DATA
infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
REPLACE
INTO TABLE LOAN_BALANCE_MASTER_INT
fields terminated by ',' optionally enclosed by '"'
(
ACCOUNT_NO,
CUSTOMER_NAME,
LIMIT,
REGION,
TERM_AGREEMENT INTEGER Terminated by Whitespace
)
Try using stream record format and specifying the terminator string. From the docs
On UNIX-based platforms, if no terminator_string is specified, SQL*Loader defaults to the line feed character, \n.
The terminator string should allow you to specify a combination of characters.