I'm trying to load some data using sql loader. Here is the top of my control/data file:
LOAD DATA
INFILE *
APPEND INTO TABLE economic_indicators
FIELDS TERMINATED BY ','
(ASOF_DATE DATE 'DD-MON-YY',
VALUE FLOAT EXTERNAL,
STATE,
SERIES_ID INTEGER EXTERNAL,
CREATE_DATE DATE 'DD-MON-YYYY')
BEGINDATA
01-Jan-79,AL,67.39940538,1,23-Jun-2009
... lots of other data lines.
The problem is that sql loader won't recognize the data types I'm specifying. This is the log file:
Table ECONOMIC_INDICATORS, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
ASOF_DATE FIRST * , DATE DD-MON-YY
VALUE NEXT * , CHARACTER
STATE NEXT * , CHARACTER
SERIES_ID NEXT * , CHARACTER
CREATE_DATE NEXT * , DATE DD-MON-YYYY
value used for ROWS parameter changed from 10000 to 198
Record 1: Rejected - Error on table ECONOMIC_INDICATORS, column VALUE.
ORA-01722: invalid number
... lots of similiar errors, expected if trying to insert char data into a numeric column.
I've tried no datatype spec, all other numeric specs, and always the same issue. Any ideas?
Also, any ideas on why it's changing the Rows parameter?
From your example, SQL*Loader will try to evaluate the string "AL" to a number value, which will result in the error message you gave. The sample data has something looking like it could be a decimal number at third position, not second as specified int he column list.
Related
I am updating string to column of length 35 into two tables
first table update was success but second table give ORA error ORA-12899 Error Too large String
select length('Andres Peñalver D1 Palmar Sani salt') bytes from dual;
BYTES
----------
35
select lengthb('Andres Peñalver D1 Palmar Sani salt') bytes from dual;
BYTES
----------
36
Both tables colm1 field declared as VARCHAR(35), first table update fails and second one success.
update t
set colm1='Andres Peñalver D1 Palmar Sani Salt'
where value1='123456';
update t2
set colm1='Andres Peñalver D1 Palmar Sani Salt'
where value1='123456';
ORA-12899
select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
----------------------------------------------------------------
AL32UTF8
let me know why this behaviour for these table which is having same column type
Check the actual columns size for both the tables in all_tab_columns.
35 Char is 3 times 35 bytes, and if one table's column is defined in char other in byte(during ddl) the size is different.
Normal characters like A-Z a-z take 1 byte to store but language specific characters take 3 byte to store.
The full error message as described in the error message documentation
should give you the answer:
$ oerr ora 12899
12899, 00000, "value too large for column %s (actual: %s, maximum: %s)"
// *Cause: An attempt was made to insert or update a column with a value
// which is too wide for the width of the destination column.
// The name of the column is given, along with the actual width
// of the value, and the maximum allowed width of the column.
// Note that widths are reported in characters if character length
// semantics are in effect for the column, otherwise widths are
// reported in bytes.
// *Action: Examine the SQL statement for correctness. Check source
// and destination column data types.
// Either make the destination column wider, or use a subset
// of the source column (i.e. use substring).
This is likely linked to character length semantics.
I would like to load data in Oracle via the sqlldr, however, it is always loading the data in another format.
So does my data from my file look like:
2018-11-27 13 Vienna 1 66.90 1
This is the result after having loaded the data:
27-Nov-17 1443443505 ienna 909510961 0.9 3377
All columns except the date column are wrong
This is my table structure:
BOOKINGDATE DATE
CUSTOMERID NUMBER(38,0)
LOCATIONID VARCHAR(255 BYTE)
NUMBEROFPARKINGTICKET NUMBER(38,0)
CHARGETICKET NUMBER(18,2)
DURATIONINMINUTES NUMBER(38)
This is my table definition in my file:
LOAD DATA
APPEND
INTO TABLE ROTH.PARKSCHEIN_ROTH
FIELDS TERMINATED BY '\t'
OPTIONALLY ENCLOSED BY '"'
(
BOOKINGDATE DATE 'YYYY-MM-DD',
CUSTOMERID INTEGER,
LOCATIONID CHAR(255),
NUMBEROFPARKINGTICKET INTEGER,
CHARGETICKET DECIMAL EXTERNAL,
DURATIONINMINUTES INTEGER
)
Can someone please tell me which datatypes do I have to use?
I thought Oracle is find all the types by itself except the date?
Thank you very much in advance for you help.
It's generally easiest to leave it to the default and let conversion happen in the database:
load data
replace
into table parkschein_roth
fields terminated by '\t'
optionally enclosed by '"'
( bookingdate date 'YYYY-MM-DD'
, customerid
, locationid
, numberofparkingticket
, chargeticket
, durationinminutes )
The log then shows it did this:
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
BOOKINGDATE FIRST * WHT O(") DATE YYYY-MM-DD
CUSTOMERID NEXT * WHT O(") CHARACTER
LOCATIONID NEXT * WHT O(") CHARACTER
NUMBEROFPARKINGTICKET NEXT * WHT O(") CHARACTER
CHARGETICKET NEXT * WHT O(") CHARACTER
DURATIONINMINUTES NEXT * WHT O(") CHARACTER
Note that specifying a numeric datatype without the EXTERNAL keyword instructs SQL*Loader to read the binary data in the file directly, not its character representation: for example, what's displayed as 1 in a text editor is character 49 (that is, the symbol representing the bit sequence 00110001), and not an actual numeric 1. (I have never seen a data file formatted with binary encoded numbers but I suppose they must exist.) See Numeric EXTERNAL in the SQL*Loader Field List Reference.
Note following comments: it seems the line actually ended with 00110001 (the character '1') followed by 00001101 (Windows carriage return) before the linefeed. Looking at the result you got, it must have read those two bytes and interpreted them as 0000110100110001 to get decimal 3377.
I am trying to import some data with sql*loader but I can't import a latitude/longitude. On the table those columns are FLOAT(126) and on the data file is just text. I tried with FLOAT EXTERNAL on the sqlldr control file but it doesn't work. I got ORA-01722: invalid number.
Describe my_table;
Name Null Type
------------------------- -------- ------------
PRE_ID NOT NULL NUMBER(38)
PRE_DH NOT NULL TIMESTAMP(6)
PRE_PRO NOT NULL NUMBER(38)
PRE_INF NOT NULL NUMBER(38)
PRE_TPL NOT NULL NUMBER(38)
PRE_LAT NOT NULL FLOAT(126)
PRE_LNG NOT NULL FLOAT(126)
data file:
55831;08/12/2016 16:48:07;1;-128;2;-22.4741249084473;-50.55194854736336
55831;09/12/2016 08:02:06;1;-128;2;-22.5002975463867;-50.8194427490234
55831;09/12/2016 19:12:06;1;-128;2;-22.5002975463867;-50.8194427490234
and sqlldr control file:
load data
infile 'my_file.csv' "str '\r\n'"
append
into table my_table
fields terminated by ';'
trailing nullcols
( PRE_ID CHAR(4000),
PRE_DH TIMESTAMP "DD/MM/YYYY HH24:MI:SS",
PRE_PRO CHAR(4000),
PRE_INF CHAR(4000),
PRE_TPL CHAR(4000),
PRE_LAT FLOAT EXTERNAL,
PRE_LNG FLOAT EXTERNAL,
)
log file:
Table MY_TABLE, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
PRE_ID FIRST 4000 ; CHARACTER
PRE_DH NEXT * ; DATETIME DD/MM/YYYY HH24:MI:SS
PRE_PRO NEXT 4000 ; CHARACTER
PRE_INF NEXT 4000 ; CHARACTER
PRE_TPL NEXT 4000 ; CHARACTER
PRE_LAT NEXT * ; CHARACTER
PRE_LNG NEXT * ; CHARACTER
value used for ROWS parameter changed from 64 to 1
Record 1: Rejected - Error on table MY_TABLE, column PRE_LAT.
ORA-01722: invalid number
Record 2: Rejected - Error on table MY_TABLE, column PRE_LAT.
ORA-01722: invalid number
Record 3: Rejected - Error on table MY_TABLE, column PRE_LAT.
ORA-01722: invalid number
You're seeing this because the operating system environment is set up in a way that causes Oracle to treat a comma as a decimal separator and a period as a group separator. Your error messages are in English, interestingly, so not sure exactly what you have that set to, but you can see the same thing with something like NLS_LANG="FRENCH_FRANCE.WE8ISO8859P1".
From the log you can see that the field in your CSV file is being read as character data. The target column is a float (any type of number column would have the same issue), which means an implicit conversion is happening, and is using your NLS settings. You can see the same thing more simply with:
alter session set NLS_NUMERIC_CHARACTERS='.,';
select to_number('-22.4741249084473') from dual;
TO_NUMBER('-22.4741249084473')
------------------------------
-22.4741249084473
alter session set NLS_NUMERIC_CHARACTERS=',.';
select to_number('-22.4741249084473') from dual;
Error report -
ORA-01722: invalid number
Same conversion, but the alter session is swapping the meaning of the comma and period.
You can either explicitly set your environment to something with the right NLS numeric characters via NLS_LANG:
export NLS_LANG="ENGLISH_UNITED KINGDOM.WE8ISO8859P1"
or just that specific setting:
export NLS_NUMERIC_CHARACTERS='.,'
... before running SQL*Loader.
New to SQL loader and am a bit confused about the POSITION.
Let's use the following sample data as reference:
Munising 49862 MI
Shingleton49884 MI
Seney 49883 MI
And here is the load statement:
LOAD DATA
INFILE 'zipcodes.dat'
REPLACE INTO TABLE zipcodes (
city_name POSITION(1) CHAR(10),
zip_code POSITION(*) CHAR(5),
state_abbr POSITION(*+1) CHAR(2)
)
In the load statement, the city_name POSITION is 1. How does SQLLDR know where it ends? Is CHAR(10) the trick here? Counting the two spaces behind 'Munising', it has 10 characters.
Also why is zip_code assigned with CHAR even though it contains nothing but numbers?
Thank You
Yes, when end position is not specified, it is derived from the datatype. This documentation explains the POSITION clause.
city_name POSITION(1) CHAR(10)
Here the starting position of data field is 1. Ending position is not specified, but is derived from the datatype, that is 10.
zip_code POSITION(*) CHAR(5)
Here * specifies that, data field immediately follows the previous field and should be 5 bytes long.
state_abbr POSITION(*+1) CHAR(2)
Here +1 specifies the offset from the previous field. Sqlloader skips 1 byte and reads next 2 bytes, as derived from char(2) datatype.
As to why zipcode is CHAR, zip code is considered simply a fixed length string. You are not going to do any arithmetic operations on it. So, CHAR is appropriate for it.
Also, have a look at SQL Loader datatypes. In control file you are telling SQL*Loader how to interpret the data. It can be different from that of table structure. In this example you could also specify INTEGER EXTERNAL for zip code.
we need three text file & 1 batch file for Load Data:
Suppose your file location 'D:\loaddata'
input file 'D:\loaddata\abc.CSV'
1. D:\loaddata\abc.bad -- empty
2. D:\loaddata\abc.log -- empty
3. D:\loaddata\abc.ctl "Write Code Below"
OPTIONS ( SKIP=1, DIRECT=TRUE, ERRORS=10000000, ROWS=5000000)
load data
infile 'D:\loaddata\abc.CSV'
TRUNCATE
into table Your_table
(
a_column POSITION (1:7) char,
b_column POSITION (8:10) char,
c_column POSITION (11:12) char,
d_column POSITION (13:13) char,
f_column POSITION (14:20) char
)
D:\loaddata\abc.bat --- For execution
sqlldr db_user/db_passward#your_tns control=D:\loaddata\abc.ctl log=D:\loaddata\abc.log
After double click "D:\loaddata\abc.bat" file you data will be load desire oracle table. if anything wrong check you "D:\loaddata\abc.bad" and "D:\loaddata\abc.log" file
In order to load data (from a CSV file) into an Oracle database, I use SQL*Loader.
In the table that receives these data, there is a varchar2(500) column, called COMMENTS.
For some reasons, I want to ignore this information from the CSV file.
Thus, I wrote this control file:
Options (BindSize=10000000,Readsize=10000000,Rows=5000,Errors=100)
Load Data
Infile 'XXX.txt'
Append into table T_XXX
Fields Terminated By ';'
TRAILING NULLCOLS
(
...
COMMENTS FILLER,
...
)
This code seems to work correctly, as the COMMENTS field in database is always set to null.
However, if in my CSV file I have a record where the corresponding COMMENTS field exceeds the 500 characters limit, I get an error from SQL*Loader:
Record 2: Rejected - Error on table T_XXX, column COMMENTS.
Field in data file exceeds maximum length
Is there a way to really exclude the processing of my COMMENTS fields?
I can't reproduce your problem. I'm using Oracle 10.2.0.3.0 with SQL*Loader 10.2.0.1.
Here is my test case:
SQL> CREATE TABLE test_sqlldr (
2 ID NUMBER,
3 comments VARCHAR2(20),
4 id2 NUMBER
5 );
Table created
Control file:
LOAD DATA
INFILE test.data
INTO TABLE test_sqlldr
APPEND
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
( id,
comments filler,
id2
)
data file:
1;aaa;2
3;abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz;4
5;bbb;6
I'm using the command sqlldr userid=xxx/yyy#zzz control=test.ctl and I'm getting all the rows without errors:
SQL> select * from test_sqlldr;
ID COMMENTS ID2
---------- -------------------- ----------
1 2
3 4
5 6
You may try another approach, I'm getting the same desired result with the following control file:
LOAD DATA
INFILE test.data
INTO TABLE test_sqlldr
APPEND
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
( id,
comments "substr(:comments,1,0)",
id2
)
Update following Romaintaz's comment: I looked into it again and managed to get the same error as you when the size of the column exceeded 255 characters. This is because the default datatype of SQL*Loader is char(255). If you have a column with more data you will have to specify the length. The following control file solved the problem for a column with 300 characters:
LOAD DATA
INFILE test.data
INTO TABLE test_sqlldr
APPEND
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
( id,
comments filler char(4000),
id2
)
Hope this Helps,
--
Vincent
Just to suggest a tiny improvement, you might try something like:
LOAD DATA
IN FILE test.data INTO TABLE test_sqlldr
APPEND
FIELDS TERMINATED BY ';'TRAILING NULLCOLS
(
id,
comments char(4000) "substr(:comments, 1, 200)",
id2)
Now you'll grab the first 200 characters (or any number you specify in it's place) of all comments - unless some of your input records have values for the comments field that exceed 4000 characters, in which they'll be rejected by loader with the 'exceeds max length' error noted earlier. But assuming that's rare or not the case, all the records will load with some of the comments truncated to 200 chars.
If you go over char(4000) you'll get a SQL Loader error - there's a limit to how far you can push the beast.