How to insert a string of size more than 4000 bytes into table column of type CLOB using SqL ldr - oracle

I am trying to load a string of length more than 4000 into table of column type CLOB. I know we can do this using ananymous block. But how can I use this block in control file?

You need to provide the max size of the clob as following:
LOAD DATA
INFILE <your_filename>
INTO TABLE <your_table_name>
FIELDS TERMINATED BY '<your_separator>'
TRAILING NULLCOLS
(
id,
<your_clob_column> CHAR (6000), -- max value of your clob col, default is 255
other_fields
)
see the default is 255 so it will throw an error if you do not specify the size and load the data with length > 255 so It is better to always use size as mentioned above.
Cheers!!

Related

Why can't I trim a column of type CHAR?

Like the title says, if a create a table in my DB :
CREATE TABLE TEST
(
FIELD CHAR(20 CHAR) NULL
)
NOLOGGING
NOCOMPRESS
NOCACHE;
Insert this :
Insert into TEST
(FIELD)
Values
('TEST -here are blank spaces- ');
COMMIT;
Then i run the following statement :
UPDATE TEST SET FIELD = TRIM(FIELD);
COMMIT;
but the field still has blank spaces, notice that if I change the data type to varchar2, it works ... does anyone know why?
Thanks!
char is a fixed width data type. A char(20) will always and forever have a length of 20. If you try to insert a shorter string, it will be padded with spaces to the fixed width length of the field. So
UPDATE TEST SET FIELD = TRIM(FIELD);
removes the spaces due to the trim function, then adds them back because the string that gets written has to be exactly 20 bytes long.
Practically, there is almost never a case to use char. You're almost always better off with a varchar2. varchar2 is a variable length data type so there is no need for the database to append the spaces to the end.

i want to change CLOB of size in my oracle tables

I have a table with CLOB column but I want to modify the CLOB of length >>
I executed this command "Alter table TableName (flddata CLOB length 100 Gb);
but i there is an error accorded "Invalid alter table option"
I want to increase the size of the field, how is that??
Thanks in advance
A clob column has no limit in the specification of the type. So, when you define a column as CLOB, it has no size associated the same as other string types ( char, varchar2 ).
However, the limit of what you can store there is defined by this:
CLOB Maximum size: (4 GB - 1) * DB_BLOCK_SIZE initialization parameter (8 TB to 128 TB)

loading data in table using SQL Loader

I'm loading data into my table through SQL Loader
data loading is successful but i''m getting garbage(repetitive) value in a particular column for all rows
After inserting :
column TERM_AGREEMENT is getting value '806158336' for every record
My csv file contains atmost 3 digit data for that column,but i'm forced to set my column definition to Number(10).
LOAD DATA
infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
REPLACE
INTO TABLE LOAN_BALANCE_MASTER_INT
fields terminated by ',' optionally enclosed by '"'
(
ACCOUNT_NO,
CUSTOMER_NAME,
LIMIT,
REGION,
**TERM_AGREEMENT INTEGER**
)
create table LOAN_BALANCE_MASTER_INT
(
ACCOUNT_NO NUMBER(30),
CUSTOMER_NAME VARCHAR2(70),
LIMIT NUMBER(30),
PRODUCT_DESC VARCHAR2(30),
SUBPRODUCT_CODE NUMBER,
ARREARS_INT NUMBER(20,2),
IRREGULARITY NUMBER(20,2),
PRINCIPLE_IRREGULARITY NUMBER(20,2),
**TERM_AGREEMENT NUMBER(10)**
)
INTEGER is for binary data type. If you're importing a csv file, I suppose the numbers are stored as plain text, so you should use INTEGER EXTERNAL. The EXTERNAL clause specifies character data that represents a number.
Edit:
The issue seems to be the termination character of the file. You should be able to solve this issue by editing the INFILE line this way:
INFILE'/ipoapplication/utl_file/LBR_HE_Mar16.csv' "STR X'5E204D'"
Where '5E204D' is the hexadecimal for '^ M'. To get the hexadecimal value you can use the following query:
SELECT utl_raw.cast_to_raw ('^ M') AS hexadecimal FROM dual;
Hope this helps.
I actually solved this issue on my own.
Firstly, thanks to #Gary_W AND #Alessandro for their inputs.Really appreciate your help guys,learned some new things in the process.
Here's the new fragment which worked and i got the correct data for the last column
LOAD DATA
infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
REPLACE
INTO TABLE LOAN_BALANCE_MASTER_INT
fields terminated by ',' optionally enclosed by '"'
(
ACCOUNT_NO,
CUSTOMER_NAME,
LIMIT,
REGION,
**TERM_AGREEMENT INTEGER Terminated by Whitspace**
)
'Terminated by whitespace' - I went through some threads of SQL Loader and i used 'terminated by whitespace' in the last column of his ctl file. it worked ,this time i didn't even had to use 'INTEGER' or 'EXTERNAL' or EXPRESSION '..' for conversion.
Just one thing, now can you guys let me now what could possibly be creating issue ?what was there in my csv file in that column and how by adding this thing solved the issue ?
Thanks.

SQL loader position

New to SQL loader and am a bit confused about the POSITION.
Let's use the following sample data as reference:
Munising 49862 MI
Shingleton49884 MI
Seney 49883 MI
And here is the load statement:
LOAD DATA
INFILE 'zipcodes.dat'
REPLACE INTO TABLE zipcodes (
city_name POSITION(1) CHAR(10),
zip_code POSITION(*) CHAR(5),
state_abbr POSITION(*+1) CHAR(2)
)
In the load statement, the city_name POSITION is 1. How does SQLLDR know where it ends? Is CHAR(10) the trick here? Counting the two spaces behind 'Munising', it has 10 characters.
Also why is zip_code assigned with CHAR even though it contains nothing but numbers?
Thank You
Yes, when end position is not specified, it is derived from the datatype. This documentation explains the POSITION clause.
city_name POSITION(1) CHAR(10)
Here the starting position of data field is 1. Ending position is not specified, but is derived from the datatype, that is 10.
zip_code POSITION(*) CHAR(5)
Here * specifies that, data field immediately follows the previous field and should be 5 bytes long.
state_abbr POSITION(*+1) CHAR(2)
Here +1 specifies the offset from the previous field. Sqlloader skips 1 byte and reads next 2 bytes, as derived from char(2) datatype.
As to why zipcode is CHAR, zip code is considered simply a fixed length string. You are not going to do any arithmetic operations on it. So, CHAR is appropriate for it.
Also, have a look at SQL Loader datatypes. In control file you are telling SQL*Loader how to interpret the data. It can be different from that of table structure. In this example you could also specify INTEGER EXTERNAL for zip code.
we need three text file & 1 batch file for Load Data:
Suppose your file location 'D:\loaddata'
input file 'D:\loaddata\abc.CSV'
1. D:\loaddata\abc.bad -- empty
2. D:\loaddata\abc.log -- empty
3. D:\loaddata\abc.ctl "Write Code Below"
OPTIONS ( SKIP=1, DIRECT=TRUE, ERRORS=10000000, ROWS=5000000)
load data
infile 'D:\loaddata\abc.CSV'
TRUNCATE
into table Your_table
(
a_column POSITION (1:7) char,
b_column POSITION (8:10) char,
c_column POSITION (11:12) char,
d_column POSITION (13:13) char,
f_column POSITION (14:20) char
)
D:\loaddata\abc.bat --- For execution
sqlldr db_user/db_passward#your_tns control=D:\loaddata\abc.ctl log=D:\loaddata\abc.log
After double click "D:\loaddata\abc.bat" file you data will be load desire oracle table. if anything wrong check you "D:\loaddata\abc.bad" and "D:\loaddata\abc.log" file

How to make a varchar2 field shorter in Oracle?

I have a field in a table that is varchar2, 4000 bytes. There are over 50000 rows. Not all rows have data in this field. Few data fields are over 255 bytes, but some are 4000. To place the table in a new application, I need to shorten the field to 255 bytes.
Is there a SQL statement that will reduce the length to 255? I realize data will be lost, that is part of the cost of the new application. The cut should be arbitrary, just stopping the data at 255 no matter the circumstance.
update b set text2 = substr(text2,1,255);
then alter table to set length of column to 255 :
alter table b MODIFY "TEXT2" varchar2(255 byte);

Resources