SQL loader that inserts data from CSV file into a table - oracle

I'm currently inserting data some columns from a CSV file into a table by using SQL loader and that data is validated and the remaining columns are filled automatically based on the inserted data.
But if I have an unnecessary extra data beyond my required columns, that is inserting into the other columns of the table which is supposed to null if the data is ok after validation.
I want to take only certain columns from the CSV file and need to insert into table.. no need of any extra data from other columns in CSV file to be load.
What should I do?
I'm wondering if there is any thing I need to include in this!
Options (ERRORS=100000,SKIP=1)
Load data
Append
Into table emp
Fields terminated by ',' optionally enclosed by '"'
Trailing Nullcols
(Emp_id char,
Dept char,
Class integer,
Subclass integer
)

You don't have to do anything special - just omit all these columns from the control file, here:
(Emp_id char, Dept char, Class integer, Subclass integer )
To illustrate it: sample table has only two columns:
SQL> create table test
2 (emp_id number,
3 dept_name varchar2(10));
Table created.
Control file's data contain some more data - talking about this:
1,A,some more data,123
2,B,not important,553
------------------
no columns for this in the table
Control file itself:
LOAD DATA
INFILE *
REPLACE
INTO TABLE test
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
emp_id,
dept_name
)
BEGINDATA
1,A,some more data,123
2,B,not important,553
Loading session:
SQL> $sqlldr scott/tiger#orcl control=test45.ctl log=test45.log
SQL*Loader: Release 18.0.0.0.0 - Production on Čet Lis 27 12:55:34 2022
Version 18.5.0.0.0
Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional
Commit point reached - logical record count 1
Commit point reached - logical record count 2
Table TEST:
2 Rows successfully loaded.
Check the log file:
test45.log
for more information about the load.
Result:
SQL> select * from test;
EMP_ID DEPT_NAME
---------- ----------
1 A
2 B
SQL>
As you can see, everything is just fine.
If it were vice versa, i.e. you'd want to populate columns that don't exist in the CSV file, you'd use e.g. filler (but - as that's not your issue - forget it. Actually, remember it, maybe you'll need it later).

Related

Trouble with Oracle SQL Loader and a date field

I have a csv file that is pipe delimited and I'm trying to use SQL Loader to import the data. The data type in the table is Date. I'd like to import just the MM/DD/YYYY but I'm having errors.
My control file code for this field is:
field_a char(1024),
field_in_question DATE'MM/DD/RRRR',
field_c,
Dates in Sample File:
5/28/2019 0:00
3/30/2020 0:00
12/16/2019 0:00
The error I'm currently receiving is:
ORA-01858: a non-numeric character was found where a numeric was
expected
Any help would be greatly appreciated.
An oracle DATE type includes a time component. Your input data also has a time component. So just adjust your input date mask to account for it.
field_in_question DATE'MM/DD/YYYY hh:mi'
Notice I've also changed your mask for 'years' to 'YYYY'. The 'RR' and "RRRR' construct was meant as a temporary band-aid to buy time in solving the Y2K bug. And that was twenty years ago. Long past time to no longer need temporary fixes.
Here's how.
Sample table:
SQL> create table test (name varchar2(10), datum date, colc number);
Table created.
Control file (sample data included):
load data
infile *
replace
into table test
fields terminated by '|'
trailing nullcols
(
name,
datum "to_date(:datum, 'mm/dd/yyyy hh24:mi')",
colc
)
begindata
Little|5/28/2019 0:00|1
Foot|3/30/2020 0:00|2
Bigfoot|12/16/2019 0:00|3
Loading session and the result:
SQL> $sqlldr scott/tiger control=test23.ctl log=test23.log
SQL*Loader: Release 11.2.0.2.0 - Production on Pon Stu 16 22:35:58 2020
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 2
Commit point reached - logical record count 3
SQL> select * from test;
NAME DATUM COLC
---------- ------------------- ----------
Little 28.05.2019 00:00:00 1
Foot 30.03.2020 00:00:00 2
Bigfoot 16.12.2019 00:00:00 3
SQL>

How to read data from text file with comma separated values and insert into temp table using in stored procedure

FIle name emp.txt - the text file contains data like this:
emp_no,emp_EXPIRY_DATE,STATUS
a123456,2020-07-12,A
a123457,2020-07-12,A
I want to insert data into a temp table using a stored procedure.
Which database do you use? "Oracle" SQL Developer looks like "Oracle" (of course), but - code you posted as a comment isn't Oracle.
Anyway, if it was, then doing what you plan to do would require UTL_FILE package. CSV file should be put into a directory (usually on the database server) which is a source for directory (as an Oracle object); user that is supposed to load data should have read (and write?) privileges on it.
Alternatively, you could use the CSV file as an external table. That option might be simpler as it allows you to write ordinary SELECT statements against it, i.e. read data from it and insert into the target table that resides in an Oracle database. This option also requires the "directory" stuff.
Or, if you want to do that locally, consider using SQL*Loader; create a control file and load data. This option might be extremely fast, way faster than previous options. You won't see any difference for small files, but - for a lot of data - this might be your choice.
A SQL*Loader example:
Test table:
SQL> create table test
2 (emp_no varchar2(10),
3 emp_expiry_date date,
4 status varchar2(1));
Table created.
Control file:
options (skip=1)
LOAD DATA
infile emp.txt
replace
INTO TABLE test
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
emp_no,
emp_expiry_date "to_date(:emp_expiry_date, 'yyyy-dd-mm')",
status
)
Loading session & the result:
SQL> alter session set nls_date_Format = 'yyyy-mm-dd';
Session altered.
SQL> $sqlldr scott/tiger control=test13.ctl log=test13.log
SQL*Loader: Release 11.2.0.2.0 - Production on Sri Pro 11 21:02:44 2019
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 1
Commit point reached - logical record count 2
SQL> select * from test;
EMP_NO EMP_EXPIRY S
---------- ---------- -
a123456 2020-12-07 A
a123457 2020-12-07 A
SQL>

read and insert data from text file to database table using oracle SQL Plus

I really need your help
I am always work on SQL server, but now I am working on something else and that why I need your help.
I m working on (Oracle SQL plus), I have a text file lets say the name test.txt and just I want to upload data from this file to database table using SQL plus
lets say the text file data:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill 25-5-2018
how to write a code pl/sql on sql plus to upload the data from the text file to the table on my data base??
usually on SQL server I use Bulk insert, here what the methods?
I tried many from the internet but not solved.
Please help me
Thanks a lot
If the text file is on the same machine you're running SQL*Plus from, you can use the SQL*Loader utility.
As a simple example, lets say your table is:
create table your_table (id number, name varchar2(10), some_date date);
And you have a text file data.txt containing what you showed, but with a comma added on the third line:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill,25-5-2018
You can create a basic SQL*Loader control file in the same directory, called say your_table.ctl, with something like:
LOAD DATA
INFILE 'data.txt'
APPEND
INTO TABLE your_table
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
ID,
NAME,
SOME_DATE DATE "DD-MM-YYYY"
)
Look at the documentation to see what all those mean, particularly what APPEND means; you may want to TRUNCATE instead - but be careful with that.
Then run SQL*Loader from the command line (not from within SQL*Plus), using the same credentials and connect string you normally use to connect to the database:
sqlldr userid=usr/pwd#tns control=your_table.ctl
Once that has completed - assuming there are no errors reported on console ro in the log file it creates - then querying your table will show:
select * from your_table;
ID NAME SOME_DATE
---------- ---------- ----------
1 mike 2018-01-01
2 jon 2017-12-20
3 bill 2018-05-25
There are lots of other options and capabilities, but that might cover what you need at the moment.

sqlldr issue with copying a certain table

I have 4 tables I am copying from one schema to another using sqldr. 3 of the tables gave me no issues and I was able to successfully copy them all over. The 4th is where my problem arises. I cant quite understand why, there is nothing special about this 4th table as far as data types go or anything else. When i do run the sqlldr command, all rows end up in the .bad file and none are copied over. I will list the code im using for better understanding.
> pico deptbb02.csv
UW PICO(tm) 4.10 File: deptbb02.csv
10,infield,Jade,Clairmont,Lets play two
20,outfield,House of Pasta,Santee,Alea iacta est
30,pitcher,Crab Shack,Pacific Beach,Semper paratus
40,staff,Burger King,Lakeside,Experientia docet
50,catchers,Pinnacle Peak,Santee,Non Bastardi Carborundum
UW PICO(tm) 4.10 File: deptload.ctl
LOAD DATA
infile 'deptbb02.csv'
replace into table deptbb02
fields terminated by ','
(DEPTNO,DNAME,RESTAURANT,LOCATION,MOTTO)
> sqlldr username/password#database
control = deptload.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Tue Jul 28 01:27:38 2015
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit point reached - logical record count 6
TABLE deptbb02 defined as...
SQL> desc deptbb02
Name Null? Type
----------------------------------------- -------- ----------------------------
DEPTNO NUMBER(3)
DNAME VARCHAR2(8)
RESTAURANT VARCHAR2(15)
LOCATION VARCHAR2(15)
MOTTO VARCHAR2(30)
I think this should be everything needed to understand my question, but don't hesitate to ask if i missed something. Thanks!
I suspect the data has spaces or control characters at the end of the MOTTO column or the sample data you posted is not the same as what was attempted to load in the .log file. i.e.
Record 3: Rejected - Error on table DEPTBB02, column MOTTO. ORA-12899: value too large for column "ST101"."DEPTBB02"."MOTTO" (actual: 44, maximum: 30)
The MOTTO column is defined in the table as varchar2(30) but sqlldr sees 44 characters. The data as shown for the 3rd record is 14 characters. Open the data file in an editor that can show control characters and spaces.
Try putting a TRIM() call around the fields in the control file to remove leading and trailing spaces maybe:
LOAD DATA
infile 'deptbb02.csv'
replace into table deptbb02
fields terminated by ','
(DEPTNO,
DNAME,
RESTAURANT,
LOCATION,
MOTTO CHAR "TRIM(:MOTTO)"
)

bulk load UDT columns in Oracle

I have a table with the following structure:
create table my_table (
id integer,
point Point -- UDT made of two integers (x, y)
)
and i have a CSV file with the following data:
#id, point
1|(3, 5)
2|(7, 2)
3|(6, 2)
now i want to bulk load this CSV into my table, but i cant find any information about how to handle the UDT in Oracle sqlldr util. Is is possible to use the bulk load util when having UDT columns?
I don't know if sqlldr can do this, but personally I would use an external table.
Attach the file as an external table (the file must be on the database server), and then insert the contents of the external table into the destination table transforming the UDT into two values as you go. The following select from dual should help you with the translation:
select
regexp_substr('(5, 678)', '[[:digit:]]+', 1, 1) x_point,
regexp_substr('(5, 678)', '[[:digit:]]+', 1, 2) y_point
from dual;
UPDATE
In sqlldr, you can transform fields using standard SQL expressions:
LOAD DATA
INFILE 'data.dat'
BADFILE 'bad_orders.txt'
APPEND
INTO TABLE test_tab
FIELDS TERMINATED BY "|"
( info,
x_cord "regexp_substr(:x_cord, '[[:digit:]]+', 1, 1)",
)
The control file above will extract the first digit in the fields like (3, 4), but I cannot find a way to extract the second digit - ie I am not sure if it is possible to have the same field in the input file inserted into two columns.
If external tables are not an option for you, I would suggest either (1) transform the file before loading, using sed, awk, Perl etc or (2) SQLLDR the file into a temporary table and then have a second process to trandform the data and insert into your final table. Another option is to look at how the file is generated - could you generate it so that the field you need to transform is repeated in two fields in the file, eg:
data|(1, 2)|(1, 2)
Maybe someone else will chip in with a way to get sqlldr to do what you want.
Solved the problem after more research, because Oracle SQL*Loader has this feature, and it is used by specifying a column object, the following was the solution:
LOAD DATA
INFILE *
INTO TABLE my_table
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
id,
point column object
(
x,
y
)
)
BEGINDATA
1,3,5
2,7,2
3,6,2

Resources