How to read data from text file with comma separated values and insert into temp table using in stored procedure - oracle

FIle name emp.txt - the text file contains data like this:
emp_no,emp_EXPIRY_DATE,STATUS
a123456,2020-07-12,A
a123457,2020-07-12,A
I want to insert data into a temp table using a stored procedure.

Which database do you use? "Oracle" SQL Developer looks like "Oracle" (of course), but - code you posted as a comment isn't Oracle.
Anyway, if it was, then doing what you plan to do would require UTL_FILE package. CSV file should be put into a directory (usually on the database server) which is a source for directory (as an Oracle object); user that is supposed to load data should have read (and write?) privileges on it.
Alternatively, you could use the CSV file as an external table. That option might be simpler as it allows you to write ordinary SELECT statements against it, i.e. read data from it and insert into the target table that resides in an Oracle database. This option also requires the "directory" stuff.
Or, if you want to do that locally, consider using SQL*Loader; create a control file and load data. This option might be extremely fast, way faster than previous options. You won't see any difference for small files, but - for a lot of data - this might be your choice.
A SQL*Loader example:
Test table:
SQL> create table test
2 (emp_no varchar2(10),
3 emp_expiry_date date,
4 status varchar2(1));
Table created.
Control file:
options (skip=1)
LOAD DATA
infile emp.txt
replace
INTO TABLE test
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
emp_no,
emp_expiry_date "to_date(:emp_expiry_date, 'yyyy-dd-mm')",
status
)
Loading session & the result:
SQL> alter session set nls_date_Format = 'yyyy-mm-dd';
Session altered.
SQL> $sqlldr scott/tiger control=test13.ctl log=test13.log
SQL*Loader: Release 11.2.0.2.0 - Production on Sri Pro 11 21:02:44 2019
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 1
Commit point reached - logical record count 2
SQL> select * from test;
EMP_NO EMP_EXPIRY S
---------- ---------- -
a123456 2020-12-07 A
a123457 2020-12-07 A
SQL>

Related

SQL loader that inserts data from CSV file into a table

I'm currently inserting data some columns from a CSV file into a table by using SQL loader and that data is validated and the remaining columns are filled automatically based on the inserted data.
But if I have an unnecessary extra data beyond my required columns, that is inserting into the other columns of the table which is supposed to null if the data is ok after validation.
I want to take only certain columns from the CSV file and need to insert into table.. no need of any extra data from other columns in CSV file to be load.
What should I do?
I'm wondering if there is any thing I need to include in this!
Options (ERRORS=100000,SKIP=1)
Load data
Append
Into table emp
Fields terminated by ',' optionally enclosed by '"'
Trailing Nullcols
(Emp_id char,
Dept char,
Class integer,
Subclass integer
)
You don't have to do anything special - just omit all these columns from the control file, here:
(Emp_id char, Dept char, Class integer, Subclass integer )
To illustrate it: sample table has only two columns:
SQL> create table test
2 (emp_id number,
3 dept_name varchar2(10));
Table created.
Control file's data contain some more data - talking about this:
1,A,some more data,123
2,B,not important,553
------------------
no columns for this in the table
Control file itself:
LOAD DATA
INFILE *
REPLACE
INTO TABLE test
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
emp_id,
dept_name
)
BEGINDATA
1,A,some more data,123
2,B,not important,553
Loading session:
SQL> $sqlldr scott/tiger#orcl control=test45.ctl log=test45.log
SQL*Loader: Release 18.0.0.0.0 - Production on Čet Lis 27 12:55:34 2022
Version 18.5.0.0.0
Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional
Commit point reached - logical record count 1
Commit point reached - logical record count 2
Table TEST:
2 Rows successfully loaded.
Check the log file:
test45.log
for more information about the load.
Result:
SQL> select * from test;
EMP_ID DEPT_NAME
---------- ----------
1 A
2 B
SQL>
As you can see, everything is just fine.
If it were vice versa, i.e. you'd want to populate columns that don't exist in the CSV file, you'd use e.g. filler (but - as that's not your issue - forget it. Actually, remember it, maybe you'll need it later).

Oracle SQL Loader control file to ignore ellipsis

I have an Oracle SQL Loader control file based on position in a text file. One particular field periodically gets an ellipsis '...' from the source, which causes a carriage return in the loading table. No matter how many times I request '...' to NOT be used by these users, there is eventually someone who forgets, or due to staff turnover, etc. Here is the current control file line for that field:
TRAN_DESC POSITION(153 : 202) Char,
Is there any command that can be added to this line in order to ignore special characters such as an ellipsis?
I'd think of REPLACE. Here's an example.
Sample table:
SQL> create table test (id number, tran_desc varchar2(10));
Table created.
Control file:
load data
infile *
into table test
(id position(1:2),
tran_desc position(3:12) char "replace(:tran_desc, '...', '')"
)
begindata
10LittleFoot
11Big...foot
Loading session and result:
SQL> $sqlldr scott/tiger control=test2.ctl log=test2.log
SQL*Loader: Release 11.2.0.2.0 - Production on Pon Tra 5 17:03:39 2021
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 2
SQL> select * from test;
ID TRAN_DESC
---------- ----------
10 LittleFoot
11 Bigfoot
SQL>

Oracle SQL: Running insert statements from a large text file

I have a large text file (around 50mb). This text file has thousands of insert statements. I tried to open the text file in Oracle SQL Developer, but it is too large. How do I insert the data into my tables without opening the file in SQL Developer?
I tried to loop through the insert statements one by one and insert them into my table like this:
DECLARE
V1 VARCHAR2(32767);
fileVariable UTL_FILE.FILE_TYPE;
BEGIN
fileVariable := UTL_FILE.FOPEN('h:/Documents',
'clob_export.sql',
'R',
32760);
UTL_FILE.GET_LINE(fileVariable,V1,32767);
UTL_FILE.FCLOSE(fileVariable);
END;
But this doesn't seem to work. I can't create directories on the machine, and anyways, the text file is on the computer where I am running SQL Developer and SQL Developer is connected remotely to the database.
The simplest way - from my point of view - is to run it from SQL*Plus, such as:
c:\Temp>sqlplus scott/tiger
SQL*Plus: Release 11.2.0.2.0 Production on Uto Sij 26 22:20:18 2021
Copyright (c) 1982, 2014, Oracle. All rights reserved.
Connected to:
Oracle Database 11g Express Edition Release 11.2.0.2.0 - 64bit Production
SQL> #insert_data.sql
1 row created.
1 row created.
<snip>
presuming that insert_data.sql contains something like
insert into dept values (1, 'Dept 1', 'NY');
insert into dept values (2, 'Dept 2', 'London');
...
Use sqlplus and if where are too much text use options to log only in the file not on screen
SET TERMOUT OFF;
spool M:\Documents\test.log;
Call the file with # instead of trying to open the file. You may also want to disable feedback to avoid many thousands of "1 row inserted" messages.
set feedback off;
#c:\users\jon\Desktop\test.sql
The above commands are SQL*Plus syntax, but Oracle SQL Developer worksheets understand basic SQL*Plus commands. If you need to frequently run large scripts then you might want to learn the command line SQL*Plus, but if this is just a one-time task then stick with SQL Developer.

read and insert data from text file to database table using oracle SQL Plus

I really need your help
I am always work on SQL server, but now I am working on something else and that why I need your help.
I m working on (Oracle SQL plus), I have a text file lets say the name test.txt and just I want to upload data from this file to database table using SQL plus
lets say the text file data:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill 25-5-2018
how to write a code pl/sql on sql plus to upload the data from the text file to the table on my data base??
usually on SQL server I use Bulk insert, here what the methods?
I tried many from the internet but not solved.
Please help me
Thanks a lot
If the text file is on the same machine you're running SQL*Plus from, you can use the SQL*Loader utility.
As a simple example, lets say your table is:
create table your_table (id number, name varchar2(10), some_date date);
And you have a text file data.txt containing what you showed, but with a comma added on the third line:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill,25-5-2018
You can create a basic SQL*Loader control file in the same directory, called say your_table.ctl, with something like:
LOAD DATA
INFILE 'data.txt'
APPEND
INTO TABLE your_table
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
ID,
NAME,
SOME_DATE DATE "DD-MM-YYYY"
)
Look at the documentation to see what all those mean, particularly what APPEND means; you may want to TRUNCATE instead - but be careful with that.
Then run SQL*Loader from the command line (not from within SQL*Plus), using the same credentials and connect string you normally use to connect to the database:
sqlldr userid=usr/pwd#tns control=your_table.ctl
Once that has completed - assuming there are no errors reported on console ro in the log file it creates - then querying your table will show:
select * from your_table;
ID NAME SOME_DATE
---------- ---------- ----------
1 mike 2018-01-01
2 jon 2017-12-20
3 bill 2018-05-25
There are lots of other options and capabilities, but that might cover what you need at the moment.

sqlldr issue with copying a certain table

I have 4 tables I am copying from one schema to another using sqldr. 3 of the tables gave me no issues and I was able to successfully copy them all over. The 4th is where my problem arises. I cant quite understand why, there is nothing special about this 4th table as far as data types go or anything else. When i do run the sqlldr command, all rows end up in the .bad file and none are copied over. I will list the code im using for better understanding.
> pico deptbb02.csv
UW PICO(tm) 4.10 File: deptbb02.csv
10,infield,Jade,Clairmont,Lets play two
20,outfield,House of Pasta,Santee,Alea iacta est
30,pitcher,Crab Shack,Pacific Beach,Semper paratus
40,staff,Burger King,Lakeside,Experientia docet
50,catchers,Pinnacle Peak,Santee,Non Bastardi Carborundum
UW PICO(tm) 4.10 File: deptload.ctl
LOAD DATA
infile 'deptbb02.csv'
replace into table deptbb02
fields terminated by ','
(DEPTNO,DNAME,RESTAURANT,LOCATION,MOTTO)
> sqlldr username/password#database
control = deptload.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Tue Jul 28 01:27:38 2015
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit point reached - logical record count 6
TABLE deptbb02 defined as...
SQL> desc deptbb02
Name Null? Type
----------------------------------------- -------- ----------------------------
DEPTNO NUMBER(3)
DNAME VARCHAR2(8)
RESTAURANT VARCHAR2(15)
LOCATION VARCHAR2(15)
MOTTO VARCHAR2(30)
I think this should be everything needed to understand my question, but don't hesitate to ask if i missed something. Thanks!
I suspect the data has spaces or control characters at the end of the MOTTO column or the sample data you posted is not the same as what was attempted to load in the .log file. i.e.
Record 3: Rejected - Error on table DEPTBB02, column MOTTO. ORA-12899: value too large for column "ST101"."DEPTBB02"."MOTTO" (actual: 44, maximum: 30)
The MOTTO column is defined in the table as varchar2(30) but sqlldr sees 44 characters. The data as shown for the 3rd record is 14 characters. Open the data file in an editor that can show control characters and spaces.
Try putting a TRIM() call around the fields in the control file to remove leading and trailing spaces maybe:
LOAD DATA
infile 'deptbb02.csv'
replace into table deptbb02
fields terminated by ','
(DEPTNO,
DNAME,
RESTAURANT,
LOCATION,
MOTTO CHAR "TRIM(:MOTTO)"
)

Resources