Read data from a text file and Instead storing into a table, Can we directly add the data into a cursor and fetch the cursor in procedure? - oracle

Context: I have a text file that may contain a data say :
Employee Salary
name:start_salary:current_salary
emp1:30000:40000
emp2:35000:40000
.
.
Emp details
name:role:experience
emp1:Analyst:2
emp2:DBA:1
emp3:Developer:3
I want to read this text file from a PL/SQL code and I can load the data into a Table and then using a cursor I can utilize that data in my PL/SQL code.
But I want to skip the step of creating a table and want to use the data on the fly, may be Can we directly read the data into cursor?
Can someone please help if that is possible?

You can do that using the UTL_FILE package. This allows PL/SQL to read and write operating system text files.
Once you open the file, you can read its contents into a PL/SQL cursor, then perform all necessary processing directly on the data in the cursor.
Note that you have to know how your file is formatted and the structure of the data you are reading.

Check out implict EXTERNAL TABLE syntax, which lets you query a flat file direcly from within a SELECT statement ,eg
SQL> select * from external (
2 empno number(4),
3 ename varchar2(10),
4 ...
12 ( type oracle_loader
13 default directory TMP
14 access parameters
15 ( records delimited by newline
16 fields terminated by ','
17 missing field values are null
18 ( empno,ename,job,mgr,...)
19 )
20 location ('emp20161001.dat')
21 );

Related

read and insert data from text file to database table using oracle SQL Plus

I really need your help
I am always work on SQL server, but now I am working on something else and that why I need your help.
I m working on (Oracle SQL plus), I have a text file lets say the name test.txt and just I want to upload data from this file to database table using SQL plus
lets say the text file data:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill 25-5-2018
how to write a code pl/sql on sql plus to upload the data from the text file to the table on my data base??
usually on SQL server I use Bulk insert, here what the methods?
I tried many from the internet but not solved.
Please help me
Thanks a lot
If the text file is on the same machine you're running SQL*Plus from, you can use the SQL*Loader utility.
As a simple example, lets say your table is:
create table your_table (id number, name varchar2(10), some_date date);
And you have a text file data.txt containing what you showed, but with a comma added on the third line:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill,25-5-2018
You can create a basic SQL*Loader control file in the same directory, called say your_table.ctl, with something like:
LOAD DATA
INFILE 'data.txt'
APPEND
INTO TABLE your_table
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
ID,
NAME,
SOME_DATE DATE "DD-MM-YYYY"
)
Look at the documentation to see what all those mean, particularly what APPEND means; you may want to TRUNCATE instead - but be careful with that.
Then run SQL*Loader from the command line (not from within SQL*Plus), using the same credentials and connect string you normally use to connect to the database:
sqlldr userid=usr/pwd#tns control=your_table.ctl
Once that has completed - assuming there are no errors reported on console ro in the log file it creates - then querying your table will show:
select * from your_table;
ID NAME SOME_DATE
---------- ---------- ----------
1 mike 2018-01-01
2 jon 2017-12-20
3 bill 2018-05-25
There are lots of other options and capabilities, but that might cover what you need at the moment.

Suggestion for loading data of 2M records in to DB

Users upload data file through application (JSF) which has 2 million records, i have to upload it to DB. Loading through JAVA asynchronous call is occupying more memory out-of memory exception and also most of the time it is getting timeout.
So for that what i did is, stored uploaded file as CLOB in table1, i use UNIX shell script which runs every 15 minutes to see if table1 has not-processed records, if then read that CLOB file and load in to table2 using SQLLDR in the same shell script.It is working fine, but there is a 15 minutes delay in processing records.
So i think the same SQLLDR process can be run through a PL/SQL package or procedure and the same package can be called through JAVA JDBC call.. rite? any examples?
If it's one-time export/import you can use SQL Developer. It enables you to export displayed rows in a loader format. B/Clobs are exported as separate files.
Following Oracle's blog:
LOAD DATA
INFILE 'loader.txt'
INTO TABLE my_table
FIELDS TERMINATED BY ','
( id CHAR(10),
author CHAR(30),
created DATE "YYYY-MM-DD" ":created",
fname FILLER CHAR(80),
text LOBFILE(fname) TERMINATED BY EOF
)
"fname" is an arbitrary label, we could have used "fred" and it would
have worked exactly the same. It just needs to be the same on the two
lines where it is used.
loader.txt:
1,John Smith,2015-04-29,file1.txt
2,Pete Jones,2013-01-31,file2.txt
If you want to know how to dump a CLOB column into a file, please refer to Dumping CLOB fields into files?.

How do I load XMLTYPE from file?

I have big (1 Mb +) XML file that is stored in local folder (for example: c:\temp\data.xml) that should be loaded inside XMLTYPE variable
How can I do that?
the size limit of an Oracle XMLTYPE field should be 4 GB so you will not experience problems loading files having size of 1 MB.
You have to create on Oracle directory (on the Database server), put into the created directory your xml file, then execute your insert as follow:
oracle#server>mkdir yourdirectory
oracle#server>chown youroracleaccount.youroraclegroup yourdirectory
SQL> CREATE OR REPLACE DIRECTORY XMLDIR AS 'YOURDESIREDPATH'
SQL> GRANT read, write ON DIRECTORY XMLDIR TO <DESIREDORACLESCHEMA>
SQL> INSERT INTO YOURTABLE VALUES (...., XMLType(bfilename('XMLDIR', 'yourfilename.xml') , nls_charset_id('YOURCHARSETID') ));
SQL> commit;
If you want to put your xml in a variable, you have to create an external table, for example as follow (but you can adjust this sample as you neeed):
CREATE TABLE YOURXMLTABLE (doc CLOB)
ORGANIZATION EXTERNAL
(
TYPE ORACLE_LOADER
DEFAULT DIRECTORY xmlfile_dir
ACCESS PARAMETERS
(
FIELDS (lobfn CHAR TERMINATED BY ',')
COLUMN TRANSFORMS (doc FROM lobfile (lobfn))
)
LOCATION ('yourfilename.xml')
)
REJECT LIMIT UNLIMITED;
and then execute
select * into XMLTYPVARIABLE from XMLTABLE
Regards
Giova

How to validate dates in a txt file using Oracle SQL?

I have written an SQL script that processes the dates that were loaded into a holding table (via sqlldr) from a comma separated, double quoted text file that comes from an external source. As such, the the type of the columns in the holding table are all VARCHAR2, regardless of what the data type is intended to be in the txt file. This is so that I can perform and log validation on the data from the external source, and report back problems, before loading the data from the holding table into the main table (which has the appropriately typed columns, such as DATE, NUMBER, etc.)
I have regex validation to check that the date is in the format MM/DD/YYYY, but what I need help with is on how to use SQL to validate the logical validity of a syntactically correct date, such as for leap years and whether a certain month has 30 or 31 days. Is there a way to do this using plain SQL? The database is Oracle 11g.
I looked into the to_date function of Oracle, and it seems to be ideal for what I am trying to do, but I cannot find a way to put its results in a query instead of crashing when it encounters an invalid date.
Thanks for any assistance in this topic.
you can create your own function, if smth is wrong with dates it returns null
create or replace function my_to_date(dt varchar2) return date as
ldt date;
begin
ldt := to_date(dt, 'MM/DD/YYYY');
return ldt;
exception when others then return null;
end;
select my_to_date('01/01/2012') from dual
union all
select my_to_date('33/01/2012') from dual
MY_TO_DATE('01/01/2012')
1 01/01/2012
2
or you need create script that will check your data after import
but maybe it possible to you make the check during import process? via to_char(to_date(...)...) convert ? if smth is wrong with your dates - sqlldr reject the record
sample record
7782, "Clark", "Manager", 7839,06/09/1981, 2572.50,, 10:101
sample control file
LOAD DATA
CHARACTERSET utf16
BYTEORDER little
INFILE ulcase11.dat
REPLACE
INTO TABLE EMP
FIELDS TERMINATED BY X'002c' OPTIONALLY ENCLOSED BY X'0022'
(empno integer external (5), ename, job, mgr,
hiredate DATE(20) "to_char(to_date(:HIREDATE, 'MM/DD/YYYY'),'MM/DD/YYYY')",
sal, comm,
deptno CHAR(5) TERMINATED BY ":",
projno,
loadseq SEQUENCE(MAX,1) )

How to copy the data from Excel to oracle? [duplicate]

This question already has answers here:
Load Excel data sheet to Oracle database
(6 answers)
Closed 8 years ago.
How to copy the data from Excel to oracle?
There are many different methods, depending
upon the amount of data, the repetitiveness
of the process, and the amount of programming
I am willing to invest.
First, create the Oracle table, using the
SQL CREATE TABLE statement to define the table's
column lengths and types. Here's an example of a
sqlplus 'CREATE TABLE' statement:
CREATE TABLE SPECIES_RATINGS
(SPECIES VARCHAR2(10),
COUNT NUMBER,
RATING VARCHARC2(1));
Then load the data using any of the following
methods or an entirely new method you invent:
--------------------------------------------
First load method:
I use the SQL*Loader method.
You will need to save a copy of your spreadsheet
in a text format like CSV or PRN.
SQL*Loader Control file for CSV file:
load data
infile 'c:\data\mydata.csv'
into table emp
fields terminated by "," optionally enclosed by '"'
( empno, empname, sal, deptno )
There are some GUIs that have wizards to walk you through the
process (Enterprise Manager -> Maintenance -> Data Movement ->
Move Row Data -> Load Data from User Files) for the
ad-hoc imports. Toad for Oracle has a SQL*Loader Wizard as
well. (DBA -> Data Import/Export -> SQL*Loader Wizard)
You can save your Excel data in PRN format if you are
planning to use positional data (fixed length) in your
control file.
SQL*Loader Control file for PRN file:
load data
infile 'c:\data\mydata.prn'
replace
into table departments
( dept position (02:05) char(4),
deptname position (08:27) char(20) )
Position(02:05) will give the 2nd to the 5th character
Once I've gone through the EM or Toad wizard, I save
the control file, tweak it as needed in a text editor,
and reuse it in SQL*Plus scripts.
SQL*Loader is handy also since it allows you to
skip certain data and call filter functions (i.e.
native functions as in DECODE() or TO_DATE() or
user defined functions) in your control .ctl file.
You can load from multiple input files provided
they use the same record format by repeating the
INFILE clause. Here is an example:
LOAD DATA
INFILE file1.prn
INFILE file2.prn
INFILE file3.prn
APPEND
INTO TABLE emp
( empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR,
deptno POSITION(17:18) CHAR,
mgr POSITION(20:23) INTEGER EXTERNAL
)
You can also specify multiple "INTO TABLE" clauses
in the SQL*Loader control file to load into multiple
tables.
LOAD DATA
INFILE 'mydata.dat'
REPLACE
INTO TABLE emp
WHEN empno != ' '
( empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR,
deptno POSITION(17:18) CHAR,
mgr POSITION(20:23) INTEGER EXTERNAL
)
INTO TABLE proj
WHEN projno != ' '
( projno POSITION(25:27) INTEGER EXTERNAL,
empno POSITION(1:4) INTEGER EXTERNAL
)
With SQL*Loader, you can selectively load only
the records you need (see WHEN clause), skip
certain columns while loading data (see FILLER
columns) and load multi-line records (see
CONCATENATE and CONTINUEIF)
Once you've created the control file, you need
to start sql loader from the command line like this:
sqlldr username/password#connect_string control=ctl_file.ctl log=log.log
You can create a batch file to call sqlldr.
For more examples, see
http://examples.oreilly.com/orsqlloader/
That's it for the versatile SQL*Loader.
--------------------------------------------
Second load method:
In this scenario, I have full control of the
spreadsheet, but less control of the data because
users send me the spreadsheets back with data.
I create another worksheet within the same Excel
file, which has locked down INSERT statements
referring back to the sheet with the data. When
I receive the spreadsheet, I copy and paste the
INSERT statements directly into SQL*Plus, or
indirectly staging them in a SQL script.
Excel is a great tool for composing dynamic
SQL statements dynamically. (see Excel functions)
--------------------------------------------
Third load method:
If you need a utility to load Excel data into
Oracle, download quickload from sourceforge at
http://sourceforge.net/projects/quickload
--------------------------------------------
Fourth load method:
In theory, this should work.
Configure Generic Database connectivity (Heterogeneous Database HS)
Connect to the Excel spreadsheet from Oracle through ODBC.
Describe it (see DESC command) or
CREATE TABLE AS SELECT col1, col2 FROM ExcelTable
to make a copy and see what data types Oracle assigns
the columns by default.
http://www.e-ammar.com/Oracle_TIPS/HS/configuring_generic_database_con.htm
--------------------------------------------
References:
http://209.85.173.132/search?q=cache:GJN388WiXTwJ:www.orafaq.com/wiki/SQL*Loader_FAQ+Oracle+control+file+columns&cd=3&hl=en&ct=clnk&gl=us
http://forums.oracle.com/forums/thread.jspa?threadID=305918&tstart=0
http://techrepublic.com.com/5208-6230-0.html?forumID=101&threadID=223797&messageID=2245485
http://examples.oreilly.com/orsqlloader/
A DBA once showed me an easy trick:
In someplace like another sheet, create a formula like:
INSERT INTO my_table (name, age, monkey) VALUES ('" & A1 & "', " & B1 & ", '" & C1 & "');"
Copy/paste it into the appropriate rows (Excel automatically changes your formula to A2, A3, etc.)
Then copy/paste the result into sqlplus.
The simplest way I can think of is to put Access in the middle. Attach to Excel (or import the data into Access); then attach to the destination Oracle tables and copy. The Access Export facility also works pretty well.
Use external tables
Perhaps some combination of DBD::Oracle, DBD::Excel and DBIx::Copy? But surely there's an easier way...
If its a once off, or rare thing, and you can export to csv, then the Application Express or SQL Loader facilities would work fine. If its a regular thing, then Chris's suggestion is what I'd go with.

Resources