How to validate dates in a txt file using Oracle SQL? - oracle

I have written an SQL script that processes the dates that were loaded into a holding table (via sqlldr) from a comma separated, double quoted text file that comes from an external source. As such, the the type of the columns in the holding table are all VARCHAR2, regardless of what the data type is intended to be in the txt file. This is so that I can perform and log validation on the data from the external source, and report back problems, before loading the data from the holding table into the main table (which has the appropriately typed columns, such as DATE, NUMBER, etc.)
I have regex validation to check that the date is in the format MM/DD/YYYY, but what I need help with is on how to use SQL to validate the logical validity of a syntactically correct date, such as for leap years and whether a certain month has 30 or 31 days. Is there a way to do this using plain SQL? The database is Oracle 11g.
I looked into the to_date function of Oracle, and it seems to be ideal for what I am trying to do, but I cannot find a way to put its results in a query instead of crashing when it encounters an invalid date.
Thanks for any assistance in this topic.

you can create your own function, if smth is wrong with dates it returns null
create or replace function my_to_date(dt varchar2) return date as
ldt date;
begin
ldt := to_date(dt, 'MM/DD/YYYY');
return ldt;
exception when others then return null;
end;
select my_to_date('01/01/2012') from dual
union all
select my_to_date('33/01/2012') from dual
MY_TO_DATE('01/01/2012')
1 01/01/2012
2
or you need create script that will check your data after import
but maybe it possible to you make the check during import process? via to_char(to_date(...)...) convert ? if smth is wrong with your dates - sqlldr reject the record
sample record
7782, "Clark", "Manager", 7839,06/09/1981, 2572.50,, 10:101
sample control file
LOAD DATA
CHARACTERSET utf16
BYTEORDER little
INFILE ulcase11.dat
REPLACE
INTO TABLE EMP
FIELDS TERMINATED BY X'002c' OPTIONALLY ENCLOSED BY X'0022'
(empno integer external (5), ename, job, mgr,
hiredate DATE(20) "to_char(to_date(:HIREDATE, 'MM/DD/YYYY'),'MM/DD/YYYY')",
sal, comm,
deptno CHAR(5) TERMINATED BY ":",
projno,
loadseq SEQUENCE(MAX,1) )

Related

How to load an extracted ORACLE CLOB into only 1 TEXT column in Postgres?

I'm currently looking at migrating CLOB data from ORACLE into Postgres from an external file. I have created my table in Postgres and the data type I'm using is TEXT which will replicate using a CLOB in ORACLE and now I just need to get my data in.
So far what I've done is extract a CLOB column from ORACLE into a file as per the below, it is only 1 CLOB from 1 COLUMN so Iā€™m trying to load the contents of this entire CLOB into 1 column in Postgres..
CREATE TABLE clob_test (
id number,
clob_col CLOB);
DECLARE
c CLOB;
CURSOR scur IS
SELECT text
FROM dba_source
WHERE rownum < 200001;
BEGIN
EXECUTE IMMEDIATE 'truncate table clob_test';
FOR srec IN scur LOOP
c := c || srec.text;
END LOOP;
INSERT INTO clob_test VALUES (1, c);
COMMIT;
END;
/
DECLARE
buf CLOB;
BEGIN
SELECT clob_col
INTO buf
FROM clob_test
WHERE id = 1;
dbms_advisor.create_file(buf, 'TEST_DIR', 'clob_1.txt');
END;
/
This works fine and generates the clob_1.txt file containing all the contents of the ORACLE CLOB column CLOB_COL. Below is an example of the file output, it seems to contain every possible character you can think of including "~"...
/********** Types and subtypes, do not reorder **********/
type BOOLEAN is (FALSE, TRUE);
type DATE is DATE_BASE;
type NUMBER is NUMBER_BASE;
subtype FLOAT is NUMBER; -- NUMBER(126)
subtype REAL is FLOAT; -- FLOAT(63)
...
...
...
END;
/
My problem now is how do I get the entire contents of this 1 file into 1 record in Postgres so it simulates exactly how the data was originally stored in 1 record in ORACLE?
Effectively what I'm trying to achieve is similar to this, it works but the formatting is awful and doesn't really mirror how the data was originally stored.
POSTGRES> insert into clob_test select pg_read_file('/home/oracle/clob_1.txt');
I have tried using the COPY command but I'm having 2 issues. Firstly if there is a carriage return it will see that as another record and split the file up and the second issue is I can't find a delimiter which isn't being used in the file. Is there some way I can bypass the delimiter and just tell Postgres to COPY everything from this file without delimiters as it's only 1 column?
Any help would be great šŸ˜Š
Note for other answerers: This is incomplete and will still put the data into multiple records; the question also wants all the data in a single field.
Use COPY ... FROM ... CSV DELIMITER e'\x01' QUOTE e'\x02'. The only thing this can't handle is actual binary blobs, which, as I understand it, is not permitted in CLOB (I have never used Oracle myself). This only avoids the delimiter issue; it will still insert the data into one row per line of the input.
I'm not sure how to go about fixing that issue, but you should be aware that it's probably not possible to do this correctly in all cases. The largest field value PG supports is 1gb, while CLOB supports up to 4GB. If you need to correctly import >1GB CLOBs, the only route available is PG's large object interface.

Import hour, minute, second from csv to timestamp column in external table

I'm trying to convert a csv data file to an oracle database table. To do so I'm going by an external table as described here.
The timestamp in my csv is separated into the date (yyyy-mm-dd) on one side and the time (hh24:mm:ss) on the other.
My table has 3 columns :
create table backup_ext
(
"user" NVARCHAR2(20),
"date" DATE,
"hour" TIMESTAMP
)
Here is what the csv looks like :
john,2018-05-28,10:17:57
I need those three values to be in three separate columns in my table.
The problem I have been encountering is that the user and date appear in the expected format but the hour has date and time in it, like so :
user |date |hour
----------------------------------------
john 28-MAY-18 01-OCT-19 10.17.57.000000000
What I would like is something like this :
user |date |hour
----------------------------------------
john 28-MAY-18 10.17.57
Other particularities :
I want to avoid changing the column type as much as possible since it is used as is in many other areas of the program, and I don't want to break anything
The table was created like this for use with MSSQL and I have been tasked with adapting it to work with Oracle, this may explain the choice of column types
I can potentially run a second bit of sql code afterwards to format the column, although I wouldn't know exactly how to do so
I can only work exclusively with sql statements since this is to be done by a C++ code creating statements and querying the database with said statements
Any help would be greatly appreciated
Full code :
create table backup_ext
(
"user" NVARCHAR2(20),
"date" DATE,
"hour" TIMESTAMP
)
organization external
(
type oracle_loader
default directory csvdir
access parameters
(
records delimited by newline
skip 1
fields terminated by ';' lrtrim
missing field values are null
(
"user",
"date" date 'yyyy-mm-dd',
"hour" Char Date_Format Timestamp Mask 'hh24:mi:ss',
)
)
location ('backup.csv')
)
reject limit unlimited;
You say you don't want to change anything, but your'e already migrating it from SQL Server to Oracle...NOW is the time to fix it, or you're going to hate life for the entire time of supporting this application/database going forward.
You only need two columns for your data
DROP TABLE BACKUP_EXT;
CREATE TABLE BACKUP_EXT (
USERNAME VARCHAR2(20),
OCCURENCE DATE
);
INSERT INTO BACKUP_EXT VALUES (
'john',
TO_DATE('28-MAY-18 10.17.57', 'DD-MON-RR HH.MI.SS')
);
COMMIT;
SELECT USERNAME "user",
TO_CHAR(OCCURENCE, 'DD-MON-RR') "date",
TO_CHAR(OCCURENCE, 'HH.MI.SS') "hour"
FROM BACKUP_EXT;
Execute that...and we get back...
Table BACKUP_EXT dropped.
Table BACKUP_EXT created.
1 row inserted.
Commit complete.
user date hour
john 28-MAY-18 10.17.57
Use the proper data type -> DATE. A DATE contains a point in time, so this includes a time portion, not just the month, day, year.
Do not use NVARCHAR2 - most modern oracle databases use a Unicode based characterset already, so it's unnecessary, especially for the test data you've provided.
Do not use reserved words like DATE or USER for table or column names - it will cause many more problems than it will solve by forcing it with quotes.

read and insert data from text file to database table using oracle SQL Plus

I really need your help
I am always work on SQL server, but now I am working on something else and that why I need your help.
I m working on (Oracle SQL plus), I have a text file lets say the name test.txt and just I want to upload data from this file to database table using SQL plus
lets say the text file data:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill 25-5-2018
how to write a code pl/sql on sql plus to upload the data from the text file to the table on my data base??
usually on SQL server I use Bulk insert, here what the methods?
I tried many from the internet but not solved.
Please help me
Thanks a lot
If the text file is on the same machine you're running SQL*Plus from, you can use the SQL*Loader utility.
As a simple example, lets say your table is:
create table your_table (id number, name varchar2(10), some_date date);
And you have a text file data.txt containing what you showed, but with a comma added on the third line:
001,mike,1-1-2018
002,jon,20-12-2017
003,bill,25-5-2018
You can create a basic SQL*Loader control file in the same directory, called say your_table.ctl, with something like:
LOAD DATA
INFILE 'data.txt'
APPEND
INTO TABLE your_table
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
ID,
NAME,
SOME_DATE DATE "DD-MM-YYYY"
)
Look at the documentation to see what all those mean, particularly what APPEND means; you may want to TRUNCATE instead - but be careful with that.
Then run SQL*Loader from the command line (not from within SQL*Plus), using the same credentials and connect string you normally use to connect to the database:
sqlldr userid=usr/pwd#tns control=your_table.ctl
Once that has completed - assuming there are no errors reported on console ro in the log file it creates - then querying your table will show:
select * from your_table;
ID NAME SOME_DATE
---------- ---------- ----------
1 mike 2018-01-01
2 jon 2017-12-20
3 bill 2018-05-25
There are lots of other options and capabilities, but that might cover what you need at the moment.

How do I run procedure in oracle with some date manipulation sql in it

I have created a procedure in oracle as follows:
create or replace PROCEDURE SP_X_AVERAGE
(
profile out SYS_REFCURSOR,
rx out SYS_REFCURSOR,
)
as
BEGIN
open profile for
select
avg(to_number(profile_netassets)) AS netassets
from
fgp;
open rx for
select
avg(to_number(a_price)) as twr
from
r_x
where
gq_date <= add_months(to_date(sysdate, 'mm/dd/yyyy'), -12);
END SP_X_AVERAGE;
It doesn't run, giving the following error:
ORA-01843: not a valid month
If I remove the where condition in the second sql then it runs successfully.
Altering a session using an sql in the same procedure doesnot work too.
Please help.
I am running this procedure in sql-developer (ubuntu Oneiric 11)
SYSDATE is already a DATE so you don't need to apply TO_DATE() to it. However, more recent versions of Oracle are tolerant of such things and handle them gracefully.
So that leaves the matter of r_x.gq_date: what data type is that? If it is a string then the chances are you have values in there which will not cast to a date, or at last don't match your default NLS_FORMAT.
"we have to keep it as "VARCHAR2(40 BYTE)" it is having date in it like this : '1/2/2003'"
Bingo. Is that the same as your NLS_DATE_FORMAT? If not you will need to cast the column:
to_date(gq_date, 'mm/dd/yyyy') <= add_months(sysdate, -12);
This may not solve your problem if the column contains strings which aren't in that format. This is a common side-effect of using strings to hold things which aren't strings.

How to copy the data from Excel to oracle? [duplicate]

This question already has answers here:
Load Excel data sheet to Oracle database
(6 answers)
Closed 8 years ago.
How to copy the data from Excel to oracle?
There are many different methods, depending
upon the amount of data, the repetitiveness
of the process, and the amount of programming
I am willing to invest.
First, create the Oracle table, using the
SQL CREATE TABLE statement to define the table's
column lengths and types. Here's an example of a
sqlplus 'CREATE TABLE' statement:
CREATE TABLE SPECIES_RATINGS
(SPECIES VARCHAR2(10),
COUNT NUMBER,
RATING VARCHARC2(1));
Then load the data using any of the following
methods or an entirely new method you invent:
--------------------------------------------
First load method:
I use the SQL*Loader method.
You will need to save a copy of your spreadsheet
in a text format like CSV or PRN.
SQL*Loader Control file for CSV file:
load data
infile 'c:\data\mydata.csv'
into table emp
fields terminated by "," optionally enclosed by '"'
( empno, empname, sal, deptno )
There are some GUIs that have wizards to walk you through the
process (Enterprise Manager -> Maintenance -> Data Movement ->
Move Row Data -> Load Data from User Files) for the
ad-hoc imports. Toad for Oracle has a SQL*Loader Wizard as
well. (DBA -> Data Import/Export -> SQL*Loader Wizard)
You can save your Excel data in PRN format if you are
planning to use positional data (fixed length) in your
control file.
SQL*Loader Control file for PRN file:
load data
infile 'c:\data\mydata.prn'
replace
into table departments
( dept position (02:05) char(4),
deptname position (08:27) char(20) )
Position(02:05) will give the 2nd to the 5th character
Once I've gone through the EM or Toad wizard, I save
the control file, tweak it as needed in a text editor,
and reuse it in SQL*Plus scripts.
SQL*Loader is handy also since it allows you to
skip certain data and call filter functions (i.e.
native functions as in DECODE() or TO_DATE() or
user defined functions) in your control .ctl file.
You can load from multiple input files provided
they use the same record format by repeating the
INFILE clause. Here is an example:
LOAD DATA
INFILE file1.prn
INFILE file2.prn
INFILE file3.prn
APPEND
INTO TABLE emp
( empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR,
deptno POSITION(17:18) CHAR,
mgr POSITION(20:23) INTEGER EXTERNAL
)
You can also specify multiple "INTO TABLE" clauses
in the SQL*Loader control file to load into multiple
tables.
LOAD DATA
INFILE 'mydata.dat'
REPLACE
INTO TABLE emp
WHEN empno != ' '
( empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR,
deptno POSITION(17:18) CHAR,
mgr POSITION(20:23) INTEGER EXTERNAL
)
INTO TABLE proj
WHEN projno != ' '
( projno POSITION(25:27) INTEGER EXTERNAL,
empno POSITION(1:4) INTEGER EXTERNAL
)
With SQL*Loader, you can selectively load only
the records you need (see WHEN clause), skip
certain columns while loading data (see FILLER
columns) and load multi-line records (see
CONCATENATE and CONTINUEIF)
Once you've created the control file, you need
to start sql loader from the command line like this:
sqlldr username/password#connect_string control=ctl_file.ctl log=log.log
You can create a batch file to call sqlldr.
For more examples, see
http://examples.oreilly.com/orsqlloader/
That's it for the versatile SQL*Loader.
--------------------------------------------
Second load method:
In this scenario, I have full control of the
spreadsheet, but less control of the data because
users send me the spreadsheets back with data.
I create another worksheet within the same Excel
file, which has locked down INSERT statements
referring back to the sheet with the data. When
I receive the spreadsheet, I copy and paste the
INSERT statements directly into SQL*Plus, or
indirectly staging them in a SQL script.
Excel is a great tool for composing dynamic
SQL statements dynamically. (see Excel functions)
--------------------------------------------
Third load method:
If you need a utility to load Excel data into
Oracle, download quickload from sourceforge at
http://sourceforge.net/projects/quickload
--------------------------------------------
Fourth load method:
In theory, this should work.
Configure Generic Database connectivity (Heterogeneous Database HS)
Connect to the Excel spreadsheet from Oracle through ODBC.
Describe it (see DESC command) or
CREATE TABLE AS SELECT col1, col2 FROM ExcelTable
to make a copy and see what data types Oracle assigns
the columns by default.
http://www.e-ammar.com/Oracle_TIPS/HS/configuring_generic_database_con.htm
--------------------------------------------
References:
http://209.85.173.132/search?q=cache:GJN388WiXTwJ:www.orafaq.com/wiki/SQL*Loader_FAQ+Oracle+control+file+columns&cd=3&hl=en&ct=clnk&gl=us
http://forums.oracle.com/forums/thread.jspa?threadID=305918&tstart=0
http://techrepublic.com.com/5208-6230-0.html?forumID=101&threadID=223797&messageID=2245485
http://examples.oreilly.com/orsqlloader/
A DBA once showed me an easy trick:
In someplace like another sheet, create a formula like:
INSERT INTO my_table (name, age, monkey) VALUES ('" & A1 & "', " & B1 & ", '" & C1 & "');"
Copy/paste it into the appropriate rows (Excel automatically changes your formula to A2, A3, etc.)
Then copy/paste the result into sqlplus.
The simplest way I can think of is to put Access in the middle. Attach to Excel (or import the data into Access); then attach to the destination Oracle tables and copy. The Access Export facility also works pretty well.
Use external tables
Perhaps some combination of DBD::Oracle, DBD::Excel and DBIx::Copy? But surely there's an easier way...
If its a once off, or rare thing, and you can export to csv, then the Application Express or SQL Loader facilities would work fine. If its a regular thing, then Chris's suggestion is what I'd go with.

Resources