How to insert bunch of data in to Oracle DB? - oracle

I have a requirement to insert 12600 data in one table. The data is in doc file i need to upload all the data in particular table at one shot.
please give me a suggestion to upload the data.
Thanks in advance.

check out sql*loader, which allows relatively easy and fast import into the database from text-files.

More information would be useful, but the following steps are a very general description of how to accomplish your task:
Open file
LOOP
Read line from file
If file at EOF, break out of LOOP
Parse line into variables
Insert data into table using variables from (4)
END LOOP
Close file.
COMMIT
If errors occur, ROLLBACK and exit
Share and enjoy.

Related

Read data from a flat file (Datastore) in an ODI procedure

I'm trying read a file from a PL/SQL procedure but I am geting ORA-00942 Table or view does not exist error.
Caused by: Error : 942, Position : 21, Sql =
SELECT UBIC_ID FROM LIST_UBICS
, Error Msg = ORA-00942: table or view does not exist
I have a file with an id per line. This file is called list_ubics.csv. I have a File model and a datastore pointing to the file called LIST_UBIC with the UBIC_ID Field.
I created a Task in a new Procedure with this SQL:
SELECT UBIC_ID From LIST_UBICS
LIST_UBICS is my datastore I don't have any table with these name.
I want read these file and make some processing for each line but I don't see any way in the docs to read a text file that works for me.
How can I read this file?
Thanks in advance for any help.
An ODI Procedure written in PL/SQL (Oracle Technology) will be pushed down on the database. The database executing doesn't know about the File Datastore and can not execute SQL statement against it.
If the goal is to load the file with ODI it can be done using an interface (11g) or a mapping (12c) with LKM File to SQL. That will copy the content of the file into a table in the database and any SQL statement can then be executed against it.
Alternatively, it is possible to create a directory in the database, land the file there and create an external table on top of it. Queries can be used on external table but not DML operations. More information here : https://oracle-base.com/articles/9i/external-tables-9i
I just found out last week that there is one more solution and probably the best!
The solution is written here-part-1 and here-part-2 and if you follow the steps exactly, it will work (I implemented it).
Anyway, I will summarize the main idea and steps.
We can use a variable into a package. There is a code (see code at the end) that reads a column from the given file. Making a for statement in the package, will help us read every row, by changing the value of "CRFILE_FIRST_ROW" variable from the code below, with a sequential number starting with 1 (
So, everything is easy as above. Besides the "CRFILE_FIRST_ROW", there are more variable that can be changed, like: CRFILE_FORMAT=D (format:decimal), CRFILE_SEP_FIELD=0x0009 (hexadecimal fileformat) and so on.
Also, as you can see in the original posts (above links), you can generate your view code; you don't need to copy paste from below.
View code:
select TES.C1 C1
from location_of_file/objects_to_import.txt TES
/*$$SNPS_START_KEYSNP$CRDWG_TABLESNP$CRTABLE_NAME=TESTSNP$CRLOAD_FILE=location_of_file/objects_to_import.txtSNP$CRFILE_FORMAT=DSNP$CRFILE_SEP_FIELD=0x0009SNP$CRFILE_SEP_LINE=0x000ASNP$CRFILE_FIRST_ROW=#UTILS.IMPORT_OBJ_READ_INCRSNP$CRFILE_ENC_FIELD=SNP$CRFILE_DEC_SEP=SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=C1SNP$CRTYPE_NAME=STRINGSNP$CRORDER=1SNP$CRLENGTH=50SNP$CRPRECISION=50SNP$CRACTION_ON_ERROR=NULLSNP$CR$$SNPS_END_KEY*/

Quick way to run large sql file containing Insert stms - oracle

I have a sql script that contains 700,000 Insert statements. I tried to run through Oracle Developer, but it failed to load the file itself. I tried to run from sqlplus but its taking quite a long time to execute such large file.
To speed up, I have deleted all the constraints on the table but there is no improvement.
I have looked for information, Please have been suggesting
Split the file into manageable sizes - which I will fall back as my last resort.
Sql Loader - as far as I understand SQL Loader is to export from the db in specific format and load them into the db with CTL.
Is there any better way to handle this scenario.
Convert the file from many SQL statements to a smaller number of PL/SQL blocks to reduce the round-trip overhead. This only requires a few minutes with a text editor and can improve performance by orders of magnitude, especially over slow networks.
Every 10,000 lines, add a begin and an end; to the file.
Change this:
insert into ...
insert into ...
insert into ...
...
To this:
begin
insert into ...
insert into ...
insert into ...
...
end;
/
Don't convert the entire file into one large PL/SQL block though. There is a limit to the size of anonymous PL/SQL blocks, you might get a parser error.
Agree with previous answer above, 700,000 inserts via script is going to be slow no matter what you do with it - load up your data as an external table or use SQL*Loader.
However
If you want to execute a large script with SQL Developer, don't OPEN the file - we have to open and parse and display the contents of that file. Ow.
Just do this in the worksheet
#script_name
That will execute the script.
To speed it up even further, hide or minimize the output area of the worksheet.
It's still not going to be super-fast with 700,000 inserts though.
AFAIK the option 2 is correct.
Use sed/awk/perl to convert the file into CSV (or fixed width) input file.
disable constraints, indexes, (possibly drop unique indexes)
create control file for your input file
exec sqlldr (turn direct path load on)
And this should finish withing few seconds.
You have also to check if you have some triggers in the table where you are doing the inserts. It can slow down the process if a lot of logic is coded behind.

How to insert name of file and modified time using batch/shell script and sql loader

I have a requirement to insert bulk data into an Oracle database from a CSV file. Now table columns specs match those of the CSV file's header with the exception of three additional fields in database:
A Primary Key field (for which a simple SEQUENCE.NEXTVAL is called)
A field for the name of the CSV file
A field for the last modified date+time of the file
The following stack question address an extra column issue, but the solution is pretty easy because it used Oracle sysdate which is internally available. I need to pass a parameter from either batch script/shell script.
Insert actual date time in a row with SQL*loader
Can PARFILE help here somehow?
My other alternative would be to do the whole task in two steps by writing a small java code:
Use SQL Loader for bulk upload leaving out data for the filename and
modified time
And then run a separate update statement to populate the newly
created rows
But I'm looking for something which will get the job done in one shot. Any advice??
I'm affraid it's not possible with sqlldr alone.
There is no tools for this in sqlldr.
You'd need some sort of script or a program to dynamically create a .ctl file for each load.
Here is a bash script to help you get started:
#!/bin/bash -xv
readonly MY_FILENAME=$1
readonly DB_BUF_TABLE=$2
readonly SQLLDR_CTL="LOAD DATA
CHARACTERSET UTF8
APPEND INTO TABLE $DB_BUF_TABLE
FIELDS TERMINATED BY ';'(
filename \"$MY_FILE_NAME\",
col_foo,
col_bar
)"
echo "$SQLLDR_CTL" > "loader.ctl"
sqlldr control=loader.ctl parfile=loader.par data="$MY_FILENAME"
sqlldrReturnValue=$?
You'd needsome locking with this.. or path separation for concurrent loads to be sure sqlldr starts with proper ctl file

Oracle, save/map csv string to a table using utl_file and external tables

I use a pl/sql procedure calling a webservice. This webservice returns me a large csv-string which I hold in a clob. Since I do not want to parse the csv by foot, I thought of using external tables. So what I need to do is storing the csv data in a corresponding table.
What I am doing at the moment is, that I store the clob using utl_file. the stored file is defined in a external table. Ok, when I am the only user this works very well. But since DBs are multiuser I have to watchout if someone else is calling the procedure and overwriting the external table data source file. What is the best way avoid a mess in table data source? Or what is the best way to store a cvs-sting into a table?
Thanks
Chris
You want to make sure that the procedure is run by at most one session. There are several ways to achieve this goal:
The easiest way would be to lock a specific row at the beginning of your procedure (SELECT ... FOR UPDATE NOWAIT). If the lock succeeds, go on with your batch. If it fails it means the procedure is already being executed by another session. When the procedure ends, either by success or failure, the lock will be released. This method will only work if your procedure doesn't perform intermediate commits (which would release the lock before the end of the procedure).
You could also use the DBMS_LOCK package to request a lock specific to your procedure. Use the DBMS_LOCK.request procedure to request a lock. You can ask for a lock that will only be released at the end of your session (this would allow intermediate commits to take place).
You could also use AQ (Oracle queuing system), I have little experience with AQ though so I have no idea if it would be a sensible method.
Maybe you should generate temporary filename for each CSV? Something like:
SELECT TO_CHAR(systimestamp, 'YYYYMMDDHH24MISSFF') filename FROM dual
You can use UTL_FILE.FRENAME.
In similar situations, I have the external_table pointing to a file (eg "fred.txt").
When I get a new source file in, I use UTL_FILE.FRENAME to try to rename it to fred.txt. If the rename fails, then another process is running, so you return a busy error or wait or whatever.
When the file has finished processing, I rename it again (normally with some date_timestamp).

How to find out when an Oracle table was updated the last time

Can I find out when the last INSERT, UPDATE or DELETE statement was performed on a table in an Oracle database and if so, how?
A little background: The Oracle version is 10g. I have a batch application that runs regularly, reads data from a single Oracle table and writes it into a file. I would like to skip this if the data hasn't changed since the last time the job ran.
The application is written in C++ and communicates with Oracle via OCI. It logs into Oracle with a "normal" user, so I can't use any special admin stuff.
Edit: Okay, "Special Admin Stuff" wasn't exactly a good description. What I mean is: I can't do anything besides SELECTing from tables and calling stored procedures. Changing anything about the database itself (like adding triggers), is sadly not an option if want to get it done before 2010.
I'm really late to this party but here's how I did it:
SELECT SCN_TO_TIMESTAMP(MAX(ora_rowscn)) from myTable;
It's close enough for my purposes.
Since you are on 10g, you could potentially use the ORA_ROWSCN pseudocolumn. That gives you an upper bound of the last SCN (system change number) that caused a change in the row. Since this is an increasing sequence, you could store off the maximum ORA_ROWSCN that you've seen and then look only for data with an SCN greater than that.
By default, ORA_ROWSCN is actually maintained at the block level, so a change to any row in a block will change the ORA_ROWSCN for all rows in the block. This is probably quite sufficient if the intention is to minimize the number of rows you process multiple times with no changes if we're talking about "normal" data access patterns. You can rebuild the table with ROWDEPENDENCIES which will cause the ORA_ROWSCN to be tracked at the row level, which gives you more granular information but requires a one-time effort to rebuild the table.
Another option would be to configure something like Change Data Capture (CDC) and to make your OCI application a subscriber to changes to the table, but that also requires a one-time effort to configure CDC.
Ask your DBA about auditing. He can start an audit with a simple command like :
AUDIT INSERT ON user.table
Then you can query the table USER_AUDIT_OBJECT to determine if there has been an insert on your table since the last export.
google for Oracle auditing for more info...
SELECT * FROM all_tab_modifications;
Could you run a checksum of some sort on the result and store that locally? Then when your application queries the database, you can compare its checksum and determine if you should import it?
It looks like you may be able to use the ORA_HASH function to accomplish this.
Update: Another good resource: 10g’s ORA_HASH function to determine if two Oracle tables’ data are equal
Oracle can watch tables for changes and when a change occurs can execute a callback function in PL/SQL or OCI. The callback gets an object that's a collection of tables which changed, and that has a collection of rowid which changed, and the type of action, Ins, upd, del.
So you don't even go to the table, you sit and wait to be called. You'll only go if there are changes to write.
It's called Database Change Notification. It's much simpler than CDC as Justin mentioned, but both require some fancy admin stuff. The good part is that neither of these require changes to the APPLICATION.
The caveat is that CDC is fine for high volume tables, DCN is not.
If the auditing is enabled on the server, just simply use
SELECT *
FROM ALL_TAB_MODIFICATIONS
WHERE TABLE_NAME IN ()
You would need to add a trigger on insert, update, delete that sets a value in another table to sysdate.
When you run application, it would read the value and save it somewhere so that the next time it is run it has a reference to compare.
Would you consider that "Special Admin Stuff"?
It would be better to describe what you're actually doing so you get clearer answers.
How long does the batch process take to write the file? It may be easiest to let it go ahead and then compare the file against a copy of the file from the previous run to see if they are identical.
If any one is still looking for an answer they can use Oracle Database Change Notification feature coming with Oracle 10g. It requires CHANGE NOTIFICATION system privilege. You can register listeners when to trigger a notification back to the application.
Please use the below statement
select * from all_objects ao where ao.OBJECT_TYPE = 'TABLE' and ao.OWNER = 'YOUR_SCHEMA_NAME'

Resources