Oracle: importing records from tab delimited text file to database using pl-sql - oracle

I have never worked with Oracle. This is the first time and the job is quite tricky. I have a text file with records delimited with tab. These records are to be imported into a database using pl-sql. I have searched online but the solutions suggests using SQL Loader utility. But the requirement is to do that using sql statements. No command line utility. Preferable the SP or UDF will take file path and database name as input parameters and it will import the records when executed. Is this possible? Can someone provide me sample sql statements or any link that explain this process step by step? Also note that there can be blank records in file. Thanks in advance.

External Tables seems like the best approach.
http://docs.oracle.com/cd/B19306_01/server.102/b14215/et_concepts.htm
UTL_FILE is possible but you would have to write the code to parse the tab delimited text etc.

http://www.allroundautomations.nl/download/NewFeatures7.pdf
check on that file, easy to upload a csv file to a table

Related

SQL Loader in Oracle

As I am inserting data from a CSV file to a oracle table using SQL Loader and it is working fine .
LOAD DATA
INFILE DataOut.txt
BADFILE dataFile.bad
APPEND INTO TABLE ASP_Net_C_SHARP_Articles
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(ID,Name,Category)
above settings are being used to do that but I do not want to specify any of the column name ex. (ID,Name,Category) .
Is this possible or not if yes can anybody tell me how..
In SQL*Loader you need to specify the column names. If you still persist in ignoring the column names in the control file, then I would suggest you to use SQL to "discover" the name of the columns and dynamically generate the control file and wrap it via shell script to make it more automated.
Meanwhile, you can consider External Tables which uses the SQL*Loader engine, so you will still have to perform some dynamic creation here for your input file as suggested above. But you can create a script to scan the input file and dynamically generate the CREATE TABLE..ORGANIZATION EXTERNAL command for you. Then the data becomes available as if it were a table in your database.
You can also partially skip the columns if that would help you, by using FILLER. BOUNDFILLER (available with Oracle 9i and above) can be used if the skipped column's value will be required later again.

Spring batch my sql dump parsing issue

Problem statement
I need to replace certain dates in my sql dump file and create another dump file. First I need to parse the create table definition and store information about date type columns. I may need to skip certain columns. After that I need to parser the "Insert into table" statements. Break this statement into rows and then into columns and then replace the dates.
Solution I am using Spring batch where reader is Composite Reader. I read the entire table definition (Drop statement, Create statement and Insert statements) in memory and then pass to processor to replace the dates. During reading I also split the insert statements into Rows and Column.
Problem This solution works fine for small dump but getting out of memory for large dumps. e.g. There is one table having long blobs and size is 2GB.
Any idea how can I fix the problem or Spring batch is not a right solution for this. Any help will be highly appreciated.

How to take input from file in Oracle and update the Oracle database table

I want to update the database in oracle by taking input from file.Means I have some input fields in file and update the table by taking that input.
Can I do this by creating directory and using utl_file which is provided in Oracle.
Yes, you need to create a DIRECTORY object and then use UTL_FILE or something similar to open and read the file, then INSERT into your table. I can't be more specific since you didn't really tell us anything about what you're trying to do.
Perhaps the SQL*Loader or import utilities would work for you too:
SQL*Loader: http://docs.oracle.com/cd/B28359_01/server.111/b28319/ldr_concepts.htm#g1013706
import: http://docs.oracle.com/cd/B28359_01/server.111/b28319/dp_import.htm#g1025464

oracle to flat file

I need to create a flat file and push information into it from oracle database using JSP.
I require a sample code. Help will be appreciated.
If you're looking for an easy way to write different SQL statements to a file, use this procedure: http://www.oracle-developer.net/content/utilities/data_dump.sql
Also you might want to look into DBMS_XSLPROCESSOR.CLOB2FILE.
I think you need to look into Oracle external tables. These are flat files that appear as tables in the Oracle database. You would simply insert data into it using SQL (as per any other database table). Google "Oracle External Tables" for more information.

Oracle sql result to DBF file export

I would like to export data from a Oracle table into *.dbf file (like excel) through PL/SQL scripts. Are there any codes available?
There are a number of different ways to do this. The easiest way is to use an IDE like SQL Developer or TOAD, which offer this functionality.
If you want to call it from PL/SQL, then then are no built-in Oracle functions. However, it is relatively straightforward to build something using UTL_FILE which can write out value separated records. These can be picked up in Excel.
Note that the default separator - , (comma being the "C" in .CSV) - will cause problems if your exported data contains commas. So you will need to use the Data Import wizard rather than a right-click Open With ...
Incidentally, it is probably a bad idea to use the .dbf suffix. In an Oracle file system the presumed meaning is database file - i.e. part of the database's infrastructure. This is just a convention, but there is no point in needlessly confusing people. Perferred alternatives include .csv, .dmp or .exp.
edit
If your interest is just to export data for transferring to another Oracle database then you should look at using the Data Pump utility. This comes with an API so it can be used from PL/SQL. Alternatively we unload data through external tables declared with a DataPump driver.
You could also consider using the External Tables feature of Oracle. This essentially allows you to map a CSV file to a 'virtual' table and the you can insert into it (and therefore the file.)
Oracle External Tables Concept Guide

Resources