SQL Loader in Oracle - oracle

As I am inserting data from a CSV file to a oracle table using SQL Loader and it is working fine .
LOAD DATA
INFILE DataOut.txt
BADFILE dataFile.bad
APPEND INTO TABLE ASP_Net_C_SHARP_Articles
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(ID,Name,Category)
above settings are being used to do that but I do not want to specify any of the column name ex. (ID,Name,Category) .
Is this possible or not if yes can anybody tell me how..

In SQL*Loader you need to specify the column names. If you still persist in ignoring the column names in the control file, then I would suggest you to use SQL to "discover" the name of the columns and dynamically generate the control file and wrap it via shell script to make it more automated.
Meanwhile, you can consider External Tables which uses the SQL*Loader engine, so you will still have to perform some dynamic creation here for your input file as suggested above. But you can create a script to scan the input file and dynamically generate the CREATE TABLE..ORGANIZATION EXTERNAL command for you. Then the data becomes available as if it were a table in your database.
You can also partially skip the columns if that would help you, by using FILLER. BOUNDFILLER (available with Oracle 9i and above) can be used if the skipped column's value will be required later again.

Related

SQL LOADER Control File without fields

I'm working on a task to load Database table from a flat file. My database table has 60 columns.
Now, In SQL LOADER control file, Is it mandatory to mention all the 60 fields ?
Is there a way to tell SQL LOADER that all 60 columns should be treated as required without mentioning the fields in the Control File ?
Oracle 12c (and higher versions) offer express mode.
In a few words (quoting the document):
The SQLLoader TABLE parameter triggers express mode. The value of the TABLE parameter is the name of the table that SQLLoader will load. If TABLE is the only parameter specified, then SQL* loader will do the following:
Looks for a data file in the current directory with the same name as the table being loaded that has an extension of ".dat". The upper/lower case used in the name of the data file is the same as the case for the table name specified in the TABLE parameter
Assumes the order of the fields in the data file matches the order of the columns in the table
Assumes the fields are terminated by commas, but there is no enclosure character
(...) order of the fields in the data file matches the order of the columns in the table. The following SQL*Loader command will load the table from the data file.
sqlldr userid=scott table=emp
Notice that no control file is used. After executing the SQL*Loader command, a SELECT from the table will return (...)
I guess that's what you're after.

Location and filename of most recent BADFILE when using external tables

Is there a way to detemine location/filename of latest BADFILE?
When I select from enternal table with BADFILE 'mytable_%a_%p.bad', how do I find out what specific values were %a and %p replaced with?
Or am I stuck with having mytable.bad which I can reliably query and hoping that there will be no race conditions?
As the documentation states
%p is replaced by the process ID of the current process. For example,
if the process ID of the access driver is 12345, then exttab_%p.log
becomes exttab_12345.log.
%a is replaced by the agent number of the current process. The agent
number is the unique number assigned to each parallel process
accessing the external table. This number is padded to the left with
zeros to fill three characters. For example, if the third parallel
agent is creating a file and bad_data_%a.bad was specified as the file
name, then the agent would create a file named bad_data_003.bad.
If %p or %a is not used to create unique file names for output files
and an external table is being accessed in parallel, then output files
may be corrupted or agents may be unable to write to the files.
Having said that, you must remember the purpose of the badfile in the first place.
The BADFILE clause names the file to which records are written when
they cannot be loaded because of errors. For example, a record would
be written to the bad file if a field in the data file could not be
converted to the data type of a column in the external table. The
purpose of the bad file is to have one file where all rejected data
can be examined and fixed so that it can be loaded. If you do not
intend to fix the data, then you can use the NOBADFILE option to
prevent creation of a bad file, even if there are bad records.
So the idea ( either for SQL Loader or External Tables with access driver oracle_loader ) is to have a file to store those records, the bad records, not to trace anything regarding them.
Normally you have external tables associated to text files that you are receiving in a daily/weekly/monthly basis. You store on the badfile those records that can't be read/loaded according to your own table specification.
You use then the LOGFILE to find what has happened. Those files are generated in the database directory where the external table is created, and you will have one for each time a badfile needs to be generated.

How to skip first two line of CSV in Oracle Loader

I have a requirement where I suppose to load the csv file using Oracle Loader. But this csv file have first two line as header and we want to exclude it in load.
How we can skip it?
The 11g SQL Loader documentation states that in your control file, you should just make sure you have an options clause. Here is an example of how you could implement this:
OPTIONS (SKIP=2)
LOAD DATA
INFILE 'my_new_records.csv'
BADFILE 'my_new_records.bad'
DISCARDFILE 'my_new_records.dsc'
APPEND
...

Oracle: importing records from tab delimited text file to database using pl-sql

I have never worked with Oracle. This is the first time and the job is quite tricky. I have a text file with records delimited with tab. These records are to be imported into a database using pl-sql. I have searched online but the solutions suggests using SQL Loader utility. But the requirement is to do that using sql statements. No command line utility. Preferable the SP or UDF will take file path and database name as input parameters and it will import the records when executed. Is this possible? Can someone provide me sample sql statements or any link that explain this process step by step? Also note that there can be blank records in file. Thanks in advance.
External Tables seems like the best approach.
http://docs.oracle.com/cd/B19306_01/server.102/b14215/et_concepts.htm
UTL_FILE is possible but you would have to write the code to parse the tab delimited text etc.
http://www.allroundautomations.nl/download/NewFeatures7.pdf
check on that file, easy to upload a csv file to a table

Magento: How to load large tables into custom db table?

I have a sql installer file for my custom Magento module. It attempts to insert many thousands of rows into a custom database table but it runs out of memory and the module doesn't install.
Everything works fine if I put the table in manually with normal mysql and there is no 'memory balloon' doing it that way.
I would like my module to work as a module, without having to do anything manually on the command line. Is there any way I can break down my installer file or call some external routine to get the data in?
You could distribute a CSV file containing the data with your module and use MySQL's LOAD DATA command to load the data into the table you create in your upgrade script.
Maybe something like:
$db = Mage::getSingleton('core/resource')->getConnection('core_write');
$filename = Mage::getBaseDir('code').'/local/Your/Module/sql/your_module_setup/foo.csv';
$sql = "LOAD DATA LOCAL INFILE '".$filename."' INTO TABLE foo FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' lines terminated by '\r\n'";
$db->query($sql);
You can, of course, run further queries if you need to process the data somehow.
You can split up creating table and inserts in different upgrade scripts.

Resources