Accessing Created date of a CSV file using an Oracle External table - windows

Situation
I have a CSV file called inventory.csv located on an Oracle database server (2008 R2 Enterprise Edition Windows Server). This CSV file is used as an Oracle external table.
Every hour, a scheduled task (Windows Task Scheduler) executes a .bat file that copies over an updated version inventory.csv, overwriting the original.
The data is then used by a reporting application.
Problem
The application that uses the data in inventory.csv has no way of knowing when the data was last updated.
Ideally, I'd like the "last updated date" to be accessible as a column in the table.
One possible solution is to trigger a logging of the current date/time in a separate file, an then referencing that as an external table as well. However, this solution has too many moving parts, and I'd prefer something simpler, if possible.
I know that the CSV file itself knows when it was created...I'm wondering if there is any way for the Oracle external table to read the "Created" date from the CSV file properties?
Or any other ideas?

What version of Oracle?
If you are using 11.2 or later, you can use the preprocessor feature of external tables to run a shell script/ batch file on the file before it is loaded. My bias would be to go for simplicity-- have the preprocessing script grab the date, store it to a separate file, and have a separate external table that loads and exposes that data. That's likely easier than adding the date to every row.

Related

How to handle bad files generated by external tables

I've always developed shell scripts on server Unix where the script before runs the SQL-Loader for loading the file to be inserted into an Oracle table and after verifies if it's been generated any BAD file and in that case for example it sends an email to me with a warning.
Instead, by using an external table, I've got the main advantage not to handle any shell scripts but since only at the moment I run the SELECT from my external table a BAD file might be generated on the server, how can I have an automated check on its existence and to handle it from Oracle?
Oracle version 10g
Thanks!
With external tables, everything you do, you do in Oracle (i.e. within the database).
Therefore, you could
create a PL/SQL program (anonymous PL/SQL block or a stored procedure)
access the external table
do whatever you do
after it is finished, use UTL_FILE to check log/bad file
use DBMS_MAIL to send an e-mail if there's something "wrong"

Load multiple csv files in different tables using sql loader at once

Just started playing around with sql loader.
I have multiple csv files which I want to load into respective tables.
I used sql loader and created multiple .ctl files one for each csv and I was able to run them one at a time to upload the data to my tables.
But instead of running multiple commands, I want to create a script that will run all these commands at once. Is there a way to do this in a shell script?
Edit: I will be using Linux Rh7
Thanks.
Sure, why not ... sqlldr is an operating system executable so you can call it from a batch / shell script. The way it looks depends on operating system you use (and you didn't mention).

Is there a possible way to load a Oracle .dmp file to an SQL Server 2012?

This throws me the below error:
the media family on device is incorrectly formed 3241.
I tried loading the .dmp file as .bak file and restored the db. It did not work.
Only way I know to extract from dmp is to use the "INDEXFILE" parameter for IMP, this will generate a readable SQL script with the DDL and DML.
However often times this script is not 100% usable as it (usually) wraps the statements, so some pre-processing may be required, for example parse the file by each discrete statement (INSERT, CREATE), join each statement into a single line then squirt into the target database. Having said that, you would probably need to pre-process anyway to translate Oracle to SQL server dialogue anyway.
Also, might not be so good for BLOB/binary type data.
The other indirect way to do this would be to create a bridge Oracle database, import the file into there, then use the normal extract and load tools to push the data into SQL server.
A *.dmp file in Oracle is nothing but a backup file. You meant to say restoring a Oracle DB backup file in SQL Server.
AFAIK, the answer is NO. You can't do that. Probably you can check, if there is any third party utility present using which you can perform a DB migration.
The dmp file comes in an Oracle specific format that cannot be parsed/interpreted by anything other than Oracle's imp tool. So, that means you cannot import the dmp file into SQL Server.
Of course there are ways to transfer data from Oracle to SQL Server but which one is optimal depends on your needs, amount of data, number of tables, number of Oracle schemas, datatypes etc etc.

Loading data from a web-hosted CSV into Oracle?

I don't have time to write a perl or anything like that nor do I have admin access to the back end, so how can I get data from a file on the intranet (http://) and parse it to create a table? Maybe somehow via PL/SQL? Keep in mind I don't have much admin access.
If you want it to be completely automated
You can use the UTL_HTTP package to retrieve the data from the HTTP server
You can either parse the CSV response yourself using INSTR and SUBSTR or you can use the UTL_FILE package to write the data to a file on the database server file system and then create an external table that parses the CSV file you just created.
You can then insert the parsed data into a table you've already created (I'm assuming that the CSV data is in the same format each time).
You can use the DBMS_SCHEDULER or DBMS_JOB package to schedule the job
The Oracle database account you're using will need to be granted access to all of these packages.
You may download the file into your host machine and then use SQL*Loader to populate a table.
Other ways there are some wizards that may be easier than SQL*Loader, if you are using IDEs like PLSQLDeveloper(Tools->Text Importer) or SQLDeveloper(right click on Tables-> Import Data).
Create an external table that references your CSV file. That means you can run select statements on the file from within Oracle.

Oracle11g Database Synchornization

I have a WPF application with back-end as Oracle11gR2. We need to enable our application to work in both online and offline(disconnected) mode. We are using Oracle standard edition(with single instance) as client database. I am using Sequnece Numbers for Primary Key Columns. Is there anyway to sync my client and server database without any issues in Sequence number columns. Please note that we will restrict creation of basic(master) data to be created only in server.
There are a couple of approaches to take here.
1- Write the sync process to rebuild the server tables (on the client) each time with a SELECT INTO. Once complete, RENAME the current table to a "temp" table, and RENAME the newly created table with the proper name. The sync process should DROP the temp table as one of its first steps. Finally, recreate the indexes and you should be good-to-go.
2- Create a backup of the server-side database, write a shell script to copy it down and restore it on the client.
Each of these options will preserve your sequence numbers. Which one you choose really depends on your skills. If you're more of a developer, you can make #1 work. If you've got some Oracle DBA skills you should be able to make #2 work.
Since you're on 11g, there might be a cleaner way to do this using Data Pump.

Resources