Loading data from a web-hosted CSV into Oracle? - oracle

I don't have time to write a perl or anything like that nor do I have admin access to the back end, so how can I get data from a file on the intranet (http://) and parse it to create a table? Maybe somehow via PL/SQL? Keep in mind I don't have much admin access.

If you want it to be completely automated
You can use the UTL_HTTP package to retrieve the data from the HTTP server
You can either parse the CSV response yourself using INSTR and SUBSTR or you can use the UTL_FILE package to write the data to a file on the database server file system and then create an external table that parses the CSV file you just created.
You can then insert the parsed data into a table you've already created (I'm assuming that the CSV data is in the same format each time).
You can use the DBMS_SCHEDULER or DBMS_JOB package to schedule the job
The Oracle database account you're using will need to be granted access to all of these packages.

You may download the file into your host machine and then use SQL*Loader to populate a table.
Other ways there are some wizards that may be easier than SQL*Loader, if you are using IDEs like PLSQLDeveloper(Tools->Text Importer) or SQLDeveloper(right click on Tables-> Import Data).

Create an external table that references your CSV file. That means you can run select statements on the file from within Oracle.

Related

How to handle bad files generated by external tables

I've always developed shell scripts on server Unix where the script before runs the SQL-Loader for loading the file to be inserted into an Oracle table and after verifies if it's been generated any BAD file and in that case for example it sends an email to me with a warning.
Instead, by using an external table, I've got the main advantage not to handle any shell scripts but since only at the moment I run the SELECT from my external table a BAD file might be generated on the server, how can I have an automated check on its existence and to handle it from Oracle?
Oracle version 10g
Thanks!
With external tables, everything you do, you do in Oracle (i.e. within the database).
Therefore, you could
create a PL/SQL program (anonymous PL/SQL block or a stored procedure)
access the external table
do whatever you do
after it is finished, use UTL_FILE to check log/bad file
use DBMS_MAIL to send an e-mail if there's something "wrong"

How to use bcp utility with oracle dB or any other better options

I have a csv file which has to be bulk imported to oracle dB. I was working on other sybase dB engine before so I had a sample script which has the environment setup for it. Right now I have to do the process in a oracle dB so what should be the first line I know about the rest other parameters but want to know the path which has to be defined when I write
path/bcp dbtable in data.txt
If anyone could help what should be the path for oracle dB
The primary tools for bulk or flat file loading are:
SQL*Loader
External Tables (and here)
GUI Tools like SQL*Developer
It is much more cumbersome, but if necessary you can roll your own solution with the UTL_FILE PL/SQL package.

how to connect to oracle database from snowflake?

I have to pull some data from oracle and update the data in snowflake. And ofcourse the size of the data is 5gb.
Is there any procedure to connect to oracle database from snowflake? OR
Do I need to connect them using a programming language as python?
You'll need to unload the data from Oracle and load into Snowflake, as there are no "direct connect" options I've ever heard about.
I'd use SQL*Loader to unload, push the files to AWS S3 (or your cloud vendor's storage), and issue Snowflake COPY INTO TABLE commands, it should be fairly straightforward.
There is no equivalent to Oracle database links in Snowflake. You would need an external process to move the data from Oracle to S3. Then you can configure a Snowpipe task to load from S3 into Snowflake. See Loading Continuously Using Snowpipe for more information.
I would suggest to use python programming to extract and load data from oracle to snowflake. Since your oracle table is being updated daily write python program to generate merge statement dynamically to load your incremental data from oracle to snowflake.
Snowflake supports Java script based stored procedure so you can use stored procedure to generate merge statement dynamically by passing table name as parameter and you can call it via python.
Initial load from oracle to snowflake may take time as you have 5GB data from your source system.

Is there a possible way to load a Oracle .dmp file to an SQL Server 2012?

This throws me the below error:
the media family on device is incorrectly formed 3241.
I tried loading the .dmp file as .bak file and restored the db. It did not work.
Only way I know to extract from dmp is to use the "INDEXFILE" parameter for IMP, this will generate a readable SQL script with the DDL and DML.
However often times this script is not 100% usable as it (usually) wraps the statements, so some pre-processing may be required, for example parse the file by each discrete statement (INSERT, CREATE), join each statement into a single line then squirt into the target database. Having said that, you would probably need to pre-process anyway to translate Oracle to SQL server dialogue anyway.
Also, might not be so good for BLOB/binary type data.
The other indirect way to do this would be to create a bridge Oracle database, import the file into there, then use the normal extract and load tools to push the data into SQL server.
A *.dmp file in Oracle is nothing but a backup file. You meant to say restoring a Oracle DB backup file in SQL Server.
AFAIK, the answer is NO. You can't do that. Probably you can check, if there is any third party utility present using which you can perform a DB migration.
The dmp file comes in an Oracle specific format that cannot be parsed/interpreted by anything other than Oracle's imp tool. So, that means you cannot import the dmp file into SQL Server.
Of course there are ways to transfer data from Oracle to SQL Server but which one is optimal depends on your needs, amount of data, number of tables, number of Oracle schemas, datatypes etc etc.

Load CLOB into Column from Local File (not oracle server)

I would like to load a txt file into a CLOB field. The catch is that the file resides on the local disk (not on the oracle server). Is it possible to do this with pl/sql, running windows, perhaps from from TOAD or SQLPlus?
If so, could someone share the pl/sql?
I have seen several posts on loading a CLOB File from the server disk, Example1 and Example2. But can't seem to find anything on loading the file from local disk.
Thank you!
ps, It would be great if the routine supported multi-byte text (as in the examples).
No, you can't use a PL/SQL script to load a local file into table. But there is an alternative: loading local file(s) into CLOB field with local Oracle SQL*Loader. Install Oracle Client on your machine if you didn't do this before and use the article to create your own SQL*Loader config and script to running it.

Resources