I would like to load a txt file into a CLOB field. The catch is that the file resides on the local disk (not on the oracle server). Is it possible to do this with pl/sql, running windows, perhaps from from TOAD or SQLPlus?
If so, could someone share the pl/sql?
I have seen several posts on loading a CLOB File from the server disk, Example1 and Example2. But can't seem to find anything on loading the file from local disk.
Thank you!
ps, It would be great if the routine supported multi-byte text (as in the examples).
No, you can't use a PL/SQL script to load a local file into table. But there is an alternative: loading local file(s) into CLOB field with local Oracle SQL*Loader. Install Oracle Client on your machine if you didn't do this before and use the article to create your own SQL*Loader config and script to running it.
Related
I have a csv file which has to be bulk imported to oracle dB. I was working on other sybase dB engine before so I had a sample script which has the environment setup for it. Right now I have to do the process in a oracle dB so what should be the first line I know about the rest other parameters but want to know the path which has to be defined when I write
path/bcp dbtable in data.txt
If anyone could help what should be the path for oracle dB
The primary tools for bulk or flat file loading are:
SQL*Loader
External Tables (and here)
GUI Tools like SQL*Developer
It is much more cumbersome, but if necessary you can roll your own solution with the UTL_FILE PL/SQL package.
This throws me the below error:
the media family on device is incorrectly formed 3241.
I tried loading the .dmp file as .bak file and restored the db. It did not work.
Only way I know to extract from dmp is to use the "INDEXFILE" parameter for IMP, this will generate a readable SQL script with the DDL and DML.
However often times this script is not 100% usable as it (usually) wraps the statements, so some pre-processing may be required, for example parse the file by each discrete statement (INSERT, CREATE), join each statement into a single line then squirt into the target database. Having said that, you would probably need to pre-process anyway to translate Oracle to SQL server dialogue anyway.
Also, might not be so good for BLOB/binary type data.
The other indirect way to do this would be to create a bridge Oracle database, import the file into there, then use the normal extract and load tools to push the data into SQL server.
A *.dmp file in Oracle is nothing but a backup file. You meant to say restoring a Oracle DB backup file in SQL Server.
AFAIK, the answer is NO. You can't do that. Probably you can check, if there is any third party utility present using which you can perform a DB migration.
The dmp file comes in an Oracle specific format that cannot be parsed/interpreted by anything other than Oracle's imp tool. So, that means you cannot import the dmp file into SQL Server.
Of course there are ways to transfer data from Oracle to SQL Server but which one is optimal depends on your needs, amount of data, number of tables, number of Oracle schemas, datatypes etc etc.
I don't have time to write a perl or anything like that nor do I have admin access to the back end, so how can I get data from a file on the intranet (http://) and parse it to create a table? Maybe somehow via PL/SQL? Keep in mind I don't have much admin access.
If you want it to be completely automated
You can use the UTL_HTTP package to retrieve the data from the HTTP server
You can either parse the CSV response yourself using INSTR and SUBSTR or you can use the UTL_FILE package to write the data to a file on the database server file system and then create an external table that parses the CSV file you just created.
You can then insert the parsed data into a table you've already created (I'm assuming that the CSV data is in the same format each time).
You can use the DBMS_SCHEDULER or DBMS_JOB package to schedule the job
The Oracle database account you're using will need to be granted access to all of these packages.
You may download the file into your host machine and then use SQL*Loader to populate a table.
Other ways there are some wizards that may be easier than SQL*Loader, if you are using IDEs like PLSQLDeveloper(Tools->Text Importer) or SQLDeveloper(right click on Tables-> Import Data).
Create an external table that references your CSV file. That means you can run select statements on the file from within Oracle.
we get problem, while trying to export Oracle DB. OS: CentOS ~ 5.2 DB: Oracle 10g.
Exp command exports db files only in location:
/home/oracle/OraHome_1/oradata/master/xxx.dbf
, but tool can't export files in different location (we know about this files after getting trace) like this:
'/disk1/dblog06.dbf',
'/home/disk2/system01.dbf',
Please, advice me, how to get dump file. or buckup it.
Thanks.
You appear to have misunderstood what exp does, and particularly what the file parameter is for. The file is the output dump file, normally given a .dmp extension. Export takes data out of the database instance, it does not work under the hood on the datafiles - you have to tell it which data you want (full, user, tables, or tablespaces) and where to put it, not where it comes from.
If you really did try to exp file=/home/disk2/system01.dbf then what you actually asked it to do was trash your database; you're lucky that it did not overwrite the datafile and cause a catastrophic failure. Oracle seems to have saved you from yourself there, though possibly only thanks to having exclusive locks on the files at the time.
You need to read up on how it works and see if it actually does what you want - as APC notes it's not a backup tool. Looks at the Oracle documentation for your version, or somewhere like http://www.orafaq.com/wiki/Import_Export_FAQ, and also look at using data pump instead of exp.
I am not sure if that is the question, but the exp command will export database objects according to their logical schema (user name, table name). It does not matter which physical database file the data is coming from.
exp works through an Oracle instance, which needs to have mounted the datafiles.
Are these other files part of the Oracle database? Maybe another database? You need to find out which Oracle server uses them, and then run exp against that instance.
EXPORT is not a backup tool. It is meant for transferring data from one database to another, or perhaps from one schema to another.
If you want to recover your data in the event of a database crash or corruption then you need to use the appropriate tool. There are OS solutions to this, but Oracle comes with a sophisticated backup and recovery tool: RMAN. Find out more.
Is there any easy way to save BLOB as a binary file into client-side file system with using of only standard Oracle utilities (such as sqlplus or sqlldr for example)?
I've already looked onto UTL_FILE package, but actually I have two problems with it:
I have doubts that it can work with client-side file system.
I have no privilege to CREATE DIRECTORY in schema where BLOBs are saved and so that I can't work with UTL_FILE at all.
Also, I know that I can just write some homebred utility in any language (Java for example); connect to Oracle, select my BLOB and save it in binary format. But I'd look for some easier way before doing this.
Would you really want a database writing a BLOB, for example winword.exe, to your local PC ? This is the sort of thing that is intentionally quite protected.
It is very client driven, so the best place to start is with whatever is running on your local machine. I'd go with a Java routine, or if you've got APEX running, a simple procedure that will push the BLOB out through the browser and let the browser prompt you for what to save it.