How to take input from file in Oracle and update the Oracle database table - oracle

I want to update the database in oracle by taking input from file.Means I have some input fields in file and update the table by taking that input.
Can I do this by creating directory and using utl_file which is provided in Oracle.

Yes, you need to create a DIRECTORY object and then use UTL_FILE or something similar to open and read the file, then INSERT into your table. I can't be more specific since you didn't really tell us anything about what you're trying to do.

Perhaps the SQL*Loader or import utilities would work for you too:
SQL*Loader: http://docs.oracle.com/cd/B28359_01/server.111/b28319/ldr_concepts.htm#g1013706
import: http://docs.oracle.com/cd/B28359_01/server.111/b28319/dp_import.htm#g1025464

Related

Generate DDL for Oracle Stored Procedure Dependency Graph

With TOAD I know I can view the dependency (uses) graph of a stored procedure using the schema browser. And, the Oracle utility procedure deptree_fill can do something similar. What I want to do is script out all of the stored procedures, functions and table definition DLLs into a file that I can use to recreate those objects in another database. Is there a tool or an existing script for this purpose? My own searching has not found a solution. In my particular case the stored procedure uses a dozen other procedures, a few functions and twenty tables.
Edit 1
Maybe my original question was not clear. What I am looking for is something that will take the stored procedure I am interested in and script it and all of its dependency graph into one or more files.
The schema I am dealing with has hundreds of objects in it and the dependency graph has ~50 objects in it. So I'd rather not dig through large lists in TOAD or write an Oracle script myself if I can avoid it.
All sources can be extracted using the dbms_metadata package.
To get the source of a table:
select dbms_metadata.get_ddl('TABLE', 'SOME_TABLE')
from dual;
To get the source of a stored procedure:
select dbms_metadata.get_ddl('PROCEDURE', 'SOME_PROC')
from dual;
Using that you can create a SQL script that extracts everything and then spool the result to a file.
More details about the various functions in dbms_metadata can be found in the manual:
http://docs.oracle.com/cd/E11882_01/appdev.112/e25788/d_metada.htm#i1015856
Hmm, it is quite easy to find in google.
Get table DDL: How to get Oracle create table statement in SQL*Plus
Code of stored procedures can be found in table USER_SOURCE.
Also, for exporting schema to another DB you can use oracle utilities: http://docs.oracle.com/cd/B28359_01/server.111/b28319/exp_imp.htm#g1070082
In Toad see the Generate Schema Script window. You can get to it from the Database|Export menu. There are many options there to include/exclude what you want.

Attempting to use SQL-Developer to analyze a system table dump created with 'exp'

I'm attempting to recover the data from a specific table that exists in a system table dump I performed earlier. I would like to append the rows existing in the dump to any rows that may exist in the active table. The problem is, it's likely that the name of the table in the dump is not the same as what exists in the database currently (They're dynamically created with a prefix of ARC_TREND_). In addition, I don't know the name of the table as it exists in the dump, I was hoping to use SQL Developer to analyze the dump file as I can recognize the correct table by it's columns and it's existing rows.
While i'm going on blind faith that SQL Developer can work with my dump file, when attempting to open it, i'm getting a Java Heap OutOfMemory exception raised. I've adjusted the maximum heap size from 640m to 1024m in both sqldeveloper.bat and in sqldeveloper.conf, but to no avail.
Can someone recommend a course of action for me to take to recover the data from a table which exists in a exp created dump file? A graphical tool would be nice, but I'm no stranger to the command line. I need to analyze the tables that exist in the dump in order to pick the correct one out. Then I assume I can use imp TABLE= to bring it back into the active instance. It likely won't match the existing table name, so I will use SQL Developer to copy the rows from the imported table to the table where I need them to be.
The dump was taken from a Linux server running 10g, and will be imported to (the same server & database instance, upgraded) an 11g instance of the same database.
Thanks
Since you're referring to imp rather than impdp, I assume this wasn't exported with data pump. Either way, I doubt you'll get anything useful through SQL Developer.
Fortunately most of what you're trying to do is quite easy from the command line; just run imp with the INDEXFILE parameter, which will give you a text file containing all the table (commented out with REM) and index creation commands. From that you should be able to spot the table from its column names.
You can't really see any row data though, so if there's more than one possible match you might need to import several tables and inspect the data in them in the database to see which one you really want.

Dynamic SQL-Loader control file

I have 20 tables that are temp-tables where we load and validate data constantly and I have a control file for each table.
How can I have a unique control file that just changes the table the data is loaded into?
Any suggestion?
Thanks in advance!
---Oracle info---
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bi
Suggest you write your control file load the data into a synonym rather than into the specific table. Begin each load run by redefining the synonym to the table you want.
Maybe you can use multiple INTO TABLE clauses, and distinguish bitween them, somehow, with the WHEN clause.
Look here for more details

oracle to flat file

I need to create a flat file and push information into it from oracle database using JSP.
I require a sample code. Help will be appreciated.
If you're looking for an easy way to write different SQL statements to a file, use this procedure: http://www.oracle-developer.net/content/utilities/data_dump.sql
Also you might want to look into DBMS_XSLPROCESSOR.CLOB2FILE.
I think you need to look into Oracle external tables. These are flat files that appear as tables in the Oracle database. You would simply insert data into it using SQL (as per any other database table). Google "Oracle External Tables" for more information.

Oracle sql result to DBF file export

I would like to export data from a Oracle table into *.dbf file (like excel) through PL/SQL scripts. Are there any codes available?
There are a number of different ways to do this. The easiest way is to use an IDE like SQL Developer or TOAD, which offer this functionality.
If you want to call it from PL/SQL, then then are no built-in Oracle functions. However, it is relatively straightforward to build something using UTL_FILE which can write out value separated records. These can be picked up in Excel.
Note that the default separator - , (comma being the "C" in .CSV) - will cause problems if your exported data contains commas. So you will need to use the Data Import wizard rather than a right-click Open With ...
Incidentally, it is probably a bad idea to use the .dbf suffix. In an Oracle file system the presumed meaning is database file - i.e. part of the database's infrastructure. This is just a convention, but there is no point in needlessly confusing people. Perferred alternatives include .csv, .dmp or .exp.
edit
If your interest is just to export data for transferring to another Oracle database then you should look at using the Data Pump utility. This comes with an API so it can be used from PL/SQL. Alternatively we unload data through external tables declared with a DataPump driver.
You could also consider using the External Tables feature of Oracle. This essentially allows you to map a CSV file to a 'virtual' table and the you can insert into it (and therefore the file.)
Oracle External Tables Concept Guide

Resources