Taking dump of tables in oracle 10g using PL/SQL procedure - oracle

Hi Required immediate response,
I want to take dump of some selected tables from schema, can any body tell me is it possible?
Can anybody provide procedure by executing that we can take dump.
e.g. I have schema, testuser, and tables (T1,T2,T3,T5,T9), i want to take dump of T1 & T5 only.
Thanks in advance

As you are on 10g you can do this with the Data Pump API. You need to have read and write access on a directory object which maps to the destination OS directory.
In the following example I am exporting two tables, EMP and DEPT, to a file called EMP.DMP in a directory identified by DATA_PUMP_DIR.
SQL> declare
2 dp_handle number;
3 begin
4 dp_handle := dbms_datapump.open(
5 operation => 'EXPORT',
6 job_mode => 'TABLE');
7
8 dbms_datapump.add_file(
9 handle => dp_handle,
10 filename => 'emp.dmp',
11 directory => 'DATA_PUMP_DIR');
12
13 dbms_datapump.add_file(
14 handle => dp_handle,
15 filename => 'emp.log',
16 directory => 'DATA_PUMP_DIR',
17 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
18
19 dbms_datapump.metadata_filter(
20 handle => dp_handle,
21 name => 'NAME_LIST',
22 value => '''EMP'',''DEPT''');
23
24 dbms_datapump.start_job(dp_handle);
25
26 dbms_datapump.detach(dp_handle);
27 end;
28 /
PL/SQL procedure successfully completed.
SQL>
#DerekMahar asks:
"Is there a similar data pump tool or
API available for execution from the
client side"
DataPump, both the PL/SQL API and the OS utility, write to Oracle directories. An Oracle directory must represent an OS directory which is visible to the database. Usually that is a directory on the server, although I suppose it is theoretically possible to map a PC drive to the network. You'd have to persuade your network admin that this is a good idea, it is a tough sell, because it isn't...
The older IMP and EXP utilities read and wrote from client directories, so it is theoretically possible possible to IMP a local dump file into a remote database. But I don't think this is a practical approach. By their nature dump files tend to be big, so importing across a network is slow and prone to failure. It is a much better solution to zip the dump file, copy it to the server and import it from there.

You should try using the DATAPUMP api's (EXPDP/IMPDP). It has a lot more capabilities and has PLP/SQL APIs. DATAPUMP is a replacement for exp and imp and is supported in 10g.
http://www.orafaq.com/wiki/Datapump#Invoking_from_PL.2FSQL

With this command you'll get a binary Oracle dump:
exp scott/tiger file=mydump.dmp tables=(T1,T5)
I recommend this link: http://www.orafaq.com/wiki/Import_Export_FAQ

If you must use PL/SQL, and you're trying to create a file, then you'll need to have a directory defined with write access granted to your user. That's something your DBA can do. See the "create directory" command.
At that point, you can (1) call UTL_FILE to open a file and write rows to it or (2) create an "EXTERNAL TABLE" and copy the information to it or (3) use DBMS_XMLGEN or (4) use any of several other ways to actually write the data from the database to the file. All of these are in the Oracle docs. The PL/SQL Packages and Types manual is your friend for things like this.
Note that the actual file system directory has to be on the server where the database is located. So you may need to get access to that server to copy your file, or have somebody set up a mount or whatever.
Alternatively, you could set up a plsql web service that you could call to get your data.
But, personally, I'd just use exp. Or, if that's not available, Toad or some other front end tool (even SQL*Plus) where you can just write a simple SQL script and save the results.
If you're doing this for a homework assignment, my guess is they'll want a UTL_FILE solution.

Related

Using UTL_FILE in ADW

Is there anyway I can use UTL_FILE in Autonomous Database to an Object Storage or another Cloud storage? I am asking that because we can't use OS files in ADB.
Here an example of the line but in on-premise env. :
UTL_FILE.PUT_LINE(v_file, 'col1,col2,col3');
Filesystem directories are supported, and documented here. Any file you put in the filesystem will count against your storage, just to make sure you are aware.
You can use UTIL_FILE to write to the database file system followed by DBMS_CLOUD.PUT_OBJECT to copy that file a bucket in object store. Use DBMS_CLOUD.DELETE_FILE to delete the file from the database object store if needed. I do it all the time and it works well.
My guess is the p_dir is a parameter to a PL/SQL procedure. If that's the case, p_dir should be set to a value equal to a valid database directory object. DATA_PUMP_DIR exists by default in ADW, so it could be something like this:
BEGIN
my_file_proc(
p_file => 'my_file.csv',
p_dir => 'DATA_PUMP_DIR');
END;

Altering Stored Procedures in ASE Sybase 15.7

I am new to ASE Sybase 15.7 but do have some background in other RDBMS systems. So i assumed there would be an equivalent of CREATE OR REPLACE for Stored procedures in ASE Sybase 15.7.
But I dont seem to see any way to do this. Most people i have asked suggest dropping and creating with the newer version of the stored procedure but that gives me a challenge of managing the permissions on the stored procedure which are different across environments depending on the users in each.
So My ask is below:
Suppose I have a stored procedure as so:
ENV1
CREATE Procedure test (
as
begin
SELECT getdate()
end
grant execute on test to group1
go
grant execute on test to group2
go
ENV2 has :
CREATE Procedure test (
as
begin
SELECT getdate()
end
grant execute on test to group1
go
grant execute on test to group2
go
grant execute on test to group3
go
I want to update this stored proc to give me 2 dates instead of 1 so new proc should be
ENV1:
CREATE Procedure test (
as
begin
SELECT getdate(), getdate()
end
grant execute on test to group1
go
grant execute on test to group2
go
ENV2:
CREATE Procedure test (
as
begin
SELECT getdate(), getdate()
end
grant execute on test to group1
go
grant execute on test to group2
go
grant execute on test to group3
go
Above is a very simplistic example ofcourse. Is there a way to deploy the changes to just modify the stored procedure body preserving the permissions?
CREATE or REPLACE and ALTER PROCEDURE dont seem to work and dropping and creating the stored procedure would mean additional logic for each environment to figure out the permissions to be granted.
Is there a way to do this kind of deployment in an optimum way considering we have 20 plus different user environments?
Thanks!
While ASE does support create or replace, this is only available with ASE 16.x (ie, you'd need to upgrade to ASE 16.x).
Assuming you're looking to build some sort of scripted solution, I'd recommend taking a look at the ddlgen utility to assist with extracting the current permissions for a stored proc.
One (very simple) example of using ddlgen to pull the DDL for a stored proc:
$ ddlgen -SmyASE -Umylogin -Pmypassword -TP -Nsybsystemprocs.dbo.sp_help -Osp_help.ddl.out
$ cat sp_help.ddl.out
-- Sybase Adaptive Server Enterprise DDL Generator Utility/1 ...snip...
...snip...
use sybsystemprocs
go
...snip...
create procedure sp_help
...snip...
Grant Execute on dbo.sp_help to public Granted by dbo
go
sp_procxmode 'sp_help', anymode
go
From here you could grep out the desired grant, revoke and/or sp_procxmode lines, to be (re)executed once you've dropped/created the replacement stored proc.
If you don't have access to ddlgen (I know it's included in the ASE installation software but don't recall if it's provided in the SDK/client software installation) you have a few alternatives:
have the DBA run the ddlgen commands for you and provide you with the results (yeah, I'm sure the DBA will love that idea)
get ddlgen installed on your 'client' machine (eg, install the ASE installation package; or copy over just the needed files from an ASE installation - easier said than done, and would be a PITA when it comes to upgrading the software)
run sp_helprotect <proc_name> (and sp_procxmode <proc_name>) and parse the output for the desired grant, revoke and/or sp_procxmode commands
And one alternative on the 'run-and-parse sp_helprotect/sp_procxmode output' ... look at the source code for these procs and roll your own SQL code to extract the desired data in a format that's easier for you process to handle.

Oracle: How to efficiently copy a table from one schema to another on a different database and server

I have a large table (3.5MM records) that I need to copy from one schema/database to another schema/database. I tried TOAD's copy data from table feature, but got errors and it never fully copied, in part because the connection keeps getting dropped. I'm trying the object copy feature of SQLDeveloper, and after 11 minutes, it's still copying. I tried the SQLPlus COPY statement but got a syntax error (help needed). I'm still open to extracting the data as INSERT statements that I can just run directly.
1) SQLPLUS Copy as follows:
copy from report_new/mypassword#(DESCRIPTION= (ADDRESS=(PROTOCOL=TCP)(HOST=10.15.15.20)(PORT=1541))(CONNECT_DATA=(SERVICE_NAME=STAGE))) to report/mypassword#(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=10.18.22.25)(PORT=1550))(CONNECT_DATA=(SERVICE_NAME=DEV))) CREATE USER_USAGE_COUNT USING SELECT * FROM _USER_USAGE_COUNT
The above gives me
SQL> start copy_user_count_table.sql
SP2-0758: FROM clause missing username
2) I tried TOAD
The TOAD "Copy data to another schema" fails due to the connection getting
dropped. I set the commit threshold first to 5000 then to 500.
3) I'm trying SQLDeveloper's copy function, but I think it's not going to finish anytime soon and it gives me no real progress indications. For all I know, it could be hung but that it just doesn't want to tell me.
4) I thought about creating a datalink, but I don't have the authority to create one, and it's in a corporate environment wherein the DBA's don't respond in under 3 days.
Todo: Should I write my own Java code to just do this one record at a time?? I shouldn't have to do this, but somehow it's easier to send a man to the moon than to copy data from one schema to another.
You can use the copy command of sqlcl which is part of newer SQLdeveloper releases. The sqlcl is found in the Sqldeveloper\bin directory and is named sql.exe (Windows) or sql (Unix/Linux/Mac). The steps to follow are:
Connect to Destination database with sqlcl
sql username/password#destindationdb
Use the copy command
copy from username#sourcedatabase create newtablename using select * from sourcetable;

How do I access the AST (abstract syntax tree) for a PL/SQL stored procedure?

When Oracle compiles a stored procedure, it stores the AST for the procedure in DIANA format.
how can I access this AST?
are there built-in tools for processing this AST?
There is an undocumented package DUMPDIANA that is meant to dump the Diana in a human-readable format.
The file $ORACLE_HOME\rdbms\admin\dumpdian.sql says "Documentation is available in /vobs/plsql/notes/dumpdiana.txt". I cannot find that file, and without it we can only guess at the meaning of some parameters. Basic usage of DUMPDIANA is as follows:
SQL> show user
USER is "SYS"
SQL> #?\rdbms\admin\dumpdian
Library created.
Package created.
Package body created.
create or replace procedure hello_world
2 as
3 begin
4 dbms_output.put_line('hello world');
5* end;
Procedure created.
SQL> set serveroutput on
SQL> execute sys.DUMPDIANA.dump('HELLO_WORLD');
user: SYS
PL/SQL procedure successfully completed.
At this point a pair of files should have been created in the folder
$ORACLE_BASE/diag/rdbms/orcl12c/orcl12c/trace. The two files seem to follow the naming convention:
orcl12c_ora_{PROCESS}.trc
orcl12c_ora_{PROCESS.trm
Where the trc file is a human readable version of the corresponding trm file, and {PROCESS} is the operating system process ID. To find this use the following query from the same session:
select p.spid from v$session s,v$process p
where s.paddr = p.addr
and s.sid = sys_context('USERENV','SID');
For example if the session ID was 8861 then from a bash shell you can view the results using:
vim $ORACLE_BASE/diag/rdbms/orcl12c/orcl12c/trace/orcl12c_ora_8861.trc
The result is interesting... if not particularly intuitive! For example here is a snippet of the file produced. Note the HELLO_WORLD string literal.
PD1(2):D_COMP_U [
L_SRCPOS : row 1 col 1
A_CONTEX :
PD2(2): D_CONTEX [
L_SRCPOS : row 1 col 1
AS_LIST : < >
]
A_UNIT_B :
PD3(2): D_S_BODY [
L_SRCPOS : row 1 col 1
A_D_ :
PD4(2): DI_PROC [
L_SRCPOS : row 1 col 11
L_SYMREP : HELLO_WORLD,
S_SPEC : PD5^(2),
S_BODY : PD8^(2),
A couple of notes. I've run this as SYS, which as we know is not a good practice, this is no reason I know of why you shouldn't grant privileges on DUMPDIANA to a normal user. All the procedures you dump go into the same file - if you delete that file, it stops working, and you'll need to start a new session. If it stops working, starting a new session sometimes seems to fix the problem.
Here is an excellent tutorial on DIANA and IDL in the PDF How to unwrap PL/SQL by Pete Finnigan, principal consultant at Siemens at the time of the writting, specializing in researching and securing Oracle databases.
Among other very interesting things you will learn that:
DIANA is written down as IDL (Interface Definition Language).
The 4 tables the IDL is stored in (IDL_CHAR$, IDL_SB4$, IDL_UB1$ and IDL_UB2$)
Wrapper PL/SQL is simply DIANA written down as IDL.
Dumpdiana is not installed by default, you need to ensure DIANA, PIDL, and DIUTIL PL/SQL packages are installed as well and you need to run it as SYS.
How to dump the DIANA tree and understand it.
How to reconstruct the PL/SQL source from DIANA.
How to write a PL/SQL un-wrapper.
Limitations of a PL/SQL API based un-wrapper.
Limitations of the PL/SQL API itself.
How to enumerate DIANA nodes and attributes.
A proof of concept un-wrapper.
You can find his website here. There is so much content there. You will find awesome papers about Oracle security and also a lot of useful security tools developed not only by him but other authors as well.
Best of all, you can get in touch with him if after the reading you still have questions.

Procedure to export table to multiple csv files

I am working with an Oracle database - the only way I have to access it is through SQL Developer. Part of my work involves exporting large tables to csv files to pass to another group. Since this is mostly babysitting the system, I've been looking for a way to automate the export process.
What I would like is to have a procedure like so:
PROCEDURE_EXAMPLE(table_in in VARCHAR2, file_out in VARCHAR2)
where table_in is the table I need to export, and it exports the table to a series of csv files titled "file_out_1.csv" "file_out_2.csv", etc.. each with no more than 5 million rows.
Is it possible to create a procedure like this?
You can using the UTL_FILE package. You can only read files that are accessible from the server on which your database instance is running.
See http://www.devshed.com/c/a/Oracle/Reading-Text-Files-using-Oracle-PLSQL-and-UTLFILE/
and Oracle write to file
I was just posting an answer here: Procedure to create .csv ouput
Using the UTL_FILE package is often not an option, because it can only create files on the server and is rather limited.
If you can only use SQL Developer, you can change the window to a command window and then you can run the commands just as I described in the other thread.
In the SQL Window right click -> Change Window to -> Command Window
set term off ...
spool c:\tmp\t\txt
select ...
spool off

Resources