UTL_FILE server side usage/ client side usage - oracle

I've used TEXT_IO package for creating files in the local(client) machine. From the documentation http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#BABBBABB I see that it is "available for both client-side and server-side PL/SQL". What does this mean?. Does it mean I can use it to create file in both client and server side? If so, which method/option should I use to create a file in the client side. Thanks.

UTL_FILE is a PL/SQL database package. It can read from or write to any directory which the oracle OS account has the matching privileges on. In practice this means directories on the database server, although directories on other servers - or even your local PC - can be shared with that server, through the good graces of your network administrator, and the DBA creating the appropriate Directory object.
TEXT_IO is an Oracle Forms package for writing to the client. Naturally it only works in client/server versions of the product, although the webutils library provides an implementation which can work in webform deployments.
The oracle OS account is the user which installed the Oracle software. We create the account before running the OUI. The oracle user has no direct relationship to any database accounts.
Processes inside the database can only read or write files in directories which the OS account can access. These processes include UTL_FILE, Data Pump, external tables, Java stored procedures running OS commands and extprocs, as well as background things like the alert log, dumps and trace files .

No, I think it means that UTL_FILE and TEXT_IO provide equivalent functionality for server and client respectively.

Related

How can I make local connection in Oracle SQL Developer?

I have downloaded the SQL Developer. Currently, I am using my school database but it is for temporary use. I want to use it after finishing my college. I do not know how can I make local connection in SQL Developer. Can you please help me in this.
Oracle SQL Developer is a tool you use to access an Oracle database.
So, if you want to use Oracle on your own computer, regardless there is (or is not) a connection to your school network, you'll have to install the database as well. I'd suggest Oracle 11g Express Edition. The installation process is simple; more or less, clicking NEXT a few times does the job. I'd, though, recommend you to follow the Installation Guide and pay attention to what the Installer asks (for example, write down passwords you choose).
Furthermore, in order to "copy" the database (actually, I believe you mean "schema" in this case) to your database, the right way to do that is to use Data Pump. You'd use Export in school to export the database, and Import on your computer to import it.
However, as Data Pump requires access to a directory (it is an Oracle object which points to a file system directory, usually on the database server; it is created by SYS and other users are granted read and/or write privileges on it). If you can't get access to it, you can use the original EXP and IMP utilities. EXP creates the DMP file locally; you'd put it onto a memory stick (or, if you're on the network, copy it directly to your PC) and import it.
If you're unsure of whether you can (or can not) do that, ask your teacher.
Once the schema is imported into your database, use SQL Developer to access it. Should be no problem to do that.

Oracle Data Pump Transfer Between Databases

I have a specific need for data pump and I am having a hard time searching for a solution.
Currently, I have a exp/imp program that exports tables (selectively based on queries) from one database, and imports that same data into another database. This program and the dump files reside on a common server that can access both the source and destination databases. This is a totally automated process. It works good, albeit slowly.
Due to various reasons, I must migrate this program to use data pump. The biggest change now is the location of the dmp files. I also have very limited access to the database servers themselves, but I can run data pump.
The process will be run from the same common server, but the exported files will now reside on the database server for the source database. No issue there. I can create dmp files using expdp.
My issue is how to get that same data into the destination database. When I run impdp, it is expecting a data_pump_dir in the destination area (not source area). Again, this is automated, and I don't have the luxury of being able to transfer dmp files using scp or ftp or anything like that.
What can I use to overcome this problem using datapump?
No reason you cannot configure an external directory on BOTH databases:
CREATE DIRECTORY mydumpdir AS '/whatever/the/path/is';
Then, impdp and expdp will take the DIRECTORY argument as mydumpdir
Make sure you configure permissions for the Oracle schemas/users to read/write to the directory AND the oracle process account should have OS level rights to read/write to that location also. The expdp server should also have write access as it might be trying to write reports to the locations or you might be using to do file cleanup.

What is EXTPROC in Oracle?

For security reasons I asked DB team to add EXTPROC_DLLS:ONLY; but they said this:
"Please be informed that the KEY = EXTPROC1526 doesn’t refer to any
external process at all. This is just a key used by any process needs
to call Oraxxx via IPC protocol. The key can be any value and the same
key value should be passed via the tnsnames.ora"
To me, it seems wrong. Could you please help me on this? What is the exact use of EXTPROC and what happens if we don't add EXTPROC_DLLS:ONLY?
For any program to connect the oracle database you need Extproc agent.
PLS/SQL for example needs Extproc to work with oracle
You can find more information about the securit here
Ill past some of the link
Description
***********
The Oracle database server supports PL/SQL, a programming language. PL/SQ can execute external procedures via extproc. Over the past few years there has been a number of vulnerabilities in this area.
Extproc is intended only to accept requests from the Oracle database server but local users can still execute commands bypassing this restriction.
Details
*******
No authentication takes place when extproc is asked to load a library and execute a function. This allows local users to run commands as the Oracle user (Oracle on unix and system on Windows). If configured properly, under 10g, extproc runs as nobody on *nix systems so the risk posed here is minimal but still present.
and an example here
On contrary to other databases Oracle does NOT allow plugins to access it's own memory address space. In case of MySQL/PostgreSQL a .dll plugin (C stored procedure) is loaded by the main database process.
Oracle lets listener to spawn a new process by calling extproc (or extproc32). This process loads the shared library and the rest of the database talks to this process via IPC.
This approach is safer, because the external library can not crash the database nor corrupt data. On the other hand sometimes C stored procedures might be slower than Java ones.
This option can restrict path for .dlls being loaded by extproc. i.e. those created by CREATE LIBRARY statement.
PS: usage of C stored procedures is VERY rare, if you do not use them you can freely remove the whole extproc stanza from listener.ora.
PS1: there is possible scenario of exploiting the extproc feature.
User must have CREATE LIBRARY, which usually NOT granted
extproc is not configured to run with nobody's privs - but runs as oracle:dba
User creates malicious .so library, which will performs something "evil" during it's initialization.
User puts this lib into /tmp directory
User creates Oracle LIBRARY pointing into /tmp by using CREATE LIBRARY statement
User forces extproc to dlopen this library
exproc will execute evil code with OS privileges oracle:dba
When using this EXTPROC_DLLS:ONLY restriction, developers have to cooperate with DBAs, and only white-listed libraries can be used and loaded.

How to use Oracle data pump export utility to create dump file in local machine?

Oracle data pump export utility expect a parameter DIRECTORY (DBA_DIRECTORIES) which exist in DB server. Is it possible to map this directory to local machine or is there any other way to export multiple table to local from oracle database?
If using Data Pump, there is no direct way to store a dump file on your local machine. That is the way how Data Pump designed.
However, there is one of possible ways to achieve what you want. A workaround has two steps:
Run expdp as usual, which creates a dump file on server
Use ocp tool to transfer a dump file from a database server to your local machine (and back, if you want to).
An ocp tool stands for "Oracle Copy" and written exactly for the purpose of copying dump files back and forth from/to a database server. It is available here: https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1.tar.gz That is a source distribution, so once downloaded and unpacked, run ./configure && make
(Hopefully you do not have Windows on a client side, because I never tried to compile it there)
That is a simple command-line tool with a simple syntax. For example, this command will pull a file for you:
ocp <connection_string> DATA_PUMP_DIR:remote_file_name.dmp local_file_name.dmp
The tool uses a database connection and a minimum set of database privileges.
Update:
Finally I was able to adjust the source code and build ocp tool for Windows 32-bit:
https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1-win32.zip
Compiled/tested with 32-bit Instant Client 11.2.0.4 available here: http://www.oracle.com/technetwork/topics/winsoft-085727.html
instantclient-basiclite-nt-11.2.0.4.0.zip (20,258,449 bytes)
I believe it will work with a full Oracle Client installation too (just watch for bits, should be 32), however did not check myself.
Unfortunately, Windows build of ocp does not have a fancy progress meter during file transfer. That piece of code had too much *nix-specific stuff, so I had to cut it off.
Also, since it uses popt and zlib libraries, which are compiled as a part of GnuWin project, and available in 32-bit only, ocp for Windows is 32-bit only too. Hopefully, not having of a 64-bit version is not mission critical to you.
Update 2:
Warning! Make sure you always use DEDICATED server connection when download files from server, otherwise (for SHARED server) the downloaded copy of the file will be corrupted with no error messages!
With a bit of a hack you can get data pump to do what you want, but you need to have a database on your local machine.
What you need to do is create a database link on your local machine to the remote machine.
Then in the datapump options, login to the local database as the db link owner, specify the 'network_link' option to be the name of the database link name you created. That way it should export from the remote database through the local database and create the file on your local instance. For example:
expdp directory=<local_dir_object> network_link=<dblinkname on local instance> dumpfile=.. logfile=.. tables/schema=...
No, data pump sucks that way, but Oracle can get faster throughput using the same server the db sits on, so thats the tradeoff. Other enhancements too, but I still think this is a big disadvantage for data pump. Use old exp/imp or third party tools for this purpose.
You should ask yourself: "Why do I want to keep data outside the database - the most secure place for my data? Where backup,restore and recovery is in place.
If you are going to move data from database A to database B, make sure both databases have access to a common file-area where they can access the datadump-files through their directory-object and use the datapump.
If you still want to export data to client side you can use the good old tools exp and imp.

Selective tables/objects Oracle Backup

I need to automate a selective table / user object backup I currently am doing via PL / SQL Developer.
The way I currently do it is via Tools/Export Tables and Tools/Export User Objects, manually select tables / objects, then set the options, choose destination and export. I do this from a windows laptop and the database is located in a suse linux server, both are in the same LAN. DB is running 24/7 and can not be shutdown. Also currently my oracle programming skills are very basic as I only do maintenance to this solution. I would like to keep doing the backup process in the windows laptop, but I would consider a server side script solution also and then retrieving the .sql files from server.
Thanks in advance
I wouldn't really call it a backup, but look at exp/imp and expdp/impdp (data pump) in the Utilities manual
As Gary implies exp/imp really isn't a backup solution. If this database is important to you or others, figure out how to use RMAN , which is usually configured to run in a mode that doesn't require the database to be shut down. Although it executes on the database host and for non-tape destinations must write its files to a filesystem attached to the host, it can be launched remotely.
RMAN is aimed at restoring/recovering the entire database, so if what you're looking for is only the ability to recover isolated objects it may not be for you.

Resources