I am using Oracle Export data pump (expdp) for my application . My oracle client and server are on different machines . When I use my application which is on the oracle client machine , the exported dump is always created on the Oracle server machine . Is this the oracle export data pump limitation ? Or Is there a workaround ?
thats the way datapump works by design.
If you need client side, you have to use the older exp and imp. otherwise use datapump and ftp the file to your local machine afterwards.
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_overview.htm#i1010293
Note: All Data Pump Export and Import processing, including the
reading and writing of dump files, is done on the system (server)
selected by the specified database connect string. This means that for
unprivileged users, the database administrator (DBA) must create
directory objects for the Data Pump files that are read and written on
that server file system.
Related
I am new to Oracle databases. I have installed DBeaver (never used this before too) to connect to the database.
I have created a connection (which I believe is called database) and now I am able to see the database tables and everything. How do I take the backup of the Oracle Database in DBeaver so I can use it locally for test purposes before making any change on live database?
I can't find any option to take the backup of connection/database.
To do a proper backup of your Oracle Database, you should use the oracle provided utility, Recovery Manager. It's a command line interface that's called from your DB server shell prompt via 'RMAN'
You can also use Data Pump to export all or part of a database that can be used to import to another database...not really used for recovery of an existing database.
I'm not aware of your tool having interfaces for either of these Oracle features.
You might not need a backup at all for your needs, take a look at Oracle Flashback Technology.
DBeaver does not support oracle database export import. See details here:
https://dbeaver.com/docs/wiki/Backup-Restore/
You need to run the sqlplus tool to create a folder where oracle is going to import/export database dumps. Login should happen as sys as sysdba and enter the password you previously entered during database server installation. Example:
sqlplus sys/[your password] as sysdba
After you successfully logged into sqlplus run the following command (don't forget to set to a different folder that you prefer to use):
create or replace directory DATA_PUMP_DIR as 'D:\Database Backups';
Once this is done exit from sqlplus and enter the following command into the command line (again no sqlplus should be used here)
expdp sys/[your password]#localhost:1521/[listener name] file=your-database-dump-file.dmp owner=[your schema]
Once this is done and finished you can zip your database dump if you would like to upload it somewhere else. (I had 9 GB dump and the zipped size was 1.6 GB)
I have a specific need for data pump and I am having a hard time searching for a solution.
Currently, I have a exp/imp program that exports tables (selectively based on queries) from one database, and imports that same data into another database. This program and the dump files reside on a common server that can access both the source and destination databases. This is a totally automated process. It works good, albeit slowly.
Due to various reasons, I must migrate this program to use data pump. The biggest change now is the location of the dmp files. I also have very limited access to the database servers themselves, but I can run data pump.
The process will be run from the same common server, but the exported files will now reside on the database server for the source database. No issue there. I can create dmp files using expdp.
My issue is how to get that same data into the destination database. When I run impdp, it is expecting a data_pump_dir in the destination area (not source area). Again, this is automated, and I don't have the luxury of being able to transfer dmp files using scp or ftp or anything like that.
What can I use to overcome this problem using datapump?
No reason you cannot configure an external directory on BOTH databases:
CREATE DIRECTORY mydumpdir AS '/whatever/the/path/is';
Then, impdp and expdp will take the DIRECTORY argument as mydumpdir
Make sure you configure permissions for the Oracle schemas/users to read/write to the directory AND the oracle process account should have OS level rights to read/write to that location also. The expdp server should also have write access as it might be trying to write reports to the locations or you might be using to do file cleanup.
I am going to unload the data from my table in Oracle DB to a file using oracle datapump. I am aware that I can import the file in to a different DB using datapump but would like to know if there is any other way to import or open the file?
i.e After the file is generated, I am going to send the file to my client who has no idea about Oracle DB etc. So is there any software or UI using which my client can open the file rather than using Oracle datapump to import (if no other go I will educate the client)? The reason I am going for external table etc because I have 50 million records that client wants to view.
Now I'm too keen to know how can I export dmp file with Oracle jobs? It's because of I'm very new to Oracle and don't know how to backup Oracle with jobs like MsSQL bachup with schedule. That's why I want to know what I asked.
You can fairly easy setup a backup scheduled by the database. Best approach for this is to install the Oracle Scheduler Remote Job Agent - local to the database - and configure that agent in the database that holds your backup schedule. This can be the database itself, it can also be a central backup schedule database, all a matter of taste.
Oracle Scheduler is very powerful and can execute tasks in the local database, in remote database[s], on the local server and on remote servers. If using OS type of jobs, best is to use the 11g Remote Scheduler Agent. Don't use the old fashioned 10g style External Jobs. Use remote jobs with defined credentials.
For help look at my blog where you also find pointers to docu.
After you installed and configured the job agent to be a valid target for the database that performs the scheduling, easiest is to use dbconsole to define the jobs. If you configure the dbconsole, it also gives an option to generate auto backup jobs. Maybe this is already enough. You asked for export and there expdp with the Oracle Scheduler does a wonderful job.
You can run an OS process from Oracle Job using Java Stored Procedure or a C program.
See this blog entry,
Instead of exporting dump using the old imp/exp utilities to generate dmp files, look at Oracle Datapump, especially since from the tags I infer you're using Oracle 11g.
Data pump supports table-level, tablespace-level, schema-level & full export modes and is kown to be consdirably faster than the previous imp/emp tools.
Further reading:
Export/import with Oracle Data pump
Oracle Documentation on Data pump
know how can I export dmp file with Oracle jobs?
That's not possible. The emp tools runs outside the database, while jobs run within. if you want scheduled exports perhaps you could use a cronjob/scheduled task.
Oracle data pump export utility expect a parameter DIRECTORY (DBA_DIRECTORIES) which exist in DB server. Is it possible to map this directory to local machine or is there any other way to export multiple table to local from oracle database?
If using Data Pump, there is no direct way to store a dump file on your local machine. That is the way how Data Pump designed.
However, there is one of possible ways to achieve what you want. A workaround has two steps:
Run expdp as usual, which creates a dump file on server
Use ocp tool to transfer a dump file from a database server to your local machine (and back, if you want to).
An ocp tool stands for "Oracle Copy" and written exactly for the purpose of copying dump files back and forth from/to a database server. It is available here: https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1.tar.gz That is a source distribution, so once downloaded and unpacked, run ./configure && make
(Hopefully you do not have Windows on a client side, because I never tried to compile it there)
That is a simple command-line tool with a simple syntax. For example, this command will pull a file for you:
ocp <connection_string> DATA_PUMP_DIR:remote_file_name.dmp local_file_name.dmp
The tool uses a database connection and a minimum set of database privileges.
Update:
Finally I was able to adjust the source code and build ocp tool for Windows 32-bit:
https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1-win32.zip
Compiled/tested with 32-bit Instant Client 11.2.0.4 available here: http://www.oracle.com/technetwork/topics/winsoft-085727.html
instantclient-basiclite-nt-11.2.0.4.0.zip (20,258,449 bytes)
I believe it will work with a full Oracle Client installation too (just watch for bits, should be 32), however did not check myself.
Unfortunately, Windows build of ocp does not have a fancy progress meter during file transfer. That piece of code had too much *nix-specific stuff, so I had to cut it off.
Also, since it uses popt and zlib libraries, which are compiled as a part of GnuWin project, and available in 32-bit only, ocp for Windows is 32-bit only too. Hopefully, not having of a 64-bit version is not mission critical to you.
Update 2:
Warning! Make sure you always use DEDICATED server connection when download files from server, otherwise (for SHARED server) the downloaded copy of the file will be corrupted with no error messages!
With a bit of a hack you can get data pump to do what you want, but you need to have a database on your local machine.
What you need to do is create a database link on your local machine to the remote machine.
Then in the datapump options, login to the local database as the db link owner, specify the 'network_link' option to be the name of the database link name you created. That way it should export from the remote database through the local database and create the file on your local instance. For example:
expdp directory=<local_dir_object> network_link=<dblinkname on local instance> dumpfile=.. logfile=.. tables/schema=...
No, data pump sucks that way, but Oracle can get faster throughput using the same server the db sits on, so thats the tradeoff. Other enhancements too, but I still think this is a big disadvantage for data pump. Use old exp/imp or third party tools for this purpose.
You should ask yourself: "Why do I want to keep data outside the database - the most secure place for my data? Where backup,restore and recovery is in place.
If you are going to move data from database A to database B, make sure both databases have access to a common file-area where they can access the datadump-files through their directory-object and use the datapump.
If you still want to export data to client side you can use the good old tools exp and imp.