Oracle database 11g. What is the easiest way to set up a full nightly database backup to a network drive (ie drive on another computer)?
Read the backup and recovery guide.
Don't just backup... Make sure you test your backup, regularly!!
If you have Grid Control setup the easiest way would be to use the web interface. The interface allows you to specify a unc path (\computer\share name). You will have to create a share on the remote computer.
If you don't have Grid Control you can create a script that uses RMAN. If you need script specifics it would be helpful to know what OS your database is on.
You need a backup strategy. Once you have defined that you will see what options are available to suit your needs.
mount the network drive, use rman to fully backup the db to the network drive. Rman can do it hot. Another method is to export full but you'll have to flush all writes and put the db into restricted mode so no user can make changes while you export to get a full consistent backup.
Related
my server had been attacked by a ransomware .rapid and all my data had been encrypted , luckily for me the oracle home folder is not encrypted - yet - and most of the files including the datafiles folder and tablespaces are still accessible
Can any One please tell me how to recover my database objects?
no backup is available , only oracle home folder -most of it-
EDIT :
The System is broken , I am trying to know witch files to collect and copy that will enable me to recover my database files from another system
when I try to log into sqlplus throw cmd I get the following error :
'sqlplus' is not recognized as an internal or external command ,
operable program or batch file.
Blockquote
EDIT :
FILES THAT I STILL HAVE ACCESS TO - NOT ENCRYPTED -
Okay. If you can find an init.ora file on your server, that's the PFILE - initialization parameter file - that's the last thing missing to easily copy your database to a new server. If you can't find it, that's ok - it'll just be a little harder. As long as you have the datafiles, you can eventually get your database back.
Basically, you'll want to follow steps 2-8 in the link I posted. You can also find some helpful info in the Oracle guide to manually creating a database in Windows. I'll walk you through them.
Shutdown your old database (if it's still running). This will make sure your datafiles are in a consistent state for copying. Probably stopping the Windows Service would be the easiest way to do that if you can't access sqlplus.
Copy the data to your new server. I'm assuming it'll be in the same location, D:\app\Administrator\oradata\VTC\
Make a copy of the control file CONTROL01.CTL and name it create_db.sql (EDIT: I was assuming that this was a backup to trace ascii version of the control file, but it sounds like this is the binary file)
Edit create_db.sql. Where it says CREATE CONTROLFILE REUSE DATABASE "MY_DB" NORESETLOGS, change it to CREATE CONTROLFILE SET DATABASE "MY_DB" RESETLOGS. Make note of whatever "MY_DB" is - this is your database name. Most people make it the same as the SID. I normally do RESETLOGS which throws out the old redo logs, but you could try keeping them with NORESETLOGS if that works for you.
Remove or comment out the lines that say RECOVER DATABASE and ALTER DATABASE OPEN;. Make sure the paths for the datafiles and logfiles look correct. Save the file.
If you couldn't find your init.ora to copy, I think this very minimal one will work for you, although you'll want to fix your memory settings later. Create it in the same folder.
DB_NAME=MY_DB
INSTANCE_NAME=MY_DB
SERVICE_NAMES=MY_DB
CONTROL_FILES = ("D:\app\Administrator\oradata\VTC\CONTROL01.CTL")
DB_FILES=100
Create an Oracle Database Windows Service. Afterwards check Services to make sure it's running.
oradim -NEW -SID MY_DB -STARTMODE manual -PFILE "D:\app\Administrator\oradata\VTC\init.ora"
Log in to your new Oracle instance as SYSDBA. There's no database yet.
cd D:\app\Administrator\oradata\VTC\
set ORACLE_SID=MY_DB
sqlplus / as sysdba
Create the database, using the control file from the old server as a script.
#create_db.sql
If everything comes back OK, run:
alter database open
So I have this task: export full database. There is a remote machine, on which Oracle 11g server is running. It is low on disk space, so exporting using expdp won't work.
Also, I do not have Oracle server on my local computer, so exporting using network link will not work for me. I used exp instead, but it has already been 4 days since I started export to my local disk (~380 GB already), but I need the dump file of the database.
P.S. I can connect to the remote machine using RDP. So if there are options that would allow me to export database dump using RDP I would appreciate if you could point to where to look.
I tried to search everywhere, even on different languages.
You can create a nfs/cifs share on your system, mount it on the server and use datapump to export the data directly on the mounted share. You will still be limited by the throughput of the network link however.
You should also check the network connectivity between your machine and the server - there is no fast way to transfer 1TB through a 100Mbps network, for example.
Also, when using windows remote desktop, there is a way for the remote machine to mount and access your local drives. This way you can also use datapump export on a mounted drive. Just make sure not to disconnect the remote desktop session. But it will most probably be very slow.
Since you need to move a lot of data, I suggest that you consider using some kind of NAS or SAN storage, if available. And again - the most likely bottleneck will be the network.
Also, in order to speed up the export - consider excluding the statistics and synonyms. In earlier versions of 11g datapump had issues with exporting the statistics and excluding them sped up the process significantly.
Oracle data pump export utility expect a parameter DIRECTORY (DBA_DIRECTORIES) which exist in DB server. Is it possible to map this directory to local machine or is there any other way to export multiple table to local from oracle database?
If using Data Pump, there is no direct way to store a dump file on your local machine. That is the way how Data Pump designed.
However, there is one of possible ways to achieve what you want. A workaround has two steps:
Run expdp as usual, which creates a dump file on server
Use ocp tool to transfer a dump file from a database server to your local machine (and back, if you want to).
An ocp tool stands for "Oracle Copy" and written exactly for the purpose of copying dump files back and forth from/to a database server. It is available here: https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1.tar.gz That is a source distribution, so once downloaded and unpacked, run ./configure && make
(Hopefully you do not have Windows on a client side, because I never tried to compile it there)
That is a simple command-line tool with a simple syntax. For example, this command will pull a file for you:
ocp <connection_string> DATA_PUMP_DIR:remote_file_name.dmp local_file_name.dmp
The tool uses a database connection and a minimum set of database privileges.
Update:
Finally I was able to adjust the source code and build ocp tool for Windows 32-bit:
https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1-win32.zip
Compiled/tested with 32-bit Instant Client 11.2.0.4 available here: http://www.oracle.com/technetwork/topics/winsoft-085727.html
instantclient-basiclite-nt-11.2.0.4.0.zip (20,258,449 bytes)
I believe it will work with a full Oracle Client installation too (just watch for bits, should be 32), however did not check myself.
Unfortunately, Windows build of ocp does not have a fancy progress meter during file transfer. That piece of code had too much *nix-specific stuff, so I had to cut it off.
Also, since it uses popt and zlib libraries, which are compiled as a part of GnuWin project, and available in 32-bit only, ocp for Windows is 32-bit only too. Hopefully, not having of a 64-bit version is not mission critical to you.
Update 2:
Warning! Make sure you always use DEDICATED server connection when download files from server, otherwise (for SHARED server) the downloaded copy of the file will be corrupted with no error messages!
With a bit of a hack you can get data pump to do what you want, but you need to have a database on your local machine.
What you need to do is create a database link on your local machine to the remote machine.
Then in the datapump options, login to the local database as the db link owner, specify the 'network_link' option to be the name of the database link name you created. That way it should export from the remote database through the local database and create the file on your local instance. For example:
expdp directory=<local_dir_object> network_link=<dblinkname on local instance> dumpfile=.. logfile=.. tables/schema=...
No, data pump sucks that way, but Oracle can get faster throughput using the same server the db sits on, so thats the tradeoff. Other enhancements too, but I still think this is a big disadvantage for data pump. Use old exp/imp or third party tools for this purpose.
You should ask yourself: "Why do I want to keep data outside the database - the most secure place for my data? Where backup,restore and recovery is in place.
If you are going to move data from database A to database B, make sure both databases have access to a common file-area where they can access the datadump-files through their directory-object and use the datapump.
If you still want to export data to client side you can use the good old tools exp and imp.
I need to automate a selective table / user object backup I currently am doing via PL / SQL Developer.
The way I currently do it is via Tools/Export Tables and Tools/Export User Objects, manually select tables / objects, then set the options, choose destination and export. I do this from a windows laptop and the database is located in a suse linux server, both are in the same LAN. DB is running 24/7 and can not be shutdown. Also currently my oracle programming skills are very basic as I only do maintenance to this solution. I would like to keep doing the backup process in the windows laptop, but I would consider a server side script solution also and then retrieving the .sql files from server.
Thanks in advance
I wouldn't really call it a backup, but look at exp/imp and expdp/impdp (data pump) in the Utilities manual
As Gary implies exp/imp really isn't a backup solution. If this database is important to you or others, figure out how to use RMAN , which is usually configured to run in a mode that doesn't require the database to be shut down. Although it executes on the database host and for non-tape destinations must write its files to a filesystem attached to the host, it can be launched remotely.
RMAN is aimed at restoring/recovering the entire database, so if what you're looking for is only the ability to recover isolated objects it may not be for you.
How can I duplicate an Oracle instance? Does anyone have any idea how to do so?
Assuming you want the schema and data duplicated, use the exp and imp commands to export your database, then import it as another user using the FROMUSER and TOUSER parameters.
Well, presumably you have a backup (surely!), so just test your backup recovery on your test server.
To be slightly more serious, it depends what version you are using, newer versions of RMAN make it pretty easy I believe, older versions, you basically do it as a backup recover.
How I've done it in the past, is basically
copy backup data files
create init file
create a new controlfile is the command 'CREATE CONTROLFILE SET DATABASE "TEST" RESETLOGS ARCHIVELOG'
Apply archivelogs and then open with resetlogs
Here is an article which explains the process with a bit more detail
A minor comment on your terminology - "instance" is actually the set of processes running on the database server host and you want to duplicate the "database".
As someone else mentioned, the best way is to start with an RMAN backup of the original database. However, since Oracle 9 RMAN has had the "DUPLICATE DATABASE" command, which takes care of a lot of housekeeping that used to be necessary if you just made a copy by restoring a production backup (e.g. resetting DBID, changing data and log file locations in the control file, setting database GLOBAL_NAME, etc.).
If you're not using RMAN, and the database is on the small side, you can script something that puts each tablespace in hot backup mode, copies the datafiles for that tablespace to a backup location, and then takes the tablespace out of hot backup mode. You now have a recoverable backup that can be moved to another host for archive log application. This definitely has a performance impact on the original database and should be your last resort.
Create a template based on your existing instance. You can then create other instances.