Selective tables/objects Oracle Backup - oracle

I need to automate a selective table / user object backup I currently am doing via PL / SQL Developer.
The way I currently do it is via Tools/Export Tables and Tools/Export User Objects, manually select tables / objects, then set the options, choose destination and export. I do this from a windows laptop and the database is located in a suse linux server, both are in the same LAN. DB is running 24/7 and can not be shutdown. Also currently my oracle programming skills are very basic as I only do maintenance to this solution. I would like to keep doing the backup process in the windows laptop, but I would consider a server side script solution also and then retrieving the .sql files from server.
Thanks in advance

I wouldn't really call it a backup, but look at exp/imp and expdp/impdp (data pump) in the Utilities manual

As Gary implies exp/imp really isn't a backup solution. If this database is important to you or others, figure out how to use RMAN , which is usually configured to run in a mode that doesn't require the database to be shut down. Although it executes on the database host and for non-tape destinations must write its files to a filesystem attached to the host, it can be launched remotely.
RMAN is aimed at restoring/recovering the entire database, so if what you're looking for is only the ability to recover isolated objects it may not be for you.

Related

How can I make local connection in Oracle SQL Developer?

I have downloaded the SQL Developer. Currently, I am using my school database but it is for temporary use. I want to use it after finishing my college. I do not know how can I make local connection in SQL Developer. Can you please help me in this.
Oracle SQL Developer is a tool you use to access an Oracle database.
So, if you want to use Oracle on your own computer, regardless there is (or is not) a connection to your school network, you'll have to install the database as well. I'd suggest Oracle 11g Express Edition. The installation process is simple; more or less, clicking NEXT a few times does the job. I'd, though, recommend you to follow the Installation Guide and pay attention to what the Installer asks (for example, write down passwords you choose).
Furthermore, in order to "copy" the database (actually, I believe you mean "schema" in this case) to your database, the right way to do that is to use Data Pump. You'd use Export in school to export the database, and Import on your computer to import it.
However, as Data Pump requires access to a directory (it is an Oracle object which points to a file system directory, usually on the database server; it is created by SYS and other users are granted read and/or write privileges on it). If you can't get access to it, you can use the original EXP and IMP utilities. EXP creates the DMP file locally; you'd put it onto a memory stick (or, if you're on the network, copy it directly to your PC) and import it.
If you're unsure of whether you can (or can not) do that, ask your teacher.
Once the schema is imported into your database, use SQL Developer to access it. Should be no problem to do that.

Oracle application - migration to Exadata server

We have an upcoming migration of our Oracle database to an Exadata server. I want to clarify some issues I have thought of:
Will there be any issues with the code - performance issues? Exadata has another type of optimizer, it doesn’t uses indexes, has a columnar optimizer, if I’m not misleading,
Currently there are some import or export files generated on the database server (accessed via Filezilla). I understand that at Exadata the database server is inaccessible, and I suspect that either:
• we will have to move those files to another server - Oracle knows only FTP (which has ports closed at our client) -> how do we write / read from another server? (as far as I understand, they would like to put all the files on the WAS server)
• or we will need to import the files into the table using the java application and process them from there (and the same with the exported files).
Files that come automatically from other applications can be written to the database server? Or we have the same problems as for the manual part.
We have plenty of database jobs that run KSH scripts on the database server - is there a problem with them? I understand they should also be moved to the WAS server, but I do not know how Oracle will call them from there.
Will there be any problems with Jenkins deployments? Anything changed? Here we save the SQL/PLSQL sources in some XML files, from which the whole application is restored (packages, configuration tables, nomenclatures ...) (with the exception of the working data) (the XML files are read through a procedure from an oracle directory).
If you can think of any other issues concerning this migration, any problems you have encountered during or after the migration to Exadata, please share!
Thank you,
Step by step:
On exadata you are going to have the same optimizer behaviour with some improvements because the exadata may improve full table scan performance thanks to smart full scans. Indeed the exadata is able to avoid retrieving data blocks in fts because it knows in advance they do not contain neeeded data.
In the exadata you can export to external servers DBFS file systems, that might be useful for external tables, imports/exports and so on.
You can write your files on the DBFS you can configure.
You could use your DBFS, if you want the ksh files are accessed from outside your exadata.
Let your oracle directory point to a directory in the DBFS file system where you put your xml files and you are done.

Oracle daily db backup - without Enterprise Manager

I'm a SQL Server DBA, but we have an Oracle 10g database that I need to start performing daily backups on. We do not have Enterprise Manager. Is there a way to schedule a daily backup in Oracle like in SQL Server?
I apologize if this question is severely elementary for Oracle people, but I've had a very tough time trying to research this and coming up with an answer other than "Use EM".
Easiest in your case is to make a simple Windows Batch script that set ORACLE_HOME and PATH and uses rman to make the backup. Schedule the script in the Windows Task scheduler. Assuming your database is production and because of this runs in archive log mode your script could be something like this:
(I am not a Windows expert so subtle errors might be easy to spot for you)
rman_backup.bat:
ORACLE_SID=your_oracle_sid
ORACLE_HOME=d:/where/your/installation/is
PATH=%ORACLE_HOME%/bin;%PATH%
rman cmdfile=your_rman_actions_script.rman log=your_log_file.log
your_rman_action_script.rman looks like
connect target=/
backup DATABASE PLUS ARCHIVELOG;
For documentation look at Oracle 10g database documentation and start with 2 day dba. After that check out the backup docu found here Administration
I would (but then, my background is more on Unix, less on Windows) do the scheduling from outside the database, using the OS Scheduler to run a backup script. Assuming that no real backup system is available.
In the beginning of backup, you would run a SQL script to place the tablespaces in backup mode (ALTER TABLESPACE x BEGIN BACKUP), then back up the tablespace data files, and after that restore normal mode (ALTER TABLESPACE x END BACKUP). PL/SQL can be used here for looping over all tablespaces.
After that, you'd back up the control file (ALTER SYSTEM BACKUP CONTROLFILE ...), and finally you would rotate the redo logs enough times that all relevant log data has been archived, and back up the archive logs.
And as of doing incremental backups f.ex. throughout the working week, just do the log rotation & archive log copy part.

How to use Oracle data pump export utility to create dump file in local machine?

Oracle data pump export utility expect a parameter DIRECTORY (DBA_DIRECTORIES) which exist in DB server. Is it possible to map this directory to local machine or is there any other way to export multiple table to local from oracle database?
If using Data Pump, there is no direct way to store a dump file on your local machine. That is the way how Data Pump designed.
However, there is one of possible ways to achieve what you want. A workaround has two steps:
Run expdp as usual, which creates a dump file on server
Use ocp tool to transfer a dump file from a database server to your local machine (and back, if you want to).
An ocp tool stands for "Oracle Copy" and written exactly for the purpose of copying dump files back and forth from/to a database server. It is available here: https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1.tar.gz That is a source distribution, so once downloaded and unpacked, run ./configure && make
(Hopefully you do not have Windows on a client side, because I never tried to compile it there)
That is a simple command-line tool with a simple syntax. For example, this command will pull a file for you:
ocp <connection_string> DATA_PUMP_DIR:remote_file_name.dmp local_file_name.dmp
The tool uses a database connection and a minimum set of database privileges.
Update:
Finally I was able to adjust the source code and build ocp tool for Windows 32-bit:
https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1-win32.zip
Compiled/tested with 32-bit Instant Client 11.2.0.4 available here: http://www.oracle.com/technetwork/topics/winsoft-085727.html
instantclient-basiclite-nt-11.2.0.4.0.zip (20,258,449 bytes)
I believe it will work with a full Oracle Client installation too (just watch for bits, should be 32), however did not check myself.
Unfortunately, Windows build of ocp does not have a fancy progress meter during file transfer. That piece of code had too much *nix-specific stuff, so I had to cut it off.
Also, since it uses popt and zlib libraries, which are compiled as a part of GnuWin project, and available in 32-bit only, ocp for Windows is 32-bit only too. Hopefully, not having of a 64-bit version is not mission critical to you.
Update 2:
Warning! Make sure you always use DEDICATED server connection when download files from server, otherwise (for SHARED server) the downloaded copy of the file will be corrupted with no error messages!
With a bit of a hack you can get data pump to do what you want, but you need to have a database on your local machine.
What you need to do is create a database link on your local machine to the remote machine.
Then in the datapump options, login to the local database as the db link owner, specify the 'network_link' option to be the name of the database link name you created. That way it should export from the remote database through the local database and create the file on your local instance. For example:
expdp directory=<local_dir_object> network_link=<dblinkname on local instance> dumpfile=.. logfile=.. tables/schema=...
No, data pump sucks that way, but Oracle can get faster throughput using the same server the db sits on, so thats the tradeoff. Other enhancements too, but I still think this is a big disadvantage for data pump. Use old exp/imp or third party tools for this purpose.
You should ask yourself: "Why do I want to keep data outside the database - the most secure place for my data? Where backup,restore and recovery is in place.
If you are going to move data from database A to database B, make sure both databases have access to a common file-area where they can access the datadump-files through their directory-object and use the datapump.
If you still want to export data to client side you can use the good old tools exp and imp.

Running RMAN Scripts with the job scheduler (Oracle)

Here's a good one for any Oracle gurus out there. I'm working on a web page that dynamically configures Oracle DB backup settings in a closed environment. Right now, I have everything set up to generate scheduled jobs that run pre-determined RMAN scripts that already exist on the Database server's disk. This works, but I want to go a step further.
Is there any way to create jobs with the scheduler that will run RMAN scripts which haven't first been written to disk? For example, is it possible to fire off an RMAN backup script directly from the scheduler by using a pipe of some sort? I've found some vague information on the RMAN Pipe Interface, but I can't see how I could create a private pipe, pack it with RMAN commands, and then feed it to RMAN all in one job run... Any thoughts would be very much appreciated.
In anything related to backup/restore of the database, I advise you to prefer OS's means to execute scheduled jobs (cron/at on unix, Scheduled tasks on Windows). The advantage is that they are independent from oracle instance and you can better handle cases when oracle instance is down or malfunctioning. The "RMAN pipe interface" is meant to be used together with operating system's shell, as well.
However, executing scripts directly from database is also possible: AskTom
If you want to use DBMS_SCHEDULER then the script has to reside on the database server.
But if you install an Oracle client on the web server you can run RMAN from there and connect to the TARGET database. E.g.:
rman 'usr/pwd#conn_str AS SYSDBA' CMDFILE /home/www/db/backup-full.rman
In this case the script can reside on the web server.
Hope this helps.

Resources