how to take backup of RPD and Catalog of OBIEE12c? - oracle

I would like to take backup of RPD (i.e., OBIEE12c metadata) and Catalog (i.e., save BI analytical reports, dashboard saved in both my folders and shared folders) in order to restore in the test server for the research purpose.
OBIEE12c Version: 12.2.1.4
Operating System: AIX 7.1

You may take backup of RPD and Catalog using exportarchive utility
Syntax:
[DOMAIN_HOME]/bitools/bin/exportarchive.sh ssi <Bar file name>.bar encryptionpassword=<password>
Example:
[DOMAIN_HOME]/bitools/bin/exporarchive.sh ssi /backupdir/obitest.bar encryptionpassword='Test123'

Related

How to update RPD file in OBIEE analytics

I'm new to OBIEE and looking for a way to update my repository file (RPD) i.e. I initially had an RPD configured and now have made some changes to some view in the oracle database (added on column). Now what can I do the get my RPD file updated with this change so that I can see this change (I need to work on the RPD file through administration tool).
Thanks.
All you have to do is
1. modify the RPD
2. Stop the BI Server opmnctl stopproc ias-component=coreapplication_obis1
3. Copy the modified RPD over to .../bifoundation/OracleBIServerComponent/coreapplication_obis1/repository.
4.Start the BI Server opmnctl startproc ias-component=coreapplication_obis1
That's it, you are done!
You can move the RPD between Windows and Linux without a problem, as long as it is the same OBIEE version, patches etc. If that is not the case then you have to run the RPD through an upgrade process documented on OTN.
hth,

How to use Oracle data pump export utility to create dump file in local machine?

Oracle data pump export utility expect a parameter DIRECTORY (DBA_DIRECTORIES) which exist in DB server. Is it possible to map this directory to local machine or is there any other way to export multiple table to local from oracle database?
If using Data Pump, there is no direct way to store a dump file on your local machine. That is the way how Data Pump designed.
However, there is one of possible ways to achieve what you want. A workaround has two steps:
Run expdp as usual, which creates a dump file on server
Use ocp tool to transfer a dump file from a database server to your local machine (and back, if you want to).
An ocp tool stands for "Oracle Copy" and written exactly for the purpose of copying dump files back and forth from/to a database server. It is available here: https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1.tar.gz That is a source distribution, so once downloaded and unpacked, run ./configure && make
(Hopefully you do not have Windows on a client side, because I never tried to compile it there)
That is a simple command-line tool with a simple syntax. For example, this command will pull a file for you:
ocp <connection_string> DATA_PUMP_DIR:remote_file_name.dmp local_file_name.dmp
The tool uses a database connection and a minimum set of database privileges.
Update:
Finally I was able to adjust the source code and build ocp tool for Windows 32-bit:
https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1-win32.zip
Compiled/tested with 32-bit Instant Client 11.2.0.4 available here: http://www.oracle.com/technetwork/topics/winsoft-085727.html
instantclient-basiclite-nt-11.2.0.4.0.zip (20,258,449 bytes)
I believe it will work with a full Oracle Client installation too (just watch for bits, should be 32), however did not check myself.
Unfortunately, Windows build of ocp does not have a fancy progress meter during file transfer. That piece of code had too much *nix-specific stuff, so I had to cut it off.
Also, since it uses popt and zlib libraries, which are compiled as a part of GnuWin project, and available in 32-bit only, ocp for Windows is 32-bit only too. Hopefully, not having of a 64-bit version is not mission critical to you.
Update 2:
Warning! Make sure you always use DEDICATED server connection when download files from server, otherwise (for SHARED server) the downloaded copy of the file will be corrupted with no error messages!
With a bit of a hack you can get data pump to do what you want, but you need to have a database on your local machine.
What you need to do is create a database link on your local machine to the remote machine.
Then in the datapump options, login to the local database as the db link owner, specify the 'network_link' option to be the name of the database link name you created. That way it should export from the remote database through the local database and create the file on your local instance. For example:
expdp directory=<local_dir_object> network_link=<dblinkname on local instance> dumpfile=.. logfile=.. tables/schema=...
No, data pump sucks that way, but Oracle can get faster throughput using the same server the db sits on, so thats the tradeoff. Other enhancements too, but I still think this is a big disadvantage for data pump. Use old exp/imp or third party tools for this purpose.
You should ask yourself: "Why do I want to keep data outside the database - the most secure place for my data? Where backup,restore and recovery is in place.
If you are going to move data from database A to database B, make sure both databases have access to a common file-area where they can access the datadump-files through their directory-object and use the datapump.
If you still want to export data to client side you can use the good old tools exp and imp.

Selective tables/objects Oracle Backup

I need to automate a selective table / user object backup I currently am doing via PL / SQL Developer.
The way I currently do it is via Tools/Export Tables and Tools/Export User Objects, manually select tables / objects, then set the options, choose destination and export. I do this from a windows laptop and the database is located in a suse linux server, both are in the same LAN. DB is running 24/7 and can not be shutdown. Also currently my oracle programming skills are very basic as I only do maintenance to this solution. I would like to keep doing the backup process in the windows laptop, but I would consider a server side script solution also and then retrieving the .sql files from server.
Thanks in advance
I wouldn't really call it a backup, but look at exp/imp and expdp/impdp (data pump) in the Utilities manual
As Gary implies exp/imp really isn't a backup solution. If this database is important to you or others, figure out how to use RMAN , which is usually configured to run in a mode that doesn't require the database to be shut down. Although it executes on the database host and for non-tape destinations must write its files to a filesystem attached to the host, it can be launched remotely.
RMAN is aimed at restoring/recovering the entire database, so if what you're looking for is only the ability to recover isolated objects it may not be for you.

How can I replicate an Oracle 11g database(data+structure) on my local machine for development?

I am working on a test server with an Oracle 11g installed. I was wondering if there is anyway I can replicate the database(environment + data) on my local Linux machine. I am using a CentOS 5.3 on Windows XP with SUN Virtual Box. On Windows I am using sqldeveloper client to connect to the 11g database.
There are a number of ways to move the data over:
Restore an RMAN backup on your test server
Export and import the data using exp/expdp/imp/impdp
Export and import using a transportable tablespace (Further Info)
Use database links to duplicate the data using SQL
You can use the Database Configuration Assistant to generate a template from your production database. This will give you all the parameters and tablespaces, among other things. You will need to tweak the configuration somewhat; for instance the file paths may be wrong, and some parameters may need downsizing. You can then feed that template into DBCA to clone the database on you Linux machine.
To get the schemas and data you should use Data Pump (rather than the older Import / Export utlities). This can be run off the command line or from PL/SQL.
Bear in mind that using production data in a development or test environment can cause you to run foul of data protection laws and other compliance issues. It depends on what your application does and what jurisdiction you operate under. But if your production system contains citizens' personal data you need to be very careful. There are products out there which will apply masking as part of a data import process (Oracle sells one) but they tend to be expensive. Rolling your own masking product can be tricky: if this applies to your situation be sure to get your compliance staff (legal team) involved early.
I would suggest you install Oracle XE which is free to use on your local if your development is not something that is related to core database features. You can then use the methods given above to pump data into Oracle XE and compile your code on it, though for development I don't think you would need data as much as that in production.

Oracle: How to setup a database backup to a network drive

Oracle database 11g. What is the easiest way to set up a full nightly database backup to a network drive (ie drive on another computer)?
Read the backup and recovery guide.
Don't just backup... Make sure you test your backup, regularly!!
If you have Grid Control setup the easiest way would be to use the web interface. The interface allows you to specify a unc path (\computer\share name). You will have to create a share on the remote computer.
If you don't have Grid Control you can create a script that uses RMAN. If you need script specifics it would be helpful to know what OS your database is on.
You need a backup strategy. Once you have defined that you will see what options are available to suit your needs.
mount the network drive, use rman to fully backup the db to the network drive. Rman can do it hot. Another method is to export full but you'll have to flush all writes and put the db into restricted mode so no user can make changes while you export to get a full consistent backup.

Resources