Oracle .dmp of files - oracle

I have received a file named STP.dmp containing a database backup, which I have successfully restored into Oracle.
I have also received multiple additional .dmp files, named sequentially - eg. STP-DOCS01.dmp, STP-DOCS02.dmp - which I am expecting to contain image and document files (perhaps in blob data format, I don't know).
However, upon trying to restore this database, using the following command:
Impdp bkup_user/password directory=exp_table dumpfile=STP_DOCS01.DMP TABLE_EXISTS_ACTION=TRUNCATE
I get an error: cannot location master table within provided dump files. I get the same when I try to get the database schema. The third-party who provided the files swear that the DOCS backup completed correctly and that they've sent us all the files.
I've tried renaming the DOCS DMP files so the name is in line with the main database, in case they were part of the same database, but that didn't do anything. I'm completely out of ideas what else to try. Can anyone help

If the files are sequentially named it could mean that the person which have done the exports used parallelism and wrote to multiple files at once, this is why a single file doesn't have a complete part of the data, try importing using:
Impdp bkup_user/password directory=exp_table dumpfile=STP_DOCS%U.DMP TABLE_EXISTS_ACTION=TRUNCATE

Related

Open dbf file from oracle database

My instructor gave me a username and password and .dbf file and tell me to open it and try to retrieve with sqlplus and oracle database
I tried to open the dbf file from excel mysql and ms server but it i gave me an error
Speaking as a DBA: As Littlefoot stated, you can't just read a data file from an Oracle DB. At best they are proprietary binary file formats, assuming it isn't encrypted on top of that. Nor can you take a data file from one database instance and just plug it in to another database instance. You also can't import it to mySQL or any other database engine: as a stand-alone data file it can only be properly read by its original database installation (i.e. the specific database instance that created it).
Oracle has specific tools available to copy data and/or files from one database to another, but those would generally use the RMAN backup manager (used to make physical backups) or (more likely in your case) the Datapump "Transportable Tablespace" feature.
To restore it from an RMAN backup you would need a complete full backup of the entire source database instance: RMAN backup sets including all data files, redo logs (and perhaps archived logs), control files, parameter files, encryption keys,, and possibly more.
To restore a transportable tablespace dump you would need your own running Oracle database instance, the correct parameters to run the impdp import utility, and the assistance/cooperation of the DBA.
You need to confirm if the file you were given is such an export dump (though the .dbf file extension would suggest not), and how you are expected to access the data. You won't be able to just "open the file".
.DBF extension probably represents datafile; I don't think you can read it with any tool (at least, I don't know of any).
You should find an Oracle DBA who might try to help; in order to restore a database (which is contained in that file), they might need control file(s), redo log files and ... can't name what other files (I'm not a DBA).
Then, if everything goes OK, database might be started up so that you'd be able to connect to it using credentials you were given.

Oracle Data Pump Transfer Between Databases

I have a specific need for data pump and I am having a hard time searching for a solution.
Currently, I have a exp/imp program that exports tables (selectively based on queries) from one database, and imports that same data into another database. This program and the dump files reside on a common server that can access both the source and destination databases. This is a totally automated process. It works good, albeit slowly.
Due to various reasons, I must migrate this program to use data pump. The biggest change now is the location of the dmp files. I also have very limited access to the database servers themselves, but I can run data pump.
The process will be run from the same common server, but the exported files will now reside on the database server for the source database. No issue there. I can create dmp files using expdp.
My issue is how to get that same data into the destination database. When I run impdp, it is expecting a data_pump_dir in the destination area (not source area). Again, this is automated, and I don't have the luxury of being able to transfer dmp files using scp or ftp or anything like that.
What can I use to overcome this problem using datapump?
No reason you cannot configure an external directory on BOTH databases:
CREATE DIRECTORY mydumpdir AS '/whatever/the/path/is';
Then, impdp and expdp will take the DIRECTORY argument as mydumpdir
Make sure you configure permissions for the Oracle schemas/users to read/write to the directory AND the oracle process account should have OS level rights to read/write to that location also. The expdp server should also have write access as it might be trying to write reports to the locations or you might be using to do file cleanup.

Accessing Created date of a CSV file using an Oracle External table

Situation
I have a CSV file called inventory.csv located on an Oracle database server (2008 R2 Enterprise Edition Windows Server). This CSV file is used as an Oracle external table.
Every hour, a scheduled task (Windows Task Scheduler) executes a .bat file that copies over an updated version inventory.csv, overwriting the original.
The data is then used by a reporting application.
Problem
The application that uses the data in inventory.csv has no way of knowing when the data was last updated.
Ideally, I'd like the "last updated date" to be accessible as a column in the table.
One possible solution is to trigger a logging of the current date/time in a separate file, an then referencing that as an external table as well. However, this solution has too many moving parts, and I'd prefer something simpler, if possible.
I know that the CSV file itself knows when it was created...I'm wondering if there is any way for the Oracle external table to read the "Created" date from the CSV file properties?
Or any other ideas?
What version of Oracle?
If you are using 11.2 or later, you can use the preprocessor feature of external tables to run a shell script/ batch file on the file before it is loaded. My bias would be to go for simplicity-- have the preprocessing script grab the date, store it to a separate file, and have a separate external table that loads and exposes that data. That's likely easier than adding the date to every row.

Is there a possible way to load a Oracle .dmp file to an SQL Server 2012?

This throws me the below error:
the media family on device is incorrectly formed 3241.
I tried loading the .dmp file as .bak file and restored the db. It did not work.
Only way I know to extract from dmp is to use the "INDEXFILE" parameter for IMP, this will generate a readable SQL script with the DDL and DML.
However often times this script is not 100% usable as it (usually) wraps the statements, so some pre-processing may be required, for example parse the file by each discrete statement (INSERT, CREATE), join each statement into a single line then squirt into the target database. Having said that, you would probably need to pre-process anyway to translate Oracle to SQL server dialogue anyway.
Also, might not be so good for BLOB/binary type data.
The other indirect way to do this would be to create a bridge Oracle database, import the file into there, then use the normal extract and load tools to push the data into SQL server.
A *.dmp file in Oracle is nothing but a backup file. You meant to say restoring a Oracle DB backup file in SQL Server.
AFAIK, the answer is NO. You can't do that. Probably you can check, if there is any third party utility present using which you can perform a DB migration.
The dmp file comes in an Oracle specific format that cannot be parsed/interpreted by anything other than Oracle's imp tool. So, that means you cannot import the dmp file into SQL Server.
Of course there are ways to transfer data from Oracle to SQL Server but which one is optimal depends on your needs, amount of data, number of tables, number of Oracle schemas, datatypes etc etc.

how to use exp command to export Oracle DB with files in different disk location

we get problem, while trying to export Oracle DB. OS: CentOS ~ 5.2 DB: Oracle 10g.
Exp command exports db files only in location:
/home/oracle/OraHome_1/oradata/master/xxx.dbf
, but tool can't export files in different location (we know about this files after getting trace) like this:
'/disk1/dblog06.dbf',
'/home/disk2/system01.dbf',
Please, advice me, how to get dump file. or buckup it.
Thanks.
You appear to have misunderstood what exp does, and particularly what the file parameter is for. The file is the output dump file, normally given a .dmp extension. Export takes data out of the database instance, it does not work under the hood on the datafiles - you have to tell it which data you want (full, user, tables, or tablespaces) and where to put it, not where it comes from.
If you really did try to exp file=/home/disk2/system01.dbf then what you actually asked it to do was trash your database; you're lucky that it did not overwrite the datafile and cause a catastrophic failure. Oracle seems to have saved you from yourself there, though possibly only thanks to having exclusive locks on the files at the time.
You need to read up on how it works and see if it actually does what you want - as APC notes it's not a backup tool. Looks at the Oracle documentation for your version, or somewhere like http://www.orafaq.com/wiki/Import_Export_FAQ, and also look at using data pump instead of exp.
I am not sure if that is the question, but the exp command will export database objects according to their logical schema (user name, table name). It does not matter which physical database file the data is coming from.
exp works through an Oracle instance, which needs to have mounted the datafiles.
Are these other files part of the Oracle database? Maybe another database? You need to find out which Oracle server uses them, and then run exp against that instance.
EXPORT is not a backup tool. It is meant for transferring data from one database to another, or perhaps from one schema to another.
If you want to recover your data in the event of a database crash or corruption then you need to use the appropriate tool. There are OS solutions to this, but Oracle comes with a sophisticated backup and recovery tool: RMAN. Find out more.

Resources