is it possible to recover the Oracle Data Pump master table? - oracle

I'm trying to import a few files with a published Oracle Data Pump perl script: dumpinfo.pl
After successfully importing several dump files from the same export process, another file failed with:
# impdp system/****** DIRECTORY=RESTORE_DIR DUMPFILE=exp_%u.dmp PARALLEL=8
Import: Release 11.2.0.2.0 - Production on Mon Jul 7 11:40:37 2014
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39002: invalid operation
ORA-39059: dump file set is incomplete
ORA-39246: cannot locate master table within provided dump files
The script reports that it can't find a master table. Assuming that the master table is lost, there any mechanics for recover it?
Thanks...

ORA-39246 cannot locate master table within provided dump files
Cause: Check the export log file and make sure all of the files that
were exported are included in the current job. Action: A Data Pump
IMPORT or SQL_FILE operation was being performed but not all of the
files from the Data Pump export dump file set were included. In
particular, the dump file containing the export job's master table was
not provided.
Check that you have all dump files in that directory and they all are accessible to impdp utility. I got this error when our backup team has restored first five dmp files, and two last ones were missing.

Related

Connecting to oracle database using tnsname.ora with liquibase

I want to use liquibase to automate database deployments. Our setup is tnsnames.ora file is located on a shared drive \\shared\drive\path\tnsnames.ora
As per docs from liquibase, liquibase.properties files should look something like as below.
--driver=oracle.jdbc.OracleDriver
--classpath=ojdbc14.jar
--url="jdbc:oracle:thin:#<IP OR HOSTNAME>:<PORT>/<SERVICE NAME OR SID>"
--changeLogFile=db.changelog-1.0.xml
--username=<USERNAME>
--password=<PASSWORD>
Is there any way by which we can just specify <SERVICE NAME OR SID> which get the matching SERVICE NAME from tnsname.ora file located on the network share and connect to the required database ?
Have you tried using Liquibase built into SQLcl? SQLcl version 20.4 has Liquibase version 4.1.1 embedded within it so you can connect to the database using the same syntax that you would use for SQLPlus, then run your Liquibase commands from there.
C:\Users\ej>sql scott/tiger#tnsalias
SQLcl: Release 20.4 Production on Fri Mar 05 11:07:13 2021
Copyright (c) 1982, 2021, Oracle. All rights reserved.
Last Successful login time: Fri Mar 05 2021 11:07:14 -05:00
Connected to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Version 19.6.0.0.0
SQL> lb version
Liquibase version: 4.1.1
Extension Version: 20.4.1.0
SQL> lb help
usage: lb COMMAND ...
Commands:
The following commands are available within the liquibase feature.
lb help COMMAND for command specific help
COMMAND
genobject Generate change log for a specific database object
genschema Generate changelogs and controller for connected schema
data Generate changelog for the data in tables
gencontrolfile Generate a blank controller.xml as a sample
update Updates database to current version
updatesql Generates SQL to update database to current version
rollback Rolls back the state requested
rollbacksql Writes SQL to roll back the database to the state requested
diff Writes description of differences between two databases to standard out.
dbdoc Generates Javadoc-like documentation based on current database and change log.
changelogsync Mark all changes as executed in the database.
clearchecksums Removes current checksums from database. On next update changesets that have already been
deployed will have their checksums recomputed, and changesets that have not been deployed will
be deployed.
listlocks Lists who currently has locks on the database changelog.
releaselocks Releases all locks on the database changelog.
status Outputs list of unrun change sets.
validate Checks the changelog for errors.
version Display product version information

Import dmp file to Oracle

my customer has provided a dmp file (10 GO), and i tried the following:
Create a user:
create user USERNAME identified by PASSWORD;
Grant read write access
Import the dump file(using imp and impdp)
impdp or imp system/password#db dumpfile=EXPDAT.DMP FULL=Y logfile=dice.log
and here's the error message:
Import: Release 18.0.0.0.0 - Production on Tue Feb 23 11:46:07 2021
Version 18.4.0.0.0
Copyright (c) 1982, 2019, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 18c Express Edition Release 18.0.0.0.0 - Production
ORA-39002: invalid operation
ORA-39059: dump file set is incomplete
ORA-39246: cannot locate master table within provided dump files
Can anyone help on that?
First, imp and impdp are not interchangeable; they have different file formats and options. You need to know exactly which was used to create the file you have.
Second, assuming the file was created with expdp (aka datapump, the more modern choice) and you should be using impdp to load it, the error indicates that there's a problem with the datafile itself.
ORA-39246 cannot locate master table within provided dump files
Cause: Check the export log file and make sure all of the files that
were exported are included in the current job.
Action: A Data Pump IMPORT or SQL_FILE operation was being performed
but not all of the files from the Data Pump export dump file set were
included. In particular, the dump file containing the export job's
master table was not provided.
It seems likely that your customer has not provided you with the complete data dump and you should have received additional files. This is possible if either the "parallel" or "filesize" option was used during the export. Confirm with them the number and size of the files you should have.

expdp issue ORA-39001: invalid argument value

Originally, the script was like:
C:\Windows\system32>expdp anna/12356#inst2
PARFILE=E:\expdp\db001\_db001.par
Export: Release 11.2.0.3.0 - Production on Mon Sep 3 22:59:30 2018
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP and Data Mining options
ORA-31626: job does not exist
ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT", line 1002 ORA-39001: invalid argument value
Then I searched online and read all the similar cases, but none of them are exactly the same as my case. I get directory, user privileges and all other things ready, but it still showing the error.
Then I tried to narrow down the range of the root cause. When I enter the script as following:
C:\Windows\system32>expdp anna/12356#inst2
The error was like following the picture:
Seems like the expdp utility can not recognized the username/pwd "anna/12356"
Any one know how does this come from and how to solve it?
I solved this issue by doing:
run catalog.sql (re-creates all the data dictionary views)
run catproc.sql
run utlrp.sql

Errors on importing my oracle 10g database

I am running my oracle on a debian machine. Today I decided to create a new user and a new tablespace. Then I export a database with the user System which on the same machine. I got some error when I try to import to my new user account.
here is what I've done:
./imp mynewuser/passwrdb#orcl file=newdump_sept.dmp system/tomynewuser
Import: Release 10.2.0.1.0 - Production on Thu Sep 29 18:06:23 2011
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release
10.2.0.1.0 - Production With the Partitioning, OLAP and Data Mining options
Export file created by EXPORT:V10.02.01 via conventional path
Warning: the objects were exported by SYSTEM, not by you
import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses WE8ISO8859P1 character set (possible charset conversion)
IMP-00085: multiple input files specified for unbounded export file
IMP-00000: Import terminated unsuccessfully
Any suggestion to my problem?
From http://www.error-code.org.uk/view.asp?e=ORACLE-IMP-00085 :
Oracle Error :: IMP-00085
multiple input files specified for unbounded export file Cause
You specified multiple file names for the FILE parameter when doing an
import, but the header in the export file indicates that that the
export operation could create only one file. Specifying multiple file
names is valid for an import operation only if the export files were
created by an export operation in which the user specified a non-zero
value for the FILESIZE parameter. Action
If you believe the export contains multiple files, verify that you
have specified the correct files. If you believe the export should be
in only one file then try the import operation again, but specify only
one value for the FILE parameter.
You should probably use:
./imp mynewuser/passwrdb#orcl file=newdump_sept.dmp fromuser=system touser=tomynewuser
For help: imp help=y
In my case I solved the error by importing the dump file by using SYS user instead of SYSTEM user and import working without errors.

Error importing oracle dump

I am trying to import a dump into two schema in the same oracle DB.Following a workaround to do this.
I am trying to run the imp command with the INDEXFILE option to be able to modify the tablespace names in the sql. This is what I get :
E:\oracle_10_2\BIN>imp atlantis/atlantis#orcl file=ABCD1_EXCLUDE_CLOB_TABS_BAK.dmp indexfile=index.sql full=y log=imp.log
Import: Release 10.2.0.1.0 - Production on Thu Mar 12 15:31:44 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
IMP-00002: failed to open ABCD1_EXCLUDE_CLOB_TABS_BAK.dmp for read
Import file: EXPDAT.DMP >
Looked like a file permission issue to me so I tried changing it.
E:\oracle_10_2\BIN>cacls E:\ABCD1_EXCLUDE_CLOB_TABS_BAK.dmp /p atlantis:F
Are you sure (Y/N)?y
processed file: E:\ABCD1_EXCLUDE_CLOB_TABS_BAK.dmp
E:\oracle_10_2\BIN>cacls E:\ABCD1_EXCLUDE_CLOB_TABS_BAK.dmp
E:\ABCD1_EXCLUDE_CLOB_TABS_BAK.dmp CORP\atlantis:F
But the problem persists.
If you are using Oracle 10g consider using new export/import tool Oracle Data Pump in which you can use REMAP_TABLESPACE parameter.
Anyway, you missed the path of the exportfile (bassed on the commands fo give permissions). Then You wrote:
file=ABCD1_EXCLUDE_CLOB_TABS_BAK.dmp
instead of
file=E:\ABCD1_EXCLUDE_CLOB_TABS_BAK.dmp
Then, the result export command to work is:
imp atlantis/atlantis#orcl file=E:\ABCD1_EXCLUDE_CLOB_TABS_BAK.dmp indexfile=index.sql full=y log=imp.log
Open window command line as administrator
imp user/pass#databasename(listenername) file='DMP path'
log=indexfile.log full=y;
Don't forget to add your database or listener name.

Resources