I am trying to import data from a dump file created by Oracle 10g data pump utility. The command that I am issuing is
impdp \"username/password#DB as sysdba\" remap_schema=SRC_SCHEMA:TARGET_SCHEMA remap_tablespace=source_tablespace:target_tablespace DUMPFILE=db.dmp
I am getting the following error message:
ORA - 39001: Invalid argument value
ORA - 39000: Bad dump file spcification
ORA - 39088: file name cannot contain a path specification
What is the cause of this error?
From the documentation:
ORA-39088: file name cannot contain a path specification
Cause: The name of a dump file, log file, or sql file contains a path specification.
Action: Use the name of a directory object to indicate where the file should be stored.
This suggests that the parameter you've shown as DUMPFILE=db.dmp is really something like DUMPFILE=C:\some\dir\path\db.dmp, which is not allowed. You have to use a directory that is recognised by the database and specify it with a DIRECTORYparameter.
As #ruffin notes from that directory parameter link, you can put the dump file in the default DATA_PUMP_DIR directory, which you can find from the dba_directories view or, if you have permission to use that object, the all_directories view. The user you're importing as has to have been granted read and write privileges on that for you to be able to use it. You also need to be able to move your dump file into the operating-system directory, so permissions may be an issue there too.
If you don't have a suitable directory object that you have database privileges for and operating-system access to, you'll need to create one and grant suitable privileges. This needs to be done by someone with the appropriate privileges, usually as SYS:
create directory my_data_pump_dir as 'C:\some\dir\path';
grant read, write on directory my_data_pump_dir to <username>;
Then the import is modified to have:
... DUMPFILE=db.dmp DIRECTORY=my_data_pump_dir
Note that the operating system directory has to be available to the Oracle user account (whoever is running the database processes, pmon etc.) on the database server. You cannot import to a remote database using a local file, unless the local directory is somehow mounted on the remote server. The old imp command was a client-side application that often ran on the server but didn't have to; impdp is a server-side application.
Related
In informatica pc I got an error like Writer initialization failed.Error opening output file.The system cannot find the path specified.
Even I checked the directories and file names but what exactly confused.
It's exactly as it says: the Writer failed to initialize, as it was not able to locate the path and file specified.
Note that PowerCenter Workflows and Mappings are executed on the Server. So while you develop on your local laptop (for example) and place a file in C:\Temp folder, and you are able to see the file, once you run the process, it will be executed on the Server. And the Server will not refer your laptop. It will look for C:\Temp location on its local disk. And if that's a unix box, there won't even be a C: path!
Hence, the process will fail with exactly the message you've seen: initialization failed, error opening output file. You need to place the file in the location accessible by Server.
In case of Writer, you name target location where the file will be created - make sure the user used by PowerCenter does have the write access.
So, I'm currently working on a spring-boot application that should import a database dump based on a config-file. For that I build myself an impdp-command as a String, which then gets executed in the command line. Since this application is based around directories and should work by just moving the dump-file into a specific directory, I would like to use the current directory my application is running in. As far as I know the parameter "DIRECTORY" only accepts oracle directories (created in sql with create directory 'name' as 'C:\path\'). Is there any way I could use the directory from which the command gets executed or just a windows path?
I have lost my database on the server because the computer has been formatted on drive C. But the oracle folder was located on Drive E, like oradata, etc. Can I restore oracle database like before?
Well, first you need to install the oracle server with the version which was there before formatting.
Follow these steps:
Install the same Oracle database version with the starter database.
Copy all control files, datafiles to oradata directory
Copy init.ora filr to ADMIN folder
Revise the init.ora file for the changed control file, archive and dump locations. (keep instance, SID, and global db name the same)
Create the instance by running oradim with the pfile(init.ora) location
Connect to DB using internal and mount it.
Check existing data and logfile locations by typing:
SELECT name FROM V$DATAFILE;
SELECT member FROM V$LOGFILE;
Now change the file locations that come up above by using:
ALTER DATABASE RENAME FILE <old file loc list> to <new file loc list>;
After renaming them open the database by alter database open.
And you should be set.
Reference
I understand this is a cliche but while doing export dump in oracle, I am getting these errors.
I have pretty much followed almost all documentation online and did these following steps:
CREATE OR REPLACE DIRECTORY export_meta as '/C:/oracle/'; (where C/oracle is my local path)
GRANT READ,WRITE ON export_meta to HR
expdp username/password DIRECTORY=export_meta
dumpfile=hr.dmp
But getting these errors:
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: ad "SYS.UTIL_FILE"
Can someone please tell me if this directory needs to be created on local and any idea as to why I am getting this error?
most probably your path '/C:/oracle/' is wrong (it starts with "/") try 'C:/oracle/' if this is just a copy/paste error then:
1) See which user started the oracle.exe process, most probably that user does not have access to write to that directory (to test this try to create the ora DIRECTORY in the $ORACLE_HOME folder, or to point to datafile folder).
2) the file hr.dmp should not exist in that directory when you start expdp
I have a CGI Perl script that will run a select statement from a Oracle Database to get records. This scripts runs on Apache with cgi-bin linked. It was running fine.
Due to failover we moved the hard disk to the backup server with similar setups. When we run the script however, the following error is shown:
install_driver(Oracle) failed: Can't load '/u02/system/perl/usr/local/lib64/perl5/auto/DBD/Oracle/Oracle.so' for module DBD::Oracle: libclntsh.so.11.1: cannot open shared object file: No such file or directory at /u02/system/perl/usr/lib64/perl5/DynaLoader.pm line 200.
I had checked my ORACLE_HOME and LD_LIBRARY_PATH variables and they both pointing to the correct oracle client. Also try locate libclntsh.so.11.1 and managed to find the file in the correct directory, with permission granted.
Also added a oracle.conf file that has the path to oracle lib directory in /etc/ld.so.conf.d
All these and still it's showing the same error. Am running out of ideas....
Any pointers or suggestion are appreciated. Thank you.
You've covered the basics already which is good: make sure the libraries exist, make sure your program has read/execute permissions, and setting environment variables for ORACLE_HOME and LD_LIBRARY_PATH.
You many need to check the dependencies recursively.
ldd /u02/system/perl/usr/local/lib64/perl5/auto/DBD/Oracle/Oracle.so
should show you any dependencies of the .so that can't be loaded. DBD::Oracle is also version sensitive to the Oracle client. If the version or location of the Oracle client has changed (or when all else fails), you may need to recompile DBD::Oracle.