Oracle 11g directory objects to remote path - oracle

We're in the process of migrating over a system from Oracle 10G (windows 2003 32 bit) to 11G (on windows 2008 R2 64bit), where currently, our backup process consist a directory object that points to a remote (unc) path on our storage box so that we don't have to perform the expdp locally, and then move the file, which seems to work without issue, however on our new windows 2008 box with 11G, I can create the directory object and test it through the EM console, however whenever I try to run my import, I get the following:
Connected to: Oracle Database 11g Release 11.2.0.2.0 - 64bit ProductionWith the Automatic
Storage Management option
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
Any ideas?? Just trying to test with metadata_only right now to troubleshoot this, but still no luck.
E:\>impdp xxxxx/xxxxx#prod CONTENT=METADATA_ONLY directory=Restoreloc dumpfile=XXXXX_
MAY172011.DMP logfile=XXXXXIMPDP.log exclude=grant

Ensure that the oracle process has write access to the directory.
or
You have to manually create directory in specified path then try your operation.
Check this post.

Related

How to backup Oracle database from server

I connect to Oracle database server for backup database (.dmp file) to my computer with windows command line
I use syntax
expdp myuser/1234#[server_ip]:[serverport]/listener directory=my_data_pump_dir dumpfile=my_pump_backup.dmp nologfile=Y full=y
But Error message
Connected to: Oracle Database 11g Release 11.2.0.1.0 - 64bit Production
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "C:\app\User\admin\orcl\dpdump_DIR\my_pump_backup.dmp"
ORA-27040: file create error, unable to create file
OSD-04002: unable to open file
O/S-Error: (OS 3) The system cannot find the path specified.
What should I do?

Has anyone tried oracle export : EXPDP on the remote machine?

I am trying to export a dump file and log file on a remote machine using oracle expdp.
However i am getting the following error :
Connected to: Oracle Database 11g
Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing
options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
Command run on remote machine host-name 'Local' using oracle client are :
SQL> create directory expdp_dir as '/vault2/expdp_dir';
SQL> grant read,write on directory expdp_dir to dbuser;
expdp dbuser/dbpwd#SID SCHEMAS=dbuser DIRECTORY=expdp_dir DUMPFILE=testDB24NOV17.dmp logfile=testDB24NOV17.log EXCLUDE=STATISTICS
Note vault 2 is mounted on a remote machine with hostname 'Local'. The database is on a machine with hostname TestDB.
The OS is RHEL6.
Any thoughts /ideas on making this operation successful would be appreciated.
Please check this one:
as per Oracle Doc.ID Doc ID 1305166.1
The errors can have multiple causes. Known causes are listed below.
One of the usual reasons for this problem to occur is when the listener process has not been started under the same account as the database instance service. The listener forks the new server process and when this runs under a different security context as the database, then access to directories and files are likely impacted.
Please verify the following information:
1) the output of:
ps -ef | grep SMON
2) the output of:
ps -ef | grep tnslsnr
3) the output of:
ps -ef|grep LIST
4) the output of:
ls -ld
Note:
When using ASM, the listener may have been started from the ASM Home instead of the RDBMS Home. Depending on your security settings, this may give to this issue.
One more:
4. Directory path/folder exists but create directory is executed by a different user in the database and the import is run by a different user.
Solution:
1. Make sure the listener and instance services are started from the same account.
Make sure that the directory are shared between nodes so that the directory can be accessed on any instance, or, create a folder similar to the other nodes locally, if there is already a folder created locally on the all the node with the same file directory path structure check if the permission are correct.
Make sure the folder exist has specified in during creation in the "CREATE DIRECTORY" syntax command.
Grant the required permission to the importing user to use the directory.
grant read, write on directory to ;
If above 4 possible causes and solutions are not applicable at your end, please check if the user has proper permission to export to run utl_file package.
Hope it helps.

Failure while exporting table from Oracle XE Database

We have XE running on a docker container. When trying to export a table I got the below error
expdp test/test#XE tables=UserProfile directory=/tmp dumpfile=profile.dmp logfile=logger
ORA-39006: internal error
ORA-39213: Metadata processing is not available
I googled a bit and found that I need to execute the below command but that failed too
execute dbms_metadata_util.load_stylesheets
ERROR at line 1:
ORA-31609: error loading file "kucolumn.xsl" from file system directory
"/u01/app/oracle/product/11.2.0/xe/rdbms/xml/xsl"
ORA-06512: at "SYS.DBMS_METADATA_UTIL", line 2397
ORA-06512: at line 1
I traversed to the directory and found that the "xsl" directory was missing, is this directory created by default with XE installation or we require a specific setting to get the "xsl" folder?
XE was installed using the rpm - oracle-xe-11.2.0-1.0.x86_64.rpm. Any idea what could be the issue?
The value you specify as directory needs to be an Oracle database directory object, not the value of a directory on your file system.
create directory export_directory as '/tmp';
expdp test/test#XE tables=UserProfile directory=export_directory dumpfile=profile.dmp logfile=logger
There is a 2011 discussion about this at https://community.oracle.com/thread/2278841. It says you have to copy the directory $ORACLE_HOME/rdbms/xml/xsl from a working installation. So the problem seems to be a known one, and if you do not have a working installation you are out of luck.
The problem does not seem to be limited to Linux (I used the same rpm as the OP), as the discussion says the working installation can be “even a Linux one.”

Errors on importing Oracle backup

I have created a directory:
SQL> create or replace directory dir as '/home/oracle12c/Desktop/latest_exp.dmp';
I am trying to run an import, as sysdba:
impdp \'/as sysdba\' full=Y directory=dir dumpfile=importDump.dmp logfile=import.log;
I am getting these errors:
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics, and Real Application Testing options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
As far as I can see from the result on my search, it as to do with permissions. The Oracle user should own the files.
The permission to the file I am trying to import looks like this:
-rw-r--r-- 1 oracle12c root 2201247744 Feb 21 16:51 latest_exp.dmp
What is my problem here, and how do I overcome it? Is there a problem when the owner of the file is oracle12? Some solutions on the internet say that the files must be owned by the user trying to access the files, who in my case is sysdba. But I cannot change the owner to my file to be sysdba, as that user does not exist on my Ubuntu.
Also, we specify the dumfile and logfile, but we do not specify the location. Where will it create the locations for these? I assumed it would look for them here: /u01/app/oracle/oradata/orcl$ where orcl is my SID
I have not been able to solve my problem with the help of Google and Oracle Doc.
I am running on Ubuntu.
The Oracle directory object should point to an operating system directory, not a file; instead of:
create or replace directory dir as '/home/oracle12c/Desktop/latest_exp.dmp'
it should just be:
create or replace directory dir as '/home/oracle12c/Desktop'
The dump file should sit in that directory, and by default the log and bad files will be created in there too.
At the moment you're asking it to create log file under the directory /home/oracle12c/Desktop/latest_exp.dmp, i.e. /home/oracle12c/Desktop/latest_exp.dmp/import.log - which clearly isn't a valid full path; and also to look for the file importDump.dmp under that not-a-directory.
Your command line looks like it has the wrong name anyway:
impdp ... directory=dir dumpfile=latest_exp.dmp logfile=import.log

Oracle tns listener error

I've just installed Oracle 10g When I try to connect to oracle db i get an error:
could not start OracleOraHome92TNSListener
when i got to services and try to start it, it says that the file doesnt exist. the service file is C:\oracle\ora92\BIN\TNSLSNR (TNSLSNR is a file not a directory)
C:\oracle\ora92\BIN\TNSLSNR doesn't exist on my machine at all. do you know how to get it?
Could not start the Oracle Ora92 Listener service on Local Computer.Error 2: The system cannot find the file specified
Here's a couple of issues I see. You say you installed 10g but the error is a 9.2 error. It could be that your computer already had an Oracle 9i on it that was mis-configured or uninstalled and that is leading to the error.
You need to check your disk and find the ORACLE_HOME (directory) where Oracle 10g was installed. Once you find that you can adjust the PATH and ORACLE_HOME and TNS_ADMIN environment variables to point to the right place. This should allow you to start the database and the listener.
If you need to install the Oracle Client for 10g then this information below will be helpful as well.
The Oracle client can be installed separately. Just go to this address, download the client and unzip it into a subdirectory and then run the Oracle Universal Installer by running setup.exe from the directory.
Oracle Downloads Page

Resources