Oracle Lob Operation FILEOPEN Failed I/O Error - oracle

I have an issues where my DBMS_LOB open operation which falls over where by I receive the following error message.
I have double checked the file exists, that the permission on the file are open to everyone one on the destination server, the database user has full access to the Oracle Directory.
I have seen similar issues where by it states that the file does not exist or the directory is invalid but I have not seen an I/O error before.
Any guidance is appreciated, I have been on the server where the file is located and I can't see what might be a miss based on the error message as the file definitely exists.
Oracle version is 11.2 trying to read from a windows server 2019.
Many Thanks,

Related

Data Pump export using cloud shell

I am trying to export schema using data pump on Oracle Cloud Autonomous database.
I am using cloud shell to export schema.
When I tried to do the final step:
expdp admin/password#DB_HIGH schemas=SCHEMA_NAME directory=data_pump_dir dumpfile=exp%U.dmp filesize=1G logfile=expot.log
I got
UDE-12154: operation generated ORACLE error 12154 ORA-12154:
TNS:could not resolve the connect identifier specified
Do I need Oracle instant client to do export?
The Oracle client code uses one of three ways to look up connect data:
A flat file named tnsnames.ora
Oracle Names service
LDAP
When the complete ORA-12154 error appears with the text line, your program has found a working Oracle client install. However, the specified Oracle service is not listed in tnsnames.ora, Oracle Names or LDAP.
The first step in the troubleshooting process is to determine which name resolution method is deployed at your site. Most sites use tnsnames.ora, but enough use Oracle Names and LDAP, so it’s best to confirm this information.
If you are not the database administrator, get in touch with the people managing your Oracle systems and find out which method you should be using. They may be able to guide you in fixing the problem in accordance with your site’s standards.
The client code decides which mechanism to use based on the file sqlnet.ora. This file and tnsnames can usually both be found in the Oracle install directory (“ORACLE_HOME”), under network/admin/. This location may be overridden with the environment variable TNS_ADMIN.
If the sqlnet.ora file does not exist or does not specify a resolution method, then Oracle Net uses tnsnames.ora.
Example locations of Oracle networking files include:
Windows
ORANTNET80ADMIN
ORACLEORA81NETWORKADMIN
ORAWIN95NETWORKADMIN
ORAWINNETWORKADMIN
UNIX / Linux
$ORACLE_HOME/network/admin/
/etc/
/var/opt/oracle/
If you fix the naming issues, but you still see the ORA-12154 error, check the Oracle service to confirm that it’s available for connections. A power outage, server failure, or network connectivity issue will make this resource inaccessible. It’s also possible that scheduled maintenance or repairs of an unrelated Oracle issue may take that resource temporarily offline.
Thanks

Experiencing issues using DBMS_CLOUD.GET_OBJECT in oracle cloud infrastructure Autonomous Database serverless

I am trying to create a DB Link between 2 Autonomous databases (Serverless) in OCI
List of steps i followed
I created the necessary credentials for the user using dbms_cloud.create_credential
Now, i try to upload the Wallet file (which i have stored in Object storage) using "dbms_cloud.get_object". It produces the following error
ORA-20000: ORA-29283: invalid file operation: nonexistent file or path [29434]
ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 983
ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 2622
ORA-06512: at line 2
If i use the wrong credential or if i change the uri, the error that the system produces are different. I believe oracle is able to get to the object, yet it produces this error.
Any ideas?
DBMS_CLOUD.GET_OBJECT supports ability to read data from an object store file and return the contents as a BLOB, or save the contents to a file in the given directory object in your Autonomous Database.
https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/dbms-cloud-subprograms.html#GUID-3DB888C9-18C7-4A26-8DA8-EDFB260E2B14
It seems that you are trying to download a Wallet file to directory object for creating a database link. Autonomous Database automatically provisions a database file system to store files. Although the exact SQL syntax is not posted, but the error indicates that the syntax is correct. The error appears like the database file system is not accessible, and it is an internal error for the service.
You could workaround the issue by restarting the Autonomous Database. As this is an old question, the issue could be automatically addressed by now with automatic maintenance of Autonomous Database.
Out of curiosity, what region are you experiencing this in? Free tier or paid?
Ultimately, there is nothing wrong with your syntax used or a wrong usage. Unfortunately the issue you're experiencing is likely an internal error/bug, and can be fixed by OCI ops. I highly recommend submitting a service request.
If you have not submitted one in the past, you can read up on how to here - https://docs.cloud.oracle.com/en-us/iaas/Content/GSG/Tasks/contactingsupport.htm#3Openasupportservicerequest

SSIS project not working when reading from Oracle

I've been trying to create an SSIS project to read from an Oracle 11.x database to an SQL Server database.
When I set this up in Visual Studio 10 Shell, I do not receive any logs . It gives me a successful message but nothing happens.
I tried to connect to an Oracle 12c database and the same happened.
I tried to get data from an Oracle 11.x project and dump it into an excel file. I also tried to get data from an Oracle 11.x table and dump it into a new Oracle 11.x table (in the same database) and in both cases I got the following error:
> TITLE: Microsoft Visual Studio
Failed to start project
------------------------------ ADDITIONAL INFORMATION:
Exception deserializing the package "The package failed to load due to
error 0xC0011008 "Error loading from XML. No further detailed error
information can be specified for this problem because no Events object
was passed where detailed error information can be stored.". This
occurs when CPackage::LoadFromXML fails. ".
(Microsoft.DataTransformationServices.VsIntegration)
The package failed to load due to error 0xC0011008 "Error loading from
XML. No further detailed error information can be specified for this
problem because no Events object was passed where detailed error
information can be stored.". This occurs when CPackage::LoadFromXML
fails. (Package)
------------------------------ BUTTONS:
OK
Can anyone help me please?
Thank you
You haven't posted how you are trying to get data from oracle exactly so can say much about the error. I can only give my solution in 2008 r2:
create an oracle linked server in your sql server and then use an open query in the SSIS package to pull anything you need

Oracle Database Copy Failed Using SQL Developer

Few days ago while i tried perform database copy from remote server to local server i got some warnings and one of them was like this
"Error occured executing DDL for TABLE:MASTER_DATA".
And then i clicked yes, but the result of database copy was unexpected, there were only few tables has been copied.
When i tried to see DDL from SQL section/tab on one of table, i got this kind of information
-- Unable to render TABLE DDL for object COMPANY_DB_PROD.MASTER_DATA with DBMS_METADATA attempting internal generator.
I also got this message and i believe this message showed up because there's something wrong with DDL on my database so tables won't be created.
ORA-00942: table or view does not exist
I've never encountered this problem before and i always perform database copy every day since two years ago.
For the record before this problem occurred, i have removed old .arch files manually not by RMAN and i never using any RMAN commands. I also have removed old .xml log files, because these two type of files have made my remote server storage full.
How to trace and fix this kind of problem? Is there any corruption on my Oracle?
Thanks in advance.
The problem was caused by datafile has reached its max size though. I have resolved the problem by following the answer of this discussion ORA-01652: unable to extend temp segment by 128 in tablespace SYSTEM: How to extend?
Anyway, thank you everyone for the help.

Can't import dump from mapped net drive using data pump

I'm trying to import few users from .dmp file from a net drive. Unofrtunately it seems that I lack some rights to do so since I get
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31640: unable to open dump file "\\net\drive\directory\placeholder\my_dump.dmp" for read
ORA-27041: unable to open file
OSD-04002: unable to open file
O/S-Error: (OS 5) Access is denied.
I'm not sure why, because I can both access that directory, and for example save a txt file there.
Directory is saved on database as '\net\drive\directory\placeholder'. Log file has other directory specified (not on net drive).
Is there any workaround to import this dump without actually moving it to local drive? Dump is really big, and I don't have space for it (not even close) and neither can I (probably) change my rights on this mapped drive.
Also I can't really make dump smaller.
On one site I've found this advice - " Remember, your OS user ID may not be the ID that is running a submitted RMAN job, in an operating system, UNIX, Linux or Windows."
The solution was to "
In the ControlPanel services:
Right click on service
Select ?properties?
Select ?logon?
Change the default user ID to an Oracle user with Windows administrator privileges"
But I'm not sure what changing this would actually do to server/database, and I'm working on client's server so I don't want to act rashly. I also don't want to reset database or server itself.
Any help with what should I do?
The problem is that your Oracle instance is running under different user account which doesn't have an access to the network drive.
Unless you don't want to run Oracle under different account, you can give the read access to the current Oracle's instance user account (usually LocalSystem for Windows platform) to your network share. Another option could be to import data from the source database via dblink (you won't need dump file in this case at all)

Resources