Experiencing issues using DBMS_CLOUD.GET_OBJECT in oracle cloud infrastructure Autonomous Database serverless - oracle

I am trying to create a DB Link between 2 Autonomous databases (Serverless) in OCI
List of steps i followed
I created the necessary credentials for the user using dbms_cloud.create_credential
Now, i try to upload the Wallet file (which i have stored in Object storage) using "dbms_cloud.get_object". It produces the following error
ORA-20000: ORA-29283: invalid file operation: nonexistent file or path [29434]
ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 983
ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 2622
ORA-06512: at line 2
If i use the wrong credential or if i change the uri, the error that the system produces are different. I believe oracle is able to get to the object, yet it produces this error.
Any ideas?

DBMS_CLOUD.GET_OBJECT supports ability to read data from an object store file and return the contents as a BLOB, or save the contents to a file in the given directory object in your Autonomous Database.
https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/dbms-cloud-subprograms.html#GUID-3DB888C9-18C7-4A26-8DA8-EDFB260E2B14
It seems that you are trying to download a Wallet file to directory object for creating a database link. Autonomous Database automatically provisions a database file system to store files. Although the exact SQL syntax is not posted, but the error indicates that the syntax is correct. The error appears like the database file system is not accessible, and it is an internal error for the service.
You could workaround the issue by restarting the Autonomous Database. As this is an old question, the issue could be automatically addressed by now with automatic maintenance of Autonomous Database.

Out of curiosity, what region are you experiencing this in? Free tier or paid?
Ultimately, there is nothing wrong with your syntax used or a wrong usage. Unfortunately the issue you're experiencing is likely an internal error/bug, and can be fixed by OCI ops. I highly recommend submitting a service request.
If you have not submitted one in the past, you can read up on how to here - https://docs.cloud.oracle.com/en-us/iaas/Content/GSG/Tasks/contactingsupport.htm#3Openasupportservicerequest

Related

Blob container doesn't allow the creation / reading of an external table. Is there any way to trace the exact problem?

About a month ago, all the external tables built upon parquet files(ADLS Gen2, Synapse) stopped working with the following error message:
Unexpected error encountered checking whether directory exists or not:
AbfsRestOperationException: Operation failed: "Server failed to
authenticate the request. Please refer to the information in the
www-authenticate header.", 401
The access key wasn't rotated and even though I tried recreating new database scoped credentials, data sources, they didn't do anything.
Then I tried creating a new blob container with the same data and I was able to create external tables and run select statements over them.
Does anyone have a clue what the problem could be? At first I thought it was something from Azure, because the coincidence was that they had problems with Synapse. It may seem it's the SAS token, but if it so, why am I not allowed to create other external tables over new SAS tokens? Plus, when a SAS token expires, it throws a 403.
My guess is it is something on configuration for this specific blob, or maybe the Login that I'm using(admin login on SQL Dedicated pool).
From the error it seems to be an authorization issue. If you are accessing the storage account from your synapse studio, the managed identity of your synapse workspace should have storage blob data contributor access on the storage account and container you are trying to access. Giving access using SAS key is not the best option rather use managed identity of your synapse workspace.
You can refer this link step 4 to achieve the same.

Oracle Lob Operation FILEOPEN Failed I/O Error

I have an issues where my DBMS_LOB open operation which falls over where by I receive the following error message.
I have double checked the file exists, that the permission on the file are open to everyone one on the destination server, the database user has full access to the Oracle Directory.
I have seen similar issues where by it states that the file does not exist or the directory is invalid but I have not seen an I/O error before.
Any guidance is appreciated, I have been on the server where the file is located and I can't see what might be a miss based on the error message as the file definitely exists.
Oracle version is 11.2 trying to read from a windows server 2019.
Many Thanks,

Data Pump export using cloud shell

I am trying to export schema using data pump on Oracle Cloud Autonomous database.
I am using cloud shell to export schema.
When I tried to do the final step:
expdp admin/password#DB_HIGH schemas=SCHEMA_NAME directory=data_pump_dir dumpfile=exp%U.dmp filesize=1G logfile=expot.log
I got
UDE-12154: operation generated ORACLE error 12154 ORA-12154:
TNS:could not resolve the connect identifier specified
Do I need Oracle instant client to do export?
The Oracle client code uses one of three ways to look up connect data:
A flat file named tnsnames.ora
Oracle Names service
LDAP
When the complete ORA-12154 error appears with the text line, your program has found a working Oracle client install. However, the specified Oracle service is not listed in tnsnames.ora, Oracle Names or LDAP.
The first step in the troubleshooting process is to determine which name resolution method is deployed at your site. Most sites use tnsnames.ora, but enough use Oracle Names and LDAP, so it’s best to confirm this information.
If you are not the database administrator, get in touch with the people managing your Oracle systems and find out which method you should be using. They may be able to guide you in fixing the problem in accordance with your site’s standards.
The client code decides which mechanism to use based on the file sqlnet.ora. This file and tnsnames can usually both be found in the Oracle install directory (“ORACLE_HOME”), under network/admin/. This location may be overridden with the environment variable TNS_ADMIN.
If the sqlnet.ora file does not exist or does not specify a resolution method, then Oracle Net uses tnsnames.ora.
Example locations of Oracle networking files include:
Windows
ORANTNET80ADMIN
ORACLEORA81NETWORKADMIN
ORAWIN95NETWORKADMIN
ORAWINNETWORKADMIN
UNIX / Linux
$ORACLE_HOME/network/admin/
/etc/
/var/opt/oracle/
If you fix the naming issues, but you still see the ORA-12154 error, check the Oracle service to confirm that it’s available for connections. A power outage, server failure, or network connectivity issue will make this resource inaccessible. It’s also possible that scheduled maintenance or repairs of an unrelated Oracle issue may take that resource temporarily offline.
Thanks

Oracle Database Copy Failed Using SQL Developer

Few days ago while i tried perform database copy from remote server to local server i got some warnings and one of them was like this
"Error occured executing DDL for TABLE:MASTER_DATA".
And then i clicked yes, but the result of database copy was unexpected, there were only few tables has been copied.
When i tried to see DDL from SQL section/tab on one of table, i got this kind of information
-- Unable to render TABLE DDL for object COMPANY_DB_PROD.MASTER_DATA with DBMS_METADATA attempting internal generator.
I also got this message and i believe this message showed up because there's something wrong with DDL on my database so tables won't be created.
ORA-00942: table or view does not exist
I've never encountered this problem before and i always perform database copy every day since two years ago.
For the record before this problem occurred, i have removed old .arch files manually not by RMAN and i never using any RMAN commands. I also have removed old .xml log files, because these two type of files have made my remote server storage full.
How to trace and fix this kind of problem? Is there any corruption on my Oracle?
Thanks in advance.
The problem was caused by datafile has reached its max size though. I have resolved the problem by following the answer of this discussion ORA-01652: unable to extend temp segment by 128 in tablespace SYSTEM: How to extend?
Anyway, thank you everyone for the help.

Invalid privilege error?

Two related applications use a function in a package in several queries to return some data as CSV. The column being selected and concatenated is a CLOB field and can contain HTML, special characters, etc. The applications have few users and so are not heavily used. One is a Flex application that consumes Oracle HTTP services, and the other is an ASP.NET application that uses ODP.NET. The applications are really one integrated application with hyperlinks to each other.
Yesterday, I received several notifications of a strange error:
ORA-01031: insufficient privileges ORA-06512
The line number in the package in the error details indicates that the error was caused by the function being used in the select clause. It would occur when called by either application about 75% of the time.
Am I correct that an ORA-06512 occurred in the function that then caused an ORA-01031 insufficient privilege error? Unfortunately, ORA-06512 is a very generic error and doesn't tell me anything. And why would it cause an invalid privilege error? The Oracle user accounts being used by both applications have the execute privilege on the package that contains the function.
Regarding the function... it has been used for about 2 years in production without any issue. Also, when I imported the data to QA yesterday and tested it, no error would occur, no matter how many times I hammered the server with requests. But in production, the error would occur about 75% of the time with exactly the same parameters.
The DBA tried to help me with a trace, but we could not find the error message in the trace files.
Today, everything is back to normal in production. Even if I hammer the server with requests the error will stubbornly refuse to occur.
What caused this very strange behaviour yesterday? Do any of the gurus here have any ideas?
EDIT: I just realized one important detail. The column in the table that is being selected and concatenated into CSV by the function is a CLOB.
If the client applications were running "SELECT clob_to_csv(clob_col) FROM ..." and it returned an invalid privilege SOMETIMES, then it is probably something the function is trying to do, rather than the select statement not having sufficient privilege to execute the function.
Not quite clear on what it might do that may require a privilege. Does it use a file (UTL_FILE) or network connection / web service ?
Could be some sort of odd data (a very large clob perhaps).

Resources