Blob container doesn't allow the creation / reading of an external table. Is there any way to trace the exact problem? - azure-blob-storage

About a month ago, all the external tables built upon parquet files(ADLS Gen2, Synapse) stopped working with the following error message:
Unexpected error encountered checking whether directory exists or not:
AbfsRestOperationException: Operation failed: "Server failed to
authenticate the request. Please refer to the information in the
www-authenticate header.", 401
The access key wasn't rotated and even though I tried recreating new database scoped credentials, data sources, they didn't do anything.
Then I tried creating a new blob container with the same data and I was able to create external tables and run select statements over them.
Does anyone have a clue what the problem could be? At first I thought it was something from Azure, because the coincidence was that they had problems with Synapse. It may seem it's the SAS token, but if it so, why am I not allowed to create other external tables over new SAS tokens? Plus, when a SAS token expires, it throws a 403.
My guess is it is something on configuration for this specific blob, or maybe the Login that I'm using(admin login on SQL Dedicated pool).

From the error it seems to be an authorization issue. If you are accessing the storage account from your synapse studio, the managed identity of your synapse workspace should have storage blob data contributor access on the storage account and container you are trying to access. Giving access using SAS key is not the best option rather use managed identity of your synapse workspace.
You can refer this link step 4 to achieve the same.

Related

Experiencing issues using DBMS_CLOUD.GET_OBJECT in oracle cloud infrastructure Autonomous Database serverless

I am trying to create a DB Link between 2 Autonomous databases (Serverless) in OCI
List of steps i followed
I created the necessary credentials for the user using dbms_cloud.create_credential
Now, i try to upload the Wallet file (which i have stored in Object storage) using "dbms_cloud.get_object". It produces the following error
ORA-20000: ORA-29283: invalid file operation: nonexistent file or path [29434]
ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 983
ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 2622
ORA-06512: at line 2
If i use the wrong credential or if i change the uri, the error that the system produces are different. I believe oracle is able to get to the object, yet it produces this error.
Any ideas?
DBMS_CLOUD.GET_OBJECT supports ability to read data from an object store file and return the contents as a BLOB, or save the contents to a file in the given directory object in your Autonomous Database.
https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/dbms-cloud-subprograms.html#GUID-3DB888C9-18C7-4A26-8DA8-EDFB260E2B14
It seems that you are trying to download a Wallet file to directory object for creating a database link. Autonomous Database automatically provisions a database file system to store files. Although the exact SQL syntax is not posted, but the error indicates that the syntax is correct. The error appears like the database file system is not accessible, and it is an internal error for the service.
You could workaround the issue by restarting the Autonomous Database. As this is an old question, the issue could be automatically addressed by now with automatic maintenance of Autonomous Database.
Out of curiosity, what region are you experiencing this in? Free tier or paid?
Ultimately, there is nothing wrong with your syntax used or a wrong usage. Unfortunately the issue you're experiencing is likely an internal error/bug, and can be fixed by OCI ops. I highly recommend submitting a service request.
If you have not submitted one in the past, you can read up on how to here - https://docs.cloud.oracle.com/en-us/iaas/Content/GSG/Tasks/contactingsupport.htm#3Openasupportservicerequest

Is there any way to setup permission for entire schema (entire tables) in Hasura graphql instead of giving permission to each table?

I want to give permission to entire schema in Hasura graphql (all tables). Is this possible to give permission to entire table instead of giving permission to one by one table.
No access control rules in the table.
Please help me..
You can't do it from the web console. If you have a large amount of tables, perhaps it's worth to script it. You can introspect the tables from PG and then use Hasura's metadata API to setup the permissions.
Metadata API

Issue with setting up Oracle Rest Data services

I have successfully installed Oracle REST Database Services version 3.0.4.60.12.48. I am able to access Oracle Apex but when I create a web-service and click on test button I get this error.
Error during evaluation of resource template: GET hr/employees/,
SQL Error Code: 28000,
SQL Error Message: ORA-28000: the account is locked
I have tried connecting to Oracle_apex_public user and every other user to check if any account is locked but I am able to connect to these account with SQLDeveloper.
I have also tried reinstalling and changing default tablespace inside ORDS configuration files, but still it's not working.
If anyone has an idea what is going wrong, please help.
Thank you for the response.
I manage to solve the issue, the cause was i was using same Passwords for all apex public and sys changing th

OBIEE 12C authorization failed based on database view

In our environment (OBIEE 12C installed on Linux) we use an external authentication provider based on a view in our database. After logging in, authorization happens via a session variable initialised in an initialisation block in the RPD with a query based on the same view as for authentication.
What happened a few days ago is that people could log in (so the password saved in the view was being checked correctly) but authentication failed and they got the value of our default provided in the initialisation block.
Looking at the logs we found this error:
[nQSError: 17001] Oracle Error code: 3135, message: ORA-03135: connection lost contact
and
Query for Initialization Block 'AUTHORISATION' has failed.
Looking in the logs for authorization, nothing bad was found here but we can't seem to figure out how authentication worked and authorization didn't since they are based on the same database view. Has anyone ever had this issue before?
We've faced this issue multiple times but found that some users had the right authorization and authentication in between the errors.
Any help would be great, also where I can look further to troubleshoot this. Thanks!
You're having connectivity issues with the DB and the init block can't run. Nothing to do with OBI or the query - that's rather network, connectivity, firewall etc etc

Accessing unencrypted H2 database without credential knowledge

We are cleaning up servers for a customer and have stumbled upon an old application using an H2 database. While the accessing applications have credentials in their configuration files, none of them seem to work.
Even the "sa" user access is not known. As far as I can see, the password for "sa" defaults to an empty string, but access with "sa"/"" is denied (Wrong user name or password [28000-182] 28000/28000 (Help)).
As said, the database is not encrypted. Looking at the file, I can see the SQL statements for the tables, even some table contents.
Is there any way to gain access to that database? As far as my searches have shown it's only possible using the "sa" user. I'm looking for something along the lines of "--skip-grant-tables" from MySQL.
The easiest solution is probably:
Try to login to the database without password. This will fail (wrong user name or password), but it will run transaction log recovery so that the database is in a consistent state.
Then, use the Recover tool (org.h2.tools.Recover) to generate a SQL script.
Edit the script: Change the password for the default user.
Run the script. That way you get a new database.

Resources