SQL Server Managed Backup to Azure - azure-blob-storage

I'm trying to backup SQL Server (Local) to Azure Blob Storage. I have created the SAS Credential
CREATE CREDENTIAL [https://xxxxxxxxx.blob.core.windows.net/yyyyyyyy]
WITH IDENTITY = 'Shared Access Signature',
SECRET = 'zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz'
In SQL Server, while trying to enable the Backup for server using this command
USE msdb;
GO
EXEC msdb.managed_backup.sp_backup_config_basic
#enable_backup =1,
#database_name = 'TempDB',
#container_url = 'https://xxxxxxxxx.blob.core.windows.net/yyyyyyyy',
#retention_days = 30
GO
I get this error:
Msg 45207, Level 17, State 17, Procedure managed_backup.sp_add_task_command, Line 102 [Batch Start Line 19]
The operation failed because of an internal error. The argument must not be empty string.
Parameter name: sasToken Please retry later.
at Microsoft.WindowsAzure.Storage.Core.Util.CommonUtility.AssertNotNullOrEmpty(String paramName, String value)
at Microsoft.WindowsAzure.Storage.Auth.StorageCredentials..ctor(String sasToken)
at Microsoft.SqlServer.SmartAdmin.SmartBackupAgent.FileService.VerifyContainerURL(String containerURL, SqlConnection conn)
at Microsoft.SqlServer.SmartAdmin.SmartBackupAgent.SmartBackup.ConfigureDbOrInstance(SmartBackupConfigParameters config, LogBaseService jobLogger, SqlConnection conn)
If we change the #enable_backup to "0", the query executes correctly.
I have tried re-creating the SAS key, it doesn't work.
PS: I'm currently using SQL Server 2017.

Try with these steps
1) Create a SQL Server credential using a shared access signature.
To create a SQL Server credential, follow these steps:
Connect to SQL Server Management Studio.
Open a new query window and connect to the SQL Server instance of the
database engine in your on-premises environment.
In the new query window, execute the following script .
USE master CREATE CREDENTIAL
"https://.blob.core.windows.net/"
//this name must match the container path, start with https and must
not contain a forward slash at the end WITH IDENTITY='SHARED ACCESS
SIGNATURE' //this is a mandatory string and should not be changed
, SECRET = 'sharedaccesssignature' //this is the shared access
signature key GO
Example:
USE master
CREATE CREDENTIAL [https://msfttutorial.blob.core.windows.net/containername]
WITH IDENTITY='SHARED ACCESS SIGNATURE'
, SECRET = 'sharedaccesssignature'
GO
To see all available credentials, you can run the following statement
in a query window connected to your instance:
SELECT * from sys.credentials
2) Create backup .Modify the URL appropriately for your storage account name and the container and then execute this script.
//To permit log backups, before the full database backup, modify the database to use the full recovery model.
USE master;
ALTER DATABASE AdventureWorks2016
SET RECOVERY FULL;
BACKUP DATABASE AdventureWorks2016
TO URL = 'https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>/AdventureWorks2016_onprem.bak
3) Open Object Explorer and connect to Azure storage using your storage account and account key. Expand Container verify that the backup appears in this container.
For more details refer these links link1 and link2

If anyone has experienced the similar issue with SQL Server, an update may solve your problem.
Symptoms
Assume that you disable "Allow Blob Public Access" on a storage account. In this situation, when you enable a managed or automated backup in Microsoft SQL Server 2016 or 2017, you receive the following error message:
"SQL Server Managed Backup to Microsoft Azure cannot configure the database 'DatabaseName' because a container URL was either not provided or invalid. It is also possible that your SAS credential is invalid."
Resolution
This problem is fixed in the following cumulative update for SQL
Server:
Cumulative Update 23 for SQL Server 2017
Cumulative Update 16 for SQL Server 2016 SP2
https://support.microsoft.com/en-us/topic/kb4589360-fix-error-when-you-enable-managed-or-automated-backup-in-sql-server-2016-and-2017-if-allow-blob-public-access-is-disabled-on-a-storage-account-d2315f8d-6a38-5d6c-6f02-3ab14c4b34cf

Related

How do I create DSN of Azure databricks?

I'm trying to connect Azure databricks from another application. I need to create a databricks DSN as per the connection steps mentioned in the application. So I'm trying to create the Databricks DSN mentioning the steps given here But I'm getting following error message [Simba][ThriftExtension] (14) Unexpected response from server during a HTTP connection: Unauthorized/Forbidden error response returned, but no token expired message received.
Sorry I couldn't share with the parameters that I'm using as it's connecting to client's data. Can you suggest me what could be the possible reason for error?
I downloaded and installed odbc driver from this page.
Image for reference:
searched for open data source on start menu and opened it and selected Simba Spark under System DSN.
Image for reference:
filled the required details from azure Databricks, for that
we need to follow below steps:
Navigate to azure Databricks compute option and select the cluster.
Click on at cluster configuration advanced options and select JDBC/ODBC option.
Copy host name, port and http path.
Go to username at left top of data bricks page and select user settings
Generate access token
Copy and save the access token.
By clicking on Simba Spark system DSN
we will get below page
Enter Data source name, description and Enter Server Hostname of Databricks as Host, enter port of Databricks, select Username and password option enter token as username and enter access token of Databricks as password
select HTTP and enter HTTP Path of Databricks as HTTP path click OK.
After above setting I tested the connection I go below error
I selected SSL Option and enabled SSL
I tested again the connection is successful.
In this way I created DSN for Databricks.

ORA-12545: Network Transport: Unable to resolve connect hostname

Getting following exception two week once in PROD, when calling azure function from core api 3.1.
Below issue occurred when establishing the connection with oracle DB using ADO.Net. especially when executes the conn.open().
OracleConnection conn = new OracleConnection(connStr);
conn.Open();
OracleCommand cmd = new OracleCommand(strSQLQuery, conn)
{
CommandType = CommandType.Text
};
OracleDataReader odr = cmd.ExecuteReader();
Once restarted the app service from azure portal, it would be resolved.
Error - ora-12545 network transport unable to resolve connect hostname.
Please check if below can be worked around.
If you are using the tnsname.ora,Please make sure its configured
correctly. Check tnsname.ora file for prod and check for
servername,See if the prod listener exists in listener.ora file, it
must be missing.So try to add the listener for prod environment similar to dev
,if present.
If the Connection string had DNS name, try replacing it with IP
address. Try to use user service name as they have different names for
product and dev separately which can be useful even if SID is same
(in some cases).Always Ensure that the host field of the local_listener is
properly set to a name which can be resolved by the Oracle client.
Or
Try to update host files to contain the requird server and host port.
Go to
\etc\hosts . Insert server name and
server IP in here and save.
Also check the .net version , it may be incompatible sometimes,
maybe that’s why , after restarting ,its setting itself and the
problem is resolved in azure app service.or Some times issue maybe
due to firewall being restricting in azure portal.
You can raise a support request from overview page from troubleshoot blade ,if the issue still remains.
Reference:
oracle - Stack Overflow

On Prem to Cloud with Data Factory

I have one On Prem Oracle database and one Azure SQL Database and want to use a Copy Data activity to transfer this data.
I have now created a Self Hosted IR for the Oracle database, and I am able to connect to it and preview data from Data Factory editor:
I have a Azure SQL Database that I want to recieve data, and it is set up with AutoResolveIntegrationRuntime, with Connection successful. I am also able to preview data from this database:
When I try to run this Copy Data activity I get following error message:
ErrorCode=SqlFailedToConnect,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot connect to SQL Database: 'sqlsrv', Database: 'database', User: 'user'. Check the linked service configuration is correct, and make sure the SQL Database firewall allows the integration runtime to access.
Based on all Docs/Tutorials I have read this should not be failing. I have tried to open up the SQL Server to allow all IP adresses in firewall rules.
Any ideas what I might be doing wrong here?
Since the integration runtime is a bridge across on prem and cloud , you need to check whether you are able to access the onprem database and the Azure SQL database through the VM in which the IR is installed.
The VM hosting the IR should be able to access both the source and sink for the copy activity to be successful in case if either of source or sink is using the self hosted runtime.
So the issue is not w.r.t Azure SQL database but in the VM hosting the IR

Error connecting Oracle - ORA-12638: Credential retrieval failed

I am getting the following error while using linked server in sql to connect and external oracle db
Cannot initialize the data source object of OLE DB provider "OraOLEDB.Oracle" for linked server "xxxx".
OLE DB provider "OraOLEDB.Oracle" for linked server "xxx" returned message "ORA-12638: Credential retrieval failed".
I am having this issue around 4 out 5 times. So it works only sometimes.
In my case, the test and live oracle DB(external) is on the same physical server with different SID/db instance.
The test connection using OraOLEDB works consistently, it's the live linked server that's the problem.
Also to take out the network from equation I tried connecting oracle live from our test environment and oracle test from our live. Connection to oracle test works fine irrespective and live doesn't.
I can connect to the application fronting the External Oracle DB fine using the live login credentials that are used for the live linked server. So that to me takes the login account out of the question.
Question 1:- Is there any other way to connect to external Oracle DB from either via sql or C#?
I used openrowset as below and get an error:
SELECT *
FROM OPENROWSET('OraOLEDB.Oracle', 'Data Source=external_Oracle_serverIP;Initial Catalog=bbinstance;User id=xxx; Password=xx;',
'SELECT * FROM dbname')
I get the following error
OLE DB provider "OraOLEDB.Oracle" for linked server "(null)" returned message "ORA-12560: TNS:protocol adapter error". Msg 7303, Level 16, State 1, Line 1 Cannot initialize the data source object of OLE DB provider "OraOLEDB.Oracle" for linked server "(null)".
Question 2: What am I doing wrong above?
When I've encountered this issue, it's because of the following line in sqlnet.ora:
SQLNET.AUTHENTICATION_SERVICES= (NTS)
This causes Oracle to attempt to use Windows Native Authentication Services.
If Oracle cannot authenticate via this method, you'll get the 12638 error. To troubleshoot, change this line to
SQLNET.AUTHENTICATION_SERVICES= (NONE)
and repeat your test to the live database.
Oracle 12c & Above:
Sometimes, there are chances that 2 Oracle Pluggable DB services running under different root databases can run with same listener creating this chaos. Check and stop (if not Prod) unwanted PDB and try connecting to the required DB.

How to connect to Redshift from RDS(sqlserver) via Linkedserver

I have created my RDS (sqlserver instance) and it is working fine when i connect from sqlserver express. Now i want to connect to Redshift cluster via the client. Both cluster are under VPC and i have already set up the security group for both.
I have created the linkedserver in my sqlserver express as follows.
EXEC sp_addlinkedserver
#server=N'Staging64', -- Linked Server Display Name
#srvproduct=N'test', -- Default to SQL Server
#provider=N'SQLNCLI', -- Use SQLNCLI
#datasrc=N'52.167.514.101', -- FQDN of remote server running redshift
#catalog = 'test';
EXEC sp_addlinkedsrvlogin #rmtsrvname = 'Staging64' -- Linked Server Display Name
, #useself = 'false' -- Do not masquerade
-- , #locallogin = '' -- Commented out to force all local users to use linked login
, #rmtuser = 'dev' -- Remote user name
, #rmtpassword = 'dev2015'; -- Remote password
The linked server is created without any issues but when i execute this query:
SELECT TOP 10 * FROM Staging64.test.stage.dw_viasatsubscription;
it throws this error message:
OLE DB provider "SQLNCLI11" for linked server "Staging64" returned message "Login timeout expired".
OLE DB provider "SQLNCLI11" for linked server "Staging64" returned message "A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.".
Msg 53, Level 16, State 1, Line 25
Named Pipes Provider: Could not open a connection to SQL Server [53].
I have tried to google a lot but it has been in vain.
Any feedback will be appreciated.
Amazon Redshift can export data to Amazon S3 with the UNLOAD command:
Unloading Data to Amazon S3
UNLOAD command
This data can then be imported into SQL Server, as with any CSV data.
There are some systems can assist with this process (eg Amazon Elastic MapReduce, Amazon Data Pipeline), but they are probably overkill for your situation.

Resources