How to connect to Redshift from RDS(sqlserver) via Linkedserver - insert

I have created my RDS (sqlserver instance) and it is working fine when i connect from sqlserver express. Now i want to connect to Redshift cluster via the client. Both cluster are under VPC and i have already set up the security group for both.
I have created the linkedserver in my sqlserver express as follows.
EXEC sp_addlinkedserver
#server=N'Staging64', -- Linked Server Display Name
#srvproduct=N'test', -- Default to SQL Server
#provider=N'SQLNCLI', -- Use SQLNCLI
#datasrc=N'52.167.514.101', -- FQDN of remote server running redshift
#catalog = 'test';
EXEC sp_addlinkedsrvlogin #rmtsrvname = 'Staging64' -- Linked Server Display Name
, #useself = 'false' -- Do not masquerade
-- , #locallogin = '' -- Commented out to force all local users to use linked login
, #rmtuser = 'dev' -- Remote user name
, #rmtpassword = 'dev2015'; -- Remote password
The linked server is created without any issues but when i execute this query:
SELECT TOP 10 * FROM Staging64.test.stage.dw_viasatsubscription;
it throws this error message:
OLE DB provider "SQLNCLI11" for linked server "Staging64" returned message "Login timeout expired".
OLE DB provider "SQLNCLI11" for linked server "Staging64" returned message "A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.".
Msg 53, Level 16, State 1, Line 25
Named Pipes Provider: Could not open a connection to SQL Server [53].
I have tried to google a lot but it has been in vain.
Any feedback will be appreciated.

Amazon Redshift can export data to Amazon S3 with the UNLOAD command:
Unloading Data to Amazon S3
UNLOAD command
This data can then be imported into SQL Server, as with any CSV data.
There are some systems can assist with this process (eg Amazon Elastic MapReduce, Amazon Data Pipeline), but they are probably overkill for your situation.

Related

SQL Server Managed Backup to Azure

I'm trying to backup SQL Server (Local) to Azure Blob Storage. I have created the SAS Credential
CREATE CREDENTIAL [https://xxxxxxxxx.blob.core.windows.net/yyyyyyyy]
WITH IDENTITY = 'Shared Access Signature',
SECRET = 'zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz'
In SQL Server, while trying to enable the Backup for server using this command
USE msdb;
GO
EXEC msdb.managed_backup.sp_backup_config_basic
#enable_backup =1,
#database_name = 'TempDB',
#container_url = 'https://xxxxxxxxx.blob.core.windows.net/yyyyyyyy',
#retention_days = 30
GO
I get this error:
Msg 45207, Level 17, State 17, Procedure managed_backup.sp_add_task_command, Line 102 [Batch Start Line 19]
The operation failed because of an internal error. The argument must not be empty string.
Parameter name: sasToken Please retry later.
at Microsoft.WindowsAzure.Storage.Core.Util.CommonUtility.AssertNotNullOrEmpty(String paramName, String value)
at Microsoft.WindowsAzure.Storage.Auth.StorageCredentials..ctor(String sasToken)
at Microsoft.SqlServer.SmartAdmin.SmartBackupAgent.FileService.VerifyContainerURL(String containerURL, SqlConnection conn)
at Microsoft.SqlServer.SmartAdmin.SmartBackupAgent.SmartBackup.ConfigureDbOrInstance(SmartBackupConfigParameters config, LogBaseService jobLogger, SqlConnection conn)
If we change the #enable_backup to "0", the query executes correctly.
I have tried re-creating the SAS key, it doesn't work.
PS: I'm currently using SQL Server 2017.
Try with these steps
1) Create a SQL Server credential using a shared access signature.
To create a SQL Server credential, follow these steps:
Connect to SQL Server Management Studio.
Open a new query window and connect to the SQL Server instance of the
database engine in your on-premises environment.
In the new query window, execute the following script .
USE master CREATE CREDENTIAL
"https://.blob.core.windows.net/"
//this name must match the container path, start with https and must
not contain a forward slash at the end WITH IDENTITY='SHARED ACCESS
SIGNATURE' //this is a mandatory string and should not be changed
, SECRET = 'sharedaccesssignature' //this is the shared access
signature key GO
Example:
USE master
CREATE CREDENTIAL [https://msfttutorial.blob.core.windows.net/containername]
WITH IDENTITY='SHARED ACCESS SIGNATURE'
, SECRET = 'sharedaccesssignature'
GO
To see all available credentials, you can run the following statement
in a query window connected to your instance:
SELECT * from sys.credentials
2) Create backup .Modify the URL appropriately for your storage account name and the container and then execute this script.
//To permit log backups, before the full database backup, modify the database to use the full recovery model.
USE master;
ALTER DATABASE AdventureWorks2016
SET RECOVERY FULL;
BACKUP DATABASE AdventureWorks2016
TO URL = 'https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>/AdventureWorks2016_onprem.bak
3) Open Object Explorer and connect to Azure storage using your storage account and account key. Expand Container verify that the backup appears in this container.
For more details refer these links link1 and link2
If anyone has experienced the similar issue with SQL Server, an update may solve your problem.
Symptoms
Assume that you disable "Allow Blob Public Access" on a storage account. In this situation, when you enable a managed or automated backup in Microsoft SQL Server 2016 or 2017, you receive the following error message:
"SQL Server Managed Backup to Microsoft Azure cannot configure the database 'DatabaseName' because a container URL was either not provided or invalid. It is also possible that your SAS credential is invalid."
Resolution
This problem is fixed in the following cumulative update for SQL
Server:
Cumulative Update 23 for SQL Server 2017
Cumulative Update 16 for SQL Server 2016 SP2
https://support.microsoft.com/en-us/topic/kb4589360-fix-error-when-you-enable-managed-or-automated-backup-in-sql-server-2016-and-2017-if-allow-blob-public-access-is-disabled-on-a-storage-account-d2315f8d-6a38-5d6c-6f02-3ab14c4b34cf

On Prem to Cloud with Data Factory

I have one On Prem Oracle database and one Azure SQL Database and want to use a Copy Data activity to transfer this data.
I have now created a Self Hosted IR for the Oracle database, and I am able to connect to it and preview data from Data Factory editor:
I have a Azure SQL Database that I want to recieve data, and it is set up with AutoResolveIntegrationRuntime, with Connection successful. I am also able to preview data from this database:
When I try to run this Copy Data activity I get following error message:
ErrorCode=SqlFailedToConnect,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot connect to SQL Database: 'sqlsrv', Database: 'database', User: 'user'. Check the linked service configuration is correct, and make sure the SQL Database firewall allows the integration runtime to access.
Based on all Docs/Tutorials I have read this should not be failing. I have tried to open up the SQL Server to allow all IP adresses in firewall rules.
Any ideas what I might be doing wrong here?
Since the integration runtime is a bridge across on prem and cloud , you need to check whether you are able to access the onprem database and the Azure SQL database through the VM in which the IR is installed.
The VM hosting the IR should be able to access both the source and sink for the copy activity to be successful in case if either of source or sink is using the self hosted runtime.
So the issue is not w.r.t Azure SQL database but in the VM hosting the IR

Connect Aurora Severless from tableau desktop and tableau server

It's a follow up question that I asked earlier here on Stack Overflow
Not able connect Amazon Aurora Serverless from SQL client
I found a cool hack that is working perfectly for my development purpose with some tweaks and I know I should not use this on my production environment.
So as we know Aurora Serverless works only inside VPC. So make sure you are attempting to connect to Aurora within the VPC and the security group assigned to the Aurora cluster has the appropriate rules to allow access. As I mention earlier that I already have an EC2 instance, Aurora Serverless and a VPC around both. So I can access it from my EC2 but not from my local pc/ local sql client. To fix that I did below two steps.
1. To access from any client(Navicat in my case),
a. First need to add GENERAL db configurations like aurora endpoint host, username, password etc.
b. Then, need to add SSH configuration, like EC2 machine username, hostip and .pem file path
2. To access from project,
First I create a ssh tunnel from my terminal like this way,
ssh ubuntu#my_ec2_ip_goes_here -i rnd-vrs.pem -L 5555:database-1.my_aurora_cluster_url_goes_here.us-west-2.rds.amazonaws.com:5432
Then run my project with db configuration like this way test.php,
$conn = pg_connect("host=127.0.0.1 port=5555 dbname=postgres user=postgres password=password_goes_here");
// other code goes here to get data from your database
if (!$conn) {
echo "An error occurred.\n";
exit;
}
$result = pg_query($conn, "SELECT * FROM brands");
if (!$result) {
echo "An error occurred.\n";
exit;
}
while ($row = pg_fetch_row($result)) {
echo "Brand Id: $row[0] Brand Name: $row[1]";
echo "<br />\n";
}
So what is my question now?
I need to connect my aurora serverless from tableau desktop and tableau server. For tableau desktop I used the same ssh tunneling and it works but how do I do it for tableau server?
Tableau Server does not support centralized data sources. You need to connect Tableau Desktop to your data source, and publish it to Tableau Server when you want to expose the data through the UI.
https://community.tableau.com/thread/111282

Error connecting Oracle - ORA-12638: Credential retrieval failed

I am getting the following error while using linked server in sql to connect and external oracle db
Cannot initialize the data source object of OLE DB provider "OraOLEDB.Oracle" for linked server "xxxx".
OLE DB provider "OraOLEDB.Oracle" for linked server "xxx" returned message "ORA-12638: Credential retrieval failed".
I am having this issue around 4 out 5 times. So it works only sometimes.
In my case, the test and live oracle DB(external) is on the same physical server with different SID/db instance.
The test connection using OraOLEDB works consistently, it's the live linked server that's the problem.
Also to take out the network from equation I tried connecting oracle live from our test environment and oracle test from our live. Connection to oracle test works fine irrespective and live doesn't.
I can connect to the application fronting the External Oracle DB fine using the live login credentials that are used for the live linked server. So that to me takes the login account out of the question.
Question 1:- Is there any other way to connect to external Oracle DB from either via sql or C#?
I used openrowset as below and get an error:
SELECT *
FROM OPENROWSET('OraOLEDB.Oracle', 'Data Source=external_Oracle_serverIP;Initial Catalog=bbinstance;User id=xxx; Password=xx;',
'SELECT * FROM dbname')
I get the following error
OLE DB provider "OraOLEDB.Oracle" for linked server "(null)" returned message "ORA-12560: TNS:protocol adapter error". Msg 7303, Level 16, State 1, Line 1 Cannot initialize the data source object of OLE DB provider "OraOLEDB.Oracle" for linked server "(null)".
Question 2: What am I doing wrong above?
When I've encountered this issue, it's because of the following line in sqlnet.ora:
SQLNET.AUTHENTICATION_SERVICES= (NTS)
This causes Oracle to attempt to use Windows Native Authentication Services.
If Oracle cannot authenticate via this method, you'll get the 12638 error. To troubleshoot, change this line to
SQLNET.AUTHENTICATION_SERVICES= (NONE)
and repeat your test to the live database.
Oracle 12c & Above:
Sometimes, there are chances that 2 Oracle Pluggable DB services running under different root databases can run with same listener creating this chaos. Check and stop (if not Prod) unwanted PDB and try connecting to the required DB.

Unable to start SQL server Agents

been trying to configure an ADO.NET connection for my Visual Studio application but I am running into issues and having no luck at all troubleshooting them. The major error that I run into is:
A network-related or instance-specific error occurred while establishing a connection to SQL server. The server was not found or not accessible. Verify the instance name is correct and that SQL server is configured to allow remote connections (Error 40: Could not open connection to the SQL server)
The steps I undertook to troubleshoot this are:
1. Open up Sql server configuration manager and under the SQL Server Network Configuration, Protocols for MSSQLSERVER,SQLExpress, SQL server 2008 connection string data source I enable each of the protocol names (Shared memory, Named pipes,TCP/IP,VIA)
As instructed I stop the SQL server services first under the SQL server services nodes.
Now I click start, MSSQL server service starts but both SQL Server 2008 and SQLEXPRESS agents do not. Stating the following:
The request failed or the service did not respond in a timely fashion. Consult the event log or other applicable error logs for details
This might just be the problem as when I attempt to create the connection string , the server name specified is MYWORLD/SQL SERVER 2008. Since the SQL server 2008 agent service refuses to start, the connection fails, leading to the first error message.
Does anybody have any leads on this and can let me know the necessary steps to mitigate this.
The SQL Browser service must be running to connect to a named instance.
Use SQL Servr Configuration Manager under Configuration Tools under Microsot SQL Server 2008 to assign the logon accounts for each service you want to run. Do NOT use any other method to assign user accounts because the correct rights will not be enabled.

Resources