I've set the DefaultEndpointsProtocol to "http" in the service configuration file, but still I get the ERROR_HTTPS_NOT_SUPPORTED exception during the cloud drive creation. Is there anything else that I need to look for?
Thanks for the help.
Cloud drive can currently use only HTTP storage connection string:
http://msdn.microsoft.com/en-us/library/windowsazure/microsoft.windowsazure.storageclient.clouddrive.delete.aspx
"When defining the connection string for connecting to the storage account, you must use the HTTP protocol. The HTTPS protocol is not supported for creating a drive."
http://blogs.msdn.com/b/farida/archive/2012/04/23/cloud-drive-can-currently-use-only-http-storage-connection-string.aspx
Related
I am getting a 404 when trying to access Database Actions of an Autonomous Database with a private endpoint from my internal environment that is connected through VPN. Anyone know how to fix this?
All Autonomous Database tools are supported in databases configured with private endpoints, but additional configuration is required: to connect from your data center, to resolve the Autonomous Database private endpoint, you need a Fully Qualified Domain Name (FQDN), mapping the private endpoint IP to the FQDN. For that you either need to add an entry into your client's hosts files (e.g. /etc/hosts on Linux) or you can use hybrid DNS in the Oracle Cloud Infrastructure.
In addition to the name resolution, your dynamic routing gateway must allow the traffic to and from the Autonomous Database.
For what it's worth, if you want to learn more about the private endpoint setup, check the official doc and specifically the connection example
I can not connect to Google Services from client application if it is trying to communicate with oauth2.googleapis.com (which is probably blocked in my corporate network - I dont know how to test it for sure).
I tried BigQuery with JDBC driver in Dbeaver. With basic settings.
User-based login does this:
It generates link for OAUTH. I open the browser and login with the right google account. Then I insert generated code into the Dbeaver and I recieve that AUTH has failed.
Service-based login does this:
It does not want me to visit any webpage. It just tells me:
[Simba][BigQueryJDBCDriver](100004) HttpTransport IO error : oauth2.googleapis.com.
I also tried to use ODBC, where PROXY can be filled in. But no luck.
When I take a look into 'Proxy Options' the proxy port is always rewritten by proxy host. Weird.
This is what happens when i click on 'catalog' or 'dataset' drop-down field. I cant do any further steps.
BUT!
When I set my HTTP PROXY in GCLOUD CLI APP then communication works. And I can call BQ from it.
Does it mean that GCLOUD communicates through HTTP Proxy and DBeaver or ODBC does not? Or does it mean that GCLOUD does not need oauth2.googleapis.com but ODBC and JDBC do and it is blacklisted? I am confused.
We need to migrate from our internal environment to GCP. We would love to use various applications. I would ask for whitelisting oauth2.googleapis.com but i am not sure this is the only problem as GCLOUD app works without any flaws.
I am not-experienced with networking so i am more than happy to update / correct this question or add any info (if you need) to help me understand this issue. Thank you
According to your description, your corporate network is using a Proxy to reach out Internet, this is the reason why gcloud is capable to reach out BigQuery service when Proxy settings are configured in your system; through Cloud SDK Proxy settings or HTTP PROXY environment variable.
You require to setup the proxy settings within the JDBC connection string as described in Simba JDBC driver documentation, e.g.:
jdbc:bigquery:DataSetId=MyDataSetId;ProjectId=MyProjectId;OAuthType=1;ProxyHost=MyProxyHost;ProxyPort=MyProxyPort;ProxyUID=MyProxyUsername;ProxyPWD=MyProxyPassword
This connection string will indicate the Proxy settings to Simba JDBC driver.
I'm trying to connect to s4 hana system using s4 sdk. While executing calls via .execute() method in cloud foundry environment, i see below error logs:
Caused by: com.sap.cloud.sdk.cloudplatform.connectivity.exception.DestinationAccessException: Failed to get authentication headers. Destination service returned error: Missing private and public key for subaccount ******-****-****-***-*******.
Note: I've already configured trust between subaccount and S4Hana system and created respective communication and business user. The associated authentication method used in the destination is oAuth2SamlBearerAssertion. Note: The call executes fine in both local and cloud foundry environment with basic authentication.
Can someone please suggest what is wrong here.
As correctly pointed out by #Dennis H there was a problem in trust configuration between my subaccount and S4 Hana system, the configuration wrong in my case :
-> The certificate I downloaded for trust was using this URL:
https://.authentication.eu10.hana.ondemand.com/saml/metadata
This is incorrect we need to get the certificate from download trust button in destination tab at subaccount level
->Provider name was incorrect in the communication system.
We are developing a side-by-side extension app and deploying it to CF. Our app is trying to connect to S4HANA cloud system using oAUTH2SAMLBEARERASSERTION. But facing issues while doing it. We are getting below error in logs. Please be noted, we are able to connect to S4HANA Cloud using basic auth.
com.sap.cloud.sdk.cloudplatform.connectivity.exception.DestinationAccessException: Failed to access the configuration of destination
Our destination parameters look as attached screenshotenter image description here
Thank you.
I have created a Google Cloud Project MySQL database to use in conjunction with the Jdbc service provided by Google Apps Script. Everything went as planned with the connection. I am basically connecting as it does in the docs.
var conn = Jdbc.getCloudSqlConnection(dbUrl, user, userPwd);
I shared the file with another account and all of a sudden I am seeing a red error saying:
'Failed to establish a database connection. Check connection string, username and password.'
Nothing changed in the code, but there is an error. When I go back to my original account and run the same bit of code, there is no error. What is happening here? Any ideas?
Jdbc.getConnection works from both: my account and another account:
var conn = Jdbc.getConnection('jdbc:mysql://' + IP + ':3306/' + database_name, user, password)
I'm really confused because the recommended method did not work.
There are two ways of establishing a connection with a Google Cloud
SQL database using Apps Script's JDBC service:
(Recommended) Connecting using Jdbc.getCloudSqlConnection(url)
Connecting using Jdbc.getConnection(url)
Notes:
IP is a Public IP address from the OVERVIEW tab in your database console:
I've allowed any host when created a user:
I am not sure whether this question has been resolved or not, but let me add this answer.
I also faced the same problem but I found the resolution. What I did is:
First, go to the console.
https://console.cloud.google.com
Then, open IAM.
and add the account as a member and add this permission: "Cloud SQL Client".
I think this is a permission issue in your second account. Necessary information are missing in your question. But, the secound account, if run as a another user, won't necessarily have your sqlservice authorization. The permission,
https://www.googleapis.com/auth/sqlservice
Manage the data in your Google SQL Service instances
is required to use Jdbc.getCloudSqlConnection(url), while Jdbc#getConnectionUrl() just requires external link connection permission
https://www.googleapis.com/auth/script.external_request
I believe that you can only connect to sql instances owned by you with getCloudSqlConnection() which doesn't even require external connection permission. This method probably calls your sql instance internally.
References:
Jdbc#getCloudConnection
Jdbc#getConnection
Conclusion
To connect to any external service, you need external_request permission. But, You don't need that permission to connect to your own documents say, Spreadsheets owned by you/have edit access permission - through SpreadsheetApp.openByUrl(). I believe it's the same thing with Jdbc.getCloudSqlConnection(). It calls your Google sql internally - So, even if you grant external request permission, It won't work. What will work for this method is
Installable triggers (which runs as you).
Add the second account also as owner in GCP-IAM (may not work though) See this answer
I'd double-check once again all IP ranges which should be whitelisted. According to your description it worked fine in first account, probably in second account Apps Script uses another IP for connection, which was not whitelisted or whitelisted with some typo. Could you share screenshot how did you exactly whitelist the ranges from this article?
I have a GAS Add-On that uses a Google cloud dB. I initially set this up by:
Whitelisting Google Cloud IP ranges in my SQL instance
Getting the script.external_request scope approved for OAuth Consent screen
This all works great from GAS for the add-on, but I suspect that if this whitelist is not comprehensive and volatile (which I expect it is), I will see intermittent connectivity issues.
I recently added a Firebase web app that needs access to the same dB. I had issues, because Firebase does not conform to those Google IP ranges and does not expose its IP for whitelisting. So I had to create a socket layer connection as if Firebase was an external service.
Which got me thinking, should I put a socket layer in my GAS Add-On? But nothing in the GAS JBDC Class documentation indicates a socket parameter.
Which leads me to a question that was not really answered in this thread:
Does anyone know why Jdbc.getCloudSqlConnection(url) is the "Recommended" approach? The documentation seems to imply that because the IP whitelisting is not required, Jdbc.getCloudSqlConnection(url) is using a socket (or some other secure method) to connect to the dB?
It also seems silly that if that is the case, that I would need two have two sensitive scopes to manage a dB connection. I would rather not go through another OAuth const audit and require my users to accept another scope unless there is a benefit to doing so.
I need to put a crossdomain.xml file in my Windows Azure Web Role. But where ?
I tried to put it in : F:\sitesroot\0
But my Unity3D Web App says : Exception: Unable to connect, as no valid crossdomain policy was found.
I don't know what I am missing. Unity uses by default port (843).
Where to put the crossdomain.xml
Any help is welcome !
CrossDomainPolicy.xml must be at the root of your application.
If you are using single Web Role just add CrossDomainPolicy.xml at the root of your application and set it up correctly as below:
Depends on how many "sites" sections you have in your role's ServiceDefinition.csdef , you will get that many \sitesroot\0 and \sitesroot\1 and CrossDomainPolicy.xml will be distributed to all depend on your role solution settings.
Once I discussed this in my following blog:
Silverlight front end calling to WCF Service, all in one Windows Azure Web Role Sample
You mentioned port 843, which sounds like it would need the Flash protocol, which is a TCP socket listener on port 843 that responds with the cross domain policy when it receives the text <policy-file-request/>. Do you need to be doing that? Does your app use sockets?
Avkash's answer is correct for where the XML file should go if you just need to serve it via port 80 from your web app, but if you need to do raw sockets, you'll need to be running something on the server that handles that.