Why am I able to access a hobby-dev Postgres in Heroku without SSL?
This is my Node code:
const { Client } = require('pg');
const connectionString = process.env.DATABASE_URL;
const client = new Client({
connectionString: connectionString,
ssl: false
})
heroku pg:credentials:url DATABASE returns sslmode=require yet I am also able to connect remotely with psql DATABASE_URL without the sslmode=require as a query param.
According to Heroku docs and code samples it shouldn't be the case. The only thing that could explain it is that Heroku does not support encryption at rest on the Hobby tier so why should it in transit?
I contacted Heroku support and the response I got is that it's currently not enforced on the server side
Their data team confirms this is a "known-but-undocumented" thing. The docs unfortunately are misleading e.g. here and here:
If you leave off sslmode=require you will get a connection error
Encryption at rest has little to do with secure connections to the Postgres instance. After all, 'at rest' is quite the opposite of moving bytes back and forth between the client and the database host.
This is almost certainly due to the configuration of the database host itself but it's impossible to say for certain. At a guess, the reasoning here is to lower the difficulty in using the service since the hobby tier is free and popular with hobbyist and newer developers.
Related
I have an Heroku Postgres database that I want to share with certain external systems.
The docs says (link):
Heroku Postgres databases are designed to be used with a Heroku app.
However, except for private and shield tier databases, they are
accessible from anywhere and may be used from any application using
standard Postgres clients. For private databases, outside access can
be enabled using trusted IP ranges.
Current external systems can't use Postgres JDBC driver for some security constraints. How can I give them access to Postgres database it in a secure way?
All Heroku Postgres databases are public, with the exception of their private and shield tier databases which have additional network restrictions.
As long as you use a Postgres appropriate driver and require ssl on connections there is nothing preventing your external client from connecting to a Heroku Postgres instance.
One additional thing to consider is that fact that your Heroku Postgres instance may be moved to a new host during maintenance operations. You can stay abreast of any changes by polling the platform api.
I have created a Google Cloud Project MySQL database to use in conjunction with the Jdbc service provided by Google Apps Script. Everything went as planned with the connection. I am basically connecting as it does in the docs.
var conn = Jdbc.getCloudSqlConnection(dbUrl, user, userPwd);
I shared the file with another account and all of a sudden I am seeing a red error saying:
'Failed to establish a database connection. Check connection string, username and password.'
Nothing changed in the code, but there is an error. When I go back to my original account and run the same bit of code, there is no error. What is happening here? Any ideas?
Jdbc.getConnection works from both: my account and another account:
var conn = Jdbc.getConnection('jdbc:mysql://' + IP + ':3306/' + database_name, user, password)
I'm really confused because the recommended method did not work.
There are two ways of establishing a connection with a Google Cloud
SQL database using Apps Script's JDBC service:
(Recommended) Connecting using Jdbc.getCloudSqlConnection(url)
Connecting using Jdbc.getConnection(url)
Notes:
IP is a Public IP address from the OVERVIEW tab in your database console:
I've allowed any host when created a user:
I am not sure whether this question has been resolved or not, but let me add this answer.
I also faced the same problem but I found the resolution. What I did is:
First, go to the console.
https://console.cloud.google.com
Then, open IAM.
and add the account as a member and add this permission: "Cloud SQL Client".
I think this is a permission issue in your second account. Necessary information are missing in your question. But, the secound account, if run as a another user, won't necessarily have your sqlservice authorization. The permission,
https://www.googleapis.com/auth/sqlservice
Manage the data in your Google SQL Service instances
is required to use Jdbc.getCloudSqlConnection(url), while Jdbc#getConnectionUrl() just requires external link connection permission
https://www.googleapis.com/auth/script.external_request
I believe that you can only connect to sql instances owned by you with getCloudSqlConnection() which doesn't even require external connection permission. This method probably calls your sql instance internally.
References:
Jdbc#getCloudConnection
Jdbc#getConnection
Conclusion
To connect to any external service, you need external_request permission. But, You don't need that permission to connect to your own documents say, Spreadsheets owned by you/have edit access permission - through SpreadsheetApp.openByUrl(). I believe it's the same thing with Jdbc.getCloudSqlConnection(). It calls your Google sql internally - So, even if you grant external request permission, It won't work. What will work for this method is
Installable triggers (which runs as you).
Add the second account also as owner in GCP-IAM (may not work though) See this answer
I'd double-check once again all IP ranges which should be whitelisted. According to your description it worked fine in first account, probably in second account Apps Script uses another IP for connection, which was not whitelisted or whitelisted with some typo. Could you share screenshot how did you exactly whitelist the ranges from this article?
I have a GAS Add-On that uses a Google cloud dB. I initially set this up by:
Whitelisting Google Cloud IP ranges in my SQL instance
Getting the script.external_request scope approved for OAuth Consent screen
This all works great from GAS for the add-on, but I suspect that if this whitelist is not comprehensive and volatile (which I expect it is), I will see intermittent connectivity issues.
I recently added a Firebase web app that needs access to the same dB. I had issues, because Firebase does not conform to those Google IP ranges and does not expose its IP for whitelisting. So I had to create a socket layer connection as if Firebase was an external service.
Which got me thinking, should I put a socket layer in my GAS Add-On? But nothing in the GAS JBDC Class documentation indicates a socket parameter.
Which leads me to a question that was not really answered in this thread:
Does anyone know why Jdbc.getCloudSqlConnection(url) is the "Recommended" approach? The documentation seems to imply that because the IP whitelisting is not required, Jdbc.getCloudSqlConnection(url) is using a socket (or some other secure method) to connect to the dB?
It also seems silly that if that is the case, that I would need two have two sensitive scopes to manage a dB connection. I would rather not go through another OAuth const audit and require my users to accept another scope unless there is a benefit to doing so.
As of Parse's announcement of Cloud Code Webhooks, I've build my own server environment instead of using Parse Cloud Code.
I'm running my server locally, changing the Parse webhook to point at my ngrok tunnel, as described here.
This all works great, but one problem arises when more people from my team, has to work on the server, on the same time. Since the webhook to Parse only can (AFAIK) point to one single webhook URL, we can only have one server running, pointing to its localhost address, through ngrok.
Is anyway to create a workaround for this problem?
Feel free to ask my elaborate on any of my points.
I am knocking together a short program mainly to try and learn some basics. I am finding an issue with this part of code:
{
MySqlConnection connection = new MySqlConnection("server=serverT;User Id=user;password=password;database=healer");
connection.Open();
MySqlDataReader reader = new MySqlCommand("SELECT version FROM version", connection).ExecuteReader();
reader.Read();
etc etc etc
}
Now when i build and run this, It works perfectly, no problems whatsoever, it connects up and i can read without any issue.. (The server is hosted online btw)
When i give out the exe/mysql.data.dll to anyone to run, they are thrown the exception of being unable to connect to any sql host.
I just cant see why I can connect, and they can't. I have tried this now on 3 other remote machines and they all fail to connect, but Mine works OK.
There are no access rules on the hosted sql database, I am allowing access from all IPs
Can anyone shed any light?
You are hard coding the connection string into your code. Your clients have a different server therefore they need a different connection string.
You need to create a form or a config file where the connection parameters are retrieved from.
Please read this:
http://msdn.microsoft.com/en-us/library/ms254494.aspx
Good morning,
I have found that many of my customers have MS Access already installed on their PCs. Although Access is very limited as a data store, I have found that it is great for deploying low-cost front-ends for entry level customers.
I want to start renting a VPS, so I can host customer databases using Microsoft SQL Server 2008, which they can access using a locally stored Access front-end. I do have a few questions though:
In order to access the remotely hosted databases, and use the security features, would the VPS need to be set up as a domain controller, using AD DS? If I am hosting multiple customer databases, this is not an option.
What I envisage is being able to set up a simple MS Access front end, to access a MS SQL Server database on my VPS. For security, I would want the database to use the Windows account on the client machine to authenticate, and also to provide basic data change tracking.
Is this possible? Or, will I need to set up a server for each client and have it configured as a domain controller, etc?
You can have many databases on the same server, so for each client you d not need to setup a separate domain controller. Only the connection strings will be different.
You can use SSL for establishing connection with the remote server to make the process more secure. You can also make a few web services to play with the data (CRUD operations), this would also make things more manageable.
take care :)