Does Heroku provide any VPN client like AWS Client VPN? - heroku

I want to host a database in Heroku server and also a django application. The problem is: To transfer data to my Heroku database i would need be connected to a VPN. Does Heroku provides a way to connect to a VPN in order to access another database, like AWS client VPN?
My infra would be like this:
Airflow running DAGs to pull data from a AWS database that requires VPN connection to source from it. I would transfer the data from this AWS database to my heroku database.
Is it possible?
Thank you
Another thing that i'm wondering is if it is possible to connect Heroku to AWS client VPN, in case Heroku does not have something similar or a way to do this step.

Yes Heroku does provides a VPN labelled as Heroku Private Spaces and Shield Spaces.
Here is the link
https://devcenter.heroku.com/articles/private-space-vpn-connection

Related

Local database docker to AWS database in VPN

i'm a beginner to Docker, hope everyone can help, much appreciated.
I downloaded a docker image from my company repository and i managed to create a container in my local machine from the image, let's named it mydb. It is created through command below:
docker run --name mydb -p 1521:1521 -d mycompany.com:5000/docker-db:20.0.04
I am able to access the database with following connection string through my sqldqveloper : system/abc123#127.0.0.1:1521/ORCL
Our company have a database server in AWS, let's name it awsdb. I can access it after vpn login.
I am able to access the database with following connection string in sqldqveloper :
system/abc123#awsdb.amazonaws.com:1521/awsdb
Question:
How can i create a database link in mydb to awsdb with database link "my_dblink"? eg. select sysdate from dual#my_dblink.
I try with following command:
CREATE PUBLIC DATABASE LINK my_dblink
CONNECT TO system
IDENTIFIED BY abc123
USING 'awsdb.amazonaws.com:1521/awsdb';
but it return error ORA-12543: TNS:destination host unreachable.
I tried remove the container and recreated it by set the net=host:
docker run --name mydb -p 1521:1521 -d --net=host mycompany.com:5000/docker-db:20.0.04
then now i can't even connect is with system/abc123#127.0.0.1:1521/ORCL
error ORA-12541 returned: no listener.
How can i open the connection between internal docker to AWS database server? Thank you.
First of all, I do believe you need to understand what you are trying to accomplish.
When you create a database link between two databases, the main requirement you must fulfil is to have network connectivity between both of them in the ports you are using. As one of them is stored in public cloud, at least you would need:
A network connection between the network where the docker is installed and the public cloud in AWS.
But, as your docker is installed in your local laptop, the AWS should be opened to Internet, something that it is a security issue and probably it is not enabled.
Moreover, you would need Firewall rules in all the ports you might need to use in this connectivity.
As you are using a VPN login that allows you to access the AWS Cloud resources because you are connecting through it ( probably using Active Directory and/or a certificate, perhaps even using SSO federation between your AD in your company and the resources in AWS ), the database can't connect using that.
Summarizing, that is not possible, and if I were someone in Security I would never allow it. The only option for you would be to create a docker with the database in AWS and then create the database link there.

Containerized Laravel application that connects to a remote database

Good day everyone, I have a Laravel application that is supposed to connect to a remote MYSQL database in production, and to ease deployment I am using docker. I have setup a GitHub actions workflow that is triggered when I push to master branch, the workflow essentially runs a couple of tests and then builds my app into an image and then pushes to docker hub.
To avoid database connection issues when composer dump-autoload is run during the build process, I allowed connection from any host (changed bind-address to 0.0.0.0 in mysql config) and also setup the mysql user to connect from any host. This seems to do the trick but my concern is obviously exposing my database service to the entire world. Fortunately its possible to setup my own dedicated server for Github actions, which means I can easily restrict my db service to that host. Would that be the Ideal solution or there is way to run the workflow without needing to connect to a database?.
Try to connect to remote database using an SSH Tunnel
ssh -N -L 3336:127.0.0.1:3306 [USER]#[REMOTE_SERVER_IP]
With this you do not need to publish MySQL to the world and could bind it to 127.0.0.1 on Remote host.

How to configure JDBC for Cloud Fusion to connect MySQL installed on localhost:3306

I'm trying to connect my local standalone MySQL with Cloud Fusion to create and test a data pipeline. I have deployed the driver successfully.
Also, I have configured the pipeline properties with correct values of jdbc string, user name and password but connectivity isn't getting established.
Connection String: jdbc:mysql://localhost:3306/test_database
I have also tried to test the connectivity via data wrangling option but that is also not getting succeeded.
Do I need to bring both the environments under same network by setting up some VPC and tunneling?
In your example, I see that you specified localhost in your Connection String. localhost is only advertised to other services running local to your machine, and Cloud Data Fusion (running in GCP) will not be able to reach the MySQL instance (running on your machine). Hence you're seeing the connectivity issue.
I highly recommend looking at this answer on SO that will help you setup a quick proof-of-concept.
I think that your question is more related to the way how to connect some on-premise environments to GCP networking system that gathering Google cloud instances or resources throughout VPC connection model.
Admitting the fact that GCP is actually leveraging different approaches for connection methods within a Hybrid cloud concepts, I would encourage you to learn some fundamental principles of Cloud VPN as a essential part of performing secure connection between particular VPN Peer Gateway and Cloud VPN Gateway and further creating a VPN tunnel between parties.
I guess there is even dedicated chapter in GCP documentation about Data Fusion VPC peering implementation that might be helpful in your user case.

Can a Heroku app access remote database?

I want to host an application in Heroku bu don't want to use a Heroku database. Can I connect to an existing remote database from my Heroku app?
You can use whatever database you like from Heroku as long as it is accessible from Heroku's platform.
Just set and use the proper env var in your app with your db's address and credentials and it should just work.
Where is your current database hosted? In short, the answer is likely yes. Heroku makes it really easy to connect to an Amazon RDS instance through a plugin. This is the same way many other database hosts provide connectivity, such as ClearDB or MongoLab. If you share where your current remote database is, it will be easier to give you more information.

How to use your own mysql database server with heroku?

I want to use mysql database which is hosted on my own server.
I've changed DATABASE_URL and SHARED_DATABASE_URL config vars to point to my server, but it's still trying to connect to heroku's amazonaws servers. How do I fix that?
According to the Heroku documentation, changing DATABASE_URL is the correct way to go.
If you would like to have your rails application connect to a non-Heroku provided database, you can take advantage of this same mechanism. Simply set your DATABASE_URL config var to point to any cloud-accessible database, and Heroku will automatically create your database.yml file to point to your chosen server. The Amazon RDS Add-on does this for you automatically, though you can also use this same method to connect to non-RDS databases as well.
Here's an example that should work:
heroku config:add DATABASE_URL=mysql://user:password#host/db
You may need to redeploy by making a change and running git push heroku master
By the way, the host is XXXX.amazonaws.com, where XXX is a long host hame that probably changes. If you can add a wildcard, that's the easiest %.amazonaws.com
I had this exact same problem with my Dreamhost MySQL database. Turns out the solution was to tell Dreamhost is was Ok to accept connections from this foreign host. Otherwise, Dreamhost blocks all requests to MySQL that don't originate from their systems.
It seems that if Heroku is falling back to Amazon AWS despite your DATABASE_URL, it's because it's being denied access to your MySQL database.

Resources