Can't connect to MySQL server on 'db' Heroku - heroku

I an using Docker Compose with my app and database. My app connects to my database using the hostname db as shown in my docker-compose.yml:
db:
image: mysql
# container_name: mysql_db
environment:
- MYSQL_USER=test
- MYSQL_ROOT_PASSWORD=test
- MYSQL_PASSWORD=test
- MYSQL_DATABASE=test
- MYSQL_ALLOW_EMPTY_PASSWORD=yes
volumes:
- ./db:/docker-entrypoint-initdb.d
restart: always
ports:
- 3306:3306`
But when I open my app on Heroku I get a connection error:
sqlalchemy.exc.OperationalError: (pymysql.err.OperationalError) (2003, "Can't connect to MySQL server on 'db' ([Errno -2] Name or service not known)")
I supposed that this error appear because Heroku has dynamic hosts.
How can I set my database container up on Heroku?

How can I set my database container up on Heroku?
Don't.
One database container and one code container might make sense in development, but in production these things shouldn't be coupled. Use a hosted database provider instead; there are several options, including MySQL services, most of which have a free starter tier.
This matches Heroku's recommendation:
Use Heroku add-ons in production
For local development: use official Docker images, such as Postgres and Redis.
For staging and production: use Heroku add-ons, such as Heroku Postgres and Heroku Redis.
Using official Docker images locally and Heroku add-ons in production provides you with the best of both worlds:
Parity: You get parity by using the same services on your local machine as you do in production
Reduced ops burden: By using add-ons, Heroku – or the add-on provider – takes the ops burden of replication, availability, and backup.

Related

How to host a dockerized Laravel website on Heroku?

I have a dockerized Laravel application that consists of MySQL database and Redis.
I want to deploy and test it on the internet.
is there any way to have an environment such as Heroku to deploy it?

Why is #nestjs/bull not able to connect to Heroku Redis?

This works for local redis-server
BullModule.forRoot({
redis: {
host: "localhost",
port: 6379,
db: 0,
password: ""
}
})
But if I use the DataStore Credentials on Heroku Redis, the bull board does not load and Heroku Logs gives an H12 error.
How can I get the BullModule to properly connect to Heroku Data for Redis?
Thanks!
You must specify the location of where redis is accessible. localhost:6379 is the default for running redis locally, but to deploy an application that uses Redis to Heroku, you will need to add the Connecting to Heroku Data for Redis add-on. Then, you'll need to pass the location of your Redis service via process.env.REDIS_URL to the BullModule.forRoot() constructor.
Be aware that encountering TLS issues in connecting to Redis like this are common. When I tried connecting using the format from PedroPovedaQ's answer, I ran into one.
There's a discusson on that here.
I suggest trying
BullModule.forRoot({
redis: "<redisurl given by heroku in env variable>"
})
This fixed the issue for me.

How to send logs to papertrail heroku add-on from a job not running on heroku?

I have an application running on Heroku using the papertrail add-on. Everything works perfectly and logs from that app are shown in there. My problem comes when I want to log to the same papertrail account from jobs running in other servers like AWS. In particular, I'm trying to setup a docker container for jobs that are only run in certain occasions (and for some technical limitations they cannot be run in heroku).
I have checking how to setup the unix loggers to automatically send logs to papertrail. My problem is that I do not have a papertrail dns or a port. Heroku only gives a papertrail token.
Any idea how to send logs to papertrail using my add-on credentials from a different unix server?
You can use logspout which can route all container logs (from the host) to a different location, for example to papertrail.
You can find a docker run example of the website, here below is a docker-compose example which is quite convenient when you run multiple containers and want to gather all logs together.
The destination logs2.papertrailapp.com:55555 is provided by Papertrail in Settings->Log Destinations
logspout:
image: gliderlabs/logspout:latest
container_name: logspout
restart: always
volumes:
- "/var/run/docker.sock:/var/run/docker.sock:ro"
command: syslog+tls://logs2.papertrailapp.com:55555
ports:
- 8082:80
networks:
- my_network

Connecting Google Compute Engine with Google App Engine

I am trying to connect a Google Compute Engine instance with a MySQL database to Google App Engine using Laravel. I actually have connected my Google App Engine to a Cloud SQL instance, I don't have problem with this, but I need an additional database connection with the database located on Google Compute Engine.
Google Compute Engine instance is on a different project. This is my scheme:
Project A -> Compute Engine -> Instance -> MySQL database
Project B -> App Engine -> Laravel
Project B -> Cloud SQL -> DB-instance -> MySQL database
This is my app.yaml file:
runtime: php
env: flex
runtime_config:
document_root: public
# Ensure we skip ".env", which is only for local development
skip_files:
- .env
env_variables:
# Put production environment variables here.
APP_LOG: "errorlog"
APP_KEY: "[KEY]"
STORAGE_DIR: "/tmp"
CACHE_DRIVER: "database"
SESSION_DRIVER: "database"
APP_DEBUG: "true"
#CLOUD SQL database connection
DB_CONNECTION: "[DATABASE_1_NAME]"
DB_HOST: "[CLOUD_SQL_INSTANCE_NAME]"
DB_DATABASE: "[DATABASE_1_NAME]"
DB_USERNAME: "root"
DB_PASSWORD: "[PASSWORD]"
DB_SOCKET: "/cloudsql/[PROJECTB]:[REGION]:[CLOUD_SQL_INSTANCE_NAME]"
# COMPUTE ENGINE database connection
DB_HOST_2: "[COMPUTE_ENGINE_INSTANCE_NAME]"
DB_DATABASE_2: "[DATABASE_2_NAME]"
DB_USERNAME_2: "root"
DB_PASSWORD_2: "[PASSWORD]"
beta_settings:
cloud_sql_instances: "[PROJECTB]:[REGION]:[CLOUD_SQL_INSTANCE_NAME]"
I consider you're using Google App Engine Standard environment with PHP.
When you connect a GAE Application to a DB hosted on a Google Compute Engine instance, it is treated as a connection to an "external DB". Thus, you need to consider GAE as an external client of your GCE instance.
Therefore, you must configure your VPC firewall in order to accept connections to the port 3306 from outside and you must specify the external GCE instance IP address as the host of your connection string. Be sure that your firewall rule accepts connections from everywhere (IP mask: 0.0.0.0/0).
Hope this helps you.
Bye

Communication between linked docker containers over http for api gateway

I'm currently working on a golang web app which is currently one application consisting of numerous packages and is deployed in an individual docker container. I have a redis instance and a mysql instance deployed and linked as separate containers. In order to get their addresses, I pull them from the environment variables set by docker. I would like to implement an api gateway pattern wherein I have one service which exposes the HTTP port (either 80 for http or 443 for https) called 'api' which proxies requests to other services. The other services ideally do not expose any ports publicly but rather are linked directly with the services they depend on.
So, api will be linked with all the services except for mysql and redis. Any service that need to validate a user's session information will be linked with the user service, etc. My question is, how can I make my http servers listen to http requests on the ports that docker links between my containers.
Simplest way to do this is Docker Compose. You can simply define which services you want and Docker Compose automatically link them in a dedicated network. Suppose you have your goapp, redis, and mysql instance and want to use nginx as your reverse proxy. Your docker-compose.yml file looks as follows:
services:
redis:
image: redis
mysql:
image: mysql
goapp:
image: myrepo/goapp
nginx:
image: nginx
volumes:
- /PATH/TO/MY/CONF/api.conf:/etc/nginx/conf.d/api.conf
ports:
- "443:443"
- "80:80"
The advantage is that you can reference any service from other services by its name. So from your goapp you can reach your MySQL server under hostname mysql and so on. The only exposed ports (i.e. reachable from the host machine) are 443 and 80 of nginx container.
You can start the whole system with docker-compose up!

Resources