Can docker containers be run on live web servers? - heroku

I read that heroku uses what they call cedar containers in their infrastructure which allows developers to use containerisation in their apps hosted on heroku. If I'm not mistaken that is, I'm new to all this.
Is is possible to run docker containers on web servers and integrate them as part of your website? Or at least, come up with a method of converting docker containers into Cedar containers or something similar which are compatible with the web server?
On your own private server I see no reason why you couldn't do this, but when it comes to commercial web hosting services, where does this stand?

You are not running "docker on web server", but running "docker with web server".
I mean, you supposed to package your app into the docker with some kind of web server.
After it, you can call your app in this container as regular web site. Also, you can host this container in some docker host (for example, docker cloud, sloppy.io,...)
As for heroku, may be you'll find this helpful

Related

How to determine if an asp.net app is running on an azure vm

We have an asp.net app that gets deployed to both On-Prem and on Azure VMs. We are trying to figure out how to configure the app so that when deployed on an Azure VM it will use Azure App Configuration Service, but when deployed On-Prem it will continue to use the settings in the config files?
How can we know on app start up whether or not we are deployed on an Azure VM?
If you can, I would recommend you add a special environment variable when you provision your Azure VM or deploy your application. If not, you may use Azure Instance Metadata Service to tell the code is running in Azure VMs.

Azure App Service Docker Linux Deployment - Where are my files?

I have a Azure CI pipeline, that deploys .NET Core API to a Linux docker image and pushes it to our Azure Container Registry. The files are deployed to /var/lib/mycompany/app using docker-compose and dockerfile. This is then used as an image for an App Service which provides our API. The app starts fine and works, but if I go to advanced tools in the app service and run a bash session, I can see all the logs files generated by docker, but I can't see any of the files I deployed in the locations I deployed them. Why is this, and where can I find them? Is it an additional volume somewhere, a symbolic link, a layer in docker I need to access by some mechanism, a host of some sort, or black magic?
Apologies for my ignorance.
All the best,
Stu.
Opening a bash session using the Advanced Tools will open the session in the underlying VM running your container. If you want to reach your container, you need to install an ssh server in it and use the SSH tab in the Advanced Tools or the Azure CLI.
az webapp create-remote-connection --subscription <subscription-id> --resource-group <resource-group-name> -n <app-name> &
How to configure your container
How to open an SSH session

Nginx Docker Web Server behind Corporate proxy

I've gone through the various questions here and I can't seem to get this working.
I have a ubuntu server running docker.
I have laradock which has a lot of options on running a web server, sql server, php, etc.
This ubuntu server is behind a corporate network.
The nginx, php-fpm and mysql containers are hosting a laravel app.
When the nginx docker container needs to access the internet, I need it to go through the corporate proxy server.
Can someone please point me in the right direction where to configure this? On the Docker Host, on the Containers themselves, on all the containers?
Thanks!
See https://docs.docker.com/network/proxy/, section Configure the Docker client.
Note: You will need to restart your containers after updating the ~/.docker/config.json file.

Docker for Windows Swarm IIS Service with Win10 Insider running but unreachable

I'm currently experimenting with Swarm Services with Docker for Windows. The new Win10 Insider build supports overlay networking for Windows containers and I was pleased to see my IIS service actually starting. The only issue i came across is that i can not reach the service in the browser, despite trying multiple things such as different ports and networks. The command issued is as following:
docker service create --name webfarm -p 80:80 microsoft/iis
I have also tried to use the --network flag to try different networks and I have made sure to test all IP addresses visible in the docker service inspect webfarm command.
docker service ps webfarm does indicate that my service is in state RUNNING and does not have any errors, so i don't know what else i can try. Especially since these commands worked fine on Linux with Apache.
I was wondering if anyone has been able to successfully create a service using Windows Containers on the Windows Insider build (15046), and if so, how?
Never mind, i found this actually is not supported yet.
The following source states:
"At the moment only DNS round robin is implemented as described in the Microsoft blog post. You cannot use to publish ports externally right now. More to come in the near future." (https://stefanscherer.github.io/docker-swarm-mode-windows10/)
And indeed, the blogposts states the following:
"Currently, Windows supports DNS Round-Robin load balancing between services. The routing mesh for Windows Docker hosts is not yet supported, but will be coming soon. Users seeking an alternative load balancing strategy today can setup an external load balancer (e.g. NGINX) and use Swarm’s publish-port mode to expose container host ports over which to load balance." (https://blogs.technet.microsoft.com/virtualization/2017/02/09/overlay-network-driver-with-support-for-docker-swarm-mode-now-available-to-windows-insiders-on-windows-10/)
I guess I'll have to wait for this feature, in the meantime I will use the alternative.

How can I setup and deploy a database with Deis (PaaS)

I'm trying to setup a database with Deis. I know this is possible, but there doesn't seem to be any documentation about how to do it other than setting an ENV variable.How could I setup say a MongoDB or Cassandra docker container and then deploy that and have my deis app use it?
If you're trying to deploy now, a possible solution is to set up a docker container, have it publicly route-able, and then configure your application to use that container through an environment variable following Heroku's 12 factor app best practices. There is a feature request for a Deis service gateway that will act like Heroku's Add-on Marketplace, but it's not there yet.

Resources