How can I setup and deploy a database with Deis (PaaS) - heroku

I'm trying to setup a database with Deis. I know this is possible, but there doesn't seem to be any documentation about how to do it other than setting an ENV variable.How could I setup say a MongoDB or Cassandra docker container and then deploy that and have my deis app use it?

If you're trying to deploy now, a possible solution is to set up a docker container, have it publicly route-able, and then configure your application to use that container through an environment variable following Heroku's 12 factor app best practices. There is a feature request for a Deis service gateway that will act like Heroku's Add-on Marketplace, but it's not there yet.

Related

Is it possible to deploy more then one Strapi.io app on the same Heroku dyno?

I read more tutorials about deploy Strapi to Heroku and all of them talks about deployng your Strapi app to Heroku. But in addition I would like to understand if it is possible to deploy for example 2 different Strapi applications, to the same Heroku dyno and if it has any disadvantage or not.
Yes, it's possible.
You need to use Nginx Virtual Host as defined in Strapi Nginx Proxying documentation.
It's mentioned in Amazon AWS part of the documentation.
https://www.digitalocean.com/community/tutorials/how-to-set-up-nginx-with-http-2-support-on-ubuntu-18-04

How to host Moqui on AWS EC2

Is there a way to host Moqui on AWS? I was trying to host Moqui using a EC2 instance but couldn't figure out a way to connect them.
The Run and Deploy document on moqui.org has a section for a simple recommended deployment using ElasticBeanstalk and RDS:
https://www.moqui.org/m/docs/framework/Run+and+Deploy#AWSElasticBeanstalkandRDS
With more details about how you want to set things up on AWS the answer to how might vary from this.
For clustered setups things get more involved to get the right settings for Hazelcast AWS discovery and it is best to use an external ElasticSearch server like an AWS ElasticSearch instance and configure Moqui using environment variables to use the Java REST Client mode instead of the Embedded Node mode. Settings for the moqui-hazelcast and moqui-elasticsearch components can be seen in the MoquiConf.xml file in each component.

Continuously deploying features to Spring Boot application hosted by AWS

I am looking for advice/ideas on how to continuously deploy new features to a Spring Boot web application that is hosted on an AWS EC2 instance. My current workflow:
bootRepackage my application to create a war file.
Upload that file to AWS.
Add a new feature to my application.
bootRepackage again.
Remove the current war from AWS, and upload the new one.
This is obviously not a good workflow, as the application needs to be restarted which could result in 1) downtime and 2) entries in the database being lost (if I'm using Spring's default H2 database - I am not, I'm using a standalone SQL server, but just making the point for this question) so I am wanting to streamline it.
Is there any way to add a new feature to the current instance of the service on AWS? Is it possible to recompile the code "one the fly" to prevent the need to restart the application?
Is there any way of creating a better setup that would allow me to just merge a new branch to master locally, and push that with the same instance still in prod except with this new feature?
Thank you in advance!
Update, is this really the correct answer?
If you using single instance of aws and deploying the application to EC2 instance, please assign Elastic IP for the AWS EC2 instance.
An Elastic IP address is a static IPv4 address designed for dynamic
cloud computing. An Elastic IP address is associated with your AWS
account. With an Elastic IP address, you can mask the failure of an
instance or software by rapidly remapping the address to another
instance in your account.
Deploy the new version of the application in another AWS EC2 instance
When the application is ready, reassign the Elastic IP from the existing EC2 instance to new EC2 instance
Elastic IPs are the simplest way to implement the blue-green switch.

Can docker containers be run on live web servers?

I read that heroku uses what they call cedar containers in their infrastructure which allows developers to use containerisation in their apps hosted on heroku. If I'm not mistaken that is, I'm new to all this.
Is is possible to run docker containers on web servers and integrate them as part of your website? Or at least, come up with a method of converting docker containers into Cedar containers or something similar which are compatible with the web server?
On your own private server I see no reason why you couldn't do this, but when it comes to commercial web hosting services, where does this stand?
You are not running "docker on web server", but running "docker with web server".
I mean, you supposed to package your app into the docker with some kind of web server.
After it, you can call your app in this container as regular web site. Also, you can host this container in some docker host (for example, docker cloud, sloppy.io,...)
As for heroku, may be you'll find this helpful

cloudfoundry NoHostAvailableException while deploying app

I have deplyed my local cloudfoundry instance. When I try to deploy my application , my app requires cassandra to be up and running. I have cassandra host setup on independant server. Cloud foundry throws com.datastax.driver.core.exceptions.NoHostAvailableException
Whereas when I try to ping this host from the machine on which CF is installed , Ping is successful. Even this cassandra host is accessible from my local computer and works fine with my eclipse deployment.
How can I make cloudfoundry recognize this host?
You will need to make sure that (a) your application has access to the information about the address and credentials to access the cassandra server, and that (b) networking (and maybe DNS) are such that your application instances will actually be able to reach the cassandra server.
For (a), you will want to bind your application to a "user-provided service instance". For (b), you need to make sure your application's running security groups allow it to reach your cassandra server.

Resources