Where to host Meteor-Apps? - heroku

I want to build a Meteor-App to run a little side-project/business. This in mind:
I want a cheap environment (online) to test and share my progress
I want to have the option to scale it up in terms of production in a few months
I want to use some standard command line tools to push to this service
The database options have also to scale up if i need more
I've started looking into Heroku, but are there any "good practices" which anybody can recommend? I never hosted a Meteor-App, and i want to avoid a private server because of administration etc.

Meteor apps are immediately ready to deploy to Heroku. Your question is very broad, but Heroku fits the bill for every parameter you specified.
Here's a flow for creating an example meteor app and deploying it:
$ meteor create --example leaderboard
$ cd leaderboard
$ git init . && git add .
$ git commit -m "First commit"
$ heroku create --buildpack https://github.com/jordansissel/heroku-buildpack-meteor
$ git push heroku master

There are a few options:
Meteor.Com -- the easiest option, free hosting from Meteor. Super easy deployments, but limited resources until the Galaxy platform is complete. Not suitable for very high load/traffic sites (yet).
PaaS providers -- cloud hosting where you are responsible for your app, and the provider manages the infrastructure for you. Generally small virtual machines which host node apps, with great deployment tools.
Own Infrastructure -- info on how to host on your own servers, or via IaaS providers like Amazon AWS, Digital Ocean, Rackspace, etc. Most complicated option.
Deployment Services - services to manage deployment to your own servers (or IaaS servers). Advantages of the above but with the management taken care of for you (managed deployment, monitoring, etc).
Source: Meteorpedia

I have been hosting some meteor apps on Webfactions and it has been working fine so far.
Here is a tutorial :
http://racingtadpole.com/blog/meteor-mongodb-webfaction/

Related

Migrating From Digital Ocean to Heroku

I have an app that uses Flash, MariaDB, and Python which I currently host on Digital Ocean. I am planning to move to Heroku for scalability purposes, but I am unable to find any resources online talking about the process of migrating data from Digital Ocean to Heroku. Can anyone explain the process? Thanks
Heroku provides Dynos (application runtime) and Add-Ons (databases and other tools).
First create a Dyno (it can also be free) and deploy your Python application (you can push the code from Git or create a Docker image for the Heroku Docker Registry).
Then enable the MariaDB add-on to setup your storage (there is a free plan here too, check if this is suitable in your case).
You need to provide the MariaDB connection string as environment variable, this is explained here
Finally it is a good idea to use Config Vars for all environment variables that your application might need (ie tokens, secrets).

Is it possible to deploy a raw Java web app to Heroku?

Very inexperienced user here...please be patient!
I inherited maintenance of Heroku app from someone no longer with the company. Having to re-deploy an app update is probably a once-a-year event, and here we are.
The instructions I have include building a standalone jar file containing my app and then deploying it to Heroku. Specifically the procedure for this is to use the Heroku CLI with the following command:
heroku deploy:jar webapp.jar -a my-app
Easy enough. Except he had his own instance of the Heroku CLI, and when I went to download my own copy, it appears that the deploy command no longer exists! Is this the case? Is this a deprecated command? Do I need to go through the process of figuring out how to set up a git repository to deploy this? (We are in fact using git to manage the source for this app, but it's behind our company firewall, so I'm not sure how practical/difficult it will be to set this up for Heroku). I just want to make sure I'm not missing something simple before investing a significant amount of time re-inventing the deployment process. Thanks.
The most popular mechanism is indeed to push the code from git to Heroku, providing the necessary files (i.e. profcile) to deploy the runtime.
An alternative is to create a Docker image and push it to the Heroku Registry (which in your case would require more reworking).
Refer to Deploy with Git, the firewall should not be a problem as Heroku will not access your code, but you will need to perform the push (git push heroku master)
I have to answer my own question because I was able to find the solution.
It turns out there is a plugin available for the heroku CLI that provides the deploy command. Running heroku plugins:install java will install the plugin that provides the deploy command in the heroku CLI.
See https://devcenter.heroku.com/articles/deploying-executable-jar-files for more information.

Is Heroku Release Phase one-off dyno exposed to the public network?

I want to run GhostInspector (or similar) as part of CI/CD. The tests would execute as part of the release phase for staging, then if successful, production will be deployed. Does this work? Is the dyno publicly accessible with the usual staging url?
short answer: one-off dynos are not publicly accessible to the internet.
For your use-case the way on heroku would be pipelines:
deploy your app to a testing-stage app
run ghost-inspector there
promote to production/staging afterwards
(you can control all this via easy cli or api-requests)
alternative: find a tool that runs headless (inside the dynos) or can be called from the dynos (Browserstack can be used like this in a CI/CD/Pipeline)

How to run InfluxDB on Heroku?

Is it possible, and if so, how? I'd like to be able to reach it from my existing Heroku infrastructure.
Will I need a Procfile? From what I understand it's just a standalone binary written in Go! so it shouldn't be that hard to deploy it, I'm just curious how to deploy it because I don't think I understand the ins and outs of Heroku deployment.
Heroku Dynos should not be used to deploy a database application like InfluxDB.
Dynos are ephemeral servers. Data does not persist between dyno restarts and cannot be shared with other dynos. Practically speaking, any database application deployed on a dyno is essentially useless. This is why databases on Heroku (e.g. Postgres) are all Add-ons. InfluxDB should be set up on a different platform (like, AWS EC2 or a VPS) since a Heroku Add-on is not available.
That said, it is possible to deploy InfluxDB to a Heroku dyno.
To get started, it is important to understand the concept of a 'slug'. Slugs are containers (similar to a Docker images) which hold everything needed to run a program on Heroku's infrastructure. To deploy InfluxDB, an InfluxDB slug needs to be created.* There are two ways to create a slug for Go libraries:
Create a slug directly from a Go executable as described here.**
Build the slug from source using the Heroku Go buildpack (explained below).
To build the slug from source using a buildpack, first clone the InfluxDB Github repo. Then add a Procfile at the root of the repo, which tells Heroku the command to run when the dyno starts up.
echo 'web: ./influxd' > Procfile
The Go buildpack requires all dependencies be included in the directory. Use the godep dependency tool to vendor all dependencies into the directory.
go get github.com/tools/godep
godep save
Next, commit the changes made above to the git repo.
git add -A .
git commit -m dependencies
Finally, create a new app and tell it to compile with the Go buildpack.
heroku create -b https://github.com/kr/heroku-buildpack-go.git
git push heroku master
heroku open // Open the newly created InfluxDB instance in the browser.
Heroku will show an error page. An error will be displayed because Heroku's 'web' process type requires an app to listen for incoming requests on the port described by the $PORT environment variable, otherwise it will kill the dyno. InfluxDB's API and admin panel run on ports 8086 and 8083, respectively.
Unfortunately, InfluxDB does not allow those ports to be set from environment variables, only through the config file (/etc/config.toml). A small bash script executed before InfluxDB starts up could set the correct port in the config file before InfluxDB starts up.
Another problem, Heroku only exposes one port per dyno so the API and the admin panel cannot be exposed to the internet at the same time. A smart reverse proxy could work around that issue using Heroku's X-Forwarded-Port request header.
Bottom line, do not use Heroku dynos to run InfluxDB.
* This means the benefits of a standalone Go executable are lost when deploying to Heroku, since it needs to be recompiled for Heroku's stack.
** Creating a slug directly from the InfluxDB executable does not work because there is no built-in way to listen to the right port given by Heroku in the $PORT environment variable.
I like to think anything is possible on a Heroku node when using a custom buildpack, but there are some considerations when hosting with Heroku:
ops, e.g. backup, monitoring (does it entail installing extra services, opening extra ports, etc - Heroku might get in the way here)
performance, considering dyno size
and if you need a larger dyno, cost becomes an issue. You'll get more bang for your buck when you go the IaaS route.
other "features" of a dyno, e.g. disk ephemerality
I highly recommend hosted InfluxDB or spinning up your own on a VPS, all of which you can point your existing Heroku-based apps to. It will then help to get those instances as close together as possible (i.e. same region, or co-located if possible), presuming a need for low latency between DB and app stack.

Setting up a collaborative environment for web application development

My office is growing and ive been tasked to build out the IT for our web development.
Whats the best tool/setup for doing web development in a group setting? The requirements are a centralized code repository, a location to test development code on, and finally a way to push tagged code out to a staging server. What im thinking is svn/redmine for code repo, each user has an account on a central development machine to allow for ssh access(eclipse over ssh) and their own virtual host on the dev server which gives everyone a centralized development sandbox. Code is written and tested on this dev box then checked back into svn and later tagged and pushed out to the staging server. Yeah? Thoughts comments or recommendations?
*Also, in a dev environment what is the best way to handle databases? Is it wise to pull from the production database? Also should each developer have his/her own db or work off a master db?
**We are building a magento application and also have some custom backoffice tools that run on cakePHP.
Although this subject is off-topic in StackOverflow and flagged so then you need to concentrate on following areas:
VERSION-CONTROL
GIT has all the glory and you don't need your own box for this as https://bitbucket.org/ offers unlimited data and private/public repos and you can set your codebase there. http://github.com is also powerful and de facto most popular version-control oriented tool out there although it comes for a small price
so your master branches live in your version control and your devs will checkout frpom there and commit to it as well
your deployment tools will deploy data to your live and staging environments from your master
ENVIRONMENTS
usually three are used LIVE, STAGE, DEV
LIVE is well live and only approved code gets deployed there
STAGE is pre-live environment and should be exact replica environment according to LIVE so all things can be tested there by merchant
DEV is cool to have exact replica but can as well be on developers local env and is ment for loose testing and experimenting
DATABASES AND DEPLOYMENT
mysql databases are pain in the ass to sync so you better have a script for it that syncs from live to others and prevent syncing from other environments to LIVE. This limitation also requires that all the configuration and content will be added from LIVE only and only then synced down the line. Every change to schema or permanent setting should be handled by update scripts (As we are talking MAGENTO CE , MAGENTO EE has migration built in)
for deployment I also suggest you to build a fabric or capistrano script that resets dev and staging environments, handles database reset and pull from LIVE DB, and imports code from central repository.
it's also a good idea to target the following everyday tasks:
clients needs to reset the stage for it's tests
project manager, developer or testers need to test so spawning a test clone should be oneclick action (take current db and code and make it live in some subfolder for specific test only) as well as deleting the test
3rd party devs might need access to specific test or dev environment (this is actual with magento as in average there are at least 10 external extensions installed in every magento store)

Resources