I'm following the tutorial to deploy a ruby app to google compute engine. Everything works, however I now want to ssh into the app to run migrations etc. After some searching i was able to find my files under a docker instance here /var/lib/docker/aufs/diff/e2972171505a931749490e13d21e4f8c0bb076245ef4b592aff6667c45b2dd13/app
Is there a simpler way to access my files? perhaps a symlinked folder?
Ruby apps on Google AppEngine run via Docker. Because AppEngine is a PaaS provider, it's discouraged (though possible) to run commands on production machines. If you'd like to run database migrations, please run them locally and point your configuration at your production database.
Related
I have just started learning ruby. So now I have setup a basic Sinatra project, just wondering where I can deploy it for free without credit card details? As Heroku has no free tier anymore. Also, I would like to know with the seeds.rb I generate random data, would those data be able to used after deploy.
I have tried couple other platforms(render, railway, AWS…), but they either are not free or hard for deploying
Currently Fly.io and Digital Ocean are popular replacement for Heroku.
Yes, you can seed the database used in the deployment.
Then you can either
Run rake db:seed on the deployed application or
Connect to the database from you development machine and run rake db:seed
I have just started developing a Golang app, and have deployed it on Google App Engine. But, when I try to connect my local server to CloudSQL instance through proxy, I am able to connect only through TCP.
However, when connecting with the same CloudSQL instance in AppEngine, I am able to connect only through UNIX.
To cope with this, I have made changes in my local environment handler file, so that it can adapt to local and GCloud config, but I'm not sure how I can skip the update on just this file for GCloud? Again, I don't want AppEngine to delete this file, I just want the CLI to avoid uploading the new version of the handler file.
I use this command for deploying: gcloud app deploy
Currently, I deploy directly to AppEngine, instead of pushing it through VCS. Also, if there is an option to detect if the app is running on AppEngine, then it'd be really great.
TIA
Got it, in case anyone gets stuck in such situation, we can make use of environment variables set in GCloud AppEngine. Although there is documentation stating the environment variables, I would still give importance to checking the environment variables in Cloud Console.
Documentation link for Go 1.12+ Runtime env:
https://cloud.google.com/appengine/docs/standard/go/runtime
Iam looking for help to containerize a laravel application with docker, running it locally and make it deployable to gcloud Run, connected to a gcloud database.
My application is an API, build with laravel, and so far i have just used the docker-compose/sail package, that comes with laravel 8, in the development.
Here is what i want to achieve:
Laravel app running on gcloud Run.
Database in gcloud, Mysql, PostgreSQL or SQL server. (prefer Mysql).
Enviroment stored in gcloud.
My problem is can find any info if or how to use/rewrite the docker-composer file i laravel 8, create a Dockerfile or cloudbuild file, and build it for gcloud.
Maybe i could add something like this in a cloudbuild.yml file:
#cloudbuild.yml
steps:
# running docker-compose
- name: 'docker/compose:1.26.2'
args: ['up', '-d']
Any help/guidanceis is appreciated.
As mentioned in the comments to this question you can check this video that explains how you can use docker-composer, laravel to deploy an app to Cloud Run with a step-by-step tutorial.
As per database connection to said app, the Connecting from Cloud Run (fully managed) to Cloud SQL documentation is quite complete on that matter and for secret management I found this article that explains how to implement secret manager into Cloud Run.
I know this answer is basically just links to the documentation and articles, but I believe all the information you need to implement your app into Cloud Run is in those.
I have been following this guide:
https://deliciousbrains.com/scaling-laravel-using-aws-elastic-beanstalk-part-3-setting-elastic-beanstalk/
However I am stuck at this point.
Not in terms of something not working, but in how it should be done properly. Which app I should deploy?
Is is the development app that is tested and deployed? Do I create another instance in AWS that will be only used to deploy ready apps? What is the pattern to follow?
At the moment I have local development server which runs on my PC, and also 1 Development instance EC2 on AWS. Do I need more than that on top of Elastic beanstalk?
Please advice me! Thanks!
The following pattern is the one that best fits your need. You're not just looking for a pattern, but an architecture. I'll try to help you with the information you provided.
First it is important that you really understand what Beanstalk is and how it works. See: http://docs.aws.amazon.com/en/elasticbeanstalk/latest/dg/Welcome.html
Answering your question, applications are typically placed in the beanstalk for scalable production, but nothing prevents you from setting up development environments for testing, too.
You do not need to create an instance to deploy, you can deploy from your own local machine, using the console, cli, or api. Look:
Console: https://sa-east-1.console.aws.amazon.com/elasticbeanstalk/home
EB Cli: http://docs.aws.amazon.com/en/elasticbeanstalk/latest/dg/eb-cli3.html
API: http://docs.aws.amazon.com/en/elasticbeanstalk/latest/api/Welcome.html
Having said that, I will cite a very useful scenario in several cases:
You create a beanstalk application from the console or cli and configure the integration with AWS CodeCommit. CodeCommit will prevent you from having to send the whole project to each deploy.
You create an instance of amazon to perform the implantation. This instance has a git repository of your project, it gets committed to the beanstalk environment settings (environment variables for example), and deploy to beanstalk using CodeCommit.
This scenario is very useful for a team project for beanstalk because you can use the deployment instance to hide sensitive details and configure deploy patterns.
Now i'm working on RESTfull API on go, using Windows and goclipse.
Testing environemnt consists of few VMs managed by Vagrant. These machines contain nginx, PostgreSQL etc. The app should be deployed into Docker on the separated VM.
There is no problem to deploy app on first time using guide like here: https://blog.golang.org/docker. I've read a lot of information and guides but still totally confused how to automate deploying process and update go app in docker after some changes in code done. On the current stage changes in code done very often, so deploying should be fast.
Could you please advise me with correct way to setup some kind of local CI for such case? What approach will be better?
Thanks a lot.