Iam looking for help to containerize a laravel application with docker, running it locally and make it deployable to gcloud Run, connected to a gcloud database.
My application is an API, build with laravel, and so far i have just used the docker-compose/sail package, that comes with laravel 8, in the development.
Here is what i want to achieve:
Laravel app running on gcloud Run.
Database in gcloud, Mysql, PostgreSQL or SQL server. (prefer Mysql).
Enviroment stored in gcloud.
My problem is can find any info if or how to use/rewrite the docker-composer file i laravel 8, create a Dockerfile or cloudbuild file, and build it for gcloud.
Maybe i could add something like this in a cloudbuild.yml file:
#cloudbuild.yml
steps:
# running docker-compose
- name: 'docker/compose:1.26.2'
args: ['up', '-d']
Any help/guidanceis is appreciated.
As mentioned in the comments to this question you can check this video that explains how you can use docker-composer, laravel to deploy an app to Cloud Run with a step-by-step tutorial.
As per database connection to said app, the Connecting from Cloud Run (fully managed) to Cloud SQL documentation is quite complete on that matter and for secret management I found this article that explains how to implement secret manager into Cloud Run.
I know this answer is basically just links to the documentation and articles, but I believe all the information you need to implement your app into Cloud Run is in those.
Related
My Laravel app is running on docker using Linux commands on a Windows Machine. This means instead of using 'php artisan' commands, I use 'sail artisan' commands. I am able to migrate locally onto the SQL server on docker (sail artisan migrate) and am able to deploy the app onto Google Cloud (gcloud app deploy). The final bit of my puzzle piece is migrating the database onto the Google Cloud SQL server.
When I was first setting the app up, I had problems with this so I exported the SQL server, uploaded it to Google Cloud and then deployed it manually which was fine for a one-off but I now have things I would like to change about the database structure without losing all the data. I also figured it was about time I learnt to do it properly anyway.
I have attempted to use the instructions on Google's community tutorial, however, this guide presupposes the project does not yet exist whereas I am working with a pre-existing application and a pre-existing database. I tried to jump into the tutorial halfway through but I couldn't get the Cloud SQL proxy to work.
After a bit more research, I found this article which I got partway through, however, once I got to the part needing TCP or Unix sockets, neither set of commands will run without error on my Ubuntu terminal.
If anyone knows of any useful articles or has had this problem themself, I would greatly appreciate your help.
Additional Info:
Laravel Framework 8.69.0
Vue version 3.0.5
Docker Engine Community 20.10.8
SQL server 'europe-west2'
I'm not aware of anyway to run sail/artisan command line commands on GCP.
However in ./vendor/facade/ignition/src/Solutions/RunMigrationsSolution.php there is the following function
public function run(array $parameters = [])
{
Artisan::call('migrate');
}
which can be used to programmatically run migrations.
So in ./routes/web.php make a route like
Route::get('/run_migrate',
[DataController::class, 'runMigrate']
)->middleware(['auth', 'verified'])->name('run_migrate');
and in ./app/Http/Controllers/DataController.php add the function
public function runMigrate() {
Artisan::call('migrate');
}
Now after signing in as an authorised user (or if you remove the auth middleware you won't need to sign in) you can go to https://your-app-url.com/run_migrate and it will run any outstanding migrations.
To be able to do this the schema table has to exist so you will have to import your local database to start.
I have just started developing a Golang app, and have deployed it on Google App Engine. But, when I try to connect my local server to CloudSQL instance through proxy, I am able to connect only through TCP.
However, when connecting with the same CloudSQL instance in AppEngine, I am able to connect only through UNIX.
To cope with this, I have made changes in my local environment handler file, so that it can adapt to local and GCloud config, but I'm not sure how I can skip the update on just this file for GCloud? Again, I don't want AppEngine to delete this file, I just want the CLI to avoid uploading the new version of the handler file.
I use this command for deploying: gcloud app deploy
Currently, I deploy directly to AppEngine, instead of pushing it through VCS. Also, if there is an option to detect if the app is running on AppEngine, then it'd be really great.
TIA
Got it, in case anyone gets stuck in such situation, we can make use of environment variables set in GCloud AppEngine. Although there is documentation stating the environment variables, I would still give importance to checking the environment variables in Cloud Console.
Documentation link for Go 1.12+ Runtime env:
https://cloud.google.com/appengine/docs/standard/go/runtime
I am installed google-cloud-sdk in my matillion instance hosted on EC2. I am able to access gcloud command in the ssh instance and also by using a bash component in my matillion.
However, I am not able to run bq commands. I see it has been installed as part of the cloud sdk. I was able to configure my account and everything. But it doesn't work.
Can someone help me with this?
As per the documentation, its necessary that you activated the BigQuery API in order to use the bq command-line tool.
These are all the steps that you need to follow:
In the Cloud Console, on the project selector page, select or create
a Cloud project.
Install and initialize the Cloud SDK.
BigQuery is automatically enabled in new projects. To activate BigQuery in a preexisting project, go to Enable the BigQuery API.
I also was getting the same error than you and activating the API was the solution.
I have 2 repositories residing in Bitbucket - Backend (Laravel app as the API and entry point) and Frontend (Main application front-end - VueJs app). My goal is to set up continuous deployment so whenever something is pushed in either of the repos in master (or other branch selected by me) branch it triggers something so that the whole app builds and reaches the AWS EC2 server.
I have considered/tried the following:
AWS CodePipeline and/or CodeDeploy. This looked like a great option
since the servers are in AWS as well. However, there is no support
for Bitbucket out of the box, so it would have to go to Bitbucket
Pipeline -> AWS Lambda -> AWS S3 -> AWS CodePipeline/CodeDeploy ->
AWS EC2. This seems like a very lengthy journey and I am not sure if
that's a good practice whatsoever.
Using Laravel Forge to deploy the Laravel app, and add additional steps to build the VueJS app. This seemed like a very basic solution,
however, the build process seems to fail there as it just takes long
time and crashes with no errors (whereas I can run exact same process
on my local machine or a different server hosted elsewhere). I am not
sure if this is issue with the way server is provisioned, the way
Forge runs deployment script or the server is too weak to handle it.
The main question of mine would be what are the best pracises for deploying the app of such components? I have read many tutorials/articles about deploying a NodeJS app, or a Laravel app, but haven't gotten good information about a scenario like this.
Would it be better to build the front-end app locally and version control the built JS file? Or should I create a Pipeline in Bitbucket that would build the app and then deploy it? Or is it the best to just version control and deploy the source files and leave the whole build process as the last step in the deployment process that will be done by the server that is hosting the app itself? There are also some articles suggesting hosting the whole front-end app in S3 bucket - would that be bad practise as well?
Appreciate any help and resources that would help!
From the sounds of things it sounds like you have two types of deployments you might want to run.
Laravel API: If you're using Laravel Forge already then this is a great way to go about deploying your Laravel App, takes care of most of the process and easy server management.
Vue.js App: Few things you can do here, I personally prefer using a provider like Vercel or Netlify who let you deploy your static sites/frontends for free-low costs. You can write custom build steps but they have great presets that should work out the box.
If you really want to keep everything on AWS then look into how to host static sites on AWS
I'm following the tutorial to deploy a ruby app to google compute engine. Everything works, however I now want to ssh into the app to run migrations etc. After some searching i was able to find my files under a docker instance here /var/lib/docker/aufs/diff/e2972171505a931749490e13d21e4f8c0bb076245ef4b592aff6667c45b2dd13/app
Is there a simpler way to access my files? perhaps a symlinked folder?
Ruby apps on Google AppEngine run via Docker. Because AppEngine is a PaaS provider, it's discouraged (though possible) to run commands on production machines. If you'd like to run database migrations, please run them locally and point your configuration at your production database.