My Laravel app is running on docker using Linux commands on a Windows Machine. This means instead of using 'php artisan' commands, I use 'sail artisan' commands. I am able to migrate locally onto the SQL server on docker (sail artisan migrate) and am able to deploy the app onto Google Cloud (gcloud app deploy). The final bit of my puzzle piece is migrating the database onto the Google Cloud SQL server.
When I was first setting the app up, I had problems with this so I exported the SQL server, uploaded it to Google Cloud and then deployed it manually which was fine for a one-off but I now have things I would like to change about the database structure without losing all the data. I also figured it was about time I learnt to do it properly anyway.
I have attempted to use the instructions on Google's community tutorial, however, this guide presupposes the project does not yet exist whereas I am working with a pre-existing application and a pre-existing database. I tried to jump into the tutorial halfway through but I couldn't get the Cloud SQL proxy to work.
After a bit more research, I found this article which I got partway through, however, once I got to the part needing TCP or Unix sockets, neither set of commands will run without error on my Ubuntu terminal.
If anyone knows of any useful articles or has had this problem themself, I would greatly appreciate your help.
Additional Info:
Laravel Framework 8.69.0
Vue version 3.0.5
Docker Engine Community 20.10.8
SQL server 'europe-west2'
I'm not aware of anyway to run sail/artisan command line commands on GCP.
However in ./vendor/facade/ignition/src/Solutions/RunMigrationsSolution.php there is the following function
public function run(array $parameters = [])
{
Artisan::call('migrate');
}
which can be used to programmatically run migrations.
So in ./routes/web.php make a route like
Route::get('/run_migrate',
[DataController::class, 'runMigrate']
)->middleware(['auth', 'verified'])->name('run_migrate');
and in ./app/Http/Controllers/DataController.php add the function
public function runMigrate() {
Artisan::call('migrate');
}
Now after signing in as an authorised user (or if you remove the auth middleware you won't need to sign in) you can go to https://your-app-url.com/run_migrate and it will run any outstanding migrations.
To be able to do this the schema table has to exist so you will have to import your local database to start.
Related
Iam looking for help to containerize a laravel application with docker, running it locally and make it deployable to gcloud Run, connected to a gcloud database.
My application is an API, build with laravel, and so far i have just used the docker-compose/sail package, that comes with laravel 8, in the development.
Here is what i want to achieve:
Laravel app running on gcloud Run.
Database in gcloud, Mysql, PostgreSQL or SQL server. (prefer Mysql).
Enviroment stored in gcloud.
My problem is can find any info if or how to use/rewrite the docker-composer file i laravel 8, create a Dockerfile or cloudbuild file, and build it for gcloud.
Maybe i could add something like this in a cloudbuild.yml file:
#cloudbuild.yml
steps:
# running docker-compose
- name: 'docker/compose:1.26.2'
args: ['up', '-d']
Any help/guidanceis is appreciated.
As mentioned in the comments to this question you can check this video that explains how you can use docker-composer, laravel to deploy an app to Cloud Run with a step-by-step tutorial.
As per database connection to said app, the Connecting from Cloud Run (fully managed) to Cloud SQL documentation is quite complete on that matter and for secret management I found this article that explains how to implement secret manager into Cloud Run.
I know this answer is basically just links to the documentation and articles, but I believe all the information you need to implement your app into Cloud Run is in those.
I'm having a hard time deploying a Laravel app for test purposes on AWS Elastic Beanstalk.
Followed all sources i could find in web including AWS documentation.
Created a Elastic Beanstalk environment and uploading an application is straightforward as long as i do not include .ebextensions and the .yaml file in it.
Based on Maximilian's tutorial i created init.config file inside .ebextensions with contents:
container_commands:
01initdb:
command: "php artisan migrate"
Environment gets to a degraded state as it finishes to update and i get the following logs:
[2018-11-20T23:14:08.485Z] INFO [7969] : Command processor returning results:
{"status":"FAILURE","api_version":"1.0","results":[{"status":"FAILURE","msg":"(TRUNCATED)...y exists\")\n/var/app/ondeck/vendor/laravel/framework/src/Illuminate/Database/Connection.php:458\n\n2 PDOStatement::execute()\n/var/app/ondeck/vendor/laravel/framework/src/Illuminate/Database/Connection.php:458\n\nPlease use the argument -v to see more details. \ncontainer_command 01initdb in .ebextensions/init.config failed. For more detail, check /var/log/eb-activity.log using console or EB CLI","returncode":1,"events":[]}],"truncated":"true"}
I have been trying different .config files from other instruction resources but none of them seems to work.
I'm running:
Laravel Framework 5.7.5
EB Platform uses PHP 7.2 running on 64bit Amazon Linux/2.8.4
RDS uses MySQL 5.6.40
I really do not know what is going on and would appreciate if you could give any suggestion.
I finally found my way out. Providing some documentation for anyone that hits the same issue.
What I was trying to do...
My main objective was to test a Laravel 5.7 application on a live AWS Elastic Beanstalk (EB) server. I was also in need of a way to visualize data using phpMyAdmin, a tool that fits my need. This is a very simple CRUD app just for learning the basics of both technologies.
What I did (worked)
Followed the normal workflow of creating an EB application mainly using the web console.
Name the application
Chose PHP as a platform
Start off with a base application (do not upload code yet)
Hit configure more options
In security card select your key pair and save. (This is valuable for SSH'ing on your server)
In the database, the card creates an RDS instance. Select whatever options that fit your needs and set a username/password.
Create environment.
After a while, you should have all resources created by EB (EC2 and RDS instances, security group, EIP, Buckets, etc) in the app environment.
Preparing your Laravel application is a straight forward process. You must not forget to change config/database.php to read server variables. My approach was to define them at the start of the file.
The main sources of troubles reside in configuring your server instance to include all software and configuration needed by your app and specific needs. This is done by including a .yaml file inside .ebextensions folder. This folder should reside in the root directory of your Laravel application. It's also a good idea to check your syntax before submitting another app version to EB. As per my needs, I used this script which basically installs phpMyAdmin as I deploy a new version. Specifically for this startup script, environment variables should be defined, namely $PMA_VER, $PMA_USERNAME, $PMA_PASSWORD for phpMyAdmin to work. You can create more environment variables in the software tab of your EB configuration page. Read the docs.
Another detail that might cause issues in running commands at startup using YAML script (specifically migration) is caused by Laravel and MySql versions. As for example, I am using Laravel 5.7 and the default MySQL version option in EB RDS creation wizard is something like 5.6.x. This will throw issues of the type:
Illuminate\Database\QueryException : SQLSTATE[42000]: Syntax error or access violation: 1071 Specified key was too long; max key length is 767 bytes (SQL: alter table `users` add unique `users_email_unique`(`email`))
If this is your scenario, despite you should have already googled and sorted out that adding the line of code Schema::defaultStringLength(191); to the boot function of your app/Providers/AppServiceProviders.php file will do the trick.
You can do a typical migration passing the script:
container_commands:
01_drop_tables:
command:
"php artisan migrate:fresh"
02_initdb:
command:
"php artisan migrate"
This will drop existing tables avoiding conflicts and create a new one based on your code. You can read more logs from your server by SSH'ing and getting content of /var/log/eb-activity.log.
I'm working to revive an app that was originally hosted on Parse. I have access to a Bitbucket with the app code but the database itself was not migrated before Parse.com shut down. I would like to run the app through Parse Server (using mlab and heroku) but all documentation I've found online requires use of the Parse migration tool (Which is no longer available).
I understand that I can use the Parse Server example project on github and paste in my own app code to set up my app. Do I do paste in my code before or after deploying to Heroku/mlab? Also, which files should I keep from the parse-server-example and which should I delete? Are there other steps I should be aware of that become necessary without access to the Parse migration tool?
Unfortunately you can't migrate your database off of the Parse.com hosted service after January 30th, 2017.
Since you don't have a database to migrate, you can start a new Parse Server project from scratch. You can just follow the Getting Started With Heroku + mLab Development steps on the parse-server-example project, and add any existing cloud functions to the /cloud directory once you've cloned the project.
I'm following the tutorial to deploy a ruby app to google compute engine. Everything works, however I now want to ssh into the app to run migrations etc. After some searching i was able to find my files under a docker instance here /var/lib/docker/aufs/diff/e2972171505a931749490e13d21e4f8c0bb076245ef4b592aff6667c45b2dd13/app
Is there a simpler way to access my files? perhaps a symlinked folder?
Ruby apps on Google AppEngine run via Docker. Because AppEngine is a PaaS provider, it's discouraged (though possible) to run commands on production machines. If you'd like to run database migrations, please run them locally and point your configuration at your production database.
is it possible to run symfony (1.4) on the windows azure cloud?
The two things I'm wondering is how to execute the symfony tasks and where will symfony save the cache files (blob storage?).
Thanks for your answers.
PHP is something that Microsoft are taking very seriously these days so yes, Symfony can run on top of Azure although documentation is sparse as most people stick to Linux servers.
Regarding tasks, there is a tool for running command line tasks on Windows Azure although I have not yet tried it myself.
http://azurephptools.codeplex.com/
In the mean time I got symfony 1.4 running inside the WindowsAzure Cloud. It was not as hard as expected. I was also able to write a blob storage caching for symfony. Session handling works ok, but you need to modify the symfony session handler to work correctly with more than one server instances.