Symfony framework on windows azure cloud - windows

is it possible to run symfony (1.4) on the windows azure cloud?
The two things I'm wondering is how to execute the symfony tasks and where will symfony save the cache files (blob storage?).
Thanks for your answers.

PHP is something that Microsoft are taking very seriously these days so yes, Symfony can run on top of Azure although documentation is sparse as most people stick to Linux servers.
Regarding tasks, there is a tool for running command line tasks on Windows Azure although I have not yet tried it myself.
http://azurephptools.codeplex.com/

In the mean time I got symfony 1.4 running inside the WindowsAzure Cloud. It was not as hard as expected. I was also able to write a blob storage caching for symfony. Session handling works ok, but you need to modify the symfony session handler to work correctly with more than one server instances.

Related

How to migrate Laravel database on Docker onto Google Cloud

My Laravel app is running on docker using Linux commands on a Windows Machine. This means instead of using 'php artisan' commands, I use 'sail artisan' commands. I am able to migrate locally onto the SQL server on docker (sail artisan migrate) and am able to deploy the app onto Google Cloud (gcloud app deploy). The final bit of my puzzle piece is migrating the database onto the Google Cloud SQL server.
When I was first setting the app up, I had problems with this so I exported the SQL server, uploaded it to Google Cloud and then deployed it manually which was fine for a one-off but I now have things I would like to change about the database structure without losing all the data. I also figured it was about time I learnt to do it properly anyway.
I have attempted to use the instructions on Google's community tutorial, however, this guide presupposes the project does not yet exist whereas I am working with a pre-existing application and a pre-existing database. I tried to jump into the tutorial halfway through but I couldn't get the Cloud SQL proxy to work.
After a bit more research, I found this article which I got partway through, however, once I got to the part needing TCP or Unix sockets, neither set of commands will run without error on my Ubuntu terminal.
If anyone knows of any useful articles or has had this problem themself, I would greatly appreciate your help.
Additional Info:
Laravel Framework 8.69.0
Vue version 3.0.5
Docker Engine Community 20.10.8
SQL server 'europe-west2'
I'm not aware of anyway to run sail/artisan command line commands on GCP.
However in ./vendor/facade/ignition/src/Solutions/RunMigrationsSolution.php there is the following function
public function run(array $parameters = [])
{
Artisan::call('migrate');
}
which can be used to programmatically run migrations.
So in ./routes/web.php make a route like
Route::get('/run_migrate',
[DataController::class, 'runMigrate']
)->middleware(['auth', 'verified'])->name('run_migrate');
and in ./app/Http/Controllers/DataController.php add the function
public function runMigrate() {
Artisan::call('migrate');
}
Now after signing in as an authorised user (or if you remove the auth middleware you won't need to sign in) you can go to https://your-app-url.com/run_migrate and it will run any outstanding migrations.
To be able to do this the schema table has to exist so you will have to import your local database to start.

Automatically set up new Digital Ocean server for Laravel app

I know that https://forge.laravel.com/auth/register is available for $12/month*, but I'd like to understand how to accomplish the same thing myself.
What I assume is possible (and what I'm looking for): I create a server that has only Ubuntu 18.04.3 installed and nothing else, and I upload a script that installs all the appropriate software and sets up MySQL with the correct passwords, etc (without manual intervention).
I've tried Laradock and had tons of problems with Docker and don't want to do that anymore.
I see that https://cloud.digitalocean.com/droplets/new lets me create a LEMP droplet (Ubuntu, Nginx, MySQL, PHP-FPM) with one click. But it lacks Redis, and its versions are outdated (e.g. PHP 7.2).
I've heard people mention Chef (maybe this?), but that seems to be more complicated than what I'm imagining.
Unfortunately I'm not even sure how to search for what I'm trying to do (or how to tag this question); is this called "server provisioning"? I've been searching phrases like "automatic install script redis mysql server for laravel".
Thanks in advance for pointing me in the right direction.
* I also just found https://getcleaver.com/ and https://runcloud.io/server-management, which each look like Forge + Envoyer (and RunCloud offers a free plan).
It is called server provisioning and Chef would be a good fit for this, check out Ansible too - another thing you could do is setup the server yourself and create an image from that server and then base your new servers out of that image, that way you'll have all your services installed from the start.
This sounds like a job or something like Puppet (or Chef/Ansible), however Laravel Envoy may be another tool to look at if you haven't already for the second part of your problem.
I highly recommend Heroku (or similar service), as this is all done out of the box, and has a ton of other great features that make developing a pipeline a breeze.

Critical Caching issue in Laravel Application (AWS Server)

I am facing a critical issue in my application, it is developed in Laravel and Angular. The issue is I am getting the old email templates on live site and on local server I am getting the latest updated one. The process to deloy the code is automatic, I just to commit the code in BitBucket and after that Bitbucket Pipleline push the code to AWS server directly.
I have already run the cache cammands for Laravel and restarted the jobs but still i am getting the same issue. If anyone have expirienced the same issue or have knowledge of the same to resolve, Please guide!
I think you can try one of the following ways to overcome the issue, I faced a similar issue and resolved it by following ways -
Try deleting the cache files manually from Laravel from storage/framework/views
Upload the code directly into AWS for particular module without using the pipeline way
restart your server
This will surely resolve your issue!
Since you are using Laravel and angular application deployed on AWS,
I assume that bit bucket is pushing code and build commands are fired on every push
there are few things which can help you.
Try to build the angular side on every push, since angular builds hashes all the files in the dist folder
Try to delete the Laravel cached files which are stored in storage/framework/views
Check that on that your server is pointing to the right project folder
If any of the points from 1 or 2 works you can automate the process by passing CLI command after every push,
Point 1 and 2 are achievable by passing CLI commands.

Move OctoberCMS website from Ubuntu VM to a CentOS 7 VM

Our web developer picked OctoberCMS to develop our new website (his skill). Unfortunately before completion he rapidly left us due to health reasons and is no longer available. His Ubuntu environment has some problems and we need it on CentOS 7 anyway. The rest of us are OctoberCMS newbies, but want to learn it.
We built a CentOS 7 VM and installed OctoberCMS and want to move his work over.
We can not find any instructions on how to "export" the work he has done thus far and import it into our new OctoberCMS.
He is using 10 plugins and 3 he developed. (I don't know if that is relevant)
Is there an easy way to do this or at least instructions?
We have been googling, youtubing, IRC'ing for a week and still at a loss.
Any help would be most appreciated.
There really isn't anything special you need to know about moving an OctoberCMS install to a new server compared to moving over any other PHP application.
I am assuming you know how to do the basics of setting up a LAMP stack, such as setting up a virtual host for the domain you want to host the site on and setting up a MySQL database and user/password to access the database. There are of course many variants on how you could accomplish this such as using a management tool like Plesk or cPanel, or just configuring the services manually via the command line.
1) Ensure your new server is running at least roughly the same version of Apache, MySQL, and PHP.
2) Copy over the directory that contains all of the web files from the old server into the document root for your domain on the new server.
3) Do a database dump from the old server and copy it to the new server. If possible, use the same database name and username and password as the old server. This way you don't have to worry about updating the configuration of the website.
4) Pull up the site and troubleshoot any errors that come up. It is helpful if OctoberCMS debug mode is on.
Following the above method will ensure that you have the exact same setup on your new server that the old server had. This will copy over all of the plugins, data, etc.
There are of course many complexities that can come up during a switch over like this, but this should at least get you started and you can come back to StackOverflow with some more specific hurdles.
Hope that helps.

How to actually configure debugging in CFBuilder

I have ColdFusion Builder 2.0.0 installed and I am trying to look at the much vaunted step debugging. However, I cannot seem to get it to work as I don't have my site / JRun install setup in the naive way the examples show.
I am using version 9,0,1,274733 of ColdFusion and my configuration is as follows:-
Installed as multi-server version with Jrun here:- c:\Apps\JRun4
application files are here:- d:\websites\my.website.com
web root is here d:\websites\my.website.com\www
core library of CFCs is here d:\websites\frameworks\core which is mapped in CF as core
I have read this watched this http://help.adobe.com/en_US/ColdFusionBuilder/Using/WS0ef8c004658c1089-31c11ef1121cdfd6aa0-7fff.html and this http://forta.com/blog/index.cfm/2007/5/30/CF8-Debugger-Getting-Started and watched this https://experts.adobeconnect.com/_a204547676/p33029638/?launcher=false&fcsContent=true&pbMode=normal but I get stuck at the point after you have configured RDS and you are setting up the server for your project.
Now I am pretty sure the above is correct, when I move to the next page in the wizard I get the following:-
Now I as I understand it my Server Home should be c:\Apps\JRun4 and my Document root should be d:\websites\my.website.com
This all looks like it is going to be fine until you actually try and debug when I get
followed by
I can confirm that the server is running and RDS is enabled as in the RDS Dataview I can see all my databases.
Any help would be gratefully received as this is very frustrating and the documentation is very lacking.
There is a video tutorial as well that you may want to check and see if that helps. http://blogs.adobe.com/anand/2011/01/learn-how-to-debug-coldfusion-applications-using-coldfusion-builder-2.html
You need to specify the RDS username/password and the "application server name". If you are using the base instance that was installed when you setup the multiserver install of CF that is "cfusion", otherwise its the name of the instance you are using.
The RDS username is most likely "admin" unless you setup custom users for RDS. The password is the RDS password you specified when you installed CF.

Resources