running multiple queues on laravel homestead - laravel

I use laravel homestead to provision my server on my virtualbox which is vagrant. I currently use the default queue to run my jobs but after sending to a new queue it just does not get picked up probably because I don't have it set up on homestead yet. How do I set up multiple queues on laravel homestead ?

As mentioned in the documentation :
Since queue workers are long-lived processes, they will not pick up
changes to your code without being restarted. So, the simplest way to
deploy an application using queue workers is to restart the workers
during your deployment process.
to pick up your new queue, connect to your server via ssh and restart the queue with:
php artisan queue:restart
for more information check the doc queue-workers

Related

Can multiple clients connect to a Laravel dev server simultaneously?

I have a Laravel app running locally using ./vendor/bin/sail up. I also have a trivial NodeJS server (running locally as well) that waits 60 seconds on each request and returns dummy data. The Laravel app makes a request to the Node app and becomes unresponsive to client requests until the 60 seconds are up.
Is this a limitation of the Laravel dev server? Is there a setting I'm missing?
Answering my own question.
Laravel uses php artisan serve underneath sail, which in turn uses the built-in server, which by default "runs only one single-threaded process."
However, "You can configure the built-in webserver to fork multiple workers in order to test code that requires multiple concurrent requests to the built-in webserver. Set the PHP_CLI_SERVER_WORKERS environment variable to the number of desired workers before starting the server. This is not supported on Windows."
Adding PHP_CLI_SERVER_WORKERS=5 to my .env file fixed the issue.

Containerized Laravel application that connects to a remote database

Good day everyone, I have a Laravel application that is supposed to connect to a remote MYSQL database in production, and to ease deployment I am using docker. I have setup a GitHub actions workflow that is triggered when I push to master branch, the workflow essentially runs a couple of tests and then builds my app into an image and then pushes to docker hub.
To avoid database connection issues when composer dump-autoload is run during the build process, I allowed connection from any host (changed bind-address to 0.0.0.0 in mysql config) and also setup the mysql user to connect from any host. This seems to do the trick but my concern is obviously exposing my database service to the entire world. Fortunately its possible to setup my own dedicated server for Github actions, which means I can easily restrict my db service to that host. Would that be the Ideal solution or there is way to run the workflow without needing to connect to a database?.
Try to connect to remote database using an SSH Tunnel
ssh -N -L 3336:127.0.0.1:3306 [USER]#[REMOTE_SERVER_IP]
With this you do not need to publish MySQL to the world and could bind it to 127.0.0.1 on Remote host.

Run two Laravel Echo Server (Socket.io) instances on one Server?

So I want to run two socket.io Server / Laravel Echo Server on one physical server. On this server there is a Virtual Host for Production and another one for testing purposes. Now they both need a connection to a Websocket running on the same server.
Is this possible without broadcasting Events to the wrong instance? Or should I run a second instance of the Laravel Echo Server on another port an let both Environments connect on different Socket.io servers?
Is there a common approach?

Does Laravel Homestead come with Nginx or Apache installed?

I am very new to Laravel and i love coding locally so of course i am using Homestead. Does Homestead use Nginx or Apache? A question before i send my site online when i get it completed.
Laravel 5.6 is the current and latest. I am using that. I am also using the latest version of Homestead as of 5/7/2018.
By default Homestead goes with Nginx but it's also possible to use Apache.
You can read more about it here: https://laravel.com/docs/5.6/homestead :
Homestead uses the Nginx web server by default. However, it can install Apache if apache is specified as a site type. While both web servers can be installed at the same time, they cannot both be running at the same time. The flip shell command is available to ease the process of switching between web servers. The flip command automatically determines which web server is running, shuts it off, and then starts the other server. To use this command, SSH into your Homestead machine and run the command in your terminal:
flip

Beanstalkd support for multiple servers and load balancing

I am using beanstalkd in a Laravel project to handle jobs on a queue. Beanstalkd is running locally. What I want to do is add one or more remote servers to handle some jobs when the queue gets bigger. I know that with Laravel I can send a job to a specific remote connection but in this way I don't know the load in each server prior to sending the job.
I was wondering if beanstalkd supports load balancing between servers and error handling when a remote job fails for example.
Thank you
Beanstalkd does't have features for load balancing.
You could setup a HAProxy on your balancer and signup multiple servers with beanstalkd installed. Then when you send jobs from Laravel code you send to the HAProxy, and HAProxy decides on which sub-server puts the job, as it knows the loading and if there is an incident with a sub system.
In the code you just need to change the IP.
In your infrastructure you need to have balancer (HAProxy) that is setup with a pool of Beanstalkd servers.
We usually have 2 machines, and they are configured like this:
- Machine 1: HAProxy, Apache, MySQL, Laravel, Beanstalkd
- Machine 2: MySQL, Laravel, Beanstalkd

Resources