Laravel Queue Timeout Error - laravel

I am creating Laravel Jobs for sending emails and add them in Laravel Queue. Everything works fine, but the timeout of laravel queue is 300 seconds. How can I extend this time? Or I want to run this queue listen forever because anytime mails can be send due to user interaction. Any one can help?

To run a queue listener in the background, you need to configure it via Supervisor which is a process monitor for Linux. You can even assign the number of workers using this.
To configure the timeout, you can use the option timeout in the queue:listen command. The command will be:
php artisan queue:listen --timeout=500

The best way, You need separate data by page push to queue, instead of 1 queue large data, we have many queues waiting run backgrounds, if in you increase speed, you can make multiple jobs cath queue

Related

Laravel Jobs and Events processing in same queue worker

I'm trying to run a time-intensive process in my jobs queue. I have an event being triggered which sends out a socket command to connected users and I have a job that gets queued after that which can run in the background. Before now, I've never run php artisan queue:work and my events system was working flawlessly. Now I'm trying to process my jobs and my events AND jobs are trying to be processed by the same queue worker.
Here is the code that I use to trigger these:
event(new ActivateItemAndUpdateRotation($id));
ChangeStatus::dispatch($id, $status);
This is really bad for my performance because the users can change statuses really quickly which needs to be updated rapidly and the jobs can just do their thing as they're able to in the background. I've tried adding the jobs to a specific queue and only running that queue worker, but then the events don't get processed at all. So really I have two questions:
how have events been getting processed without running a worker up until now?
is it necessary to have two workers running now to process the events and the jobs asynchronously?

Jobs not firing but configured SQS

I am new to Laravel and i have configured SQS. I have an email send event which is handled in a controller. I see it it is called but it is not sending the email. What is the reason ?
This is a common case when you are not running queue worker. Check Documentation. Start your worker by using php artisan queue:work

How reliable is delaying a Mail in Laravel?

I want to inform the seller, that the buyer is coming soon (about 2 hours before pickup time) via mail.
I would normally do it the hard way with CRON and a database table. Checking hourly if I find an order with pickup time minus 2 hours, only then sending the mail out.
Now, I would like to know if you would recommend using Queueing Jobs for sending Mails out.
With
$when = now()->addDays(10); //I would dynamically set the date
Mail::to($order->seller())
->later($when, new BuyerIsComing($order));
I can delay the delivery of a queued email message.
But how safe would this be? Especially, if someone is ordering something but is picking it up in let us exaggerate two months?
Is the Laravel queueing system rigid enough to behave correctly after long delays (i.e. 2 months)?
Edit
I'm using Redis for Queueing
You actually have nothing to worry about. Sending mail usually increases the response time of your application, so it's a good thing you want to delay the sending.
Queues are the way to go and it's pretty easy to setup in Laravel. Laravel supports a couple of them out of the box. I would advise you start with database and then try beanstalk etc.
Lastly and somehow more importantly, use a process manager like Supervisor to monitor and maintain your queue workers...
Take a look at https://laravel.com/docs/5.7/queues for more insight.
Cheers.
If by safe, you mean reliable, then it would be little different than sending an email immediately. If there's ever a possibility that your server "hiccups" and doesn't send an email, that possibility would be the same now as 10 minutes from now. Once the job is in the queue, it is persisted until completion (unless you use a memory-based driver, like Redis, which could get reset if the server reboots).
If you are using a database queue driver or remote, the log of queued jobs will remain even if the server is unavailable for a short period of time. Your queue will be honored even if the exact time stamp for when you want to send the job has expired. For instance if you schedule to send an email at 1:00pm but your server is down at that exact moment, when it comes back online it will still see the job because it is stored as incomplete and the time for the job is in the past, which will trigger the execution of the job at the next time your queue worker checks the job list.
Of course, this assumes that you have your queue worker set up to always check jobs and automatically restart, even after a server failure, but that's a different discussion with lots of solutions...such as those shown here.
If you're using database driver with Laravel queues to process your email then you don't need to worry about anything.
Jobs are only removed from Jobs table if they are successfully completed otherwise their next attempt time is set which is few minutes in future and they are executed again (if your queue worker is online).
So its completely safe to use Laravel queues

How can I figure out why a Laravel Queue Worker is running?

A Laravel queue worker was producing a lot of error log entries due to the DB server crashing, in turn Laravel's log grew to 150gb within just two hours, filling up the entire hard drive so that several web apps stopped working.
But actually there is only a queue worker for sending emails in our system and no emails have been sent during the past days. So why is there still a queue worker running?
Are there other reasons why a queue worker might be accessing the DB in a Laravel system besides starting it "manually" (i.e. in our case - by the command that sends mails)?
We're currently using Laravel 5.1.
First, the Laravel worker is using DB to store its job details.
And, you should indicate the maximum number of times a job should be attempted using:
php artisan queue:listen connection-name --tries=3
Then setup a provider to handle if any queued job fails.
For your main question, you should install supervisor, then you can have a UI to manage your workers.

Beanstalkd running all jobs in the queue simultaneously on forge queue workers

I'm running an API build on Laravel Lumen 5.1, but I can't seem to get the Forge Queue Worker to work properly when using beanstalkd as a driver. It seems to run all the jobs in the queue simultaneously
I'm using the Forge UI to set up the driver
Queue Worker setup
And the .env drivers
The .env drivers
The queue system works fine when running it manually without any worker processing it.
If you need any more informations to help me, please just ask!
The purpose of the message queue is to allow parallel processing. If you have more workers eg: more threads than it will run simultaneously as many jobs.
In order to achieve non simultaneously that is counter intuitive and against the message queue principle. You can achieve that with 1 single worker, but it's not recommended as you don't leverage the power and scalability.

Resources