Pause Queue in a Job - laravel

I have following problem to solve:
Multiple Users can submit Jobs to a Queue via a Web Interface.
This Jobs are then stored in the Database via the database Queue Driver.
Now my problem is: I want the Queue to run all Jobs until inside a job I say something like $queue->pause() because for the next Job to run I need some confirmation from the User.
How would I do something like this?
run jobs
inside one job the job determines that it needs some confirmation from the user
halt the queue and keep that job which needs the confirmation in the queue
any user on the website can press a button which then would delete this confirmation job and start the queue again.
My current "solution" which didn't work was this:
create 2 different job types:
ImageProcessingJob
UserNotificaitonJob
The queue worked all ImageProcessingJobs until it hit a UserNotificationJob.
Inside the UserNotificationJob->handle() I called Artisan::call("queue:restart"); which stopped the Queue.
The problem with this solution is: The UserNotificationJob also got deleted. So if I would start the Queue again the Queue would immediately start with the remainig ImageProcessingJobs without waiting for the actual confirmation.
I'm also open to other architectural solutions without a Queue system.

One approach which avoids pausing the queue, is to have the UserNotificationJob wait on a SyncEvent (the SyncEvent is set when then confirmation comes back from the user). You can have this wait timing-out if you like, but then you need to repost the job to the queue. If you decide to timeout and repost, you can use job chaining to setup dependencies between jobs so that nothing can be run until the UserNotificationJob competes.
Another approach might be to simply avoid posting the remaining jobs until the confirmation is sent from the user.

Related

Laravel Jobs and Events processing in same queue worker

I'm trying to run a time-intensive process in my jobs queue. I have an event being triggered which sends out a socket command to connected users and I have a job that gets queued after that which can run in the background. Before now, I've never run php artisan queue:work and my events system was working flawlessly. Now I'm trying to process my jobs and my events AND jobs are trying to be processed by the same queue worker.
Here is the code that I use to trigger these:
event(new ActivateItemAndUpdateRotation($id));
ChangeStatus::dispatch($id, $status);
This is really bad for my performance because the users can change statuses really quickly which needs to be updated rapidly and the jobs can just do their thing as they're able to in the background. I've tried adding the jobs to a specific queue and only running that queue worker, but then the events don't get processed at all. So really I have two questions:
how have events been getting processed without running a worker up until now?
is it necessary to have two workers running now to process the events and the jobs asynchronously?

Hangfire: Can we run multiple queue parallelly?

I'm creating queue dynamically i.e. 1 2 3 4 5, This queue are created based on the user's request. each request create new queue.
Now all this queue are running one by one only, I would like to run parallel, So that each user can see their jobs are running rather than waiting for other user's task to be complete.
I have resolved the issue by creating multiple server with respect to queue.
for example, couple of queue on 1 server and few on another server.
code:
app.usehangfireserver(options1);
app.usehangfireserver(options2);

spring batch -how to submit jobs without executing ? to be able to run them later by a scheduler

i'm starting a project in spring batch, my plan is like the following:
create spring boot app
expose an api to submit a job(without executing), that will return the job execution id to be able to track the progress of the job later by other clients
create a scheduler for running a job - i want to have a logic that will decide how many jobs i can run at any moment.
the issue is that my batch service may receive many requests for starting jobs, and i want to put the job exeuctuion in a pending status first, then a scheduler later on will check the jobs in pending status and depending on my logic will decide if it should run another set jobs.
is that possible to do in spring batch, or i need to implement it from scratch ?
The common way of addressing such a use case is to decouple job submission from job execution using a queue. This is described in details in the Launching Batch Jobs through Messages section. Your controller can accept job requests and put them in a queue. The scheduler can then control how many requests to read from the queue and launch jobs accordingly.
Spring Batch provides all building blocks (JobLaunchRequest, JobLaunchingMessageHandler, etc) to implement this pattern.

Service synchronization issue

I've created two services.
One of them (scheduler) only requests to the other (backoffice) for performing some "large" operations.
When backoffice receives a request:
first creates a mark (key on redis) in order to set that the process has started.
Each time a request is reached:
backoffice checks if the mark exist.
When it exists means that the previous process has not yet finished, and escape it.
Perform the large process.
When process is finished, the previous key in redis is removed.
It would be something like this:
if (key exists)
return;
make long process... (1);
remove key;
The problem arises when service is destroyed when the process has not already finished and then it doesn't removes the mark on redis. It means the process will never run again.
Is there any way to solve this kind of problems?
The way to solve this problem is use an existing engine as building custom scalable and robust solution for reliable service orchestration is really hard.
I recommend looking at Uber Cadence Workflow which would allow to convert your pseudocode into a real production application with minor changes.
You can fire a background job that updates timestamp under the key, e.g. every minute.
When service attempts to start the process it must verify key existence (as it does now) + timestamp under the key. If it is more than 1 minute ago then the previous attempt is stale and you can start over.
Sounds like you should be using a messaging queue to schedule tasks for the back office service. Queuing solutions like RabbitMQ allow you to manually acknowledge (or “ack”) that the process is complete. Whenever a subscriber crashes, the queue detects that the connection dropped without acknowledgement and will re-enqueue the same task which will be picked up by the next available subscriber. Here’s another thread talking about this problem specifically focused on messaging queues:
What happens to fetched messages when RabbitMQ consumer crashes?

How reliable is delaying a Mail in Laravel?

I want to inform the seller, that the buyer is coming soon (about 2 hours before pickup time) via mail.
I would normally do it the hard way with CRON and a database table. Checking hourly if I find an order with pickup time minus 2 hours, only then sending the mail out.
Now, I would like to know if you would recommend using Queueing Jobs for sending Mails out.
With
$when = now()->addDays(10); //I would dynamically set the date
Mail::to($order->seller())
->later($when, new BuyerIsComing($order));
I can delay the delivery of a queued email message.
But how safe would this be? Especially, if someone is ordering something but is picking it up in let us exaggerate two months?
Is the Laravel queueing system rigid enough to behave correctly after long delays (i.e. 2 months)?
Edit
I'm using Redis for Queueing
You actually have nothing to worry about. Sending mail usually increases the response time of your application, so it's a good thing you want to delay the sending.
Queues are the way to go and it's pretty easy to setup in Laravel. Laravel supports a couple of them out of the box. I would advise you start with database and then try beanstalk etc.
Lastly and somehow more importantly, use a process manager like Supervisor to monitor and maintain your queue workers...
Take a look at https://laravel.com/docs/5.7/queues for more insight.
Cheers.
If by safe, you mean reliable, then it would be little different than sending an email immediately. If there's ever a possibility that your server "hiccups" and doesn't send an email, that possibility would be the same now as 10 minutes from now. Once the job is in the queue, it is persisted until completion (unless you use a memory-based driver, like Redis, which could get reset if the server reboots).
If you are using a database queue driver or remote, the log of queued jobs will remain even if the server is unavailable for a short period of time. Your queue will be honored even if the exact time stamp for when you want to send the job has expired. For instance if you schedule to send an email at 1:00pm but your server is down at that exact moment, when it comes back online it will still see the job because it is stored as incomplete and the time for the job is in the past, which will trigger the execution of the job at the next time your queue worker checks the job list.
Of course, this assumes that you have your queue worker set up to always check jobs and automatically restart, even after a server failure, but that's a different discussion with lots of solutions...such as those shown here.
If you're using database driver with Laravel queues to process your email then you don't need to worry about anything.
Jobs are only removed from Jobs table if they are successfully completed otherwise their next attempt time is set which is few minutes in future and they are executed again (if your queue worker is online).
So its completely safe to use Laravel queues

Resources