Is there a way to implement in Laravel 8, a user specific queue containing the ability to fire an event when the queue is empty? - laravel

I currently have an import process that dispatches a series of jobs to a default queue that are initiated by user input via API.
If I add the user id to the queue name when dispatching it will go to a user-specific queue but I have no way of starting a queue worker for that specific queue. Any way to programmatically start a queue:work command to get around this?
Furthermore, I'd like to send a broadcast signal for the individual user once the queue has finished its jobs. My initial thoughts were to implement sending a signal in an event subscriber that monitors that user-specific queue if I can solve the initial question.
I found a partial route here: Polling Laravel Queue after All Jobs are Complete
This doesn't fully work because it will keep triggering when the queue is empty. So I'd have to find some way to unsubscribe the event subscriber once it runs once. I'd also have to find a way to subscribe the event subscriber at runtime once the import process has started vs in the Event Subscribe Service Provider as stated in the official Laravel documentation.
https://laravel.com/docs/8.x/events#event-subscribers
One approach could be to create a custom table that manages this and then add/remove to it and have the loop event subscriber iterate through that table, and check if the queue is in that table, if so then check to see its size, and if its 0, send the broadcast signal and then remove from the table.
Here are the events that already exist for Queues. https://laravel.com/api/8.x/Illuminate/Queue/Events/Looping.html
What's the best way to approach this?
Start to End:
User provides a file to import, I'm interpreting the file, and dispatching jobs that process the data, once jobs are finished, a broadcast signal should be sent to that user saying the import is completed.

You might want to use the Job Batches functionality
It will let you dispatch jobs and run callback at the end. Here is an exemple from the doc:
$batch = Bus::batch([
new ImportCsv(1, 100),
new ImportCsv(101, 200),
new ImportCsv(201, 300),
new ImportCsv(301, 400),
new ImportCsv(401, 500),
])->then(function (Batch $batch) {
// All jobs completed successfully...
})->catch(function (Batch $batch, Throwable $e) {
// First batch job failure detected...
})->finally(function (Batch $batch) {
// The batch has finished executing...
})->dispatch();
You can send the Broadcast Event at the end in the Callback

Related

How to handle 2 event source using 2 separate thread in golang

This is a design-related question. I have a situation where my application receives events from 2 different sources it is registered with and the application should handle events from these 2 sources parallelly. My Application is already handling events from one source using the buffered channel (where events are queued up and processed one after another). Now I am in a situation where the application needs to handle the events from a different source and I cannot use the same channel here because the Application may have to handle events from these 2 sources parallelly. I am thinking of using another buffered channel to handle the events from the second event source. But I am concerned about the same resource being used to process 2 events parallelly. Even though we use channel we need to again apply sync while processing these events.
Could you please suggest me a better way, any patterns I can use, or a design to handle this situation?
This is the code I have now to handle event from one source
for event := range thisObj.channel {
log.Printf("Found a new event '%s' to process at the state %s", event, thisObj.currentState)
stateins := thisObj.statesMap[thisObj.currentState]
// This is separate go routine. Hence acuire the lock before calling a state to process the event.
thisObj.mu.Lock()
stateins.ProcessState(event.EventType, event.EventData)
thisObj.mu.Unlock()
}
Here thisObj.channel is being created at start up and the events are being added in a separate method. Currently this method is reading events from the channel and processing the events.
You can use the for select pattern to read process your events from a single channel and process them in parallel:
var events *EventsChannel
for {
select {
case eventA = <-events:
// handle A
case eventB = <-events:
// handle B
default:
// unknown type, error case
}
}

Running nested batches inside a batched chain of jobs

I have a batched series of chained jobs, and inside those chains I need to be able to batch other jobs.
Say I have 3 Clients
For Each Client I need to
Sync their details with an external API
Create 0 or more new cases and sync them individually
Update 0 or more existing cases and sync them individually
And I need the wrapping batch to keep track of when this is all finished.
I currently have the following structure:
$jobs = $clients->map(fn(Client $client) => [
new SyncClientJob(...),
new CreateMultipleCasesJob(...),
new UpdateMultipleCasesJob(...)
]);
Bus::batch($jobs)->name('BatchA')->etc()
In CreateCasesJob, something along the lines of
public function handle()
{
$jobs = $collection_of_new_cases->map(fn(Case $case) => new CreateSingleCaseJob($case));
Bus::batch($jobs)->dispatch();
}
CreateCasesJob and UpdateCasesJob should both dispatch their own batch of jobs, since each case needs to be synced individually
The problem is of course that the Create/Update jobs are "complete" in the chain when they're dispatched, not when all their internal jobs are completed. So the BatchA job will be marked as completed when it hasn't yet synced any cases.
I solved this by having each batch of jobs dispatch an event in the ->finally() callback. The listener for that event would then build and start the next batch.

how to perform an heavy database related task in laravel that consume more than 30 seconds

I'm developing a binary multilevel marketing system in Laravel, at the registration time there we have to perform a task to entries for many types of bonus for each parent nodes of a new user. This task is time-consuming.
No one user want to see buffering and task taking more than 30 second that is not the right way.
I want to run this mechanism in the background and send a success message that your account created successfully.
You could use observers that trigger queued jobs.
After the user does an action on a model, the observers create queued jobs in the background. While the queue is being processed the user can continue working.
either implement laravel job and queues or use https://github.com/spatie/async.
you can invoke sub processes to make your task
use Spatie\Async\Pool;
$pool = Pool::create();
foreach ($things as $thing) {
$pool->add(function () use ($thing) {
// Do a thing
})->then(function ($output) {
// Handle success
})->catch(function (Throwable $exception) {
// Handle exception
});
}
$pool->wait();

How do I return a message back to SQS from lambda trigger

I have lambda trigger that reads messages from SQS queue. In some conditions, the message may not be ready for processing so I'd like to put the message back in queue for 1min and try again. Currently, I am create another copy of this customer record and posting this new copy in the queue. Is there a reason/way for me to keep the original record in queue as opposed to creating a new one
def postToQueue(customer):
if 'attemptCount' in customer.keys():
attemptCount = int(customer["attemptCount"]) + 1
else:
attemptCount = 2
customer["attemptCount"] = attemptCount
# Get the service resource
sqs = boto3.resource('sqs')
# Get the queue
queue = sqs.get_queue_by_name(QueueName='testCustomerQueue')
response = queue.send_message(MessageBody=json.dumps(customer), DelaySeconds=60)
print('customer postback: ', customer)
print ('response from writing ot the queue is: ', response)
#main function
for record in event['Records']:
if 'body' in record.keys():
customer = json.loads(record['body'])
print("attempting to process customer", customer, " at: ", datetime.datetime.now())
if (not ifReadyToProcess(customer)):
postToQueue(customer)
else:
processCustomer(customer)
This is not an ideal setup for SQS triggering Lambda functions.
My testing shows that messages sent to SQS will immediately trigger the Lambda function, even if a Delay setting is provided. Therefore, putting a message back onto the SQS queue will cause Lambda to fire again straight after.
To avoid a situation where Lambda is continually checking whether a message is ready for processing, I would recommend:
Use Amazon CloudWatch Events to trigger a Lambda function on a schedule (eg every 2 minutes)
The Lambda function should pull messages from the queue and check if they are ready to process.
If they are ready, then process them and delete them
If they are not ready, then push them back onto the queue with a Delay setting and delete the original message
Note that this is different to having SQS directly trigger Lambda. Instead, the Lambda function should call ReceiveMessages() to obtain the message(s) itself, which allows the Delay function to add some time between checks.
Another option: Instead of re-inserting a message into the queue, you could simply take advantage of the Default Visibility Timeout setting by not deleting the message. A message that is read from the queue, but not deleted, will automatically "reappear" on the queue. You could use this as the "retry" time period. However, this means you will need to handle Dead Letter processing yourself (eg if a message fails to be processed after n tries).

How to get job data from reserved jobs in laravel using pheanstalk?

I am working on a feature where I need to check the job status in beanstalkd queues. I have tried a few things but I am not getting the jobs that are reserved for queues other than the default queue
$pheanstalk = \Illuminate\Support\Facades\Queue::getPheanstalk();
$pheanstalk->useTube('import-live');
$pheanstalk->watch('import-live');
while ($job = $pheanstalk->reserve(0)) {
var_dump(json_decode($job->getData(), true));
}
This is what I have tried. But I still get the data for the default queue. Anyone has any idea on how to get the data for the import-live queue as well? Or any other queues I have running in my system. Basically I want to get the data on all the queues in the system.
First of all - make sure that there are jobs in the other queues.
Then, if you don't want to get jobs from the 'default' queue for a particular run, you can ignore a it.
$job = $pheanstalk
->watch('import-live')
->watch('import-other')
->ignore('default')
->reserve();
->useTube('..') is only used when put()-ing messages into a queue.

Resources