how prevent laravel's jobs run at the same time - laravel

I have a job as below:
class ProcessActions implements ShouldQueue
{
use Dispatchable;
use InteractsWithQueue;
use Queueable;
use SerializesModels;
protected $user_id;
public $uniqueFor = 4;
/**
* Create a new job instance.
*
* #param mixed $action_id
*/
public function __construct($user_id)
{
$this->user_id = $user_id;
}
public function uniqueId()
{
return $this->user_id;
}
/**
* Handle a job failure.
*/
public function failed()
{
}
/**
* Execute the job.
*/
public function handle()
{
Log::debug('');
Log::debug('Started Time: '.date('Y-m-d H:i:s'));
try {
} catch (Exception $e) {
Log::critical('Error occurred.');
Log::critical($e);
// make the job failed
$this->job->fail($e);
}
}
public function middleware()
{
return [(new WithoutOverlapping($this->user_id))->releaseAfter(4)->expireAfter(4)];
}
}
How I should prevent jobs run at the same time? unique and WithoutOverlapping didn't work and I have jobs that run at the same time.
Actually I want jobs with same user_id run by 4 second delay.
Also as I checked jobs'available_at and time that I logged in handle have some seconds difference.

You can use Ratelimiting feature in laravel
You can read more about here - https://laravel.com/docs/8.x/queues#rate-limiting -
In your job class inside handle method add this code -Make sure key is a unique value,Here is a sample code
Basically it limit the worker to process one job for a user at a given time.see uuid,next job will be processed 4 sec after.
$uuid='process_action_'.$user_id;
Redis::funnel($uuid)->limit(1)->then(function () {
//job logic
}, function () {
//reattempt the job after 4 seconds
return $this->release(4);
});

Related

Laravel Nested Jobs

I created a job that has a foreach loop that dispatches another job. Is there a way to fire an even when all the nested jobs are completed?
When triggered here is what happends
Step 1. first I trigger the batch job
GenerateBatchReports::dispatch($orderable);
Step 2. We then run a loop and queue other jobs
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
$dir = storage_path('reports/tmp/'.str_slug($this->event->company) . '-event');
if(file_exists($dir)) {
File::deleteDirectory($dir);
}
foreach($this->event->participants as $participant) {
$model = $participant->exercise;
GenerateSingleReport::dispatch($model);
}
}
I just need to know when all the nested jobs are done so I can zip the reports up and email them to a user. When the batch job is done queueing all the nested jobs, it is removed from the list. Is there a way to keep the job around until the nested jobs are done, then fire an event?
Any help would be appreciated.
Update: Laravel 8 (released planned on 8th September 2020) will provide jobs batching. This feature is already documented is probably perfect for nested jobs scenario and looks like this:
$batch = Bus::batch([
new ProcessPodcast(Podcast::find(1)),
new ProcessPodcast(Podcast::find(2)),
new ProcessPodcast(Podcast::find(3)),
new ProcessPodcast(Podcast::find(4)),
new ProcessPodcast(Podcast::find(5)),
])->then(function (Batch $batch) {
// All jobs completed successfully...
})->catch(function (Batch $batch, Throwable $e) {
// First batch job failure detected...
})->finally(function (Batch $batch) {
// The batch has finished executing...
})->dispatch();
We will also be able to add additional batched jobs on the fly:
$this->batch()->add(Collection::times(1000, function () {
return new ImportContacts;
}));
Original answer 👇
I came up with a different solution, because I have a queue using several processes. So, for me:
No dispatchNow because I want to keep jobs running in parallel.
Having several processes, I need to make sure the last nested job will not run after the final one. So a simple chaining doesn’t guarantee that.
So my not elegant solution filling the requirements is to dispatch all the nested jobs and, in the last one, dispatching the final job with a couple of seconds of delay, to make sure all other nested jobs that may still be running in parallel will be terminated.
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
$last_participant_id = $this->event->participants->last()->id;
foreach($this->event->participants as $participant) {
$is_last = $participant->id === $last_participant_id;
GenerateSingleReport::dispatch($model, $is_last);
}
}
and in GenerateSingleReport.php
class GenerateSingleReport implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $model;
protected $runFinalJob;
public function __construct($model, $run_final_job = false)
{
$this->model = $model;
$this->runFinalJob = $run_final_job;
}
public function handle()
{
// job normal stuff…
if ($this->runFinalJob) {
FinalJob::dispatch()->delay(30);
}
}
}
Alternatively
I’m throwing another idea, so the code is not flawless. Maybe a wrapper Job could be created and dedicated to running the last nested job chained with the final job.
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
$last_participant_id = $this->event->participants->last()->id;
foreach($this->event->participants as $participant) {
$is_last = $participant->id === $last_participant_id;
if ($is_last) {
ChainWithDelay::dispatch(
new GenerateSingleReport($model), // last nested job
new FinalJob(), // final job
30 // delay
);
} else {
GenerateSingleReport::dispatch($model, $is_last);
}
}
}
And in ChainWithDelay.php
class ChainWithDelay implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $job;
protected $finalJob;
protected $delay;
public function __construct($job, $final_job, $delay = 0)
{
$this->job = $job;
$this->finalJob = $final_job;
$this->delay = $delay;
}
public function handle()
{
$this->job
->withChain($this->finalJob->delay($this->delay))
->dispatchNow();
}
}
For laravel >= 5.7
You can use the dispatchNow method. That will keep the parent job alive while the child jobs are processing:
https://laravel.com/docs/5.8/queues#synchronous-dispatching
Parent job:
public function handle()
{
// ...
foreach($this->event->participants as $participant) {
$model = $participant->exercise;
GenerateSingleReport::dispatchNow($model);
}
// then do something else...
}
For laravel 5.2 - 5.6
You could use the sync connection:
https://laravel.com/docs/5.5/queues#customizing-the-queue-and-connection
Make sure the connection is defined in your config/queue.php:
https://github.com/laravel/laravel/blob/5.5/config/queue.php#L31
Parent job (NOTE: This syntax is for 5.5. The docs are a little different for 5.2):
public function handle()
{
// ...
foreach($this->event->participants as $participant) {
$model = $participant->exercise;
GenerateSingleReport::dispatch($model)->onConnection('sync');
}
// then do something else...
}
You could use Laravel's job chaining. It allows you to run a bunch of jobs in sequence and if one fails, the rest in the chain will not be run.
The basic syntax looks like this:
FirstJob::withChain([
new SecondJob($param),
new ThirdJob($param)
])->dispatch($param_for_first_job);
In your case your could add all of your GenerateSingleReport jobs to an array except the first one and add then add the final job that you want to run to the end of the array. Then you can pass that array to the withChain method on the first job.
$jobs = [];
$first_job = null;
$first_parameter = null;
foreach($this->event->participants as $participant) {
$model = $participant->exercise;
if (empty($first_job)) {
$first_job = GenerateSingleReport;
$first_parameter = $model;
} else {
$jobs[] = new GenerateSingleReport($model);
}
}
$jobs[] = new FinalJob();
$first_job->withChain($jobs)->dispatch($first_parameter);

Laravel 5.5 . Re-send data with the replacement of existing ones

there is a task:
1) the user will be retrieved and stored in the variable $user = User::find(1);
2) then the function displays experience;
3) In parallel with the operation of the function, the asynchronous method changes the experience by a random number every few seconds.
In the first function, the user experience is displayed once again. What will this conclusion be?
How can I implement the output on the page of deferred calculations?
Do I understand correctly that there should be the following sequence:
- on the page shows the experience;
- in parallel - every 3 seconds, the experience update is launched;
- in a minute (for example) the experience value is updated on the page?
There are 2 operations happening essentially independent of each other :
- Fetching and showing the information on frontend
- Updating the data in background
Each of this has some nitty gritty things like updates in X seconds etc which can be handled if we look at them as separate operations.
Frontend :
You can have simple route which gets the user using $user = User::find(1); as you mentioned in question and shows the information. That data will be what the respective user contains at the moment when the query is performed. It will have nothing to do with the background updates happening.
Then having requirements of fetching the updates, depending on which ever frontend javascript library you are using, you can have an Ajax call happening at an internal of X minutes depending on your refresh rate. This ajax call will get the updated information.
Handling background updates :
You can create an artisan command which has code to update the records.
Then you can schedule it to run every 3 minutes using laravel scheduling like :
$schedule->command('update:userexperience')->cron('3 * * * *');
And add * * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1 in your server's crontab`
All above details are in documentation
Tips while scheduling command :
When you are updating users, if you re running just one query to update all users then its great. But if you have a logic which assigns a new value for each user in an individual row update, use chunk to load limited records at a time. This will help you keep the memory utilisation under a limit.
Testing before scheduling :
Doing it every 3 seconds is a very very small interval. I would suggest first run a command manually and check how much time it takes. If the backgrund process takes 2 seconds to complete one time, 3 second interval is tooo small.
Also, if the number of records in users table are increasing quickly, you need to revisit sometime later to increase the cron interval. It is good to keep that in mind.
Update :
The smallest unit in a cron is 1 minute. So to schedule it for every 3 seconds you can do inside app/Console/Kernel.php's schedule function :
$secondInterval = 3;
$totalJobs = 60 / $secondInterval;
for ($i = 0; $i < $totalJobs; $i++) {
$schedule->command('update:userexperience', [ '--delay'=> $secondInterval ])->everyMinute();
}
And then inside your command you delay :
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
class UpdateUserExperienceCommand extends Command
{
/**
* The name and signature of the console command.
*
* #var string
*/
protected $signature = 'update:userexperience {--delay= : Number of seconds to delay command}';
/**
* The console command description.
*
* #var string
*/
protected $description = 'Import the CSVs';
/**
* Create a new command instance.
*
* #return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the command
*
* #return void
*/
public function handle()
{
sleep($this->option('delay'));
//update users
return;
}
}
Done.
Create a Job, which is designed for 30 requests within one and a half minutes. Every 3 seconds.
class UpdateUserExperienceJob implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $user;
/**
* Create a new job instance.
*
* #param User $user
*/
public function __construct(User $user)
{
$this->user = $user;
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
for ($i = 3; $i <= 90; $i += 3) {
sleep(3);
$this->user->update(['experience' => rand()]);
}
}
}
Create a method in the controller. Depending on the parameter repeatRequest depends on the launch of Job.
public function getExperience(): array
{
$user = User::find(request()->get('user_id'));
request()->get('repeatRequest') === 'true' ?: UpdateUserExperienceJob::dispatch($user);
return ['experience' => $user->experience];
}
Front implemented through Vue.js. In the component, when loading, we first get the current experience of the user, and then every 5 seconds we run a repeated request with a positive value of the repeatedRequest property. Due to the calculated properties, the value will change dynamically.
<template>
<div class="container">
<p>experience is updated every 5 seconds</p>
<p v-html="computedExperience"></p>
</div>
</template>
<script>
export default {
data: function () {
return {
experience: 0,
user_id: parseInt(document.URL.replace(/\D+/g,""))
}
},
mounted() {
this.firstExperience();
setInterval(this.repeatedExperience.bind(this), 5000);
},
computed: {
computedExperience() {
return this.experience;
},
},
methods: {
firstExperience: function () {
axios.get(document.URL + '/api', {
params: {
user_id: this.user_id,
repeatRequest: false
}
}).then((response) => {
this.experience = response.data.experience;
});
},
repeatedExperience: function () {
axios.get(document.URL + '/api', {
params: {
user_id: this.user_id,
repeatRequest: true
}
}).then((response) => {
this.experience = response.data.experience;
});
}
},
}
</script>

How to Make a Scheduler Task for Every New Record After 20 Minutes in Laravel?

I have a Parking System that I use Angular 6 + Laravel for the backend, and I have a specific problem that I don't know the right approach.
The Park has two totem's that send to my server an entry. I check if the client is invalid only when he goes out on the totem from the exit and when he goes to the payment area.
This is my code when he put the ticket that has EAN_13 barcode read. This is my code that is ready by the Exit Totem:
public function getEntrysDataByEan(Request $request)
{
if (isset($request)) {
$entryean = $request->input('entryean13');
$entry = $this->mdMovEntry->getEntrysDataByEan($entryean);
if (empty($entry)) {
$response["success"] = 0;
$response["message"] = "Não existe nenhuma entrada correspondente";
} else {
$nowHour = Carbon::now();
$enterHour = Carbon::parse($entry[0]->updated_at);
$difmin = $enterHour->diffInMinutes($nowHour);
$dif = $enterHour->diffInHours($nowHour);
if ($difmin <= 20) {
$this->mdMovEntry->validatedEntryByEan($entryean, Carbon::parse($entry[0]->updated_at), $nowHour);
$entry[0]->validated = 'S';
} else {
$this->mdMovEntry->devaluedEntryByEan($entryean, Carbon::parse($entry[0]->updated_at));
$entry[0]->validated = 'N';
}
$response["success"] = 1;
$response["message"] = "Entrada retornada com sucesso";
$response["entry"] = $entry;
}
} else {
$response["success"] = 0;
$response["message"] = "Nenhum dado enviado";
}
return $response;
}
The problem is that I think is so much processing to just read if the client can go out or not, so I search the Task Scheduling approach and Job approach from Laravel and don't find anything for my problem.
What I want is when the Totem from entry area makes an insert on my database I create some Job or Task that after 20 min for every new record he put to invalidated.
And when the client goes out, the system just does a getEntry on the database and check if he is validated or not.
How I can achieve this? I ask this because every example for the Task is a global execution but I only want a task for every new record it is possible?
Consider using model events in conjunction with delayed dispatching for queued jobs.
The following will listen for the created event when new entries are inserted into the database and dispatch a job delayed by 20 minutes to later invalidate the entry.
InvalidateEntry Job
class InvalidateEntry implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $entry;
/**
* Create a new job instance.
*
* #param Entry $entry
* #return void
*/
public function __construct(Entry $entry)
{
$this->entry = $entry;
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
// you'll probably need to add some conditional logic
// before updating the entry to determine if the
// action is still required.
$isEntryInvalid = ...;
if ($isEntryInvalid) {
$this->entry->update([
'validated' => 'N'
]);
} else {
// delete the job from the queue
$this->delete();
}
}
}
Entry Model
class Entry extends Model
{
public static function boot()
{
parent::boot();
static::created(function (Entry $entry) {
InvalidateEntry::dispatch($entry)->delay(now()->addMinutes(20));
});
}
}

No query results for model at boot created event

I have a model with boot function which has created event like below.
However, I sometimes (not all the time) get No query results for model on ProcessAddressRefine which is a job. As far as I understand, created event should happen after record is created, so there is no way that there is no query result unless it gets deleted right after it has been created. I also wonder that looking at the DB record, ProcessAddressRefine job is properly executed.
What would be the problem in this case?
Any advice or suggestion would be appreciated. Thank you.
Model
public static function boot()
{
parent::boot();
self::created(function ($model) {
if (!$model->lat || !$model->lng) {
ProcessAddressRefine::dispatch($model);
}
});
}
Job
class ProcessAddressRefine implements ShouldQueue
{
use Dispatchable, SerializesModels;
private $place;
public function __construct($place)
{
$this->place = $place;
}
public function handle()
{
if ($this->place->addressRefine()) {
$this->place->save();
}
}
}
Extra
public function addressRefine()
{
$helper = new MapHelper();
$coordinate = $helper->addressToCoordinate($geo_code_address);
if ($coordinate !== false) {
$this->lat = $coordinate['lat'];
$this->lng = $coordinate['lng'];
return true;
} else {
return false;
}
}
Assuming job is queued it's quite possible that model is created, then you dispatch the job, then model is deleted and then job is really executed and you are getting this message because model doesn't exist any more.
This was because of DB::transaction when Order record is created.

Silex , mongo DB and updating a Document , Unique Constraint validation doesnt work properly

I'm testing silex, mongodb and Form validation. And I run into a validation issue when using MongoDB ODM and a before middelware , wether it is security related or not.
I'm given a document and have added an Unique constraint on a field of the document.
If the document is not requested before the validation actually executes, everything works fine.
Let's say I have a before middleware that requests the document before the controller with validation is executed. In that case the unique constraint does not work properly and displays an error message saying This value is already used. But it should not since the value used is unique to the object that is validated.
To sum up, the issue is, when validating a document with an Unique Constraint on a field, if the document is requested through the document manager (with no modification performed on the document) before the validation code is actually executed, it seems the Unique constraint does not work properly.
I'm going to set up a test case but in the mean time, here is an example of how I set up the validation on the document; if anybody has come across the issue please let me know.
namespace Document;
//use Doctrine\Bundle\MongoDBBundle\Validator\Constraints\Unique as MongoDBUnique;
use Doctrine\Bundle\MongoDBBundle\Validator\Constraints\Unique;
use Doctrine\ODM\MongoDB\Mapping\Annotations as ODM;
use Symfony\Component\Validator\Constraints\Length;
use Symfony\Component\Validator\Mapping\ClassMetadata;
/**
* #ODM\Document(collection="blog_post")
*/
class Post
{
/**
* #ODM\Id
*
*/
protected $id;
/**
* #ODM\String
* #ODM\UniqueIndex
*/
protected $title;
/**
* #ODM\String
*/
protected $body;
/**
* #ODM\Date
*/
protected $createdAt;
/**
* #ODM\ReferenceOne(targetDocument="Document\User",cascade="update,merge",inversedBy="posts")
*/
protected $user;
function __construct() {
}
public function getTitle() {
return $this->title;
}
public function setTitle($title) {
$this->title = $title;
}
public function getBody() {
return $this->body;
}
public function setBody($body) {
$this->body = $body;
}
public function getCreatedAt() {
return $this->createdAt;
}
public function setCreatedAt($createdAt) {
$this->createdAt = $createdAt;
}
public function getId() {
return $this->id;
}
function getUser() {
return $this->user;
}
function setUser(User $user) {
$this->user = $user;
$this->user->addPost($this);
}
function __toString() {
return $this->title;
}
static function loadValidatorMetadata(ClassMetadata $metadatas) {
$metadatas->addPropertyConstraint("body", new Length(array('min'=>10,'max'=>1000)));
$metadatas->addPropertyConstraint("title", new Length(array('min'=>5,'max'=>255)));
$metadatas->addConstraint(new Unique(array('fields' => 'title')));
}
}
The app is in this Github repository. (work in progress):
Edit 1: An example of a before middleware that would trigger the erratic behavior of the Unique Constraint (in demo/mongoddb/blog/app/Config.php file in the repo linked) :
$app['must_be_post_owner'] = $app->protect(function(Request $req)use($app) {
$postId = $req->attributes->get('id');
$user = $app['security']->getToken()->getUser();
$post = $app['odm.dm']->getRepository('Document\Post')->findOneBy(array('id' => $postId));
if ($post->getUser() !== $user) {
$app['logger']->alert("Access denied for user $user to post with id $postId ");
$app->abort(500, 'You cant access this resource !');
}
}
);
Edit 2: I tried to debug the Symfony/Doctrine-Bridge UniqueEntityValidator class with XDebug but I get socket errors every time there is a MongoDB related function executed during step debugging.

Resources