Laravel 5.5 . Re-send data with the replacement of existing ones - laravel

there is a task:
1) the user will be retrieved and stored in the variable $user = User::find(1);
2) then the function displays experience;
3) In parallel with the operation of the function, the asynchronous method changes the experience by a random number every few seconds.
In the first function, the user experience is displayed once again. What will this conclusion be?
How can I implement the output on the page of deferred calculations?
Do I understand correctly that there should be the following sequence:
- on the page shows the experience;
- in parallel - every 3 seconds, the experience update is launched;
- in a minute (for example) the experience value is updated on the page?

There are 2 operations happening essentially independent of each other :
- Fetching and showing the information on frontend
- Updating the data in background
Each of this has some nitty gritty things like updates in X seconds etc which can be handled if we look at them as separate operations.
Frontend :
You can have simple route which gets the user using $user = User::find(1); as you mentioned in question and shows the information. That data will be what the respective user contains at the moment when the query is performed. It will have nothing to do with the background updates happening.
Then having requirements of fetching the updates, depending on which ever frontend javascript library you are using, you can have an Ajax call happening at an internal of X minutes depending on your refresh rate. This ajax call will get the updated information.
Handling background updates :
You can create an artisan command which has code to update the records.
Then you can schedule it to run every 3 minutes using laravel scheduling like :
$schedule->command('update:userexperience')->cron('3 * * * *');
And add * * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1 in your server's crontab`
All above details are in documentation
Tips while scheduling command :
When you are updating users, if you re running just one query to update all users then its great. But if you have a logic which assigns a new value for each user in an individual row update, use chunk to load limited records at a time. This will help you keep the memory utilisation under a limit.
Testing before scheduling :
Doing it every 3 seconds is a very very small interval. I would suggest first run a command manually and check how much time it takes. If the backgrund process takes 2 seconds to complete one time, 3 second interval is tooo small.
Also, if the number of records in users table are increasing quickly, you need to revisit sometime later to increase the cron interval. It is good to keep that in mind.
Update :
The smallest unit in a cron is 1 minute. So to schedule it for every 3 seconds you can do inside app/Console/Kernel.php's schedule function :
$secondInterval = 3;
$totalJobs = 60 / $secondInterval;
for ($i = 0; $i < $totalJobs; $i++) {
$schedule->command('update:userexperience', [ '--delay'=> $secondInterval ])->everyMinute();
}
And then inside your command you delay :
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
class UpdateUserExperienceCommand extends Command
{
/**
* The name and signature of the console command.
*
* #var string
*/
protected $signature = 'update:userexperience {--delay= : Number of seconds to delay command}';
/**
* The console command description.
*
* #var string
*/
protected $description = 'Import the CSVs';
/**
* Create a new command instance.
*
* #return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the command
*
* #return void
*/
public function handle()
{
sleep($this->option('delay'));
//update users
return;
}
}

Done.
Create a Job, which is designed for 30 requests within one and a half minutes. Every 3 seconds.
class UpdateUserExperienceJob implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $user;
/**
* Create a new job instance.
*
* #param User $user
*/
public function __construct(User $user)
{
$this->user = $user;
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
for ($i = 3; $i <= 90; $i += 3) {
sleep(3);
$this->user->update(['experience' => rand()]);
}
}
}
Create a method in the controller. Depending on the parameter repeatRequest depends on the launch of Job.
public function getExperience(): array
{
$user = User::find(request()->get('user_id'));
request()->get('repeatRequest') === 'true' ?: UpdateUserExperienceJob::dispatch($user);
return ['experience' => $user->experience];
}
Front implemented through Vue.js. In the component, when loading, we first get the current experience of the user, and then every 5 seconds we run a repeated request with a positive value of the repeatedRequest property. Due to the calculated properties, the value will change dynamically.
<template>
<div class="container">
<p>experience is updated every 5 seconds</p>
<p v-html="computedExperience"></p>
</div>
</template>
<script>
export default {
data: function () {
return {
experience: 0,
user_id: parseInt(document.URL.replace(/\D+/g,""))
}
},
mounted() {
this.firstExperience();
setInterval(this.repeatedExperience.bind(this), 5000);
},
computed: {
computedExperience() {
return this.experience;
},
},
methods: {
firstExperience: function () {
axios.get(document.URL + '/api', {
params: {
user_id: this.user_id,
repeatRequest: false
}
}).then((response) => {
this.experience = response.data.experience;
});
},
repeatedExperience: function () {
axios.get(document.URL + '/api', {
params: {
user_id: this.user_id,
repeatRequest: true
}
}).then((response) => {
this.experience = response.data.experience;
});
}
},
}
</script>

Related

how prevent laravel's jobs run at the same time

I have a job as below:
class ProcessActions implements ShouldQueue
{
use Dispatchable;
use InteractsWithQueue;
use Queueable;
use SerializesModels;
protected $user_id;
public $uniqueFor = 4;
/**
* Create a new job instance.
*
* #param mixed $action_id
*/
public function __construct($user_id)
{
$this->user_id = $user_id;
}
public function uniqueId()
{
return $this->user_id;
}
/**
* Handle a job failure.
*/
public function failed()
{
}
/**
* Execute the job.
*/
public function handle()
{
Log::debug('');
Log::debug('Started Time: '.date('Y-m-d H:i:s'));
try {
} catch (Exception $e) {
Log::critical('Error occurred.');
Log::critical($e);
// make the job failed
$this->job->fail($e);
}
}
public function middleware()
{
return [(new WithoutOverlapping($this->user_id))->releaseAfter(4)->expireAfter(4)];
}
}
How I should prevent jobs run at the same time? unique and WithoutOverlapping didn't work and I have jobs that run at the same time.
Actually I want jobs with same user_id run by 4 second delay.
Also as I checked jobs'available_at and time that I logged in handle have some seconds difference.
You can use Ratelimiting feature in laravel
You can read more about here - https://laravel.com/docs/8.x/queues#rate-limiting -
In your job class inside handle method add this code -Make sure key is a unique value,Here is a sample code
Basically it limit the worker to process one job for a user at a given time.see uuid,next job will be processed 4 sec after.
$uuid='process_action_'.$user_id;
Redis::funnel($uuid)->limit(1)->then(function () {
//job logic
}, function () {
//reattempt the job after 4 seconds
return $this->release(4);
});

Scheduled Command to update records

Morning all,
I am trying to create a command that I can schedule to check if a certification date has expired and if it has, update the boolean from 0 to 1. I have never used commands before and I have read the OctoberCMS documentation but I found it confusing.
If anyone could help me, that would be perfect.
Here is what I have so far.
<?php
namespace Bitpixlimited\Concert\Console;
use Illuminate\Console\Command;
use Symfony\Component\Console\Input\InputOption;
use Symfony\Component\Console\Input\InputArgument;
use BitpixLimited\ConCert\Models\Certification;
use Carbon\Carbon;
/**
* CheckExpiredCertifications Command
*/
class CheckExpiredCertifications extends Command
{
/**
* #var string name is the console command name
*/
protected $name = 'concert:checkexpiredcertifications';
/**
* #var string description is the console command description
*/
protected $description = 'No description provided yet...';
/**
* handle executes the console command
*/
public function handle()
{
$certifications = Certification::all();
$date = now();
$expiredValue = '1';
foreach ($certifications as $certification) {
if ($certification->expiry_date < $date) {
$certification->status = $expiredValue;
}
$certification->save();
}
}
/**
* getArguments get the console command arguments
*/
protected function getArguments()
{
return [];
}
/**
* getOptions get the console command options
*/
protected function getOptions()
{
return [];
}
}
Take a look at this code:
public function handle()
{
$certifications = Certification::query()
->where('expiry_date', '<', now())
->update(['status' => '1'])
}
It does what you are trying to achieve, it's a simplified version of your code and it is more performant.
We don't actually get the records, we update them directly
We update all records that have a expiry_date before now()
All these records now have the status equals to 1
Since we don't store the records in memory and we don't need to "build" the Collection, it's far better performance wise.
The drawback is that you lose model events (if you declared any) and mutators, but I assume that's not the case here.
If you need to access all models mutators, methods, events (now or in the future), then use the following code:
public function handle()
{
$certifications = Certification::query()
->where('expiry_date', '<', now())
->each(function(Certification $certification){
$certification->status = '1';
$certification->save();
});
}
The main difference is that we actually retrieve the records and build all the Certification instances. It gives you more capabilities but the performances will take a hit.
There are more optimized ways to do this, especially if you have a large number of rows, but this is another topic.
You should run this command in your scheduler at the frequency you wish, for instance every minute:
protected function schedule(Schedule $schedule)
{
$schedule->command('concert:checkexpiredcertifications')->everyMinute();
}
Every minute, we will update all records that have expiry_date in the past and set their status to '1'.
Of course, you must have a working scheduler to do this, but that's a little bit off topic here (docs here: https://laravel.com/docs/8.x/scheduling#running-the-scheduler).

Laravel Nested Jobs

I created a job that has a foreach loop that dispatches another job. Is there a way to fire an even when all the nested jobs are completed?
When triggered here is what happends
Step 1. first I trigger the batch job
GenerateBatchReports::dispatch($orderable);
Step 2. We then run a loop and queue other jobs
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
$dir = storage_path('reports/tmp/'.str_slug($this->event->company) . '-event');
if(file_exists($dir)) {
File::deleteDirectory($dir);
}
foreach($this->event->participants as $participant) {
$model = $participant->exercise;
GenerateSingleReport::dispatch($model);
}
}
I just need to know when all the nested jobs are done so I can zip the reports up and email them to a user. When the batch job is done queueing all the nested jobs, it is removed from the list. Is there a way to keep the job around until the nested jobs are done, then fire an event?
Any help would be appreciated.
Update: Laravel 8 (released planned on 8th September 2020) will provide jobs batching. This feature is already documented is probably perfect for nested jobs scenario and looks like this:
$batch = Bus::batch([
new ProcessPodcast(Podcast::find(1)),
new ProcessPodcast(Podcast::find(2)),
new ProcessPodcast(Podcast::find(3)),
new ProcessPodcast(Podcast::find(4)),
new ProcessPodcast(Podcast::find(5)),
])->then(function (Batch $batch) {
// All jobs completed successfully...
})->catch(function (Batch $batch, Throwable $e) {
// First batch job failure detected...
})->finally(function (Batch $batch) {
// The batch has finished executing...
})->dispatch();
We will also be able to add additional batched jobs on the fly:
$this->batch()->add(Collection::times(1000, function () {
return new ImportContacts;
}));
Original answer 👇
I came up with a different solution, because I have a queue using several processes. So, for me:
No dispatchNow because I want to keep jobs running in parallel.
Having several processes, I need to make sure the last nested job will not run after the final one. So a simple chaining doesn’t guarantee that.
So my not elegant solution filling the requirements is to dispatch all the nested jobs and, in the last one, dispatching the final job with a couple of seconds of delay, to make sure all other nested jobs that may still be running in parallel will be terminated.
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
$last_participant_id = $this->event->participants->last()->id;
foreach($this->event->participants as $participant) {
$is_last = $participant->id === $last_participant_id;
GenerateSingleReport::dispatch($model, $is_last);
}
}
and in GenerateSingleReport.php
class GenerateSingleReport implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $model;
protected $runFinalJob;
public function __construct($model, $run_final_job = false)
{
$this->model = $model;
$this->runFinalJob = $run_final_job;
}
public function handle()
{
// job normal stuff…
if ($this->runFinalJob) {
FinalJob::dispatch()->delay(30);
}
}
}
Alternatively
I’m throwing another idea, so the code is not flawless. Maybe a wrapper Job could be created and dedicated to running the last nested job chained with the final job.
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
$last_participant_id = $this->event->participants->last()->id;
foreach($this->event->participants as $participant) {
$is_last = $participant->id === $last_participant_id;
if ($is_last) {
ChainWithDelay::dispatch(
new GenerateSingleReport($model), // last nested job
new FinalJob(), // final job
30 // delay
);
} else {
GenerateSingleReport::dispatch($model, $is_last);
}
}
}
And in ChainWithDelay.php
class ChainWithDelay implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $job;
protected $finalJob;
protected $delay;
public function __construct($job, $final_job, $delay = 0)
{
$this->job = $job;
$this->finalJob = $final_job;
$this->delay = $delay;
}
public function handle()
{
$this->job
->withChain($this->finalJob->delay($this->delay))
->dispatchNow();
}
}
For laravel >= 5.7
You can use the dispatchNow method. That will keep the parent job alive while the child jobs are processing:
https://laravel.com/docs/5.8/queues#synchronous-dispatching
Parent job:
public function handle()
{
// ...
foreach($this->event->participants as $participant) {
$model = $participant->exercise;
GenerateSingleReport::dispatchNow($model);
}
// then do something else...
}
For laravel 5.2 - 5.6
You could use the sync connection:
https://laravel.com/docs/5.5/queues#customizing-the-queue-and-connection
Make sure the connection is defined in your config/queue.php:
https://github.com/laravel/laravel/blob/5.5/config/queue.php#L31
Parent job (NOTE: This syntax is for 5.5. The docs are a little different for 5.2):
public function handle()
{
// ...
foreach($this->event->participants as $participant) {
$model = $participant->exercise;
GenerateSingleReport::dispatch($model)->onConnection('sync');
}
// then do something else...
}
You could use Laravel's job chaining. It allows you to run a bunch of jobs in sequence and if one fails, the rest in the chain will not be run.
The basic syntax looks like this:
FirstJob::withChain([
new SecondJob($param),
new ThirdJob($param)
])->dispatch($param_for_first_job);
In your case your could add all of your GenerateSingleReport jobs to an array except the first one and add then add the final job that you want to run to the end of the array. Then you can pass that array to the withChain method on the first job.
$jobs = [];
$first_job = null;
$first_parameter = null;
foreach($this->event->participants as $participant) {
$model = $participant->exercise;
if (empty($first_job)) {
$first_job = GenerateSingleReport;
$first_parameter = $model;
} else {
$jobs[] = new GenerateSingleReport($model);
}
}
$jobs[] = new FinalJob();
$first_job->withChain($jobs)->dispatch($first_parameter);

How to Make a Scheduler Task for Every New Record After 20 Minutes in Laravel?

I have a Parking System that I use Angular 6 + Laravel for the backend, and I have a specific problem that I don't know the right approach.
The Park has two totem's that send to my server an entry. I check if the client is invalid only when he goes out on the totem from the exit and when he goes to the payment area.
This is my code when he put the ticket that has EAN_13 barcode read. This is my code that is ready by the Exit Totem:
public function getEntrysDataByEan(Request $request)
{
if (isset($request)) {
$entryean = $request->input('entryean13');
$entry = $this->mdMovEntry->getEntrysDataByEan($entryean);
if (empty($entry)) {
$response["success"] = 0;
$response["message"] = "Não existe nenhuma entrada correspondente";
} else {
$nowHour = Carbon::now();
$enterHour = Carbon::parse($entry[0]->updated_at);
$difmin = $enterHour->diffInMinutes($nowHour);
$dif = $enterHour->diffInHours($nowHour);
if ($difmin <= 20) {
$this->mdMovEntry->validatedEntryByEan($entryean, Carbon::parse($entry[0]->updated_at), $nowHour);
$entry[0]->validated = 'S';
} else {
$this->mdMovEntry->devaluedEntryByEan($entryean, Carbon::parse($entry[0]->updated_at));
$entry[0]->validated = 'N';
}
$response["success"] = 1;
$response["message"] = "Entrada retornada com sucesso";
$response["entry"] = $entry;
}
} else {
$response["success"] = 0;
$response["message"] = "Nenhum dado enviado";
}
return $response;
}
The problem is that I think is so much processing to just read if the client can go out or not, so I search the Task Scheduling approach and Job approach from Laravel and don't find anything for my problem.
What I want is when the Totem from entry area makes an insert on my database I create some Job or Task that after 20 min for every new record he put to invalidated.
And when the client goes out, the system just does a getEntry on the database and check if he is validated or not.
How I can achieve this? I ask this because every example for the Task is a global execution but I only want a task for every new record it is possible?
Consider using model events in conjunction with delayed dispatching for queued jobs.
The following will listen for the created event when new entries are inserted into the database and dispatch a job delayed by 20 minutes to later invalidate the entry.
InvalidateEntry Job
class InvalidateEntry implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $entry;
/**
* Create a new job instance.
*
* #param Entry $entry
* #return void
*/
public function __construct(Entry $entry)
{
$this->entry = $entry;
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
// you'll probably need to add some conditional logic
// before updating the entry to determine if the
// action is still required.
$isEntryInvalid = ...;
if ($isEntryInvalid) {
$this->entry->update([
'validated' => 'N'
]);
} else {
// delete the job from the queue
$this->delete();
}
}
}
Entry Model
class Entry extends Model
{
public static function boot()
{
parent::boot();
static::created(function (Entry $entry) {
InvalidateEntry::dispatch($entry)->delay(now()->addMinutes(20));
});
}
}

Cache Eloquent query for response

In one of my applications I have a property that is needed throughout the app.
Multiple different parts of the application need access such as requests, local and global scopes but also commands.
I would like to "cache" this property for the duration of a request.
My current solution in my Game class looks like this:
/**
* Get current game set in the .env file.
* #return Game
*/
public static function current()
{
return Cache::remember('current_game', 1, function () {
static $game = null;
$id = config('app.current_game_id');
if ($game === null || $game->id !== $id) {
$game = Game::find($id);
}
return $game;
});
}
I can successfully call this using Game::current() but this solutions feels "hacky" and it will stay cached over the course of multiple requests.
I tried placing a property on the current request object but this won't be usable for the commands and seems inaccessible in the blade views and the objects (without passing the $request variable.
Another example of its usage is described below:
class Job extends Model
{
/**
* The "booting" method of the model.
*
* #return void
*/
protected static function boot()
{
parent::boot();
static::addGlobalScope('game_scope', function (Builder $builder) {
$builder->whereHas('post', function ($query) {
$query->where('game_id', Game::current()->id);
});
});
}
}
I do not believe I could easily access a request property in this boot method.
Another idea of mine would be to store the variable on a Game Facade but I failed to find any documentation on this practice.
Could you help me find a method of "caching" the Game::current() property accessible in most if not all of these cases without using a "hacky" method.
Use the global session helper like this:
// Retrieve a piece of data from the session...
$value = session('key');
// Store a piece of data in the session...
session(['key' => 'value']);
For configuration info and more options: https://laravel.com/docs/5.7/session

Resources