How to check if queue is working - laravel-5

I need to start a game every 30 seconds, cron job minimum interval is a minute, so I use queue delay to do it
app/Jobs/StartGame.php
public function handle()
{
// Start a new issue
$this->gameService->gameStart();
// Start new issue after 15 seconds
$job = (new \App\Jobs\StartGame)->onQueue('start-game')->delay(30);
dispatch($job);
}
And I start first game by console
app/Console/Commands/StartGame.php
public function handle()
{
$job = (new \App\Jobs\StartGame)->onQueue('start-game');
dispatch($job);
}
The question is, I want to use cron job to check if the start game queue is running, if not then dispatch, in case of something like server stop for maintenance, is it possible?

You can write a log of the "start-date" in a file, for example : data_start.log
1- every time gameStart() is called the content of file (date) is checked, and verified if is between 29sec and 31sec.
2- update the content of file.
example of write :
<?php
$file = "data_start.log";
$f=fopen($file, 'w');
fwrite( $f, date('Y-m-d H:i:s') . "\n");
?>

Related

Save User Activity in json file

I am trying to save the user activities in a json file but when ever the file size gets bigger and multiple users working on same time the json file deletes the old records.
this is my Trait
trait CustomLogActivity
{
protected static function bootCustomLogActivity()
{
foreach (static::getModelEvents() as $event) {
static::$event(function ($model) use ($event) {
$model->recordActivity($event);
});
}
}
protected static function getModelEvents()
{
return ['created', 'updated', 'deleted'];
}
protected function recordActivity($event)
{
$activity = [
'user_id' => Auth::id(),
'type' => $event,
'subject' => (new \ReflectionClass($this))->getShortName(),
'timestamp' => now()
];
if ($event === 'updated') {
$activity['old_properties'] = $this->getOriginal();
$activity['new_properties'] = $this->getAttributes();
} else {
$activity['properties'] = $this->getAttributes();
}
$this->appendToLog($activity);
}
protected function appendToLog($activity)
{
$logFile = 'activity.json';
$log = json_encode($activity);
Storage::append($logFile, $log);
}
protected function getActivityType($event)
{
$type = strtolower((new \ReflectionClass($this))->getShortName());
return "{$event}_{$type}";
}
}
As I mentioned in some comments, I will post it as an answer so it is explanatory for anyone having these types of issues:
The error you are having is called: concurrency.
I am assuming 2 processes uses the file at the same time, so both reads the current content, but one of them after that writes, the other process already has data in memory (so the new data is not get by this process), but not the new content, so it will overwrite the file...
First of all, use a Queue (events) to send data, and then use Redis, or a database or something that is super fast for this, but not literally a file, you can lose it instantly, but not a database...
You can still use a file bu I would not recommend to do so because it depends a lot on your infrastructure:
If you have a load balancer with 10 machines, are you going to have 10 different files (one per machine)?
How do you combine them?
So what I would do is just have a queue (triggered by using an event) and let that queue, with a single worker, handle this super specific task. But you will have to have in mind the speed, if you are getting more events in the queue than the single worker can resolve, you will have to find a solution for that

How to call $this->emitSelf() multiple times on function

I have the following example function:
public function backupService()
{
$job = Job::find($this->job_id);
sleep(5);
$job->status = 'in_progress';
$job->update();
$this->emitSelf('refreshComponent');
sleep(10);
$job->status = 'finished';
$job->update();
$this->emitSelf('refreshComponent');
}
When I change the status to 'in_progress' it changes in my database but doesn't update the component. Apparently it is only issuing $this->emitSelf() when the backupService() function finishes, ie the status will never appear as 'in_progress', only as 'finished'.
I don't want to use the wire:poll directive because I don't want to keep updating the page all the time, only when I specifically call it. How can I resolve this?
The event will be emitted once the entire method backupService() is finished with its execution, when the response from that method is sent back to the browser. Livewire-events are actually sent to the browser with the response, and any components listening for those events will be triggering actions on the client, making secondary-requests.
This means that the refresh-event that you emit, will trigger after everything is completed.
If you don't want to use polling, then another alternative is websockets. But this too can be a bit much for such a simple task, so a third alternative is to restructure your method into two methods, one that starts the process, and have events going from there. Something like this, where the first method is only responsible for setting the new status and emitting a new event that will be starting the job, and the second method is responsible for execution.
protected $listeners = [
'refreshComponent' => '$refresh',
'runJob'
];
public function backupService()
{
$job = Job::find($this->job_id);
$job->status = 'in_progress';
$job->update();
$this->emitSelf('runJob', $job);
}
public function runJob(Job $job)
{
sleep(10);
$job->status = 'finished';
$job->update();
$this->emitSelf('refreshComponent');
}

Laravel Job Batching unable to cancel

I have a simple laravel job batching, my problem is when one of my queue inside batch is failed and throw an exception, it doesn't stop or cancel the execution of the batch even I add the cancel method, still processing the next queue.
this is my handle and failed method
public function handle()
{
if ($this->batch()->cancelled()) {
return;
}
$csv_data = array_map('str_getcsv', file($this->chunk_directory));
foreach ($csv_data as $key => $row) {
if(count($this->header) != count($row)) {
$data = array_combine($this->header, $row);
} else {
$this->batch()->cancel();
throw new Exception("Your file doesn't match the number of headers like your product header");
}
}
}
public function failed(\Exception $e = null)
{
broadcast(new QueueProcessing("failed", BatchHelpers::getBatch($this->batch()->id)));
}
here is my commandline result
[2021-01-11 01:17:57][637] Processing: App\Jobs\ImportItemFile
[2021-01-11 01:17:57][637] Failed: App\Jobs\ImportItemFile
[2021-01-11 01:17:58][638] Processing: App\Jobs\ImportItemFile
[2021-01-11 01:17:58][638] Processed: App\Jobs\ImportItemFile
From the Laravel 8 Queue documentation:
When a job within a batch fails, Laravel will automatically mark the batch as "cancelled"
So the default behavior is that the whole batch is marked as "canceled" and stops executing (note that the currently executing jobs will not be stopped).
In your case if the batch execution is continuing, maybe you've turned on the allowFailures() option when created a batch?
By the way you don't need to call the cancel() method. When an exception is thrown, the given job is already "failed" and the whole batch cancels.
Either remove the cancel() line, or return after the cancelation method (without throwing an exception). (see Cancelling batches)

Foreach and sleep function to broadcast

The main objective of this question is to have a system running permanently on the server to broadcast music.
I would like to create an artisan console function that will automatically launch a track from its parent game every 32 seconds. Unfortunately the sleep() function blocks the parent loop.
Two questions (or maybe three):
What would be the solution to prevent the parent loop from stopping?
Does this method consume too many resources? Is there a better way?
public function fetch($games) {
foreach ($games as $game) {
$tracks = Track::inRandomOrder()->where('game_id', $game->id)->limit($game->tracks_number)->get();
foreach ($tracks as $key => $track) {
broadcast(new NewTrack($track));
if($key + 1 == count($tracks)) {
$this->fetch($this->games);
}
sleep(32);
}
}
}
Many thanks for your feedback

Run an asynchronous PHP task using Symfony Process

For time-consuming tasks (email sending, image manipulation… you get the point), I want to run asynchronous PHP tasks.
It is quite easy on Linux, but I'm looking for a method that works on Windows too.
I want it to be simple, as it should be. No artillery, no SQL queueing, no again and again installing stuff… I just want to run a goddamn asynchronous task.
So I tried the Symfony Process Component.
Problem is, running the task synchronously works fine, but when running it asynchronously it exits along the main script.
Is there a way to fix this?
composer require symfony/process
index.php
<?php
require './bootstrap.php';
$logFile = './log.txt';
file_put_contents($logFile, '');
append($logFile, 'script (A) : '.timestamp());
$process = new Process('php subscript.php');
$process->start(); // async, subscript exits prematurely…
//$process->run(); // sync, works fine
append($logFile, 'script (B) : '.timestamp());
subscript.php
<?php
require './bootstrap.php';
$logFile = './log.txt';
//ignore_user_abort(true); // doesn't solve issue…
append($logFile, 'subscript (A) : '.timestamp());
sleep(2);
append($logFile, 'subscript (B) : '.timestamp());
bootstrap.php
<?php
require './vendor/autoload.php';
class_alias('Symfony\Component\Process\Process', 'Process');
function append($file, $content) {
file_put_contents($file, $content."\n", FILE_APPEND);
}
function timestamp() {
list($usec, $sec) = explode(' ', microtime());
return date('H:i:s', $sec) . ' ' . sprintf('%03d', floor($usec * 1000));
}
result
script (A) : 02:36:10 491
script (B) : 02:36:10 511
subscript (A) : 02:36:10 581
// subscript (B) is missing
Main script must be waiting when async process will be completed. Try this code:
$process = new Process('php subscript.php');
$process->start();
do {
$process->checkTimeout();
} while ($process->isRunning() && (sleep(1) !== false));
if (!$process->isSuccessful()) {
throw new \Exception($process->getErrorOutput());
}
If php supports fpm for Windows, you can listen to kernel.terminate event to provide all expensive tasks after response has been sent.
Service:
app.some_listener:
class: SomeBundle\EventListener\SomeListener
tags:
- { name: kernel.event_listener, event: kernel.terminate, method: onKernelTerminate }
Listener:
<?php
namespace SomeBundle\EventListener;
use Symfony\Component\HttpKernel\Event\PostResponseEvent;
class SomeListener
{
public function onKernelTerminate(PostResponseEvent $event)
{
// provide time consuming tasks here
}
}
Not the best solution, but:
$process = new Process('nohup php subscript.php &');
$process->start();

Resources