Foreach and sleep function to broadcast - laravel

The main objective of this question is to have a system running permanently on the server to broadcast music.
I would like to create an artisan console function that will automatically launch a track from its parent game every 32 seconds. Unfortunately the sleep() function blocks the parent loop.
Two questions (or maybe three):
What would be the solution to prevent the parent loop from stopping?
Does this method consume too many resources? Is there a better way?
public function fetch($games) {
foreach ($games as $game) {
$tracks = Track::inRandomOrder()->where('game_id', $game->id)->limit($game->tracks_number)->get();
foreach ($tracks as $key => $track) {
broadcast(new NewTrack($track));
if($key + 1 == count($tracks)) {
$this->fetch($this->games);
}
sleep(32);
}
}
}
Many thanks for your feedback

Related

Save User Activity in json file

I am trying to save the user activities in a json file but when ever the file size gets bigger and multiple users working on same time the json file deletes the old records.
this is my Trait
trait CustomLogActivity
{
protected static function bootCustomLogActivity()
{
foreach (static::getModelEvents() as $event) {
static::$event(function ($model) use ($event) {
$model->recordActivity($event);
});
}
}
protected static function getModelEvents()
{
return ['created', 'updated', 'deleted'];
}
protected function recordActivity($event)
{
$activity = [
'user_id' => Auth::id(),
'type' => $event,
'subject' => (new \ReflectionClass($this))->getShortName(),
'timestamp' => now()
];
if ($event === 'updated') {
$activity['old_properties'] = $this->getOriginal();
$activity['new_properties'] = $this->getAttributes();
} else {
$activity['properties'] = $this->getAttributes();
}
$this->appendToLog($activity);
}
protected function appendToLog($activity)
{
$logFile = 'activity.json';
$log = json_encode($activity);
Storage::append($logFile, $log);
}
protected function getActivityType($event)
{
$type = strtolower((new \ReflectionClass($this))->getShortName());
return "{$event}_{$type}";
}
}
As I mentioned in some comments, I will post it as an answer so it is explanatory for anyone having these types of issues:
The error you are having is called: concurrency.
I am assuming 2 processes uses the file at the same time, so both reads the current content, but one of them after that writes, the other process already has data in memory (so the new data is not get by this process), but not the new content, so it will overwrite the file...
First of all, use a Queue (events) to send data, and then use Redis, or a database or something that is super fast for this, but not literally a file, you can lose it instantly, but not a database...
You can still use a file bu I would not recommend to do so because it depends a lot on your infrastructure:
If you have a load balancer with 10 machines, are you going to have 10 different files (one per machine)?
How do you combine them?
So what I would do is just have a queue (triggered by using an event) and let that queue, with a single worker, handle this super specific task. But you will have to have in mind the speed, if you are getting more events in the queue than the single worker can resolve, you will have to find a solution for that

How to best implement a Promise semaphore?

I use a semaphore for two processes that share a resource (rest api endpoint), that can't be called concurrent. I do:
let tokenSemaphore = null;
class restApi {
async getAccessToken() {
let tokenResolve;
if (tokenSemaphore) {
await tokenSemaphore;
}
tokenSemaphore = new Promise((resolve) => tokenResolve = resolve);
return new Promise(async (resolve, reject) => {
// ...
resolve(accessToken);
tokenResolve();
tokenSemaphore = null;
});
}
}
But this looks too complicated. Is there a simpler way to achieve the same thing?
And how to do it for more concurrent processes.
This is not a server side Semaphore. You need interprocess communication for locking processes which are running independently in different threads. In that case the API must support something like that on the server side and this here is not for you.
As this was the first hit when googling for "JavaScript Promise Semaphore", here is what I came up with:
function Semaphore(max, fn, ...a1)
{
let run = 0;
const waits = [];
function next(x)
{
if (run<max && waits.length)
waits.shift()(++run);
return x;
}
return (...a2) => next(new Promise(ok => waits.push(ok)).then(() => fn(...a1,...a2)).finally(_ => run--).finally(next));
}
Example use (above is (nearly) copied from my code, following was typed in directly and hence is not tested):
// do not execute more than 20 fetches in parallel:
const fetch20 = Semaphore(20, fetch);
async function retry(...a)
{
for (let retries=0;; retries++)
{
if (retries)
await new Promise(ok => setTimeout(ok, 100*retries));
try {
return await fetch20(...a)
} catch (e) {
console.log('retry ${retries}', url, e);
}
}
}
and then
for (let i=0; ++i<10000000; ) retry(`https://example.com/?${i}`);
My Browser handles thousands of asynchronous parallel calls to retry very well. However when using fetch directly, the Tabs crash nearly instantly.
For your usage you probably need something like:
async function access_token_api_call()
{
// assume this takes 10s and must not be called in parallel for setting the Cookie
return fetch('https://api.example.com/nonce').then(r => r.json());
}
const get_access_token = Semaphore(1, access_token_api_call);
// both processes need to use the same(!) Semaphore, of course
async function process(...args)
{
const token = await get_access_token();
// processing args here
return //something;
}
proc1 = process(1);
proc2 = process(2);
Promise.all([proc1, proc2]).then( //etc.
YMMV.
Notes:
This assumes that your two processes are just asynchronous functions of the same single JS script (i.E. running in the same Tab).
A Browser usually does not open more than 5 concurrent connects to a backend and then pipelines excess requests. fetch20 is my workaround for a real-world problem when a JS-Frontend needs to queue, say, 5000 fetches in parallel, which crashes my Browser (for unknown reason). We have 2021 and that should not be any problem, right?
But this looks too complicated.
Not complicated enough, I'm afraid. Currently, if multiple code paths call getAccessToken when the semaphore is taken, they'll all block on the same tokenSemaphore instance, and when the semaphore is released, they'll all be released and resolve roughly at the same time, allowing concurrent access to the API.
In order to write an asynchronous lock (or semaphore), you'll need a collection of futures (tokenResolvers). When one is released, it should only remove and resolve a single future from that collection.
I played around with it a bit in TypeScript a few years ago, but never tested or used the code. My Gist is also C#-ish (using "dispoables" and whatnot); it needs some updating to use more natural JS patterns.

Laravel daily email send multiple user with schedule

I need to send email to multiple user in every day.
My code is like this.
It works as well, but I have misunderstood.
foreach($advisors as $advisor) {
$receivers = [];
foreach($advisor->clients as $client) {
array_push($receivers, $client);
}
array_push($receivers, $advisor);
if (count($receivers) > 0) {
Notification::send($receivers, new DailyEmail($advisor));
}
}
before I code like below.
foreach($advisors as $advisor) {
$receivers = [];
foreach($advisor->clients as $client) {
array_push($receivers, $client);
}
if (count($receivers) > 0) {
Notification::send($receivers, new DailyEmail($advisor));
}
Notification::send($advisor, new DailyEmail($advisor));
}
but if I code like this, only one user got email.
I can't understand, why this works different.
If you can explain about this, please.
The "old" code was firing the Notification::send event twice, once for the receivers and once for the advisor.
Your "new" code only fires it once for the receivers, thus the advisor is not getting an email notification.
Now i may be understanding your code wrong for lack of more information, but if you want to send the notification to the $advisor->clients you dont need to loop them over and make a new array, in fact Notification::send expects a collection
Just do:
foreach($advisors as $advisor) {
if (count($advisor->clients) > 0) {
Notification::send($advisor->clients, new DailyEmail($advisor));
}
}

Sequelize correctly executing multiple creates + updates

I have a cron job that scrapes a list of items on a website and then inserts or updates records in a database. When I scrape the page, I want to create records for new ones that haven't been created yet, otherwise update any existing ones. Currently I'm doing something like this:
// pretend there is a "Widget" model defined
function createOrUpdateWidget(widgetConfig) {
return Widget.find(widgetConfig.id)
.then(function(widget) {
if (widget === null) {
return Widget.create(widgetConfig);
}
else {
widget.updateAttributes(widgetConfig);
}
});
}
function createOrUpdateWidgets(widgetConfigObjects) {
var promises = [];
widgetConfigObjects.forEach(function(widgetConfig) {
promises.push(createOrUpdateWidget(widgetConfig));
});
return Sequelize.Promise.all(promises);
}
createOrUpdateWidgets([...])
.done(function() {
console.log('Done!');
});
This seems to work fine, but I'm not sure if I'm doing this "correctly" or not. Do all promises that perform DB interactions need to run serially, or is how I have them defined ok? Is there a better way to do this kind of thing?
What you're doing is pretty idiomatic and perfectly fine, the only room for improvement is to utilize the fact Sequelize uses Bluebird for promises so you get .map for free, which lets you convert:
function createOrUpdateWidgets(widgetConfigObjects) {
var promises = [];
widgetConfigObjects.forEach(function(widgetConfig) {
promises.push(createOrUpdateWidget(widgetConfig));
});
return Sequelize.Promise.all(promises);
}
Into:
function createOrUpdateWidgets(widgetConfigObjects) {
return Sequelize.Promise.map(widgetConfig, createOrUpdateWidget)
}
Other than that minor improvement - you're chaining promises correctly and seem to have the correct hang of it.

How to check if player has already been selected using CodeIgniter Callback

I wonder if anyone can help me out. I am trying to ensure that when a manager selects his players in his team to confirm a football result, he can only choose a player once.
So my validation callback starts here:
$this->form_validation->set_rules('P1', 'The Home Team cannot play with less than 7 players', 'trim|required|callback_player1_check');
I then have this callback_function:
function callback_player1_check()
{
if ($this->fixtures_model->callback_player1_check()== TRUE)
{
$this->form_validation->set_message('P1', 'Player already selected');
return FALSE;
}
else
{
return TRUE;
}
}
This callback function then links to this model function:
function callback_player1_check() {
$player_id1 = $this->input->post('P1');
$player_id2 = $this->input->post('P2');
if ($player_id1 == $player_id2)
{
return TRUE;
}
}
So all I'm trying to do at the moment, is check if Player 1 (P1) and Player 2 (P2) are the same player. Which isn't working. If I can sort this out, I then need to check all players against each other to ensure a player is only selected once?
Any help would be great! Thanks alot.
You don't need to prefix the callback function with the word callback_
function player1_check() {
$player_id1 = $this->input->post('P1');
// etc
should do fine.
edit
See about callbacks with regards to their naming convention.

Resources