How to ping a server using Ajax in Laravel every 5 minutes? - ajax

I have a HTML table full of server IP addresses, and I want to ping them them every 5 minutes to check if the server is alive (and eventually highlight table rows depending if the server is dead/alive).
Currently I'm using Ajax with a 5 minute interval which calls a method in my controller:
var checkSims = function() {
$.ajax({
type: "GET",
url: '/checkSimStatus',
success: function(msg) {
onlineSims = msg['online'];
offlineSims = msg['offline'];
console.log(onlineSims);
console.log(offlineSims);
},
error: function() {
console.log('false');
}
});
}
var interval = 1000 * 60 * 1; // where X is your every X minutes
setInterval(checkSims, interval);
However, this is not asynchronous and while this controller method is pinging the IPs the webserver cannot serve requests.
I've read about Laravel's queue system but I'm not sure this would suit me as I need one specific page to trigger the job, and would need to use JS to highlight table rows.

#f7n if you have done it with ajax, how will it work if that page where HTML table with IP address not open in a browser?
I think you must use cron job on a server. Also, if you use VPS (Linux) or something else you can write simple code with bash shell script and run it on the daemon. Also, you can create simple code like below, create php script where it will parse (grab) page with HTML table of IP addresses and ping server.
#!/bin/bash
echo "Press [CTRL+C] to stop.."
while true
do
php parse_and_ping.php
sleep 300
done
sleep 300 is mean, It will work every 5 minutes. Just save It on .sh file (run_shell.sh) and run It on a terminal or on the daemon of Linux server.

Related

Redis Throttling / Jobs MaxAttemptsExceededException

I need to make 2 calls to a 3rd party API for about 350K products (EANs). The limit for this API is 1200 requests/hourly. I currently use the following code to throttle to jobs according to the 3rd party api limit:
Redis::throttle('bol_import_product_offers')
->allow(1)
->every(3)
->then(function () {
$this->process();
}, function () {
$this->release(10);
});
This works fine until some time jobs start to fail with the exception:
MaxAttemptsExceededException: Job has been attempted too many times or
run too long. The job may have previously timed out.
I assume this has something to do with the Horizon config file with timeouts/wait/retries - but I can't seem to find the issue. This is my Horizon config file: https://gist.github.com/liamseys/40538aab2cf0425d83ca4e5feac4d2ff.

DISCORD.JS SendPing to Host

I'm developing a bot in Discord.js, and because I use lavalink, I hosted it (lavalink server) on a free host, and to keep it online I need to do some pings constantly, I was wondering if, is there any way to make my bot (which is currently my vps) send a ping every time interval to the "url/host" where my lavalink is. if you have any solution I will be grateful!
You have two ways:
Using Uptimer Robot (fastest way)
Uptimer Robot is an online service that can do HTTP requestes each 5 minutes.
Very simple and fast to use, see more here.
making the request from your bot vps
Installing node-fetch
Type this in yout terminal:
npm i node-fetch
Making the request
Insert this where You want in the bot code.
const fetch = require('node-fetch');
const intervalTime = 300000; // Insert here the interval for doing the request in milliseconds, like now 300000 is equal to 5 minutes
const lavalinkURL = 'insert here the lavalink process url';
setInterval(() => {
fetch(lavalinkURL);
}, intervalTime)

How to parallelize downloads across hostnames on WordPress?

I'm getting this message "Parallelize downloads across hostnames" when checking my WordPress site on GTmetrix > https://gtmetrix.com
Here are the details > https://gtmetrix.com/parallelize-downloads-across-hostnames.html
How do I fix that ?
Details
Web browsers put a limit on the number of concurrent connections they will make to a host. When there are many resources that need to be downloaded, a backlog of resources waiting to be downloaded will form. The browser will make up as many simultaneous connections to the server as the browser allows in order to download these resources, but then will queue the rest and wait for the requests to finish.
The time spent waiting for a connection to finish is referred to as blocking and reducing this blocking time can result in a faster loading page. The waterfall diagram below shows a page which loads 45 resources from the same host. Notice how long the resources are blocked (the brown segments), before they are downloaded (the purple segments) as they wait for a free connection.
So here is a hack to implement it on WordPress.
In order to work properly, all subdomains/hostnames MUST have the same structure/path. Ex:
example.com/wp-content/uploads/2015/11/myimage.jpg
media1.example.com/wp-content/uploads/2015/11/myimage.jpg
media2.example.com/wp-content/uploads/2015/11/myimage.jpg
Add to functions.php
function parallelize_hostnames($url, $id) {
$hostname = par_get_hostname($url);
$url = str_replace(parse_url(get_bloginfo('url'), PHP_URL_HOST), $hostname, $url);
return $url;
}
function par_get_hostname($name) {
//add your subdomains below, as many as you want.
$subdomains = array('media1.mydomain.com','media2.mydomain.com');
$host = abs(crc32(basename($name)) % count($subdomains));
$hostname = $subdomains[$host];
return $hostname;
}
add_filter('wp_get_attachment_url', 'parallelize_hostnames', 10, 2);
This is mainly due do HTTP/1.1 in which browsers open on average 6 connections per hostname.
If you are running over HTTPS with a provider that supports HTTP/2, this warning can usually be safely ignored now. With HTTP/2 multiple resources can now be loaded in parallel over a single connection.
--
However, if you need to fix it, you can follow the below steps:
Create additional subdomains such as:
domain.com static1.domain.com static2.domain.com
Simply add the following code to your WordPress theme’s functions.php file. And replace the $subdomains values with your subdomains.
All subdomains/hostnames MUST have the same structure/path.
function parallelize_hostnames($url, $id) {
$hostname = par_get_hostname($url); //call supplemental function
$url = str_replace(parse_url(get_bloginfo('url'), PHP_URL_HOST), $hostname, $url);
return $url;
}
function par_get_hostname($name) {
$subdomains = array('static1.domain.com','static2.domain.com');
$host = abs(crc32(basename($name)) % count($subdomains));
$hostname = $subdomains[$host];
return $hostname;
}
add_filter('wp_get_attachment_url', 'parallelize_hostnames', 10, 2);
Read more about the parallelize downloads across hostnames warning and why you probably don't need to worry about this anymore.

Laravel liebig/cron executes the cronjob twice for same time

I am using laravel 4.2.
I've a project requirement to send some analysis report email to all the users every Monday 6 am.
Obviously its a scheduled task, hence I've decided to use cron-job.
For this I've installed liebig/cron package. The package is installed successfully. To test email, I've added following code in app/start/global.php:
Event::listen('cron.collectJobs', function() {
Cron::setEnablePreventOverlapping();
// to test the email, I am setting the day of week to today i.e. Tuesday
Cron::add('send analytical data', '* * * * 2', function() {
$maildata = array('email' => 'somedomain#some.com');
Mail::send('emails.analytics', $maildata, function($message){
$message->to('some_email#gmail.com', 'name of user')->subject('somedomain.com analytic report');
});
return null;
}, true);
Cron::run();
});
Also in app\config\packages\liebig\cron\config.php the key preventOverlapping is set to true.
Now, if I run it like php artisan cron:run, it sends the same email twice with the same time.
I've deployed the same code on my DigitalOcean development server (ubuntu) and set its crontab to execute this command every minute but still it is sending the same email twice.
Also it is not generating lock file in app/storage directory, according to some search results I've come to know that it creates a lock file to prevent overlapping. the directory has full permissions granted.
Can anybody knows how to solve it?
Remove Cron::run().
Here's what's happening:
Your Cron route or cron:run command is invoked.
Cron fires off the cron.collectjobs event to get a list of events.
You call Cron::run() and run all the events.
Cron calls Cron::run() and runs all the events.
In the cron.collectjobs event you should only be making a list of jobs using Cron::add().
The reason you're not seeing a lock file is either that preventOverlapping is set to false (it's true by default), or that the jobs are running so fast you don't see it being created and deleted. The lock file only exists for the time the jobs run, which may only be milliseconds.

PHP cron job every 10 minutes to log out user on browser close

I read a lot but cannot find something that works.
I have a php query that sets the user status to 1 when logged in, and 0 when logged out.
At the moment, I have a timer for inactivity but this will not work if the user closes the browser.
I know the solution is somehow to use javascript to periodically call the server using ajax. But I am not sure how to do that
Any help is much appreciated!
On page:
if ($_SESSION['last_action'] + (10*60) < time()) // Check if last action time wasn't less than 10 minutes ago.
{
// Log user out.
}
else {
$_SESSION['last_action'] = time();
}

Resources