How to Track the Online Status of Users of my WebSite? - session

I want to track users that are online at the moment.
The definition of being online is when they are on the index page of the website which
has the chat function.
So far, all I can think of is setting a cookie for the user and, when the cookie is found on the next visit, an ajax call is made to update a table with their username, their status online and the time.
Now my actual question is, how can I reliably turn their status to off when they leave
the website? The only thing I can think of is to set a predetermined amount of time of no user interaction and then set the status to off.
But what I really want is to keep the status on as long as they are on the site, with or without interaction, and only go to off when they leave the site.

Full Solution. Start-to-finish.
If you only want this working on the index.php page, you could send updates to the server asynchronously (AJAX-style) alerting the server that $_SESSION["userid"] is still online.
setInterval("update()", 10000); // Update every 10 seconds
function update() {
$.post("update.php"); // Sends request to update.php
}
Your update.php file would have a bit of code like this:
session_start();
if ($_SESSION["userid"])
updateUserStatus($_SESSION["userid"]);
This all assumes that you store your userid as a session-variable when users login to your website. The updateUserStatus() function is just a simple query, like the following:
UPDATE users
SET lastActiveTime = NOW()
WHERE userid = $userid
So that takes care of your storage. Now to retrieve the list of users who are "online." For this, you'll want another jQuery-call, and another setInterval() call:
setInterval("getList()", 10000) // Get users-online every 10 seconds
function getList() {
$.post("getList.php", function(list) {
$("listBox").html(list);
});
}
This function requests a bit of HTML form the server every 10 seconds. The getList.php page would look like this:
session_start();
if (!$_SESSION["userid"])
die; // Don't give the list to anybody not logged in
$users = getOnlineUsers(); /* Gets all users with lastActiveTime within the
last 1 minute */
$output = "<ul>";
foreach ($users as $user) {
$output .= "<li>".$user["userName"]."</li>";
}
$output .= "</ul>";
print $output;
That would output the following HTML:
<ul>
<li>Jonathan Sampson</li>
<li>Paolo Bergantino</li>
<li>John Skeet</li>
</ul>
That list is included in your jQuery variable named "list." Look back up into our last jQuery block and you'll see it there.
jQuery will take this list, and place it within a div having the classname of "listBox."
<div class="listBox"></div>
Hope this gets you going.

In the general case, there's no way to know when a user leaves your page.
But you can do things behind the scenes such that they load something from your server frequently while they're on the page, eg. by loading an <iframe> with some content that reloads every minute:
<meta http-equiv="refresh" content="60">
That will cause some small extra server load, but it will do what you want (if not to the second).

Well, how does the chat function work? Is it an ajax-based chat system?
Ajax-based chat systems work by the clients consistently hitting the chat server to see if there are any new messages in queue. If this is the case, you can update the user's online status either in a cookie or a PHP Session (assuming you are using PHP, of course). Then you can set the online timeout to be something slightly longer than the update frequency.
That is, if your chat system typically requests new messages from the server every 5 seconds, then you can assume that any user who hasn't sent a request for 10-15 seconds is no longer on the chat page.
If you are not using an ajax-based chat system (maybe Java or something), then you can still accomplish the same thing by adding an ajax request that goes out to the server periodically to establish whether or not the user is online.
I would not suggest storing this online status information in a database. Querying the database every couple of seconds to see who is online and who isn't is very resource intensive, especially if this is a large site. You should cache this information and operate on the cache (very fast) vs. the database (very slow by comparison).

The question is tagged as "jquery" - what about a javascript solution? Instead of meta/refresh you could use window.setInterval(), perform an ajax-request and provide something "useful" like e.g. an updated "who's online" list (if you consider that useful ;-))

I have not tried this, so take it with a grain of salt: Set an event handler for window.onunload that notifies the server when the user leaves the page. Some problems with this are 1.) the event won't fire if the browser or computer crashes, and 2.) if the user has two instances of the index page open and closes one, they will appear to logout unless you implement reference counting. On its own this is not robust, but combined with Jonathan's polling method, should allow you to have pretty good response time and larger intervals between updates.

The ultimate solution would be implementing something with websockets.

Related

Is there a way to delay cache revalidation in service worker?

I am currently working on performance improvements for a React-based SPA. Most of the more basic stuff is already done so I started looking into more advanced stuff such as service workers.
The app makes quite a lot of requests on each page (most of the calls are not to REST endpoints but to an endpoint that basically makes different SQL queries to the database, hence the amount of calls). The data in the DB is not updated too often so we have a local cache for the responses, but it's obviously getting lost when a user refreshes a page. This is where I wanted to use the service worker - to keep the responses either in cache store or in IndexedDB (I went with the second option). And, of course, the cache-first approach does not fit here too well as there is still a chance that the data may become stale. So I tried to implement the stale-while-revalidate strategy: fetch the data once, then if the response for a given request is already in cache, return it, but make a real request and update the cache just in case.
I tried the approach from Jake Archibald's offline cookbook but it seems like the app is still waiting for real requests to resolve even when there is a cache entry to return from (I see those responses in Network tab).
Basically the sequence seems to be the following: request > cache entry found! > need to update the cache > only then show the data. Doing the update immediately is unnecessary in my case so I was wondering if there is any way to delay that? Or, alternatively, not to wait for the "real" response to be resolved?
Here's the code that I currently have (serializeRequest, cachePut and cacheMatch are helper functions that I have to communicate with IndexedDB):
self.addEventListener('fetch', (event) => {
// some checks to get out of the event handler if certain conditions don't match...
event.respondWith(
serializeRequest(request).then((serializedRequest) => {
return cacheMatch(serializedRequest, db.post_cache).then((response) => {
const fetchPromise = fetch(request).then((networkResponse) => {
cachePut(serializedRequest, response.clone(), db.post_cache);
return networkResponse;
});
return response || fetchPromise;
});
})
);
})
Thanks in advance!
EDIT: Can this be due to the fact that I put stuff into IndexedDB instead of cache? I am sort of forced to use IndexedDB instead of the cache because those "magic endpoints" are POST instead of GET (because of the fact they require the body) and POST cannot be inserted into the cache...

Why $time from $lock=Cache::lock('name', $time) should be greater than the updating Cache time?

I placed this code inside a Route::get() method only to test it quicker. So this is how it looks:
use Illuminate\Support\Facades\Cache;
Route::get('/cache', function(){
$lock = Cache::lock('test', 4);
if($lock->get()){
Cache::put('name', 'SomeName'.now());
dump(Cache::get('name'));
sleep(5);
// dump('inside get');
}else{
dump('locked');
}
// $lock->release();
});
If you reach this route from two browsers (almost)at the same time. They both will respond with the result from dump(Cache::get('name'));. Shouldn't the second browser respond be "locked"? Because when it calls the $lock->get() that is supposed to return false? And that because when the second browser tries to reach this route the lock should be still set.
That same code works just fine if the time required for the code after the $lock = Cache::lock('test', 4) to be executed is less than 4. If you set the sleep($sec) when $sec<4 you will see that the first browser reaching this route will respond with the result from Cache::get('name') and the second browser will respond with "locked" as expected.
Can anyone explain why is this happening? Isn't it suppose that any get() method to that lock, expect the first one, to return false for that amount of time the lock has been set? I used 2 different browsers but it works the same with 2 tabs from the same browser too.
Quote from the 5.6 docs https://laravel.com/docs/5.6/cache#atomic-locks:
To utilize this feature, your application must be using the memcached or redis cache driver as your application's default cache driver. In addition, all servers must be communicating with the same central cache server.
Quote from the 5.8 docs https://laravel.com/docs/5.8/cache#atomic-locks:
To utilize this feature, your application must be using the memcached, dynamodb, or redis cache driver as your application's default cache driver. In addition, all servers must be communicating with the same central cache server.
Quote from the 8.0 docs https://laravel.com/docs/8.x/cache#atomic-locks:
To utilize this feature, your application must be using the memcached, redis, dynamodb, database, file, or array cache driver as your application's default cache driver. In addition, all servers must be communicating with the same central cache server.
Apparently, they have been adding support for more drivers to make use of this lock functionality. Check which Cache driver you are using and if it fits the support list of your Laravel version.
There is likely an atomicity issue here where the cache driver you are using is not able to lock a file atomically. What should happen is that when a process (i.e. a php request) is writing to the lock file, all other processes requiring the lock file should at least wait until the lock file available to be read again. If not, they read the lock file before it has been written to, which obviously causes a race condition.
I saw this question I asked, well now I can say that the problem I was trying to solve here was not because of the atomic lock. The problem here is the sleep method. If the time provided to the sleep method is bigger than the time that a lock will live, it means when the next request it's able to hit the route the lock time will expire(will be released). And that's because let's say you have defined a route like this:
Route::get('case/{value}', function($value){
if($value){
dump('hit-1');
}else{
sleep(5);
dump('hit-0');
}
});
And you open two browser tabs with the same URL that hits this route something like:
127.0.0.1:8000/case/0
and
127.0.0.1:8000/case/1
It will show you that the first route will take 5sec to finish execution and even if the second request is sent almost at the same time with the first request, still it will wait to finish the first one and then run. This means the second request will last 5sec(from the first request) plus the time it took to run.
Back to the asked question the lock time will expire by the time the second request will get it or said differently run the $lock->get() statement.

keeping laravel session variable till user logs out

I have a simple authentications for user,In UserController I have a fuction called postLogin().
public function postLogin()
{
if(Auth::user()->attempt($credentials))
{
return Redirect::intended('desk')->with('stream',"SomeData");;
}
}
with above code I am able to log in successfullt with the "SomeData" variable which I am retrieving it by
<?php
$class = Session::get('stream');
var_dump($class);
?>
First time when it goes to "/desk" url it dumps the value perfectly fine that is "SomeData" but once I refresh the page it resets the session and the value turns to null.
How do I keep this value till the user logs out.
From the laravel official documentation :
Flash Data
Sometimes you may wish to store items in the session only for the next
request. You may do so using the flash method. Data stored in the
session using this method will only be available during the subsequent
HTTP request, and then will be deleted. Flash data is primarily useful
for short-lived status messages:
$request->session()->flash('status', 'Task was successful!');
If you need to keep your flash data around for even more requests, you
may use the reflash method, which will keep all of the flash data
around for an additional request. If you only need to keep specific
flash data around, you may use the keep method:
$request->session()->reflash();
$request->session()->keep(['username', 'email']);

Programmatically change database for heroku dataclips

We just upgraded our Heroku postgres database using the follower changeover method. We have over 50 dataclips attached to the old database, and now we need to move them over to the new database. However, doing them one by one will take a lot of time.
Is there a programatic way to update the database a dataclip is attached to, perhaps with the CLI tools?
At least once the old database has been deprovisioned, you can now (as of March 2016) reattach them to another database:
Go to https://dataclips.heroku.com/clips/recoverable. It will display your old database and a set of 'orphaned' dataclips and you can choose to transfer them to another database (in my case the promoted follower from the changeover).
Note that this only affects the dataclips that you created, it does not affect the dataclips one of your team members created and that you only had access to. So they will have to go through this process as well.
Official devcenter article: https://devcenter.heroku.com/articles/dataclips#dataclip-recovery
Thanks to Heroku CSRF measures, programmatically updating data clips is much more difficult than you might expect. You'll need to suck it up and start clicking buttons by hand, or beg their support team to do it for you, which is just as difficult.
There is no official support for programmatically moving the dataclips. That being said, you can script it out against their HTTP API.
The base URL is https://dataclips.heroku.com/api/v1/. There are three relevant endpoints:
clips /clips
resources (databases) /heroku_resources
move clip /clips/:slug/move
Find the slug of the clip you want to move, find the resource id of the new database, and make a post to the move clip endpoint:
POST /api/v1/clips/fjhwieufysdufnjqqueyuiewsr/move
Content-Type: application/json
{"heroku_resource_id":"resource123456789#heroku.com"}
I had over 300 dataclips to move. I used the following technique to update them all (essentially reverse engineering the dataclips API).
Open Chrome with Web Developer tools, Network tab.
Log into Heroku Dataclips
Observe the network call which returns all the dataclips, in JSON (https://dataclips.heroku.com/api/v1/clips). Take this response and extract out all dataclip slugs.
Update the database for one dataclip. Observe the network call which does this (https://dataclips.heroku.com/api/v1/clips/:slug/move). Right click, Copy as cURL. This is the easiest way to get all the correct parameters, since the API uses cookies for authentication.
Write a script that loops through each dataclip slug, and shells out to curl. In Ruby, this looks like:
slugs = <paste ids here>.split("\n")
slugs.each do |slug|
command = %Q(curl -v 'https://dataclips.heroku.com/api/v1/clips/#{slug}/move' -H 'Cookie: ...' --data '{"heroku_resource_id":"resource1234567#heroku.com"}')
puts command
system(command)
end
You can contact Heroku support, and they will bulk transfer the dataclips to your new database for you.
Batch working on dataclips
I've finally found a solution to work on my Dataclips as a batch using the javascript console and some scraping technique. I needed it to retrieve every dataclips. But it guess It can be updated as such:
// Go to the dataclip listing (https://data.heroku.com/dataclips).
// Then execute this script in your console.
// Be careful, this will focus a new window every 4 seconds, preventing
// you from working 4 seconds times the number of dataclips you have.
// Retrieve urls and titles
let dataclips = Array.
from(document.querySelectorAll('.rt-td:first-child a')).
map(el => ({ url: el.href, title: el.innerText }))
/**
* Allows waiting for a given timeout before execution.
* #param {number} seconds
*/
const timeout = function(seconds) {
return new Promise(resolve => {
setTimeout(() => {
resolve()
}, seconds);
})
}
/**
* Here are all the changes you want to apply to every single
* dataclip.
* #param {object} window
*/
const applyChanges = function(window) {
}
// With a fast connection, 4 seconds is OK. Dial it down if you
// have errors.
const expectedLoadTime = 4000 // ms
// This is the main loop, windows are opened one by one to ensure focus and a
// correct loading time.
for (const dataclip of dataclips) {
// This opens another window from the script, having access to its DOM.
// See https://github.com/buonomo/kazoo for a funnier example usage!
// And don't be shy to star and share :D
const externWindow = window.open(dataclip.url)
// A hack to wait for loading, this could be improved for sure.
await timeout(expectedLoadTime)
applyChanges(externWindow)
externWindow.close()
}
You'd still have to implement applyChanges yourself which I conceed is a bit tedious and I don't have time to do it know (if one does, please share!). But at least it can be done on all of your dataclips in a single function.
For an example usage of this script, you can take a look at the gist I made to scrape every dataclips and related errors.

How to prevent AJAX polling keeping Asp.Net sessions alive

We have a ASP.Net application that has been given a 20 minute sliding expiry for the session (and cookie).
However, we have some AJAX that is polling the server for new information. The downside of this of course is that the session will continue indefinitely, as it is being kept alive by the polling calls causing the expiry time to be refreshed. Is there a way of preventing this - i.e. to only allow the session to be refreshed on non-ajax calls?
Turning off sliding expiry is not really an option as this is an application that business users will be using for most of their day between telephone calls.
Other Stackoverflow discussions on this talk about maintaining 2 separate application (one for authenticated calls, one for unauthenticated. I'm not sure this will be an option as all calls need to be authenticated.
Any ideas?
As this question is old I am assuming it has been resolved or a workaround implemented. However, I wanted to mention that instead of AJAX polling the server to perform an operation we have utilized SignalR which allows both the client to communicate with the server via JQuery and/or the server to notify the client.
Check it out: Learn About ASP.NET SignalR
add below code to your controller action that you are reference for polling.Convert this into an attribute so it can be used everywhere. This line will not extend session timeout
[HttpPost]
public ActionResult Run()
{
Response.Cookies.Remove(FormsAuthentication.FormsCookieName);
return Json("");
}
There is no way to stop the ajax from keeping the session and cookies alive!
However, there is a way to achieve what you want to do. That is if the process I will describe will be ok to you.
I think what you really want to achieve is first to refresh your page with ajax so that some processes will be active and running. Also to know when the user has stopped operating the program.
If that is what you want then there is a simple process to achieve this
You will have your ajax running for the things you want to run.
You will remove the session you want to check if user has stopped operation on the page and manage the session as a variable instead.
The variable can be a global variable or a class variable that will be set to initial value whenever the user clicks an element on the page.
(You will select the click event of an element and set the variable to initial value)
You will increment the variable every given time (say every time your ajax runs)
You will also have a function/method run to check the value of that variable if it is greater than the value you set as limit. This can run every time your ajax runs or every time you want it to run (timed event).
If the value of your variable is greater than the limit set it should invalidate or clear session/log user out.
This way if user stops operating (clicking elements) the system on any page that this is running will eventually log out the current user and stop running the program.
I have done this by creating a hidden page in an i-Frame. Then using JavaScript it posts back every 18 minutes to keep the session alive. This works really well.
This example is from a ASP.NET Forms project but could be tweaked for MVC.
Create a page called KeepSessionAlive page and add a meta refresh tag
meta id="MetaRefresh" http-equiv="refresh" content="21600;url=KeepSessionAlive.aspx"
In the code behind
protected string WindowStatusText = "";
protected void Page_Load(object sender, EventArgs e)
{
//string RefreshValue = Convert.ToString((Session.Timeout * 60) - 60);
string RefreshValue = Convert.ToString((Session.Timeout * 60) - 90);
// Refresh this page 60 seconds before session timeout, effectively resetting the session timeout counter.
MetaRefresh.Attributes["content"] = RefreshValue + ";url=KeepSessionAlive.aspx?q=" + DateTime.Now.Ticks;
WindowStatusText = "Last refresh " + DateTime.Now.ToShortDateString() + " " + DateTime.Now.ToShortTimeString();
}
Add the hidden iFrame in a master page
iframe ID="KeepAliveFrame" src="KeepSessionAlive.aspx" frameBorder="0" width="0" height="0"
Download example

Resources