socket.io rejoin rooms on reconnect - socket.io

I'm writing a new SPA application that will subscribes to several rooms for several types of information updates.
In my production setup I'll use two servers behind a load balancer for reliability.
In the event of disconnect - Does the client have to resend the request for rooms subscriptions on the reconnect event callback, or is there a way to have the server reconnect the client automatically (even when the client reconnects to a different server due to server failure) ?

Socket.io will unsubscribe your users from all rooms on a disconnect. It will unsubscribe you from the server side. I played around with this a little. The server can store your user's rooms in redis or a database under the user ID and, upon connecting, check to see if that user should be in any of these rooms. At which time your user can join them from the server side without ever having to do anything from the client.
The problem is that this list of rooms must be constantly stored and updated. It's just another thing that has to work seamlessly on the backend. It's a lot of tests to consider all the possibilities that could mess up your organization. Like, what if they log in on another device, you have to clear the rooms and put in new ones, but if the user opens his laptop again and it reconnects, now he has to get back in those rooms from his laptop. ...It's totally doable/solvable, but I only did this on the front end:
// rejoin if there's a disconnect
mySocket.on('reconnect', () => {
mySocket.emit('subscribe', 'theRoom')
})
...and no further hassle. If you added some more details about why it's necessary to do it from the server..?

From my experience, I found this to be the easiest and useful solution:
Client side:
// the next 3 functions will be fired automatically on a disconnect.
// the disconnect (the first function) is not required, but you know,
// you can use it make some other good stuff.
socket.on("disconnect", function() {
console.log("Disconnected");
});
socket.on("reconnect", function() {
// do not rejoin from here, since the socket.id token and/or rooms are still
// not available.
console.log("Reconnecting");
});
socket.on("connect", function() {
// thats the key line, now register to the room you want.
// info about the required rooms (if its not as simple as my
// example) could easily be reached via a DB connection. It worth it.
socket.emit("registerToRoom", $scope.user.phone);
});
Server side:
io.on('connection', function(socket){
socket.on("registerToRoom", function(userPhone) {
socket.join(userPhone);
});
});
And thats it. Very simple and straight forward.
You also can add in the connected socket (the last function) some more updates to the user display, such as refreshing its index or something else.

Socket.io does have a reconnect event - Docs here
Something like the below should work
socket.on('reconnect', () => attemptReconnection())
The attempt reconnection callback would look something like:
const attemptReconnection = () => socket.emit('joinRoom', roomId)

Related

Send socket data to a specific socket id socket.io

So I'm trying to send data to a specific user (Specific socket ID), I've tried doing.
io.to(users[steamid].socket).emit('message', {
type: 'balance',
balance: row[0].balance
});
users[steamid].socket is where I store all the socket ids, it just fetches the specific ID I want, however when I do this, it works on the first time but when it fires again it doesn't work. I don't know what I'm doing wrong.
The way I'm firing the call is with a function, the website has a countdown and when the countdown is over it calls the function which gets the socket id of the winner and then it gives the winner the data needed, I want to know why this doesn't only works on the first time.
Thanks in advance.
A way to fix you problem would be to save sockets, not socket ids.
If user.socket is the socket for a user, you can then do
user.socket.emit('message', { ... });
Or in your example
users[steamid].socket.emit('message', { ... });

Parse Background Job can send a message to all users currently logged in?

I am new to Parse and I want to know if there is a way to schedule a Background job that starts every 3 minutes and sends a message (an integer or something) to all users that at that moment are logged in. I could not find any help here reading the guide. I hope someone can help me here.
I was in need to push information for all logged in users in several apps which were built with Parse.com.
None of the solutions introduced earlier by Emilio, because we were in need to trigger some live event for logged users only.
So we decided to work with PubNub within CloudCode in Parse : http://www.pubnub.com/blog/realtime-collaboration-sync-parse-api-pubnub/
Our strategy is to open a "channel" available for all users, and if a user is active (logged in), we are pushing to this dedicated "channel" some information which are triggered by the app, and create some new events or call to action.
This is a sample code to send information to a dedicated channel :
Parse.Cloud.define("sendPubNubMessage", function(request, response) {
var message = JSON.stringify(request.params.message);
var PubNubUrl;
var PubNubKeys;
Parse.Config.get().then(function(config) {
PubNubKeys = config.get('PubNubkeys');
}).then(function() {
PubNubUrl = 'https://pubsub.pubnub.com/publish/';
PubNubUrl+= PubNubKeys.Publish_Key + '/';
PubNubUrl+= PubNubKeys.Subscribe_Key + '/0/';
PubNubUrl+= request.params.channel +'/0/';
PubNubUrl+= message;
return Parse.Cloud.httpRequest({
url: PubNubUrl,
headers: {
'Content-Type': 'application/json;charset=utf-8'
}
}).then(function(httpResponse) {
return httpResponse;
});
}).then(function(httpResponse) {
response.success(httpResponse.text);
}, function(error) {
response.error(error);
});
});
This is an another sample code used to send a message to a dedicated channel once something was changed on a specific class :
Parse.Cloud.afterSave("your_class", function(request, response) {
if (!request.object.existed()) {
Parse.Cloud.run('sendPubNubMessage', {
'message': JSON.stringify({
'collection': 'sample',
'objectId': request.object.id
}),
'channel' : 'all' // could be request.object.get('user').id
});
}
});
#Toucouleur is right in suggesting PubNub for your Parse project. PubNub acts essentially like an open socket between client and server so that the sever can send messages to clients and vice versa. There are 70+ SDKs supported, including one here for Win Phone.
One approach for your problem would be to Subscribe all users to a Channel when they log in, and Unsubscribe from that Channel when they exit the app or timeout.
When you want to send a message you can publish to a Channel and all users Subscribed will receive that message in < 1/4 second. PubNub makes sending those messages as Push Notifications really simple as well.
Another feature you may find useful is "Presence" which can give you realtime information about who is currently Subscribed to your "Channel".
If you think a code sample would help let me know!
Here's a few ideas I came up with.
Send a push notification to all users, but don't add an alert text. No alert will show for users who have the app closed and you can handle the alert in the App Delegate. Disadvantage: Uses a lot of push notifications, and not all of them are going to be used.
When the app comes to foreground, add a flag to the PFInstallation object that specifies the user is online, when it goes to the background, set the flag to false. Send a push notification to the installations that have the flag set to true. Disadvantages: If the app crashes, you would be sending notifications to users that are not online. Updating the user twice per session can increase your Parse request count.
Add a new property to the PFInstallation object where you store the last time a user did something, you can also set it on a timer of 30s/1m while the app is open. Send a push notification to users that have been active in the last 30s/1m. Disadvantage: Updating the PFInstallation every 30 seconds might cause an increase on your Parse request count. More accuracy (smaller interval) means more requests. The longer the session length of your users, the more requests you will use.

Why does socket.disconnect disconnect the wrong user?

My website has an IM with several users connected. From my client I wish to disconnect a particular user. Here is the code I am trying:
// client side
function deleteUser(delCallsign)
{
delCallsign = delCallsign.toUpperCase();
socket.emit('deleteuser', delCallsign); // send it to the server for delete
}
// server side
socket.on('deleteuser', function(callsign)
{
socket.disconnect(usernames[callsign]);
io.sockets.emit('updateusers', usernames);
});
Using an alert, I have verified that I'm calling the server side function with the username I wish to disconnect. But what happens is that I get disconnected, not the user specified. What am I doing wrong here?
On user connection you should record its socket.id which you would then call for deletion
io.sockets.on('connection',function(socket){
// Asign socket.id to variable
// socket.id;
});
socket.on('deleteuser', function(callsign) {
io.sockets.connected[usernames[callsign].id].disconnect();
io.sockets.emit('updateusers', usernames);
});
This is roughly the idea.
Based on those post:
SocketIO: disconnect client by socket id?
Get the client id of the message sender in socket.io?
-- More Relevent --
A little bit old but the same principals applies
how do I store socket resources from specific users with socket.io?

Sending events from server to client(s) in Meteor

Is there a way to send events from the server to all or some clients without using collections.
I want to send events with some custom data to clients. While meteor is very good in doing this with collections, in this case the added complexity and storage its not needed.
On the server there is no need for Mongo storage or local collections.
The client only needs to be alerted that it received an event from the server and act accordingly to the data.
I know this is fairly easy with sockjs but its very difficult to access sockjs from the server.
Meteor.Error does something similar to this.
The package is now deprecated and do not work for versions >0.9
You can use the following package which is originally aim to broadcast messages from clients-server-clients
http://arunoda.github.io/meteor-streams/
No collection, no mongodb behind, usage is as follow (not tested):
stream = new Meteor.Stream('streamName'); // defined on client and server side
if(Meteor.isClient) {
stream.on("channelName", function(message) {
console.log("message:"+message);
});
}
if(Meteor.isServer) {
setInterval(function() {
stream.emit("channelName", 'This is my message!');
}, 1000);
}
You should use Collections.
The "added complexity and storage" isn't a factor if all you do is create a collection, add a single property to it and update that.
Collections are just a shape for data communication between server and client, and they happen to build on mongo, which is really nice if you want to use them like a database. But at their most basic, they're just a way of saying "I want to store some information known as X", which hooks into the publish/subscribe architecture that you should want to take advantage of.
In the future, other databases will be exposed in addition to Mongo. I could see there being a smart package at some stage that strips Collections down to their most basic functionality like you're proposing. Maybe you could write it!
I feel for #Rui and the fact of using a Collection just to send a message feel cumbersome.
At the same time, once you have several of such message to send around is convenient to have a Collection named something like settings or similar where you keep these.
Best package I have found is Streamy. It allows you to send to everybody, or just one specific user
https://github.com/YuukanOO/streamy
meteor add yuukan:streamy
Send message to everybody:
Streamy.broadcast('ddpEvent', { data: 'something happened for all' });
Listen for message on client:
// Attach an handler for a specific message
Streamy.on('ddpEvent', function(d, s) {
console.log(d.data);
});
Send message to one user (by id)
var socket = Streamy.socketsForUsers(["nJyQvECmkBSXDZEN2"])._sockets[0]
Streamy.emit('ddpEvent', { data: 'something happened for you' }, socket);

ExpressJS backend hanging with too much requests

I have an express app running with Sequelize.js as an ORM. My express app receives requests from my main Rails app, and because of the cross-domain policy, these requests are performed with getJSON.
On the client, the request is fired when the user hits a key.
Everything goes fine and express logs the queries being performed (and json being served) each time the user hits the key. Even trying to hit quickly it performs ok. But, whenever I leave the key pressed (or maybe several clients hitting the key very quickly), as it starts firing lots of requests, at some moment the server just hangs, all the requests from that point on are left pending (I see that in the Network tab of Chrome Dev Tools), and they slowly start to timeout. I have to reboot the server to make it respond again.
The server code for my request is:
models.Comment.findAllPublic(req.params.pId, req.params.sId, function(comments){
var json = comments.map(function(comment){
var com = {};
['user_id','user_avatar', 'user_slug', 'user_name', 'created_at', 'text', 'private', 'is_speaker_note'].forEach(function(key){
com[key]=comment[key];
});
return com;
});
res.json({comments: json});
});
And the findAllPublic method from the Comment model (this is a Sequelize model) is:
findAllPublicAndMyNotes: function(current_user, presentationId, slideId, cb){
db.query("SELECT * FROM `comments` WHERE commentable_type='Slide' AND commentable_id=(SELECT id from `slides` where `order_in_presentation`="+slideId+" AND `presentation_id`="+presentationId+") AND (`private` IS FALSE OR (`private` IS TRUE AND `user_id`="+current_user+" AND `is_speaker_note` IS FALSE))",self.Comment).on('success', cb).on('failure',function(err){console.log(err);});
}
How to avoid the server from getting stuck? Am I leaving some blocking code in the request that may slowly hang the server as new requests are made?
At first I thought it could be a problem because of the "forEach" when composing the json object from the Sequelize model, but I also tried leaving the callback for the mysql query empty, just responding empty json and it also got frozen.
Maybe it is a problem of the mysql connector? When the server gets stuck I can normally run the mysql console and perform queries on my database and it also responds, so I don't know if that's the problem.
I know I could just control the key event to prevent it from firing too many requests when the key gets pressed for a long time, but the problem seems to appear also when several clients hit the key repeatedly and concurrently.
Any thoughts? Thanks in advance for the help :D
Two things:
It seems like you have some path where res.render is not being called. It could be that the database you're connecting to is dropping the connection to your Express server after the absurd number of requests and the callback is never fired (and there's no database.on('close', function() { // Handle disconnect from DB, perhaps auto-restarting }) code to catch it.
Your client-side code should detect when an AJAX request on keypress is still pending while a new one is being started, and cancel the old one. I'm guessing getJSON is a jQuery method? Assuming it's jQuery's, then you need something like the following
.
var currKeyRequest = null;
function callOnKeyUp() {
var searchText = $('#myInputBox').value;
if(currKeyRequest) {
currKeyRequest.reject();
currKeyRequest = null;
}
currKeyRequest = $.getJSON('path/to/server', function(json) {
currKeyRequest = null;
// Use JSON code
});
}
This way, you reduce the load on the client, the latency of the autocomplete functionality (but why not use the jQuery UI autocomplete if that's what you're after?), and you can save the server from some of the load as well if the keypresses are faster than handshaking with the server (possible with a good touch-typist a few hours flight away).

Resources