I'm trying to have clients publish a A/V stream, turn them off, and then turn them back on. The first time I tell them to publish and then unpublish, it works fine. However, the next time I tell them to publish (Using the same session ID and token), I get the error "Cannot Connect, the session is already undefined".
Why is the "session" getting destroyed?.. is it the unpublish? My code is pretty much taken from the tutorials:
clientSession = OT.initSession(apiKey, sessionId);
clientSession.connect(token, function (error) {
if (error) {
handleError(error);
} else {
clientPublisher = OT.initPublisher(container, {
insertMode: 'append',
width: '100%',
height: '100%'
}, handleError);
}
});
}
To unpublish:
clientSession.unpublish(clientPublisher);
There are 2 ways you could do this. You could initialise a single publisher object once and keep reusing it everytime you republish. Or you could keep destroying and reinitialising a new publisher each time. I've written up an example of both approaches for you:
Reuse same publisher: https://jsbin.com/tobabos/edit?html
Create new publisher each time: https://jsbin.com/jawuxez/edit?html
Note: Please provide your own API key, session ID and token to run the above JSbins
The key difference is that to reuse a publisher you need to do this:
pub.on('streamDestroyed', e => e.preventDefault());
This is documented here: https://tokbox.com/developer/sdks/js/reference/Publisher.html#.event:streamDestroyed
It makes sure that when you unpublish, the publisher object is not destroyed so it can be reused.
What also happens is if you reuse a publisher, the publisher remains on the page and the user can still see themselves. Even if the publisher is not streaming to the session. You could use CSS or DOM manipulation to hide the publisher, but the webcam light will remain on.
However, if you destroy and recreate the publisher each time, the publisher disappears from the page and the webcam light turns off while unpublished. Depending on the browser and the user's settings, they may be asked to permit access to their webcam again.
Related
I'm trying to integrate the schedule component from Syncfusion. The component has a URL adaptor to connect to the controller; GetData() and Batch() for Crud Operations. Batch has a payload indicating what actions to perform. At the end, the Batch method would requery the database and send data identical to GetData() back.
Unfortunately, there is no built-in method to notify clients of anything going wrong - whether there is an exception, server-side validation kicks in or similar.
What I'd like to do is to add a placeholder outside the compentent to receive and display server messages (be it a notification popup, a or whatever.
Since I can't influence the Ajax call itself, I was wondering if I had to get started with SignalR (still in beta for .Net Core 2 as far as I know), or if I may have missed something more obvious? I have read a lot about push notifications etc - but these are not quite what I'm after, it'd be slightly over the top I think.
To summarise, let's say I have
<div id="messages"></div>
<div id="component">HereGoesTheScheduleWhichICantDoMuchWith</div>
Now in the Batch() method, it would be great to call a SendMessage("Sorry,you can't do this") - the text of which would ideally then appear in the messages-div.
How would you go about this?
I have now solved this, using SignalR (currently 1.0.0-alpha2-final) and for a nice view on the Client, PNotify.
Presently, it only works if the client is authenticated, if it needs to work anonymously you'd need to figure out a way to track SignalR's connection id.
On the page with the Syncfusion Schedule component, I connect to SignalR.
let connection = new signalR.HubConnection("/signalr", { transport: signalR.TransportType.ServerSentEvents });
connection.on("Notify",
(title, message) => {
new PNotify({
title: title,
text: message
});
});
connection.start();
The Hub (SignalRHub : Hub) creates a notification group for the user connecting:
public override Task OnConnectedAsync()
{
Groups.AddAsync(Context.ConnectionId, Context.User.Identity.Name);
return base.OnConnectedAsync();
}
The associated controller gets IHubContext<SignalRHub> signalRHub injected.
Now in the Batch-Method for the Syncfusion component, which returns Json and can't itself carry messages or notifications, you can notify the user:
_signalRHub.Clients.Group(User.Identity.Name).InvokeAsync("Notify", "A title", "A message");
In my particular case, I'm sending over an object to control layout, animation and popup duration for PNotify (e.g. longer for an exception to allow copy/paste etc) - as you please. Returning an object could be done using:
_signalRHub.Clients.Group(User.Identity.Name).InvokeAsync("Notify", JsonConvert.SerializeObject(new { title = "Some Title", message = "notification", type = "notice"}););
Obviously, connection.on("Notify"... needs to be changed accordingly.
I hope this is clear enough and might help someone else.
I am new to Parse and I want to know if there is a way to schedule a Background job that starts every 3 minutes and sends a message (an integer or something) to all users that at that moment are logged in. I could not find any help here reading the guide. I hope someone can help me here.
I was in need to push information for all logged in users in several apps which were built with Parse.com.
None of the solutions introduced earlier by Emilio, because we were in need to trigger some live event for logged users only.
So we decided to work with PubNub within CloudCode in Parse : http://www.pubnub.com/blog/realtime-collaboration-sync-parse-api-pubnub/
Our strategy is to open a "channel" available for all users, and if a user is active (logged in), we are pushing to this dedicated "channel" some information which are triggered by the app, and create some new events or call to action.
This is a sample code to send information to a dedicated channel :
Parse.Cloud.define("sendPubNubMessage", function(request, response) {
var message = JSON.stringify(request.params.message);
var PubNubUrl;
var PubNubKeys;
Parse.Config.get().then(function(config) {
PubNubKeys = config.get('PubNubkeys');
}).then(function() {
PubNubUrl = 'https://pubsub.pubnub.com/publish/';
PubNubUrl+= PubNubKeys.Publish_Key + '/';
PubNubUrl+= PubNubKeys.Subscribe_Key + '/0/';
PubNubUrl+= request.params.channel +'/0/';
PubNubUrl+= message;
return Parse.Cloud.httpRequest({
url: PubNubUrl,
headers: {
'Content-Type': 'application/json;charset=utf-8'
}
}).then(function(httpResponse) {
return httpResponse;
});
}).then(function(httpResponse) {
response.success(httpResponse.text);
}, function(error) {
response.error(error);
});
});
This is an another sample code used to send a message to a dedicated channel once something was changed on a specific class :
Parse.Cloud.afterSave("your_class", function(request, response) {
if (!request.object.existed()) {
Parse.Cloud.run('sendPubNubMessage', {
'message': JSON.stringify({
'collection': 'sample',
'objectId': request.object.id
}),
'channel' : 'all' // could be request.object.get('user').id
});
}
});
#Toucouleur is right in suggesting PubNub for your Parse project. PubNub acts essentially like an open socket between client and server so that the sever can send messages to clients and vice versa. There are 70+ SDKs supported, including one here for Win Phone.
One approach for your problem would be to Subscribe all users to a Channel when they log in, and Unsubscribe from that Channel when they exit the app or timeout.
When you want to send a message you can publish to a Channel and all users Subscribed will receive that message in < 1/4 second. PubNub makes sending those messages as Push Notifications really simple as well.
Another feature you may find useful is "Presence" which can give you realtime information about who is currently Subscribed to your "Channel".
If you think a code sample would help let me know!
Here's a few ideas I came up with.
Send a push notification to all users, but don't add an alert text. No alert will show for users who have the app closed and you can handle the alert in the App Delegate. Disadvantage: Uses a lot of push notifications, and not all of them are going to be used.
When the app comes to foreground, add a flag to the PFInstallation object that specifies the user is online, when it goes to the background, set the flag to false. Send a push notification to the installations that have the flag set to true. Disadvantages: If the app crashes, you would be sending notifications to users that are not online. Updating the user twice per session can increase your Parse request count.
Add a new property to the PFInstallation object where you store the last time a user did something, you can also set it on a timer of 30s/1m while the app is open. Send a push notification to users that have been active in the last 30s/1m. Disadvantage: Updating the PFInstallation every 30 seconds might cause an increase on your Parse request count. More accuracy (smaller interval) means more requests. The longer the session length of your users, the more requests you will use.
I'm looking to develop a chat application with Pubnub where I want to make sure all the chat messages that are send is been stored in the database and also want to send messages in chat.
I found out that I can use the Parse with pubnub to provide storage options, But I'm not sure how to setup those two in a way where the messages and images send in the chat are been stored in the database.
Anyone have done this before with pubnub and parse? Are there any other easy options available to use with pubnub instead of using parse?
Sutha,
What you are seeking is not a trivial solution unless you are talking about a limited number of end users. So I wouldn't say there are no "easy" solutions, but there are solutions.
The reason is your server would need to listen (subscribe) to every chat channel that is active and store the messages being sent into your database. Imagine your app scaling to 1 million users (doesn't even need to get that big, but that number should help you realize how this can get tricky to scale where several server instances are listening to channels in a non-overlapping manner or with overlap but using a server queue implementation and de-duping messages).
That said, yes, there are PubNub customers that have implemented such a solution - Parse not being the key to making this happen, by the way.
You have three basic options for implementing this:
Implement a solution that will allow many instances of your server to subscribe to all of the channels as they become active and store the messages as they come in. There are a lot of details to making this happen so if you are not up to this then this is not likely where you want to go.
There is a way to monitor all channels that become active or inactive with PubNub Presence webhooks (enable Presence on your keys). You would use this to keep a list of all channels that your server would use to pull history (enable Storage & Playback on your keys) from in an on-demand (not completely realtime) fashion.
For every channel that goes active or inactive, your server will receive these events via the REST call (and endpoint that you implement on your server - your Parse server in this case):
channel active: record "start chat" timetoken in your Parse db
channel inactive: record "end chat" timetoken in your Parse db
the inactive event is the kickoff for a process that uses start/end timetokens that you recorded for that channel to get history from for channel from PubNub: pubnub.history({channel: channelName, start:startTT, end:endTT})
you will need to iterate on this history call until you receive < 100 messages (100 is the max number of messages you can retrieve at a time)
as you retrieve these messages you will save them to your Parse db
New Presence Webhooks have been added:
We now have webhooks for all presence events: join, leave, timeout, state-change.
Finally, you could just save each message to Parse db on success of every pubnub.publish call. I am not a Parse expert and barely know all of its capabilities but I believe they have some sort or store local then sync to cloud db option (like StackMob when that was a product), but even if not, you will save msg to Parse cloud db directly.
The code would look something like this (not complete, likely errors, figure it out or ask PubNub support for details) in your JavaScript client (on the browser).
var pubnub = PUBNUB({
publish_key : your_pub_key,
subscribe_key : your_sub_key
});
var msg = ... // get the message form your UI text box or whatever
pubnub.publish({
// this is some variable you set up when you enter a chat room
channel: chat_channel,
message: msg
callback: function(event){
// DISCLAIMER: code pulled from [Parse example][4]
// but there are some object creation details
// left out here and msg object is not
// fully fleshed out in this sample code
var ChatMessage = Parse.Object.extend("ChatMessage");
var chatMsg = new ChatMessage();
chatMsg.set("message", msg);
chatMsg.set("user", uuid);
chatMsg.set("channel", chat_channel);
chatMsg.set("timetoken", event[2]);
// this ChatMessage object can be
// whatever you want it to be
chatMsg.save();
}
error: function (error) {
// Handle error here, like retry until success, for example
console.log(JSON.stringify(error));
}
});
You might even just store the entire set of publishes (on both ends of the conversation) based on time interval, number of publishes or size of total data but be careful because either user could exit the chat and the browser without notice and you will fail to save. So the per publish save is probably best practice if a bit noisy.
I hope you find one of these techniques as a means to get started in the right direction. There are details left out so I expect you will have follow up questions.
Just some other links that might be helpful:
http://blog.parse.com/learn/building-a-killer-webrtc-video-chat-app-using-pubnub-parse/
http://www.pubnub.com/blog/realtime-collaboration-sync-parse-api-pubnub/
https://www.pubnub.com/knowledge-base/discussion/293/how-do-i-publish-a-message-from-parse
And we have a PubNub Parse SDK, too. :)
We just upgraded our Heroku postgres database using the follower changeover method. We have over 50 dataclips attached to the old database, and now we need to move them over to the new database. However, doing them one by one will take a lot of time.
Is there a programatic way to update the database a dataclip is attached to, perhaps with the CLI tools?
At least once the old database has been deprovisioned, you can now (as of March 2016) reattach them to another database:
Go to https://dataclips.heroku.com/clips/recoverable. It will display your old database and a set of 'orphaned' dataclips and you can choose to transfer them to another database (in my case the promoted follower from the changeover).
Note that this only affects the dataclips that you created, it does not affect the dataclips one of your team members created and that you only had access to. So they will have to go through this process as well.
Official devcenter article: https://devcenter.heroku.com/articles/dataclips#dataclip-recovery
Thanks to Heroku CSRF measures, programmatically updating data clips is much more difficult than you might expect. You'll need to suck it up and start clicking buttons by hand, or beg their support team to do it for you, which is just as difficult.
There is no official support for programmatically moving the dataclips. That being said, you can script it out against their HTTP API.
The base URL is https://dataclips.heroku.com/api/v1/. There are three relevant endpoints:
clips /clips
resources (databases) /heroku_resources
move clip /clips/:slug/move
Find the slug of the clip you want to move, find the resource id of the new database, and make a post to the move clip endpoint:
POST /api/v1/clips/fjhwieufysdufnjqqueyuiewsr/move
Content-Type: application/json
{"heroku_resource_id":"resource123456789#heroku.com"}
I had over 300 dataclips to move. I used the following technique to update them all (essentially reverse engineering the dataclips API).
Open Chrome with Web Developer tools, Network tab.
Log into Heroku Dataclips
Observe the network call which returns all the dataclips, in JSON (https://dataclips.heroku.com/api/v1/clips). Take this response and extract out all dataclip slugs.
Update the database for one dataclip. Observe the network call which does this (https://dataclips.heroku.com/api/v1/clips/:slug/move). Right click, Copy as cURL. This is the easiest way to get all the correct parameters, since the API uses cookies for authentication.
Write a script that loops through each dataclip slug, and shells out to curl. In Ruby, this looks like:
slugs = <paste ids here>.split("\n")
slugs.each do |slug|
command = %Q(curl -v 'https://dataclips.heroku.com/api/v1/clips/#{slug}/move' -H 'Cookie: ...' --data '{"heroku_resource_id":"resource1234567#heroku.com"}')
puts command
system(command)
end
You can contact Heroku support, and they will bulk transfer the dataclips to your new database for you.
Batch working on dataclips
I've finally found a solution to work on my Dataclips as a batch using the javascript console and some scraping technique. I needed it to retrieve every dataclips. But it guess It can be updated as such:
// Go to the dataclip listing (https://data.heroku.com/dataclips).
// Then execute this script in your console.
// Be careful, this will focus a new window every 4 seconds, preventing
// you from working 4 seconds times the number of dataclips you have.
// Retrieve urls and titles
let dataclips = Array.
from(document.querySelectorAll('.rt-td:first-child a')).
map(el => ({ url: el.href, title: el.innerText }))
/**
* Allows waiting for a given timeout before execution.
* #param {number} seconds
*/
const timeout = function(seconds) {
return new Promise(resolve => {
setTimeout(() => {
resolve()
}, seconds);
})
}
/**
* Here are all the changes you want to apply to every single
* dataclip.
* #param {object} window
*/
const applyChanges = function(window) {
}
// With a fast connection, 4 seconds is OK. Dial it down if you
// have errors.
const expectedLoadTime = 4000 // ms
// This is the main loop, windows are opened one by one to ensure focus and a
// correct loading time.
for (const dataclip of dataclips) {
// This opens another window from the script, having access to its DOM.
// See https://github.com/buonomo/kazoo for a funnier example usage!
// And don't be shy to star and share :D
const externWindow = window.open(dataclip.url)
// A hack to wait for loading, this could be improved for sure.
await timeout(expectedLoadTime)
applyChanges(externWindow)
externWindow.close()
}
You'd still have to implement applyChanges yourself which I conceed is a bit tedious and I don't have time to do it know (if one does, please share!). But at least it can be done on all of your dataclips in a single function.
For an example usage of this script, you can take a look at the gist I made to scrape every dataclips and related errors.
I have an express app running with Sequelize.js as an ORM. My express app receives requests from my main Rails app, and because of the cross-domain policy, these requests are performed with getJSON.
On the client, the request is fired when the user hits a key.
Everything goes fine and express logs the queries being performed (and json being served) each time the user hits the key. Even trying to hit quickly it performs ok. But, whenever I leave the key pressed (or maybe several clients hitting the key very quickly), as it starts firing lots of requests, at some moment the server just hangs, all the requests from that point on are left pending (I see that in the Network tab of Chrome Dev Tools), and they slowly start to timeout. I have to reboot the server to make it respond again.
The server code for my request is:
models.Comment.findAllPublic(req.params.pId, req.params.sId, function(comments){
var json = comments.map(function(comment){
var com = {};
['user_id','user_avatar', 'user_slug', 'user_name', 'created_at', 'text', 'private', 'is_speaker_note'].forEach(function(key){
com[key]=comment[key];
});
return com;
});
res.json({comments: json});
});
And the findAllPublic method from the Comment model (this is a Sequelize model) is:
findAllPublicAndMyNotes: function(current_user, presentationId, slideId, cb){
db.query("SELECT * FROM `comments` WHERE commentable_type='Slide' AND commentable_id=(SELECT id from `slides` where `order_in_presentation`="+slideId+" AND `presentation_id`="+presentationId+") AND (`private` IS FALSE OR (`private` IS TRUE AND `user_id`="+current_user+" AND `is_speaker_note` IS FALSE))",self.Comment).on('success', cb).on('failure',function(err){console.log(err);});
}
How to avoid the server from getting stuck? Am I leaving some blocking code in the request that may slowly hang the server as new requests are made?
At first I thought it could be a problem because of the "forEach" when composing the json object from the Sequelize model, but I also tried leaving the callback for the mysql query empty, just responding empty json and it also got frozen.
Maybe it is a problem of the mysql connector? When the server gets stuck I can normally run the mysql console and perform queries on my database and it also responds, so I don't know if that's the problem.
I know I could just control the key event to prevent it from firing too many requests when the key gets pressed for a long time, but the problem seems to appear also when several clients hit the key repeatedly and concurrently.
Any thoughts? Thanks in advance for the help :D
Two things:
It seems like you have some path where res.render is not being called. It could be that the database you're connecting to is dropping the connection to your Express server after the absurd number of requests and the callback is never fired (and there's no database.on('close', function() { // Handle disconnect from DB, perhaps auto-restarting }) code to catch it.
Your client-side code should detect when an AJAX request on keypress is still pending while a new one is being started, and cancel the old one. I'm guessing getJSON is a jQuery method? Assuming it's jQuery's, then you need something like the following
.
var currKeyRequest = null;
function callOnKeyUp() {
var searchText = $('#myInputBox').value;
if(currKeyRequest) {
currKeyRequest.reject();
currKeyRequest = null;
}
currKeyRequest = $.getJSON('path/to/server', function(json) {
currKeyRequest = null;
// Use JSON code
});
}
This way, you reduce the load on the client, the latency of the autocomplete functionality (but why not use the jQuery UI autocomplete if that's what you're after?), and you can save the server from some of the load as well if the keypresses are faster than handshaking with the server (possible with a good touch-typist a few hours flight away).