Dart How is the websocket pingInterval actually used? - websocket

I have been testing using a web-socket and want to use pingInterval to determine if the client has closed but I have been unable to detect that. I send a message over the websocket every 4 seconds. When I quit the browser no error condition is generated, I think, and I have not been able to figure out how to detect that the web-socket closed the connection. How can I detect that? I'm new at Dart and web applications as well.

I tested it with SDK 1.5.0.dev:
Server code:
import 'dart:io';
main() {
HttpServer.bind('127.0.0.1', 4040).then((server) {
server.listen((HttpRequest request) {
WebSocketTransformer.upgrade(request).then((socket) {
socket.listen((msg){
socket.pingInterval = new Duration(seconds : 1);
print('server received message: $msg');
socket.add('server received message: $msg');
});
socket.done.then((e){
print("WebSocket closed with:"
"socket.closeReason: ${socket.closeReason}, "
"socket.closeCode: ${socket.closeCode}");
});
});
});
});
}
Client code:
import 'dart:html';
import 'dart:async';
void main() {
querySelector('button').onClick.first.then((e){
for (int i = 0; i > -1; i++){
print("epic code");
}
});
WebSocket ws = new WebSocket('ws://127.0.0.1:4040');
ws.onMessage.listen((MessageEvent e) {
querySelector('#response').appendHtml('<p>${e.data}</p>');
});
Timer t = new Timer.periodic(new Duration(seconds : 1), (t) {
ws.sendString('timer fired');
});
}
html:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>ClientTest</title>
<link rel="stylesheet" href="clienttest.css">
</head>
<body>
<button type="button">Hang</button>
<p>Response:</p>
<div id="response">
</div>
<script type="application/dart" src="clienttest.dart"></script>
<script src="packages/browser/dart.js"></script>
</body>
</html>
For Example, if you close browser window then client-server will close socket with socket.closeReason: , socket.closeCode: 1005 of course you can provide your own reason if it wasn't "sudden death" CloseEvent codes
But if you set pingInterval and press Hang button then server will close socket on timeout but with socket.closeReason: null, socket.closeCode: null. Without pingInterval it will keep waiting.
Probably, Dart team should provide something more 'exhaustive' than null. But you can ping it yourself with Stream timeout
Stream timeout(Duration timeLimit, {Function void onTimeout(EventSink sink)})
Creates a new stream with the same events as this stream.
Whenever more than timeLimit passes between two events from this
stream, the onTimeout function is called.
The countdown doesn't start until the returned stream is listened to.
The countdown is reset every time an event is forwarded from this
stream, or when the stream is paused and resumed.
The onTimeout function is called with one argument: an EventSink that
allows putting events into the returned stream. This EventSink is only
valid during the call to onTimeout.
If onTimeout is omitted, a timeout will just put a TimeoutException
into the error channel of the returned stream.
The returned stream is not a broadcast stream, even if this stream is.

Related

Is there a way I can display webcam Arraybuffer live stream to a websocket server on a web socket client

I am not interested in webrtc; it doesn't work for my use case.
Here is my basic flow
a. Open webpage-> stream live video frames from webcam(MediaRecorder used with video/webm; codecs="vp8" codec) ->
b.send video frames to websocket server(Java E.E websocket server in this case) ->
c. Broadcast video frames to subscribed websocket clients ->
d. receive video frames at client webpage->
e. Playback live video frame.
My primary issue seems to be coming from playing back the live video frames streamed to the client side.
I have tried using MSE(media source extension) but it didn't work smoothly; I ended up with flickering video and if client connects to websocket after streams begin on streamer web page, I will have to restart streamer web page to reinitialize and redisplay streams on client side. I am able to log and see logs of continuous arraybuffer data of video frames at client side browser console.
below is the streamer.html code
<html>
<head>
<title>TODO supply a title</title>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>
<body>
<div>TODO write content</div>
<script>
navigator.mediaDevices.getUserMedia({video:true,audio:true}).then(stream=>{
var ws,mediaRecorder;
var options = {
mimeType: 'video/webm; codecs="vp8"',
bitsPerSecond:5000 //quality
};
var stateVal = '0';
var socket;
var counter = 0;
function handleVideo(){
try{
mediaRecorder.stop()
}catch(e){}
mediaRecorder=null;
mediaRecorder = new MediaRecorder(stream,options);
mediaRecorder.ondataavailable =function(e) {
if(e.data&&e.data.size>0) {
e.data.arrayBuffer().then(buffer=>{
//
//const data= new Uint8Array(buffer);
//console.log(data);
socket.send(buffer);
//console.log('counter still counting to 100');
// ws.send(buffer)
})
}
}
mediaRecorder.start(200);
}
function connect(){
socket = new WebSocket("ws://localhost:8813/mainws/actions");
socket.binaryType = "arraybuffer";
socket.onopen = function(evt) {stateInfo ='open'; handleVideo(); console.log("Socket opened");};
socket.onclose = function(evt) {console.log("Socket closed"); connect() };
socket.onerror = function(evt) {console.log("Error: "+evt.data);};
socket.onmessage = function(evt){//console.log("new client connected");
};
//ws = new WebSocket("ws://localhost:8813/zyconnectws/actions")
//ws.binaryType = "arraybuffer"
//ws.onopen=handleVideo
//ws.onmessage=handleVideo
//ws.onclose=connect
}
connect()
})
// so onmessage function neccessary for when someone join the socket stream again, because webm format need embl header
</script>
</body>
</html>
I recently had a recommendation from a colleague to try playing the stream at client side using jsmpeg.js lib.
I do not know if I am on the right path; Any push there would be highly appreciated, thanks.

Received an error when uploading a file to DoubleClick

Does anyone know what the error "You uploaded the wrong number of assets with Enabler components for this creative. The creative must have exactly 2 asset(s) with Enabler." means?
I'm assuming DoubleClick changed something on their end. I tried uploading old creative and received the same error.
I am using Hype3 to create my ad. Here is the script in the head of the file. I wonder if something has changed with the enabler.
<head>
<script src="https://s0.2mdn.net/ads/studio/Enabler.js"></script>
<meta name="ad.size" content="width=1000,height=90">
<script>
// If true, start function. If false, listen for INIT.
window.onload = function() {
if (Enabler.isInitialized()) {
enablerInitHandler();
} else {
Enabler.addEventListener(studio.events.StudioEvent.INIT, enablerInitHandler);
}
}
function enablerInitHandler() {
// Start ad, initialize animation,
// load in your image assets, call Enabler methods,
// and/or include other Studio modules.
// Also, you can start the Polite Load
}
//If true, start function. If false, listen for VISIBLE.
//So your pageLoadedHandler function will look like the following:
function pageLoadedHandler() {
if (Enabler.isVisible()) {
adVisibilityHandler();
} else {
Enabler.addEventListener(studio.events.StudioEvent.VISIBLE,
adVisibilityHandler);
}
}
function bgExitHandler1(e) {
Enabler.exitOverride('Background Exit1', 'URL');
}
function exitClose(e) {
Enabler.reportManualClose();
Enabler.close();
}
document.getElementById('exit').addEventListener('click', bgExitHandler1, false);
document.getElementById('close_btn').addEventListener('click', exitClose, false);
</script>
<head>
I realized that the issue was caused by the fact that I chose the wrong format. I needed to choose 'interstitial' in order for it to work with my files.

How do you stop listening to a socket.io channel?

In socket.io, you can bind to a socket event/channel like this:
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost');
socket.on('news', function (data) {
console.log(data);
socket.emit('my other event', { my: 'data' });
});
</script>
But how do you stop listening to the "news" event?
Try, socket.removeAllListeners("news");
The off function can also be used.
socket.off('news'); // stops listening to the "news" event
socket.off('news', myFunction); // useful if you have multiple listeners for the same event
socket.off(); // stops listening to all events
Sources
According to the Socket.io Client Documentation "the socket actually inherits every method of the Emitter class, like hasListeners, once or off (to remove an event listener)."
Emitter documentation:
Pass event and fn to remove a listener.
Pass event to remove all listeners on that event.
Pass nothing to remove all listeners on all events.
If you want to remove only a specific event you can try socket.removeListener('you_event');

Firefox doesn't fire error event on <audio> tag or display fallback text

I'm using the <audio> tag to play audio files across a number of browsers.
var audioTag = document.createElement("audio"),
sourceTag = document.createElement("source"),
sorryTag = document.createElement("div");
sorryTag.innerHTML = "This filetype not supported";
audioTag.onerror = function() {
//some error handling code
}
sourceTag.onerror = function() {
/some error handling code
}
sourceTag.src = "myfile.mp3";
audioTag.appendChild(sourceTag);
audioTag.appendChild(sorryTag);
//add audioTag to DOM
This leads to
<audio>
<source src='myfile.mp3' />
<div>This filetype not supported</div>
</audio>
Firefox can't play MP3 files, and I'm OK with that. Mozilla also promises that an error event will be dispatched if the <audio> or <video> tag can't play the media. And also it will go through the tags nested inside the media tag one by one (<source> or others, the last presumably being an error message) till it finds one it can work with. None of these seem to work for me; the error event is never fired on the elements nor is the error message displayed. What am I doing wrong?
The workaround I found was:
var audioTag = document.createElement("audio"),
sourceTag = document.createElement("source");
//Add error event listeners for browsers other than FF
audioTag.onerror = function() {
console.log("file can't be played. error from audio tag");
}
sourceTag.onerror = function() {
console.log("file can't be played. error from source tag");
}
//The only way to tell that file failed to play on FF
//Timeout is because audioTag.networkState === audioTag.NETWORK_NO_SOURCE
//on IE till it starts downloading the file
setTimeout(function() {
if(audioTag.networkState === audioTag.NETWORK_NO_SOURCE) {
console.log("this hack is only for <audio> on FF.");
console.log("Not for <video> and on no other browsers");
}
}, 3000);
sourceTag.src = "<file_url>";
audioTag.appendChild(sourceTag);
Basically, create the media and source tags, add error handlers, then append the source tag to the media tag and if the error event fires, then you know the file is unplayable.
On FF, the error event doesn't fire and you have to rely on the networkState flag of the <audio> element, comparing it to NETWORK_NO_SOURCE. You can't inspect it immediately after setting the src attribute of the <source> element because on IE networkState === NETWORK_NO_SOURCE till the browser actually starts downloading the file. For this reason, set a timeout of about 3 seconds (it's not an exact science) before checking the flag value and there's a good chance that you will have given IE enough time to determine if it's capable of playing the file.
UPDATE
Wrote a test case for this: http://jogjayr.github.com/FF-Audio-Test-Case/ but the error event fires OK there. Guess I was wrong; that or it was broken on FF14 (which I was using at the time), because the error event fires OK in my application too. Thanks #BorisZbarsky

IE Ajax Streaming/long-polling without folding the server

I'm trying to solve how to do Streaming for IE and long-polling without folding the server. Here what I had in mind.
I 'll have a servlet called : TimeServlet.
in doGet or doPost() .. I'll suspend the request and send the time at each seconds.
....
suspend()
while(!stopped){
request.writeln(new Date().toString());
}
or with a Scheduler and Runnable, but you get the point.
On the client in javascript I'll create a ajax connection.
My big questions are :
1 - How do I do streaming with IE ? with Firefox and Chrome, I read the data when readyState==3, but in IE, the data is only available on readyState==4.
2 - How can I do long-polling in this example ? Long-polling block until the server had data to push, but in this example, the server will always have something to push, so the client will do something like while(true) and flood the server. I suppose that I have to do something like that
ajax.push(null) ... on readyState==4 -> read ... after that setTimeout(resendRequest, 1); //1 sec ?
there is someone that have a sample like that ?
my code works fine for FF and Chrome, but now I'm looking for IE and Opera.
EDIT
I found that I could use XDomainRequest in IE for streaming. You have to have that in your server code :
response.setHeader("Access-Control-Allow-Origin","*");
I won't answer this question yet, because I don't know how to detect that the connection is completed.
with Ajax, it was easy.. ReadyState==4. but I don't know for XDomainRequest.
I need to be able to trigger some javascript callback when the connection is closed. Any ideas ?
I found how to detect the close event. You have to use the onload method.
So the code will look like that
var ajaxRequest = new XDomainRequest();
ajaxRequest.onload = function() {
//alert("[XDR-onload]. responseText: " + ajaxRequest.responseText + "");
};
ajaxRequest.onerror = function() { alert("[XDR-onerror] Fatal Error."); };
ajaxRequest.ontimeout = function() {
alert("[XDR-ontimeout] Timeout Error.");
};
ajaxRequest.onprogress = function() {
//alert("[XDR-onprogress] responseText so far: " + ajaxRequest.responseText + "");
};
and don't forget to add the Header in the response (server's side)
response.setHeader("Access-Control-Allow-Origin","*");

Resources