Unicode receiving with ASP.NET core websocket - websocket

I'm trying to build a websocket server using ASP.NET core 1.1 websocket middleware that can handle text messages. My strategy is to use a fixed-size buffer to keep reading and decoding until the websocket message ends.
The middleware setup goes like this:
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
loggerFactory.AddConsole(Configuration.GetSection("Logging"));
loggerFactory.AddDebug();
var logger = loggerFactory.CreateLogger("websocket");
app.UseWebSockets();
app.Use(async (http, next) =>
{
if (!http.WebSockets.IsWebSocketRequest)
{
await next();
return;
}
var websocket = await http.WebSockets.AcceptWebSocketAsync();
while (websocket.State == WebSocketState.Open)
{
var buffer = new ArraySegment<byte>(new byte[32]);
var charbuffer = new char[32];
try
{
var sb = new StringBuilder();
var decoder = Encoding.UTF8.GetDecoder();
//aha, we got a message
var detectResult = await websocket.ReceiveAsync(buffer, CancellationToken.None);
var receiveResult = detectResult;
while (!receiveResult.EndOfMessage)
{
var charLen = decoder.GetChars(buffer.Array, 0, receiveResult.Count, charbuffer, 0);
logger.LogInformation($"Decoded {charLen} byte(s) from wire");
sb.Append(charbuffer, 0, charLen);
receiveResult = await websocket.ReceiveAsync(buffer, CancellationToken.None);
}
var charLenFinal = decoder.GetChars(buffer.Array, 0, receiveResult.Count, charbuffer, 0);
logger.LogInformation($"Decoded {charLenFinal} byte(s) from wire");
sb.Append(charbuffer, 0, charLenFinal);
var message = sb.ToString();
logger.LogInformation($"decoded message: {message}");
await websocket.SendAsync(new ArraySegment<byte>(Encoding.UTF8.GetBytes("got it")), WebSocketMessageType.Text, true, CancellationToken.None);
}
catch (Exception ex)
{
logger.LogError(ex.Message);
logger.LogError(ex.InnerException?.Message ?? string.Empty);
}
}
});
}
Now the code works well for text that include only ASCII characters. But when I tried to send Unicode text message (Vietnamese) whose length is longer than the buffer size, an exception occurs
fail: websocket[0]
The remote party closed the WebSocket connection without completing the close handshake.
fail: websocket[0]
at System.Net.WebSockets.ManagedWebSocket.<ReceiveAsyncPrivate>d__60.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at MvcApp.Startup.<>c__DisplayClass5_0.<<Configure>b__0>d.MoveNext()
The exception occurs at the line
var detectResult = await websocket.ReceiveAsync(buffer, CancellationToken.None);
What could be the reason for this error?

I think this:
var charLenFinal = decoder.GetChars(buffer.Array, 0, receiveResult.Count, charbuffer, 0);
should be:
var charLenFinal = decoder.GetChars(buffer.Array, 0, receiveResult.Count, charbuffer, 0, true);
since:
https://msdn.microsoft.com/en-us/library/125z2etb(v=vs.110).aspx
Remember that the Decoder object saves state between calls to
GetChars. When the application is done with a stream of data, it
should set the flush parameter to true to make sure that the state
information is flushed. With this setting, the decoder ignores invalid
bytes at the end of the data block and clears the internal buffer
.

Related

Custom google cast receiver stuck in "Load is in progress"

My custom v3 CAF receiver app is successfully playing the first few live & vod assets. After that, it gets into a state were media commands are being queued because "Load is in progress". It is still (successfully) fetching manifests, but MEDIA_STATUS remains "buffering". The log then shows:
[ 4.537s] [cast.receiver.MediaManager] Load is in progress, media command is being queued.
[ 5.893s] [cast.receiver.MediaManager] Buffering state changed, isPlayerBuffering: true old time: 0 current time: 0
[ 5.897s] [cast.receiver.MediaManager] Sending broadcast status message
CastContext Core event: {"type":"MEDIA_STATUS","mediaStatus":{"mediaSessionId":1,"playbackRate":1,"playerState":"BUFFERING","currentTime":0,"supportedMediaCommands":12303,"volume":{"level":1,"muted":false},"currentItemId":1,"repeatMode":"REPEAT_OFF","liveSeekableRange":{"start":0,"end":20.000999927520752,"isMovingWindow":true,"isLiveDone":false}}}
CastContext MEDIA_STATUS event: {"type":"MEDIA_STATUS","mediaStatus":{"mediaSessionId":1,"playbackRate":1,"playerState":"BUFFERING","currentTime":0,"supportedMediaCommands":12303,"volume":{"level":1,"muted":false},"currentItemId":1,"repeatMode":"REPEAT_OFF","liveSeekableRange":{"start":0,"end":20.000999927520752,"isMovingWindow":true,"isLiveDone":false}}}
Fetch finished loading: GET "(manifest url)".
No errors are shown.
Even after closing and restarting the cast session, the issue remains. The cast device itself has to be rebooted to resolve it. It looks like data is kept between sessions.
It could be important to note that the cast receiver app is not published yet. It is hosted on a local network.
My questions are:
What could be the cause of this stuck behavior?
Is there any session data kept between session?
How to fully reset the cast receiver app, without the necessity to restart the cast device.
The receiver app itself is very basic. Other than license wrapping it resembles the vanilla example app:
const { cast } = window;
const TAG = "CastContext";
class CastStore {
static instance = null;
error = observable.box();
framerate = observable.box();
static getInstance() {
if (!CastStore.instance) {
CastStore.instance = new CastStore();
}
return CastStore.instance;
}
get debugLog() {
return this.framerate.get();
}
get errorLog() {
return this.error.get();
}
init() {
const context = cast.framework.CastReceiverContext.getInstance();
const playerManager = context.getPlayerManager();
playerManager.addEventListener(
cast.framework.events.category.CORE,
event => {
console.log(TAG, "Core event: " + JSON.stringify(event));
}
);
playerManager.addEventListener(
cast.framework.events.EventType.MEDIA_STATUS,
event => {
console.log(TAG, "MEDIA_STATUS event: " + JSON.stringify(event));
}
);
playerManager.addEventListener(
cast.framework.events.EventType.BITRATE_CHANGED,
event => {
console.log(TAG, "BITRATE_CHANGED event: " + JSON.stringify(event));
runInAction(() => {
this.framerate.set(`bitrate: ${event.totalBitrate}`);
});
}
);
playerManager.addEventListener(
cast.framework.events.EventType.ERROR,
event => {
console.log(TAG, "ERROR event: " + JSON.stringify(event));
runInAction(() => {
this.error.set(`Error detailedErrorCode: ${event.detailedErrorCode}`);
});
}
);
// intercept the LOAD request to be able to read in a contentId and get data.
this.loadHandler = new LoadHandler();
playerManager.setMessageInterceptor(
cast.framework.messages.MessageType.LOAD,
loadRequestData => {
this.framerate.set(null);
this.error.set(null);
console.log(TAG, "LOAD message: " + JSON.stringify(loadRequestData));
if (!loadRequestData.media) {
const error = new cast.framework.messages.ErrorData(
cast.framework.messages.ErrorType.LOAD_CANCELLED
);
error.reason = cast.framework.messages.ErrorReason.INVALID_PARAM;
return error;
}
if (!loadRequestData.media.entity) {
// Copy the value from contentId for legacy reasons if needed
loadRequestData.media.entity = loadRequestData.media.contentId;
}
// notify loadMedia
this.loadHandler.onLoadMedia(loadRequestData, playerManager);
return loadRequestData;
}
);
const playbackConfig = new cast.framework.PlaybackConfig();
// intercept license requests & responses
playbackConfig.licenseRequestHandler = requestInfo => {
const challenge = requestInfo.content;
const { castToken } = this.loadHandler;
const wrappedRequest = DrmLicenseHelper.wrapLicenseRequest(
challenge,
castToken
);
requestInfo.content = wrappedRequest;
return requestInfo;
};
playbackConfig.licenseHandler = license => {
const unwrappedLicense = DrmLicenseHelper.unwrapLicenseResponse(license);
return unwrappedLicense;
};
// Duration of buffered media in seconds to start/resume playback after auto-paused due to buffering; default is 10.
playbackConfig.autoResumeDuration = 4;
// Minimum number of buffered segments to start/resume playback.
playbackConfig.initialBandwidth = 1200000;
context.start({
touchScreenOptimizedApp: true,
playbackConfig: playbackConfig,
supportedCommands: cast.framework.messages.Command.ALL_BASIC_MEDIA
});
}
}
The LoadHandler optionally adds a proxy (I'm using a cors-anywhere proxy to remove the origin header), and stores the castToken for licenseRequests:
class LoadHandler {
CORS_USE_PROXY = true;
CORS_PROXY = "http://192.168.0.127:8003";
castToken = null;
onLoadMedia(loadRequestData, playerManager) {
if (!loadRequestData) {
return;
}
const { media } = loadRequestData;
// disable cors for local testing
if (this.CORS_USE_PROXY) {
media.contentId = `${this.CORS_PROXY}/${media.contentId}`;
}
const { customData } = media;
if (customData) {
const { licenseUrl, castToken } = customData;
// install cast token
this.castToken = castToken;
// handle license URL
if (licenseUrl) {
const playbackConfig = playerManager.getPlaybackConfig();
playbackConfig.licenseUrl = licenseUrl;
const { contentType } = loadRequestData.media;
// Dash: "application/dash+xml"
playbackConfig.protectionSystem = cast.framework.ContentProtection.WIDEVINE;
// disable cors for local testing
if (this.CORS_USE_PROXY) {
playbackConfig.licenseUrl = `${this.CORS_PROXY}/${licenseUrl}`;
}
}
}
}
}
The DrmHelper wraps the license request to add the castToken and base64-encodes the whole. The license response is base64-decoded and unwrapped lateron:
export default class DrmLicenseHelper {
static wrapLicenseRequest(challenge, castToken) {
const wrapped = {};
wrapped.AuthToken = castToken;
wrapped.Payload = fromByteArray(new Uint8Array(challenge));
const wrappedJson = JSON.stringify(wrapped);
const wrappedLicenseRequest = fromByteArray(
new TextEncoder().encode(wrappedJson)
);
return wrappedLicenseRequest;
}
static unwrapLicenseResponse(license) {
try {
const responseString = String.fromCharCode.apply(String, license);
const responseJson = JSON.parse(responseString);
const rawLicenseBase64 = responseJson.license;
const decodedLicense = toByteArray(rawLicenseBase64);
return decodedLicense;
} catch (e) {
return license;
}
}
}
The handler for cast.framework.messages.MessageType.LOAD should always return:
the (possibly modified) loadRequestData, or
a promise for the (possibly modified) loadRequestData
null to discard the load request (I'm not 100% sure this works for load requests)
If you do not do this, the load request stays in the queue and any new request is queued after the initial one.
In your handler, you return an error if !loadRequestData.media, which will get you into that state. Another possibility is an exception in the load request handler, which will also get you in that state.
I guess we have a different approach and send everything possible through sendMessage, when we loading stuff we create a new cast.framework.messages.LoadRequestData() which we load with playerManager.load(loadRequest).
But I guess that you might be testing this on an integrated Chromecast, we see this problems as well!?
I suggest that you do one or more
Enable gzip compression on all responses!!!
Stop playback playerManager.stop() (maybe in the interseptor?)
Change how the licenseUrl is set
How we set licenseUrl
playerManager.setMediaPlaybackInfoHandler((loadRequestData, playbackConfig) => {
playbackConfig.licenseUrl = loadRequestData.customData.licenseUrl;
return playbackConfig;
}
);

Create server TCP in Winodws Univeral App (Javascript) and client Android

I want create a tcp server in c# and use it in universal app javascript based project, and I create the folowing code (Server):
//C# Windows Runtime Component
public sealed class Server
{
public Server()
{
Debug.WriteLine("Server...");
}
public async void Connection()
{
IPAddress ip = IPAddress.Parse("192.168.0.10");
TcpListener server = new TcpListener(ip, portNumber);
TcpClient client = default(TcpClient);
try
{
server.Start();
Debug.WriteLine("Server started ... " + ip.ToString());
}
catch (Exception e)
{
Debug.WriteLine(e.ToString());
}
while (true)
{
client = await server.AcceptTcpClientAsync();
byte[] recievedBuffer = new byte[100];
NetworkStream stream = client.GetStream();
stream.Read(recievedBuffer, 0, recievedBuffer.Length);
string msg = Encoding.UTF8.GetString(recievedBuffer, 0, recievedBuffer.Length);
Debug.WriteLine(msg);
}
}
}
//in HTML
<script>
console.log("test");
var server = new Server.Server();
server.connection();
console.log("msg");
</script>
I don't know why Debug.WriteLine and console.log method don't work, nothing are printed in output or in javascript console.
The Server code works with Android client, if the server is "Console App" project but in "Universal App Javscript" nothing append, I don't have warning or error.
So I don't know if I'm doing bad, because console.log and Debug.WriteLine don't work.
I have a solution that work with windows universal app, I remove Connection and add followings methods:
public async void StartServer()
{
try
{
var streamSocketListener = new Windows.Networking.Sockets.StreamSocketListener();
streamSocketListener.ConnectionReceived += this.StreamSocketListener_ConnectionReceived;
await streamSocketListener.BindEndpointAsync(new HostName("192.168.0.10"), PortNumber);
}
catch (Exception ex){}
}
private async void StreamSocketListener_ConnectionReceived(Windows.Networking.Sockets.StreamSocketListener sender, Windows.Networking.Sockets.StreamSocketListenerConnectionReceivedEventArgs args)
{
string request;
using (var streamReader = new StreamReader(args.Socket.InputStream.AsStreamForRead()))
{
request = await streamReader.ReadLineAsync();
}
sender.Dispose();
}
//in main.js just call this method
new Server.Server().startServer();
But I still don't know why Debug.WriteLine() in c# and console.log() in javascript don't work.

Dart - WebSocket.readyState always returns WebSocket.OPEN

Basically, I'm trying to check the status of my WebSocket Server.ws. However, when I query Server.ws.readyState, the only response I ever get is WebSocket.OPEN. How do I check if a WebSocket is disconnected if it always returns WebSocket.OPEN?
For example, I've tried to turn off the WiFi of the device used to test the Flutter app. Normally, after one second, the WebSocket is assumed disconnected and the connection is closed with a WebSocketStatus.GOING_AWAY close code. I assumed it would also change the WebSocket.readyState, but that doesn't seems to be the case.
So, how do I properly check the status of my WebSocket?
How I'm currently checking :
/// Connection status
IconButton _status() {
IconData iconData;
switch (Server.ws?.readyState) {
case WebSocket.CONNECTING:
print("readyState : CONNECTING");
iconData = Icons.wifi;
break;
case WebSocket.OPEN:
print("readyState : OPEN");
iconData = Icons.signal_wifi_4_bar;
break;
case WebSocket.CLOSING:
print("readyState : CLOSING");
iconData = Icons.signal_wifi_4_bar_lock;
break;
case WebSocket.CLOSED:
print("readyState : CLOSED");
iconData = Icons.warning;
break;
default:
print("readyState : " + Server.ws.readyState.toString());
break;
}
return new IconButton(
icon: new Icon(iconData),
tooltip: 'Connection Status', // TODO:Localize
onPressed: () {
setState(() {
Server.ws.close();
});
},
);
}
Additional info about the WebSocket :
/// Should be called when the IP is validated
void startSocket() {
try {
WebSocket.connect(Server.qr).then((socket) {
// Build WebSocket
Server.ws = socket;
Server.ws.listen(
handleData,
onError: handleError,
onDone: handleDone,
cancelOnError: true,
);
Server.ws.pingInterval = new Duration(
seconds: Globals.map["PingInterval"],
);
send(
"CONNECTION",
{
"deviceID": Globals.map["UUID"],
},
);
});
} catch (e) {
print("Error opening a WebSocket : $e");
}
}
/// Handles the closing of the connection.
void handleDone() {
print("WebSocket closed.");
new Timer(new Duration(seconds: Globals.map["PingInterval"]), startSocket);
}
/// Handles the WebSocket's errors.
void handleError(Error e) {
print("WebSocket error.");
print(e);
Server.ws.close();
}
I've gone ahead and taken a look at the source code for the WebSocket implementation. It appears that when the WebSocket is being closed with the status GOING_AWAY, the internal socket stream is being closed. However, it is possible that this event does not propagate to the transformed stream which handles the readyState of the instance. I would recommend filing a bug report at dartbug.com.
try setting the pingInterval, this checks for connection status every said interval, then the closeCode will update

Await call to web service is stopping the execution flow of the program

I have the following code:
public async Task IntiateDataFetchingProcess(string[] args)
{
try
{
ProcessArgs(args);
Log.Information("Run Mode: {RunModeID}", RunModeID);
switch (RunModeID)
{
case RunModeType.A:
await MethodAAsync();
break;
case RunModeType.B:
await MethodBAsync();
break;
case RunModeType.C:
TestMethod();
break;
default:
break;
}
}
catch (Exception ex)
{
throw;
}
}
private async Task MethodBAsync()
{
Console.WriteLine(DateTime.Now.ToLongTimeString());
// Call to webservice to get the data
var response = await _service.GetDataAsync(input1, request);
Console.WriteLine(DateTime.Now.ToLongTimeString());
}
On debugging I found that the execution call comes to the below line (of method: MethodBAsync) and stops there.
var response = await _service.GetDataAsync(input1, request);
Can anyone help me to know is there anything that I am missing here.
Ah, your code is getting deadlocked!
You just need to add .ConfigureAwait(false); to each line that you are awaiting.
Example:
var response = await _service.GetDataAsync(input1, request);
becomes
var response = await _service.GetDataAsync(input1,
request).ConfigureAwait(false);
For more information on .ConfigureAwait(), Stephen Cleary wrote an awesome post on it.

PushStreamContent stream does not flush under load

I am using PushStreamContent to keep a persistent connection to each client. Pushing short heartbeat messages to each client stream every 20 seconds works great with 100 clients, but at about 200 clients, the client first starts receiving it a few seconds delayed, then it doesn't show up at all.
My controller code is
// Based loosely on https://aspnetwebstack.codeplex.com/discussions/359056
// and http://blogs.msdn.com/b/henrikn/archive/2012/04/23/using-cookies-with-asp-net-web-api.aspx
public class LiveController : ApiController
{
public HttpResponseMessage Get(HttpRequestMessage request)
{
if (_timer == null)
{
// 20 second timer
_timer = new Timer(TimerCallback, this, 20000, 20000);
}
// Get '?clientid=xxx'
HttpResponseMessage response = request.CreateResponse();
var kvp = request.GetQueryNameValuePairs().Where(q => q.Key.ToLower() == "clientid").FirstOrDefault();
string clientId = kvp.Value;
HttpContext.Current.Response.ClientDisconnectedToken.Register(
delegate(object obj)
{
// Client has cleanly disconnected
var disconnectedClientId = (string)obj;
CloseStreamFor(disconnectedClientId);
}
, clientId);
response.Content = new PushStreamContent(
delegate(Stream stream, HttpContent content, TransportContext context)
{
SaveStreamFor(clientId, stream);
}
, "text/event-stream");
return response;
}
private static void CloseStreamFor(string clientId)
{
Stream oldStream;
_streams.TryRemove(clientId, out oldStream);
if (oldStream != null)
oldStream.Close();
}
private static void SaveStreamFor(string clientId, Stream stream)
{
_streams.TryAdd(clientId, stream);
}
private static void TimerCallback(object obj)
{
DateTime start = DateTime.Now;
// Disable timer
_timer.Change(Timeout.Infinite, Timeout.Infinite);
// Every 20 seconds, send a heartbeat to each client
var recipients = _streams.ToArray();
foreach (var kvp in recipients)
{
string clientId = kvp.Key;
var stream = kvp.Value;
try
{
// ***
// Adding this Trace statement and running in debugger caused
// heartbeats to be reliably flushed!
// ***
Trace.WriteLine(string.Format("** {0}: Timercallback: {1}", DateTime.Now.ToString("G"), clientId));
WriteHeartBeat(stream);
}
catch (Exception ex)
{
CloseStreamFor(clientId);
}
}
// Trace... (this trace statement had no effect)
_timer.Change(20000, 20000); // re-enable timer
}
private static void WriteHeartBeat(Stream stream)
{
WriteStream(stream, "event:heartbeat\ndata:-\n\n");
}
private static void WriteStream(Stream stream, string data)
{
byte[] arr = Encoding.ASCII.GetBytes(data);
stream.Write(arr, 0, arr.Length);
stream.Flush();
}
private static readonly ConcurrentDictionary<string, Stream> _streams = new ConcurrentDictionary<string, Stream>();
private static Timer _timer;
}
Could there be some ASP.NET or IIS setting that affects this? I am running on Windows Server 2008 R2.
UPDATE:
Heartbeats are reliably sent if 1) the Trace.WriteLine statement is added, 2) Visual Studio 2013 debugger is attached and debugging and capturing the Trace.WriteLines).
Both of these are necessary; if the Trace.WriteLine is removed, running under the debugger has no effect. And if the Trace.WriteLine is there but the program is not running under the debugger (instead SysInternals' DbgView is showing the trace messages), the heartbeats are unreliable.
UPDATE 2:
Two support incidents with Microsoft later, here are the conclusions:
1) The delays with 200 clients were resolved by using a business class Internet connection instead of a Home connection
2) whether the debugger is attached or not really doesn't make any difference;
3) The following two additions to web.config are required to ensure heartbeats are sent timely, and failed heartbeats due to client disconnecting "uncleanly" (e.g. by unplugging computer rather than normal closing of program which cleanly issues TCP RST) trigger a timely ClientDisconnected callback as well:
<httpRuntime executionTimeout="5" />
<serverRuntime appConcurrentRequestLimit="50000" uploadReadAheadSize="1" frequentHitThreshold="2147483647" />

Resources