can you provide information about websocket - websocket

I am doing research on websocket in the world of IoT, but the scope of information I have is quite small. I like the suggestion, if you can share information about the website, if you can, thank you.
I read several papers about IoT, including the application of websocket in the queuing system, there is also a comparative analysis of the performance of Xbee and Websocket.

WebSocket is a communications protocol which facilitates a full-duplex communication channel over a single TCP connection. WebSocket WebSocket communication presents a suitable protocol for the IoT environments. Since it offers a lightweight communication between server and client also bundles of data can be transmitted continually between multiple devices. For this we need to have a server that needs WebSocket library installed and also a WebSocket Client and web browser installed on the client or the device that supports WebSocket.
There are both the advantages and caveats of using WebSockets and IoT.
Please refer the below links for more information:
1) https://www.hcltech.com/blogs/unleashing-power-html5-websocket-internet-things-iot
2) https://medium.com/#krishna.thokala2010/websocket-fever-for-iot-f662498ff1d2
3) https://webofthings.org/tag/websockets/
4) https://readwrite.com/2017/10/31/websockets-iot-two-dont-go-together/
Hope this information helps. Please comment if you need more assistance on specific details.

Related

What exactly does a XMPP server do?

Besides federation (talking to other XMPP server), what's the role of an XMPP server in the communication between two peers?
Wikipedia says that
The XMPP network uses a client–server architecture; clients do not
talk directly to one another.
In that case, they must talk through the server, so messages must go through the server, correct?
Does it role change if we're using XMPP over Websockets, BOSH, or bare TCP?
For instance if we use XMPP over Websockets, is there a Websocket between client1 and the server, and another Websocket between client2 and the server?
An XMPP server provides basic messaging, presence, and XML routing features. This page lists Jabber/XMPP server software that you can use to run your own XMPP service, either over the Internet or on a local area network. Wikipedia is also right. According to wikipedia it means that it uses server client system. It is a system in which a computing system composed of two logical parts: a server, which provides information or services, and a client, which requests them. On a network, for example, users can access server resources from their personal computers using client software. Server client system is widely used for communication purpose and also in DBMS
actually it is a communication protocol for message oriented middle ware ( lies between software and application ) based on xml (extensible markup language ) It enables the near-real-time exchange of structured yet extensible data between any two or more network entities.

twilio <stream> websocket streaming to signalR asp.net core 3.0

Sorry if this is a silly question. Has anyone managed to get the twilio stream to websocket working with SignalR (https://www.twilio.com/docs/voice/twiml/stream).
I have been trying for a while now and although I can see its hitting the server I never see it hit any of the methods.
Any help would be greatly appreciated
Thanks
Steve
I am just taking a look at Twilio Streams now that they've announced bidirectional support, I noticed that the docs indicate you have to utilize WebSockets. SignalR uses WebSockets but isn't just WebSockets. It's summed up succinctly in Jim Mc̮̑̑̑͒G's blog post How to save an audio file from Twilio Media streams to Azure storage
"SignalR is terrific, but it solves a different problem. SignalR can be thought of as a wrapper of several technologies - of which WebSockets represent a major component. It primarily serves the purpose of connecting web-browser clients to a back-end service. Other problems it solves include the maintenance of robust connections and the use of fallback techniques to enable browsers that don’t natively support WebSockets, to still benefit from real-time connections.
For Twilio Media Streams, we need to use WebSocket connections in a server-to-server configuration. SignalR isn’t the right tool for that job."

How to use Pusher API for bi-directional communication?

When taking a look at the Pusher Servcer and their Client / Server API I am having some problems trying to figure out how Pusher will help me allow bi-directional communication between devices / apps.
I am having multiple smaller devices / apps in the field that should return their status to a server or another client, which acts as a dashboard to browse all those devices and monitor status, etc.
In my understanding this can be done using traditional WebSockets and a cloud-server in between which manages all connections between those clients - something I though Pusher would be.
But after reading through the docs I can't really see a concept of bi-directional data communication. Here's why:
To push data to the clients I have to use one of Pushers Server Libraries
To receive that Data I have to use one of Pusher Client Libraries
This concept however does not fit into what I need. I want to:
Broadcast to Clients.
Clients can send Data directly to Clients (Server acting as Gateway / Routing).
Clients can send Data to Server.
Server can send / response to unique Client.
When reading about Pusher, they state: "Bi-Directional Communication" which I currently cannot see. So how to implement that advertised Bi-Directional Communication?
Pusher does PubSub only. Using this, you can simulate bi-directional communication: Both sides of the conversation each need to have a topic dedicated to the conversation, and you then publish to this.
This is not ideal. For something which is probably closer to what you seem to want, take a look at WAMP (Web Application Messaging Protocol), which has more than just PubSub. There is a list of implementations at http://wamp-proto.org/implementations. For a router I would recommend Crossbar.io (http://crossbar.io), which has the most documentation to help you get started. Full disclosure: I am involved both with WAMP and Crossbar.io - but it's all open source and may just be what you need.

Does PubNub use WebSockets and/or XMPP under the hood?

Couldn't find a clear answer to either:
WebSockets: There is support for WebSockets (http://www.pubnub.com/websockets/) and socket.io, however do the other SDKs use web sockets?
XMPP: Does PubNub use it as a communication protocol?
PubNub WebSockets and/or XMPP
Update 2019 🌟 PubNub is planning to add additional protocols. MQTT is supported today mqtt.pubnub.com, additionally we will be adding WebSockets and SEE and connectionless push with UDP.
At PubNub we use many protocols in our Client SDKs starting with an always-on forever lived TCP Socket. Our TTL policy on TCP Sockets is unlimited. We provide the best protocol and we roll in updates under the covers so developers don't have to sweat the details of how messages are delivered.
The PubNub Data Stream Network believes in a protocol independent open mobile web; meaning that we will use the best protocol to get connectivity through any environment. Protocols, like WebSockets, can get tripped up by cell tower switching, double NAT environments, and even some anti-virus software or proxy boarder authorities.
PubNub provides client libraries specifically so we can auto-switch the protocol and remove socket-level complexities making it easy for developers to build apps that can communicate in realtime.
PubNub has deployed a variety of protocols over time, like WebSockets, MQTT, COMET, BOSH, long polling and others. We are exploring currently prototyping future designs using SPDY, HTTP 2.0, and others. The bottom line is that PubNub will work in every network environment, and has very low network bandwidth overhead, as well as low battery drain on mobile devices compared to connection based push implementations.

WebRTC vs Websockets: If WebRTC can do Video, Audio, and Data, why do I need Websockets? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
So I'm looking to build a chat app that will allow video, audio, and text. I spent some time researching into Websockets and WebRTC to decide which to use. Since there are plenty of video and audio apps with WebRTC, this sounds like a reasonable choice, but are there other things I should consider?
Feel free to share your thoughts.
Things like:
Due to being new WebRTC is available only on some browsers, while WebSockets seems to be in more browsers.
Scalability - Websockets uses a server for session and WebRTC seems to be p2p.
Multiplexing/multiple chatrooms - Used in Google+ Hangouts, and I'm still viewing demo apps on how to implement.
Server - Websockets needs RedisSessionStore or RabbitMQ to scale across multiple machines.
WebRTC is designed for high-performance, high quality communication of video, audio and arbitrary data. In other words, for apps exactly like what you describe.
WebRTC apps need a service via which they can exchange network and media metadata, a process known as signaling. However, once signaling has taken place, video/audio/data is streamed directly between clients, avoiding the performance cost of streaming via an intermediary server.
WebSocket on the other hand is designed for bi-directional communication between client and server. It is possible to stream audio and video over WebSocket (see here for example), but the technology and APIs are not inherently designed for efficient, robust streaming in the way that WebRTC is.
As other replies have said, WebSocket can be used for signaling.
I maintain a list of WebRTC resources: strongly recommend you start by looking at the 2013 Google I/O presentation about WebRTC.
Websockets use TCP protocol.
WebRTC is mainly UDP.
Thus main reason of using WebRTC instead of Websocket is latency.
With websocket streaming you will have either high latency or choppy playback with low latency. With WebRTC you may achive low-latency and smooth playback which is crucial stuff for VoIP communications.
Just try to test these technology with a network loss, i.e. 2%. You will see high delays in the Websocket stream.
WebSockets:
Ratified IETF standard (6455) with support across all modern browsers and even legacy browsers using web-socket-js polyfill.
Uses HTTP compatible handshake and default ports making it much easier to use with existing firewall, proxy and web server infrastructure.
Much simpler browser API. Basically one constructor with a couple of callbacks.
Client/browser to server only.
Only supports reliable, in-order transport because it is built On TCP. This means packet drops can delay all subsequent packets.
WebRTC:
Just beginning to be supported by Chrome and Firefox. MS has proposed an incompatible variant. The DataChannel component is not yet compatible between Firefox and Chrome.
WebRTC is browser to browser in ideal circumstances but even then almost always requires a signaling server to setup the connections. The most common signaling server solutions right now use WebSockets.
Transport layer is configurable with application able to choose if connection is in-order and/or reliable.
Complex and multilayered browser API. There are JS libs to provide a simpler API but these are young and rapidly changing (just like WebRTC itself).
webRTC or websockets? Why not use both.
When building a video/audio/text chat, webRTC is definitely a good choice since it uses peer to peer technology and once the connection is up and running, you do not need to pass the communication via a server (unless using TURN).
When setting up the webRTC communication you have to involve some sort of signaling mechanism. Websockets could be a good choice here, but webRTC is the way to go for the video/audio/text info. Chat rooms is accomplished in the signaling.
But, as you mention, not every browser supports webRTC, so websockets can sometimes be a good fallback for those browsers.
Security is one aspect you missed.
With Websockets the data has to go via a central webserver which typically sees all the traffic and can access it.
With WebRTC the data is end-to-end encrypted and does not pass through a server (except sometimes TURN servers are needed, but they have no access to the body of the messages they forward).
Depending on your application this may or may not matter.
If you are sending large amounts of data, the saving in cloud bandwidth costs due to webRTC's P2P architecture may be worth considering too.
Comparing websocket and webrtc is unfair.
Websocket is based on top of TCP. Packet's boundary can be detected from header information of a websocket packet unlike tcp.
Typically, webrtc makes use of websocket. The signalling for webrtc is not defined, it is upto the service provider what kind of signalling he wants to use. It may be SIP, HTTP, JSON or any text / binary message.
The signalling messages can be send / received using websocket.
Webrtc is a part of peer to peer connection.
We all know that before creating peer to peer connection, it requires handshaking process to establish peer to peer connection.
And websockets play the role of handshaking process.
Websocket and WebRTC can be used together, Websocket as a signal channel of WebRTC, and webrtc is a video/audio/text channel, also WebRTC can be in UDP also in TURN relay, TURN relay support TCP HTTP also HTTPS.
Many projects use Websocket and WebRTC together.

Resources