Difference between Web Socket in Raspberry Pi and Arduino - websocket

I use Web socket between Raspberry Pi and server and it works perfect, as the theory of web socket.
But when I search for a web socket for Arduino I found a lot of libraries and may be all of them is just like http post not like the real web socket.
They just send posts and receive gets, but the Web socket is to open connection between the 2 terminals and they send posts and wait for the a message from the other side.
In my Arduino application I always need to send http get requests to know the state of the led from server, which leads to too much traffics, but when using a Web Socket the server only sends a message for the Arduino when the state is changed, and the Arduino only waits for the message.
Does any one know a real web socket library for Arduino?

There is no "Real" websocket library for Arduino.
I recommend using Raspberry Pi farther than Arduino because it is more stable than Arduino since it has an operating system and supports multi processes ( unlike the Arduino which runs only one process at a time).
If you compare the price of RPi and any Arduino with build in WiFi you will find that they have the same price approximately.

Related

micropython timers blocked by microwebsrv

I am using ESP32 and micropython to create electric blinds. I am also using MicroWebSrv with websockets to control the blinds from my phone using a web page.
I want to be able to control it using voice commands to Google Assistant.
I was able to make it work with IFTT and Adafruit.io feed. The problem is that after some time my ESP32 looses connection to the adafruit and I was not able to make it reconnect easily. It happends all the time, even when I only leave the mqtt library alone on the ESP32 and nothing else. I tried checking messages every 1sec up to 10sec (to avoid reaching the free Adafruit.io limit of 30 items per minute), made no difference.
I'd like to ask if there is some other way how can I control the ESP32, ideally without the Adafruit.io or any other third party (IFTT is fine), as this is causing me the troubles. I even have public IP address, so I can directly send for example HTTP requests to my ESP32 from the internet or something.
So is there some other way how Google Assistant can tell my ESP32 to OPEN or CLOSE the blinds?

Stream real time video from local IP to browser in an external network using websocket/webRTC with raspberry pi 3b+

Anybody here with some experience in websockets and webRTC using TURN/STUN servers?
Requirement:
Send real-time video feed from local IP to browser in an external network and I need some help implementing via raspberry pi 3b+. My camera source is android device, and using 3rd party apps I am able to generate the video feed over local network. Using the same app I can stream via Youtube Live,but getting a latency of about 2 secs in ultra low latency mode and dvr enabled. And I am trying to reduce the latency of the stream.
Q1. Do the semi-public TURN server provide a one to one peer. Or anyone can just jump into the URL and view and override what I am streaming?Please provide few list of service providers.
Just for information there would be 1-2 users browser connected at max.
Q2. Do I need Janus gateway to send webRTC/websockets data into the TURN/STUN server? Since my raspberry is connected to a different network and I cannot port forward due to carrier constraints.
Q3. Do I need both STUN/TURN servers or do I even need webRTC instead of websockets to send my video stream over the internet. Is websockets not sufficient?
Q4. Since we are not implementing over local network do we need to install coTURN too on raspberry pi?
Q5. Is there any android app that can publish the data from camera to websocket/werRTC server with a public ws URL?
Any help would be really helpful.
Q1. TURN servers relay media. They do this by allocating for every connecting peer a relay port between 49152–65535. This relay port will then be used to transmit the media to the second peer. The peers will know which relay ports to use automatically since this is part of the ice gathering process. To get back to your question: Other Peers cannot write to that relay port, it is 1 to 1 with handshakes, there is no chance of someone else overwriting it.
Q2. You definitely do not need a Janus Gateway to use TURN. TURN and STUN will probably work fine for NAT-Traversal without port forwarding.
Q3. You need at least a TURN server (but you ideally want to use 1 STUN server and 1 TURN server). STUN will work in most cases, but will fail if there are firewalls or complicated NATs, which block inbound udp connections. TURN is just the fallback for those cases.
Needing WebRTC? For just streaming videos, it depends on the use case. A sequence of images can be transmitted over websockets, they can handle Blobs fine. But you won't have a very fluent, high fps AND high resolution video stream this way. And of course, I know of no usable way to transmit audio over websocket.
Q4. The raspberry pi is a Peer that transmits media? Peers do not need a local TURN server installation, you will only need 1 TURN server (which should not be behind a NAT, probably running on some web server). The TURN server is a separate instance.
EDIT
For your private testing and development purposes, you may use https://numb.viagenie.ca/ . I don't know much about commercial turn server hosters, except that some exist. For someone who owns a v-server or root server, installing coTURN may be an option, this Tutorial might be helpful. To check if the server is working, I also found this snippet to be very useful.
END EDIT
Q5. There is no android app that publishes webRTC streams to a ws URL since websocket
messages are used by webrtc only for signalling (that is, telling peers their host candidates, those are the IP adresses and ports learned by the ice gathering process, this includes the TURN and STUN ip and port combinations).

Client Server Communication in ns3

I want to simulate an UDP request-response application in ns3 by using 802.11n as communication technology, where a client/server A an access point AP and another server/client B exchange some messages (where each message has a particular size, i.e. 100, 130, 235 byte and so on):
*A*         AP          *B*
A------>AP
AP------>A
AP------>B
B------>AP
AP------>B
AP------>A
A------>B
B------->A
I followed this tutorial(https://www.nsnam.org/wiki/HOWTO_make_and_use_a_new_application) and I'm able to send and receive data with custom size but just for the first interaction. Furthermore I spent a lot of days to understand how can is possible modify the behavior of this application in order to simulate the aforementioned scenario. Are there any suggestions?
You don't need to create a new application. NS-3 already has applications that send traffic according to your specification (e.g. BulkSend Application, OnOff, and so on). Here's the link.
Regarding your application, it is very close to this same application using a client/server over wireless lan (link).
All you have to do is tweak the parameters. Let me know if you have any problems.

Can you poll a TCP socket created by another program?

Is it possible on Windows 7 to write a C++ or .NET program that finds out whether an existing, connected TCP socket created by another program has any data in its send or receive buffer?
Use case: There's a 16-bit legacy application doing TCP communication with some .NET applications. To work around a concurrency issue in the legacy app, it would be helpful if we could inspect either of two sockets that are connected to each other and tell whether there's some data sent on one end but not yet received on the other end.
The connection is TCP and the sockets are on the loopback interface (127.0.0.1).
Approach: WSADuplicateSocket() + WSAPoll() could be the solution but I don't know how to get a hold of the socket handle programmatically because the socket is created by another program.

Connecting to MindWave through COM port?

Has anyone here tried connecting to the MindWave device through the COM port associated with the bluetooth device? Is it better to just connect through the Thinkgear Connector? My target language is Ruby and I was thinking of using the serial-port library.
If you connect to the serial port directly you will have to parse binary data which, actually, ThinkGear Connect socket protocol is for (all related is here) TGSP is JSON-based protocol so dealing with it in Ruby is piece of cake (below is an extract from documentation at the link above)
ThinkGear Socket Protocol (TGSP) is a JSON-based protocol for the
transmission and receipt of ThinkGear brainwave data between a client
and a server. TGSP was designed to allow languages and/or frameworks
without a standard serial port API (e.g. Flash and most scripting
languages) to easily integrate brainwave-sensing functionality through
socket APIs.
So I'd suggest you deal with TG socket protocol (I assume you don't run Ruby on an embedded device, for if you do, you will definitely need to parse data from TGAM chip by hand) rather than parse on your own, but don't forget you will have to have ThinkGear Connector software up and running everywhere you wish your code to run at.

Resources