what decides ftp download and stream? - ftp

While trying to setup an streaming server with my raspberry pi, the instructions seem to contain just installing an ftp server.
This made me wonder, what decides whether a file stored in the ftp server to be downloaded or streamed?
In other words, is the choice of downloading or streaming dependent on the client side and not the server side?

If using FTP, streaming is implemented client side using the REST command (for Start Position), as explained at How does a FTP server resume a download? and (in more detail) at http://cr.yp.to/ftp/retr.html .
Your server therefore needs to allow the REST verb (most do by default). Throttling (flow control) is also managed client side.
Long story:
This mechanism is similar to the strategy used by HTTP too. Streaming, however, is a wide subject. and there are other approaches to streaming. Some protocols provide extra verbs to signal other events like changes of bandwidth/resolution to account for unstable connections (like videoconference / desktop share protocols). Some are more suitable for live broadcasting and others for buffered/stored video.
Nowadays, most Streaming Players like YouTube are web based and built on top of the HTTP protocol. Streaming is achieved using the HTTP RANGE Header and by dividing the media in chunks that can be retrieved separately, as explained in this magnific video: https://www.youtube.com/watch?v=OqQk7kLuaK4 .

Related

Integration of Shenzhen Concox Information Technology Tracker GT06 with EC2

I have a concox GT06 device from which I want to send tracking data to my AWS Server.
The coding protocol manual that comes with it only explains the data structure and protocol.
How does my server receive the GPS data collected by my tracker?
Verify if your server allows you to open sockets, which most low cost solutions do NOT allow for security reasons (i recommend using an Amazon EC2 virtual machine as your platform).
Choose a port on which your application will listen to incoming data, verify if it is open (if not open it) and code your application (i use C++) to listen to that port.
Compile and run your application on the server (and make sure that it stays alive).
Configure your tracker (usually by sending an sms to it) to send data to your server's IP and to the port which your application is listening to.
If you are, as i suspect you are, just beginning, consider that you will invest 2 to 3 weeks to develop this solution from scratch. You might also consider looking for a predeveloped tracking platform, which may or may not be acceptable in terms of data security.
You can find examples and tutorials online. I am usually very open with my coding and would gladly send a copy of the socket server, but, in this case, for security reasons, i cannot do so.
Instead of direct parsing of TCP or UDP packets you may use simplified solution putting in-between middleware backends specialized in data parsing e.g. flespi.
In such approach you may use HTTP REST API to fetch each new portion of data from trackers sent to you dedicated IP:port (called channel) or even send standardized commands with HTTP REST to connected devices.
At the same time it is possible to open MQTT connection using standard libraries and receive converted into JSON messages from devices as MQTT in real time, which is even better then REST due to almost zero latency.
If you are using python you may take a look at open-source flespi_receiver library. In this approach with 10 lines of code you may have on your EC2 whole parsed into JSON messages from Concox GT06.

Windows WebDAV client with streaming/chunked transfer

I have implemented a very minimal proof-of-concept supporting a portion of the WebDAV protocol. This includes the OPTIONS, PROPFIND and GET HTTP verbs. The built-in Windows WebDAV client (on Windows 8.1) can therefore open the WebDAV share, list files and directories, and navigate through these.
The GET HTTP verb implementation provides the Accept-Ranges (as bytes), Content-Length, Content-Type and Transfer-Encoding (as chunked). When opening a large video file in a browser, it will begin to play immediately while it is downloading the remaining contents. The built-in WebDAV client of Windows seems to be downloading the entire file to a temporary location prior to having a media player play the file. When a file is 10GB, this is going to suck.
Is there any way to provide support so that the built-in WebDAV client can read ranges of bytes for streaming purposes (I would imagine it just needs to translate to use Range somehow...)?
It sounds like you did all the correct things to indicate to client that streaming is possible, and range requests are possible. So if the client doesn't respond do that, I think you can conclude that it just doesn't support those features. (which is a total bummer).

Is socket.io the WebRTC or WebSocket or something else?

I'm new to socket.io. In Realtime (Web) Applications, we used to choose whether it should be WebRTC or WebSocket (or even SIP, still?) technologies.
What exactly is socket.io in this case please?
WebSockets
socket.io is a popular open source library implemented on both backend and client side. This library is based on WebSockets API which allow a communication between a SERVER and a CLIENT.
WebRTC
On the other hand, WebRTC is a WebAPI which comes with basically 3 things:
Real Time Communication between two browsers (no server needed), a peer to peer connection (P2P)
Media Streaming (Audio and Video)
Real Time Communication Data Chanel (stream any data on P2P)
The main difference is that WebSockets needs A SERVER and it is based on publish/subscribe pattern where you can send raw data back and forth, without having any special data handling by default. In contrast, WebRTC has a lot of functions already in place which can be used to handle Audio/Video streaming and also the raw data with data chanel.
For more info I recommend reading on MDN links I provided above and also check this very cool slides on sockets and webRTC
If you want to make video or audio communication services use WebRTC for browser build in support and write the discovery and signaling. webrtc have awesome features like P2P connections and data encryption.
WebRTC client side (browser) features like get video and audio data with good support in evergreen browsers: http://iswebrtcreadyyet.com/#interop
And socket.io is good for build centralized pub / sub apps like text chat
You can make connections with WebRTC without socket.io but both works fine if you use socket.io for help in signaling

What is the difference between WebRTC and WebSockets for low level data communication

I am trying to understand the difference between WebRTC and WebSockets so that I can better understand which scenario calls for what. I am curious about the broad idea of two parties (mainly web based, but potentially one being a dedicated server application) talking to each other.
Assumption:
Clearly in regards to ad-hoc networks, WebRTC wins as it natively supports the ICE protocol/method.
Questions:
Regarding direct communication between two known parties in-browser, if I am not relying on sending multimedia data, and I am only interested in sending integer data, does WebRTC give me any advantages over webSockets other than data encryption?
Regarding a dedicated server speaking to a browser based client, which platform gives me an advantage? I would need to code a WebRTC server (is this possible out of browser?), or I would need to code a WebSocket server (a quick google search makes me think this is possible).
There is one significant difference: WebSockets works via TCP, WebRTC works via UDP.
In fact, WebRTC is SRTP protocol with some additional features like STUN, ICE, DTLS etc. and internal VoIP features such as Adaptive Jitter Buffer, AEC, AGC etc.
So, WebSockets is designed for reliable communication. It is a good choice if you want to send any data that must be sent reliably.
When you use WebRTC, the transmitted stream is unreliable. Some packets can get lost in the network. It is bad if you send critical data, for example for financial processing, the same issue is ideally suitable when you send audio or video stream where some frames can be lost without any noticeable quality issues.
If you want to send data channel via WebRTC, you should have some forward error correction algorithm to restore data if a data frame was lost in the network.
WebRTC specifies media transport over RTP .. which can work P2P under certain circumstances. In any case to establish a webRTC session you will need a signaling protocol also .. and for that WebSocket is a likely choice. In other words: unless you want to stream real-time media, WebSocket is probably a better fit.
Question 1: Yes. The DataChannel part of WebRTC gives you advantages in this case, because it allows you to create a peer to peer channel between browsers to send and receive any raw data you want. Websockets forces you to use a server to connect both parties.
Question 2 Like I said in the previous response, Websockets are better if you want a server-client communication, and there are many implementations to do this (i.e. jWebSocket). To add support in a server to establish a connection with a WebRTC DataChannel, it may take you some days of life and health. :)

Peer-to-peer file sharing with Web Sockets

This is sort of a theoretical question, however, I need to add file sharing capabilities to my web socket powered chat application. I could use a service like Amazon S3 to upload a file to share by posting a link to the file, but that involves the uploading of a file that may already be accessible over the local network (sharing a file between co-workers for example).
So I had the idea that it might be possible to somehow tunnel the upload/download/transfer through the already existing web socket connection. However, I don't know enough about HTTP file transfer to know the next step of how to implement it. Is there a limitation to web sockets that would prevent this from being possible?
I'm using Ruby and EventMachine for my current web socket implementation. If you were able to provide a high level overview to get me started, that would be very much appreciated.
Here's an example of a project that uses only Web Sockets and the javascript File API for transferring files: http://www.github.com/thirtysixthspan/waterunderice
To allow files to be shared without the need to upload it to the server, (i.e Coworkers) you can now use the WebRTC DataChannel API, to create a peer to peer connection.

Resources