From Sonos documentation I saw that Sonos only supports Mp3 streaming over RTSP or over Shoutcast over HTTP.
Is it possible to streaming tracks in a MP3 format through HTTP using progressive download instead of ShoutCast in Sonos?
Yes, but could you expand on what you intend by progressive download? Sonos does support byte range seeking: http://musicpartners.sonos.com/node/148
Related
I am trying to make a peer-to-peer game streaming platform. At this point I managed to capture the OpenGL frames and I have a functional Java websockets server, I can have 2 clients that establish a peer to peer connection (I have solved the STUN/TURN servers part) and transfer text at this point.
I do not quite understand how I could stream a video made out of the Opengl frames with a low latency (<100ms). The problem mainly lies in the FFMPEG part, I want to use this to encode the frames, get the result (stdin/stdout redirect for ffmpeg ?), somehow link to the the JS API of the host (maybe a local websocket to which the JS of the hoster will connect to).
I tried several FFMPEG arguements/commands with stdin and stdout pipes and they did not work.
What WebRTC Client are you using? What is the H264 Live stream flowing into?
WebRTC in the browser has a few restrictions (just because the implementation is naive). Try doing constrained-baseline, and do a very small keyframe interval (every second is usually good for a prototype!)
If you don't have a WebRTC client you can do something like webrtc-remote-screen
I am trying to stream some content from my browser's webcam implementation to a random RTMP server. I got it working to the part where it sends blobs of WEBM (VP8 i believe) encoded bits of movie to my server every 2 seconds, but the tricky part is getting it to an RTMP server from that part on.
A bit of fiddling with FFMPEG showed that it can successfully stream to the server I want to stream to, but so far I have only managed to get it working with regular files. Attempting to stream the blobs are unsuccessful, it simply does not upload anything. It also only seems to accept mp4 encoded with the h264 codec.
The question: what is the best way to get the raw video data from my webbrowser's webcam implementation, encode it with the h264 codec and send it to an RTMP server?
Without using a server to convert your blobs to an RTMP stream, The only way is you use flash. RTMP is an adobe protocol that no browsers support natively. Another option is WebRTC which uses RTP protocol.
i was installed red5 server on ubuntu 12.04 lts for live and vod video streaming. i just want to convert my RTMP protocol stream to RTSP and HTTP protocol stream. i was studied and search about FFMPEG. but i did not understand correctly. so please guide me any one. thanks advance
the sample URL
rtmp://xxxxx.com/live
to
rtsp://xxxxx.com/live and http://xxxxx.com/live
There are red5 plugins for hls(http live streaming) and rtsp. I don't know how stable they are so you can try. Here are the links.
https://github.com/Red5/red5-plugins/tree/master/rtspplugin
https://github.com/Red5/red5-hls-plugin
You could write a transcoding application that uses Xuggler and converts your streams on-the-fly to RTSP (RTMP -> RTSP). That isn't meant to sound simple, but it can be done as well as the other way around (RTSP -> RTMP).
I want to establish a video stream between a C# application and a Browser.
Im using Websockets for the communication.
The Video source is a webcam.
I am able to request single PNG frames but it is slow as hell.
The websocket Server(ratchet) is message based but is it possible to use VP9 compression or something similar by using some kind of buffer?
WebSockets implement a messaging protocol over sockets. This is not desirable for video. I think a better suited technology for this is WebRTC.
I have a proprietary library that generates JPEG images at 10-20Hz. I'd like to stream the images as a video stream over the network, so that a remote client (VLC for example) will be able to view it.
The clients are all in a LAN and there are no restrictions on the streaming protocol and the video format. The environment is Windows 7/XP and the library DLL exports a C-only API.
Is there a recommended library that allows streaming image frames, injected in real-time? the streaming libraries I know (VLC and Live555) do not allow this AFAIK.
M-JPEG defines streaming over HTTP by sending individual images. This protocol is understood by VLC.
From Wikipedia:
M-JPEG over HTTP
HTTP streaming
separates each image into individual
HTTP replies on a specified marker.
RTP streaming creates packets of a
sequence of JPEG images that can be
received by clients such as QuickTime
or VLC. The server software mentioned
above streams the sequence of JPEGs
over HTTP. A special mime-type content
type
multipart/x-mixed-replace;boundary=
informs the browser to expect several
parts as answer separated by a special
boundary. This boundary is defined
within the MIME-type. For M-JPEG
streams the JPEG data is sent to the
client with a correct HTTP-header. The
TCP connection is not closed as long
as the client wants to receive new
frames and the server wants to provide
new frames. Two basic implementations
of such a server are test-server
"cambozola" and webcam server
"MJPG-Streamer".
Client software
Browsers such as Safari, Google Chrome
and Opera stream M-JPEG natively.
See: http://en.wikipedia.org/wiki/Motion_JPEG#M-JPEG_over_HTTP