I want to run JFugue on an Amazon cloud instance to create a music stream from data on the cloud instance...and then listen to the music stream on my local laptop via some audio cast system. Is there any way to get JFugue to output a music stream in a way thats compatible input to any audio casting system?
Thanks.
Related
I'm trying to stream the video of my C++ 3D application (similar to streaming a game).
I have encoded an H.264 video stream with the ffmpeg library (i.e. internally to my application) and can push it to a local address, e.g. rtp://127.0.0.1:6666, which can be played by VLC or other player (locally).
I'm not particularly wedded to h.264 at this point, or rtp. I could send as srtp if that would help.
I'd like to use WebRTC to set up a connection across different machines, but can't see in the examples how to make use of this pre-existing stream - the video and audio examples are understandably focused on getting data from devices like connected web cams, or the display.
Is what I'm thinking feasible? I.e. ideally I'd just point webRTC at my rtp://127.0.0.1:6666 address and that would be the video stream source.
I am writing out an sdp file as well which can be read by VLC, could I use this in a similar way?
As noted in the comment below there is an example out there using go to weave some magic that enables an rtp stream to be shown in a browser via webRTC.
I am trying to find a more "standard" way to be able to set the source of a video track in webRTC to be the URL of an encoded stream. If there isn't one, that is valuable information to me too, as I can change tack and use a webrtc library to send frames directly.
Unfortunately FFMPEG doesn't support WebRTC output. It lacks support for ICE and DTLS-SRTP.
You will need to use a RTP -> WebRTC bridge. I wrote rtp-to-webrtc that can do this. You can do this with lots of different WebRTC clients/servers!
If you have a particular language/paradigm that you prefer happy to provide examples for those.
I have a Dahua IP camera with a microphone. I would like to get the audio stream playing on a website like a live radio.
I have a raspberry pi which I was planning to use with ffmpeg, but I haven't had much success in bridging the gap between that and my website to form an audio stream.
Can this be done via sftp/ftp with mp3 files and some fancy php/javascript to play like a live radio?
Do I need to use another service? (would like to minimise costs as much as possible!)
Thanks!
Peter
Browsers can't play RTSP source directly so you need a streaming server to get the RTSP stream and convert it to something suitable for HTML5 playback (HLS/DASH).
This can be done with Wowza SE (the free developer license could be used for 10 subscribers).
You can see how this works with the VideoNow.Live - IP Camera Restreaming feature and see how this is done by taking a peek at the open source code of the Live Streaming HTML5 RTSP WP plugin that powers that solution.
I'm really dumb and new to RTP/SIP. Is there a stack that's recommended for uploading video to the cloud from a camera attached to a microprocessor? What's the difference between all the things I'm seeing - MPEG DASH, Live555, ffmpeg, and so on...?
How does WhatsApp or Dropcam transmit live video?
Uploading a video on it's own is fairly trivial if the video has already been captured you will just need to set your app to have access to the local media store, list the files and then upload using any standard technology e.g. HTTP PUT, HTTP POST, FTP, S3 etc
If you want to process the video first you would be wise to first process it via a library such as ffmpeg which has been compiled for Android e.g. https://trac.ffmpeg.org/wiki/How%20to%20compile%20FFmpeg%20for%20Android
If you want to live stream the video hence your reference to RTP/SIP etc you will need to access the camera. You could start with something like the kickflip SDK as a basis and includes a bundled ffmpeg https://github.com/Kickflip/kickflip-android-sdk or libstreaming: https://github.com/fyhertz/libstreaming
The target is that a user sends a stream of images to a server. The server should forward those to a media server for showing as a live continuous video to clients.
following are the thought for implementing, kindly tell if they are ok.
Use a lightweight rtmp server to accept stream of images from a user (please suggest if this is even possible via rtmp and if it cab be easiy and efficiently done otherwise)
use ffmpeg to use the rtmp (or other) url as input and send those images to ffserver for streaming. (am also confused here, if ffserver is fed with images continuosly, can it show those images as video as long as the images are coming)
I think an easier solution would be to use an RTMP live ecoder that is capable of streaming a slideshow. This can be achieved by playing the slideshow using Windows Photo Viewer and then setting the live encoder to capture the Windows Photo Viewer output and stream it as live. Two RTMP encoders that should be able to do this are OBS (Open broadcaster software) and XSplit. Another (free) solution would be to use Adobe Flash Media Live Encoder in combination with a software called ManyCam. ManyCam can capture feed from video, images, etc and feed it to FMLE using a driver. Install ManyCam and create a slideshow of images in the playlist option. Then start FMLE and select the ManyCam driver. You can now stream the slideshow as live to an rtmp server.
is it possible to create a live audio stream based on other audio streams? I'm thinking of a proxy that gets two audio streams (e.g. shoutcast stream), and based on time, switches to one of them. And, if its possible, to have some time for analysis, I would implement some kind of caching so that I can stream the newly created stream time-displaced.
I already had a look on the Shoutcast server but couldn't figure out, how to config the input source as another stream. Maybe there are other projects that can handle this through a interface.
Programming language don't really matters, but Ruby is prefered.
I made my own solution with javascript based on nodejs. You can clone the repo (a personal radio app) at https://github.com/23tux/personal_node_radio.