Streaming a stream from website to Twitch with FFMPEG - ffmpeg

Is it possible to stream a website with a livestream (i.e. ip-camera) via FFMPEG to Twitch? If yes, does anybody know how to achieve this?

Yes. FFmpeg has a built-in RTMP client (which is the protocol you'll use to send your video data to Twitch), FLV (the wrapper for your audio and video data), H.264 (the video codec), and AAC (the audio codec).
First, find your RTMP ingest URL: https://stream.twitch.tv/ingests/
Now, just run FFmpeg as you normally would to ingest your input, but set some additional parameters for the output:
ffmpeg [your input parameters] -vcodec libx264 -b:v 5M -acodec aac -b:a 256k -f flv [your RTMP URL]

Related

FFmpeg efficient capture from rtsp ipcamera

I need to capture an audio/video rtsp stream uncompressed in a file from ipcamera. Audio (pcm_alaw) and video (h264) must be synchronized. It is necessary that the file does not get corrupted if the camera loses the connection for a few moments (mp4).
At the moment I use the command below, but the ts codec does not support pcm_alaw and therefore the audio is not heard:
ffmpeg -stimeout 2000000 -rtsp_transport tcp -i rtsp://admin:1234#192.168.5.22/h264 -c:v copy -c:a copy -f mpegts -y main.ts
I use the mpegts codec because I need to check the duration of the capture in real time with the command:
ffprobe -i /home/pi/NAS/main.mov -show_entries format=duration -v quiet -of csv="p=0"
If i use mkv or avi its output would be:
N/A
The verification of the duration is important because I capture files of about 3 hours and at my choice I perform some data while the capture is in progress. I prefer not to compress the audio because I have often noticed some asynchrony with respect to the video when cutting.
Thank you.
Instead of -c:a copy you can use -c:a aac or -c:a mp3 to convert the audio stream before you save it.
MPEG-TS h264 is only compatible with mp3 or aac (source).

How can we transcode live rtmp stream to live hls stream using ffmpeg?

I am trying to convert a live rtmp stream to hls stream on real time.
I got some idea after reading
http://sonnati.wordpress.com/2011/08/30/ffmpeg-%E2%80%93-the-swiss-army-knife-of-internet-streaming-%E2%80%93-part-iv/
i am able to convert the live rtmp stream to hls but not at run time. when i run the command and test for any hsl files (.m3u8 and .ts) i am not able to see but when i interrupt the command and check there i get the hls files as required.
I searched on google for solution but not able to get proper answer.
This is a short guide for HLS streaming with any input file or stream:
I am following user1390208's approach, so I use FFMPEG only to produce the rtmp stream which my server then receives to provide HLS. Instead of Unreal/Wowza/Adobe, I use the free server nginx with the rtmp module, which is quite easy to setup. This is how I do it in short: Any input file or stream -> ffmpeg -> rtmp -> nginx server -> HLS -> Client or more detailed:
input video file or stream (http, rtmp, whatever) --> ffmpeg transcodes live to x.264 + aac, outputs to rtmp --> nginx takes the rtmp and serves a HLS to the user (client).
So on the client side you can use VLC or whatever and connect to the .m3u8 file which is provided by nginx.
I followed this setup guide for nginx.
This is my nginx config file.
This is how I use ffmpeg to transcode my input file to rtmp:
ffmpeg -re -i mydirectory/myfile.mkv -c:v libx264 -b:v 5M -pix_fmt yuv420p -c:a:0 libfdk_aac -b:a:0 480k -f flv rtmp://localhost:12345/hls/mystream;
(the .mkv is 1080p with 5.1 sound, depending on your input, you should use lower bitrates!)
Where do you get the rtmp stream from?
A file? Then you can use exactly my approach.
Any server X with a stream Y? Then you have to change the ffmpeg command to:
ffmpeg -re -i rtmp://theServerX/yourStreamY -c:v libx264 -b:v 5M -pix_fmt yuv420p -c:a:0 libfdk_aac -b:a:0 480k -f flv rtmp://localhost:12345/hls/mystream;
or if your rtmp stream is already h.264/aac encoded, you could try to use the copy option in ffmpeg to stream the content directly to nginx.
As you see in my nginx config file:
My rtmp server has an "application" called "hls". That's the part that describes where nginx listens to ffmpeg's rtmp stream and that's why ffmpeg streams to rtmp://localhost:12345/hls/mystream;
My http server has the location /hls. This means in VLC I can connect to http://myServer:80/hls/mystream.m3u8 to access the HLS stream.
Is everything clear? Happy streaming!
Try this RTMP to HLS command line settings:
ffmpeg -v verbose -i rtmp://<host>:<port>/<stream> -c:v libx264 -c:a aac -ac 1 -strict -2 -crf 18 -profile:v baseline -maxrate 400k -bufsize 1835k -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 <pathToFolderYouWantTo>/<streamName>.m3u8
There might be some delay in the HLS feed. However, it'll work.
As an update to this question, I've managed to complete the live transcoding from RTMP to HLS without the use of ffmpeg, how?
Well just by using the exact same nginx config file shared by user3069376 and being very careful about the paths that you are generating the .m3uh manifesto, the hls option within the RTMP module should take care of it.
As for video player the Video.Js worked like a charm o
If you already have the RTMP live stream ready and playing as HLS then you can simply add .m3u8 after the stream name and make RTMP link to http. For example you have RTMP link like this:
rtmp://XY.Y.ZX.Z/hls/chid
You have to just make the url like this:
http://XY.Y.ZX.Z/hls/chid.m3u8
and it will play smoothly in iOS. I have tried following code and it is working fine.
func setPlayer()
{
// RTMP URL rtmp://XY.Y.ZX.Z/hls/chid be transcripted like this http://XY.Y.ZX.Z/hls/chid.m3u8 it will play normally.
let videoURL = URL(string: "http://XY.Y.ZX.Z/hls/chid.m3u8")
let playerItem = AVPlayerItem(url: videoURL!)
let adID = AVMetadataItem.identifier(forKey: "X-TITLE", keySpace: .hlsDateRange)
let metadataCollector = AVPlayerItemMetadataCollector(identifiers: [adID!.rawValue], classifyingLabels: nil)
//metadataCollector.setDelegate(self, queue: DispatchQueue.main)
playerItem.add(metadataCollector)
let player = AVPlayer(playerItem: playerItem)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.view.bounds
self.view.layer.addSublayer(playerLayer)
self.player = player
player.play()
}
But it will be slow and laggy because of the high resolution video stream upload. If you make the resolution to low when uploading the video stream, it will work smooth in low bandwidth network as well.
Please note: It is not by FFMPEG as we have already RTMP running by
FFMPEG so I did like this.

using ffmpeg for live streaming MPEG-TS with windows media services

I'm trying to live stream MPEG-TS source to Windows Media service.
I found how to live stream using RTMP with this code:
ffmpeg -y -f mpegts -i udp://#:1234 -re -vcodec libx264 -maxrate 700k
-r 25 -s 640x360 -deinterlace -acodec libvo_aacenc -ab 64k -ac 1 -ar 44100 -f flv "rtmp://rtmp1.youtube.com/videolive?sparams=<STREAM PARAMETERS HERE>"
How can I convert it to support WM9/VC1 format?
Does ffmpeg support pulling of the stream or only pushing to Windows Media services?
Windows Media Service can receive data only from Windows Media Encoder.
It can't work with RTMP which its flash technology.
if your input stream is a file or a capture device than you can use "Windows Live Media Encoder".

FMS FLV to mp3/aac/wav

How can I decode a FLV's audio if it's recorded from a live stream using Flash Media Server and uses NellyMoser codec?
I'm writing a script that process several FLVs, using FFmpeg, so I need a command line solution.
Any ideas?
This should work for you, since NellyMoser is supported by FFmpeg.
1. Using mp3
ffmpeg -i yourinput.flv -vn -acodec libmp3lame output.flv
2. Using AAC (switch aac with libfaac depending on which you have loaded)
ffmpeg -i yourinput.flv -vn -acodec libfaac output.mp4
I'm assuming of course you dont care about video.

DSS won't stream 3GP's encoded with FFMPEG

I have setup Darwin Streaming Server which streams the included sample 3GP files pretty well. However when I try to encode my own 3GPs using FFMPEG and stream them, it doesn't work. (I test the streaming using VLC Player).
Here is the command I use for encoding:
ffmpeg -i test.flv -acodec aac -ar 22050 -ab 128kb -vcodec mpeg4 -b 256k -mbd 2 -flags +4mv+aic -trellis 1 -cmp 1 -y test.3gp
Do I need a special way to encode the 3gp's for streaming?
DSS has no idea about file content. So, you have to "hint" (or give an idea to the server how to stream your file) media file. You can use M4Box for hinting - http://gpac.sourceforge.net/doc_mp4box.php
On the following link you can find how to configure a DSS server
http://www.howtoforge.com/apples-darwin-streaming-server-on-centos-5.2
The second page describe how to convert files with FFMPEG and MP4 creator to hint the vodeo to be streamed. The link can be find at the bottom of the page before the comments.

Resources