How to get Continuous live streaming without buffering in azure media player using FFMPEG(Latency is not a issue)? - ffmpeg

I am streaming from the ip camera which uses RTSP protocol and ingesting the feed to RTMP(to Azure media server) using the following command
ffmpeg command : ffmpeg -f lavfi -i anullsrc -rtsp_transport tcp -i rtsp://CloudAppUser:admin#192.168.8.145/MediaInput/h264/stream_1 -vcodec libx264 -t 12:00:00 -pix_fmt + -c:v copy -c:a aac -strict experimental -f flv rtmp://channel1-cloudstream-inso.channel.media.azure.net:1934/live/980b582afc12e421b85b4jifd8e8662b/df
I am able to watch the stream but it is buffering once in every 30 seconds , and I want to know the reason behind this buffering
Please any one change this command , so that it should not buffer
I am executing this command from my terminal
I would like to watch my live stream in azure media player without any buffering and latency below 1 minute is not an issue

As documented here, when on-premise encoders are set up to push a contribution feed into a Channel, we recommend that these encoders use fixed 2 second GOPs. If your IP camera is not sending 2 second GOPs, you'd have to modify the ffmpeg commandline to re-encode the input video bitstream, and not just copy it. If that doesn't help, recommend contacting us via amshelp#microsoft.com with the (output) stream URL, and other details like the Media Service account name, region used, and date/time/timezone you attempted to stream the feed.

Related

FFMPEG and FFPlay can access rtsp stream from one ip, but from other ip, it can't

The situation is kind of complex. I was archiving several CCTV camera feeds (rtsp, h264, no audio) through OpenCV, which worked but the CPU utilization was too high and started to lose some frames time by time.
To reduce the CPU utilization, I started to use FFMPEG to skip the decoding and encoding processes, which worked perfectly on my home machine. However, when I connected to my university VPN and tried to deploy it on our lab server, FFmpeg couldn't read any frame, ffplay couldn't get anything either. However, OpenCV, VLC Player and IINA Player could still read and display the feed.
In Summary,
1 FFMPEG/ffplay
1.1 can only read the feed from my home network(Wi-Fi, optimum)
1.2 from other two networks, the error message says: "Could not find codec parameters for stream 0 (Video: h264, none): unspecified size
Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options"
2 IINA/VLC Player, OpenCV
These tools can get the video all the time.
I'm wondering whether it's related to some specific port access, that the ffmpeg required but the others don't. I'd appreciate it if anyone can provide any suggestions.
As references, the tested ffplay command is simple:
ffplay 'the rtsp address'
Thanks
Update
More tests have been performed.
By specifying rtsp_transport as TCP, ffplay can play the video, but FFmpeg can't access the video. (In the beginning, when both FFmpeg and ffplay worked through my home network, it was UDP)
The FFmpeg command is as follows:
ffmpeg -i rtsp://the_ip_address/axis-media/media.amp -hide_banner -c:v copy -s 1920x1080 -segment_time 00:30:00 -f segment -strftime 1 -reset_timestamps 1 -rtsp_transport tcp "%Y-%m-%d-%H-%M-%S_Test.mp4"
Please help...
Solved by forcing it to use "-rtsp_transport tcp" right before -i.
ffmpeg -rtsp_transport tcp -i rtsp://the_ip_address/axis-media/media.amp -hide_banner -c:v copy -s 1920x1080 -segment_time 00:30:00 -f segment -strftime 1 -reset_timestamps 1 "%Y-%m-%d-%H-%M-%S_Test.mp4"

FFmpeg CLI - swap RTMP source using ZMQ (zmqsend)

My setup is as follows:
Nginx with the RTMP module
Multiple RTMP stream pairs, each one with a primary and backup RTMP endpoint (so streaming to rtmp://localhost/main/$STREAM_NAME and rtmp://localhost/backup/$STREAM_NAME)
Using the Nginx RTMP module exec_publish and exec_publish_done hooks, I push either main or backup to an FFmpeg CLI proc that restreams it to a remote RTMP endpoint (Wowza server in this case, though it's not very relevant to my question)
My problem is that currently, if the main stream is stopped, I have to stop the FFmpeg CLI process that restreams to Wowza and start another with a new input source (the backup stream). This often causes issues on the Wowza side so I'm looking for a way to avoid that.
After some research, I found that FFmpeg encapsulated ZMQ support but it seems documentation is very sparse. Is it possible to send a message to the running FFmpeg process to alert it that it must change its source to a different RTMP stream?
Thanks a lot,
In case it's of interest to anyone, I solved my problem in a different way.
I now use named pipes, like so:
PIPE_FILE= /path/to/pip/file
mkfifo $PIPE_FILE
exec 7<>$PIPE_FILE
ffmpeg -nostdin -i /path/to/source -acodec copy -vcodec copy -vbsf h264_mp4toannexb -f mpegts pipe:1 > $PIPE_FILE
/path/to/source can be a media file on the FS or an RTMP stream, for that matter.
I then re-stream from the pipe to the final RTMP endpoint:
ffmpeg -re -i $PIPE_FILE -c:v libx264 -preset veryfast -r 25 -g 50 -f flv $RTMP_ENDPOINT
When $PIPE_FILE stops receiving data (i.e - when streaming stops or, in the case of sending data from a local media file, when EOF is reached), I immediately launch a different FFmpeg CLI proc and feed the pipe data from the backup media file/stream.
That keeps the re-streaming FFmpeg CLI proc continuously up and running.
Interesting approach. I've got something similar. Instead of the Pipe, I'm using another local rtmp destination.
I've got an nginx rtmp setup with 3 apps. One is the main app, another the backup app, and another is the distribute app.
So I send the main stream to the main app from my streaming software.
I have a ffmpeg process running:
ffmpeg -i rtmp://127.0.0.1/main/stream1 -c copy rtmp://127.0.0.1/distribute/stream1
If this process breaks due to the input shutting down, I run a similar command to pull input from the backup:
ffmpeg -i rtmp://127.0.0.1/backup/stream1 -c copy rtmp://127.0.0.1/distribute/stream1
From my distribute app I stream to my external outputs.
Only issue here is that I get the non-monotonous DTS error after the switch, so I've had to add a few flags when streaming from distribute to my outputs. The command is:
ffmpeg -fflags +genpts+igndts+ignidx -avoid_negative_ts make_zero -use_wallclock_as_timestamps 1 -i rtmp://127.0.0.1/distribute/stream1 -c:v libx264 -preset veryfast -r 25 -g 50 -c:a aac -b:a 128k -f flv $RTMP_ENDPOINT
I've noticed I get some warnings in the ffmpeg process when I switch, if the main and backup streams are coming in with different x264 profiles, let's say one is on High and the other on Baseline or Main.

Wowza RTMP to RTSP

We're using a Raspicam to stream live video to a client using Wowza Streaming Engine. We use FFMPEG to encode and send to Wowza and we can succesfully watch the RTMP stream using JWPlayer. This is the FFMPEG command used:
ffmpeg -t 0 -s 320x240 -f video4linux2 -i /dev/video0 -b:v 250k -tune zerolatency -preset ultrafast -f flv -r 15 rtmp://wowzaaddress:1935/live/live
To be able to watch the stream on mobile devices we want to use RTSP or HLS which Wowza provides links to, however when we use those links (provided on the Test Player) nothing happens and we can't even open them to test for example in VLC. We have already added the extra ports on our web server to see if that was the problem and still didn't work.
Does anybody know what the problem is or what could we be doing wrong?

How can we transcode live rtmp stream to live hls stream using ffmpeg?

I am trying to convert a live rtmp stream to hls stream on real time.
I got some idea after reading
http://sonnati.wordpress.com/2011/08/30/ffmpeg-%E2%80%93-the-swiss-army-knife-of-internet-streaming-%E2%80%93-part-iv/
i am able to convert the live rtmp stream to hls but not at run time. when i run the command and test for any hsl files (.m3u8 and .ts) i am not able to see but when i interrupt the command and check there i get the hls files as required.
I searched on google for solution but not able to get proper answer.
This is a short guide for HLS streaming with any input file or stream:
I am following user1390208's approach, so I use FFMPEG only to produce the rtmp stream which my server then receives to provide HLS. Instead of Unreal/Wowza/Adobe, I use the free server nginx with the rtmp module, which is quite easy to setup. This is how I do it in short: Any input file or stream -> ffmpeg -> rtmp -> nginx server -> HLS -> Client or more detailed:
input video file or stream (http, rtmp, whatever) --> ffmpeg transcodes live to x.264 + aac, outputs to rtmp --> nginx takes the rtmp and serves a HLS to the user (client).
So on the client side you can use VLC or whatever and connect to the .m3u8 file which is provided by nginx.
I followed this setup guide for nginx.
This is my nginx config file.
This is how I use ffmpeg to transcode my input file to rtmp:
ffmpeg -re -i mydirectory/myfile.mkv -c:v libx264 -b:v 5M -pix_fmt yuv420p -c:a:0 libfdk_aac -b:a:0 480k -f flv rtmp://localhost:12345/hls/mystream;
(the .mkv is 1080p with 5.1 sound, depending on your input, you should use lower bitrates!)
Where do you get the rtmp stream from?
A file? Then you can use exactly my approach.
Any server X with a stream Y? Then you have to change the ffmpeg command to:
ffmpeg -re -i rtmp://theServerX/yourStreamY -c:v libx264 -b:v 5M -pix_fmt yuv420p -c:a:0 libfdk_aac -b:a:0 480k -f flv rtmp://localhost:12345/hls/mystream;
or if your rtmp stream is already h.264/aac encoded, you could try to use the copy option in ffmpeg to stream the content directly to nginx.
As you see in my nginx config file:
My rtmp server has an "application" called "hls". That's the part that describes where nginx listens to ffmpeg's rtmp stream and that's why ffmpeg streams to rtmp://localhost:12345/hls/mystream;
My http server has the location /hls. This means in VLC I can connect to http://myServer:80/hls/mystream.m3u8 to access the HLS stream.
Is everything clear? Happy streaming!
Try this RTMP to HLS command line settings:
ffmpeg -v verbose -i rtmp://<host>:<port>/<stream> -c:v libx264 -c:a aac -ac 1 -strict -2 -crf 18 -profile:v baseline -maxrate 400k -bufsize 1835k -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 <pathToFolderYouWantTo>/<streamName>.m3u8
There might be some delay in the HLS feed. However, it'll work.
As an update to this question, I've managed to complete the live transcoding from RTMP to HLS without the use of ffmpeg, how?
Well just by using the exact same nginx config file shared by user3069376 and being very careful about the paths that you are generating the .m3uh manifesto, the hls option within the RTMP module should take care of it.
As for video player the Video.Js worked like a charm o
If you already have the RTMP live stream ready and playing as HLS then you can simply add .m3u8 after the stream name and make RTMP link to http. For example you have RTMP link like this:
rtmp://XY.Y.ZX.Z/hls/chid
You have to just make the url like this:
http://XY.Y.ZX.Z/hls/chid.m3u8
and it will play smoothly in iOS. I have tried following code and it is working fine.
func setPlayer()
{
// RTMP URL rtmp://XY.Y.ZX.Z/hls/chid be transcripted like this http://XY.Y.ZX.Z/hls/chid.m3u8 it will play normally.
let videoURL = URL(string: "http://XY.Y.ZX.Z/hls/chid.m3u8")
let playerItem = AVPlayerItem(url: videoURL!)
let adID = AVMetadataItem.identifier(forKey: "X-TITLE", keySpace: .hlsDateRange)
let metadataCollector = AVPlayerItemMetadataCollector(identifiers: [adID!.rawValue], classifyingLabels: nil)
//metadataCollector.setDelegate(self, queue: DispatchQueue.main)
playerItem.add(metadataCollector)
let player = AVPlayer(playerItem: playerItem)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.view.bounds
self.view.layer.addSublayer(playerLayer)
self.player = player
player.play()
}
But it will be slow and laggy because of the high resolution video stream upload. If you make the resolution to low when uploading the video stream, it will work smooth in low bandwidth network as well.
Please note: It is not by FFMPEG as we have already RTMP running by
FFMPEG so I did like this.

Wowza error :Failed to play myStream; stream not found.

i am using ffmpeg for encoding a video which will then be restreamed using wowza.i am new to streaming.first i started wowza using command
/etc/init.d/WowzaMediaServer start
after that i start streaming a MP4 file using rtsp protocol.i used the command
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream.sdp
video start streaming before all of these i changed admin.password and added a username as myuser and password as mypassword when i run above command its streaming but after that they say go to
WowzaMediaServer/examples/LiveVideoStreaming/FlashRTMPPlayer/Player.html
and fill server with rtmp://localhost:1935/live
and Stream field with myStream
when i click on connect its giving me status
"Failed to play myStream; stream not found."
i am following this article http://www.wowza.com/forums/content.php?354-How-to-set-up-live-streaming-using-an-RTSP-RTP-based-encoder
where i am wrong i dont know.i am unable to figure it out. from wowza support team i am not getting satisfactory answers .so someone work on wowza please help me!!why i am not able to connect my video stream to wowza.Please respond stuck badly.
So it appears there are some basic issues with the rtsp from ffmpeg and then no matches to the play request
You have
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream.sdp
You need to make sure your ffmpeg has libx264 and libfdk_aac plugins available. You should be able to determine this with just
ffmpeg
and it should print out the libraries available.
If you have all the libraries then you are publishing a stream called
myStream.sdp
You then have instructions that say
and fill server with rtmp://localhost:1935/live
and Stream field with myStream
So you should either change your ffmpeg command to
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream
Note no .sdp in the stream name any more or use a Stream field in the player of
myStream.sdp
When publishing a stream and then attempting to play it back they must match, otherwise you get back Stream Not Found.
One way to successfully do this is to specify only the port number (65000 in this example), making sure it isn't 1935 and server in your ffmpeg command then create a mystream.stream file in your content directory of your Wowza server with ONLY the following line:
udp://0.0.0:65000
Then, in Wowza/conf/startupstreams.xml, add the following:
<!-- Native RTP example (SDP file is myStream.sdp) -->
<StartupStream>
<Application>live/_definst_</Application>
<MediaCasterType>rtp</MediaCasterType>
<StreamName>mystream.stream</StreamName>
</StartupStream>
Restart wowza and ffmpeg and then re-try your url in with the stream name mystream.stream.

Resources