I'm using ffmpeg to convert a RTSP stream (from a security camera) into a HLS stream which I then play on a website using hls.js.
I start the transmuxing with: ffmpeg -i rtsp:<stream> -fflags flush_packets -max_delay 1 -an -flags -global_header -hls_time 1 -hls_list_size 3 -hls_wrap 3 -vcodec copy -y <file>.m3u8
I can get the stream to play, but the quality isn't good at all... Sometimes the stream jumps on time or freezes for a while. If I open it using VLC I get the same kind of problems.
Any idea why? Or how can I stabilize it?
I've had a similar issue one time and the issue ended up being not enough bandwidth. Be it an issue with whatever means the camera is using to stream, the connection to the server, etc. In my case, I had a set bandwidth listed as an FFMPEG argument that I simply had to increase. I know sometimes really low framerates set on the camera can cause oddities where you may have to add the "-framerate (Frames Per Second)" argument (no quotes) depending on how the page is set up.
If it is a bandwidth issue, the only way to resolve it as far as I'm aware is to increase the bandwidth somehow or make sure you aren't limiting yourself in some way which could come down to exactly how you are hosting the website/server and verifying speeds from each point the best you can. If you can't find the oddity in the connection yourself or need additional help, comment and I will help further.
This is an old question, so I don't know if OP will see this or not, but I'll leave this here to potentially give something to troubleshoot to anyone else having the same or similar issue since this is what helped me on a very similar issue.
Related
Alright, real simple here. I'm rendering some fractal flames I've created over the years. Which makes the math on all of this really simple.. lol.
I'm trying to generate a 5 second video at 60fps that when played continuously makes a perfect loop.
So I sequence and render exactly 300 frames numbered 000.png through 299.png for one loop. I then send this into FFMpeg with the following command:
ffmpeg -f image2 -framerate 60 -start_number 0 -i '%03d.png' -r 60
-crf 10 output.webm
No matter what, it kills the last 12-18 frames depending on the run and creates a video that players recognize as 4 seconds only.
Here is a snippet of the processing output (Take note that 300 frames at 60fps no matter what you do comes out at 04.66 seconds - but it does claim there are exactly 5 seconds on the input side)
I have tried replacing -crf setting with just -quality good, I have tried moving around where I state the framerate. I have tried removing the -r from the output and putting it in there. I have tried building out this call to be as specific as possible such as the strictly specifying the encoder and options. Oh I have tried other encoders and get the same result. I have even tried -hwaccell using NVEC and CUVID respectively.
Nothing I do works..
Any thoughts here? Maybe alternatives to FFMpeg? Maybe difference versions of FFMpeg? I don't know what I should do next and thought I would ask.
Diagnostic output on a finished file for reference this one actually got close with 294 frames and a 4.9 second runtime it is much higher res though:
Here's the setup, I have an Acer Aspire 5 (mobile Ryzen 3, integrated graphics) running Manjaro (don't ask) a 4Kp24/1080p60 mirrorless dslr camera and an Elgato Camlink 4K usb dongle.
Understandably, the Acer cannot encode very well at all... so to avoid the mess of crummy compression... I'm not compressing it! (Well I sort of want to, but yeah...)
Right now, my laptop can handle writing the stream somewhat, but it often skips around...
What I need to know:
How do I drop every other frame? (Using -c copy doesn't allow filters... and -r 30 doesn't do anything.)
How do I limit the bitrate? The Camlink stream is 200Mbits/s and I need to record for 50 minutes at least so I'm concerned about file size, but if it will fit reasonably in 80GiBs of space then whatever. (Just setting -b:v 25M seemed like it was working? Though, I'm not sure of the implications of that)
Is it at all possible to use FFMPEG to stream the file across Ethernet to a different computer using FFMPEG to encode on it? (Additionally, Idk if I'd try it, but if I can get wifi would it be efficient to portforward a port on my router and upload over the internet???)
Lastly, what file type should I save the stream as? Trying .raw didn't work, but if over the internet is a reasonable option then this won't matter right?
Thanks in advance for your help. Here's the most functional set of command options I've tried...
ffmpeg -vaapi_device /dev/dri/renderD128 -f v4l2 -i /dev/video2 -c copy -b:v 20M -r:v 30 capture.mkv
Edit: I forgot to mention, but I have a MUCH better computer at home (Ryzen 9 3900x, RTX 3080), this is so I can record lectures and bring back footage to edit and finalize.
Anyone could help me? Have been trying to record video from an RTSP server using FFMPEG but somehow the video result has many frozen images (couldn't be used for any people detection) - the people look similar to this:
Here is the code I used:
ffmpeg -i rtsp://10.10.10.10/encoder1 -b:v 1024k -s 640x480 -an -t 60 -r 12.5 output.mp4
What I have done so far?
- Recorded the video in smaller dimension instead of original one
- No audio and lower FPS
- Even only record from two IP sources on a machine
But still didn't get any luck yet. Anyone ever experience this?
I am working on re-encoding some footage (x264), including some grainy footage. I am interested in CRF-only bitrate management (I want to avoid artifacts during demanding scenes).
What are recommended parameters to be set instead of leaving them at their defaults?
Here is what I got so far, pretty simple:
ffmpeg -i in.mkv -vf unsharp=3:3:1 -c:v libx265 -tune:v grain -crf 24 -c:a copy out.mkv
(this example has grain tune as many files are grainy, and without it it washes it out and all the "detail by noise" is lost + I am applying a light sharpening filter, I find there is always a room to sharpen a bit without causing noticeable sharpening artifacts)
If I am not mistaken all the params one does consider are ones contained in the presets, but is there some other or one of those which is a good practice to adjust manually to achieve a better result? I was wondering specifically about P,I,B-frames and AQ (but I guess there are some other as well).
The defaults are what the developers recomend. But every video is different, and could be improved with custom settings. There is no “Better default”, because it could be worse on a different file. It can’t be know by anyone without the video file, and the preferences of the viewer.
I'm trying to do a live restream an RTSP feed from a webcam using ffmpeg, but the stream repeatedly stops with the error:
"No more output streams to write to, finishing."
The problem seems to get worse at higher bitrates (256kbps is mostly reliable) and is pretty random in its occurrence. At 1mbps, sometimes the stream will run for several hours without any trouble, on other occasions the stream will fail every few minutes. I've got a cron job running which restarts the stream automatically when it fails, but I would prefer to avoid the continued interruptions.
I have seen this problem reported in a handful of other forums, so this is not a unique problem, but not one of those reports had a solution attached to it. My ffmpeg command looks like this:
ffmpeg -loglevel verbose -r 25 -rtsp_transport tcp -i rtsp://user:password#camera.url/live/ch0 -reset_timestamps 1 -movflags frag_keyframe+empty_moov -bufsize 7168k -stimeout 60000 -hls_flags temp_file -hls_time 5 -hls_wrap 180 -acodec copy -vcodec copy streaming.m3u8 > encode.log 2>&1
What gets me is that the error makes no sense, this is a live stream so output is always wanted until I shut off the stream. So having it shut down because output isn't wanted is downright odd. If ffmpeg was complaining because of a problem with input it would make more sense.
I'm running version 3.3.4, which I believe is the latest.
Update 13 Oct 17:
After extensive testing I've established that "No more outputs" error message generated by FFMPEG is very misleading. The error seems to be generated if the data coming in from RTSP is delayed, eg by other activity on the router the camera is connected via. I've got a large buffer and timeout set which should be sufficient for 60 seconds, but I can still deliberately trigger this error with far shorter interruptions, so clearly the buffer and timeout aren't having the desired effect. This might be fixed by setting a QOS policy on the router and by checking that the TCP packets from the camera have a suitably high priority set, it's possible this isn't the case.
However, I would still like to improve the robustness of the input stream if it is briefly interrupted. Is there any way to persuade FFMPEG to tolerate this or to actually make use of the buffer it seems to be ignoring? Can FFMPEG be persuaded to simply stop writing output and wait for input to become available rather than bailing out? Or could I get FFMPEG to duplicate the last complete frame until it's able to get more data? I can live with the stream stuttering a bit, but I've got to significantly reduce the current behaviour where the stream drops at the slightest hint of a problem.
Further update 13 Oct 2017:
After more tests, I've found that the problem actually seems to be that HLS is incapable of coping with a discontinuity in the incoming video stream. If I deliberately cut the network connection between the camera and FFMPEG, FFMPEG will wait for the connection to be re-established for quite a long time. If the interruption was long (>10 seconds) the stream will immediately drop with the "No More Outputs" error the instant that the connection is re-established. If the interruption is short, then RTSP will actually start pulling data from the camera again, but the stream will then drop with the same error a few seconds later. So it seems clear that the gap in the input data is causing the HLS encoder to have a fit and give up once the stream is resumed, but the size of the gap has an impact on whether the drop is instant or not.
I had a similar problem. In my case stream stopped without any errors after few minutes. I fixed this by switching from freebsd to linux. Maybe the problem is bad package dependencies or ffmpeg version. So my suggestion is to try older or newer version of ffmpeg or another OS.
Update: Actually this doesn't solve the problem. I've tested a bit more and stream stopped after 15 minutes.
Been facing the same problem. After an extended trial and error i found that the problem resided in my cctv camera parameters. More exactly i adjusted the key frame interval parameter to match the frame-rate of the recording camera.
My syntax (windows)
SET cam1_rtsp="rtsp://192.168.0.93:554/11?timeout=30000000&listen_timeout=30000000&recv_buffer_size=30000000"
ffmpeg -rtsp_transport tcp -vsync -1 -re -i %cam1_rtsp% -vcodec copy -af apad -shortest -async 1 -strftime 1 -reset_timestamps 1 -metadata title=\\"Cam\\" -map 0 -f segment -segment_time 300 -segment_atclocktime 1 -segment_format mp4 CCTV\%%Y-%%m-%%d_%%H-%%M-%%S.mp4 -loglevel verbose
After this correction got a 120 hour smooth input stream with no errors.
Hope this helps anyone.