FFmpeg efficient capture from rtsp ipcamera - ffmpeg

I need to capture an audio/video rtsp stream uncompressed in a file from ipcamera. Audio (pcm_alaw) and video (h264) must be synchronized. It is necessary that the file does not get corrupted if the camera loses the connection for a few moments (mp4).
At the moment I use the command below, but the ts codec does not support pcm_alaw and therefore the audio is not heard:
ffmpeg -stimeout 2000000 -rtsp_transport tcp -i rtsp://admin:1234#192.168.5.22/h264 -c:v copy -c:a copy -f mpegts -y main.ts
I use the mpegts codec because I need to check the duration of the capture in real time with the command:
ffprobe -i /home/pi/NAS/main.mov -show_entries format=duration -v quiet -of csv="p=0"
If i use mkv or avi its output would be:
N/A
The verification of the duration is important because I capture files of about 3 hours and at my choice I perform some data while the capture is in progress. I prefer not to compress the audio because I have often noticed some asynchrony with respect to the video when cutting.
Thank you.

Instead of -c:a copy you can use -c:a aac or -c:a mp3 to convert the audio stream before you save it.
MPEG-TS h264 is only compatible with mp3 or aac (source).

Related

FFMPEG reduce fps for live h264 stream with direct copy

I found different articles on changing the fps with ffmpeg but none of them is matching for my exact purposes.
There is an ffmpeg command like below:
ffmpeg -i RTSPCAMERAPRODUCEH264 -c:v copy -an -movflags +frag_keyframe+empty_moov -f mp4
This will remux my camerastream to fragmented mp4 perfectly.
Is there a way to force ffmpeg to lower the FPS to save bandwidth?
I.e. camera streams 30fps, it needs 1Mbps for fmp4 (sample numbers!):
I'd like to know if it's possible to lower the FPS and get an output stream for which 500kbps (50% of original is enough) without re-encoding.
ffmpeg -r 1 -i RTSPCAMERAPRODUCEH264 -c:v copy -an -movflags +frag_keyframe+empty_moov -f mp4
and
ffmpeg -i RTSPCAMERAPRODUCEH264 -c:v copy -an -movflags +frag_keyframe+empty_moov -r 1 -f mp4
do not seem to work.
A temporally coded video stream (like one with H264 codec) cannot arbitrarily drop intermediate packets, so this is not possible. Only whole or trailing part of GOPs may be dropped.

ffmpeg RTP audio multicast stream choppy

I'm trying to stream .wav audio files via RTP multicast. I'm using the following command:
ffmpeg -re -i Melody_file.wav -f rtp rtp://224.0.1.211:5001
It successfully initiates the stream. However, the audio comes out very choppy. Any ideas how I can make the audio stream clean? I do not need any video at all. Below is a screenshot of my output:
Here's some examples expanding upon the useful comments between #Ralf and #Ahmed about setting asetnsamples and aresample, and also those mentioned in the Snom wiki. Basically one can get smoother multicast transmission/playback using these approaches for G711/mulaw audio:
ffmpeg -re -i Melody_file.wav -filter_complex 'aresample=8000,asetnsamples=n=160' -acodec pcm_mulaw -ac 1 -f rtp rtp://224.0.1.211:5001
Or using higher quality G722 audio codec:
ffmpeg -re -i Melody_file.wav -filter_complex 'aresample=16000,asetnsamples=n=160' -acodec g722 -ac 1 -f rtp rtp://224.0.1.211:5001

FFMPEG: how to save input camera stream into the file with the SAME codec format?

I have the camera-like device that produces video stream and passes it into my Windows-based machine via USB port.
Using the command:
ffmpeg -y -f vfwcap -i list
I see that (as expected) FFmpeg finds the input stream as stream #0.
Using the command:
ffmpeg -y -f vfwcap -r 25 -i 0 c:\out.mp4
I can successfully save the input stream into the file.
From the log I see:
Stream #0:0: Video: rawvideo (UYVY / 0x59565955), uyvy422, 240x320, 25 tbr, 1k tbn, 25 tbc
No pixel format specified, yuv422p for H.264 encoding chosen.
So, my input format is transcoded to yuv422p.
My question:
How can I cause FFmpeg to save my input video stream into out.mp4 WITHOUT transcoding - actually, to copy input stream to output file as close as possible, with the same format?
How can I cause ffmpeg to save my input videostream into out.mp4 WITHOUT transcoding
You can not. You can stream copy the rawvideo from vfwcap, but the MP4 container format does not support rawvideo. You have several options:
Use a different output container format.
Stream copy to rawvideo then encode.
Use a lossless encoder (and optionally re-encode it after capturing).
Use a different output container format
This meets your requirement of saving your input without re-encoding.
ffmpeg -f vfwcap -i 0 -codec:v copy rawvideo.nut
rawvideo creates huge file sizes.
Stream copy to rawvideo then encode
This is the same as above, but the rawvideo is then encoded to a more common format.
ffmpeg -f vfwcap -i 0 -codec:v copy rawvideo.nut
ffmpeg -i rawvideo.nut -codec:v libx264 -crf 23 -preset medium -pix_fmt yuv420p -movflags +faststart output.mp4
See the FFmpeg and x264 Encoding Guide for more information about -crf, -preset, and additional detailed information on creating H.264 video.
-pix_fmt yuv420p will use a pixel format that is compatible with dumb players like QuickTime. Refer to colorspace and chroma subsampling for more info.
-movflags +faststart relocates the moov atom which allows the video to begin playback before it is completely downloaded by the client. Useful if you are hosting the video and users will view it in their browser.
Use a lossless encoder
Using huffyuv:
ffmpeg -f vfwcap -i 0 -codec:v huffyuv lossless.mkv
Using lossless H.264:
ffmpeg -f vfwcap -i 0 -codec:v libx264 -qp 0 lossless.mp4
Lossless files can be huge, but not as big as rawvideo.
Re-encoding the lossless output is the same as re-encoding the rawvideo.

FFmpeg Stream Transcoding

I have got a streaming application that displays the stream sent from a Flash Media Server.
I want to grab that stream and transcode it to a output stream with a different bitrate using ffmpeg.
Could such kind of thing be done using ffmpeg?
This will get input from a feed, and transcode it to an MKV file with default audio and video codecs, and 1024k bitrate for the video stream (audio bitrate is specified with '-ab'):
ffmpeg -i "http://my_server/video_feed" -b 1024k output.mkv
For a live feed try this (not sure if it'll work, I don't have ffmpeg to test it right now):
ffmpeg -i "http://my_server/input_video_feed" -b 1024 -f flv "http://my_server/output_video_feed"
This should create a FLV feed.

Stream H264 To Android Using FFMPEG

I'm trying to stream a .ts file containing H.264 and AAC as an RTP stream to an Android device.
I tried:
.\ffmpeg -fflags +genpts -re -i 1.ts -vcodec copy -an -f rtp rtp://127.0.0.1:10
000 -vn -acodec copy -f rtp rtp://127.0.0.1:20000 -newaudio
FFMPEG displays what should be in your SDP file and I copied this into an SDP file and tried playing from VLC and FFPLAY. VLC plays audio but just gives errors re: bad NAL unit types for video. FFPLAY doesn't play anything.
My best guess if that the FFMPEG H.264 RTP implementation is broken or at least it doesn't work in video passthru mode (i.e. using the -vcodec copy).
I need a fix for FFMPEG or an alternate simple open-source solution. I don't want to install FFMPEG in my Android client.
thanks.
Have you tried vlc?I once used vlc for streaming. You can have a look at here.

Resources