Extract frames from live youtube stream - ffmpeg

I want to extract frames from a youtube live event, say one frame every 5 minutes, ideally without saving the stream to my local machine. Is there a simple way to do this, possibly a combination between youtube-dl and ffmpeg that I am not figuring out? I found a similar question for UDP streams but don't know how to include the youtube stream: FFMPEG: extract a fram from a live stream once every 5 seconds
Thanks a lot in advance!

I figured it out based on another answer and the ffmpeg wiki.
You need to convert your youtube URL as follows:
youtube-dl -g "youtube URL"
Copy the output URL into the following command to extract a frame every 5 seconds:
ffmpeg -i "output URL" -vf fps=1/5 out%d.png
This will extract one frame every 5 seconds.

Related

Extract frames from video to a file with their timestamps as the output name

I want to extract specific frames from a video and save them on an external disk with ffmpeg with the timestamp of the video as the output name, preferably with milliseconds so that I can acquire more frames per second.
As a first approach I tried to extract the frames to the external disk with the following code, where I intend to extract 1 frame per second from a specific time interval and saving them as a sequence of images.
ffmpeg -ss 00:12:25 -to 00:12:35 -i 220718-124513_CAM0bc99448_30.mp4 -r 1 E:/images/img_%04d.png
Once I tried a different time interval, it began overwriting the images I had on the disk, because the sequence restarted, and the purpose of this is that I want to retrieve as many images as possible.
Then I tried the following code
ffmpeg -ss 00:00:00 -to 00:00:04 -i 220718-124513_CAM0bc99448_30.mp4 -vframes 1 -f image2 -strftime 1 E:/images/"img_%Y-%m-%d_%H-%M-%S.png"
thinking that it would give me the timestamp of the video but it saved the images with the local time, which solves the problem I was having, but I would like to specify the timestamp that those frames correspond on the video, preferably with milliseconds included, so that I can acquire even more frames per second (doing it this way, if I want 2 frames per second, it will save as one image only because the output names only cover seconds).
Finally I tried this code:
ffmpeg -ss 00:00:00 -to 00:00:04 -i 220718-124513_CAM0bc99448_30.mp4 -copyts -f image2 -frame_pts true -r 2 E:/images/img_%04d.png
and it supposedly solves the issue, and if there is no solution for the problem I mentioned, this will be the way I will implement this.
This is the first time I am posting on stack overflow, so if there is something missing on my question, please tell me and I will change.
Thanks in advance!

keyframe frequency of four seconds or less error when trying to stream a VOD file to ant media server

I am trying to stream a VOD file to Youtube using ant media server. I have uploaded the file to AMS and added the RTMP url but after starting the broadcast I am getting an error from YouTube saying "Please use a keyframe frequency of four seconds or less. Currently, keyframes are not being sent often enough, which can cause buffering. The current keyframe frequency is 6.8 seconds. Note that ingestion errors can cause incorrect GOP (group of pictures) sizes."
How do I configure the keyframe frequency for my file
You can use FFMPEG to change the key frame frequency of your mp4 file.
This command will set the key frame interval to 2 seconds.
ffmpeg -i myvideo.mp4 -qscale 0 -g 2 outputFile.mp4
You can alter other parameters in addition to key frame frequency. For more details, please visit https://trac.ffmpeg.org/wiki

Extract frames as images from an RTMP stream in real-time

I am streaming short videos (4 or 5 seconds) encoded in H264 at 15 fps in VGA quality from different clients to a server using RTMP which produced an FLV file. I need to analyse the frames from the video as images as soon as possible so I need the frames to be written as PNG images as they are received.
Currently I use Wowza to receive the streams and I have tried using the transcoder API to access the individual frames and write them to PNGs. This partially works but there is about a second delay before the transcoder starts processing and when the stream ends Wowza flushes its buffers causing the last second not to get transcoded meaning I can lose the last 25% of the video frames. I have tried to find a workaround but Wowza say that it is not possible to prevent the buffer getting flushed. It is also not the ideal solution because there is a 1 second delay before I start getting frames and I have to re-encode the video when using the transcoder which is computationally expensive and unnecessarily for my needs.
I have also tried piping a video in real-time to FFmpeg and getting it to produce the PNG images but unfortunately it waits until it receives the entire video before producing the PNG frames.
How can I extract all of the frames from the stream as close to real-time as possible? I don’t mind what language or technology is used as long as it can run on a Linux server. I would be happy to use FFmpeg if I can find a way to get it to write the images while it is still receiving the video or even Wowza if I can find a way not to lose frames and not to re-encode.
Thanks for any help or suggestions.
Since you linked this question from the red5 user list, I'll add my two cents. You may certainly grab the video frames on the server side, but the issue you'll run into is transcoding from h.264 into PNG. The easiest was would be to use ffmpeg / avconv after getting the VideoData object. Here is a post that gives some details about getting the VideoData: http://red5.5842.n7.nabble.com/Snapshot-Image-from-VideoData-td44603.html
Another option is on the player side using one of Dan Rossi's FlowPlayer plugins: http://flowplayer.electroteque.org/snapshot
I finally found a way to do this with FFmpeg. The trick was to disable audio, use a different flv meta data analyser and to reduce the duration that FFmpeg waits for before processing. My FFmpeg command now starts like this:
ffmpeg -an -flv_metadata 1 -analyzeduration 1 ...
This starts producing frames within a second of receiving an input from a pipe so writes the streamed frames pretty close to real-time.

How to get keyframes of video ffmpeg for videos created by Red5?

I had a task of getting keyframes information of a video i.e. getting time duration of all keyframes(seeking video via rtmp url).
To get information of all the frames of video use following command:
ffprobe -show_frames testVideo.mp4 > data.txt
The information I get from data.txt file :
“keyframe=1” signifies that it tried to make keyframe
“pict_type=I” signifies a keyframe or I-frame
here I get exact time duration of keyframe.
actual fps of video
fps = coded_picture_number(last frame that contains coded_picture_number) / pkt_pts_time
Above information worked well for most of the videos.But for some videos that were created by Red5 were showing abnormal results.
List of urls I have searched
http://sinclairmediatech.com/using-ffprobe-to-evaluate-keyframes/
http://ffmpeg-users.933282.n4.nabble.com/How-can-I-find-the-keyframe-information-for-a-mp4-video-td4349687.html
please help if I am wrong or need any other command.
Thanks in advance.

Recording a mp3 stream with FFMPEG and drop-outs

I hope someone can give me pointer, I have a php script that runs the command below to record an live radio mp3 stream to create hour long mp3 recordings. It works very well for my purpose. The only issue is occasionally no recording is made. As far as I can tell its because the stream has dropped out and ffmpeg just aborts.
/usr/local/bin/ffmpeg -i http://www.mystream.com:8000/radiostream.mp3 -t 60:00 -acodec copy /var/www/mydomain/audio/".$recorded_audio_title;
So my question, is there anyway to tell ffmpeg to continuously record for the 60:00 minutes to make a recording even if their are drop outs? I'd be happy with a odd bit of silence providing it completed the recording.
I hope this makes sense and I'd appreciate even a pointer to a FFMPEG option or flag. Having Google'd I havnt seen anything that would fit the bill.
Many thanks in advance
rob
Assuming you need to record exactly 60 mins files by completing dropped streaming by silence.
FFMPEG doesn't have such explicit option. But you can simulate it. Prepare 60 mins mp3 of silence. Record your stream. When recording finished, check its duration. If it's shorter than 60 mins - join your recording with silence file and specify that final duration should be 60 mins. Joining is described here.
EDIT:
To continue recording you just need to check your previous recording duration and if it's too short - run the same FFMPEG command again, with different file name, and then join two files. Loop this until you receive 60 mins file
try looking at ffmpeg -segment command to split up your audio recording into 60 minute chunks ffmpeg documentation

Resources