create mp4 slide show on raspbian - ffmpeg

I have RPi running raspbian.
I want a solution to convert a folder of image files to an mp4 slide show video that can be played with omxplayer.
I did it with ffmpeg and following command:
ffmpeg -y -framerate .1 -pattern_type glob -i '*.jpg' -c:v libx264 -pix_fmt yuv420p out.mp4
It works with mpv media player but playing it with the flowing command with omx player does not do anything.
omxplayer --loop --no-osd --win 0,0,128,224 --orientation 90 out.mp4
I must use omx player to output on exact window and be compatible with older programs.
Not sure what would be the right way to do this. I have already a node js server running on Pi that I can use if needed.
Thanks

So the problem was I should have force both input and output rates(r .2 and -r 30)
Here is my final command:
ffmpeg -y -r .2 -pattern_type glob -i '*.jpg' -vcodec libx264 -pix_fmt yuv420p -preset fast -crf 18 -b-pyramid none -acodec ac3 -ab 1536k -scodec copy -r 30 out.mp4
Thank you Gyan for your comments.

Related

How to replace the video track in a video file with a still image?

I am trying to use ffmpeg to replace the video track in a video file with a still image. I tried some commands I got from other questions such as the one here
ffmpeg -i x.png -i orig.mp4 final.mp4
ffmpeg -r 1/5 -i x.png -r 30 -i orig.mp4 final.mp4
But these didn't work. I'm not sure which of these arguments are required or not. The output should be accepted by YouTube as a valid video - I was able to simply remove the video track, but apparently they don't let you upload a video without a video track.
You can try looping the still image like this:
ffmpeg -loop 1 -i x.png -i orig.mp4 final.mp4
Then you can tweak the encoding process by introducing the following quality parameters:
ffmpeg -loop 1 -i x.png -i orig.mp4 -crf 22 -preset slow final.mp4
they are described here.
If your colorspace gets rejected by YouTube you can try adding: -pix_fmt yuv420p.
Solution: A final solution is something like this:
Where -t 30 is an example duration of 30 seconds.
Using -c:a copy will directly copy the original audio without a new re-encoding (is faster).
ffmpeg -loop 1 -i x.png -i orig.mp4 -map 0 -map 1:a -c:v libx264 -pix_fmt yuv420p -crf 22 -preset slow -c:a copy -shortest final.mp4

FFMpeg command output shows 'green pixelated' video

I am trying to take a single image and add audio resulting in a video playing the entire song with that single image; much like you see for YouTube videos for songs. The command I am using is from this link: https://askubuntu.com/questions/868283/image-audio-mp4-how-to-make-video-smaller
This is the command:
ffmpeg -loop 1 -framerate 1 -i image.png -i song.aac -c:v libx264 -preset veryslow -crf 0 -c:a copy -shortest output.mp4
It works as intended for having the video file be a small size, and the song plays as well, but depending on the image I used, some of the images appear 'Green' when playing the video.
However though, this command works for any image used:
ffmpeg -loop 1 -framerate 1 -i image.jpg -i music.mp3 -c copy -shortest output.mp4
But the result is a very big file whereas I would like it to be smaller. Any help would be greatly appreciated! Thank you!
FFMpeg version: 4.3.1
The command you used from the answer you linked to is for YouTube specifically and even says, "your player probably won't like it but YouTube will". I'm assuming you're using a player and not YouTube.
In the same answer is another command under Widest compatibility for any player:
ffmpeg -loop 1 -i image.png -i music.mp3 -vf "scale='min(1280,iw)':-2,format=yuv420p" -c:v libx264 -preset medium -profile:v main -c:a aac -shortest -movflags +faststart output.mp4
Try that instead.

Convert PNG sequence to MP4 from directory

On a mac I would normally convert a folder of pngs to mp4 using the following:
ffmpeg -framerate 25 -i -pattern_type glob -i "*" -c:v libx264 -pix_fmt yuv420p -b:v 10M output.mp4
Now I'm trying to accomplish the same using windows 10, but globbing is not supported.
Not knowing what the filenames might be, is there a decent way of getting a complete file list of the directory and implementing it with ffmpeg?
Have u tried the % operator ?
ffmpeg -r 30 -i %.png -vcodec libx264 -crf 25 -pix_fmt yuv420p test.mp4

Is there a way to pipe input video into ffmpeg?

ffmpeg -f avfoundation -i "1:0" -vf "crop=1920:1080:0:0" -pix_fmt yuv420p -y -r 30 -c:a aac -b:a 128k -f flv rtmp://RTMP_SERVER:RTMP_PORT/STREAM_KEY
Hello guys, the above command works pretty well. It records the audio/video of the computer. But what I want to do is pipe a repeating video or image(png/jpeg/gif), so that there is no live video feed from the computer, but just the image on the stream with the audio.
How would you go about doing this?
Also, if you know any programming interfaces that can do this same thing, please give suggestions. Because I would rather not use a CLI.
I think you should be able to achieve this by using -loop and some -map:ing. I can't test with avfoundation myself but something like this works for me:
ffmpeg -loop 1 -i image.png -i file_to_take_audio_from.mp4 -vf "scale=1920:1080:0:0" -pix_fmt yuv420p -r 30 -c:a aac -b:a 128k -map 0:v -map 1:a output.mp4
Replace -i file_to_take_audio_from.mp4 with -f avfoundation -i "1:0" and output.mp4 with -f flv rtmp://RTMP_SERVER:RTMP_PORT/STREAM_KEY.
Also you might be able to skip -vf if the image has correct resolution.
Hope that helps!
Use none or no value at all (:0) for the video device index and provide a secondary input:
ffmpeg -f avfoundation -i :0 -i image.png ...
There's a loop option for images such as animated GIFs and -stream_loop for input streams.
You can use the FFmpeg APIs directly instead of CLI.

animation between images using FFmpeg

Hi I am new in FFmpeg,
I have made video from slideshow of sequential images (img001.jpg, img002.jpg, img003.jpg....). Using following commands in Ubuntu 14.04
ffmpeg -framerate 1/5 -i img%03d.jpg -c:v libx264 -r 30 -pix_fmt yuv420p -vf scale=320:240 out.mp4
But now I want to put animation like fade-in, fade-out between each sequential images, I want to generate video,
can anybody help me how to make it, i have searched lots of things but could not get....
The best way to do this is create intermediate mpeg's for each image and then concatenate them all into a video. For example, say you have 5 images; you would run this for each one of the images to create the intermediate mpeg's with a fade in at the beginning and a fade out at the end.
ffmpeg -y -loop 1 -i image -vf "fade=t=in:st=0:d=0.5,fade=t=out:st=4.5:d=0.5" -c:v mpeg2video -t 5 -q:v 1 image-1.mpeg
where t is the duration, or time, of each image. Once you have all of these mpeg's, you use ffmpeg's concat command to combine them all into an mp4.
ffmpeg -y -i image-1.mpeg -i image-2.mpeg -i image-3.mpeg -i image-4.mpeg -i image-5.mpeg -filter_complex '[0:v][1:v][2:v][3:v][4:v] concat=n=5:v=1 [v]' -map '[v]' -c:v libx264 -s 1280x720 -aspect 16:9 -q:v 1 -pix_fmt yuv420p output.mp4
This gives you the desired video and is the simplest and highest quality solution with ffmpeg. Let me know if you have any questions about how the above command works.

Resources