Is there a way to pipe input video into ffmpeg? - ffmpeg

ffmpeg -f avfoundation -i "1:0" -vf "crop=1920:1080:0:0" -pix_fmt yuv420p -y -r 30 -c:a aac -b:a 128k -f flv rtmp://RTMP_SERVER:RTMP_PORT/STREAM_KEY
Hello guys, the above command works pretty well. It records the audio/video of the computer. But what I want to do is pipe a repeating video or image(png/jpeg/gif), so that there is no live video feed from the computer, but just the image on the stream with the audio.
How would you go about doing this?
Also, if you know any programming interfaces that can do this same thing, please give suggestions. Because I would rather not use a CLI.

I think you should be able to achieve this by using -loop and some -map:ing. I can't test with avfoundation myself but something like this works for me:
ffmpeg -loop 1 -i image.png -i file_to_take_audio_from.mp4 -vf "scale=1920:1080:0:0" -pix_fmt yuv420p -r 30 -c:a aac -b:a 128k -map 0:v -map 1:a output.mp4
Replace -i file_to_take_audio_from.mp4 with -f avfoundation -i "1:0" and output.mp4 with -f flv rtmp://RTMP_SERVER:RTMP_PORT/STREAM_KEY.
Also you might be able to skip -vf if the image has correct resolution.
Hope that helps!

Use none or no value at all (:0) for the video device index and provide a secondary input:
ffmpeg -f avfoundation -i :0 -i image.png ...
There's a loop option for images such as animated GIFs and -stream_loop for input streams.
You can use the FFmpeg APIs directly instead of CLI.

Related

How to get playing audio file data in ffmpeg stream?

Dears experts of the wonderful ffmpeg utility! Tell me please who knows this:
I want to make a 24/7 stream on YouTube of music from looped video and audio tracks.
I do it like this:
ffmpeg -loglevel info -stream_loop -1 -y -re \
-i video.mp4 \
-f concat -safe 0 -i playlist.txt \
-c:v libx264 -preset veryfast -b:v 3000k -maxrate 3000k -bufsize 6000k \
-framerate 25 -video_size 1280x720 -vf "format=yuv420p" -g 50 -shortest -strict experimental \
-c:a aac -b:a 128k -ar 44100 \
-f flv rtmp://localhost/live/my-stream
i.e. video.mp4 is spinning in a loop and from the playlist.txt file I play mp3 in turn.
With this everything is ok, everything works. But I also want to show the title of the playing track.
As for example on some YouTube radios:
With cover is perfect!
Any ideas how this can be implemented?
I know that it is possible to display text through drawtext. You can output text from a file, which you can separately update yourself. But how to get the data of the currently playing file? ffmpeg does not give such information, only stream parameters: fps, framerate... Or is it still possible to get it?
Or are there better and easier ways?
Thanks in advance for your help!
you can use FFprobe to extract metadata from files.

How to replace the video track in a video file with a still image?

I am trying to use ffmpeg to replace the video track in a video file with a still image. I tried some commands I got from other questions such as the one here
ffmpeg -i x.png -i orig.mp4 final.mp4
ffmpeg -r 1/5 -i x.png -r 30 -i orig.mp4 final.mp4
But these didn't work. I'm not sure which of these arguments are required or not. The output should be accepted by YouTube as a valid video - I was able to simply remove the video track, but apparently they don't let you upload a video without a video track.
You can try looping the still image like this:
ffmpeg -loop 1 -i x.png -i orig.mp4 final.mp4
Then you can tweak the encoding process by introducing the following quality parameters:
ffmpeg -loop 1 -i x.png -i orig.mp4 -crf 22 -preset slow final.mp4
they are described here.
If your colorspace gets rejected by YouTube you can try adding: -pix_fmt yuv420p.
Solution: A final solution is something like this:
Where -t 30 is an example duration of 30 seconds.
Using -c:a copy will directly copy the original audio without a new re-encoding (is faster).
ffmpeg -loop 1 -i x.png -i orig.mp4 -map 0 -map 1:a -c:v libx264 -pix_fmt yuv420p -crf 22 -preset slow -c:a copy -shortest final.mp4

FFMpeg command output shows 'green pixelated' video

I am trying to take a single image and add audio resulting in a video playing the entire song with that single image; much like you see for YouTube videos for songs. The command I am using is from this link: https://askubuntu.com/questions/868283/image-audio-mp4-how-to-make-video-smaller
This is the command:
ffmpeg -loop 1 -framerate 1 -i image.png -i song.aac -c:v libx264 -preset veryslow -crf 0 -c:a copy -shortest output.mp4
It works as intended for having the video file be a small size, and the song plays as well, but depending on the image I used, some of the images appear 'Green' when playing the video.
However though, this command works for any image used:
ffmpeg -loop 1 -framerate 1 -i image.jpg -i music.mp3 -c copy -shortest output.mp4
But the result is a very big file whereas I would like it to be smaller. Any help would be greatly appreciated! Thank you!
FFMpeg version: 4.3.1
The command you used from the answer you linked to is for YouTube specifically and even says, "your player probably won't like it but YouTube will". I'm assuming you're using a player and not YouTube.
In the same answer is another command under Widest compatibility for any player:
ffmpeg -loop 1 -i image.png -i music.mp3 -vf "scale='min(1280,iw)':-2,format=yuv420p" -c:v libx264 -preset medium -profile:v main -c:a aac -shortest -movflags +faststart output.mp4
Try that instead.

animation between images using FFmpeg

Hi I am new in FFmpeg,
I have made video from slideshow of sequential images (img001.jpg, img002.jpg, img003.jpg....). Using following commands in Ubuntu 14.04
ffmpeg -framerate 1/5 -i img%03d.jpg -c:v libx264 -r 30 -pix_fmt yuv420p -vf scale=320:240 out.mp4
But now I want to put animation like fade-in, fade-out between each sequential images, I want to generate video,
can anybody help me how to make it, i have searched lots of things but could not get....
The best way to do this is create intermediate mpeg's for each image and then concatenate them all into a video. For example, say you have 5 images; you would run this for each one of the images to create the intermediate mpeg's with a fade in at the beginning and a fade out at the end.
ffmpeg -y -loop 1 -i image -vf "fade=t=in:st=0:d=0.5,fade=t=out:st=4.5:d=0.5" -c:v mpeg2video -t 5 -q:v 1 image-1.mpeg
where t is the duration, or time, of each image. Once you have all of these mpeg's, you use ffmpeg's concat command to combine them all into an mp4.
ffmpeg -y -i image-1.mpeg -i image-2.mpeg -i image-3.mpeg -i image-4.mpeg -i image-5.mpeg -filter_complex '[0:v][1:v][2:v][3:v][4:v] concat=n=5:v=1 [v]' -map '[v]' -c:v libx264 -s 1280x720 -aspect 16:9 -q:v 1 -pix_fmt yuv420p output.mp4
This gives you the desired video and is the simplest and highest quality solution with ffmpeg. Let me know if you have any questions about how the above command works.

ffmpeg rtmp and local file output

I have a trouble with ffmpeg
I receive a rtsp stream from a grabbing device (camera) and I stream-out it to rtmp (Youtube Live)
I want to have a copy of the stream in my computer so I write at the same time in a local file
I use this command :
ffmpeg -y -i 'RTSP_SOURCE' -c:v copy -c:a libvo_aacenc -map 0:v -bsf:v dump_extra -fflags +genpts -flags +global_header -movflags +faststart
-map_metadata 0 -metadata title= -f tee -filter_complex aevalsrc=0 '[f=mp4]/tmp/backup.mp4|[f=mpegts]/tmp/backup.ts|[f=flv]rtmp://a.rtmp.youtube.com/live2/STREAM_ID'
The problem is when I have some disconnections, ffmpeg exits and stop to recording
Is there any flag or option for telling to ffmpeg to continue recording in local files even there is not internet ?
Thank you very much for your help =)
You can try:
ffmpeg -f tee "[onfail=ignore] ...
More description is available here.

Resources