Unble to create video(.mp4) from images (.jpeg) - android-ffmpeg

When I execute the ffmpeg command I get this error:
Unable to find a suitable output format for ffmpeg
String strCommand = "ffmpeg -loop 1 -t 3 -i " + list.get(0)" -loop 1 -t 3 -i " + list.get(1)+" -loop 1 -t 3 -i " + list.get(2)+" -loop 1 -t 3 -i " + list.get(3)+ " -filter_complex[0:v]trim=duration=3,fade=t=out:st=2.5:d=0.5[v0];[1:v]trim=duration=3,fade=t=in:st=0:d=0.5,fade=t=out:st=2.5:d=0.5[v1];[2:v]trim=duration=3,fade=t=in:st=0:d=0.5,fade=t=out:st=2.5:d=0.5[v2];[3:v]trim=duration=3,fade=t=in:st=0:d=0.5,fade=t=out:st=2.5:d=0.5[v3];[v0][v1][v2][v3]concat=n=4:v=1:a=0,format=yuv420p[v] -map [v] -preset ultrafast " + getVideoFilePath();

You command seems to be ok, please be sure that getVideoFilePath() can get a right path with the supported format, for example, /path/.../slide.mp4, here is a workable example ffmpeg line:
ffmpeg \
-loop 1 -t 3 -i pic001.jpg \
-loop 1 -t 3 -i pic002.jpg \
-loop 1 -t 3 -i pic003.jpg \
-loop 1 -t 3 -i pic004.jpg \
-filter_complex "\
[0:v]trim=duration=3,fade=t=out:st=2.5:d=0.5[v0];\
[1:v]trim=duration=3,fade=t=in:st=0:d=0.5,fade=t=out:st=2.5:d=0.5[v1];\
[2:v]trim=duration=3,fade=t=in:st=0:d=0.5,fade=t=out:st=2.5:d=0.5[v2];\
[3:v]trim=duration=3,fade=t=in:st=0:d=0.5,fade=t=out:st=2.5:d=0.5[v3];\
[v0][v1][v2][v3]concat=n=4:v=1:a=0,format=yuv420p[v]" \
-map "[v]" -preset ultrafast slide.mp4

Related

FFMPEG command playing the audios all at once

I want to play all the audios at a designated time frame using this command tho they all play at the beginning of the video here is the command
ffmpeg -y -i ./temp/background.mp4 -itsoffset 1 -i ./temp/tts/0.mp3 -itsoffset 3 -i ./temp/tts/1.mp3 -itsoffset 4 -i ./temp/tts/10.mp3 -itsoffset 7 -i ./temp/tts/11.mp3 -itsoffset 8 -i ./temp/tts/12.mp3 -itsoffset 10 -i ./temp/tts/13.mp3 -itsoffset 12 -i ./temp/tts/14.mp3 -itsoffset 14 -i ./temp/tts/2.mp3 -itsoffset 15 -i ./temp/tts/3.mp3 -itsoffset 18 -i ./temp/tts/4.mp3 -itsoffset 20 -i ./temp/tts/5.mp3 -itsoffset 22 -i ./temp/tts/6.mp3 -itsoffset 23 -i ./temp/tts/7.mp3 -itsoffset 25 -i ./temp/tts/8.mp3 -itsoffset 27 -i ./temp/tts/9.mp3 -filter_complex amix=inputs=5[a] -map 0:v -map [a] -c:v copy -async 1 -c:a aac ./temp/withoutsubs.mp4
I was expecting all the audios to play at their right time
amix doesn't sync by timestamp. You have to pad the head of each mp3 to start from timestamp 0 like this,
ffmpeg -y \
-i ./temp/background.mp4 \
-itsoffset 1 -i ./temp/tts/0.mp3 \
-itsoffset 3 -i ./temp/tts/1.mp3 \
-itsoffset 4 -i ./temp/tts/2.mp3 \
-filter_complex \
"[1]aresample=async=1:first_pts=0[1a];
[2]aresample=async=1:first_pts=0[2a];
[3]aresample=async=1:first_pts=0[3a];
[1a][2a][3a]amix=inputs=3[a]" \
-map 0:v -map [a] \
-c:v copy -c:a aac \
./temp/withoutsubs.mp4

if I put 1920x1080 images does it no longer work?

Hello everyone here I have a problem this code works well with 1080x1080 images but if I put 1920x1080 images does it no longer work? can someone tell me why or put me on a track thank you
fmpeg/ffmpeg \
-loop 1 -t 3 -i agence_quatre_img/1669992356.png \
-loop 1 -t 3 -i agence_quatre_img/1669992343.png \
-loop 1 -t 3 -i agence_quatre_img/1669992317.png \
-loop 1 -t 3 -i agence_quatre_img/1669992290.png \
-loop 1 -t 3 -i agence_quatre_img/1669992290.png \
-filter_complex \
"[1]fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+4/TB[f0]; \
[2]fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+8/TB[f1]; \
[3]fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+10/TB[f2]; \
[4]fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+12/TB[f3]; \
[0][f0]overlay[bg1];[bg1][f1]overlay[bg2];[bg2][f2]overlay[bg3]; \
[bg3][f3]overlay,format=yuv420p[v]" -map "[v]" agence_quatre_img/output.mov

rmtp lavfi & image2 queue blocking

[image2 # 0x7fb9ffc105e0] Thread message queue blocking; consider raising the thread_queue_size option (current value: 1024)
How to resolve this, after adding lavfi & raising the queue value to 1024 error still present....
ffmpeg -f image2 -loop 1 -framerate 1 -thread_queue_size 512 -i bg.png \
-f lavfi -thread_queue_size 512 -i "amovie=audio.m4a:loop=0,asetpts=N/SR/TB" \
-vf "realtime" \
-f flv rtmp://server.dev:1935/live/mystream
No errors w/o lavfi
ffmpeg -f image2 -loop 1 -framerate 1 -i bg.png \
-vf "realtime" \
-f flv rtmp://server.dev:1935/live/mystream

FFMpeg : Add audio stream via -filter_complex

In Reference to https://superuser.com/questions/833232/create-video-with-5-images-with-fadein-out-effect-in-ffmpeg
I have a bunch of images. and I want to make a video using these images.
Below is my code for FFMpeg. But the problem is how can I append an audio file to -filter_complex (with bitrate)
ffmpeg \
-loop 1 -i input0.png \
-loop 1 -i input1.png \
-loop 1 -i input2.png \
-loop 1 -i input3.png \
-loop 1 -i input4.png \
-filter_complex \
"[0:v]trim=duration=15,fade=t=out:st=14.5:d=0.5[v0]; \
[1:v]trim=duration=15,fade=t=in:st=0:d=0.5,fade=t=out:st=14.5:d=0.5[v1]; \
[2:v]trim=duration=15,fade=t=in:st=0:d=0.5,fade=t=out:st=14.5:d=0.5[v2]; \
[3:v]trim=duration=15,fade=t=in:st=0:d=0.5,fade=t=out:st=14.5:d=0.5[v3]; \
[4:v]trim=duration=15,fade=t=in:st=0:d=0.5,fade=t=out:st=14.5:d=0.5[v4]; \
[v0][v1][v2][v3][v4]concat=n=5:v=1:a=0,format=yuv420p[v]" -map "[v]" out.mp4

Inserting variables as command arguments in a bash script

I have a file for screencasting on linux, and I want to modify parts of it based on user input. At the moment I can't even get it to stick an argument in a command from a variable.
#!/bin/bash -x
fps="30"
capturesize="hd1080"
outputsize="720"
filter=-vf 'scale=${outputsize}'
avconv \
-f x11grab -r $fps -s $capturesize -i :0.0 \
-f alsa -ac 2 -i pulse \
-f alsa -ac 2 -i pulse \
-f alsa -ac 1 -i pulse \
-map 0:0 -map 1:0 -map 2:0 -map 3:0 \
-vcodec libx264 \
$filter \
-pre:v lossless_ultrafast \
-acodec libmp3lame \
-threads 4 \
-y $#
$fps and $capturesize are evaluated properly but $filter assignment gives a nice little:
+ filter=-vf
+ 'scale=${outputsize}'
~/bin/screencap: line 9: scale=${outputsize}: command not found
Changing the line to:
filter="-vf 'scale=$outputsize'"
Gives an even less pleasant:
+ filter='-vf '\''scale=720'\'''
+ avconv -f x11grab [...] -vf ''\''scale=720'\''' [...]
[...]
No such filter: 'scale=720'
Error opening filters!
Use an array. It has the added benefit of protecting items that contain spaces, something
you can't do if you try to store it in a space-separated list.
#!/bin/bash -x
fps="30"
capturesize="hd1080"
outputsize="720"
filter=( -vf "scale=${outputsize}" )
avconv \
-f x11grab -r "$fps" -s "$capturesize" -i :0.0 \
-f alsa -ac 2 -i pulse \
-f alsa -ac 2 -i pulse \
-f alsa -ac 1 -i pulse \
-map 0:0 -map 1:0 -map 2:0 -map 3:0 \
-vcodec libx264 \
"${filter[#]}" \
-pre:v lossless_ultrafast \
-acodec libmp3lame \
-threads 4 \
-y "$#"
You can put all the options in a single array; a small example:
options=( -f x11grab -r "$fps" -s "$capturesize" )
options+=( -i :0.o )
# etc
avconv "${options[#]}" -y "$#"
filter="-vf scale=${outputsize}"

Resources