ffmpeg 1 image with many frames - ffmpeg

Is that possible to create thumbnails of video using ffmpeg in this format:
I need to output a single image with vertical shots every 10 seconds.
I know only how to create one image with one frame:
<?php
$ffmpeg = '/usr/local/bin/ffmpeg';
$video = '1.mp4';
$image = '1.png';
$interval = 1;
$size = '300x210';
$cmd = "$ffmpeg -i $video -deinterlace -an -ss $interval -f mjpeg -t 1 -r 1 -y -s $size $image 2>&1";
$return = `$cmd`;
?>

You can do this with one ffmpeg command.
Example
ffmpeg -i alone_in_the_wilderness.mp4 -filter_complex \
"select='isnan(prev_selected_t)+gte(t-prev_selected_t\,10)',yadif,scale=240:-1,tile=1x3" \
-vframes 1 -t 30 -q:v 4 strip.jpg
Example with borders
tile=1x3:margin=10:padding=10
Also see
select, yadif, scale, tile filters documentation
Combine multiple images to form a strip of images ffmpeg
FFmpeg output screenshot gallery

You can get a single image with one frame every 10 seconds with ffmpeg (e.g. 1.png, 2.png, 3.png) in a for loop and then merge the images horizontally using imagemagick:
convert 1.png 2.png 3.png -append vertical.png

Related

ffmpeg - create a morph video between 2 images (png solid hex color) 0.55 seconds for a length of 10 seconds

currently i create with imagick (convert -size 640x480 xc:#FF000 hex1.png and hex2.png) 2 png images and save both png files.
now i need the following (but i have no idea how i can do that (maybe ffmpeg?)):
create a video 640x480 for example a length of 10 seconds like this method:
0.00s (hex1) > 0.55s (hex2) > 1.10s (hex1) > 1.65s (hex2) > 2.2s (hex1).... until the 10 seconds has reached.
the hex1 and hex2 images should always morph/fade from hex1 => hex2 => hex1, ...
but the time is very critical. the time must be exact always have 0.55.
maybe i can create on same way direct the HEX colors without creating first a png image for that purposes.
can anybody helps me how i can do that best way?
thank you so much and many greets iceget
currently i have created only a single image with that function:
ffmpeg -loop 1 -i hex1.png -c:v libx264 -t 10 -pix_fmt yuv420p video.mp4
Here is one of the approaches to achieving your goal without pre-creation of images:
ffmpeg -hide_banner -y \
-f lavfi -i color=c=0x0000ff:size=640x480:duration=0.55:rate=20 \
-filter_complex \
"[0]fade=type=out:duration=0.55:color=0xffff00[fade_first_color]; \
[fade_first_color]split[fade_first_color1][fade_first_color2]; \
[fade_first_color1]reverse[fade_second_color]; \
[fade_first_color2][fade_second_color]concat[fade_cycle]; \
[fade_cycle]loop=loop=10/(0.55*2):size=0.55*2*20,trim=duration=10" \
flicker.mp4
Since loop filter operates with frames, not seconds, and you have time constraints, you may choose only the few FPS rates correspondig to the integer number of frames that fit in the 0.55 seconds period (e.g. 20, 40, 60).
The filtergraph is self-explanable.
The result of such command will be like this:
Almost universal way (added in response to the OP's new questions)
#!/bin/bash
# Input parameters
color_1=0x0000ff
color_2=0xffff00
segment_duration=0.55
total_duration=10
# Magic calculations
sd_numerator=${segment_duration#*.}
sd_denominator=$(( 10**${#sd_numerator} ))
FPS=$(ffprobe -v error -f lavfi "aevalsrc=print('$sd_denominator/gcd($sd_numerator,$sd_denominator)'\,16):s=1:d=1" 2>&1)
FPS=${FPS%.*}
# Preparing the output a little bit longer than total_duration
# and mark the cut point with the forced keyframe
ffmpeg -hide_banner -y \
-f lavfi -i color=c=$color_1:size=640x480:duration=$segment_duration:rate=$FPS \
-filter_complex \
"[0]fade=type=out:duration=$segment_duration:color=$color_2[fade_first_color]; \
[fade_first_color]split[fade_first_color1][fade_first_color2]; \
[fade_first_color1]reverse[fade_second_color]; \
[fade_first_color2][fade_second_color]concat[fade_cycle]; \
[fade_cycle]loop=loop=ceil($total_duration/($segment_duration*2))+1: \
size=$segment_duration*2*$FPS,fps=fps=25" \
-force_key_frames $total_duration \
flicker_temp.mp4
# Fine cut of total_duration
ffmpeg -hide_banner -y -i flicker_temp.mp4 -to $total_duration flicker_${total_duration}s.mp4
# Clean up
rm flicker_temp.mp4

Lossless compression of 10bit images (stored as 16bit pngs) with ffmpeg - HEVC preferably

I am trying to encode 10 bit images losslessly in a video format, preferably using HEVC encoding. The images are stored as 16 bit png files (but only use 10 bit) and I have been working with ffmpeg to create and read back the video files.
My best attempt so far is based on https://stackoverflow.com/a/66180140/17261462 but as mentioned there, I get some pixel intensity differences which may be due to rounding when converting between 10 and 16 bit representation. I tried a few different means (bit shifting, left bit replication, floating point based scaling) but haven't yet figured out how to get a trully lossless reconstruction.
Below is a small piece of code to replicate my issue. I probably am doing something wrong there so feedback would be appreciated.
import subprocess
import numpy as np
import matplotlib.pyplot as plt
import tempfile
import imageio
# Create simple image
bitdepth = 10
hbd = int(bitdepth/2)
im0 = np.zeros((1<<hbd,1<<hbd),dtype=np.uint16)
im0[:] = np.arange(0,1<<bitdepth).reshape(im0.shape)
print('im0',np.min(im0),np.max(im0),im0.shape,im0.dtype)
# tile it to be at least 64 pix
im0 = np.tile(im0, (2, 2))
print('im0',np.min(im0),np.max(im0),im0.shape,im0.dtype)
im0ref = im0
# bitshift it or rescale intensities
#im0 = (im0<<6)
#im0 = (im0<<6) + (im0>>4)
im0 = np.uint16(np.round(im0 * np.float64((1<<16)-1)/np.float64((1<<10)-1)))
print('im0',np.min(im0),np.max(im0),im0.shape,im0.dtype)
# Save it as png
tmp0 = tempfile.NamedTemporaryFile(suffix='.png', delete=False)
print(f'Using tmp file: {tmp0.name}')
imageio.imwrite(tmp0.name,im0)
# Encode with ffmpeg
tmp1 = tempfile.NamedTemporaryFile(suffix='.mkv', delete=False)
# note that adding the following doesn't seem to impact the results
# + ' -bsf:v hevc_metadata=video_full_range_flag=1' \
mycmd = f'ffmpeg -y -i {tmp0.name}' \
+ ' -c:v libx265 -x265-params lossless=1' \
+ ' -pix_fmt gray10be' \
+ f' {tmp1.name}'
print(mycmd)
p = subprocess.run(mycmd.split(), capture_output=True)
print( 'stdout:', p.stdout.decode() )
print( 'stderr:', p.stderr.decode() )
tmp2 = tempfile.NamedTemporaryFile(suffix='.png', delete=False)
mycmd = f'ffmpeg -y -i {tmp1.name}' \
+ ' -pix_fmt gray16be' \
+ f' {tmp2.name}'
print(mycmd)
p = subprocess.run(mycmd.split(), capture_output=True)
print( 'stdout:', p.stdout.decode() )
print( 'stderr:', p.stderr.decode() )
# Read back with ffmpeg
im1 = imageio.imread(tmp2.name)
print('im1',np.min(im1),np.max(im1),im1.shape,im1.dtype)
# Bitshift or scale back
im1pre = im1
#im1 = (im1>>6)
im1 = np.uint16(np.round(im1 * np.float64((1<<10)-1)/np.float64((1<<16)-1)))
# check the result
plt.figure()
plt.imshow(im0ref)
plt.colorbar()
plt.figure()
plt.imshow(im1)
plt.colorbar()
plt.figure()
plt.imshow(np.int32(im1)-np.int32(im0ref))
plt.colorbar()
print('err: ',np.linalg.norm((np.float32(im1)-np.float32(im0ref)).ravel()))
plt.show()
EDIT: I have now also posted my question on the FFmpeg-user list:
http://ffmpeg.org/pipermail/ffmpeg-user/2021-November/053761.html
Also for convenience, a simple script is provided below to generate the different variants of using the 16 bits with 10bit data:
import numpy as np
import imageio
# Create simple image with gradient from
# 0 to (2^bitdepth - 1)
bitdepth = 10
unusedbitdepth = 16-bitdepth
hbd = int(bitdepth/2)
im0 = np.zeros((1<<hbd,1<<hbd),dtype=np.uint16)
im0[:] = np.arange(0,1<<bitdepth).reshape(im0.shape)
# Tile it to be at least 64 pix as ffmpeg encoder may only work
# with image of size 64 and up
im0 = np.tile(im0, (2, 2))
print('im0',np.min(im0),np.max(im0),im0.shape,im0.dtype)
# Save it
imageio.imwrite('gradient10bit-lsb.png',im0)
# Bitshift the values to use most significant bits
im1 = (im0<<unusedbitdepth)
print('im1',np.min(im1),np.max(im1),im1.shape,im1.dtype)
imageio.imwrite('gradient10bit-msb.png',im1)
# Scale the values use all 16 bits
im2 = np.uint16(np.round(im0 * np.float64((1<<16)-1)/np.float64((1<<bitdepth)-1)))
print('im2',np.min(im2),np.max(im2),im2.shape,im2.dtype)
imageio.imwrite('gradient10bit-scaledto16bits.png',im2)
# Left bit replication as a cost-effective approximation of scaling
# See http://www.libpng.org/pub/png/spec/1.1/PNG-Encoders.html
im3 = (im0<<unusedbitdepth) + (im0>>(bitdepth-unusedbitdepth))
print('im3',np.min(im3),np.max(im3),im3.shape,im3.dtype)
imageio.imwrite('gradient10bit-leftbitreplication.png',im3)
As well as raw ffmpeg / image magick commands.
Encoding:
ffmpeg -y -i gradient10bit-scaledto16bits.png -c:v libx265 -x265-params lossless=1 -pix_fmt gray10be gradient10bit-scaledto16bits.mkv
Decoding back to png:
ffmpeg -y -i gradient10bit-scaledto16bits.mkv -pix_fmt gray16be recons-gradient10bit-scaledto16bits.png
Comparison:
magick compare -verbose -metric mae gradient10bit-scaledto16bits.png recons-gradient10bit-scaledto16bits.png diff-scaledto16bits.png
Many thanks,
Tom
If, as you say, you want to encode 10-bit images to video losslessly, you would surely do better to use a lossless format that is capable of storing such things - such as ffv1 - then you can store a full 16-bits without shifting/scaling or doing anything.
#!/bin/bash
# Generate 16-bit greyscale PNG
magick -size 1920x1080 xc:gray +noise random 1.png
magick 1.png -format "File: %f Unique colours: %k, Min: %[min], Max: %[max]\n" info:
# Encode to video
ffmpeg -v warning -y -i 1.png -c:v ffv1 -pix_fmt gray16le video.mkv
# Decode back to PNG
ffmpeg -v warning -y -i video.mkv 2.png
magick 2.png -format "File: %f Unique colours: %k, Min: %[min], Max: %[max]\n" info:
# Compare
magick compare -verbose -metric ae {1,2}.png null:
Output
File: 1.png Unique colours: 65536, Min: 0, Max: 65535
File: 2.png Unique colours: 65536, Min: 0, Max: 65535
1.png PNG 1920x1080 1920x1080+0+0 16-bit Gray 3.96256MiB 0.030u 0:00.029
2.png PNG 1920x1080 1920x1080+0+0 16-bit Gray 4161790B 0.020u 0:00.017
Image: 1.png
Channel distortion: AE
gray: 0
all: 0
1.png=> PNG 1920x1080 16-bit Gray 3.96256MiB 0.760u 0:00.064
Thanks to Paul B Mahol on the ffmpeg-user mailing list, I have been able to solve this while using temporary rawvideo files. A solution without temporaries would nonethless be preferrable.
# convert png to rawvideo in 16 bits
ffmpeg -y -i gradient10bit-lsb.png -f rawvideo -pix_fmt gray16le gradient10bit-lsb.raw
# convert rawvideo to hevc-mkv in 10 bits by tricking the rawvideo demuxer
# into thinking the input is a 10 bit video
ffmpeg -y -f rawvideo -pixel_format gray10le -video_size 64x64 -i gradient10bit-lsb.raw -c:v libx265 -x265-params lossless=1 -pix_fmt gray10le gradient10bit-lsb.mkv
# delete tmp file
rm -f gradient10bit-lsb.raw
# convert hevc-mkv to rawvideo 10 bit
ffmpeg -y -i gradient10bit-lsb.mkv -f rawvideo -pix_fmt gray10le gradient10bit-lsb-postmkv.raw
# convert rawvideo back to png 16bits by tricking the rawvideo demuxer
# into thinking the input is 16 bits
ffmpeg -y -f rawvideo -pixel_format gray16le -video_size 64x64 -i gradient10bit-lsb-postmkv.raw -pix_fmt gray16be recons-gradient10bit-lsb.png
# delete tmp file
rm -f gradient10bit-lsb-postmkv.raw
# compare
magick compare -verbose -metric mae gradient10bit-lsb.png recons-gradient10bit-lsb.png diff-lsb.png

ffmpeg choose exact frame from long film strip

I'm working with ffmpeg to choose the better thumbnail for my video. and the selection would be based on the slider.
as per the requirement multiple thumbnails not needed just a single long film image and with the help of slider select the thumbnail and save it.
i used below command to get the long strip thumbnail.
ffmpeg -loglevel panic -y -i "video.mp4" -frames 1 -q:v 1 -vf "select=not(mod(n\,40)),scale=-1:120,tile=100x1" video_preview.jpg
I followed the instructions from this tutorial
I'm able to get the long film image:
This is working fine, they moving the image in slider which is fine.
My question is how can I select a particular frame from that slider / film strip. How can I calculate the exact time duration from the slider and then execute a command to extract that frame?
In one of my project I implemented the below scenario. In the code where I'm getting Video duration from the ffmgpeg command, if duration is less than 0.5 second than I set a new thumbnail interval for the same. In your case you should set time interval for the thumbnail creation. Hope this will help you.
$dur = 'ffprobe -i '.$video.' -show_entries format=duration -v quiet -of csv="p=0"';
$duration= exec($dur);
if($duration < 0.5) {
$interval = 0.1;
} else {
$interval = 0.5;
}
screenshot size
$size = '320x240';
ffmpeg command
$cmd = "ffmpeg -i $video -deinterlace -an -ss $interval -f mjpeg -t 1 -r 1 -y $image";
exec($cmd);
You can try this:
ffmpeg -vsync 0 -ss duration -t 0.0001 -noaccurate_seek -i filename -ss 00:30 -t 0.0001 -noaccurate_seek -i filename -filter_complex [0:v][1:v] concat=n=2[con];[con]scale=80:60:force_original_aspect_ratio=decrease[sc];[sc]tile=2x1[out] -map [out]:v -frames 1 -f image2 filmStrip.jpg
2 frame

How to Create video from selected images of a folder using FFMPEG?

For the time being I am doing
ProcessStartInfo ffmpeg = new ProcessStartInfo();
ffmpeg.CreateNoWindow = false;
ffmpeg.UseShellExecute = false;
ffmpeg.FileName = "e:\ffmpeg\ffmpeg.exe";
ffmpeg.Arguments = "for file in (D:\\Day\\*.jpg); do ffmpeg -i \"$file\" -vf fps=1/60 -q:v 3 \"D:\\images\\out.mp4\"; done;";
ffmpeg.RedirectStandardOutput = true;
Process x = Process.Start(ffmpeg);
Here I'm getting exception saying system cannot find specified file.
For time being I'm considering all the files in D:\Day\*.jpg but actually I need to query individual files from a list.
Where am I wrong in the above scenario?
We need to create a separate text file with the image names and use that text file to create your video.
inside frameList.txt :
file 'D:\20180205_054616_831.jpg'
file 'D:\20180205_054616_911.jpg'
file 'D:\20180205_054617_31.jpg'
file 'D:\20180205_054617_111.jpg'
and in Arguments of the process use,
"-report -y -r 15/1 -f concat -safe 0 -i frameList.txt -c:v libx264 -s 1920*1080 -b:v 2000k -vf fps=15,format=yuv420p out.mp4"

When converting .mov to .flv video plays horizontally

When I record a video (.mov) through my iPhone it display vertically which is right.
But after converting the .mov to .flv(using ffmpeg) it displays horizontally.
My code:
function convert_flv($vidtime,$infile, $outfile, $w = 0, $h = 0, $extra_infile = '', $extra_outfile = '') {
$parms = '';
if($w == 0 && $h == 0) {
//$parms .= '-sameq ';
} else {
$parms = '-s {$w}x{$h} ';
}
if($vidtime==60) {
$cmd = ffmpeg($infile, $outfile, $parms.' '.$extra_infile, '-t 00:01:00 -ar 22050 -r 15 -f flv '.$extra_outfile);
} else {
$cmd = ffmpeg($infile, $outfile, $parms.' '.$extra_infile, '-t 00:04:00 -ar 22050 -r 15 -f flv '.$extra_outfile);
}
print_r($cmd);
return $cmd;
}
iPhone's store orientation information in .mov metadata that ffmpeg ignores, leading to rotated output. Correctly parsing the metadata is a problem.
If you're recording movies in a consistent orientation you can rotate them by adding -vf "transpose=1" to your ffmpeg command. Docs for transpose.
The orientation is a meta-data field in the video file - the actual file is not recorded in an alternate orientation. You would need to apply a transform in ffmpeg to rotate the video.

Resources