ffmpeg scale pixel different compare of android pixel - android-ffmpeg

i have the following simple command that merge image in front of video
String comand = '-y -i $videoPath -i $imagePath -filter_complex "[1]scale=300:300[logo1];[0:v][logo1] overlay=300:300" -qscale 0 $outPutPath';
i need to understand why the pixel of width and height which is scale=300:300 is completely difference of any UI pixel like android screen phone ?
in other word : i am building app and i am set the width and height of Container or any widget lets say also 300andwidth 300but i got scale bigger than that one inffmpeg` pixel, Although pixel values are the same !
as shown in image i use the same pixel values however the result is different !
my scenario is to let user get values from UI and i making ffmpeg command scale depend on user UI values. This causes the width and height to be inconsistent
Is there more than one type of pixel or why this difference?

Related

ffmpeg - zoompan causing stretching

I have two images as input, both are 1600x1066. I am vertically stacking them. Then I am drawing a box and vertically stacking that box under both of the image. Inside of the box I write text, then I output a video that is 1080x1920. Everything works well, until I use zoompan to zoom in on the images, I get a weird behavior (see images included below). basically all input images including the box stretchs (shrink) vertically and no longer fit the entire height of the video which is 1920.
The command (removed some drawtext commands from it):
-filter_complex
"color=s=1600x1066:color=blue, drawtext=fontfile=font.otf: text='My Text':fontcolor=white: fontsize=30: x=50: y=50[box];
[0]scale=4000x4000,zoompan=z='min(zoom+0.0015,1.5)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=125:s=1600x1066[z0];
[1]scale=4000x4000,zoompan=z='min(zoom+0.0015,1.5)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=125:s=1600x1066[z1];
[z0][z1][box]vstack=inputs=3"
How do I fix this? I want to zoom in without stretching the images.
Video before using zoompan: https://i.stack.imgur.com/kTBto.jpg
Video after using zoompan: https://i.stack.imgur.com/7faNn.png
The problem was the scaling done before the zoompan ("scale=4000x4000") to remove the jiggly zoom effect. the scaling ration was not equal to the ratio of the image.
Ratio of image: 1600/1066 = 1.5
Ratio of scaling: 4000/4000 = 1
So scaling had to be changed to 6000/4000 = 1.5, which solved the problem.

How can I pan, right to left, in a "wide" video output to 1080p? [duplicate]

What would be the most efficient way to create a video from a panoramic image that would for example have the size: 5000 width x 600 height px?
I created this GIF image that would explain things a bit better. Imagine that the video would be inside the red border. So the video would potentially be panning from left to right.
A moving crop is the most convenient way to achieve this in ffmpeg.
ffmpeg -loop 1 -i in.jpg -vf "crop=500:ih:'min((iw/10)*t,9*iw/10)':0" -t 10 pan.mp4
The crop filter crops to a size of 500 x ih i.e. 500x600. The top-left co-ordinate of the cropping window is fixed to Y=0. For X, the expression is min((iw/10)*t,9*iw/10) i.e. in each second, the cropping window will slide across 10% of the image width. So, at t=9, the cropping window covers (4500,0) to (5000,600) for the example image. From that time, the min function returns the other value 9*iw/10 = 4500 and the sliding stops.
Dynamically Crop a panorama video to 1280x720 with timestamp settings from a detector:
ffmpeg script: crop=1280:ih:'func_cropstartx':0
func_cropstartx:
cropstartx[0] = location[0].x
for every location 0 < i <= N:
cropstartx[i] = if (gte(t\, location[i].time)\, location[i].x\, cropstartx[i-1]);
Use a sorted list of setpoints
Example:
crop = 1280:ih:'if(gte(t\,10)\,600\,if(gte(t\,8)\,400\,if(gte(t\,6)\,300\,if(gte(t\,4)\,200\,100))))' : 0

FFMPEG set ZxY width and height thumbnail while keeping aspect ratio with padding

I have been searching Stack Overflow for code to create thumbnails using FFmpeg.
But the options I've come across either result in distorted thumbnails when using the -s 100x100 and a specific frame, or only correctly scale one side of the image when using -vf 100:-1. This is an issue for me as all thumbnails need to be the same size.
Is there a way to achieve both a set height/width and maintain a consistent aspect ratio, such as filling in blank spaces with black boxes?
https://trac.ffmpeg.org/wiki/Scaling%20(resizing)%20with%20ffmpeg
Sometimes there is a need to scale the input image in such way it fits into a specified rectangle, i.e. if you have a placeholder (empty rectangle) in which you want to scale any given image. This is a little bit tricky, since you need to check the original aspect ratio, in order to decide which component to specify and to set the other component to -1 (to keep the aspect ratio). For example, if we would like to scale our input image into a rectangle with dimensions of 320x240, we could use something like this:
ffmpeg -i input.jpg -vf scale="'if(gt(a,4/3),320,-1)':'if(gt(a,4/3),-1,240)'" output_320x240_boxed.png

generate video containing scrolling image

I want to generate a video [let's say 800x600] from a 800x10000 still image.
The image has to scroll, from top to bottom as if someone was actually scrolling a page.
If it could scroll faster over some portions and slower over others, that would be great, if not I think I could just make a few separate videos and then just stitch them up.
I cannot find any documentation on this subject; could anyone give me a hint? Thanks for your time!
Use the scroll filter. The crop filter is optional and will output a reasonable width and height for large image inputs. You can consider using the scale filter too. The format filter outputs a widely compatible pixel format / chroma subsampling scheme.
Vertical
ffmpeg -loop 1 -i input.png -vf "scroll=vertical=0.01,crop=iw:600:0:0,format=yuv420p" -t 10 output.mp4
Horizontal
ffmpeg -loop 1 -i input.png -vf "scroll=horizontal=0.01,crop=800:600:0:0,format=yuv420p" -t 10 output.mp4
Scroll filter options
horizontal, h Set the horizontal scrolling speed. Default is 0. Allowed range is from -1 to 1. Negative values changes scrolling direction.
vertical, v Set the vertical scrolling speed. Default is 0. Allowed range is from -1 to 1. Negative values changes scrolling direction.
hpos Set the initial horizontal scrolling position. Default is 0. Allowed range is from 0 to 1.
vpos Set the initial vertical scrolling position. Default is 0. Allowed range is from 0 to 1.

ffmpeg resize down larger video to fit desired size and add padding

I'm trying to resize a larger video to fit an area that I have. In order to achieve this I calculate first the dimensions of the resized video so That it fits my area, and then I try to add padding to this video so that the final result will have the desired dimension, keeping the aspect ratio as well.
So let's say that I have the original video dimensions of 1280x720 and to fit my area of 405x320 I need first to resize the video to 405x227. I do that. Everything is fine at this point. I do some math and I find out that I have to add 46 pixels of padding at the top and the bottom.
So the padding parameter of the command for that would be -vf "pad=405:320:0:46:black". But each time I run the command I get an error like Input area 0:46:405:273 not within the padded area 0:0:404:226.
The only docs for padding that I found is this http://ffmpeg.org/libavfilter.html#pad.
I don't know what I'm doing wrong. Anyone had this problem before? Do you have any suggestions?
try -vf "scale=iw*min(405/iw\,320/ih):ih*min(405/iw\,320/ih),pad=405:320:(405-iw)/2:(320-ih)/2"
Edit to clarify what's going on in that line: you are asking how to scale one box to fit inside another box. The boxes might have different aspect ratios. If they do, you want to fill one dimension, and center along the other dimension.
# you defined the max width and max height in your original question
max_width = 405
max_height = 320
# first, scale the image to fit along one dimension
scale = min(max_width/input_width, max_height/input_height)
scaled_width = input_width * scale
scaled_height = input_height * scale
# then, position the image on the padded background
padding_ofs_x = (max_width - input_width) / 2
padding_ofs_y = (max_height - input_height) / 2
Here is a generic filter expression for scaling (maintaining aspect ratio) and padding any source size to any target size:
-vf "scale=min(iw*TARGET_HEIGHT/ih\,TARGET_WIDTH):min(TARGET_HEIGHT\,ih*TARGET_WIDTH/iw),
pad=TARGET_WIDTH:TARGET_HEIGHT:(TARGET_WIDTH-iw)/2:(TARGET_HEIGHT-ih)/2"
Replace TARGET_WIDTH and TARGET_HEIGHT with your desired values. I use this to pull a 200x120 padded thumbnail from any video. Props to davin for his nice overview of the math.
Try this:
-vf 'scale=640:480:force_original_aspect_ratio=decrease,pad=640:480:x=(640-iw)/2:y=(480-ih)/2:color=black'
According to FFmpeg documentation, the force_original_aspect_ratio option is useful to keep the original aspect ratio when scaling:
force_original_aspect_ratio
Enable decreasing or increasing output video width or height if
necessary to keep the original aspect ratio. Possible values:
disable
Scale the video as specified and disable this feature.
decrease
The output video dimensions will automatically be decreased if
needed.
increase
The output video dimensions will automatically be increased if
needed.

Resources