Avisynth total frames does not equal VirtualDub total frames - ffmpeg

It appears that Dissolve and/or Fade change the total number of frames in .avs scripts. When I add up the total number of frames in the avs script and then load the avs script in Vdub the total number of frames is different. My real world example below shows a difference of 822 frames vs 1368 frames for the same script. I have run some basic tests which appear to support this hypothesis. Of course I may be doing something stupid. Any guidance would be greatly appreciated. Please let me know if I can clarify anything. Ffmpeg also borks on the same script which leads me to think this is an Avisynth issue. Or my lack of avs coding skills.
System specs:
Win7,
FFmpeg version: 20170223-dcd3418 win32 shared,
AVISynth version: 2.6
Test1.avs = 200 frames long = Expected behaviour
LoadPlugin("C:\Program Files (x86)\AviSynth 2.5\plugins\VSFilter.dll")
v1=ImageReader("1.png", fps=24, start=1, end=100)
v2=ImageReader("2.png", fps=24, start=1, end=100)
video = v1 + v2
return video
Test2.avs with return Dissolve = 195 frames long = Unexpected behaviour
LoadPlugin("C:\Program Files (x86)\AviSynth 2.5\plugins\VSFilter.dll")
v1=ImageReader("1.png", fps=24, start=1, end=100)
v2=ImageReader("2.png", fps=24, start=1, end=100)
return Dissolve(v1, v2, 5)
Test3.avs with fadeOut(fadeIn = 202 frames long = Unexpected behaviour
LoadPlugin("C:\Program Files (x86)\AviSynth 2.5\plugins\VSFilter.dll")
v1=ImageReader("1.png", fps=24, start=1, end=100)
v2=ImageReader("2.png", fps=24, start=1, end=100)
fadeOut(fadeIn(v1 + v2, 60), 60)
Test4.avs with dissolve and fade = 197 frames long = Unexpected behaviour
LoadPlugin("C:\Program Files (x86)\AviSynth 2.5\plugins\VSFilter.dll")
v1=ImageReader("1.png", fps=24, start=1, end=100)
v2=ImageReader("2.png", fps=24, start=1, end=100)
v3 = Dissolve(v1, v2, 5)
fadeOut(fadeIn(v3, 60), 60)
Test5.avs explicity specifying frame rates on dissolve and fade = 197 frames = Unexpected behaviour
LoadPlugin("C:\Program Files (x86)\AviSynth 2.5\plugins\VSFilter.dll")
v1=ImageReader("1.png", fps=24, start=1, end=100)
v2=ImageReader("2.png", fps=24, start=1, end=100)
v3 = Dissolve(v1, v2, 5, 24)
fadeOut(fadeIn(v3, 60, $000000, 24), 60, $000000, 24)
realExample = 822 frames long = Expected behaviour (this is what I want)
LoadPlugin("C:\Program Files (x86)\AviSynth 2.5\plugins\VSFilter.dll")
v1=ImageReader("1.png", fps=24).trim(1,106)
v3=ImageReader("3.png", fps=24).trim(1,471)
v9=ImageReader("9.png", fps=24).trim(1,58)
v10=ImageReader("10.png", fps=24).trim(1,35)
v11=ImageReader("11.png", fps=24).trim(1,152)
video = v1 + v3 + v9 + v10 + v11
return video
realExample = 1368 frames long
LoadPlugin("C:\Program Files (x86)\AviSynth 2.5\plugins\VSFilter.dll")
v1=ImageReader("1.png", fps=24).trim(1,106)
v3=ImageReader("3.png", fps=24).trim(1,471)
v9=ImageReader("9.png", fps=24).trim(1,58)
v10=ImageReader("10.png", fps=24).trim(1,35)
v11=ImageReader("11.png", fps=24).trim(1,152)
d1 = Dissolve(v1, v3, 5)
d3 = Dissolve(v3, v9, 5)
d9 = Dissolve(v9, v10, 5)
d10 = Dissolve(v10, v11, 5)
fadeOut(fadeIn(d1 + d3 + d9 + d10,60),60)

You stated that some of your results gave "unexpected behavior", but you didn't specify what you expected them to be, so it's unclear what you think is wrong and where your misunderstanding lies. (When discussing problems, you should always state what results you got and what results you expected instead.)
In your Dissolve example (Test2.avs), you say that 195 frames is unexpected, but that sounds correct to me. "Dissolving" two clips together means that the end of one clip overlaps with the beginning of a second clip as one gradually fades into the other; this is not the same as fading out the first clip and then fading in the second clip. The overlap means that the result must be shorter than the sum of the clips' individual lengths. You combined two 100-frame clips and specified a 5-frame overlap, so 100 + 100 - 5 = 195.
In your FadeOut example (Test3.avs), you say that 202 frames is unexpected, but that also sounds correct to me. The documentation for FadeIn/FadeOut state:
An additional color frame is added at the start/end, thus increasing the total frame count by one (or for FadeIO, by two).
Since you made one call to FadeIn and one call to FadeOut in test3.avs, two extra frames were added. If you do not want this, then you can use FadeIn0/FadeOut0 (or FadeIO0 since you're using both), although note that with those functions, the first/last frame will not exactly be exactly black. If you want exactness, then simply trim off the first and/or last frames before using the normal FadeIn/FadeOut/FadeIO functions.
Your "real examples" are comparing apples to oranges. The version with Dissolve dramatically increases the frame count because it's combining the same clips multiple times:
d1 = Dissolve(v1, v3, 5)
d3 = Dissolve(v3, v9, 5)
...
fadeOut(fadeIn(d1 + d3 + d9 + d10,60),60)
d1 and d3 each include a copy of the v3 clip, and then you spliced d1 and d3 together at the end, meaning that v3 is included twice. (This is also true for v9 and v10.)
You probably intended to do something like:
video = Dissolve(v1, v3, 5)
video = Dissolve(video, v9, 5)
video = Dissolve(video, v10, 5)
video = Dissolve(video, v11, 5)
video = FadeOut(FadeIn(video, 60), 60)
or more succinctly:
video = FadeIO(Dissolve(v1, v3, v9, v10, v11, 5), 60)
The result should be 804 frames long: (822 frames from the original clips) - (4 dissolve points) * (5 frames of overlap per dissolve) + (2 frames from FadeIO).
If you actually wanted to combine the clips by fading out and then fading in, then you could preserve the original frame count by doing:
video = FadeIO0(v1, 60) \
+ FadeIO0(v3, 60) \
+ FadeIO0(v9, 60) \
+ FadeIO0(v10, 60) \
+ FadeIO0(v11, 60)

Related

How to extract an image per minute from video with a predefined starting frame in cv2 VideoCapture

I would like to extract one image per minute from an mp4 video recording and I have a code for that already, however, I also would like to add a starting frame from where the extraction should start and either set an ending frame or the number of images that should be extracted.
This is because I would like to avoid trimming the videos as it takes a very long time.
My existing code to extract the images is the following:
cap = cv2.VideoCapture(r'C:my_folder/my_video.mp4')
i = 1
while (cap.isOpened()):
ret, frame = cap.read()
if ret == False:
break
if i % 1800 == 0:
cv2.imwrite(r'C:/my_folder/pictures/' + str(i) + '.jpg', frame)
i += 1
cap.release()
cv2.destroyAllWindows()
Let's say I would like to start the extraction from frame 2700 and from that every 1800th frame until I have 30 frames.
Is there a way to do this? Every help would be much appreciated!
Change the if condition to the following:
if i > 2700 and i % 1800 == 0:

How can I learn the starting time of each frame in a video?

It is very critical to learn the start time of each frame of a video.
I need to determine the starting point manually ( for example 848 here) by using below matlab code:
v = VideoReader('video1.avi','CurrentTime',848);
while hasFrame(v)
video_frame = readFrame(v);
counter=counter+1;
if counter==1
imshow(video_frame)
imhist(video_frame(:,:,1))
end
end
What I want is to distinguish some video frame from the others by using histogram. At the end my aim is to reach the exact showing time of the distinguished frames.
After editting:
This is frame histogram outputs:
Histogram size of the some frames are different from the previous one, do you know the reason?
difference=[difference sum(abs(histcounts(video_frame)-histcounts(lastframe)))];
Because of the taking the difference of the I had remove the different histogram sized frames but it causes missing some frames.
i havent found an video example that looks like what you discribe. please condsider always to have an example.
This example code calculates the differences in the histcounts. please notice that waitforbuttonpressis in the loop so you have to click for each frame while testing or remove it when the video is too long. Does this works on your file?
v = VideoReader('sample.avi','CurrentTime',1);
figure1=figure('unit','normalized','Position',[0.2 0.2 0.4 0.6]);
axes1=subplot(3,1,1);
axes2=subplot(3,1,2);
axes3 = subplot(3,1,3);
counter=0;
difference=[];
video_frame=readFrame(v);
while hasFrame(v)
lastframe=video_frame;
video_frame = readFrame(v);
counter=counter+1;
imshow(video_frame,'Parent',axes1);
[a,b]=histcounts(video_frame(:,:,1));
plot(b(1:end-1),a,'Parent',axes2);
difference=[difference sum(abs(histcounts(video_frame,0:255)-histcounts(lastframe,0:255)))];
bar(1:counter,difference,'Parent',axes3);
waitforbuttonpress
end
[~,onedistinguished]=max(difference);
%defining a threshold like every value that is bigger 4000
multidistinguished=find(difference>4000);
disp(['majorly changed at: ' num2str(distinguished)]);

How to extract frames at particular intervals from video using matlab

I am using matlab 2013a software for my project.
I face a problem while splitting video into individual frames.
I want to know how to get frames at a specific intervals from video.. i.e., i want to grab frames at the rate of one frame per second(frame/sec) .My input video has 50 frames/sec. In the code I have used step() to slice the video into frames.
The following is my code , basically a face detection code(detects multiple faces in a video) . This code captures every frame in the video(i.e 50fp approx) and processes it. I want to process frames at the rate of 1 fps. Please help me.
clear classes;
videoFileReader = vision.VideoFileReader('C:\Users\Desktop\project\05.mp4');
**videoFrame = step(videoFileReader);**
faceDetector = vision.CascadeObjectDetector(); % Finds faces by default
tracker = MultiObjectTrackerKLT;
videoPlayer = vision.VideoPlayer('Position',[200 100 fliplr(frameSize(1:2)+30)]);
bboxes = [];
while isempty(bboxes)
**framergb = step(videoFileReader);**
frame = rgb2gray(framergb);
bboxes = faceDetector.step(frame);
end
tracker.addDetections(frame, bboxes);
frameNumber = 0;
keepRunning = true;
while keepRunning
**framergb = step(videoFileReader);**
frame = rgb2gray(framergb);
if mod(frameNumber, 10) == 0
bboxes = 2 * faceDetector.step(imresize(frame, 0.5));
if ~isempty(bboxes)
tracker.addDetections(frame, bboxes);
end
else
% Track faces
tracker.track(frame);
end
end
%% Clean up
release(videoPlayer);
But this actually considers every frame. I want to grab 1fps.
It cannot be done directly in Matlab 2013a, because the video access library does not provide the feature you want. Writing the necessary code to implement an efficient frame skipping routine is not really possible using just Matlab code (you would need to look inside the video libraries)
Working around it, you have two basic options:
Do as little work as possible on frames that you do not want to process.
Where you currently have
framergb = step(videoFileReader);
Instead do something like
for i=1:49,
step(videoFileReader);
end
framergb = step(videoFileReader);
(NB this does not allow for going beyond end of input)
Pre-process your file with a tool like ffmpeg, and reduce the frame-rate before you use Matlab.
The ffmpeg command might look something like this:
ffmpeg -i 05.mp4 -r 1 05_at_1fps.mp4

Creating Movie for each Generation of Data [duplicate]

This question already has an answer here:
How to create movies on each generation of a for loop in Matlab plot
(1 answer)
Closed 9 years ago.
I have the following code:
figure;
contour(X1,X2,f);
hold on
plot(top(1:size(top,1)), 'rx');
EDIT
figure;
for i = 1: G
contour(X1,X2,f);
hold on
plot(top(1:size(top,1)), 'rx');
end
NB: G is the maximum generation.
This is supposed to plot contours of sphere superimposed with selected individuals. In each iteration of the individuals, the best individuals is selected and these going on until the global optimum is reached. I need to show this in a movie form as shown in this below:
When you runs each stage of the iteration is indicated in the slides attached. This is what i am trying to do. Any idea please?
OK, I am just copying and pasting now, from here.
However I added FrameRate (per second) since you might want to use (or ask) it later.
writerObj = VideoWriter('Your_video.avi');
writerObj .FrameRate = 1; % 1 frames per second animation.
open(writerObj);
fig_h = figure;
for i = 1: G
contour(X1,X2,f);
hold on
plot(top(1:size(top,1)), 'rx');
frame = getframe(fig_h); % or frame = getframe; since getframe gets gcf.
writeVideo(writerObj, frame);
end
close(writerObj);
Now you will have a Your_video.avi file in your working directory.
If VideoWriter is not supported by your matlab, you could use use avifile same as mentioned in this answer (or in mathwork documentaion example here) like this:
aviobj = avifile('Your_video.avi','compression','None', 'fps', 1);
fig_h = figure;
for i = 1:G
contour(X1,X2,f);
hold on
plot(top(1:size(top,1)), 'rx');
frame = getframe(fig_h); % or frame = getframe; since getframe gets gcf.
aviobj = addframe(aviobj, frame);
end
aviobj = close(aviobj);
EDIT
A problem may occur as pointed out by this question also, which is the captured frame is a constant image. If you are running Matlab on windows, this problem may be caused by conjunction of windows in with certain graphics drivers, and may be solved as mentioned in this answer.

Cocoa QTMovie - How can i change the duration of each frame being added to my movie?

I am creating a movie by addoing image frames to my QTMovie, every frame is suppose to show up for about 0.2 seconds. But the closest I have got was 1 second per frame.
I tried etering amounts less than 1 into my QTTime but that way my movie length would be 0 seconds, the documentation doesn't describe what the parameters in QTMakeTime are.
Any idea how to achieve this?
QTTime frameDuration = QTMakeTime(1, 1);
for (//here goes my loop to read each frame)
{
[movie addImage:img forDuration:frameDuration withAttributes:dict];
}
the second parameterf is the number of frames per second
QTTime frameDuration = QTMakeTime(1, 7);
this means 7 frames per second which worked fine

Resources