How do I blur a segment in a video using ffmpeg? - ffmpeg

I am looking through the docs for ffmpeg and am having a hard time finding my way around.
Is there a way to add blurring to a video for a specific range in the video.
Example: a 1 minute video. I want to blur seconds 30-35.
Is this possible and if so what does the command look like?

Related

How can I add a screen shake effect to a video using ffmpeg?

I have a video I'm trying to make for a dumb joke. The input video stream could be anything, but in my case it's an image file that I'm adding audio to. I want to apply a screen shake effect/earthquake effect to the video to make it a bit more dramatic. Is there a simple way to do this with ffmpeg? I know ffmpeg can do a lot, but at a certain point it gets a bit impractical to do everything in ffmpeg.

Ffmpeg blur part of video interacively

I want to blur part of live stream conditionally in realtime with ffmpeg.
I read about zeromq filter, but the output of ffmpeg -filters says that boxblur filter doesn't support commands. For example, I want to blur the bottom half of the video then after some time disable blur then blur all video.
How can I achieve this?

How to get perspective coordinates from file for image overlay with ffmpeg?

Is it possible to do something like this purely with ffmpeg?
Lets say we have a text file with the frame by frame coordinates for the 4 corners where the image should go. ffmpeg has a perspective filter, but how would one get ffmpeg to pull the frame coordinates from the text file? I'm guessing with a pipe of sorts?
The perspective filter corrects the input's perspective, it doesn't apply a perspective effect. Applied to an overlay it results in a rectangular overlay with a corrected perspective.
The closest you can get with the already implemented filters is via the frei0r perspective module.
You can write your own filter for ffmpeg or a frei0r module.
Update: using #Mulvya's tip you can use timeline editing with perspective:
perspective=enable='eq(n,0)':x0=...,perspective=enable='eq(n,1)':x0=...
where n is the current frame number.
This will result in an impossibly long command line which may go over the system limit. You're still better writing your own filter.
You can alternatively do one frame at a time with a different command, save the output as an image and re-assemble the video at the end.

Still images to video for storage - But back to still images for viewing

Using ffmpeg I can take a number of still images and turn them into a video. I would like to do this to decrease the total size of all my timelapse photos. But I would also like to extract the still images for use at a later date.
In order to use this method:
- I will need to correlate the original still image against a frame number in the video.
- And I will need to extract a thumbnail of a given frame number in a
video.
But before I go down this rabbit hole, I want to know if the requirements are possible using ffmpeg, and if so any hints on how to accomplish the task.
note: The still images are timelapse from a single camera over a day, so temporal compression will be measurable compared to a stack of jpegs.
When you use ffmpeg to create a video from a sequence of images, the images aren't affected in any way. You should still be able to use them for what you're trying to do, unless I'm misunderstanding your question.
Edit: You can use ffmpeg to create images from an existing video. I'm not sure how well it will work for your purposes, but the images are pretty high quality, if not the same as the originals. You'd have to play around with it to make sure the extracted images are exactly the same as the input images as far as sequential order and naming, but if you take fps into account, it should work.
The command to do this (from the ffmpeg documentation) is as follows:
ffmpeg -i movie.mpg movie%d.jpg

ffmpeg series of image overlay at specific interval?

I am generating series of images for text scrolling, I need to overlay those images on a video at specific interval (for example from 10-15sec), how can we do that using ffmpeg?
According to FFmpeg developer Stefano Sabatini, this cannot currently be done.
Right now you can't do that by using the overlay, you need a multi-steps
process, where you need to split the file, overlay just in the interested
segments and merge again the modified segments, or create an ad-hoc video to
overlay which shows the image only in the interested intervals.
ffmpeg.org/pipermail/ffmpeg-user/2012-January/004062.html

Resources