how to merge segmented m4s or find the init file with ffmpeg - ffmpeg

i have this locked video that i can watch but looking at the responses i see that the video looks like its split into 3 m4s segments
this video seems to be hosted in vimeo but i can't seem to know where to find the init.mp4 nor do i knoew how to merge the segment and turn it into mp4 with ffmpeg
hi , i have this locked video that i can watch but looking at the responses i see that the video looks like its split into 3 m4s segments
this video seems to be hosted in vimeo but i can't seem to know where to find the init.mp4 nor do i knoew how to merge the segment and turn it into mp4 with ffmpeg

Related

FFMPEG images sequence normal and reverse to single video

I have a bunch of images
img01.jpg
img02.jpg
img03.jpg .... (up to img_45.jpg)
I would like to produce a video that will play the images in such way
img01.jpg
img02.jpg
img03.jpg
img02.jpg
img01.jpg
Possible?

Scalable solution for converting an image sequence to video

We are developing a stop motion app for kids and schools.
So what we have is:
A sequence of images and audio files (no overlapping audio, in v1. But there can be gaps between them)
What we need to do:
Combine the images to a video with a frame rate between 1-12 fps
Add multiple audio files at a given start times
encode with H265 to mp4 format
I would really like to avoid maintaining a VM or Azure batch jobs running ffmpeg jobs if possible.
Is there any good frameworks or third party APIs?
I have only found transloadit as the closes match but they don't have the option to add multiple audio files.
Any suggestions or experience in this area is very appreciated.
You've mentionned FFmpeg in your tag and it is a tool that checks all the boxes.
For the first part of your project (making a video from images) you should check this link. To sum up, you'll use this kind of command:
ffmpeg -r 12 -f image2 -i PATH_TO_FOLDER/frame%d.png PATH_TO_FOLDER/output.mp4
-r 12 being your framerate, here 12. You control the output format with the file extension. To control the video codec check out this link, you'll need to use the option -c:v libx265before the filename of your output.
With FFmpeg you add audio as you add video, with -i option followed by your filename. If you want to cut audio you should seek in your audio with -ss -t two options good for that. If you want and audio to start at a certain point, check out -itoffset, you can find a lot of examples.

Stitching 6 video files into one 360 video in Ubuntu Linux

I used 2 raspberry PIs to record 2 different videos in sync using a OSC server so the videos are perfectly in sync and I send them to a Linux server so I can stitch these videos and produce one 360 video file so far I have tried doing that on 2 videos just for testing purposes using two methods:
1- only ffmpeg:
By simply concatinating two videos into one but this doesn't produce a 360 video as seen here
2- using ffmpeg and hugin following this tutorial (https://medium.com/#xorgol/stitching-multi-camera-360-video-an-open-source-workflow-bb8b1e72925):
The problem here is I needed to do this method on each video file I have and then concatenate both videos to produce this result
The original video can be seen here its captured using the RPI camera module V2 using a lens that provide a resolution of 3280x2464.
I don't mind a bit of overlapping or anything I just need to produce a 360 video from 2 videos using two vamera with 180 degrees FOV and be able to view them using a 360 video player so your help is appreciated.

ffmpeg picture slideshow with audio

I'm trying to make a batch of videos for uploading to youtube. My emphasis is on the audio (mostly MP3, with some WMA). I have several image files that need to be picked up at random to go with the audio (ie) display an image for 5 seconds before showing the next. I want the video to stop when the audio stream ends. How should I use ffmpeg to achieve this ?
Ref:
http://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images

Overlaying video with ffmpeg

I'm attempting to write a script that will merge 2 separate video files into 1 wider one, in which both videos play back simultaneously. I have it mostly figured out, but when I view the final output, the video that I'm overlaying is extremely slow.
Here's what I'm doing:
Expand the left video to the final video dimensions
ffmpeg -i left.avi -vf "pad=640:240:0:0:black" left_wide.avi
Overlay the right video on top of the left one
ffmpeg -i left_wide.avi -vf "movie=right.avi [mv]; [in][mv] overlay=320:0" combined_video.avi
In the resulting video, the playback on the right video is about half the speed of the left video. Any idea how I can get these files to sync up?
Like the user 65Fbef05 said, the both videos must have the same framerate
use -f framerate and framerate must be the same in both videos.
To find the framerate use:
ffmpeg -i video1
ffmpeg -i video2
and look for the line which contains "Stream #0.0: Video:"
on that line you'll find the fps in movie.
Also I don't know what problems you'll encounter by mixing 2 audio tracks.
From my part I will try to use the audio from the movie which will be overlayed
over and discard the rest.

Resources