Streaming video playlist from collection of identical mp4 files - ffmpeg

I am looking for a way to play/stream to browser tag a list of mp4 files (same size, bitrate, etc) without hickups in between the files. I am hoping the following approach would work:
* convert mp4 files to m4s/m4v files
* generate MPEG-Dash MPD file (xml)
* stream MPD to dash player in browser
Is this in any way possible? I am aware the m4s/m4v files need special headers and an entry file must be made somehow, and there you have my roadblock.
Bottom-line is I want to avoid to concatenate the separate videos into one big video file and avoid the hick-ups you see when sequencing via a straightforward 'ended-event' way in JS.
Any suggestion much appreciated!

If you want a basic client side solution you can use two separate players or video tags in your web page, showing one and hiding the other.
The one that is visible plays the current video.
The other player loads starts and immediately pauses the next video.
When the first video ends, you hide that player and make the other one visible, un-pausing the video at the same time.
You then preload the next video into the original player and continue.
This technique is used successfully in some sites where ad breaks are mixed with the main video, as an example.

Related

Is it possible to set up ffmpeg as a repeater?

I am using this PyLivestream library to stream files to youtube. the problem is that once it finishes each video the scren goes down for a second until the next video starts. because it's simply just creating ffmpeg command and running then directly in a subprocess for each media file.
Is it possible to configure an instance of ffmpeg that will always be streaming to the destination. It could just be a blank screen or an image. And it also has an input, so I can point PyLivestream to the repeater.
This way the repeater will create one long un-interupted stream experience, but I can still use PyLivestream to stream the individual files.

Set specific frame as thumbnail for video?

I just want some confirmation, because I have the sneaking suspicion that I wont be able to do what I want to do, given that I already ran into some errors about ffmpeg not being able to overwrite the input file. I still have some hope that what I want to do is some kind of exception, but I doubt it.
I already used ffmpeg to extract a specific frame into its own image file, I've set the thumbnail of a video with an existing image file, but I can't seem to figure out how to set a specific frame from the video as the thumbnail. I want to do this without having to extract the frame into a separate file and I don't want to create an output file, I want to edit the video directly and change the thumbnail using a frame from the video itself. Is that possible?
You're probably better off asking it in IRC zeronode #ffmpeg-devel.
I'd look at "-ss 33.5" or a more precise filter "-vf 'select=gte(n,1000)'" both will give same or very similar result at 30 fps video.
You can pipe the image out to your own process via pipe if you like of course without saving it : "ffmpeg ... -f jpeg -|..."

How can I overlay an image onto a video

How can I overlay an image onto a video without changing the video file?
I have many videos and I want to be able to open them and overlay a ruler onto them and then measure the distance an individual moved visually. All I want is to play a video and then to open up an image with some transparency and position the image over the video. This way i would be able to look at the video and see how far the individual moved.
I would like to do this without having to embed the image like a watermark, because that is computationally expensive. I would need to copy the video, embed it with the ruler and then watch the video, then delete that video file. This seems unnecessary. I would like to just watch the video and have a transparent image over it while I a watching.
Is there a program that does this all together?
Alternatively, is there a program which I can use to open an image and make it transparent and then move it over the video that is playing?
Note: I am using Windows.
It sounds form your requirements that simply overlaying a separate image layer over the video will meet your needs.
Implementing this approach will depend on the video player client you are using, but you could implement an HTML5 based solution and play the videos locally with this (or even from a URL on the web if you have them there).
There is a nice answer with a working fiddle which shows how to do this with HTML5 here: https://stackoverflow.com/a/31175193/334402
One thing to note - you have not mentioned scale in your question. If you need to measure how far the person has moved in real distance, rather than in just cm's across the video screen, then you will need to somehow work out the scale of the video. This makes things considerably harder as the video may zoom in and out during the sequence you want to measure, so you would need some reference to calculate the scale for each frame. One approach would be to use the individual as a reference, assuming they are in all the frames you are interested in.
What about using good old VLC for that?
Open VLC go to Tools→Effects and Filters→Video Effects→Overlay and select Add logo checkbox:
Then, add your transparent overlay image and play any video with VLC. The output looks like this:

JPEG to Video stream

We are getting images from a third party and want to have a software that compresses the images and streams them. We are wondering if anyone knows of any software/api that does this.
I saw these online but unsure if it what I want.
http://www.aforgenet.com/framework/
http://splicer.codeplex.com/
Again we are getting images from a third party, we want to stream these images as a video feed on a website (we dont want to display them as image)
avifile wrapper should be one of the best options.

Is there a open-source video codec which can "play" .exe files?

Well, I would like to utilize Windows Media Player to run .exe applications in it's video-window. The application would be for example a full screen DirectX or OpenGL application, which you can execute on the OS.
I would like to know if there is such a codec so I can tweak it for my needs? Or maybe there is one which has very good tweaking abilities but is not (fully) open-source?
(I am asking this question because of this question: https://superuser.com/questions/533730/how-to-run-an-directx-or-opengl-application-as-desktop-background)
This is probably the weirdest request I've read in a long time. First the bad news: No, there's no open-source codec to play the output generated by a ".exe" in the video window of Windows Media Player. ".exe"s or more accurately PE files (Portable Executables) contain program code, i.e. data that is interpreted as program by your CPU. Videos however are not programs, but image data.
A video codec is a program that translates video data between formats. For example it can decode compressed h.264 into raw RGB data suitable for displaying. There are certain constraints on video codecs, for example that they decode a sequence of frames.
Now the good news: Technically it is possible to write such a codec. I won't be possible to open a .exe with WMP, though, as those don't can't be interpreted by WMP. But you could introduce a new FOURCC, a 4 character code identifying a particular video encoding format, and register a special purpose codes with that FOURCC. Then you create a special AVI file using that FOURCC and containing a reference to your target .exe instead of video data in the frames. When WMP tries to play this file it will launch this "codec", which in turn can launch the .exe. You need to establish a communication protocol between the launched application and the "codec". An off-screen rendering surface must be created, and I'd say a PBuffer DC shared between the processes serves this best, i.e. the "codec" creates the PBuffer and the .exe creates a OpenGL context on top of it. Then the codec passes the contents of the PBuffer as the decoded video frames to WMP.
So yes, such a hack can be done. But it'd be ugly and weird.
Why not simply write a visualization plugin for WMP? Those run in the video window as well, but it doesn't require such an ugly hack.
Simple answer: NO.
Complex answer: your title makes zero sense, because down there you then do not talk of playing an exe file but of trying to intercept "all sorts of API's" and magically transform them into a video.

Resources