I have been able to stream video but what I would like to do is use cvLine to draw a line on the video as it is being recorded. Is there anyway to do this? cvLine seems to only work on static images. Any help would be greatly appreciated.
You will have to capture each frame, draw whatever you like on it, and perform any additional processing as you need, and save the result processed frame.
Related
I just want some confirmation, because I have the sneaking suspicion that I wont be able to do what I want to do, given that I already ran into some errors about ffmpeg not being able to overwrite the input file. I still have some hope that what I want to do is some kind of exception, but I doubt it.
I already used ffmpeg to extract a specific frame into its own image file, I've set the thumbnail of a video with an existing image file, but I can't seem to figure out how to set a specific frame from the video as the thumbnail. I want to do this without having to extract the frame into a separate file and I don't want to create an output file, I want to edit the video directly and change the thumbnail using a frame from the video itself. Is that possible?
You're probably better off asking it in IRC zeronode #ffmpeg-devel.
I'd look at "-ss 33.5" or a more precise filter "-vf 'select=gte(n,1000)'" both will give same or very similar result at 30 fps video.
You can pipe the image out to your own process via pipe if you like of course without saving it : "ffmpeg ... -f jpeg -|..."
I wish to add a frame to the end of a video just after it has been captured so I can make a timelapse video as the images are acquired.
So the idea is to take an image, use ffpmeg to make the video by adding each image just after it is aqcuired.
I've seen many questions about adding a set length of time of a logo type image or how to compile a whole bunch of single images to a video but not this.
Anyone got a good idea of what to try?
I wanna draw video overlays like lines, circles, rectangles and print texts on an AVFrame by using the ffmpeg's functions, not command line utility.
Is there any one knows how to do it?
Do you have the source code for this function?
I have asked this question when I started to work on video streaming issues.
Since an AVFrame is a decoded video data, adding overlays on video data means transcoding it. I add overlays while rendering. I used SDL library to render AvFrames and add overlays.
I am using the ani=anim.FuncAnimation(fig, animate, interval=1000, frames=20 syntax with an animate function which I then want to save to a file for later viewing. I have tried saving the file with ani.save('plan_evap.mp4', fps=1) but this process of saving seems to be the limiting factor to my whole end. The resulting mp4 movie that is obviously problematic can be downloaded here (http://s000.tinyupload.com/?file_id=00603708182366713933) for view. I would apprecaite any help people can offer to help resolve this.
I am working on an application and I have a problem I just cant seem to find a solution for. The application is written in vc++. What I need to do is display a YUV video feed with text on top of it.
Right now it works correctly by drawing the text in the OnPaint method using GDI and the video on a DirectDraw overlay. I need to get rid of the overlay because it causes to many problems. It wont work on some video cards, vista, 7, etc.
I cant figure out a way to complete the same thing in a more compatible way. I can draw the video using DirectDraw with a back buffer and copy it to the primary buffer just fine. The issue here is that the text being drawn in GDI flashes because of the amount of times the video is refreshed. I would really like to keep the code to draw the text intact if possible since it works well.
Is there a way to draw the text directly to a DirectDraw buffer or memory buffer or something and then blt it to the back buffer? Should I be looking at another method all together? The two important OS's are XP and 7. If anyone has any ideas just let me know and I will test them out. Thanks.
Try to look into DirectShow and the Ticker sample on microsoft.com:
DirectShow Ticker sample
This sample uses the Video Mixing Renderer to blend video and text. It uses the IVMRMixerBitmap9 interface to blend text onto the bottom portion of the video window.
DirectShow is for building filter graphs for playing back audio or video streams an adding different filters for different effects and manipulation of video and audio samples.
Instead of using the Video Mixing Renderer of DirectShow, you can also use the ISampleGrabber interface. The advantage is, that it is a filter which can be used with other renderers as well, for example when not showing the video on the screen but streaming it over network or dumping it to a file.