I have a camera which has a "normal" RGB sensor and an IR sensor (for windows hello).
I can open and access the RGB stream (and show it in my app), but how do I access the IR stream?
I am working in C/C++/Win32 but even something in C#/.NET would be useful.
Thanks,
Cyrille
Related
For my M100 I want to control several external devices such as laserpointer or thermal camera. I think, for a full control I have to integrate it by OSDK. I looked through the dji sdk documentation, but I have no idea where to start and what I effectively have to do. May somebody give me a little hint how to start and what really is necessary to fullfill my requirement described below.
Connect a Flir Boson (3.3V) and display the thermal image on the remote controllers mobile device.
Control the thermal camera (switch on/off) by the remote conroller.
Connect a laser pointer on a standard port of the M100 and control switch it on/off by a button of the remote controller.
I am working on a WPF scoreboard project. I have created my dashboard. I am capturing my WPF window using with Chroma key technique with OBS (Open Broadcaster Software) and I can stream it over internet. Etc. Youtube.
I would like to use my same WPF project with TV source. I mean I would like to capture my WPF window and send it to TV source using PC HDMI output with alpha channel.
is it possible without using any hardware? If it is where will I start to learn?
I am trying to take 15 Megapixel images with a Logitech c920.
The camera supports full-HD (2.1 MP) video, and 15MP static images.
Using the supplied Logitech Software, I can indeed save 15MP images. But if I access the camera directly via DirectShow, I can at most get the full-HD resolution. It seems like the camera is stuck in "video" mode.
I tried various examples for accessing webcams, namely RobotEyez, VLC, NirSoft WebCamImageSave.
I am beginning to suspect that the default interfaces for webcams under windows are only designed for video, and that there is no official way to tell the camera to take still images in higher resolution than the video stream.
How can I access such a device under windows in order to capture the full 15MP resolution supported for still images?
i am looking for differnt solutions to capture video stream from monitor screen and send it to vidoestreaming server to broadcast in web. it must occuring in "live".
i'd not like to use external services like "procaster" for broad.
OS: Windows.
it will be great to know the ideas and expirience people have to accomplish that.
Thanks all.
Recently, I build a GoLang project called ScreenStreamer, is a tool to stream current active window or full screen (Linux's or Windows's) to other device, like phone or another PC, as MJPEG over http or FLV over rtmp, it's very realtime (delay < 100ms). It works on Windows and Linux.
After building it, you can run it as:
# enter the project root directory
cd ./src/ScreenStreamer
# run it
./mjpeg or .\mjpeg.exe
# use a web browser or other video player, open http://host:port/mjpeg
./rtmp or .\rtmp.exe
# use a video player, open rtmp://host:port/live/screen
Screenshot:
Windows SDK includes Push Source Filters Sample, which in turn contains CPushSourceDesktop filter/class.
CPushSourceDesktop: Copy of current desktop image (GDI only)
It captures desktop image and pushes it into DirectShow pipeline. From there on you can process it using video compression codec and stream it to remote location. A decent screen image compression codec is included with Windows Media subsystem, network streaming will have to be a custom or third party component. Alternatively, it is possible to make the capture class a virtual camera and have Windows Media Encoder broadcast it (or, it already has a simila feature built in).
Alternatively, you can check VNC (or one of the clones) source code and see how it hooks windows and captures image updates, then compresses them and makes it available for remote applications.
Note that you will have to specifically capture non-GDI images (such as coming from video/gaming applications, which use hardware acceleration and non-RGB surffaces).
Whats the best and easiest way to play an incoming live video stream in a c++ windows application (visual studio 2010) and write some notes (eg. this is a blue ball) on the stream display? ActiveX? DirectX? Flash?
I have Windows SDK 7.1 installed. Do I need to install any other software?
Appreciate any pointers.
In simplest, you can do everything you ask with just directshow. There is directshow.net managed library that wraps it for you.
So - try to find an example that just gets video from capture device to the renderer. Then, insert SampleGrabber filter in between those, and modify frame data accordingly. I am using such technique to draw timestamp on the recorded video in my recorder, I am even drawing it with simple GDI+ calls.
Thing to consider: you'll have to watch out for PICTURE format - some webcams have YUY2 as default or ONLY format. You'll want RGB24 format to be able to wrap Bitmap then Graphics around it.