Capturing 15MP images from a Logitech c920 webcam - windows

I am trying to take 15 Megapixel images with a Logitech c920.
The camera supports full-HD (2.1 MP) video, and 15MP static images.
Using the supplied Logitech Software, I can indeed save 15MP images. But if I access the camera directly via DirectShow, I can at most get the full-HD resolution. It seems like the camera is stuck in "video" mode.
I tried various examples for accessing webcams, namely RobotEyez, VLC, NirSoft WebCamImageSave.
I am beginning to suspect that the default interfaces for webcams under windows are only designed for video, and that there is no official way to tell the camera to take still images in higher resolution than the video stream.
How can I access such a device under windows in order to capture the full 15MP resolution supported for still images?

Related

getUserMedia video quality

I am working on a tablet (HP) with Windows 8.1. We developed a web application, accessed from the tablet with the Chrome browser, which accesses the tablet's webcam using the getUserMedia API (the implementation is simple, based on JavaScript, similar to the one here for example: https://davidwalsh.name/demo/camera.php).
Our application will be used to take photos of identity cards, and then submit them to a servlet.
The quality of the picture taken inside the browser, using the getUserMedia API, is quite poor, and the letters on the identity cards are sometimes not easily readable in the image.
If I use the "Camera" application from Windows 8.1 on the same tablet, and take pictures of the same identity cards, in the same light conditions and from the same distance, the resulting images (JPEG) are very clear.
Why is this difference in quality? I read all about the getUserMedia API, and I tried all the available parameters (constraints, width, height, jpeg quality), but I cannot obtain a good quality image.
What is the reason for which the same camera on the same tablet results in such a quality difference when used in the browser, and when used with the Windows camera application, and is there a way to obtain better quality in the browser (develop a custom plugin)?
To answer your question "Why is this difference in quality?" In short it is because the Browser emulates the camera feed and does image transformation underthehood to allow the ability to send different streams to different clients. WebRTC main focus is for P2P media streaming, not for taking high quality photos.
You can use ImageCapture to gain more camera properties (and to get it as an ImageBitmap) but support right now is still very weak.
Please read my answers below which goes into more depth on MediaCapture and ImageCapture for their use cases for more information.
Why the difference in native camera resolution -vs- getUserMedia on iPad / iOS?
Take photo when the camera is automatically focused

Mac OS: get full control of web-camera (USB connected)

The task:
OS: Mac OS X 10.9 +
Description:
There is web-camera connected to a Mac via USB. I need to discover a way of getting access to its' brightness, pan, color temperature, focus, etc.
I also need a way to apply image filters against camera's video stream.
I need to be able to control the camera while it is being used by other programs like Skype, so I can transmit for example video stream with increased contrast at Skype video call.
Reference app: https://itunes.apple.com/app/webcam-settings/id533696630?mt=12
Solution:
This is the question.
As far as I understood I must to find custom kext (driver) in order to perform all this magic.
Could you please show me right direction, libraries, drivers, etc.
You can use opencv library to capture camera frames, apply filters, etc.
http://docs.opencv.org/2.4/doc/tutorials/introduction/display_image/display_image.html
Then you can feed a virtual cam that can feed into Skype, etc.
http://download.cnet.com/Virtual-Webcam/3000-2348_4-75754338.html
There also many open source virtual webcam available.
I hope this helps.

capture video from screen for stream

i am looking for differnt solutions to capture video stream from monitor screen and send it to vidoestreaming server to broadcast in web. it must occuring in "live".
i'd not like to use external services like "procaster" for broad.
OS: Windows.
it will be great to know the ideas and expirience people have to accomplish that.
Thanks all.
Recently, I build a GoLang project called ScreenStreamer, is a tool to stream current active window or full screen (Linux's or Windows's) to other device, like phone or another PC, as MJPEG over http or FLV over rtmp, it's very realtime (delay < 100ms). It works on Windows and Linux.
After building it, you can run it as:
# enter the project root directory
cd ./src/ScreenStreamer
# run it
./mjpeg or .\mjpeg.exe
# use a web browser or other video player, open http://host:port/mjpeg
./rtmp or .\rtmp.exe
# use a video player, open rtmp://host:port/live/screen
Screenshot:
Windows SDK includes Push Source Filters Sample, which in turn contains CPushSourceDesktop filter/class.
CPushSourceDesktop: Copy of current desktop image (GDI only)
It captures desktop image and pushes it into DirectShow pipeline. From there on you can process it using video compression codec and stream it to remote location. A decent screen image compression codec is included with Windows Media subsystem, network streaming will have to be a custom or third party component. Alternatively, it is possible to make the capture class a virtual camera and have Windows Media Encoder broadcast it (or, it already has a simila feature built in).
Alternatively, you can check VNC (or one of the clones) source code and see how it hooks windows and captures image updates, then compresses them and makes it available for remote applications.
Note that you will have to specifically capture non-GDI images (such as coming from video/gaming applications, which use hardware acceleration and non-RGB surffaces).

Can I make QTKit Capture support an IP camera?

I've got an IP camera - an Axis M1114 - that I'd like to use as a QTKit Capture device. Only DV cameras and the iSight usually appear on this capture menu. Is there any way of getting the IP camera to appear in this list and work with QTKit Capture?
This might not be possible, but if it is, I'd appreciate any pointers at ways of going about it. Thanks!
I asked Apple engineers the same question on WWDC 2010. The answer was "No, you cannot make your own QTCaptureDevice". To record video from AXIS IP cameras we have developed our own framework. You can get frames from the camera and write them to a Quicktime movie. You can do it with QTKit, but works better in 32bit Quicktime framework.

windows app live video

Whats the best and easiest way to play an incoming live video stream in a c++ windows application (visual studio 2010) and write some notes (eg. this is a blue ball) on the stream display? ActiveX? DirectX? Flash?
I have Windows SDK 7.1 installed. Do I need to install any other software?
Appreciate any pointers.
In simplest, you can do everything you ask with just directshow. There is directshow.net managed library that wraps it for you.
So - try to find an example that just gets video from capture device to the renderer. Then, insert SampleGrabber filter in between those, and modify frame data accordingly. I am using such technique to draw timestamp on the recorded video in my recorder, I am even drawing it with simple GDI+ calls.
Thing to consider: you'll have to watch out for PICTURE format - some webcams have YUY2 as default or ONLY format. You'll want RGB24 format to be able to wrap Bitmap then Graphics around it.

Resources