openCV mac OSX suggested webcam - macos

Does anyone have a recommendation for a web cam to be used on a mac with opencv?
Thank you!

The internal webcam that comes with their laptop works quite well (rather high quality). You can also get Logitech Orbit (I personally have rather good experience with those).

While OpenCV will work with essentially any USB webcam, there is a great deal of variety in image quality. I have personally had excellent luck with the PS3 Eye Cam in Ubuntu after removing the casing. Even though it is only $25-50, it can run up to 125 Hz and is less susceptible to motion blur. I am not sure how the open source Mac OS X drivers compare to the Linux drivers, but it is worth consideration.

Related

How to get a I420 or RGB24 formatted Camera Stream in Windows?

Camera optionally will display RGB24 and I420 video on some Windows machines, and will not do so on others machines. Actually, I've tested several cameras with this result.
I love BeatHarness, my cool old (no longer supported) audio visualizer software. It has served me (and my bands) faithfully for almost a decade. It's still kicking on some (un-networked) WinXP boxes at my practice spot.
Now, with some replacement machines (i.e., live use and such), the camera functions die. I've tested several otherwise functional USB cameras on several Windows machines. When BeatHarness indicates the cameras are YUY2 YUYV, the camera settings and preview are black or distorted. The I420 and RGB24 formats work like a charm.
It doesn't appear to be hardware specific, as the same camera that works on a Windows7 machine, doesn't work on my Windows10 laptop. A couple times, I think the same camera will work on one XP box and will not on another.
Some searching indicates it might have to do with DirectShow, but I'm not sure. And I've yet to find a comprehensible fix that I might test out. Maybe you have some ideas? Can I provide more info to make this question clearer?

Firemonkey Scrambles graphics on laptop

I have an application that plots quarter degree blocks on a map using Timage stacked on each other. I then add records by drawing them on a separate layer.
The problem I have is that Firemonkey (or Windows) scrambles the graphics, but only on some computers, and I think all the affected computers are laptops. See the following links for screenshots:
The correct image should look like this:
On laptops this scrambling may take 3 repaints of the layers, but sometimes (on exactly the same code) it happens after 1 or two times. While it is inconsistent in exactly how many repaints it takes, it is guaranteed to happen after no more than 3 paints.
So I have come to the conclusion that it must be a Graphics driver issue. I have a NVidia Geforce 950M on my laptop (Asus NJ551 with Windows 10), but if I understand the code correctly I am using the Windows Direct2D acceleration so the Nvidia drivers shouldn't affect things?
I set the following flag by default: GlobalUseDX10Software := true; //Use DirectX to generate graphics, but this does not seem to make any difference as it still scrambles even when set to false.
I would prefer the Windows acceleration as my users may not all have a graphics card installed. A friend using a HP laptop (not sure of the model but running Windows 8) does not experience the issue, yet another friend with a brand new HP laptop (low spec but with Windows 10) is also experiencing the issue.
Can someone please help out here? I am out of ideas, and I'm not even sure what to Google. Is it Windows 10, is it the Graphics driver, etc? Is there a way I can force my laptop to use the Graphics card for testing? While this will not help other users without proper graphics cards, it may help isolate the issue.
Any advice is appreciated!
From the EDN forum, I got the a number of other Graphics related global variables to set. The one that sorted out the issue is:
GlobalUseDXSoftware := True;
It now makes sense, as the issue started happening after moving to XE8 from XE5, and the GlobalUseDX10Software flag is now deprecated

Alternatives to libvlc for receiving HTTP based MJPEG video (from Axis IP Camera)?

I need to display a video stream from AXIS IP camera, which is streaming MJPEG video in HTTP. I have tried working with libvlc, but it has some buffering issues. So please suggest the list of alternatives for the same.
System Config: Ubuntu 11.10 operating system running on ATOM based ATMEL tablet.
Thanks in advance
BK
PS: I read a bit about gstreamer, but not sure if it's an overkill here.
After some research, found the following alternatives (for C++ on Linux platform) to receive/display video from an IP camera:
libvlc - nice framework with good documentation; but has buffering issues
opencv - an overkill for the scenario; but otherwise a very good choice
gstreamer - an excellent framework to work with streams; but poor documentation (but consumes more CPU as compared to libvlc)
As of now, narrowed it to gstreamer and got some code working. Can share it, if someone is interested. Any more suggestions/alternatives are welcome.

Opencv 2.2.0 VS2010C++ webcam grey image only

I'm trying to do a simple opencv project to track colour and testing out my webcam at home as the school is crowded whenever I have the time to do my project. However opencv is not correctly using my webcam.
This is an image of the codes I used and the result.
I really need any help I can get.. The webcam I have is this: http://www.logitech.com/en-us/product/hd-webcam-c525
Would really appreciate any help :)
Does having a 64 bit computer make any difference? Because opencv is 32bit
Try open other cam, may be you have some virtual web cams installed in system. To do this change argument of cvCaptureFromCam to something different from 0.

Exposure Lock in iSight

I am creating object-detection program on Mac.
I want to use iSight in manual exposure mode to improve detection quality.
I tried iGlasses & QTKit Capture to do that and it worked but program runs very slowly and unstable.
So I want to try other solution.
In PhotoBooth.app, iSight seemed to be run in fixed exposure mode so there might be a way to do that.
I read QTKit Capture documents and OpenCV documents but I couldn't find the answer.
If you have any ideas, please tell me.
Thank you.
QTKit Capture, as easy as it is to use, lacks the ability to set manual camera parameters like gain, brightness, focus, etc. If you were using a Firewire camera, I'd suggest looking into the libdc1394 library, which gives you control over all these values and more if you're using an IIDC Firewire camera (like the old external iSight). I use this library for video capture from, and control of, CCD cameras on a robotics platform.
However, I'm guessing that you're interested in the internal iSight camera, which is USB. Wil Shipley briefly mentions control of parameters on internal USB iSights in his post "Frozen in Carbonite", but most of the Carbon code he lays out controls those values in IIDC Firewire cameras.
Unfortunately, according to this message in the QuickTime mailing list by Brad Ford, it sounds like you can't programmatically control anything but saturation and sharpness on builtin iSights through the exposed interfaces. He speculates that iGlasses is post-processing the image in software, which is something you could do using Core Image filters.
I finally managed to lock my iSight's autoexposure/autowhitebalance from my Cocoa App.
Check out www.paranoid-media.de/blog for more info.
Hmmm,
I tried & googled a lot these days but I couldn't find a good idea.
I think OpenCV + cocoa + iGlasses is the fastest one but still unstable.
If you have good idea, please reply.
Thank you.
The UVC Camera Control for Mac OSX by phoboslab uses basic USB commands and documented USB interfaces to access the webcam controls. The paranoid-media.de/blog listed above links to PhobosLab and provides a few additional tweaks to that method for the iSight. (Those tweaks can now also be found in the comments at phoboslab.

Resources