How to expose a virtual camera on macOS? - macos

I want to write my own camera filters for videochat, and ideally apply them in any/all of the popular videochat applications (Zoom, Hangouts, Skype, etc.). The way I imagine this working is to write a macOS application that reads the camera feed, applies my filters, and exposes an additional virtual camera. This virtual camera could then be selected in whichever videochat application.
I've spent many hours researching how to do this and I'm still not clear if it's even possible with modern macOS APIs. There are a few similar questions on StackOverflow (e.g. here, here), but they are either unanswered or very old. I'm hoping this question will collect advice/links/ideas in the right direction for how to do this as of 2020.
Here's what I got so far:
There's a popular tool in the live streaming community called OBS Studio. It captures input from different sources (camera, desktop, etc.), has a plugin system for applying effects, and then streams the output to popular services (e.g. Twitch). However, there is no functionality to expose the stream as a virtual camera on macOS. In discussions about this (thread, thread), folks talk about a tool called Syphon and a tool called CamTwist.
Unfortunately, Syphon doesn't expose a virtual camera anymore: "SyphonInject NO LONGER WORKS IN macOS 10.14 (Mojave). Apple closed up the loophole that allows scripting additions in global directories to load into any process. Trying to inject into any process will silently fail. It will work if SIP is disabled, but that's a terrible idea and I'm not going to suggest or help anyone do that."
Fortunately, CamTwist works. I got it running on my macOS Catalina, applied some of its builtin effects on my camera stream, and saw it show up as a new camera in my Hangouts settings (after restarting Chrome). This was encouraging.
Unfortunately, CamTwist is rather old and not well maintained. It uses Quartz Composer for implementing effects, but Quartz Composer was deprecated by Apple and it's probably living its last days in Catalina.
The macOS SDK used to have an API called CoreMediaIO, which might have been the way to expose a virtual camera, but this API was also deprecated. It's not clear if/what is a modern alternative.
I guess another way of asking this whole question is: how is CamTwist implemented, how come it still works in macOS Catalina, and how would you implement the same thing in 2020?
Anything that sheds some light on all of this would be highly appreciated!

I also want to create own camera filter like Snap Camera.
So I researched around CoreMediaIO and Syphon.
Did you check this Github project?
https://github.com/lvsti/CoreMediaIO-DAL-Example
This repository started off as a fork of the official CoreMediaIO sample code by Apple.
You know, the original code didn't age well since it was last updated in 2012.
So the owner of the repository changed to make it compile on modern systems.
And you can know that the code works in macOS 10.14 (Mojave) to see the following issue.
https://github.com/lvsti/CoreMediaIO-DAL-Example/issues/4
Actually I have not created the camera filter yet because I don't know how to send images to virtual camera that builded by CoreMediaIO.
I would like to know more information. If you know please tell me.

CamTwist uses CoreMedioIO. What makes you think that's deprecated? Looking at the headers in the 10.15 SDK, I see no indication that it's deprecated. There were updates as recently as 10.14.

Related

Where can I find an example of creating a FaceTime comparable camera in OSX

Many of us are working from home more and I envy the Windows guys who have a virtual webcam plugging in OBS (Open Broadcast Studio). OBS and the Windows plugin are open source projects. As a competent software engineer I should be able to create a plugin that works on OSX -But- I am not a hardened OSX dev.
I am sure I am not googling for the correct APIs and Subsystems. If someone(s) could help me with the Apple concept map to this obscure topic. I would be grateful for a set of crumbs that leads to the OSX API to call(s) to create a camera. I know it can be done as SnapCam does it, but that is a closed-source app.
I am aware of the workaround for OBS that
1) uses code injection and requires disabling security features
2) Doesn't even work in the current versions of OSX
3) Requires yet another app running with video previews etc.
I like the challenge of creating this plugin. I am also wise enough to try and ask for a road map if one is available.
Someone beat me to it. https://github.com/johnboiles/obs-mac-virtualcam
I thought I would search just githib.com directly with the search "virtual camera macos site:github.com". Constraining the search to just GitHub was quite useful.

App rejected because of using global hotkeys

The last days I sent a new build of my app to Apple to fix some minor bugs with macOS Catalina. This evening Apple called me and explained that they are refusing the new version in the AppStore because the application monitors keystrokes. I use the Clipy/Magnet library (github.com/Clipy/Magnet) to manage the hotkeys. This library uses a Carbon API. Admittedly, I'm a little surprised - on the one hand that Apple is calling me directly via phone about this and on the other hand the use of this library has not caused any problems so far. The only thing I noticed about the new build was that Catalina asks for "Input-Monitoring" at the first start of the program. I've installed several other programs that also respond to global hotkeys and none of them require the permission for "input monitoring".
Does anyone have a similar phenomenon under Catalina or how do you solve the problem of query/monitoring global hotkeys?
I know of numerous applications using the MASShortcut framework, and I don't believe they've had issues with the store. It also uses a carbon API, but I think the issue in Magnet is the call to CGEvent.tapCreate which can globally monitor all keyboard input, and which MASShortcut doesn't use.

How to host audio units in osx

I've been looking through the audio unit documentation for OSX, and I haven't found any good resources about how to host an audio unit in OSx. There are lots of resources for how to build audio units, and some about hosting in IOS. Has anyone seen a good document to this effect?
Thanks
i'm not sure how the samples have changed over the years... it looks like the current related samples don't address the issue in depth.
in the past, there were a few very basic examples which shipped with the DevTools distributions. iirc, these were distributed in the DEVTOOLS/Examples/ or DEVTOOLS/Extras/.
the AU APIs haven't changed a whole lot since they were written (fast dispatch was added along the way...), so the samples should help with the backend. they were written in the era when Cocoa UIs were still very new to AUs, so the frontend will be the aspect that's changed the most.
you may want to look for them in your Xcode 3.0 or Xcode 2.5 installs.
This [1] doc shows how to open audio units from within an OSX app. It doesn't cover the general case of hosting audio units though.
[1] http://developer.apple.com/library/mac/technotes/tn2091/_index.html

how the new Mac OS AirDrop works

I am wondering what technologies are used by the new Mac OS AirDrop and if there is a way to use it on windows.
You know that AirDrop is a feature that will be introduced as part of Mac OS X Lion (version 10.7), right? That version of the OS is not even out yet, and it won't be until later this summer.
Furthermore, I assume that the handful of lucky developers who have a pre-release copy are under a strict non-disclosure agreement (this is Apple, and that's pretty standard policy in the industry anyway), which would keep them from giving any details about the feature in a public forum such as this one.
But, since I am not one of those lucky developers, I suppose I'm free to do a little speculating about how it might work. Presumably, it takes advantage of Apple's existing Bonjour network service discovery protocol (formerly known as Rendezvous) to locate other users nearby whose devices support AirDrop. The rest of the pieces have been part of Mac OS X for years, they just haven't been wrapped up in a fancy, easy to use interface (really, that's about all that software development is about nowadays). There's always been rich support for peer-to-peer networking, you've always been able to share files with other users, users have always had a public folder, etc. (This is UNIX we're talking about, after all.)
Will it work on Windows? Maybe. Apple has been surprisingly good in recent history about including its Windows brethren in on the fun—iTunes, Safari, MobileMe, etc. But it doesn't always happen right away. Rolling your own solution for Windows (or any other platform) would be pretty simple, but there's no guarantee that it will be compatible with Apple's.
Bonjour happens at layer 3, so it may be a small part of it.
The real question is how does AirDrop work at layer 2.
Airdrop was reverse-engineerd by the https://owlink.org/ folks. They implemented a free Python version called opendrop as well. The implementation is (unsurprisingly) quite hairy as you need to setup a special Wi-Fi link alongside some bluetooth voodoo, but it apparently works. Or at least it works better than whatever we had before, which was those few question around SE:
Implementing the AirDrop protocol
Is it possible to listen on the Airdrop protocol with my Ubuntu machine?

Exposure Lock in iSight

I am creating object-detection program on Mac.
I want to use iSight in manual exposure mode to improve detection quality.
I tried iGlasses & QTKit Capture to do that and it worked but program runs very slowly and unstable.
So I want to try other solution.
In PhotoBooth.app, iSight seemed to be run in fixed exposure mode so there might be a way to do that.
I read QTKit Capture documents and OpenCV documents but I couldn't find the answer.
If you have any ideas, please tell me.
Thank you.
QTKit Capture, as easy as it is to use, lacks the ability to set manual camera parameters like gain, brightness, focus, etc. If you were using a Firewire camera, I'd suggest looking into the libdc1394 library, which gives you control over all these values and more if you're using an IIDC Firewire camera (like the old external iSight). I use this library for video capture from, and control of, CCD cameras on a robotics platform.
However, I'm guessing that you're interested in the internal iSight camera, which is USB. Wil Shipley briefly mentions control of parameters on internal USB iSights in his post "Frozen in Carbonite", but most of the Carbon code he lays out controls those values in IIDC Firewire cameras.
Unfortunately, according to this message in the QuickTime mailing list by Brad Ford, it sounds like you can't programmatically control anything but saturation and sharpness on builtin iSights through the exposed interfaces. He speculates that iGlasses is post-processing the image in software, which is something you could do using Core Image filters.
I finally managed to lock my iSight's autoexposure/autowhitebalance from my Cocoa App.
Check out www.paranoid-media.de/blog for more info.
Hmmm,
I tried & googled a lot these days but I couldn't find a good idea.
I think OpenCV + cocoa + iGlasses is the fastest one but still unstable.
If you have good idea, please reply.
Thank you.
The UVC Camera Control for Mac OSX by phoboslab uses basic USB commands and documented USB interfaces to access the webcam controls. The paranoid-media.de/blog listed above links to PhobosLab and provides a few additional tweaks to that method for the iSight. (Those tweaks can now also be found in the comments at phoboslab.

Resources