How to host audio units in osx - macos

I've been looking through the audio unit documentation for OSX, and I haven't found any good resources about how to host an audio unit in OSx. There are lots of resources for how to build audio units, and some about hosting in IOS. Has anyone seen a good document to this effect?
Thanks

i'm not sure how the samples have changed over the years... it looks like the current related samples don't address the issue in depth.
in the past, there were a few very basic examples which shipped with the DevTools distributions. iirc, these were distributed in the DEVTOOLS/Examples/ or DEVTOOLS/Extras/.
the AU APIs haven't changed a whole lot since they were written (fast dispatch was added along the way...), so the samples should help with the backend. they were written in the era when Cocoa UIs were still very new to AUs, so the frontend will be the aspect that's changed the most.
you may want to look for them in your Xcode 3.0 or Xcode 2.5 installs.

This [1] doc shows how to open audio units from within an OSX app. It doesn't cover the general case of hosting audio units though.
[1] http://developer.apple.com/library/mac/technotes/tn2091/_index.html

Related

Where can I find an example of creating a FaceTime comparable camera in OSX

Many of us are working from home more and I envy the Windows guys who have a virtual webcam plugging in OBS (Open Broadcast Studio). OBS and the Windows plugin are open source projects. As a competent software engineer I should be able to create a plugin that works on OSX -But- I am not a hardened OSX dev.
I am sure I am not googling for the correct APIs and Subsystems. If someone(s) could help me with the Apple concept map to this obscure topic. I would be grateful for a set of crumbs that leads to the OSX API to call(s) to create a camera. I know it can be done as SnapCam does it, but that is a closed-source app.
I am aware of the workaround for OBS that
1) uses code injection and requires disabling security features
2) Doesn't even work in the current versions of OSX
3) Requires yet another app running with video previews etc.
I like the challenge of creating this plugin. I am also wise enough to try and ask for a road map if one is available.
Someone beat me to it. https://github.com/johnboiles/obs-mac-virtualcam
I thought I would search just githib.com directly with the search "virtual camera macos site:github.com". Constraining the search to just GitHub was quite useful.

How to expose a virtual camera on macOS?

I want to write my own camera filters for videochat, and ideally apply them in any/all of the popular videochat applications (Zoom, Hangouts, Skype, etc.). The way I imagine this working is to write a macOS application that reads the camera feed, applies my filters, and exposes an additional virtual camera. This virtual camera could then be selected in whichever videochat application.
I've spent many hours researching how to do this and I'm still not clear if it's even possible with modern macOS APIs. There are a few similar questions on StackOverflow (e.g. here, here), but they are either unanswered or very old. I'm hoping this question will collect advice/links/ideas in the right direction for how to do this as of 2020.
Here's what I got so far:
There's a popular tool in the live streaming community called OBS Studio. It captures input from different sources (camera, desktop, etc.), has a plugin system for applying effects, and then streams the output to popular services (e.g. Twitch). However, there is no functionality to expose the stream as a virtual camera on macOS. In discussions about this (thread, thread), folks talk about a tool called Syphon and a tool called CamTwist.
Unfortunately, Syphon doesn't expose a virtual camera anymore: "SyphonInject NO LONGER WORKS IN macOS 10.14 (Mojave). Apple closed up the loophole that allows scripting additions in global directories to load into any process. Trying to inject into any process will silently fail. It will work if SIP is disabled, but that's a terrible idea and I'm not going to suggest or help anyone do that."
Fortunately, CamTwist works. I got it running on my macOS Catalina, applied some of its builtin effects on my camera stream, and saw it show up as a new camera in my Hangouts settings (after restarting Chrome). This was encouraging.
Unfortunately, CamTwist is rather old and not well maintained. It uses Quartz Composer for implementing effects, but Quartz Composer was deprecated by Apple and it's probably living its last days in Catalina.
The macOS SDK used to have an API called CoreMediaIO, which might have been the way to expose a virtual camera, but this API was also deprecated. It's not clear if/what is a modern alternative.
I guess another way of asking this whole question is: how is CamTwist implemented, how come it still works in macOS Catalina, and how would you implement the same thing in 2020?
Anything that sheds some light on all of this would be highly appreciated!
I also want to create own camera filter like Snap Camera.
So I researched around CoreMediaIO and Syphon.
Did you check this Github project?
https://github.com/lvsti/CoreMediaIO-DAL-Example
This repository started off as a fork of the official CoreMediaIO sample code by Apple.
You know, the original code didn't age well since it was last updated in 2012.
So the owner of the repository changed to make it compile on modern systems.
And you can know that the code works in macOS 10.14 (Mojave) to see the following issue.
https://github.com/lvsti/CoreMediaIO-DAL-Example/issues/4
Actually I have not created the camera filter yet because I don't know how to send images to virtual camera that builded by CoreMediaIO.
I would like to know more information. If you know please tell me.
CamTwist uses CoreMedioIO. What makes you think that's deprecated? Looking at the headers in the 10.15 SDK, I see no indication that it's deprecated. There were updates as recently as 10.14.

Is there an official Apple API for streaming from a Mac app to Apple TV?

I have been searching high and low, but have so far been ineffable. I am now turning for the stackoverflow community for advice.
My goal is to build a Mac app in XCode, that will allow me to send MP4 content from my Mac (or from a public URL on the web) to my Apple TV.
I have located numerous classes within the framework for iOS that enables this (quite easily it seems), but the trail ends there. It just seems like there is no API to do the same from OSX, but I am hoping that I have overlooked something :-) There seems to be well established methods for sending audio to AirPlay enabled devices, but not video?
I am aware of the 3rd party specification of the protocol at http://nto.github.io/AirPlay.html, and it looks a tangible plan B for my needs, but I would appreciate any pointers if anyone knows of a more official way.

Low-latency audio playback from Ruby

I'm building an audio application in Ruby which needs low latency audio playback. So far, I'm using SDL, which is great for a prototype, but it's got nowhere near enough performance for what I need.
I've tried using the ruby-jack gem, but it doesn't seem complete enough to inject any audio into a playback port (and the documentation is wildly incomplete).
If it changes much, I'm on OS X (but I'd like something that's decently cross-platform), and I'm (currently) playing back small WAV files, but more formats would be better. I don't want especially want to call a system application to do this, either.
My application's full source is available on Github; the salient features of it are in a gist, for those who want to have a look.
I'm not too certain if I have the correct answer for you, but I believe it may be worth your time to look into rbSFML. It is a binding for SFML, a multi media library, which has been growing in popularity.
Go here for rbSFML
http://groogy.se/mainsite/rbsfml/
SFML main page
http://www.sfml-dev.org/
Wish I had more information for you!

Differences in multimedia frameworks

I've been recently investigating different multimedia frameworks for adding audio and video capabilities to my applications.
I've been looking at phonon, gstreamer, ffmpeg, libvlc/vlc.
However, I cannot find a good resource that answers some of my general questions.
Are these interchangeable?
Do they work in the same level?
Do you have any experience using some and can give feedback of why did you chose one over the other?
Thanks
Are these interchangeable?
generally not. Phonon is a high level api that wraps actual multimedia frameworks, which allows you to change the backend, but on the other hand limits what you can do.
Do they work in the same level?
no. some of the ones you mentioned are high level, some are low level.
Do you have any experience using some and can give feedback of why did you chose one over the other?
You should really tell what you want to do. Then people can advise what framework might be suitable. Lower level frameworks such as gstreamer cover quite a large variety of use cases.
There is a 'GStreamer SDK' for windows and OS/X which should get you started easily on those platforms (on Linux you can just install your distro's -dev packages). The SDK ships with snappy, which is a small media player using clutter, but you can easily build your own player using some other toolkit or API of course.

Resources