kVTCompressionPropertyKey_DataRateLimits property not being respected by hardware encoder, macOS - macos

I am working on a macOS project that uses hardware acceleration for video encoding, using VideoToolbox. I have tried controlling the bitrate using kVTCompressionPropertyKey_AverageBitRate and kVTCompressionPropertyKey_DataRateLimits, but after many attempts it still frequently overshoots the target bitrate.
Is kVTCompressionPropertyKey_DataRateLimits respected on macOS?
When I use VTSessionCopySupportedPropertyDictionary to get the supported properties on my compression session, DataRateLimits shows up in the dictionary, and I have set it on the compression session without error. But then if I try to actually get the value of it using
CFTypeRef value = nullptr;
result = VTSessionCopyProperty(m_compressionSessionRef, (__bridge CFStringRef)key, kCFAllocatorDefault, &value);
where key is DataRateLimits, as taken from the dictionary, it returns an error that the property is unsupported. But on iOS, it returns successfully.
This seems pretty serious, and does not appear limited to my machine. If it's truly not supported, that would help explain the volatility of my encoding bitrate. However, in months of working on this project and researching VideoToolbox, I have not seen anything about this not being supported by the hardware encoder on macOS. I would think such a defect would be a big issue. Does anyone have any insight here? Thank you!
Update: 8/5
Note: SDKs targeted are recent (macOS-10.14.x, iOS-12.x). I see this issue on multiple hardware models with varying versions of macOS (10.11-10.14), though all are MacBook Pros, going as far back as 2013/2014, as well as recent touch models 2017/2018. For iOS, I don't have a lot of data, but I see DataRateLimits appears as expected on 12.3.1, and on iPhone 7 and iPhone X. Bottom line, the OS version and hardware doesn't seem to matter; it works on phones, but not laptops.

Related

How to expose a virtual camera on macOS?

I want to write my own camera filters for videochat, and ideally apply them in any/all of the popular videochat applications (Zoom, Hangouts, Skype, etc.). The way I imagine this working is to write a macOS application that reads the camera feed, applies my filters, and exposes an additional virtual camera. This virtual camera could then be selected in whichever videochat application.
I've spent many hours researching how to do this and I'm still not clear if it's even possible with modern macOS APIs. There are a few similar questions on StackOverflow (e.g. here, here), but they are either unanswered or very old. I'm hoping this question will collect advice/links/ideas in the right direction for how to do this as of 2020.
Here's what I got so far:
There's a popular tool in the live streaming community called OBS Studio. It captures input from different sources (camera, desktop, etc.), has a plugin system for applying effects, and then streams the output to popular services (e.g. Twitch). However, there is no functionality to expose the stream as a virtual camera on macOS. In discussions about this (thread, thread), folks talk about a tool called Syphon and a tool called CamTwist.
Unfortunately, Syphon doesn't expose a virtual camera anymore: "SyphonInject NO LONGER WORKS IN macOS 10.14 (Mojave). Apple closed up the loophole that allows scripting additions in global directories to load into any process. Trying to inject into any process will silently fail. It will work if SIP is disabled, but that's a terrible idea and I'm not going to suggest or help anyone do that."
Fortunately, CamTwist works. I got it running on my macOS Catalina, applied some of its builtin effects on my camera stream, and saw it show up as a new camera in my Hangouts settings (after restarting Chrome). This was encouraging.
Unfortunately, CamTwist is rather old and not well maintained. It uses Quartz Composer for implementing effects, but Quartz Composer was deprecated by Apple and it's probably living its last days in Catalina.
The macOS SDK used to have an API called CoreMediaIO, which might have been the way to expose a virtual camera, but this API was also deprecated. It's not clear if/what is a modern alternative.
I guess another way of asking this whole question is: how is CamTwist implemented, how come it still works in macOS Catalina, and how would you implement the same thing in 2020?
Anything that sheds some light on all of this would be highly appreciated!
I also want to create own camera filter like Snap Camera.
So I researched around CoreMediaIO and Syphon.
Did you check this Github project?
https://github.com/lvsti/CoreMediaIO-DAL-Example
This repository started off as a fork of the official CoreMediaIO sample code by Apple.
You know, the original code didn't age well since it was last updated in 2012.
So the owner of the repository changed to make it compile on modern systems.
And you can know that the code works in macOS 10.14 (Mojave) to see the following issue.
https://github.com/lvsti/CoreMediaIO-DAL-Example/issues/4
Actually I have not created the camera filter yet because I don't know how to send images to virtual camera that builded by CoreMediaIO.
I would like to know more information. If you know please tell me.
CamTwist uses CoreMedioIO. What makes you think that's deprecated? Looking at the headers in the 10.15 SDK, I see no indication that it's deprecated. There were updates as recently as 10.14.

macOS High Sierra Edit Kext

A while ago I was digging around in the kexts provided with macOS High Sierra, specifically AppleHDA.kext.
The audio driver appears to implement a DSP chain with some rather complicated signal processing going on. In particular, there is a great deal of dynamic range compression applied to the output as well as other things I'd like to experiment with removing.
There are a large number of configuration plists inside, some of which I suppose may apply to my model of MacBook Pro (MacBookPro13,2). I have been able to narrow down which four are likely to be the one that controls the sound system on my machine, and how to edit it to get what I want.
However, after much effort editing the kext, it will not load properly after being repacked. This is supposedly due to kext signing, and I have tried removing the signature altogether and disabling SIP, but both options do not work. Wondering what some suggestions for further exploration would be.
Thanks!

Any way to programmatically disable hardware accelerated h.264 video decoding in silverlight 5?

We are getting reports from our Mac users that some of their video is playing back garbled. This only started happening when Silverlight 5 was released. This release included hardware video decoding acceleration for H.264, which is the codec we use. We have found that disabling the hardware acceleration through the Silverlight Preferences solves the problem.
Does anyone know of a programmatic way of disabling the hardware acceleration? We have thousands of users on OS X, and would like to preemptively fix this issue for them. Other ways of solving the issue from our end would also be welcome.
Relevant details:
H.264 codec in MP4 container, sometimes with AAC audio
Video is hosted on Amazon S3 and fed through a CDN
Using the Silverlight MediaElement
I have tried turning off enableGPUAcceleration in the object params
I have tried turning off the CacheMode since it affects GPU acceleration as detailed on this page about Silverlight hardware acceleration.
Update
It doesn't happen consistently, which is making this problem harder to solve. Some videos will play OK, and others will not. All the videos are encoded the same way.
It happens in Google Chrome, Safari, and Firefox.
This is Mac OS X only, it doesn't happen at all in Windows.
It happens on several different models and revisions of Macs. Mac Mini, MacBook Air, MacBook Pro, etc. We haven't found a particular model that never has the issue so far.
Update 2
Reproduces with Silverlight 5.0 and 5.1
Update 3
This is in-browser (hence the browsers listed above)
Here is an example of the garbled video
The preferences setting that fixes the issue
You mentioned several browsers, so I assume you're hosted on a page. See if this does it for you:
<param name="enableGPUAcceleration" value="false"/>
in the <object> tag on the HTML page hosting the plug-in.
If it is Out of browser application you can turn off GPU Acceleration in OOB settings

WebGL on older Mac OS X versions (say 10.4)

Not really a programming related question but...
I'd like very much to experiment with WebGL on my spare time. My current 'spare time' machine is a MacBook running Mac OS X Tiger (10.4.xx) and I'm unable to find a new browser supporting this OS. Firefox dropped support, Chrome too, and Safari idem.
I read somewhere that this is due to a Quicktime bug that Apple won't fix.
Does anyone have more information on this issue ?
Does anyone have a clue or track to find a running implementation of WebGL on Mac OS X 10.4 ?
Cheers,
I know a fellow who is maintaining a Firefox 4 port to OS X 10.4.
Check out http://www.floodgap.com/software/tenfourfox/
Edit: Unfortunately I've just found out that this doesn't quite fulfil your main reason for wanting Firefox 4.
From the dev's site:
OpenGL support is presently disabled
in 10.4Fx. This is Apple's fault, as
Mozilla requires non-power-of-two
texture sizes, which require OpenGL 2.
Unfortunately, PPC Tiger does not
support OpenGL 2 at all, and only a
subset of cards support it in PPC
Leopard (the really irritating part is
that Intel Tiger does have OpenGL 2,
and OpenGL 2 came out in 2004!). It
may be enabled in the future for those
handful of configurations on Leopard,
but this won't benefit the majority of
users. Note that many graphics
features will work just fine; they
just won't be hardware-accelerated.
If you can get a build of Firefox which has WebGL to run, but don't have a GPU that supports OpenGL ES 2.0, you might want to try setting the "webgl.osmesalib" about:config option. Even simple programs will probably run at a flip-book frame rate however.

Same QtOpenGL code runs as about 15 times slower when going to Carbon (vs Cocoa)

I'm developing a very simple application for the Mac OSX platform making use of Qt and OpenGL (and QtOpenGL) so crossplatform gets easier.
The application receive a variable number of video streams that have to be rendered to the screen. Each frame of these video streams is used as a texture for mapping a rectangle in 3D space (very similar to a videowall).
Apart from the things such as receiving, locking, uploading video data, synchronizing threads... i consider it is clear that it's a quite simple application.
The fact is that all behaves ok when using cocoa based Qt 4.7 binaries (the default ones) in a 10.5 Mac.
But my code has to run fine at all of the OSX versions starting from (and including to) 10.4. So i tried the code in a 10.4 machine and it crashed just when starting. After a few hours of internet reading, i discovered that for a Qt Application to be targeted at 10.4, carbon Qt based has to be used. So i rebuild the whole project with the new framework.
When the new resulting binary gets run, all works well except by the fact that application's fps fall to about 2 fps!! And it behaves the same at both machines (10.5 computer has sensibly better features)
I've spent quite time working on this but i have not reached a solution. Any suggest?
More information about the application and things i've tried
code has not been modified when recompiling carbon based
only two (256x256 textures) videos ar used in order to assure it's not a bandwidth limit problem (although i know it shouldn't because the first code worked)
the 2 video streams arrive from network (local)
when a video stream arrives, a signal is emmited and the data will be uploaded to an OpenGL texture (glTexSubImage2D)
a timer makes render (paintGL) happen at about 20ms (~50 fps)
the render code use the textures (updated or not) to draw the rectangles.
rendering only when a video arrives won't work because of having 2 (asynchronous) video streams; besides more things have to be draw at screen.
only basic OpenGL commands are used (no PBO,FBO,VBO,...) The only one problematic thing could be the use of shaders (available only from Qt 4.7), but its code is trivial.
i've made use of OpenGLProfiler and Instruments. Nothing special/strange was observed.
Some things i suspect (conclusions)
it's clear it's not a hardware issue. The same computer behave differently
it gives me the sensation it's a threading/locking problem but, why?
carbon is 32 bits. The 10.5 application was 64. It's not possibly develop 64 bits in carbon.
for giving away the 32 bits possible cause, i also rebuild the first project for 32 bits. It worked partically the same.
i've read something about carbon having problems (more than usual) with context switching.
maybe OpenGL implementation is Multithread and code is not? (or the opposite?) That could cause a lot of stalls.
maybe carbon handle events differently from cocoa's? (i mean signal/event dispatching, main loop...)
Ok, this is (sorry for the so long writing) my actual headache. Any suggestion, idea.. would be very appreciated.
Thx in advance.
May I ask a diagnostic question? Can you ensure that it's not being passed to the software renderer?
I remember that when 10.4 was released, there was some confusion about quartz extreme, quartz and carbon, with some of it disabled, and hardware renderers disabled by default on some of them, which required configuration by the end user to get it working correctly. I'm not sure whether this information is pertinent, because you say that, having targetted 10.4, the problem exhibits on both the 10.4 and the 10.5, yes?
It's possible (though admittedly I'm grasping at straws here) that even in 10.5 carbon doesn't use the hardware renderers by default. I'd like to think though that OSX prefers hardware renderers to software renderers in all scenarios, but it may be worth spending a little time looking into, given how thoroughly you're already looking into other options.
Good luck.
If you are using Qt, I guess your code would work on a windows or linux platform. Have you tried your application under these platforms ?
This would quickly reveal if it comes from Qt or the mac OSX version.

Resources