I need to write display driver on MacOS High Sierra for external display. I found IOKit sample for device driver and basic document about IOVideoDevice. But I cannot find detail document or sample code with IOVideoDevice.
I joined Apple developer program for $99/year. Do I have to join special Apple program for writing video driver? I wonder how graphic card vendor, DisplayLink and AirParrot got the information.
By "video driver" do you mean a video capture device or a graphics card (GPU)?
IOVideoDevice implies a video capture device, e.g. webcam or video capture card. However, this API is old, nowadays drivers for video capture devices should be written as CoreMediaIO plugins. (Though since the prevalence of the Library Validation
code signing flag, this route also has issues, such as 3rd party capture drivers not working with FaceTime and similar apps; this goes beyond the scope of this question.)
"Graphics card" suggests you have a device you want to use as a display for the Mac. This is not officially supported by Apple. It used to be that you could create an IOFramebuffer subclass. As of macOS 10.13 this no longer works as expected (blank screen), and does not work at all as of 10.13.4-10.13.6. The GPU manufacturers (Intel, AMD, and NVidia) are suppliers to Apple, so they get deep access to the graphics pipeline. The APIs they use to implement their drivers are not public.
Update: As of 10.14 and 10.15, IOFramebuffer subclasses sort of work again. At least, sufficiently so that the OS extends the display are to such a virtual screen, although the "vram" is never actually filled with the image data. You need to capture that in user space via Core Graphics APIs or similar.
Related
I'm actually facing some issues when I record my screen with audio while using a bluetooth microphone on macOS. The voice is sometimes crackly, sped up, sounds like a robot, or just loud noise.
Here are a few pieces of information:
I built a custom app to record screen/audio using Electron framework.
navigator.mediaDevices.getUserMedia is used without any specific configuration.
I can replicate the problem sometimes with Microsoft Teams, or with others meetings/recording tools.
The majority of users who have that issue are on macOS (some rarely are on Windows) and every user that has that kind of issue used Bluetooth devices.
Did anyone have already saw something like that?
I am new to the Apple DriverKit SDK an i am not clear about how register my device driver so it would be available as a Camera in the OS. Do i have to register a streaming function in the Start function of the IOService? I searched all over the internet for an answer but i could not find on.
I need to read data from a custom USB Camera and then make it available via a custom driver.
Can any of you guys help me?
Support for cameras and video capture devices is not implemented as special I/O Kit classes in macOS (and therefore also not in DriverKit), but rather entirely in user space, via the Core Media I/O framework. Depending on the type of device a DriverKit component may be necessary. (e.g. PCI/Thunderbolt, which is not available directly from userspace, or USB devices where the camera functionality is not cleanly isolated to a USB interface descriptor) This dext would expose an entirely custom API that in turn can then be used from the user space CoreMediaIO based driver.
From macOS 13 (Ventura) onwards, the Core Media I/O extensions API should be used to implement a driver as this will run in its own standalone process and can be used from all apps using Core Media.
Prior to that (macOS 12 and earlier), only a so-called Device Abstraction Layer (DAL) plug-in API existed, which involved writing a dynamic library in a bundle, which would be loaded on-demand by any application wishing to use the device in question. Unfortunately, this raised code signing issues: applications built with the hardened runtime and library-validation flags can only load libraries signed by Apple or by the same team as signed the application itself. This means such apps can't load third party camera drivers. Examples of such apps are all of Apple's own, including FaceTime.
I am trying to convert the desktop app to Android Automotive OS (AAOS). I am using OpenCV DNN for object tracking. Also, I am using OpenGL to render the contents. Rendering outputs (2 full HD) must be displayed on two monitors (must be full screen). Also, I must send some data using serial communication. I don't have any experience with AAOS. So I can not decide to this app doable or not on AAOS. So If you have any experience with AAOS can you give me any feedback about this project. AAOS runs on Snapdragon SA8155.
Dev board link:
https://www.lantronix.com/products/sa8155p-automotive-development-platform/#tab-features
Android Automotive supports multiple screens. And specifically this platform provides multiple video outputs.
You should check whether mentioned features are supported by provided Android distributive. Most certainly the distro is supplied by Qualcomm. In this case you need to get access to Qualcomm's documentation.
I want to write a Windows application which accesses the joystick. It is just a very simple application which reads the values and sends them to a server, so I am not using any game programming framework. However, I am confused about which API to use.
I looked at the Multimedia Joystick API, but this is described as superseded by DirectInput. So I looked at DirectInput, but this is also deprecated in favour of XInput.
However the XInput documentation talks only about Xbox360 controllers, and says it does not support "legacy DirectInput devices".
Have Microsoft consigned the entire HID Joystick usage type to the dustbin and given up on supporting them in favour of their own proprietary controller products, or am I missing something?
The most common solution is to use a combination of XInput and DirectInput so your application can properly access both type of devices. Microsoft even provides instructions on how to do this.
Please note that DirectInput is not available for Windows Store apps so if you intend to distribute through there, that's not an option.
XInput devices like the XBox 360 controller will also work with DirectInput but with some limitations. Notably, the left and right trigger will be mapped to the same axis instead of being independents, and vibration effects will not be available.
I wrote a video playback application based on Carbon in MAC OSX, are there any API to turn on the DXVA feature which support by Graphic Card? Does it support in QuickTime SDK or Carbon API?
DirectX is part of Windows. It doesn't exist in Mac OS X.
If you're doing hardware-accelerated video playback: Why are you worrying about it? If the hardware supports it, chances are, the APIs will use it. So just play your movie and let the library take care of doing it through the graphics card or not.
If you're doing video capture: Core Video will let you do that through the graphics card.
I believe that on very-recent hardware only, QuickTime will use hardware video acceleration for the decoding of some types of video stream.
Note that this is NOT specifically related to the capabilities of the graphics card, for example the 8800GT PureVideo feature works fine under Windows but is unused in OS X.