Apple DriverKit SDK Camera driver registration - macos

I am new to the Apple DriverKit SDK an i am not clear about how register my device driver so it would be available as a Camera in the OS. Do i have to register a streaming function in the Start function of the IOService? I searched all over the internet for an answer but i could not find on.
I need to read data from a custom USB Camera and then make it available via a custom driver.
Can any of you guys help me?

Support for cameras and video capture devices is not implemented as special I/O Kit classes in macOS (and therefore also not in DriverKit), but rather entirely in user space, via the Core Media I/O framework. Depending on the type of device a DriverKit component may be necessary. (e.g. PCI/Thunderbolt, which is not available directly from userspace, or USB devices where the camera functionality is not cleanly isolated to a USB interface descriptor) This dext would expose an entirely custom API that in turn can then be used from the user space CoreMediaIO based driver.
From macOS 13 (Ventura) onwards, the Core Media I/O extensions API should be used to implement a driver as this will run in its own standalone process and can be used from all apps using Core Media.
Prior to that (macOS 12 and earlier), only a so-called Device Abstraction Layer (DAL) plug-in API existed, which involved writing a dynamic library in a bundle, which would be loaded on-demand by any application wishing to use the device in question. Unfortunately, this raised code signing issues: applications built with the hardened runtime and library-validation flags can only load libraries signed by Apple or by the same team as signed the application itself. This means such apps can't load third party camera drivers. Examples of such apps are all of Apple's own, including FaceTime.

Related

How do I go about writing a driver using IOKit/DriverKit for MT6320 on an Azure Sphere Kit from MSFT?

Where are the step-by-step instructions to write a generic driver to a USB-connected developer board for MacOS using IOKit/DriverKit (publicly shown in WWDC 2019) in Xcode?
The only documentation I'm aware of is:
DriverKit reference
WWDC 2019 Session 702
The DriverKit version of IOKit is intended to have a similar API to the in-kernel IOKit, so I guess they expect you to be familiar with that.
Note that in many cases when writing drivers for USB devices, you don't need to use either DriverKit or a kext, and instead can use the userspace IOUSB libraries directly. You only really need to use DriverKit or a kext if the kernel is the consumer of your driver. You haven't said what your driver will do, so I can't say which is best in your case. DriverKit is still extremely limited, so unless you want to write a HID or serial port driver, there are few reasons to choose it at the moment.

How to write MacOS display driver

I need to write display driver on MacOS High Sierra for external display. I found IOKit sample for device driver and basic document about IOVideoDevice. But I cannot find detail document or sample code with IOVideoDevice.
I joined Apple developer program for $99/year. Do I have to join special Apple program for writing video driver? I wonder how graphic card vendor, DisplayLink and AirParrot got the information.
By "video driver" do you mean a video capture device or a graphics card (GPU)?
IOVideoDevice implies a video capture device, e.g. webcam or video capture card. However, this API is old, nowadays drivers for video capture devices should be written as CoreMediaIO plugins. (Though since the prevalence of the Library Validation
code signing flag, this route also has issues, such as 3rd party capture drivers not working with FaceTime and similar apps; this goes beyond the scope of this question.)
"Graphics card" suggests you have a device you want to use as a display for the Mac. This is not officially supported by Apple. It used to be that you could create an IOFramebuffer subclass. As of macOS 10.13 this no longer works as expected (blank screen), and does not work at all as of 10.13.4-10.13.6. The GPU manufacturers (Intel, AMD, and NVidia) are suppliers to Apple, so they get deep access to the graphics pipeline. The APIs they use to implement their drivers are not public.
Update: As of 10.14 and 10.15, IOFramebuffer subclasses sort of work again. At least, sufficiently so that the OS extends the display are to such a virtual screen, although the "vram" is never actually filled with the image data. You need to capture that in user space via Core Graphics APIs or similar.

How do I access the Joystick on windows in a non-deprecated way?

I want to write a Windows application which accesses the joystick. It is just a very simple application which reads the values and sends them to a server, so I am not using any game programming framework. However, I am confused about which API to use.
I looked at the Multimedia Joystick API, but this is described as superseded by DirectInput. So I looked at DirectInput, but this is also deprecated in favour of XInput.
However the XInput documentation talks only about Xbox360 controllers, and says it does not support "legacy DirectInput devices".
Have Microsoft consigned the entire HID Joystick usage type to the dustbin and given up on supporting them in favour of their own proprietary controller products, or am I missing something?
The most common solution is to use a combination of XInput and DirectInput so your application can properly access both type of devices. Microsoft even provides instructions on how to do this.
Please note that DirectInput is not available for Windows Store apps so if you intend to distribute through there, that's not an option.
XInput devices like the XBox 360 controller will also work with DirectInput but with some limitations. Notably, the left and right trigger will be mapped to the same axis instead of being independents, and vibration effects will not be available.

Third party devices with ios8 HomeKit Support?

I already have an home automation ios app. I can able to control devices that are configured in my home. I can able to access my device via local and remote network.
I just read apple's new ios8 HomeKit support. I want to integrate HomeKit compatibility into my app.I heard HAP(Home Accessory protocol) supported devices can only able to communicate with HomeKit Framework. Also apple said there is a bridge for third party devices to communicate with HomeKit. There is not much information about hardware protocols or procedures, how to use bridge between third party device and HomeKit ?
Is my HomeKit bridge is a real hardware?
Also i have doubts on communicating with configured accessories. Because apple HomeKit Framework have commands like "startExecutingActionSet" to perform one or multiple task , but how this commands works with our Existing commands protocols defined in ios app.
I am new to hardware engineering . So please give me a simple example of communication between apple's homekit via bridge with my Hardware device.
Thanks in advance...
A HomeKit bridge is a piece of hardware that receives HK style commands from an iDevice and translated them into the specific protocol for the target devices in your home. Phillips Hue have one of these. Apple have a protocol that the hardware manufacturers need to conform to and you need to be signed up to their MFi program to get that protocol. However someone seems to have reverse engineered the specs and you can use their code to write your own software bridge. That's what I'm doing.
In HomeKit you do not talk directly to the devices. That's pretty much the main point of HomeKit. So that each developer doesn't need to know each device's specific protocol you just trigger iOS to do the talking for a predefined action. I believe you can also add triggers and action sets by building up a group of actions that you want to happen and firing of the event. E.g. Turn off all accessories in the garage when I go inside. You don't need to know how to turn off each one, you just tell iOS to run the Off command on each device and it knows the rest. Or at least it does for the ones that have signed up to the MFi program and can listen to HomeKit commands.

Is Serial Port Profile (SPP) supported on iOS 7 over Bluetooth Low Energy (v4.0)?

Can I use Serial Port Profile (SPP) to communicate with iOS devices over Bluetooth Low Energy (v4.0) without the need for MFi Chip?
If you're designing something from scratch (rather than trying to interface to an existing SPP-enabled device), there is a possible solution.
Laird Technologies make a Bluetooth Low Energy Module (BL600), which can be loaded with a virtual serial port application. This creates a service which is similar to the SPP; at the remote end it can just be treated as a plain serial port (albeit rather low speed). You could roll your own service to do something similar on other devices.
It's not the most elegant solution, but seems to work okay, and far easier than trying to get MFi certification.
If you cannot control the peripheral's protocol choice:
The Serial Port Profile (SPP) is still supported by Bluetooth 4.0. However, Bluetooth 4.0 Low Energy uses different pysical and link layer protocols that are not backwards compatible with older Bluetooth standards. Current iOS and Android devices use "dual mode" interfaces that support the backward compatible part of BT 4.0 and the Low Energy standard.
Bluetooth 4.0 Low Energy does not support SPP whereas regular Bluetooth 4.0 does!
I found a Cordova/Phonegap Plugin on GitHub that might serve as a source of inspiration for you. They advertise to support SPP on iOS and Android alike.
If you are in control of the peripheral, i.e. you implement the peripheral's software:
Bluetooth 4.0 Low Energy communication makes use of the Generic ATTribute Protocol. Based on GATT there exist a number of profiles but no serial port profile.
The good news is that implementing your own proprietary serial port profile on iOS, Android and your device is fairly simple. The API instructions for your BTLE module/SoC should provide some examples for existing profiles.
As soon as you see how simple implementing your own profile is, you will probably choose to go for a more use case specific profile which will save you lots of power on your (battery powered?) peripheral.
Just to clear up John Parsons comment from Feb 16th - the BL600 is definitely not discontinued whatsoever.
vSP works well for a low level, low throughput data connectivity using BLE for iOS devices, as well as Android. Video showing the solution working to an iPad are at this link and full source code is available for the iOS application as well http://www.lairdtech.com/Support-Center/Technical-Library/Videos/VSP-Bridge-Command/#.UwYvzGJ_s1w
There are no MFi requirements for BLE connectivity on iOS.
MFi is only relevant to Classic Bluetooth data connections to / from iOS devices, where you need to use Apple's iAP protocol, be a MFi licensee, use an external Apple Authentication IC and pay a royalty to Apple.
NO,you can't. BLE not support SPP.
No, you can't. In general, it's important to remember that any Bluetooth Classic profile isn't necessarily applicable for Bluetooth Low Energy. With BLE however, you can easily create your own custom service/profile, specially tailored towards your particular application. As far as I know, all BLE communication with iOS is currently allowed without participating in the MFi. You can also take a look at this page for further information on SPP and BLE.
I'm searching for SPP for iOS myself and found a German supplier, lintech.de, that has products for "Bluetooth meets Apple" claiming to support/emulate SSP, apparently using their own embedded software layer combined with iAP. "BlueMFI software communicate with APPLE devices using the iAP (iPod Accessory Protocol) and manage the data communication with the Apple authentication chip...BlueMFI software is designed to run on a variety of hardware platforms (Bluetooth modules), and interested users can obtain the relevant evaluation kits. LinTech’s Bluetooth modules with BlueMFI software not only support the APPLE iAP protocol via Bluetooth, but they are also able to communicate with standard Bluetooth devices." Haven't tried this yet, just exploring and sharing.
I won't say SPP is directly supported under iOS 7, Apple says no. Won't argue :)
But...
I use connectblue modules OBS421 and OBS425 on a data collection project.
BLE modules have SPP profile enabled and I transmit data from my sensors to the iOS devices using BTLE module in SPP mode.
Works pretty fine under iOS 6 and 7
That said, I was having trouble with MFi bluetooth devices under iPhone 5S, that's why I moved to BTLE.
Drawback with BTLE, it's limited to 20 bytes at a time.
I had to adjust hardware and software, but was easy.
You have programmable chips such as Bluegiga BL112 that are doing the job. It is the cable replacement code.
I'm integrating it actually for both iOS and Android 4.3. It works at least on the demo board.

Resources