Are there any API to turn on DXVA in Mac OSX - macos

I wrote a video playback application based on Carbon in MAC OSX, are there any API to turn on the DXVA feature which support by Graphic Card? Does it support in QuickTime SDK or Carbon API?

DirectX is part of Windows. It doesn't exist in Mac OS X.
If you're doing hardware-accelerated video playback: Why are you worrying about it? If the hardware supports it, chances are, the APIs will use it. So just play your movie and let the library take care of doing it through the graphics card or not.
If you're doing video capture: Core Video will let you do that through the graphics card.

I believe that on very-recent hardware only, QuickTime will use hardware video acceleration for the decoding of some types of video stream.
Note that this is NOT specifically related to the capabilities of the graphics card, for example the 8800GT PureVideo feature works fine under Windows but is unused in OS X.

Related

How to write MacOS display driver

I need to write display driver on MacOS High Sierra for external display. I found IOKit sample for device driver and basic document about IOVideoDevice. But I cannot find detail document or sample code with IOVideoDevice.
I joined Apple developer program for $99/year. Do I have to join special Apple program for writing video driver? I wonder how graphic card vendor, DisplayLink and AirParrot got the information.
By "video driver" do you mean a video capture device or a graphics card (GPU)?
IOVideoDevice implies a video capture device, e.g. webcam or video capture card. However, this API is old, nowadays drivers for video capture devices should be written as CoreMediaIO plugins. (Though since the prevalence of the Library Validation
code signing flag, this route also has issues, such as 3rd party capture drivers not working with FaceTime and similar apps; this goes beyond the scope of this question.)
"Graphics card" suggests you have a device you want to use as a display for the Mac. This is not officially supported by Apple. It used to be that you could create an IOFramebuffer subclass. As of macOS 10.13 this no longer works as expected (blank screen), and does not work at all as of 10.13.4-10.13.6. The GPU manufacturers (Intel, AMD, and NVidia) are suppliers to Apple, so they get deep access to the graphics pipeline. The APIs they use to implement their drivers are not public.
Update: As of 10.14 and 10.15, IOFramebuffer subclasses sort of work again. At least, sufficiently so that the OS extends the display are to such a virtual screen, although the "vram" is never actually filled with the image data. You need to capture that in user space via Core Graphics APIs or similar.

Deprecated API Usage - Apple no longer accepts submissions of apps that use QuickTime APIs

Application use Qtkit framework.
When try to upload to iTunes, error message appear: invalid binary.
I received mail with : Deprecated API Usage - Apple no longer accepts submissions of apps that use QuickTime APIs.
source : https://developer.apple.com/quicktime/
QTKit
QTKit is a Cocoa framework for manipulating time-based media providing a set of easy to use classes and methods to handle capture, playback, editing, and export. Use these resources for integrating media into your app.
QuickTime
QuickTime provides a powerful C based API for manipulating time-based media, allowing low-level media export, editing, encoding and decoding. While QTKit is the preferred API for use with time-based media, a good understanding of QuickTime is essential for all developers.
As result: QuickTime and Qtkit are different API's.
Please confirm what Apple no longer accepts submissions of apps that use QuickTime API and QTKit
It certainly appears so. QTKit isn’t the same as QuickTime but it’s on top of it, and QuickTime isn’t moving into the future very gracefully. Apple doesn’t want to have to keep piling hacks on top of hacks to keep QuickTime working on newer machines and operating systems.
The good news is AVFoundation is kind of awesome.

QT movie processing on GPU

I'm working on a QTKit project in cocoa where the QT movie is taking up ~70% of the CPU load. I would like to move some of the processing load onto the GPU, if possible. Does anyone know if that is possible?
On Snow Leopard, use initWithAttributes:error: to create the QTMovie object, and include the QTMovieOpenForPlaybackAttribute attribute with a value of YES. This will use QuickTime X to decode/play the movie, which, if possible, will play it with GPU acceleration.
Be aware that this locks out a lot of functionality. You really do need to only be using the movie for playback. The QuickTime Kit Application Programming Guide has more information.
Install a GPU QuickTime codec for the movie format you want to support ? Apparently such things exist, e.g. CoreAVC.

Streaming from webcam on OS X - what technology to use?

I'm building a videoconferencing application in OS X.
What technology would be best to use for real-time streaming video/audio captured from webcam/microphone in OS X?
So far I was unsuccessful with these methods:
using QTKit I captured the media, but there isn't a way to stream it (without using the QTSS which is too bloaty and hard to control programmatically).
using QT Java I got everything (almost) working, but the library is deprecated, it crashes every once in a while, signals memory leaking and there isn't a way to save preferences from a settings dialog
I installed gstreamer using Macports, but there isn't a working osxvideosrc (or audio for that matter)
My next target is VLC because it can access the webcam in OS X, but I'm not sure will it give me what I need - can I control it fully over an API and can I display the stream inside a Cocoa application (using QTKit's player)?
Could of points:
Consider Flex/Flash and possible Adobe Air. Many people have written videoconferencing applications this way.
QT for Java is dated and not going anywhere.
VLC is a solid option. Stable, well known, powerful, and very mature.

Video capture on MacOS

I'm wrtiting a C++ application with Trolltech QT Library and I need to capture video stream from a camera and some medical instrumentations.
What kind of hardware can I use to do this? I've tried with OpenCV but it doesn't recognize my EyeTV 250.
Can I use Pinnacle Video capture for Mac?
thanks,
Andrea
I believe that Qt delegates to QuickTime for media on OS X. I'd therefore expect that any hardware supported by QuickTime is in play. If you're willing to be locked to OS X, using the native API will likely be much easier. QTKit, the Objective-C API for QuickTime is new with Leopard (OS X 10.5) and is very good. You'll likely want to start with QTKit's Capture API. Since you're working with C++, you'll also want to learn about Objective-C++ for building the connection between QTKit and your code.
Try openCV, there is a good project here : http://code.google.com/p/opencv-cocoa/ there are c++ class.

Resources