I'm working on a QTKit project in cocoa where the QT movie is taking up ~70% of the CPU load. I would like to move some of the processing load onto the GPU, if possible. Does anyone know if that is possible?
On Snow Leopard, use initWithAttributes:error: to create the QTMovie object, and include the QTMovieOpenForPlaybackAttribute attribute with a value of YES. This will use QuickTime X to decode/play the movie, which, if possible, will play it with GPU acceleration.
Be aware that this locks out a lot of functionality. You really do need to only be using the movie for playback. The QuickTime Kit Application Programming Guide has more information.
Install a GPU QuickTime codec for the movie format you want to support ? Apparently such things exist, e.g. CoreAVC.
Related
I need to export an MP4 file containing several streams of audio. There is no video at all at this point in time, though this might be requested sometimes in the future.
All this is on the Mac running a decently recent version of Mac OS X. QT Kit is not an option. Portability to iOS would be a bonus.
Where should I look? A very casual look at AV Foundation suggests this might be a way, but doesn't look simple at all.
Or should I rather look for a third party library? ffmpeg? mp4v2?
Thanks for any suggestion.
I'm building a videoconferencing application in OS X.
What technology would be best to use for real-time streaming video/audio captured from webcam/microphone in OS X?
So far I was unsuccessful with these methods:
using QTKit I captured the media, but there isn't a way to stream it (without using the QTSS which is too bloaty and hard to control programmatically).
using QT Java I got everything (almost) working, but the library is deprecated, it crashes every once in a while, signals memory leaking and there isn't a way to save preferences from a settings dialog
I installed gstreamer using Macports, but there isn't a working osxvideosrc (or audio for that matter)
My next target is VLC because it can access the webcam in OS X, but I'm not sure will it give me what I need - can I control it fully over an API and can I display the stream inside a Cocoa application (using QTKit's player)?
Could of points:
Consider Flex/Flash and possible Adobe Air. Many people have written videoconferencing applications this way.
QT for Java is dated and not going anywhere.
VLC is a solid option. Stable, well known, powerful, and very mature.
Please advise a combination of server and client technologies, tools and frameworks to implement a solution that meets the following requirements?
File server in the network has a huge library of mp3/aac/aiff/wav music files
Desktop cocoa application accesses audio files using URLs: rtmp, http, rtsp+rtp, ftp — how to make a choice?
Audio content should be streamed and played with seeking (it's crucial) without downloading the entire file: QuckTime, AudioQueue, AudioFile, AudioStream, CFHTTP, All of them? — how to develop a client?
After solid research I've ended up with myriads of options and articles. But it looks like a half of them is quite out-of-date (2001—2005), and the other half is about universal code (pure C) for Mac OS X and iPhone OS.
However the main goal here is to write a Desktop music player for Mac OS 10.5.
I cannot believe that all this raw C-coding is just required.
No wrappers? No handy libraries? No components?
P. S. Research has resulted in the following combination: qt_tools for hinting + DSS for RTSP streaming + QTMovie for playing back + setCurrentTime: for seeking. This selection requires double-space for storing hinted .MOV-versions of every music file but works anyway.
I am not sure, but I believe you can use [QTMovie movieWithURL:url error:err] to stream a movie from a URL, then pass it to a QTMovieView object. QuickTime treats audio like movies, so it may work. Or it may try to load the entire file.
Have a look at the QuickTime streaming Guide
Did you look at VLC as a streaming solution?
I wrote a video playback application based on Carbon in MAC OSX, are there any API to turn on the DXVA feature which support by Graphic Card? Does it support in QuickTime SDK or Carbon API?
DirectX is part of Windows. It doesn't exist in Mac OS X.
If you're doing hardware-accelerated video playback: Why are you worrying about it? If the hardware supports it, chances are, the APIs will use it. So just play your movie and let the library take care of doing it through the graphics card or not.
If you're doing video capture: Core Video will let you do that through the graphics card.
I believe that on very-recent hardware only, QuickTime will use hardware video acceleration for the decoding of some types of video stream.
Note that this is NOT specifically related to the capabilities of the graphics card, for example the 8800GT PureVideo feature works fine under Windows but is unused in OS X.
I'm wrtiting a C++ application with Trolltech QT Library and I need to capture video stream from a camera and some medical instrumentations.
What kind of hardware can I use to do this? I've tried with OpenCV but it doesn't recognize my EyeTV 250.
Can I use Pinnacle Video capture for Mac?
thanks,
Andrea
I believe that Qt delegates to QuickTime for media on OS X. I'd therefore expect that any hardware supported by QuickTime is in play. If you're willing to be locked to OS X, using the native API will likely be much easier. QTKit, the Objective-C API for QuickTime is new with Leopard (OS X 10.5) and is very good. You'll likely want to start with QTKit's Capture API. Since you're working with C++, you'll also want to learn about Objective-C++ for building the connection between QTKit and your code.
Try openCV, there is a good project here : http://code.google.com/p/opencv-cocoa/ there are c++ class.