Can I use OSX AUGraph from monomac? - macos

I have an mp3 playing application written in C# which I would like to port to OSX.
As it uses DirectShow to play mp3 I realise that I will need to recode the audio playback part. I found Apple's playfile sample which uses AUGraph.
The Binding Cocoa section of http://www.mono-project.com/MonoMac mentions the "much simpler AudioToolBox API".
Can anyone point me at sample code for using the AudioToolBox from C# or preferably using AUGraph from C#.
Is porting my code to monomac the best approach or would I be better taking the plunge and recoding in Objective C.

These are some samples of using AudioUnit using the same API that MonoMac has, except that these samples target the iPhone using MonoTouch:
https://github.com/migueldeicaza/MonoTouch.AudioUnit
Setting AudioUnit up is a little cumbersome, if all you want is to play MP3 files without doing any low-level processing or applying effects, you can use the MonoMac.AppKit.NSSound API instead.

The page you linked does say that AudioToolbox (i.e. CoreAudio) is fully bound. I don't know of any samples but it shouldn't be hard to port C code.
Alternatively you could go onto the mono-osx mailing list and request a binding of QTKit, or even do this binding yourself. I hear that the MonoMac binding generator makes it quite easy to bind Objective-C APIs.
It's going to be much quicker and easier to use your existing C# code and knowledge, even if you do have to do some bindings yourself.

AUGraph is part of Core Audio. It is used to assemble a graph of Audio units and can be used for audio playback. Core Audio is a low level framework that has a C API.
Maybe you can use QTKit (a cocoa wrapper around QuickTime) from Mono.
In my opinion it is always the best approach to work with a platforms "native" technology. (Which would be Objective-C and Cocoa for Mac OS X).
Apple has a nice sample that shows how to create a media player using QTKit:
http://developer.apple.com/library/mac/#documentation/Cocoa/Conceptual/QTKitApplicationTutorial/Introduction/Introduction.html%23//apple_ref/doc/uid/TP40008155-CH1-SW1

Related

Porting a win32 MIDI SysEx application to MacOSX

What is the easiest way to port a win32 MIDI SysEx application (a configuration program), to MacOSX ?
The application itself is written in Qt, but I have no experience in OSX MIDI APIs. Are there good enough drop-in replacements for calls like:
midiInOpen
midiOutOpen
midiOutPrepareHeader
midiOutLongMsg
and a couple more? Is there a decent source of information for someone who has never programmed under MacOSX to develop MIDI SysEx applications? CoreAudio?
I found a great little (just one .cpp file + headers) midi library - crossplatform and all :)
It's also a great source to analyse and to learn from.. a little nugget in the whole undocumented field.
http://www.music.mcgill.ca/~gary/rtmidi/index.html
What is your development platform? If you're writing a Native Coca Application for the mac, Apple wrote a complete framework to deal with the Midi traffic named CoreMidi. The CoreMidi framework deliver the whole package of midi (include SysEx) and even extend it with network support.
I recommend having a look at Pete Goodliffe blog post of using CoreMidi for iOS devices. Although you're not developing for iOS, there is a lot of CoreMidi related information there.
There is a simple, yet brilliant, application that I use a lot on my studio named: Midi Monitor which is an open source application. I recommend having a look there too.

Streaming from webcam on OS X - what technology to use?

I'm building a videoconferencing application in OS X.
What technology would be best to use for real-time streaming video/audio captured from webcam/microphone in OS X?
So far I was unsuccessful with these methods:
using QTKit I captured the media, but there isn't a way to stream it (without using the QTSS which is too bloaty and hard to control programmatically).
using QT Java I got everything (almost) working, but the library is deprecated, it crashes every once in a while, signals memory leaking and there isn't a way to save preferences from a settings dialog
I installed gstreamer using Macports, but there isn't a working osxvideosrc (or audio for that matter)
My next target is VLC because it can access the webcam in OS X, but I'm not sure will it give me what I need - can I control it fully over an API and can I display the stream inside a Cocoa application (using QTKit's player)?
Could of points:
Consider Flex/Flash and possible Adobe Air. Many people have written videoconferencing applications this way.
QT for Java is dated and not going anywhere.
VLC is a solid option. Stable, well known, powerful, and very mature.

Video capture on MacOS

I'm wrtiting a C++ application with Trolltech QT Library and I need to capture video stream from a camera and some medical instrumentations.
What kind of hardware can I use to do this? I've tried with OpenCV but it doesn't recognize my EyeTV 250.
Can I use Pinnacle Video capture for Mac?
thanks,
Andrea
I believe that Qt delegates to QuickTime for media on OS X. I'd therefore expect that any hardware supported by QuickTime is in play. If you're willing to be locked to OS X, using the native API will likely be much easier. QTKit, the Objective-C API for QuickTime is new with Leopard (OS X 10.5) and is very good. You'll likely want to start with QTKit's Capture API. Since you're working with C++, you'll also want to learn about Objective-C++ for building the connection between QTKit and your code.
Try openCV, there is a good project here : http://code.google.com/p/opencv-cocoa/ there are c++ class.

Programmatically stream audio in Cocoa on the Mac

How do I go about programmatically creating audio streams using Cocoa on the Mac. To make, say a white-noise generator using core frameworks on Mac OSX in Cocoa apps?
One way is using the CoreAudio DefaultOutputUnit.
You can configure it with parameters such as output sampling rate, resolution, and output sample format. Then you can programmatically create a raw sound wave and provide this to the output unit.
Take a look at this example on your machine at /Developer/Examples/CoreAudio/SimpleSDK/DefaultOutputUnit/
Which uses the default output unit to play a programmatically rendered sine wave. Using that as a starting point and you can write a routine to render any thing else to output.
This location at /Developer/Examples/CoreAudio/ also contains tons of other core audio examples.
Look at Audio Queue Services.

Simple audio input API on a Mac?

I'd like to pull a stream of PCM samples from a Mac's line-in or built-in mic and do a little live analysis (the exact nature doesn't pertain to this question, but it could be an FFT every so often, or some basic statistics on the sample levels, or what have you).
What's a good fit for this? Writing an AudioUnit that just passes the sound through and incidentally hands it off somewhere for analysis? Writing a JACK-aware app and figuring out how to get it to play with the JACK server? Ecasound?
This is a cheesy proof-of-concept hobby project, so simplicity of API is the driving factor (followed by reasonable choice of programming language).
The principal framework for audio development in Mac OS X is Core Audio; it's the basis for all audio I/O. There are layers on top of it like Audio Toolbox, Audio Queue Services, QuickTime, and QTKit that you can use if you want a simplified API for common tasks.
To just pull a stream of samples, you'd probably want to use Audio Queue Services; the AudioQueueNewInput function will set up recording of PCM data and pass it to a callback you supply.
On your Mac there's a set of Core Audio examples in /Developer/Examples/CoreAudio/SimpleSDK that includes a use (AQRecord in AudioQueueTools) of the Audio Queue Services recording APIs.
I think portaudio is what you need.
Reading from the mike from a console app is a 10 line C file (see patests in the portaudio distrib).
Apple provides sample code for reading and writing audio data. Additionally there is a lot of good information in the Audio section of the Apple Developer site.

Resources