Streaming from webcam on OS X - what technology to use? - macos

I'm building a videoconferencing application in OS X.
What technology would be best to use for real-time streaming video/audio captured from webcam/microphone in OS X?
So far I was unsuccessful with these methods:
using QTKit I captured the media, but there isn't a way to stream it (without using the QTSS which is too bloaty and hard to control programmatically).
using QT Java I got everything (almost) working, but the library is deprecated, it crashes every once in a while, signals memory leaking and there isn't a way to save preferences from a settings dialog
I installed gstreamer using Macports, but there isn't a working osxvideosrc (or audio for that matter)
My next target is VLC because it can access the webcam in OS X, but I'm not sure will it give me what I need - can I control it fully over an API and can I display the stream inside a Cocoa application (using QTKit's player)?

Could of points:
Consider Flex/Flash and possible Adobe Air. Many people have written videoconferencing applications this way.
QT for Java is dated and not going anywhere.
VLC is a solid option. Stable, well known, powerful, and very mature.

Related

How to write into a MP4 container with Cocoa?

I need to export an MP4 file containing several streams of audio. There is no video at all at this point in time, though this might be requested sometimes in the future.
All this is on the Mac running a decently recent version of Mac OS X. QT Kit is not an option. Portability to iOS would be a bonus.
Where should I look? A very casual look at AV Foundation suggests this might be a way, but doesn't look simple at all.
Or should I rather look for a third party library? ffmpeg? mp4v2?
Thanks for any suggestion.

Can I use OSX AUGraph from monomac?

I have an mp3 playing application written in C# which I would like to port to OSX.
As it uses DirectShow to play mp3 I realise that I will need to recode the audio playback part. I found Apple's playfile sample which uses AUGraph.
The Binding Cocoa section of http://www.mono-project.com/MonoMac mentions the "much simpler AudioToolBox API".
Can anyone point me at sample code for using the AudioToolBox from C# or preferably using AUGraph from C#.
Is porting my code to monomac the best approach or would I be better taking the plunge and recoding in Objective C.
These are some samples of using AudioUnit using the same API that MonoMac has, except that these samples target the iPhone using MonoTouch:
https://github.com/migueldeicaza/MonoTouch.AudioUnit
Setting AudioUnit up is a little cumbersome, if all you want is to play MP3 files without doing any low-level processing or applying effects, you can use the MonoMac.AppKit.NSSound API instead.
The page you linked does say that AudioToolbox (i.e. CoreAudio) is fully bound. I don't know of any samples but it shouldn't be hard to port C code.
Alternatively you could go onto the mono-osx mailing list and request a binding of QTKit, or even do this binding yourself. I hear that the MonoMac binding generator makes it quite easy to bind Objective-C APIs.
It's going to be much quicker and easier to use your existing C# code and knowledge, even if you do have to do some bindings yourself.
AUGraph is part of Core Audio. It is used to assemble a graph of Audio units and can be used for audio playback. Core Audio is a low level framework that has a C API.
Maybe you can use QTKit (a cocoa wrapper around QuickTime) from Mono.
In my opinion it is always the best approach to work with a platforms "native" technology. (Which would be Objective-C and Cocoa for Mac OS X).
Apple has a nice sample that shows how to create a media player using QTKit:
http://developer.apple.com/library/mac/#documentation/Cocoa/Conceptual/QTKitApplicationTutorial/Introduction/Introduction.html%23//apple_ref/doc/uid/TP40008155-CH1-SW1

I need to develop a project involving hardware which should also work the same on Windows as well as Macs. Whats the way forward?

Whats the best approach (read painless) that I could take?
Primarily, the application needs to record the webcam video + mic recording on the disk and compress the video using ffmpeg (or something similar).
So there is hardware involved + running a separate process for encoding.
I was seriously considering Adobe AIR - but I read on the Adobe site that it does not have permission to run other applications which can be problematic if I want to encode the video using ffmpeg.
Did you consider developing it in Java? In that case you should take a look at the Eclipse Rich Client platform. I have developed a couple of programs by using Eclipse RCP and I would never develop an app in Java without it. It uses SWT and jFace and provides options for exporting the app to run on OSX, Linux and Windows.
You should give it a try.
If you can develop it under Mono, much of it will work on both platforms.
Qt. Simple as pie.

User-friendly approach for network streaming, playing and seeking of audio files in Mac OS X 10.5

Please advise a combination of server and client technologies, tools and frameworks to implement a solution that meets the following requirements?
File server in the network has a huge library of mp3/aac/aiff/wav music files
Desktop cocoa application accesses audio files using URLs: rtmp, http, rtsp+rtp, ftp — how to make a choice?
Audio content should be streamed and played with seeking (it's crucial) without downloading the entire file: QuckTime, AudioQueue, AudioFile, AudioStream, CFHTTP, All of them? — how to develop a client?
After solid research I've ended up with myriads of options and articles. But it looks like a half of them is quite out-of-date (2001—2005), and the other half is about universal code (pure C) for Mac OS X and iPhone OS.
However the main goal here is to write a Desktop music player for Mac OS 10.5.
I cannot believe that all this raw C-coding is just required.
No wrappers? No handy libraries? No components?
P. S. Research has resulted in the following combination: qt_tools for hinting + DSS for RTSP streaming + QTMovie for playing back + setCurrentTime: for seeking. This selection requires double-space for storing hinted .MOV-versions of every music file but works anyway.
I am not sure, but I believe you can use [QTMovie movieWithURL:url error:err] to stream a movie from a URL, then pass it to a QTMovieView object. QuickTime treats audio like movies, so it may work. Or it may try to load the entire file.
Have a look at the QuickTime streaming Guide
Did you look at VLC as a streaming solution?

Video capture on MacOS

I'm wrtiting a C++ application with Trolltech QT Library and I need to capture video stream from a camera and some medical instrumentations.
What kind of hardware can I use to do this? I've tried with OpenCV but it doesn't recognize my EyeTV 250.
Can I use Pinnacle Video capture for Mac?
thanks,
Andrea
I believe that Qt delegates to QuickTime for media on OS X. I'd therefore expect that any hardware supported by QuickTime is in play. If you're willing to be locked to OS X, using the native API will likely be much easier. QTKit, the Objective-C API for QuickTime is new with Leopard (OS X 10.5) and is very good. You'll likely want to start with QTKit's Capture API. Since you're working with C++, you'll also want to learn about Objective-C++ for building the connection between QTKit and your code.
Try openCV, there is a good project here : http://code.google.com/p/opencv-cocoa/ there are c++ class.

Resources