Capturing Audio buffer in Cocoa - cocoa

Hello All
In my application , i need to enable voice communication over IP,i.e. need to capture the Audio buffer and send it over the Internet, through secure socket,
In the Lower layer everything is ready, I need the entry point to start Voice Communication, but not getting any pointer in the Apple documentation, so far i have done following,
1 -- In the Apple documentation going through CoreAudio programming guide, is this right place to start,
2 -- if yes, somewhere it says, i need to download CoreAudio SDK, will it not come along with standard XCOde and Cocoa framework,
Kind Regards
Rohan

1: CoreAudio programming guide is the right place to start.
Also there are very good samples in Mac Dev Center and perhaps some iPhone examples are portable/useful as well (since CoreAudio in the iPhone is basically a sub-set of CoreAudio for the Mac)
2: You don't need to download anything else. If you have Xcode then just add CoreAudio.framework to your project and add #import <CoreAudio/CoreAudio.h>

Nacho4d is right. You don't need to download anything. The CoreAudio.framework is included in XCode.
As for learning how to work with Core-Audio in general Apple's documentation may be painful to use if you are not a seasoned programmer. Only one book has been written on the subject so far: It's called Core Audio (Rough Cuts) and it's written by Kevin Avila and Chris Adamson on Addison-Wesley Professional. It should help you with the basics.

Related

How do I change the Hot Word (wake word) on Google Assistant Raspberry Pi 3 (Non-AIY Kit)

I've found some great tutorials around the internet on how to change the wake word on the AIY Kit for the Raspberry Pi (Here's a good video of that), but cannot find one anywhere on how to change the hotword for a regular RPi 3 with the API setup.
I'm in the process of writing a beginners' tutorial on this and would love any and all advice on how to do this without the AIY kit. For reference and comparison, the AIY kit strategy is on that linked tutorial above in Part 5 using Snowboy's hotword creator.
Thanks in advance everyone!
You should be able to use snowboy as long as you have some way of recording as a raw wav. I have done some work for the AIY Voice kit Hotword and Voice Activation which may be of help. Basically get your audio into a queue in one thread and then take the audio off the other end and pass to snowboy. The activation is all done locally. Once activated you can carry on sending to Google. Hope this helps.

Is there an official Apple API for streaming from a Mac app to Apple TV?

I have been searching high and low, but have so far been ineffable. I am now turning for the stackoverflow community for advice.
My goal is to build a Mac app in XCode, that will allow me to send MP4 content from my Mac (or from a public URL on the web) to my Apple TV.
I have located numerous classes within the framework for iOS that enables this (quite easily it seems), but the trail ends there. It just seems like there is no API to do the same from OSX, but I am hoping that I have overlooked something :-) There seems to be well established methods for sending audio to AirPlay enabled devices, but not video?
I am aware of the 3rd party specification of the protocol at http://nto.github.io/AirPlay.html, and it looks a tangible plan B for my needs, but I would appreciate any pointers if anyone knows of a more official way.

A proper way of handling audio I/O devices attache and detach on Mac

I have been looking for something similar for
AVCaptureDeviceWasConnectedNotification or AVCaptureDeviceWasDisconnectedNotification
but for audio playback and record device. I have done a quick google search, but I haven't found any quite relevant answer for this question. There is a sample code from Apple AVCaptureToAudioUnitOSX. but it does not handle multiple audio route.
I wonder if anyone has idea about it?
You can do this at IOKit level by adding IOKit matching notifications for audio devices being hot-plugged or unplugged. [Caveat - I've not done this in about a decade: higher level APIs might exist]
Apple's documentation for this is here - particularly the section
Getting Notifications of Device Arrival and Departure
This is quite hard work. I advise you to get the IORegistryExplorer tool - which I believe is now in the Hardware IO Tools for Xcode download package (Xcode->Open Developer Tool->More Developer Tools). You can use this to work out the matching criteria.

How to host audio units in osx

I've been looking through the audio unit documentation for OSX, and I haven't found any good resources about how to host an audio unit in OSx. There are lots of resources for how to build audio units, and some about hosting in IOS. Has anyone seen a good document to this effect?
Thanks
i'm not sure how the samples have changed over the years... it looks like the current related samples don't address the issue in depth.
in the past, there were a few very basic examples which shipped with the DevTools distributions. iirc, these were distributed in the DEVTOOLS/Examples/ or DEVTOOLS/Extras/.
the AU APIs haven't changed a whole lot since they were written (fast dispatch was added along the way...), so the samples should help with the backend. they were written in the era when Cocoa UIs were still very new to AUs, so the frontend will be the aspect that's changed the most.
you may want to look for them in your Xcode 3.0 or Xcode 2.5 installs.
This [1] doc shows how to open audio units from within an OSX app. It doesn't cover the general case of hosting audio units though.
[1] http://developer.apple.com/library/mac/technotes/tn2091/_index.html

Playing Sound With OpenAL

Whats the best way to play a sound using OpenAL in xcode 3.2.2 on a 3.1.2 SDK
im pulling my brains out at the moment. I've followed ben brittel and mike daley's tutorials on openAL and ive implemented all the things needed to play sound. Basically i creaed a sound manager class with the help of their fantastic tutorials.
The only problem is i get a SIGABRT error. the app doesnt even load when i try to initialise the sounds.
Im making a drum application. The app works fine until i try to play the sound
so ive decided i may need to start from fresh again. (since before i was playing sound using the playsystemsound but that is very slow and is not idea for games programming)
can someone please help and tell me the best way to play sound using OpenAL
Thanks
I need openal so i can use stuff like pitch control.
You can use Finch, a simple iOS SFX engine written atop of OpenAL. It has a very low latency, so that it should be a good fit for a drumming application, and it also offers pitch control.

Resources