Playing audio while dictation is active in Mavericks - core-audio

In Mavericks Apple introduced "Enhanced Dictation" -- an ability to transcribe speech into text locally, in off-line mode. Unfortunately, they also introduced another feature -- while the dictation is active, all the sound is muted. A bit of digging around turned out that the "muted" sound is still being played. For example, Audio Hijack captures the sound as it should be played and saves it into a file. I'm making an application that requires sound output during dictation (I'm assuming that the user is wearing headphones). It does not look like they change the volume settings: querying the master volume level on the headphone device shows that it is the same before and during dictation. The Sound volume indicator in the menu bar does not change either. As far as the rest of the system is concerned the sound is playing.
I'm a CoreAudio noob. I can do basic things with recording and playback, but not much more. Is it possible to get the "muted" sound back? Is there a switch, a flag, a feature in CoreAudio that would enable the sound from my application to reach the headphones with the dictation active?

For people who would stumble on this page: I did find an answer eventually. You can disable audio ducking by setting the following user defaults:
defaults write com.apple.SpeechRecognitionCore AllowAudioDucking -bool NO
defaults write com.apple.speech.recognition.AppleSpeechRecognition.prefs DictationIMAllowAudioDucking -bool NO
See the detailed explanation on Youtube.

Related

Manually delay audio in chrome or system-wide (Mac OSX)

I got new Bluetooth speakers and inherently they seem to have a delay of about 0.5 to 1s when watching streams or videos online. I already mailed to the manufacturer and I was told that this has to do with how they make use of the Bluetooth protocol (pair of speakers in master-slave mode for stero sound) and how the respective video player is doing encoding/decoding. iTunes for instance seems to be just fine while vlc and all streams in browsers have this delay.
So I was wondering whether there is a way to manually delay audio either just in the browser (chrome) or even system-wide on MacOSX?! It would be great if the possible solution was transient since I do not want the delay when I am not using these speakers.
Additionally it would be just perfect if somebody even knew how to this on iOS although I don't think that it is possible there, that's why I did not include iOS in the title.

Lower security setting dialog after changing sandbox settings

I am developing an app which plays midi files in Mac.
When I activated sandboxing I couldn't hear any midi playback, after googling a bit I found out that I need to add some things in my entitlements.plist file,
So according to one forum which I saw, I added
com.apple.security.temporary-exception.audio-unit-host
I get a dialog box asking me to lower security settings every time I run the app on any mac. This is not desirable, how can i disable it?
The behavior you see is described in TN 2247 by Apple. It looks like you have at least one Audio Component that isn't suitable for sandbox. But probably not the one playing MIDI.
https://developer.apple.com/library/mac/technotes/tn2247/_index.html

Redirect microphone input to headphone output (soft playthru)

If I use my in-ear headphones with my macbook pro it takes me a few minutes until they are fitting perfectly (due to the foam bits on the headphones) :)
My idea is to use internal macbook pro microphone to be able talk to somebody coming to my desk without having to remove the headphones every time. So kind of an 'intercom' thing which can be enabled by hitting a hotkey.
First thought was to use applescript which could be easily used for pausing iTunes, but I could not find information about how I could forward the microphone output to my headphones.
Next try was to check Xcode for writing a Cocoa App. By starting with an example provided by Apple I think it would be achievable for me to extend it to fit my needs.
So my question is:
Do you have a better idea/approach to solve my problem?

Mute other applications' sound when my application is playing a sound

I would like to mute other applications sounds when my application is playing a sound. I know that this is possible in Windows 7 because it allows sound control on a per-application basis.
The specific scenario is my app needs to have its sound play exclusively; if other applications (eg Winamp, Media Player Classic etc) are playing a sound, they should be muted for the duration of the sound played by my application.
I would like to know how it can be done using Delphi? Which library/system call?
I doubt this is easily achieved.
What if the other apps took the same view? Suppose another app decided that it wanted its sound to play and mute all other apps. Which app would win?
On Vista & above you can do this by using CoreAudio/WASAPI & an exclusive mode stream.
Mumble is doing this, you can look at source code.

Recording audio on Mac

I am developing an application which access my audio input device and record audio from my microphone.
Here when i am pressing START button it have to record audio from microphone and have to stop recording when pressing STOP button.
My device is Lynx-AES16 and i got driver from there site.
In windows i am accessing the machine(Lynx) by using direct-show SDK(Graph edit). 
Is there any similar tool  like direct-show is available. I checked with AUAudio and IOkit but didnt understood that much.
Can anyone provide some sample or some useful links. Apple link is not good for a starter 
Look at the CoreAudio sample code at developer.apple.com. There is plenty of useful code there.
In particular look at the RecordAudioToFile example.

Resources