distortion on core audio's newTimePitch audio unit - core-audio

I am attempting to incorporate the newTimePitch audio unit in my iOS app. I have constructed an AUGraph with a filePlayer Unit -> subType_NewTimePitch Unit -> remoteIO Unit. I have found that the output of newTimePitch unit is distorted. This distortion is persistent even at the default rate (1.0) and pitch (0) parameters.
I have performed a simple test for confirmation: on my auGraph, I swapped the subType_NewTimePitch for subType_Varispeed, keeping all other parameters the same, and the distortion disappears. I also tried to place a mixer unit upstream of the newTimePitch in order to reduce the input gain to the newTimePitch unit. Of course, this reduced the overall output of the unit but did nothing to ameliorate the distortion. I have found that, although the distortion is not particularly noticeable in the iOS Simulator, it is definitely present on an actual device (iPad2).
I would be very interested in others' experience and/or solutions.

I also found this, just went back to the old time pitch unit.

Greg, I've also use newTimePitch audio unit in my app. I don't found any distortion or noise on simulator. What about device: unfortunately I have no sound at all with NewTimePitch node ( iOS Sounds with AudioUnit works on simulator, not on real device iPad ). If I use AUConverter node instead of NewTimePitch the sound is playing good. I've also tried Varispeed, but with no success too. Do you know the possible reason of missing the sound on iPad?
I have several suggestions for your issue:
1) your NewTimePitch is the first node in your graph? And input render callback is associated with it? If yes, then did you try to add for example Mixer node as the first node and NewTimePitch after it? It can make sense if issue is related to stream format.
2) did you try to add AUConverter node between NewTimePitch node and remoteIO node? It can make sense if issue is related to stream format.
3) did you try to launch compiled app on device not from XCode? It can make sense if you have logs in render callback (big number of log information cause no sound at all or bad sound).

There is a BypassEffect property that can be set on the NewTimePitch converter Audio Unit if you don't want it to affect the sound at the default pitch and rate (no change).
I've only tested this on an iOS 6 device, but both NewTimePitch and the bypass appears to work there.

I ran into this issue using a render callback -> Mixer unit -> NewTimePitch unit -> Mixer unit -> RemoteIO unit. Two workarounds I found:
1) Use the Varispeed unit and set the kNewTimePitchParam_Pitch property on it (undocumented, but works).
2) Use a AUFilePlayer instead of a render callback.
Result: Player unit -> NewTimePitch unit -> Mixer unit -> RemoteIO unit
Note on workaround 2: I needed to get the kAudioUnitProperty_StreamFormat from the pitch unit and set player unit's output format to it, and mixer unit's input format to it.

Related

Does cobalt support YouTube 360 Video(Spherical Video)

Dose cobalt support youtube 360 video(Spherical Video)? If yes, how it's been implemented, is there any document for it? Does it need the platform to do extra things to support it?
Almost. There is still some small remaining work that prevents this from being a simple yes, but the vast bulk of the work is done and has been shown to function.
A document will soon be appearing in the source tree explaining all this, but here is a preview...
In order to support spherical video, a platform will have to support decode-to-texture, introduced in Starboard API version 4. Cobalt will choose between punch-out and decode-to-texture when creating an SbPlayer, based on whether a mesh transform has been applied to the video tag. In decode-to-texture mode, every frame, the video texture will then be queried from the player, and rendered into the UI graphics plane with the current transform applied.

How to control speed of animation on Unity by a .txt file?

I am trying to controlling the flow of an 3D animation on unity through the sound.
Therefore, through a synthesizer sounds, get the BPM of a song, which are stored in a file. Txt, with the given time and the number of BPM in second. Then in Unity, I have a pre-defined animation, and already carry the file with the information of BPM, but now I do not know how I can make the animation speed is controlled by the information in that file.
Any idea?
You can always change AnimationState.speed to serve that purpose.
http://docs.unity3d.com/ScriptReference/AnimationState-speed.html

Delay in AUGraph callback

We are developing a music player app for Lion OSX(10.7), which applies different audio effects to selected music file.
We have used Audio unit and AUGraph APi's to achieve this.
However after connecting all the audio unit node , when we call AUGraphStart(mGraph) graph takes around 1 sec to invoke first I/o callback.
Because of this there is slight delay in the beginning of the playback.
How can we avoid this delay?Could any one provide any imputs to help us solve this issue?
One solution is to start the audio graph running before displaying any UI that the user could use to start playback. Since the audio units will then be running, you could fill any audio output buffers with silence before the appropriate UI event. If the buffers are small/short, the latency from any UI event till an output buffer is filled may be small enough to be below normal human perception.

Should I use NSOperation or NSRunLoop?

I am trying to monitor a stream of video output from a FireWire camera. I have created an Interface Builder interface with buttons and an NSImageView. While image monitoring is occurring within an endless loop, I want to:
change some camera parameters on the fly (gain, gamma, etc.)
tell the monitoring to stop so I can save an image to a file (set a flag that stops the while loop)
Using the button features, I have been unable to loop the video frame monitor, while still looking for a button press (much like using the keypressed feature from C.) Two options present themselves:
Initiate a new run loop (for which I cannot get an autoreleasepool to function ...)
Initiate an NSOperation - how do I do this in a way which allows me to connect with an Xcode button push?
The documentation is very obtuse about the creation of such objects. If I create an NSOperation as per the examples I've found, there seems to be no way to communicate with it with an object from Interface Builder. When I create an NSRunLoop, I get an object leak error, and I can find no example of how to create an autoreleasepool that actually responds to the RunLoop I've created. Nevermind that I haven't even attempted to choose which objects get sampled by the secondary run loop ...
Because Objective C is (obviously!) not my native tongue, I am looking for solutions with baby steps, sorry to say ...
Thanks in advance
I've needed to do almost exactly the same as you, only with a continuous video display from the FireWire camera. In my case, I used the libdc1394 library to perform the frame capture and camera property adjustment for our FireWire cameras. I know you can also do this using some of the Carbon Quicktime functions, but I found libdc1394 to be a little easier to understand.
For the video capture loop, I tried a number of different approaches, from a separate thread that polls the camera and has locks around shared resources, to using one NSOperationQueue for interaction with the camera, and finally settled on using a CVDisplayLink to poll the camera in a way that matches the refresh rate of the screen.
The CVDisplayLink is configured using the following code:
CGDirectDisplayID displayID = CGMainDisplayID();
CVReturn error = kCVReturnSuccess;
error = CVDisplayLinkCreateWithCGDisplay(displayID, &displayLink);
if (error)
{
NSLog(#"DisplayLink created with error:%d", error);
displayLink = NULL;
}
CVDisplayLinkSetOutputCallback(displayLink, renderCallback, self);
and it calls the following function to trigger the retrieval of a new camera frame:
static CVReturn renderCallback(CVDisplayLinkRef displayLink,
const CVTimeStamp *inNow,
const CVTimeStamp *inOutputTime,
CVOptionFlags flagsIn,
CVOptionFlags *flagsOut,
void *displayLinkContext)
{
return [(SPVideoView *)displayLinkContext renderTime:inOutputTime];
}
The CVDisplayLink is started and stopped using the following:
- (void)startRequestingFrames;
{
CVDisplayLinkStart(displayLink);
}
- (void)stopRequestingFrames;
{
CVDisplayLinkStop(displayLink);
}
Rather than using a lock on the FireWire camera communications, whenever I need to adjust the exposure, gain, etc. I change corresponding instance variables and set the appropriate bits within a flag variable to indicate which settings to change. On the next retrieval of a frame, the callback method from the CVDisplayLink changes the appropriate settings on the camera to match the locally stored instance variables and clears that flag.
Display to the screen is handled through an NSOpenGLView (CAOpenGLLayer introduced too many visual artifacts when updating at this rate, and its update callbacks ran on the main thread). Apple has some extensions you can use to provide these frames as textures using DMA for better performance.
Unfortunately, nothing that I've described here is introductory-level stuff. I have about 2,000 lines of code for these camera-handling functions in our software and this took a long time to puzzle out. If Apple could add the manual camera settings adjustments to the QTKit Capture APIs, I could remove almost all of this.
If all you're trying to do is see/grab the output of a connected camera, the answer is probably neither.
Use QTKit's QTCaptureView. Problem solved. Want to grab a frame? Also no problem. Don't try to roll your own - QTKit's stuff is optimized and part of the OS. I'm pretty sure you can affect camera properties as you wanted but if not, plan B should work.
Plan b: Use a scheduled, recurring NSTimer to ask QTKit to grab a frame every so often ("how" linked above) and apply your image manipulations to the frame (maybe with Core Image) before displaying in your NSImageView.

How to get the FPS from a JavaFX scene?

I am currently writing a small graphical performance test benchmark for JavaFX.
Thus, I need to get the current FPS at which the JavaFX scene is being refreshed.
So far, I haven't found a solution how to accomplish this.
Does anyone know if there is some kind of event that I could use in order to get the FPS?
I don't think there is a specific event that gives a frame rate. This reference/example might help. It shows the frame rate when running -- JavaFX FPS Meter . It has a link to source code.

Resources