completionHandler of AVAudioPlayerNode.scheduleFile() is called too early - ios8

I am trying to use the new AVAudioEngine in iOS 8.
It looks like the completionHandler of player.scheduleFile() is called before the sound file has finished playing.
I am using a sound file with a length of 5s -- and the println()-Message appears round about 1 second before the end of the sound.
Am I doing something wrong or do I misunderstand the idea of a completionHandler?
Thanks!
Here is some code:
class SoundHandler {
let engine:AVAudioEngine
let player:AVAudioPlayerNode
let mainMixer:AVAudioMixerNode
init() {
engine = AVAudioEngine()
player = AVAudioPlayerNode()
engine.attachNode(player)
mainMixer = engine.mainMixerNode
var error:NSError?
if !engine.startAndReturnError(&error) {
if let e = error {
println("error \(e.localizedDescription)")
}
}
engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0))
}
func playSound() {
var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a")
var soundFile = AVAudioFile(forReading: soundUrl, error: nil)
player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") })
player.play()
}
}

I see the same behavior.
From my experimentation, I believe the callback is called once the buffer/segment/file has been "scheduled", not when it is finished playing.
Although the docs explicitly states:
"Called after the buffer has completely played or the player is stopped. May be nil."
So I think it's either a bug or incorrect documentation. No idea which

You can always compute the future time when audio playback will complete, using AVAudioTime. The current behavior is useful because it supports scheduling additional buffers/segments/files to play from the callback before the end of the current buffer/segment/file finishes, avoiding a gap in audio playback. This lets you create a simple loop player without a lot of work. Here's an example:
class Latch {
var value : Bool = true
}
func loopWholeFile(file : AVAudioFile, player : AVAudioPlayerNode) -> Latch {
let looping = Latch()
let frames = file.length
let sampleRate = file.processingFormat.sampleRate
var segmentTime : AVAudioFramePosition = 0
var segmentCompletion : AVAudioNodeCompletionHandler!
segmentCompletion = {
if looping.value {
segmentTime += frames
player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
}
}
player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
segmentCompletion()
player.play()
return looping
}
The code above schedules the entire file twice before calling player.play(). As each segment gets close to finishing, it schedules another whole file in the future, to avoid gaps in playback. To stop looping, you use the return value, a Latch, like this:
let looping = loopWholeFile(file, player)
sleep(1000)
looping.value = false
player.stop()

The AVAudioEngine docs from back in the iOS 8 days must have just been wrong. In the meantime, as a workaround, I noticed if you instead use scheduleBuffer:atTime:options:completionHandler: the callback is fired as expected (after playback finishes).
Example code:
AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
[file readIntoBuffer:buffer error:&error];
[_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
// reminder: we're not on the main thread in here
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"done playing, as expected!");
});
}];

My bug report for this was closed as "works as intended," but Apple pointed me to new variations of the scheduleFile, scheduleSegment and scheduleBuffer methods in iOS 11. These add a completionCallbackType argument that you can use to specify that you want the completion callback when the playback is completed:
[self.audioUnitPlayer
scheduleSegment:self.audioUnitFile
startingFrame:sampleTime
frameCount:(int)sampleLength
atTime:0
completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) {
// do something here
}];
The documentation doesn't say anything about how this works, but I tested it and it works for me.
I've been using this workaround for iOS 8-10:
- (void)playRecording {
[self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() {
float totalTime = [self recordingDuration];
float elapsedTime = [self recordingCurrentTime];
float remainingTime = totalTime - elapsedTime;
[self performSelector:#selector(doSomethingHere) withObject:nil afterDelay:remainingTime];
}];
}
- (float)recordingDuration {
float duration = duration = self.audioUnitFile.length / self.audioUnitFile.processingFormat.sampleRate;
if (isnan(duration)) {
duration = 0;
}
return duration;
}
- (float)recordingCurrentTime {
AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime;
AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime];
AVAudioFramePosition sampleTime = playerTime.sampleTime;
if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing
sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here
float time = sampleTime / self.audioUnitFile.processingFormat.sampleRate;
self.audioUnitLastKnownTime = time;
return time;
}

Yes, it does get called slightly before the file (or buffer) has completed. If you call [myNode stop] from within the completion handler the file (or buffer) will not fully complete. However, if you call [myEngine stop], the file (or buffer) will complete to the end

// audioFile here is our original audio
audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: {
print("scheduleFile Complete")
var delayInSeconds: Double = 0
if let lastRenderTime = self.audioPlayerNode.lastRenderTime, let playerTime = self.audioPlayerNode.playerTime(forNodeTime: lastRenderTime) {
if let rate = rate {
delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate) / Double(rate!)
} else {
delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate)
}
}
// schedule a stop timer for when audio finishes playing
DispatchTime.executeAfter(seconds: delayInSeconds) {
audioEngine.mainMixerNode.removeTap(onBus: 0)
// Playback has completed
}
})

As of today, in a project with deployment target 12.4, on a device running 12.4.1, here's the way we found to successfully stop the nodes upon playback completion:
// audioFile and playerNode created here ...
playerNode.scheduleFile(audioFile, at: nil, completionCallbackType: .dataPlayedBack) { _ in
os_log(.debug, log: self.log, "%#", "Completing playing sound effect: \(filePath) ...")
DispatchQueue.main.async {
os_log(.debug, log: self.log, "%#", "... now actually completed: \(filePath)")
self.engine.disconnectNodeOutput(playerNode)
self.engine.detach(playerNode)
}
}
The main difference w.r.t. previous answers is to postpone node detaching on main thread (which I guess is also the audio render thread?), instead of performing that on callback thread.

Related

Swift 2 Spritekit: Issue with background music not resuming when the game is unpaused

First of all I would like to thank anyone in advance for any help I get.
I have searched far and wide across the net and cannot find a solution to my issue. My issue is with an iOS game I am building using the SpriteKit framework. I have added a background song using an SKAudioNode and it works fine initially, but when I pause and play the game within a few seconds, the music does not begin playing again. I have tried lots of things like removing the SKAudioNode when the game is paused and adding it again when the game is resumed, but nothing has worked. I have posted a snippet of my code below keeping it as relevant as possible:
class GameScene: SKScene, SKPhysicsContactDelegate {
var backgroundMusic = SKAudioNode(fileNamed: "bg.mp3")
let pauseImage = SKTexture(imageNamed: "pause.png")
var pauseButton:SKSpriteNode!
let playImage = SKTexture(imageNamed: "play2.png")
var playButton:SKSpriteNode!
override func didMoveToView(view: SKView) {
self.addChild(backgroundMusic)
// create pause button
pauseButton = SKSpriteNode(texture: pauseImage)
pauseButton.position = CGPoint(x: self.size.width - pauseButton.size.width, y: pauseButton.size.height)
pauseButton.zPosition = 1
pauseButton.name = "pauseButton"
self.addChild(pauseButton)
// create play button
playButton = SKSpriteNode(texture: playImage)
playButton.position = CGPoint(x: self.size.width - playButton.size.width, y: -playButton.size.height)
playButton.zPosition = 1
playButton.name = "playButton"
self.addChild(playButton)
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
for touch in touches {
if pauseButton.containsPoint(touch.locationInNode(self)) {
let blockOne = SKAction.runBlock({
self.pauseButton.position.y = - self.pauseButton.size.height
self.playButton.position.y = self.playButton.size.height
})
let blockTwo = SKAction.runBlock({
self.view?.paused = true
})
self.runAction(SKAction.sequence([blockOne, blockTwo]))
}
else if(playButton.containsPoint(touch.locationInNode(self))) {
self.playButton.position.y = -self.playButton.size.height
self.pauseButton.position.y = self.pauseButton.size.height
self.view?.paused = false
}
}
}
}
You'll want to look into using AVAudioPlayer and NSNotificationCenter to pass data around the game.
Start the background audio player in your actual GameViewController class.
It's better to do it this way then use SKAudioNode... That's more for sounds that are like sound effects relating to something that happened in gameplay.
By using AVAudioPlayer, the advantage is when the music is paused it's still cued up to play in it's previous spot.
This is one of the few things that will be running regardless of what's going on. So we put it in the GameViewController.
So here's an example of GameViewController code we'd need to start
import AVFoundation
var bgMusicPlayer:AVAudioPlayer?
Then in the GameViewController we make these functions as such
override func viewDidLoad() {
super.viewDidLoad()
// PlayBackgroundSound , PauseBackgroundSound will be your code to send from other "scenes" to use the audio player //
// NSNotificationCenter to pass data throughout the game //
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(GameViewController.playBackgroundSound(_:)), name: "PlayBackgroundSound", object: nil)
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(GameViewController.pauseBackgroundSound), name: "PauseBackgroundSound", object: nil)
}
func playBackgroundSound(notification: NSNotification) {
let name = notification.userInfo!["fileToPlay"] as! String
if (bgSoundPlayer != nil){
bgSoundPlayer!.stop()
bgSoundPlayer = nil
}
if (name != ""){
let fileURL:NSURL = NSBundle.mainBundle().URLForResource(name, withExtension: "mp3")!
do {
bgSoundPlayer = try AVAudioPlayer(contentsOfURL: fileURL)
} catch _{
bgSoundPlayer = nil
}
bgSoundPlayer!.volume = 1
bgSoundPlayer!.numberOfLoops = -1
// -1 will loop it forever //
bgSoundPlayer!.prepareToPlay()
bgSoundPlayer!.play()
}
}
func pauseBackgroundSound() {
if (bgSoundPlayer != nil){
bgSoundPlayer!.pause()
}
}
Then when you want to use the audio player in your pause or resume button functions.
Remember you need to have the player used in each scene.
import AVFoundation.AVAudioSession
override func didMoveToView(view: SKView) {
try! AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
}
Then if you want to pause or play something just use NSNotificationCenter.
// To Pause in a scene use this line of code in the function you need to pause the music
NSNotificationCenter.defaultCenter().postNotificationName("PauseBackgroundSound", object: self)
/// To Play initially or a new song .. use this line of code in the function you need to play the music
NSNotificationCenter.defaultCenter().postNotificationName("PlayBackgroundSound", object: self, userInfo: "FILE NAME OF BACKGROUND MUSIC")

How to play a sound just one time SWIFT

I'm having a little problem with my code who consists to play a sound when I start my app. But here's the problem everytime that I go back to the first screen the sound is playing again and I want it to play just one time. When the menu screen pop ups for the fist time.
Here's my code
var bubbleSound: SystemSoundID!
bubbleSound = createBubbleSound()
AudioServicesPlaySystemSound(bubbleSound)
(...)
the function
func createBubbleSound() -> SystemSoundID {
var soundID: SystemSoundID = 0
let soundURL = CFBundleCopyResourceURL(CFBundleGetMainBundle(), "bubble", "wav", nil)
AudioServicesCreateSystemSoundID(soundURL, &soundID)
return soundID
}
You can define a struct like this (source):
struct MyViewState {
static var hasPlayedSound = false
}
Then in your viewDidLoad:
if(!MyViewState.hasPlayedSound) {
var bubbleSound: SystemSoundID!
bubbleSound = createBubbleSound()
AudioServicesPlaySystemSound(bubbleSound)
MyViewState.hasPlayedSound = true
}
You can then modify MyViewState.hasPlayedSound and allow the UIViewController to play the sound again if desired.

How to use NSWindowOcclusionState.Visible in Swift

I am trying to implement window toggling (something I've done many times in Objective-C), but now in Swift. It seams that I am getting the use of NSWindowOcclusionState.Visible incorrectly, but I really cannot see my problem. Only the line w.makeKeyAndOrderFront(self) is called after the initial window creation.
Any suggestions?
var fileArchiveListWindow: NSWindow? = nil
#IBAction func tougleFileArchiveList(sender: NSMenuItem) {
if let w = fileArchiveListWindow {
if w.occlusionState == NSWindowOcclusionState.Visible {
w.orderOut(self)
}
else {
w.makeKeyAndOrderFront(self)
}
}
else {
let sb = NSStoryboard(name: "FileArchiveOverview",bundle: nil)
let controller: FileArchiveOverviewWindowController = sb?.instantiateControllerWithIdentifier("FileArchiveOverviewController") as FileArchiveOverviewWindowController
fileArchiveListWindow = controller.window
fileArchiveListWindow?.makeKeyAndOrderFront(self)
}
}
Old question, but I just run into the same problem. Checking the occlusionState is done a bit differently in Swift using the AND binary operator:
if (window.occlusionState & NSWindowOcclusionState.Visible != nil) {
// visible
}
else {
// not visible
}
In recent SDKs, the NSWindowOcclusionState bitmask is imported into Swift as an OptionSet. You can use window.occlusionState.contains(.visible) to check if a window is visible or not (fully occluded).
Example:
observerToken = NotificationCenter.default.addObserver(forName: NSWindow.didChangeOcclusionStateNotification, object: window, queue: nil) { note in
let window = note.object as! NSWindow
if window.occlusionState.contains(.visible) {
// window at least partially visible, resume power-hungry calculations
} else {
// window completely occluded, throttle down timers, CPU, etc.
}
}

AVPlayerItem videoComposition freeze IOS8

I am playing a AVMutableVideoComposition with AVPlayer and since IOS8 everything was perfectly fine.
But now the video start playing and after 4 or 5 seconds it stop ,like buffering or something like that, the sound keeps playing and when the video ends the AVPlayer loops and play it fine without stops.
I have no clue for fixing this issue.
Any help would be appreciate,
Thank you
I had a same issue, but I got the solution.
You should play after playerItem's status is changed to .ReadyToPlay.
I also answered here, that issue is similar to your issue.
Please see as below.
func startVideoPlayer() {
let playerItem = AVPlayerItem(asset: self.composition!)
playerItem.videoComposition = self.videoComposition!
let player = AVPlayer(playerItem: playerItem)
player.actionAtItemEnd = .None
videoPlayerLayer = AVPlayerLayer(player: player)
videoPlayerLayer!.frame = self.bounds
/* add playerItem's observer */
player.addObserver(self, forKeyPath: "player.currentItem.status", options: .New, context: nil)
NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: playerItem);
self.layer.addSublayer(videoPlayerLayer!)
}
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if keyPath != nil && keyPath! == "player.currentItem.status" {
if let newValue = change?[NSKeyValueChangeNewKey] {
if AVPlayerStatus(rawValue: newValue as! Int) == .ReadyToPlay {
playVideo() /* play after status is changed to .ReadyToPlay */
}
}
} else {
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
func playerItemDidReachEnd(notification: NSNotification) {
let playerItem = notification.object as! AVPlayerItem
playerItem.seekToTime(kCMTimeZero)
playVideo()
}
func playVideo() {
videoPlayerLayer?.player!.play()
}
Same here, don't know if it can be count like an answer but anyway, just use
[AVPlayer seekToTime:AVPlayer.currentItem.duration]; to make first loop by yourself and to avoid AVPlayer stop. That's only way that i found.
I was having the same problem and i solved it this way ..
Instead of applying videoComposition to AVPlayerItem directly .. i exported my video using AVAssetExportSession to apply videoComposition, which in turn gives me a Url with a video having all my videoComposition applied and now i can use this contentURL to play video on AVPlayer..
This works like a charm ..
Exporting video will take time but as now rendering video composition is not happening on the go, it will work smoothly..
Following code can be used to export video ..
AVAssetExportSession *export = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset1280x720];
export.videoComposition = videoComposition;
export.outputURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[NSUUID new].UUIDString] stringByAppendingPathExtension:#"MOV"]];
export.outputFileType = AVFileTypeQuickTimeMovie;
export.shouldOptimizeForNetworkUse = YES;
[export exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (export.status == AVAssetExportSessionStatusCompleted) {
completionHander(export.outputURL, nil);
} else {
completionHander(nil, export.error);
}
});
}];

AVURLAsset getting video size

This is pretty frustrating. I'm trying to get the size of an AVURLasset, but try to avoid naturalSize since Xcode tells me, this is deprecated in iOS5.
But: What's the replacement?
I can't find any clue on how to get the video-dimensions without using «naturalsize»...
Resolution in Swift 3:
func resolutionSizeForLocalVideo(url:NSURL) -> CGSize? {
guard let track = AVAsset(URL: url).tracksWithMediaType(AVMediaTypeVideo).first else { return nil }
let size = CGSizeApplyAffineTransform(track.naturalSize, track.preferredTransform)
return CGSize(width: fabs(size.width), height: fabs(size.height))
}
For Swift 4:
func resolutionSizeForLocalVideo(url:NSURL) -> CGSize? {
guard let track = AVAsset(url: url as URL).tracks(withMediaType: AVMediaType.video).first else { return nil }
let size = track.naturalSize.applying(track.preferredTransform)
return CGSize(width: fabs(size.width), height: fabs(size.height))
}
Solutions without preferredTransform do not return correct values for some videos on the latest devices!
I just checked the documentation online, and the naturalSize method is deprecated for the AVAsset object. However, there should always be an AVAssetTrack which refers to the AVAsset, and the AVAssetTrack has a naturalSize method that you can call.
naturalSize
The natural dimensions of the media data referenced by the track. (read-only)
#property(nonatomic, readonly) CGSize naturalSize
Availability
Available in iOS 4.0 and later. Declared In AVAssetTrack.h
Via: AVAssetTrack Reference for iOS
The deprecation warning on the official documentation suggests, "Use the naturalSize and preferredTransform, as appropriate, of the asset’s video tracks instead (see also tracksWithMediaType:)."
I changed my code from:
CGSize size = [movieAsset naturalSize];
to
CGSize size = [[[movieAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];
It's less pretty and less safe now but won't break when they drop that method.
The deprecation warning says:
Use the naturalSize and preferredTransform, as appropriate,
of the asset’s video tracks instead (see also tracksWithMediaType:).
So we need an AVAssetTrack, and we want its naturalSize and preferredTransform. This can be accessed with the following:
AVAssetTrack *track = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
CGSize dimensions = CGSizeApplyAffineTransform(track.naturalSize, track.preferredTransform);
asset is obviously your AVAsset.
This is a fairly simple extension for AVAsset in Swift 4 to get the size of the video, if available:
extension AVAsset {
var screenSize: CGSize? {
if let track = tracks(withMediaType: .video).first {
let size = __CGSizeApplyAffineTransform(track.naturalSize, track.preferredTransform)
return CGSize(width: fabs(size.width), height: fabs(size.height))
}
return nil
}
}
To derive the dimension of an AVAsset, you should calculate the union of all the visual track rects (after applying their corresponding preferred transformation):
CGRect unionRect = CGRectZero;
for (AVAssetTrack *track in [asset tracksWithMediaCharacteristic:AVMediaCharacteristicVisual]) {
CGRect trackRect = CGRectApplyAffineTransform(CGRectMake(0.f,
0.f,
track.naturalSize.width,
track.naturalSize.height),
track.preferredTransform);
unionRect = CGRectUnion(unionRect, trackRect);
}
CGSize naturalSize = unionRect.size;
Methods that rely on CGSizeApplyAffineTransform fail when your asset contains tracks with non-trivial affine transformation (e.g., 45 degree rotations) or if your asset contains tracks with different origins (e.g., two tracks playing side-by-side with the second track's origin augmented by the width of the first track).
See: MediaPlayerPrivateAVFoundationCF::sizeChanged()at https://opensource.apple.com/source/WebCore/WebCore-7536.30.2/platform/graphics/avfoundation/cf/MediaPlayerPrivateAVFoundationCF.cpp
For Swift 5
let assetSize = asset.tracks(withMediaType: .video)[0].naturalSize
Swift version of #David_H answer.
extension AVAsset {
func resolutionSizeForLocalVideo() -> CGSize? {
var unionRect = CGRect.zero;
for track in self.tracks(withMediaCharacteristic: .visual) {
let trackRect = CGRect(x: 0, y: 0, width:
track.naturalSize.width, height:
track.naturalSize.height).applying(track.preferredTransform)
unionRect = unionRect.union(trackRect)
}
return unionRect.size
}
}
For iOS versions 15.0 and above,
extension AVAsset {
func naturalSize() async -> CGSize? {
guard let tracks = try? await loadTracks(withMediaType: .video) else { return nil }
guard let track = tracks.first else { return nil }
guard let size = try? await track.load(.naturalSize) else { return nil }
return size
}
}

Resources