I am playing a AVMutableVideoComposition with AVPlayer and since IOS8 everything was perfectly fine.
But now the video start playing and after 4 or 5 seconds it stop ,like buffering or something like that, the sound keeps playing and when the video ends the AVPlayer loops and play it fine without stops.
I have no clue for fixing this issue.
Any help would be appreciate,
Thank you
I had a same issue, but I got the solution.
You should play after playerItem's status is changed to .ReadyToPlay.
I also answered here, that issue is similar to your issue.
Please see as below.
func startVideoPlayer() {
let playerItem = AVPlayerItem(asset: self.composition!)
playerItem.videoComposition = self.videoComposition!
let player = AVPlayer(playerItem: playerItem)
player.actionAtItemEnd = .None
videoPlayerLayer = AVPlayerLayer(player: player)
videoPlayerLayer!.frame = self.bounds
/* add playerItem's observer */
player.addObserver(self, forKeyPath: "player.currentItem.status", options: .New, context: nil)
NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: playerItem);
self.layer.addSublayer(videoPlayerLayer!)
}
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if keyPath != nil && keyPath! == "player.currentItem.status" {
if let newValue = change?[NSKeyValueChangeNewKey] {
if AVPlayerStatus(rawValue: newValue as! Int) == .ReadyToPlay {
playVideo() /* play after status is changed to .ReadyToPlay */
}
}
} else {
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
func playerItemDidReachEnd(notification: NSNotification) {
let playerItem = notification.object as! AVPlayerItem
playerItem.seekToTime(kCMTimeZero)
playVideo()
}
func playVideo() {
videoPlayerLayer?.player!.play()
}
Same here, don't know if it can be count like an answer but anyway, just use
[AVPlayer seekToTime:AVPlayer.currentItem.duration]; to make first loop by yourself and to avoid AVPlayer stop. That's only way that i found.
I was having the same problem and i solved it this way ..
Instead of applying videoComposition to AVPlayerItem directly .. i exported my video using AVAssetExportSession to apply videoComposition, which in turn gives me a Url with a video having all my videoComposition applied and now i can use this contentURL to play video on AVPlayer..
This works like a charm ..
Exporting video will take time but as now rendering video composition is not happening on the go, it will work smoothly..
Following code can be used to export video ..
AVAssetExportSession *export = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset1280x720];
export.videoComposition = videoComposition;
export.outputURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[NSUUID new].UUIDString] stringByAppendingPathExtension:#"MOV"]];
export.outputFileType = AVFileTypeQuickTimeMovie;
export.shouldOptimizeForNetworkUse = YES;
[export exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (export.status == AVAssetExportSessionStatusCompleted) {
completionHander(export.outputURL, nil);
} else {
completionHander(nil, export.error);
}
});
}];
Related
First of all I would like to thank anyone in advance for any help I get.
I have searched far and wide across the net and cannot find a solution to my issue. My issue is with an iOS game I am building using the SpriteKit framework. I have added a background song using an SKAudioNode and it works fine initially, but when I pause and play the game within a few seconds, the music does not begin playing again. I have tried lots of things like removing the SKAudioNode when the game is paused and adding it again when the game is resumed, but nothing has worked. I have posted a snippet of my code below keeping it as relevant as possible:
class GameScene: SKScene, SKPhysicsContactDelegate {
var backgroundMusic = SKAudioNode(fileNamed: "bg.mp3")
let pauseImage = SKTexture(imageNamed: "pause.png")
var pauseButton:SKSpriteNode!
let playImage = SKTexture(imageNamed: "play2.png")
var playButton:SKSpriteNode!
override func didMoveToView(view: SKView) {
self.addChild(backgroundMusic)
// create pause button
pauseButton = SKSpriteNode(texture: pauseImage)
pauseButton.position = CGPoint(x: self.size.width - pauseButton.size.width, y: pauseButton.size.height)
pauseButton.zPosition = 1
pauseButton.name = "pauseButton"
self.addChild(pauseButton)
// create play button
playButton = SKSpriteNode(texture: playImage)
playButton.position = CGPoint(x: self.size.width - playButton.size.width, y: -playButton.size.height)
playButton.zPosition = 1
playButton.name = "playButton"
self.addChild(playButton)
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
for touch in touches {
if pauseButton.containsPoint(touch.locationInNode(self)) {
let blockOne = SKAction.runBlock({
self.pauseButton.position.y = - self.pauseButton.size.height
self.playButton.position.y = self.playButton.size.height
})
let blockTwo = SKAction.runBlock({
self.view?.paused = true
})
self.runAction(SKAction.sequence([blockOne, blockTwo]))
}
else if(playButton.containsPoint(touch.locationInNode(self))) {
self.playButton.position.y = -self.playButton.size.height
self.pauseButton.position.y = self.pauseButton.size.height
self.view?.paused = false
}
}
}
}
You'll want to look into using AVAudioPlayer and NSNotificationCenter to pass data around the game.
Start the background audio player in your actual GameViewController class.
It's better to do it this way then use SKAudioNode... That's more for sounds that are like sound effects relating to something that happened in gameplay.
By using AVAudioPlayer, the advantage is when the music is paused it's still cued up to play in it's previous spot.
This is one of the few things that will be running regardless of what's going on. So we put it in the GameViewController.
So here's an example of GameViewController code we'd need to start
import AVFoundation
var bgMusicPlayer:AVAudioPlayer?
Then in the GameViewController we make these functions as such
override func viewDidLoad() {
super.viewDidLoad()
// PlayBackgroundSound , PauseBackgroundSound will be your code to send from other "scenes" to use the audio player //
// NSNotificationCenter to pass data throughout the game //
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(GameViewController.playBackgroundSound(_:)), name: "PlayBackgroundSound", object: nil)
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(GameViewController.pauseBackgroundSound), name: "PauseBackgroundSound", object: nil)
}
func playBackgroundSound(notification: NSNotification) {
let name = notification.userInfo!["fileToPlay"] as! String
if (bgSoundPlayer != nil){
bgSoundPlayer!.stop()
bgSoundPlayer = nil
}
if (name != ""){
let fileURL:NSURL = NSBundle.mainBundle().URLForResource(name, withExtension: "mp3")!
do {
bgSoundPlayer = try AVAudioPlayer(contentsOfURL: fileURL)
} catch _{
bgSoundPlayer = nil
}
bgSoundPlayer!.volume = 1
bgSoundPlayer!.numberOfLoops = -1
// -1 will loop it forever //
bgSoundPlayer!.prepareToPlay()
bgSoundPlayer!.play()
}
}
func pauseBackgroundSound() {
if (bgSoundPlayer != nil){
bgSoundPlayer!.pause()
}
}
Then when you want to use the audio player in your pause or resume button functions.
Remember you need to have the player used in each scene.
import AVFoundation.AVAudioSession
override func didMoveToView(view: SKView) {
try! AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
}
Then if you want to pause or play something just use NSNotificationCenter.
// To Pause in a scene use this line of code in the function you need to pause the music
NSNotificationCenter.defaultCenter().postNotificationName("PauseBackgroundSound", object: self)
/// To Play initially or a new song .. use this line of code in the function you need to play the music
NSNotificationCenter.defaultCenter().postNotificationName("PlayBackgroundSound", object: self, userInfo: "FILE NAME OF BACKGROUND MUSIC")
Whenever I go back to my home screen it replays the music over the already playing music. I tried making an if statements that 'obviously' doesn't work (because I barely know any swift!). Here is my home screen code:
var myAudioPlayer = AVAudioPlayer()
override func viewDidLoad() {
let myFilePathString =
NSBundle.mainBundle().pathForResource("16 March of the Resistance", ofType: "m4a")
if let myFilePathString = myFilePathString
{
let myFilePathURL = NSURL(fileURLWithPath: myFilePathString)
do{
try myAudioPlayer = AVAudioPlayer(contentsOfURL: myFilePathURL)
myAudioPlayer.play()
}catch
{
print("error")
}
}
}
How can I stop it from playing on top of itself?
Here is a picture of my storyboard
enter image description here
The music player is on StartScreen, but when I click the back button, it starts another music player over the current one.
I need code which says
is myAudioPlayer playing?
if yes do not play song again
else play "16 March of the resistance"
enter image description here
You need to check if you are already playing. You can do this by the playing property of the AVAudioPlayer class.
Guard your viewDidLoad with say:
if !myAudioPlayer.playing
Note that I don't know any Swift.
I'm not sure what you mean by "my home screen", is it the device or your main VC? Why are you overriding viewDidLoad()? A previous player could never be playing in the viewDidLoad() call, unless for a leak which is what's needing fixing.
Otherwise, you just need to declare your player as an optional, rather than creating it in the declaration. As for the replaying or overlapping, just check if you player exists, although you wouldnt do this in your viewDidLoad(). ViewDidLoad() is called when the view of the VC is created and the view loaded, which doesn't relate to your lifecycle of the audio.
Depending on your requirements, you might want to create the player somewhere else that is plays throughout the app, rather than restarting when visiting that VC. Otherwise the code below will avoid the duplication. The problem was this var myAudioPlayer = AVAudioPlayer() which would duplicate players and one wasn't being released.
var myAudioPlayer: AVAudioPlayer?
func viewDidLoad() {
// this should never be needed with correct player setup
if let player = myAudioPlayer where player.playing {
return
}
if let pathString = NSBundle.mainBundle().pathForResource("16 March of the Resistance", ofType: "m4a")
{
let url = NSURL(fileURLWithPath: pathString)
do{
try myAudioPlayer = AVAudioPlayer(contentsOfURL: url)
myAudioPlayer?.play()
} catch {
print("error")
}
}
}
Update
So there is no need to make the VC do this. Create a Class of e.g. AudioController, which could even be a singleton. Initialise it in your AppDelegate and trigger this player setup code after initialisation. Your view controllers dont need to know about the audio controller. IF e.g. you do want to modify the volume from a setting page, you just access the singleton object and set its volume.
This way you keep all your audio controller code nicely separated. Dont just think of functionality, think of overall software architecture, with objects having clear responsibilities. Keep your ViewControllers light, they shouldnt be doing much processing.
Take a look at this Singletons , its really easy to setup a singleton in Swift nowadays.
class AudioController {
static let sharedInstance = AudioController()
var audioPlayer: AVAudioPlayer?
let kVolumeKey = "VolumeKey"
let kHasSavedInitialVolumeKey = "HasSavedInitialVolumeKey"
var volume: Float = 0.5
func setup() {
self.loadVolume()
self.setupPlayer()
}
func updatePlayerVolume(volume: Float) {
self.audioPlayer?.volume = volume
self.volume = volume
self.saveVolume()
}
func saveVolume() {
NSUserDefaults.standardUserDefaults().setBool(true, forKey: kHasSavedInitialVolumeKey)
NSUserDefaults.standardUserDefaults().setFloat(self.volume, forKey: kVolumeKey)
NSUserDefaults.standardUserDefaults().synchronize()
}
func loadVolume() {
if NSUserDefaults.standardUserDefaults().boolForKey(kHasSavedInitialVolumeKey) {
self.volume = NSUserDefaults.standardUserDefaults().floatForKey(kVolumeKey)
}
}
func setupPlayer() {
if let pathString = NSBundle.mainBundle().pathForResource("16 March of the Resistance", ofType: "m4a")
{
let url = NSURL(fileURLWithPath: pathString)
do{
try audioPlayer = AVAudioPlayer(contentsOfURL: url)
audioPlayer?.volume = self.volume
audioPlayer?.play()
} catch {
print("error")
}
}
}
}
So, you can just setup the player from your AppDelegate's didFinishLaunching
AudioController.sharedInstance.setupPlayer()
And then you can access the AudioController using AudioController.sharedInstance.whatever when you need to get or set. Also, you will need to implement the players delegate methods for handling the end of the song, possibly triggering a new song to play from a playlist(create a new object for this) or whatever. Your player delegate code is also nicely separated now.
Looking at your storyboard it seems that when you click the Back button on the second ViewController you allocate another start screen and you push it on the screen.
It seems that you're using a UINavigationController if so when the back button is tapped you should call:
self.navigationController?.popViewControllerAnimated(true)
If your showing it as a modal you should dismiss is like this:
self.dismissViewControllerAnimated(true, completion: nil)
I'm having a little problem with my code who consists to play a sound when I start my app. But here's the problem everytime that I go back to the first screen the sound is playing again and I want it to play just one time. When the menu screen pop ups for the fist time.
Here's my code
var bubbleSound: SystemSoundID!
bubbleSound = createBubbleSound()
AudioServicesPlaySystemSound(bubbleSound)
(...)
the function
func createBubbleSound() -> SystemSoundID {
var soundID: SystemSoundID = 0
let soundURL = CFBundleCopyResourceURL(CFBundleGetMainBundle(), "bubble", "wav", nil)
AudioServicesCreateSystemSoundID(soundURL, &soundID)
return soundID
}
You can define a struct like this (source):
struct MyViewState {
static var hasPlayedSound = false
}
Then in your viewDidLoad:
if(!MyViewState.hasPlayedSound) {
var bubbleSound: SystemSoundID!
bubbleSound = createBubbleSound()
AudioServicesPlaySystemSound(bubbleSound)
MyViewState.hasPlayedSound = true
}
You can then modify MyViewState.hasPlayedSound and allow the UIViewController to play the sound again if desired.
I am using the following code to play music. It plays well :)
var backgroundMusicPlayer = AVAudioPlayer()
func playBackgroundMusic(filename: String) {
let url = NSBundle.mainBundle().URLForResource(filename, withExtension: nil)
guard let newURL = url else {
print("Could not find file: \(filename)")
return
}
do {
backgroundMusicPlayer = try AVAudioPlayer(contentsOfURL: newURL)
backgroundMusicPlayer.numberOfLoops = -1
backgroundMusicPlayer.prepareToPlay()
backgroundMusicPlayer.play()
} catch let error as NSError {
print(error.description)
}
}
playBackgroundMusic("music.wav")
However, I'd like to be able to keep the music playing only through certain transitions. Is this possible?
Cheers :)
I am trying to use the new AVAudioEngine in iOS 8.
It looks like the completionHandler of player.scheduleFile() is called before the sound file has finished playing.
I am using a sound file with a length of 5s -- and the println()-Message appears round about 1 second before the end of the sound.
Am I doing something wrong or do I misunderstand the idea of a completionHandler?
Thanks!
Here is some code:
class SoundHandler {
let engine:AVAudioEngine
let player:AVAudioPlayerNode
let mainMixer:AVAudioMixerNode
init() {
engine = AVAudioEngine()
player = AVAudioPlayerNode()
engine.attachNode(player)
mainMixer = engine.mainMixerNode
var error:NSError?
if !engine.startAndReturnError(&error) {
if let e = error {
println("error \(e.localizedDescription)")
}
}
engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0))
}
func playSound() {
var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a")
var soundFile = AVAudioFile(forReading: soundUrl, error: nil)
player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") })
player.play()
}
}
I see the same behavior.
From my experimentation, I believe the callback is called once the buffer/segment/file has been "scheduled", not when it is finished playing.
Although the docs explicitly states:
"Called after the buffer has completely played or the player is stopped. May be nil."
So I think it's either a bug or incorrect documentation. No idea which
You can always compute the future time when audio playback will complete, using AVAudioTime. The current behavior is useful because it supports scheduling additional buffers/segments/files to play from the callback before the end of the current buffer/segment/file finishes, avoiding a gap in audio playback. This lets you create a simple loop player without a lot of work. Here's an example:
class Latch {
var value : Bool = true
}
func loopWholeFile(file : AVAudioFile, player : AVAudioPlayerNode) -> Latch {
let looping = Latch()
let frames = file.length
let sampleRate = file.processingFormat.sampleRate
var segmentTime : AVAudioFramePosition = 0
var segmentCompletion : AVAudioNodeCompletionHandler!
segmentCompletion = {
if looping.value {
segmentTime += frames
player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
}
}
player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
segmentCompletion()
player.play()
return looping
}
The code above schedules the entire file twice before calling player.play(). As each segment gets close to finishing, it schedules another whole file in the future, to avoid gaps in playback. To stop looping, you use the return value, a Latch, like this:
let looping = loopWholeFile(file, player)
sleep(1000)
looping.value = false
player.stop()
The AVAudioEngine docs from back in the iOS 8 days must have just been wrong. In the meantime, as a workaround, I noticed if you instead use scheduleBuffer:atTime:options:completionHandler: the callback is fired as expected (after playback finishes).
Example code:
AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
[file readIntoBuffer:buffer error:&error];
[_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
// reminder: we're not on the main thread in here
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"done playing, as expected!");
});
}];
My bug report for this was closed as "works as intended," but Apple pointed me to new variations of the scheduleFile, scheduleSegment and scheduleBuffer methods in iOS 11. These add a completionCallbackType argument that you can use to specify that you want the completion callback when the playback is completed:
[self.audioUnitPlayer
scheduleSegment:self.audioUnitFile
startingFrame:sampleTime
frameCount:(int)sampleLength
atTime:0
completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) {
// do something here
}];
The documentation doesn't say anything about how this works, but I tested it and it works for me.
I've been using this workaround for iOS 8-10:
- (void)playRecording {
[self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() {
float totalTime = [self recordingDuration];
float elapsedTime = [self recordingCurrentTime];
float remainingTime = totalTime - elapsedTime;
[self performSelector:#selector(doSomethingHere) withObject:nil afterDelay:remainingTime];
}];
}
- (float)recordingDuration {
float duration = duration = self.audioUnitFile.length / self.audioUnitFile.processingFormat.sampleRate;
if (isnan(duration)) {
duration = 0;
}
return duration;
}
- (float)recordingCurrentTime {
AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime;
AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime];
AVAudioFramePosition sampleTime = playerTime.sampleTime;
if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing
sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here
float time = sampleTime / self.audioUnitFile.processingFormat.sampleRate;
self.audioUnitLastKnownTime = time;
return time;
}
Yes, it does get called slightly before the file (or buffer) has completed. If you call [myNode stop] from within the completion handler the file (or buffer) will not fully complete. However, if you call [myEngine stop], the file (or buffer) will complete to the end
// audioFile here is our original audio
audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: {
print("scheduleFile Complete")
var delayInSeconds: Double = 0
if let lastRenderTime = self.audioPlayerNode.lastRenderTime, let playerTime = self.audioPlayerNode.playerTime(forNodeTime: lastRenderTime) {
if let rate = rate {
delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate) / Double(rate!)
} else {
delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate)
}
}
// schedule a stop timer for when audio finishes playing
DispatchTime.executeAfter(seconds: delayInSeconds) {
audioEngine.mainMixerNode.removeTap(onBus: 0)
// Playback has completed
}
})
As of today, in a project with deployment target 12.4, on a device running 12.4.1, here's the way we found to successfully stop the nodes upon playback completion:
// audioFile and playerNode created here ...
playerNode.scheduleFile(audioFile, at: nil, completionCallbackType: .dataPlayedBack) { _ in
os_log(.debug, log: self.log, "%#", "Completing playing sound effect: \(filePath) ...")
DispatchQueue.main.async {
os_log(.debug, log: self.log, "%#", "... now actually completed: \(filePath)")
self.engine.disconnectNodeOutput(playerNode)
self.engine.detach(playerNode)
}
}
The main difference w.r.t. previous answers is to postpone node detaching on main thread (which I guess is also the audio render thread?), instead of performing that on callback thread.