Swift 2 Spritekit: Issue with background music not resuming when the game is unpaused - xcode

First of all I would like to thank anyone in advance for any help I get.
I have searched far and wide across the net and cannot find a solution to my issue. My issue is with an iOS game I am building using the SpriteKit framework. I have added a background song using an SKAudioNode and it works fine initially, but when I pause and play the game within a few seconds, the music does not begin playing again. I have tried lots of things like removing the SKAudioNode when the game is paused and adding it again when the game is resumed, but nothing has worked. I have posted a snippet of my code below keeping it as relevant as possible:
class GameScene: SKScene, SKPhysicsContactDelegate {
var backgroundMusic = SKAudioNode(fileNamed: "bg.mp3")
let pauseImage = SKTexture(imageNamed: "pause.png")
var pauseButton:SKSpriteNode!
let playImage = SKTexture(imageNamed: "play2.png")
var playButton:SKSpriteNode!
override func didMoveToView(view: SKView) {
self.addChild(backgroundMusic)
// create pause button
pauseButton = SKSpriteNode(texture: pauseImage)
pauseButton.position = CGPoint(x: self.size.width - pauseButton.size.width, y: pauseButton.size.height)
pauseButton.zPosition = 1
pauseButton.name = "pauseButton"
self.addChild(pauseButton)
// create play button
playButton = SKSpriteNode(texture: playImage)
playButton.position = CGPoint(x: self.size.width - playButton.size.width, y: -playButton.size.height)
playButton.zPosition = 1
playButton.name = "playButton"
self.addChild(playButton)
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
for touch in touches {
if pauseButton.containsPoint(touch.locationInNode(self)) {
let blockOne = SKAction.runBlock({
self.pauseButton.position.y = - self.pauseButton.size.height
self.playButton.position.y = self.playButton.size.height
})
let blockTwo = SKAction.runBlock({
self.view?.paused = true
})
self.runAction(SKAction.sequence([blockOne, blockTwo]))
}
else if(playButton.containsPoint(touch.locationInNode(self))) {
self.playButton.position.y = -self.playButton.size.height
self.pauseButton.position.y = self.pauseButton.size.height
self.view?.paused = false
}
}
}
}

You'll want to look into using AVAudioPlayer and NSNotificationCenter to pass data around the game.
Start the background audio player in your actual GameViewController class.
It's better to do it this way then use SKAudioNode... That's more for sounds that are like sound effects relating to something that happened in gameplay.
By using AVAudioPlayer, the advantage is when the music is paused it's still cued up to play in it's previous spot.
This is one of the few things that will be running regardless of what's going on. So we put it in the GameViewController.
So here's an example of GameViewController code we'd need to start
import AVFoundation
var bgMusicPlayer:AVAudioPlayer?
Then in the GameViewController we make these functions as such
override func viewDidLoad() {
super.viewDidLoad()
// PlayBackgroundSound , PauseBackgroundSound will be your code to send from other "scenes" to use the audio player //
// NSNotificationCenter to pass data throughout the game //
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(GameViewController.playBackgroundSound(_:)), name: "PlayBackgroundSound", object: nil)
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(GameViewController.pauseBackgroundSound), name: "PauseBackgroundSound", object: nil)
}
func playBackgroundSound(notification: NSNotification) {
let name = notification.userInfo!["fileToPlay"] as! String
if (bgSoundPlayer != nil){
bgSoundPlayer!.stop()
bgSoundPlayer = nil
}
if (name != ""){
let fileURL:NSURL = NSBundle.mainBundle().URLForResource(name, withExtension: "mp3")!
do {
bgSoundPlayer = try AVAudioPlayer(contentsOfURL: fileURL)
} catch _{
bgSoundPlayer = nil
}
bgSoundPlayer!.volume = 1
bgSoundPlayer!.numberOfLoops = -1
// -1 will loop it forever //
bgSoundPlayer!.prepareToPlay()
bgSoundPlayer!.play()
}
}
func pauseBackgroundSound() {
if (bgSoundPlayer != nil){
bgSoundPlayer!.pause()
}
}
Then when you want to use the audio player in your pause or resume button functions.
Remember you need to have the player used in each scene.
import AVFoundation.AVAudioSession
override func didMoveToView(view: SKView) {
try! AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
}
Then if you want to pause or play something just use NSNotificationCenter.
// To Pause in a scene use this line of code in the function you need to pause the music
NSNotificationCenter.defaultCenter().postNotificationName("PauseBackgroundSound", object: self)
/// To Play initially or a new song .. use this line of code in the function you need to play the music
NSNotificationCenter.defaultCenter().postNotificationName("PlayBackgroundSound", object: self, userInfo: "FILE NAME OF BACKGROUND MUSIC")

Related

How to get AVPlayer to redraw current AVItem videoComposition when paused

I'm building a simple video editor for macOS: A movie file is loaded as an AVAsset, transformed by a series of CIFilters in a AVVideoComposition, and played by an AVPlayer. I present UI controls for some of the parameters of the CIFilters.
When video is playing everything is working great, I slide sliders and effects change! But when the video is paused the AVPlayerView doesn't redraw after the controls in the UI are changed.
How can I encourage the AVPlayerView to redraw the contents of the videoComposition of it's current item when it's paused?
class ViewController: NSViewController {
#objc #IBAction func openDocument(_ file: Any) { ... }
#IBOutlet weak var moviePlayerView: AVPlayerView!
var ciContext:CIContext? = CIContext(mtlDevice: MTLCreateSystemDefaultDevice()!)
var sliderValue: Double = 0.0
#IBAction func sliderMoved(_ sender: NSSlider) {
self.sliderValue = sender.doubleValue
// need to update the view here when paused
self.moviePlayerView.setNeedsDisplay(self.moviePlayerView.bounds)
// setNeedsDisplay has no effect.
}
func loadMovie(file: URL) {
let avMovie = AVMovie(url: file)
let avPlayerItem = AVPlayerItem(asset: avMovie)
avPlayerItem.videoComposition = AVVideoComposition(asset: avMovie) { request in
let output = request.sourceImage.applyingGaussianBlur(sigma: self.sliderValue)
request.finish(with: output, context: self.ciContext)
}
self.moviePlayerView.player = AVPlayer(playerItem: avPlayerItem)
}
}
It turns out re-setting the AVPlayerItem's videoComposition property will trigger the item to redraw. This works even if you set the property to it's current value: item.videoComposition = item.videoComposition. The property setter appears to have undocumented side effects.
To fix the sample code above, do this:
#IBAction func sliderMoved(_ sender: NSSlider) {
self.sliderValue = sender.doubleValue
// update the view when paused
if self.moviePlayerView.player?.rate == 0.0 {
self.moviePlayerView.player?.currentItem?.videoComposition = self.moviePlayerView.player?.currentItem?.videoComposition
}
}
Hopefully someone finds this useful!

Unable to detect QR Code

I have tried to make an app scanning the QR code within ios 10 and Swift 3. However, my QRScannerController could not detect the QR Code but showing the camera view.
I don't understand what's wrong with the code. Here is the implement of the controller:
import UIKit
import AVFoundation
class QRScannerController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
#IBOutlet var messageLabel:UILabel!
#IBOutlet var topbar: UIView!
//TESTING
var captureSession: AVCaptureSession?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?
var qrCodeFrameView: UIView?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
// Get an instance of the AVCaptureDevice class to initialize a device object and provide the video as the media type parameter
let captureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
do {
view.bringSubview(toFront: messageLabel)
view.bringSubview(toFront: topbar)
// Get an instance of the AVCaptureDeviceInput class using the previous deivce object
let input = try AVCaptureDeviceInput(device: captureDevice)
// Initialize the captureSession object
captureSession = AVCaptureSession()
// Set the input devcie on the capture session
captureSession?.addInput(input)
// Initialize a AVCaptureMetadataOutput object and set it as the input device
let captureMetadataOutput = AVCaptureMetadataOutput()
captureSession?.addOutput(captureMetadataOutput)
// Set delegate and use the default dispatch queue to execute the call back
captureMetadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
captureMetadataOutput.metadataObjectTypes = [AVMetadataObjectTypeQRCode]
//Initialise the video preview layer and add it as a sublayer to the viewPreview view's layer
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer?.frame = view.layer.bounds
view.layer.addSublayer(videoPreviewLayer!)
//start video capture
captureSession?.startRunning()
//Initialize QR Code Frame to highlight the QR code
qrCodeFrameView = UIView()
if let qrCodeFrameView = qrCodeFrameView {
qrCodeFrameView.layer.borderColor = UIColor.green.cgColor
qrCodeFrameView.layer.borderWidth = 2
view.addSubview(qrCodeFrameView)
view.bringSubview(toFront: qrCodeFrameView)
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!) {
// Check if the metadataObjects array is not nil and it contains at least one object.
if metadataObjects == nil || metadataObjects.count == 0 {
qrCodeFrameView?.frame = CGRect.zero
messageLabel.text = "No QR code is detected"
return
}
// Get the metadata object.
let metadataObj = metadataObjects[0] as! AVMetadataMachineReadableCodeObject
if metadataObj.type == AVMetadataObjectTypeQRCode {
// If the found metadata is equal to the QR code metadata then update the status label's text and set the bounds
let barCodeObject = videoPreviewLayer?.transformedMetadataObject(for: metadataObj)
qrCodeFrameView?.frame = barCodeObject!.bounds
if metadataObj.stringValue != nil {
messageLabel.text = metadataObj.stringValue
}
}
}
} catch {
//If any error occurs, simply print it out
print(error)
return
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
Your message label is going under your view controller view. Just bring the message label to front at the end of view did load would help.
I have created a sample project and it was working fine on iPhone. Please take a look at here

how to add a SKnode on a loop that can be toggled on and off

I am creating a game and i keep getting this error for my bullet spawn method linked with a joystick. I want to repetitively spawn the bullet node while the joystick is active. Here is how i am creating a firing method
class GameScene: SKScene, SKPhysicsContactDelegate {
let bullet1 = SKSpriteNode(imageNamed: "bullet.png")
override func didMoveToView(view: SKView) {
if fireWeapon == true {
NSTimer.scheduledTimerWithTimeInterval(0.25, target: self,
selector: Selector ("spawnBullet1"), userInfo: nil, repeats: true)
}
}
func spawnBullet1(){
self.addChild(bullet1)
bullet1.position = CGPoint (x: hero.position.x , y:hero.position.y)
bullet1.xScale = 0.5
bullet1.yScale = 0.5
bullet1.physicsBody = SKPhysicsBody(rectangleOfSize: bullet1.size)
bullet1.physicsBody?.categoryBitMask = PhysicsCategory.bullet1
bullet1.physicsBody?.contactTestBitMask = PhysicsCategory.enemy1
bullet1.physicsBody?.affectedByGravity = false
bullet1.physicsBody?.dynamic = false
}
override func touchesBegan(touches: Set<UITouch>, withEvent
event:UIEvent?) {
for touch in touches {
let location = touch.locationInNode(self)
let node = nodeAtPoint(location)
if (CGRectContainsPoint(joystick.frame, location)) {
stickActive = true
if stickActive == true {
fireWeapon = true
}
}
override func touchesEnded(touches: Set<UITouch>, withEvent event:
UIEvent?) {
fireWeapon = false
}
the first bullet launches as planned and works great however, every time the second bullet launches the app crashes and i get this error "Attemped to add a SKNode which already has a parent". can someone tell me an alternative method
Your problem is exactly as the error says, you need to first remove the bullet from its parent before adding it again, or Make bullet1 a local property inside the spawnBullet function, so that each time you call the function a new bullet gets created and added as child to the scene instead of trying to re-add the same one.

How to play a sound just one time SWIFT

I'm having a little problem with my code who consists to play a sound when I start my app. But here's the problem everytime that I go back to the first screen the sound is playing again and I want it to play just one time. When the menu screen pop ups for the fist time.
Here's my code
var bubbleSound: SystemSoundID!
bubbleSound = createBubbleSound()
AudioServicesPlaySystemSound(bubbleSound)
(...)
the function
func createBubbleSound() -> SystemSoundID {
var soundID: SystemSoundID = 0
let soundURL = CFBundleCopyResourceURL(CFBundleGetMainBundle(), "bubble", "wav", nil)
AudioServicesCreateSystemSoundID(soundURL, &soundID)
return soundID
}
You can define a struct like this (source):
struct MyViewState {
static var hasPlayedSound = false
}
Then in your viewDidLoad:
if(!MyViewState.hasPlayedSound) {
var bubbleSound: SystemSoundID!
bubbleSound = createBubbleSound()
AudioServicesPlaySystemSound(bubbleSound)
MyViewState.hasPlayedSound = true
}
You can then modify MyViewState.hasPlayedSound and allow the UIViewController to play the sound again if desired.

AVPlayerItem videoComposition freeze IOS8

I am playing a AVMutableVideoComposition with AVPlayer and since IOS8 everything was perfectly fine.
But now the video start playing and after 4 or 5 seconds it stop ,like buffering or something like that, the sound keeps playing and when the video ends the AVPlayer loops and play it fine without stops.
I have no clue for fixing this issue.
Any help would be appreciate,
Thank you
I had a same issue, but I got the solution.
You should play after playerItem's status is changed to .ReadyToPlay.
I also answered here, that issue is similar to your issue.
Please see as below.
func startVideoPlayer() {
let playerItem = AVPlayerItem(asset: self.composition!)
playerItem.videoComposition = self.videoComposition!
let player = AVPlayer(playerItem: playerItem)
player.actionAtItemEnd = .None
videoPlayerLayer = AVPlayerLayer(player: player)
videoPlayerLayer!.frame = self.bounds
/* add playerItem's observer */
player.addObserver(self, forKeyPath: "player.currentItem.status", options: .New, context: nil)
NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: playerItem);
self.layer.addSublayer(videoPlayerLayer!)
}
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if keyPath != nil && keyPath! == "player.currentItem.status" {
if let newValue = change?[NSKeyValueChangeNewKey] {
if AVPlayerStatus(rawValue: newValue as! Int) == .ReadyToPlay {
playVideo() /* play after status is changed to .ReadyToPlay */
}
}
} else {
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
func playerItemDidReachEnd(notification: NSNotification) {
let playerItem = notification.object as! AVPlayerItem
playerItem.seekToTime(kCMTimeZero)
playVideo()
}
func playVideo() {
videoPlayerLayer?.player!.play()
}
Same here, don't know if it can be count like an answer but anyway, just use
[AVPlayer seekToTime:AVPlayer.currentItem.duration]; to make first loop by yourself and to avoid AVPlayer stop. That's only way that i found.
I was having the same problem and i solved it this way ..
Instead of applying videoComposition to AVPlayerItem directly .. i exported my video using AVAssetExportSession to apply videoComposition, which in turn gives me a Url with a video having all my videoComposition applied and now i can use this contentURL to play video on AVPlayer..
This works like a charm ..
Exporting video will take time but as now rendering video composition is not happening on the go, it will work smoothly..
Following code can be used to export video ..
AVAssetExportSession *export = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset1280x720];
export.videoComposition = videoComposition;
export.outputURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[NSUUID new].UUIDString] stringByAppendingPathExtension:#"MOV"]];
export.outputFileType = AVFileTypeQuickTimeMovie;
export.shouldOptimizeForNetworkUse = YES;
[export exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (export.status == AVAssetExportSessionStatusCompleted) {
completionHander(export.outputURL, nil);
} else {
completionHander(nil, export.error);
}
});
}];

Resources