Xcode 6 audio problems - xcode

Recently I updated my Xcode from 5 to 6.4 version and the simulator does not give any sound but some error messages. The simplest codes don't run well, the app itself starts in simulator but there is no audio, the problem might be in some setting because all the codes and programs I have ran well before. I use a brand new mac mini so I imported these codes from a different, older one.
If anyone came across with problems like this please help me!
The error is this:
2015-08-01 09:03:56.730 Testbutton[496:11659] 09:03:56.730 ERROR: 98: Error '!obj' trying to fetch default input device's sample rate
2015-08-01 09:03:56.731 Testbutton[496:11659] 09:03:56.731 ERROR: 100: Error getting audio input device sample rate: '!obj'
2015-08-01 09:03:56.731 Testbutton[496:11659] 09:03:56.731 WARNING: 230: The input device is 0x0; '(null)'
2015-08-01 09:03:56.731 Testbutton[496:11659] 09:03:56.731 WARNING: 234: The output device is 0x26; 'AppleHDAEngineOutput:1B,0,1,2:0'
2015-08-01 09:03:56.732 Testbutton[496:11659] 09:03:56.732 ERROR: 296: error '!obj'
2015-08-01 09:03:56.732 Testbutton[496:11659] 09:03:56.732 ERROR: 113: * * * NULL AQIONode object
2015-08-01 09:03:56.732 Testbutton[496:11403] 09:03:56.732 ERROR: 296: error -66680
2015-08-01 09:03:56.732 Testbutton[496:11403] 09:03:56.732 ERROR: >aq> 1595: failed (-66680); will stop (11025/0 frames)
2015-08-01 09:03:56.736 Testbutton[496:11659] 09:03:56.736 ERROR: 703: Can't make UISound Renderer
The code is this:
// ViewController.m
// Testbutton
#import "ViewController.h"
#interface ViewController ()
{
AVAudioPlayer *_audioPlayer;
}
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Construct URL to sound file
NSString *path = [NSString stringWithFormat:#"%#/horn.wav", [[NSBundle mainBundle] resourcePath]];
NSURL *soundUrl = [NSURL fileURLWithPath:path];
// Create audio player object and initialize with URL to sound
_audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:soundUrl error:nil];
}
- (IBAction)buttonPressed:(id)sender {
[_audioPlayer play];
}
- (void)dealloc {
[audioPlayerPointer release];
[super dealloc];
}
#end
and there is a header file too with this:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController
{
AVAudioPlayer* audioPlayerPointer;
}
- (IBAction)buttonPressed:(id)sender;
#end

AVAudioPlayer needs both a valid input and output in order to set itself up. This isn't a problem on a real device (which has these things built in), but it is a problem in the simulator where there's no built-in microphone on the MacMini.
If you have a microphone, try plugging it into the "line in" port on your MacMini (I found this probable solution in the comments of the answer to this very related question). Make sure to select "line-in" as the input in System Preferences, too.

Related

Failed attempt using Related Items to create backup file in sandboxed app

The App Sandbox design guide says:
The related items feature of App Sandbox lets your app access files
that have the same name as a user-chosen file, but a different
extension. This feature consists of two parts: a list of related
extensions in the application’s Info.plist file and code to tell the
sandbox what you’re doing.
My Info.plist defines a document type for .pnd files (the user-chosen file), as well as a document type for .bak files. The entry for the .bak files has, among other properties, the property NSIsRelatedItemType = YES.
I am trying to use Related Items to move an existing file to a backup file (change .pnd suffix to .bak suffix) when the user writes a new version of the .pnd file. The application is sandboxed. I am not proficient with sandboxing.
I am using PasteurOrgManager as the NSFilePresenter class for both the original and backup files:
#interface PasteurOrgData : NSObject <NSFilePresenter>
. . . .
#property (readonly, copy) NSURL *primaryPresentedItemURL;
#property (readonly, copy) NSURL *presentedItemURL;
#property (readwrite) NSOperationQueue *presentedItemOperationQueue;
#property (readwrite) NSFileCoordinator *fileCoordinator;
. . . .
- (void) doBackupOf: (NSString*) path;
. . . .
#end
The doBackupOf: method is as follows. Notice that it also sets the NSFilePresenter properties:
- (void) doBackupOf: (NSString*) path
{
NSError *error = nil;
NSString *appSuffix = #".pnd";
NSURL *const pathAsURL = [NSURL URLWithString: [NSString stringWithFormat: #"file://%#", path]];
NSString *const baseName = [pathAsURL lastPathComponent];
NSString *const prefixToBasename = [path substringToIndex: [path length] - [baseName length] - 1];
NSString *const baseNameWithoutExtension = [baseName substringToIndex: [baseName length] - [appSuffix length]];
NSString *backupPath = [NSString stringWithFormat: #"%#/%#.bak", prefixToBasename, baseNameWithoutExtension];
NSURL *const backupURL = [NSURL URLWithString: [NSString stringWithFormat: #"file://%#", backupPath]];
// Move backup to trash — I am sure this will be my next challenge
// (it's a no-op now because there is no pre-existing .bak file)
[[NSFileManager defaultManager] trashItemAtURL: backupURL
resultingItemURL: nil
error: &error];
// Move file to backup
primaryPresentedItemURL = pathAsURL;
presentedItemURL = backupURL;
presentedItemOperationQueue = [NSOperationQueue mainQueue];
[NSFileCoordinator addFilePresenter: self];
fileCoordinator = [[NSFileCoordinator alloc] initWithFilePresenter: self]; // error here
[self backupItemWithCoordinationFrom: pathAsURL
to: backupURL];
[NSFileCoordinator removeFilePresenter: self];
fileCoordinator = nil;
}
The backupItemWithCoordinationFrom: method does the heavy lifting, basically:
[fileCoordinator coordinateWritingItemAtURL: from
options: NSFileCoordinatorWritingForMoving
error: &error
byAccessor: ^(NSURL *oldURL) {
[self.fileCoordinator itemAtURL: oldURL willMoveToURL: to];
[[NSFileManager defaultManager] moveItemAtURL: oldURL
toURL: to
error: &error];
[self.fileCoordinator itemAtURL: oldURL didMoveToURL: to];
}
but the code doesn't make it that far. I have traced the code and the URL variables are as I expect, and are reasonable. At the point of "error here" in the above code, where I allocate the File Presenter, I get:
NSFileSandboxingRequestRelatedItemExtension: an error was received from pboxd instead of a token. Domain: NSPOSIXErrorDomain, code: 1
[presenter] +[NSFileCoordinator addFilePresenter:] could not get a sandbox extension. primaryPresentedItemURL: file:///Users/cope/Me.pnd, presentedItemURL: file:///Users/cope/Me.bak
Any help is appreciated.
(I have read related posts Where can a sandboxed Mac app save files? and Why do NSFilePresenter protocol methods never get called?. I have taken note of several other sandboxing-related posts that don't seem relevant to this issue.)
MacBook Pro, MacOS 10.13.5, XCode Version 9.3 (9E145)
do not read too much about avoiding sandboxing. Most explenations go too far out of the most obvious problem. Instead of explaining the pitfalls that rightfully triggers sandboxing they explain mostly how to avoid the Sandbox at all. Which is not a solution - it is a thread!
So the most obvious problem is exposing a URL to pasteboard that still needs properly escaped characters in the string before you transform to NSURL.
So your NSString beginning with "file://" should use something like..
NSString *encodeStringForURL = [yourstring stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLQueryAllowedCharacterSet]];
before you transform to NSURL with
NSURL *fileurl = [NSURL URLWithString:encodeStringForURL];
NString *output = fileurl.absoluteString;

How do I add background music to my spritekit file

Could someone give me a quick easy step by step to adding background m4a music once my app has loaded. It is a sprite kit Xcode file, and the music is in m4a format. Thanks
Try with this:
#import AVFoundation;
...
AVAudioPlayer * backgroundMusicPlayer;
NSError *error;
NSURL * backgroundMusicURL = [[NSBundle mainBundle] URLForResource:#"song" withExtension:#"m4a"];
backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:backgroundMusicURL error:&error];
backgroundMusicPlayer.numberOfLoops = -1; //-1 = infinite loop
[backgroundMusicPlayer prepareToPlay];
[backgroundMusicPlayer play];
and to stop simply
[backgroundMusicPlayer stop];
note: I don't use SKAction to play background music because you can't stop it when you want
You can use AVAudioPlayer for this purpose:
In your .h:
#import <AVFoundation/AVFoundation.h>
and add the following to interface
AVAudioPlayer *player;
In .m, initialize player with audio oath url:
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"bg_music"
ofType:#"mp3"]];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
player.numberOfLoops = -1;
and when you need to play the audio, you can call:
[player play];
Note:
"numberOfLoops" is the number of times that the sound will return to the beginning upon reaching the end.
A value of zero means to play the sound just once.
A value of one will result in playing the sound twice, and so on...
Any negative number will loop indefinitely until stopped.
Keep Coding................ :)

Play audio from app, even when it is in the background

I am having difficulties determining how I would play audio in the background for my iOS app. I have a NSTimer and wish to have a sound be played when it reaches 5 minutes, even when in the background. I already have the audio background mode enabled, but unsure how to accomplish my idea.
Under capabilities select the following:
In your plist make sure where it says "Application does not run in background" is NO.
Then add the following to your your appDelegate.m file.
NSError *sessionError = nil;
NSError *activationError = nil;
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error:&sessionError];
[[AVAudioSession sharedInstance] setActive: YES error: &activationError];
In the following method:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
return YES;
}

MPMoviePlayerController doesn't work after upgrading to iOS 5

This code works perfectly on iPad 4.3 Simulator:
NSString *source = [mediaObject objectForKey:#"source"];
NSString *videoPath = [NSString stringWithFormat:#"%#/%#", path, source];
NSURL *videoUrl = [NSURL fileURLWithPath:videoPath];
MPMoviePlayerController *videoPlayer = [[MPMoviePlayerController alloc] initWithContentURL:videoUrl];
videoPlayer.shouldAutoplay = NO;
videoPlayer.view.frame = CGRectMake(xPos, yPos, width, height);
[backgroundImageView addSubview:videoPlayer.view];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(videoPlaybackStateDidChange:) name:MPMoviePlayerPlaybackStateDidChangeNotification object:videoPlayer];
but it doesn't work on iPad 5 Simulator. I get a black frame with no movie nor playback controls.
I read the Apple changelog about MPMoviePlayerController, but I didn't found anything about this problem. Can you help me?
I solved the problem in this way: in my header file I wrote:
MPMoviePlayerController *moviePlayer;
with this property:
#property(nonatomic, strong) MPMoviePlayerController *moviePlayer;
and in the method in which I init the moviePlayer:
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:movieUrl];
self.moviePlayer = player;
It seems that assigning the player to a property "saves" the player. But don't ask me why...
You don't mention what type of URL you are trying to play, however, if it's an HTTP Live Streaming resource (.m3u8 file), then be aware that iOS 5.0 seems to have tightened up on validating the contents of the m3u8 index file.
Specifically, I've discovered that:
No individual segment can be more than twice as long as the #EXT-X-TARGETDURATION value;
The #EXTINF value (segment length in seconds) can, now, only be an integer value.
If one of these is your problem, running your application under the iOS 5.0 simulator should show a warning in the debugger console.
For HLS on iOS5, the TARGETDURATION value is really not the target duration but needs to be the maximum duration. So it should be set to the largest segment in the file.

&error error - iOS dev

I am trying to create an AVCaptureSession. I based my code on the WWDC 2011 video, number 419.
I have the following line which is exactly the same as the code in the WWDC 2011 video and it also identical to code here http://www.bardecode.com/en/knowledge-base/214-detailed-description-of-work-around-for-avcapturevideopreviewlayer-problem-in-ios-41.html
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
But Xcode says that the &error is the use of an undeclared identifier.
This is because you've not defined the NSError error variable, that you're providing the address of when you use &error.
If you define the variable via...
NSError *error = nil;
...on the line before, all should be well.
As a bit of an explanation, if you look at the signature for the AVCaptureDeviceInput deviceInputWithDevice:error: method you'll see the following:
+ (id)deviceInputWithDevice:(AVCaptureDevice *)device error:(NSError **)outError
In other words, this method expects the address of an NSError pointer variable to be provided as the ourError parameter.

Resources