AVAudioPlayer memory leak - xcode

I'm stuck on some weird memory leak problem related to the AVAudioPlayer and I need help after trying everything that came to mind.
Here is the short description of the problem - code appears right after.
I initialize my player and start to play the sound track in an endless loop (and endless loop or one time play did not change the problem).
Several seconds after the music started, I switch to another sound track, hence I create a new player, initialize it, release the old one (which is playing) and then set the new one in place and play it.
At that point in time (right after I call the new Player - [Player play]) I get a memory leak (of 3.5Kb).
I tried the following:
Stop the old player and then release it - no effect
Release the Player right after the play instruction - did not start playing
Release twice the old player - crash
Memory leak DOES NOT happen when I create and play the first Player!
Also, in the reference it does say that the 'play' is async and so probably it increases the ref count by 1, but in this case, why didn't [Player stop] help?
Thanks,
Here are some parts of the code about how I use it:
- (void) loadAndActivateAudioFunction {
NSBundle *mainBundle = [NSBundle mainBundle];
NSError *error;
NSURL *audioURL = [NSURL fileURLWithPath:[mainBundle pathForResource: Name ofType: Type]];
AVAudioPlayer *player = [(AVAudioPlayer*) [AVAudioPlayer alloc] initWithContentsOfURL:audioURL error:&error];
if (!player) {
DebugLog(#"Audio Load Error: no Player: %#", [error localizedDescription]);
DuringAudioPrep = false;
return;
}
[self lock];
[self setAudioPlayer: player];
[self ActivateAudioFunction];
[self unlock];
}
- (void) setAudioPlayer : (AVAudioPlayer *) player {
if (Player)
{
if ([Player isPlaying] || Repeat) // The indication was off???
[Player stop];
[Player release];
}
Player = player;
}
- (void) ActivateAudioFunction {
[Player setVolume: Volume];
[Player setNumberOfLoops: Repeat];
[Player play];
DuringAudioPrep = false;
}

Here is method to create AVAudioPlayer without causing memory leaks. See this page for explaination.
I have confirmed in my app that this removed my AVAudioPlayer leaks 100%.
- (AVAudioPlayer *)audioPlayerWithContentsOfFile:(NSString *)path {
NSData *audioData = [NSData dataWithContentsOfFile:path];
AVAudioPlayer *player = [AVAudioPlayer alloc];
if([player initWithData:audioData error:NULL]) {
[player autorelease];
} else {
[player release];
player = nil;
}
return player;
}

Implement the protocol AVAudioPlayerDelegate and its method audioPlayerDidFinishPlaying:successfully: then release the audio player object
eg.
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
[player release]; // releases the player object
}

Your code looks OK to me as far as I've seen, so maybe there is code elsewhere which is causing the problem.
I will say that you're using a sort of odd idiom. Rather than retaining on create and releasing on set, I'd do something like this:
// new players will always be created autoreleased.
AVAudioPlayer *player = [[(AVAudioPlayer*) [AVAudioPlayer alloc] initWithContentsOfURL:audioURL error:&error] autorelease];
- (void) setAudioPlayer : (AVAudioPlayer *) player
{
if (Player)
{
if ([Player isPlaying] || Repeat) // The indication was off???
[Player stop];
[Player release];
}
Player = [player retain];
}
In this way, you only retain "player" objects when they actually come into your setAudioPlayer method, which might make it easier to track down.
Also, verify that it's actually an AVAudioPlayer object which is leaking. Instruments should be able to verify this for you.

Try adding MediaPlayer.framework to your project

Related

passing video frame to Core Image on osx

Hi all you awesome coders! I've put together this thing from various helpful sources over the last couple of weeks (including a lot of posts from stackoverflow), trying to create something that will take a webcam feed and detect smiles when they occur (might as well draw boxes around the faces and the smiles as well, that doesn't seem like it would be hard once they are detected). Please give me some lee-way if it's messy code because I'm still very much learning.
Currently I'm stuck at trying to pass the image to a CIImage so it can be analysed for faces (I plan to deal with smiles after the face hurdle is overcome). As it is the compiler succeeds if I comment out the block after (5) - it brings up a simple AVCaptureVideoPreviewLayer in a window. I think this is what I've called "rootLayer", so it's like the first layer of the displayed output, and after I detect faces in the video frames I'll show a rectangle following the "bounds" of any detected face in a new layer overlaid on top of this one, and I've called that layer "previewLayer"... correct?
But with the block after (5) there, the compiler throws out three errors -
Undefined symbols for architecture x86_64:
"_CMCopyDictionaryOfAttachments", referenced from:
-[AVRecorderDocument captureOutput:didOutputSampleBuffer:fromConnection:] in AVRecorderDocument.o
"_CMSampleBufferGetImageBuffer", referenced from:
-[AVRecorderDocument captureOutput:didOutputSampleBuffer:fromConnection:] in AVRecorderDocument.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Can anyone tell me where I'm going wrong and what my next steps are?
Thanks for any help, I've been stuck at this point for a couple of days and I can't figure it out, all the examples I can find are for IOS and don't work in OSX.
- (id)init
{
self = [super init];
if (self) {
// Move the output part to another function
[self addVideoDataOutput];
// Create a capture session
session = [[AVCaptureSession alloc] init];
// Set a session preset (resolution)
self.session.sessionPreset = AVCaptureSessionPreset640x480;
// Select devices if any exist
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
[self setSelectedVideoDevice:videoDevice];
} else {
[self setSelectedVideoDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeMuxed]];
}
NSError *error = nil;
// Add an input
videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
[self.session addInput:self.videoDeviceInput];
// Start the session (app opens slower if it is here but I think it is needed in order to send the frames for processing)
[[self session] startRunning];
// Initial refresh of device list
[self refreshDevices];
}
return self;
}
-(void) addVideoDataOutput {
// (1) Instantiate a new video data output object
AVCaptureVideoDataOutput * captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.videoSettings = #{ (NSString *) kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
// discard if the data output queue is blocked (while CI processes the still image)
captureOutput.alwaysDiscardsLateVideoFrames = YES;
// (2) The sample buffer delegate requires a serial dispatch queue
dispatch_queue_t captureOutputQueue;
captureOutputQueue = dispatch_queue_create("CaptureOutputQueue", DISPATCH_QUEUE_SERIAL);
[captureOutput setSampleBufferDelegate:self queue:captureOutputQueue];
dispatch_release(captureOutputQueue); //what does this do and should it be here or after we receive the processed image back?
// (3) Define the pixel format for the video data output
NSString * key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber * value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * settings = #{key:value};
[captureOutput setVideoSettings:settings];
// (4) Configure the output port on the captureSession property
if ( [self.session canAddOutput:captureOutput] )
[session addOutput:captureOutput];
}
// Implement the Sample Buffer Delegate Method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// I *think* I have a video frame now in some sort of image format... so have to convert it into a CIImage before I can process it:
// (5) Convert CMSampleBufferRef to CVImageBufferRef, then to a CI Image (per weichsel's answer in July '13)
CVImageBufferRef cvFrameImage = CMSampleBufferGetImageBuffer(sampleBuffer); // Having trouble here, prog. stops and won't recognise CMSampleBufferGetImageBuffer.
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
self.ciFrameImage = [[CIImage alloc] initWithCVImageBuffer:cvFrameImage options:(__bridge NSDictionary *)attachments];
//self.ciFrameImage = [[CIImage alloc] initWithCVImageBuffer:cvFrameImage];
//OK so it is a CIImage. Find some way to send it to a separate CIImage function to find the faces, then smiles. Then send it somewhere else to be displayed on top of AVCaptureVideoPreviewLayer
//TBW
}
- (NSString *)windowNibName
{
return #"AVRecorderDocument";
}
- (void)windowControllerDidLoadNib:(NSWindowController *) aController
{
[super windowControllerDidLoadNib:aController];
// Attach preview to session
CALayer *rootLayer = self.previewView.layer;
[rootLayer setMasksToBounds:YES]; //aaron added
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
[self.previewLayer setBackgroundColor:CGColorGetConstantColor(kCGColorBlack)];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[self.previewLayer setFrame:[rootLayer bounds]];
//[newPreviewLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable]; //don't think I need this for OSX?
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[rootLayer addSublayer:previewLayer];
// [newPreviewLayer release]; //what's this for?
}
(moved from the comments section)
Wow. I guess two days and one StackOverflow post is what it takes to figure out that I haven't added CoreMedia.framework to my project.

UIView layer renderInContext Memory not getting released

We are trying to create multiple pdf files by using UIView.layer's renderInContext method
The below code run in a autoreleasepool.
---loop
UIGraphicsBeginPDFContextToFile(documentDirectoryFilename, CGRectZero, nil);
UIGraphicsBeginPDFPage();
CGContextRef pdfContext = UIGraphicsGetCurrentContext();
[view.layer renderInContext:pdfContext];
view =nil;
pdfContext =nil
UIGraphicsEndPDFContext()
--loop ends
After couple of iterations resident memory increases to 20 mb and subsequent iteration adds to the total memory used.
Some how ARC is not releasing the memory used in rendering the pdf and thus the application crashes with low memory.
Application is using ARC
Any help or pointers to resolve the issue would be much appreciated.
Thanks
UPDATED:
Thanks for the quick reply.
My apologies for having a typo in the pseudocode.
Here is sample code from a test project.
-(BOOL)addUIViewToPDFFile:(UIView*)view newBounds:(CGRect)newBounds pdfFile:(NSString*)pdfFile{
BOOL retFlag =YES;
NSArray* documentDirectories = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask,YES);
NSString* documentDirectory = [documentDirectories objectAtIndex:0];
NSString* documentDirectoryFilename = [documentDirectory stringByAppendingPathComponent:pdfFile];
UIGraphicsBeginPDFContextToFile(documentDirectoryFilename, CGRectZero, nil);
CGContextRef pdfContext = UIGraphicsGetCurrentContext();
if([NSThread isMainThread]){
NSLog(#"Main Thread");
}else{
NSLog(#"Not in Main Thread");
}
UIGraphicsBeginPDFPage();
logMemUsage1();
[view.layer renderInContext:pdfContext];
UIGraphicsEndPDFContext();
pdfContext = nil;
logMemUsage1();
return retFlag;
}
- (IBAction)createPdf:(id)sender {
for(int i=0;i<10;i++){
#autoreleasepool {
PDFViewController *vc = [[PDFViewController alloc] init];
[vc loadView];
NSString *PdfFileName =[ NSString stringWithFormat:#"TestPdf%d.pdf",i ];
[self addUIViewToPDFFile:vc.view newBounds:self.view.frame pdfFile:PdfFileName];
vc = nil;
}
}
}
One more to point to add is that when application goes into the background on press of the hardware home button and comes back again to foreground the memory seems to get cleared and released.
I am expecting that after each iteration the memory which is used for creating the pdf should be released.

How do I dismiss subview in MPMoviePlayer?

I have a video that auto plays at launch. When the short clip is finished it shows a black screen. I would like to dismiss the subview to show an image or auto load another controller??
Below is my code:
(void)viewDidLoad
{
{
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"cover" ofType:#"mp4"]];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc]
initWithContentURL:url];
player.movieSourceType = MPMovieSourceTypeFile;
[player setControlStyle:MPMovieControlStyleNone];
player.view.frame = CGRectMake(0, 0, 768, 960);
[self.view addSubview:player.view];
[player play];
player = nil;
}
Thanks for any help..i'm a rookie at this.
I figured out a lot of this a couple of weeks ago. Check out the notifications available. http://developer.apple.com/library/ios/#documentation/mediaplayer/reference/MPMoviePlayerController_Class/Reference/Reference.html
Add something like this to viewDidLoad:
// Remove the movie player view controller from the "playback did finish" notification observers
[[NSNotificationCenter defaultCenter] removeObserver:_moviePlayer
name:MPMoviePlayerPlaybackDidFinishNotification
object:_moviePlayer];
// Register this class as an observer instead
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(movieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:_moviePlayer];
Now you have a method where you can re-add the view, or a thumbnail, or whatever:
- (void)movieFinishedCallback:(NSNotification*)aNotification
{
// Obtain the reason why the movie playback finished
NSNumber *finishReason = [aNotification userInfo][MPMoviePlayerPlaybackDidFinishReasonUserInfoKey];
if ([finishReason intValue] == MPMovieFinishReasonPlaybackEnded) {
[self.view addSubview:self.moviePlayer.view];
}
(This code was adapted on the fly from my project, may need thinking to adapt!)
The MPMoviePlayerController has a lot of notifications that get fired when the movie is playing, is paused, is stopped, is finished, etc. You can add your code to those methods to get very good control of your presentation.
In my case, it took about a day of research and hacking (and maybe another half day of cleanup and tuning), but I managed to get a very nice play/pause transparent button, with a "play icon" image overlay when paused or stopped, all loading or unloading based on the player state. It's a simple custom player control that does exactly what I want. Totally doable, just start with one player state, get what you want, and move onto the next state.

MPMoviePlayerController keeps playing after view did unload?

I have a detail view, and when viewdidload in detailviewcontroller, MPMoviePlayerController allocs and plays an audio, but even if I navigate backto main table, audio is still being played.
How can I stop MPMovieplayercontroller when I navigate back to main table ? This is my MPMoviePlayerController code:
.h
MPMoviePlayerController *player;
.m
- (void)viewDidLoad
{
[super viewDidLoad];
//Get the Movie
NSURL *movieURL = [NSURL URLWithString:#"some link"];
player = [[MPMoviePlayerController alloc] initWithContentURL:movieURL];
//Place it in subview, else it won’t work
player.view.frame = CGRectMake(20, 20, 280, 25);
player.backgroundView.backgroundColor = [UIColor clearColor];
[self.view addSubview:player.view];
// Play the movie.
[player play];
}
I even added following code into viewdidunload method, but didn't work.
- (void)viewDidUnload {
[player stop];
player.initialPlaybackTime = -1;
[player release];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
What do you guys suggest ?
Thanks in advance,
I liked the user experience of viewDidDisappear better than viewWillDisappear. The animation starts and the movie stops after - the audio flows better for me this way.
-(void)viewDidDisappear:(BOOL)animated {
[super viewDidDisappear:animated];
[_moviePlayer stop];
}
I am having a similar issue. I am unable to use "viewDidDisappear" or "viewWillDisappear" because I have a "config" type view that can be opened while the content is playing, and it will trigger those two methods.
EDIT: Found that viewDidUnload and viewWillUnload are not getting called any more (I'm currently on an iOS 6+ device)...
From the documentation:
Availability: iOS (3.0 and later) Deprecated: Views are no longer
purged under low-memory conditions and so this method is never called.
I just created a simple function called unload, and inside the function, set any objects I needed to = nil (I'm using ARC). At the time that I make the call to remove the view, I call the unload function as well. Hope it helps someone.

Changing NSApplicationIcon across a running application?

I'd like to adjust the NSApplicationIcon image that gets shown automatically in all alerts to be something different than what is in the app bundle.
I know that it's possible to set the dock icon with [NSApplication setApplicationIconImage:] -- but this only affects the dock, and nothing else.
I'm able to work around this issue some of the time: I have an NSAlert *, I can call setIcon: to display my alternate image.
Unfortunately, I have a lot of nibs that have NSImageView's with NSApplicationIcon, that I would like to affect, and it would be a hassle to create outlets and put in code to change the icon. And for any alerts that I'm bringing up with the BeginAlert... type calls (which don't give an NSAlert object to muck with), I'm completely out of luck.
Can anybody think of a reasonable way to globally (for the life of a running application) override the NSApplicationIcon that is used by AppKit, with my own image, so that I can get 100% of the alerts replaced (and make my code simpler)?
Swizzle the [NSImage imageNamed:] method? This method works at least on Snow Leopard, YMMV.
In an NSImage category:
#implementation NSImage (Magic)
+ (void)load {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
// have to call imageNamed: once prior to swizzling to avoid infinite loop
[[NSApplication sharedApplication] applicationIconImage];
// swizzle!
NSError *error = nil;
if (![NSImage jr_swizzleClassMethod:#selector(imageNamed:) withClassMethod:#selector(_sensible_imageNamed:) error:&error])
NSLog(#"couldn't swizzle imageNamed: application icons will not update: %#", error);
[pool release];
}
+ (id)_sensible_imageNamed:(NSString *)name {
if ([name isEqualToString:#"NSApplicationIcon"])
return [[NSApplication sharedApplication] applicationIconImage];
return [self _sensible_imageNamed:name];
}
#end
With this hacked up (untested, just wrote it) jr_swizzleClassMethod:... implementation:
+ (BOOL)jr_swizzleClassMethod:(SEL)origSel_ withClassMethod:(SEL)altSel_ error:(NSError**)error_ {
#if OBJC_API_VERSION >= 2
Method origMethod = class_getClassMethod(self, origSel_);
if (!origMethod) {
SetNSError(error_, #"original method %# not found for class %#", NSStringFromSelector(origSel_), [self className]);
return NO;
}
Method altMethod = class_getClassMethod(self, altSel_);
if (!altMethod) {
SetNSError(error_, #"alternate method %# not found for class %#", NSStringFromSelector(altSel_), [self className]);
return NO;
}
id metaClass = objc_getMetaClass(class_getName(self));
class_addMethod(metaClass,
origSel_,
class_getMethodImplementation(metaClass, origSel_),
method_getTypeEncoding(origMethod));
class_addMethod(metaClass,
altSel_,
class_getMethodImplementation(metaClass, altSel_),
method_getTypeEncoding(altMethod));
method_exchangeImplementations(class_getClassMethod(self, origSel_), class_getClassMethod(self, altSel_));
return YES;
#else
assert(0);
return NO;
#endif
}
Then, this method to illustrate the point:
- (void)doMagic:(id)sender {
static int i = 0;
i = (i+1) % 2;
if (i)
[[NSApplication sharedApplication] setApplicationIconImage:[NSImage imageNamed:NSImageNameBonjour]];
else
[[NSApplication sharedApplication] setApplicationIconImage:[NSImage imageNamed:NSImageNameDotMac]];
// any pre-populated image views have to be set to nil first, otherwise their icon won't change
// [imageView setImage:nil];
// [imageView setImage:[NSImage imageNamed:NSImageNameApplicationIcon]];
NSAlert *alert = [[[NSAlert alloc] init] autorelease];
[alert setMessageText:#"Shazam!"];
[alert runModal];
}
A couple of caveats:
Any image view already created must have setImage: called twice, as seen above to register the image changing. Don't know why.
There may be a better way to force the initial imageNamed: call with #"NSApplicationIcon" than how I've done it.
Try [myImage setName:#"NSApplicationIcon"] (after setting it as the application icon image in NSApp).
Note: On 10.6 and later, you can and should use NSImageNameApplicationIcon instead of the string literal #"NSApplicationIcon".

Resources