AVAudioEngine incorrect time management and callback for AVAudioPlayerNode - ios8

I have a serious issue with the new audio engine in iOS8. I have an application, which is built with AVAudioPlayer and I am trying to figure out a way to migrate to the new architecture, however I bumped into the following problem (which I'm sure you would agree, is a serious and basic obstacle):
My header file:
AVAudioEngine *engine;
AVAudioMixerNode *mainMixer;
AVAudioPlayerNode *player;
My m file (inside the viewDidLoad):
engine = [[AVAudioEngine alloc] init];
player = [[AVAudioPlayerNode alloc] init];
[engine attachNode:player];
NSURL *fileUrl = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:#"test" ofType:#"mp3"]];
AVAudioFile *file = [[AVAudioFile alloc] initForReading:fileUrl error:nil];
NSLog(#"duration: %.2f", file.length / file.fileFormat.sampleRate);
[player scheduleFile:file atTime:nil completionHandler:^{
AVAudioTime *nodeTime = player.lastRenderTime;
AVAudioTime *playerTime = [player playerTimeForNodeTime:nodeTime];
float secs = (float)playerTime.sampleTime / file.fileFormat.sampleRate;
NSLog(#"finished at: %.2f", secs);
}];
mainMixer = [engine mainMixerNode];
[engine connect:player to:mainMixer format:file.processingFormat];
[engine startAndReturnError:nil];
[player play];
The above code initializes the engine and a node, then starts playing back whatever file I'm using. First it prints out the duration of the music file, then, after finishing playback, in the callback function, it prints the current time of the player. These two should be the same or in a worst case scenario, very, very close to each other, but this is not the case, the difference between these two values is very big, e.g.
duration: 148.51
finished at: 147.61
Am I doing something wrong? This should be fairly straight forward, I've tried with different file formats, file lengths, tens of music files, but the difference is always around or just under 1 second.

Update:
As of iOS 11 you can specify the completionCallbackType: AVAudioPlayerNodeCompletionCallbackType
dataConsumed:
A completion handler indicating that the buffer or file data has been consumed by the player.
dataRendered:
The buffer or file data that has been rendered by the player.
dataPlayedBack:
A completion handler indicating that the buffer or file has finished playing.
More info: Documentation
Original:
According to Apple's documentation for scheduleFile:atTime:completionHandler:
It is possible for the completionHandler to be called before rendering
begins or before the file is played completely.

Related

AFNetworking and multiple UIImageViews pulling from same URL

I have an issue where im loading 3, sometimes 4 of the same images using
[imageFile setImageWithURL:[NSURL URLWithString:friendAvatar] placeholderImage:[UIImage imageNamed:#"defaultProfileImage.png"]];
Im trying to see if theres a way to load this into some kind of NSData and use it later on, kind of like how im doing below, but using AFNetworking.
dispatch_async( dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_DEFAULT, 0 ), ^(void)
{
NSURL *url3 = [NSURL URLWithString:friendAvatar];
NSData *data = [NSData dataWithContentsOfURL:url3];
UIImage *img = [[UIImage alloc] initWithData:data];
dispatch_async( dispatch_get_main_queue(), ^(void){
imageFile.image = img;
bgImageFile.image = img;
});
});
Also im not calling the image loads all in the same method, 2 call under the cellForRowAtIndexPath once the users friends list has been populated, then the 3rd gets loaded when i swipe over a cell to show its hidden (under) layer, the 4th repeat image gets called when pressing a button that is showed once the cell has been swiped which leads to a chatroom view between me and that friend.
Hopefully i got to my point on what im trying to acheive. And any help pointing in the right direction is very much appreciated.
Update:
This is my current code, this is what i mean by im pulling the same image several times.
Inside the cellForRowAtIndexPath
[imageFile setImageWithURLRequest:request placeholderImage:[UIImage imageNamed:#"defaultProfileImage.png"]];
[bgImageFile setImageWithURL:[NSURL URLWithString:friendAvatar] placeholderImage:[UIImage imageNamed:#"defaultProfileImage.png"]];
The method bottomDrawerWillAppear that is called contains
UIImageView *drawerBGImg = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,320,75)];
NSString *friendAvatar = [NSString stringWithFormat:#"%#%#%#", #"http://v9a2a7.com/user_photos/", [MyClass friendID], #".jpg"];
[drawerBGImg setImageWithURL:[NSURL URLWithString:friendAvatar]];
And on a seperate class and seperate view viewMessageViewController
NSString *friendAvatar = [NSString stringWithFormat:#"%#%#%#", #"http://v9a2a7.com/user_photos/", email, #".jpg"];
[bgImage setImageWithURL:[NSURL URLWithString:friendAvatar]];
I have not confirmed that the viewMessageViewsController's forces the image to be pulled from the server, but i know for a fact on the cellForRowAtIndexPath makes 3 requests for the same image which results in using 3X the amount of data usage
Hope this clears things up.
It sounds like you want to cache the image so you don't have to keep loading it. Assuming that's what you mean...
The AFNetworking method [UIImageView -setImageWithURL:placeholderImage:] already caches this image for you. The second time you call it, the image will be loaded from the cache.
The only reason it would get reloaded from the server a second time is if your app receives a low memory warning since the last download. (AFImageCache, an NSCache subclass, will automatically invalidate some or all of the cache.)
It uses the URL as the key, so as long as the URL is identical, the image will only get loaded from the server once.

How update image from URL in OS X app?

i have some problem. So i have code which update song name and picture from php. Song name work and also updated but picture not work, in php file all work but in my project - no. How make update picture from url after 10 sec for example. Thanks.
-(void)viewWillDraw {
NSURL *artistImageURL = [NSURL URLWithString:#"http://site.ru/ParseDataField/kiss.php?image"];
NSImage *artistImage = [[NSImage alloc] initWithContentsOfURL:artistImageURL];
[dj setImage:artistImage];
dispatch_queue_t queue = dispatch_get_global_queue(0,0);
dispatch_async(queue, ^{
NSError* error = nil;
NSString* text = [NSString stringWithContentsOfURL:[NSURL URLWithString:#"http://site.ru/ParseDataField/kiss.php?artist"]
encoding:NSASCIIStringEncoding
error:&error];
dispatch_async(dispatch_get_main_queue(), ^{
[labelName setStringValue:text];
});
});
}
You should really consider placing this code someplace other than -viewWillDraw. This routine can be called multiple times for the same NSView under some circumstances and, more importantly, you need to call [super viewWillDraw] to make sure that things will actually draw correctly (if anything is drawn in the view itself).
For periodic updates (such as every 10 seconds), you should consider using NSTimer to trigger the retrieval of the next object.
As for the general question of why your image isn't being drawn correctly, you should probably consider putting the image retrieval and drawing code into the same structure as your label retrieval and drawing code. This will get the [dj setImage: artistImage] method call outside of the viewWillDraw chain which is likely causing some difficulty here.

AV Foundation: Difference between currentItem being ready to play, and -[AVPlayer readyForDisplay] property?

I'm running into a weird situation with my video player, the core code of which hasn't changed much from what worked in an earlier app I made. Here's the problem: I'm inserting a "_loadingLayer" (a CATextLayer that says the video is loading), and then observing the AVPlayer's currentItem's status property to figure out when to remove the "_loadingLayer" and replace it with my actual "_playerLayer". Here's my KVO code for that:
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if ((object == _playerLayer) && (_playerLayer.player.currentItem.status == AVPlayerItemStatusReadyToPlay)) {
[CATransaction setAnimationDuration:1.8];
_loadingLayer.opaque = NO;
if (_playerLayer.readyForDisplay) {
NSLog(#"Should be ready now.");
}
[self addPlayerLayerToLayerTree];
}
}
My problem is that the video is starting, but only the audio is playing -- the layer stays black. When I inserted the NSLog statement above, I found out why: Apparently although the currentItem's status is "AVPlayerItemStatusReadyToPlay", the player layer isn't actually readyForDisplay. This makes no sense to me -- it seems counterintuitive. Can someone please give me some guidance on this?
I was able to verify that _playerLayer is being added to the layer tree by setting its background color to red.
One other weird thing that I think might be related.... I've been seeing these messages in the debugger console:
PSsetwindowlevel, error setting window level (1000)
CGSSetIgnoresCycle: error 1000 setting or clearing window tags
Thanks in advance. This is a crosspost from the Apple Dev Forums.
We had a similar problem and traced it to what I believe is a bug in iOS 5.1 (and maybe earlier versions). It is fixed in iOS 6.0. Since I couldn't find a solution to this anywhere, I'm writing a long writeup for future people that have this problem.
If the AVPlayerItem reports a status of AVPlayerStatusReadyToPlay before the AVPlayerLayer has been obtained then the AVPlayer will never report that it is readyForDisplay.
So when you do:
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
make sure that it's followed with:
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
and that you don't have much if any code in between the two.
I built a test rig to make it work 100% of the time or fail 100% of the time. Note that it can be tricky to see what's going on in your actual app since you will have different load times on the video and that will affect how quickly the playerItem reports AVPlayerStatusReadyToPlay.
If you want to test in your app, put this into a simple view. The below will not work (i.e. you'll hear audio but not see video) on iOS 5.1. If you switch loadPlayerLayer to instead be invoked at the end of loadPlayer, it will always work.
A follow on for future readers: A couple of player events can switch up this order and make you think it's working. They're red herrings though since they're inadvertently reversing the load order such that playerLayer is grabbed before AVStatusReadyToPlay. The events are: seeking the video, going to the home screen and then reactivating the app, the player switching to a different video/audio track inside an HLS video. These actions trigger AVStatusReadyToPlay again and thus make the playerLayer happen before AVStatusReadyToPlay.
Here's the test harness that uses Apple's test HLS video:
-(void)loadPlayer
{
NSLog(#"loadPlayer invoked");
NSURL *url = [NSURL URLWithString:#"https://devimages.apple.com.edgekey.net/resources/http-streaming/examples/bipbop_4x3/bipbop_4x3_variant.m3u8"];
self.playerItem = [AVPlayerItem playerItemWithURL:url];
[self.playerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:&kPlayerContext];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
}
-(void)loadPlayerLayer
{
NSLog(#"starting player layer");
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
[self.playerLayer addObserver:self forKeyPath:#"readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:&kPlayerLayerContext];
[self.playerLayer setFrame:[[self view] bounds]];
[[[self view] layer] addSublayer:self.playerLayer];
}
-(void)observeValueForKeyPath:(NSString*)path ofObject:(id)object change:(NSDictionary*)change context:(void*) context
{
if(context == &kPlayerContext){
if([self.player status] == AVPlayerStatusReadyToPlay){
NSLog(#"Player is ready to play");
//Robert: Never works if after AVPlayerItem reports AVPlayerStatusReadyToPlay
if(!self.startedPlayerLayer){
self.startedPlayerLayer = YES;
[self loadPlayerLayer];
}
}
}
if(context == &kPlayerLayerContext){
if([self.playerLayer isReadyForDisplay] == YES){
NSLog(#"PlayerLayer says it's ready to display now");
[self playTheVideoIfReady];
}
}
}

QTKit, capture video for live streaming

I am trying to create an application for the Mac that would create live video streaming. I know about VLC and other solutions, but still.
To that end i am trying to record video from iSight using QTKit, and save it continuously as a series of tiny video files. However, the recording turns out not quite continuous, with gaps between the files.
Basically, I am just setting up a timer, that starts recording to a new file at certain time intervals, thus stopping the old recording. I also tried setting the max recorded length, and using a delegate method ...didFinishRecording... and ...willFinishRecording..., but with the same result (i can't really estimate the difference between the gaps in these cases).
Please, help me, if you know how these things should be done.
Here is my current code:
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
QTCaptureSession *session = [[QTCaptureSession alloc] init];
QTCaptureDevice *iSight = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
[iSight open:nil];
QTCaptureDeviceInput *myInput = [QTCaptureDeviceInput deviceInputWithDevice:iSight];
output = [[QTCaptureMovieFileOutput alloc] init] ; //ivar, QTCaptureFileOutput
[output setDelegate:self];
a = 0; //ivar, int
fileName = #"/Users/dtv/filerecording_"; //ivar, NSString
[session addOutput:output error:nil];
[session addInput:myInput error:nil];
[capview setCaptureSession:session]; //IBOutlet
[session startRunning];
[output setCompressionOptions:[QTCompressionOptions compressionOptionsWithIdentifier:#"QTCompressionOptionsSD480SizeH264Video"] forConnection:[[output connections] objectAtIndex:0]];
[output recordToOutputFileURL:[NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%i.mov", fileName, a]] bufferDestination:QTCaptureFileOutputBufferDestinationOldFile];
NSTimer *tmr = [NSTimer timerWithTimeInterval:5 target:self selector:#selector(getMovieLength:) userInfo:nil repeats:YES];
[[NSRunLoop currentRunLoop] addTimer:tmr forMode:NSDefaultRunLoopMode];
}
‐ (void) getMovieLength:(NSTimer *) t {
a++;
[output recordToOutputFileURL:[NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%i.mov", fileName, a]] bufferDestination:QTCaptureFileOutputBufferDestinationOldFile];
}
There is a native mechanism to brake the captured movie into pieces. Use
[QTCaptureFileOutput setMaximumRecordedDuration:]
to specify the duration of the piece or
[QTCaptureFileOutput setMaximumRecordedFileSize:]
to specify the file size limit.
When the limit is reached the delegate method will be called:
[QTCaptureFileOutput_Delegate captureOutput: shouldChangeOutputFileAtURL: forConnections: dueToError:]
In the this method you can set the new file name:
[QTCaptureFileOutput recordToOutputFileURL:]
This will allow you to cut the pieces of the recorded movie pretty precisely.
Note, that [QTCaptureFileOutput_Delegate captureOutput: didFinishRecordingToOutputFileAtURL: forConnections: dueToError:] will be called a bit later after the recoding into the file has been actually finished. If you use this method to set the new file you will have gaps in the final video. It does not mean you do not need to use this method though. This method will indicated when the piece of the movie is ready to be used.
If you need even more precise cutting you can use
[QTCaptureFileOutput captureOutput: didOutputSampleBuffer: fromConnection:]
to specify the exact movie frame when to start recording into a new piece. However, you will need more specific knowledge to work with the method.

MPMoviePlayerController doesn't work after upgrading to iOS 5

This code works perfectly on iPad 4.3 Simulator:
NSString *source = [mediaObject objectForKey:#"source"];
NSString *videoPath = [NSString stringWithFormat:#"%#/%#", path, source];
NSURL *videoUrl = [NSURL fileURLWithPath:videoPath];
MPMoviePlayerController *videoPlayer = [[MPMoviePlayerController alloc] initWithContentURL:videoUrl];
videoPlayer.shouldAutoplay = NO;
videoPlayer.view.frame = CGRectMake(xPos, yPos, width, height);
[backgroundImageView addSubview:videoPlayer.view];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(videoPlaybackStateDidChange:) name:MPMoviePlayerPlaybackStateDidChangeNotification object:videoPlayer];
but it doesn't work on iPad 5 Simulator. I get a black frame with no movie nor playback controls.
I read the Apple changelog about MPMoviePlayerController, but I didn't found anything about this problem. Can you help me?
I solved the problem in this way: in my header file I wrote:
MPMoviePlayerController *moviePlayer;
with this property:
#property(nonatomic, strong) MPMoviePlayerController *moviePlayer;
and in the method in which I init the moviePlayer:
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:movieUrl];
self.moviePlayer = player;
It seems that assigning the player to a property "saves" the player. But don't ask me why...
You don't mention what type of URL you are trying to play, however, if it's an HTTP Live Streaming resource (.m3u8 file), then be aware that iOS 5.0 seems to have tightened up on validating the contents of the m3u8 index file.
Specifically, I've discovered that:
No individual segment can be more than twice as long as the #EXT-X-TARGETDURATION value;
The #EXTINF value (segment length in seconds) can, now, only be an integer value.
If one of these is your problem, running your application under the iOS 5.0 simulator should show a warning in the debugger console.
For HLS on iOS5, the TARGETDURATION value is really not the target duration but needs to be the maximum duration. So it should be set to the largest segment in the file.

Resources