MPMoviePlayerController doesn't work after upgrading to iOS 5 - mpmovieplayercontroller

This code works perfectly on iPad 4.3 Simulator:
NSString *source = [mediaObject objectForKey:#"source"];
NSString *videoPath = [NSString stringWithFormat:#"%#/%#", path, source];
NSURL *videoUrl = [NSURL fileURLWithPath:videoPath];
MPMoviePlayerController *videoPlayer = [[MPMoviePlayerController alloc] initWithContentURL:videoUrl];
videoPlayer.shouldAutoplay = NO;
videoPlayer.view.frame = CGRectMake(xPos, yPos, width, height);
[backgroundImageView addSubview:videoPlayer.view];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(videoPlaybackStateDidChange:) name:MPMoviePlayerPlaybackStateDidChangeNotification object:videoPlayer];
but it doesn't work on iPad 5 Simulator. I get a black frame with no movie nor playback controls.
I read the Apple changelog about MPMoviePlayerController, but I didn't found anything about this problem. Can you help me?

I solved the problem in this way: in my header file I wrote:
MPMoviePlayerController *moviePlayer;
with this property:
#property(nonatomic, strong) MPMoviePlayerController *moviePlayer;
and in the method in which I init the moviePlayer:
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:movieUrl];
self.moviePlayer = player;
It seems that assigning the player to a property "saves" the player. But don't ask me why...

You don't mention what type of URL you are trying to play, however, if it's an HTTP Live Streaming resource (.m3u8 file), then be aware that iOS 5.0 seems to have tightened up on validating the contents of the m3u8 index file.
Specifically, I've discovered that:
No individual segment can be more than twice as long as the #EXT-X-TARGETDURATION value;
The #EXTINF value (segment length in seconds) can, now, only be an integer value.
If one of these is your problem, running your application under the iOS 5.0 simulator should show a warning in the debugger console.

For HLS on iOS5, the TARGETDURATION value is really not the target duration but needs to be the maximum duration. So it should be set to the largest segment in the file.

Related

AVAudioEngine incorrect time management and callback for AVAudioPlayerNode

I have a serious issue with the new audio engine in iOS8. I have an application, which is built with AVAudioPlayer and I am trying to figure out a way to migrate to the new architecture, however I bumped into the following problem (which I'm sure you would agree, is a serious and basic obstacle):
My header file:
AVAudioEngine *engine;
AVAudioMixerNode *mainMixer;
AVAudioPlayerNode *player;
My m file (inside the viewDidLoad):
engine = [[AVAudioEngine alloc] init];
player = [[AVAudioPlayerNode alloc] init];
[engine attachNode:player];
NSURL *fileUrl = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:#"test" ofType:#"mp3"]];
AVAudioFile *file = [[AVAudioFile alloc] initForReading:fileUrl error:nil];
NSLog(#"duration: %.2f", file.length / file.fileFormat.sampleRate);
[player scheduleFile:file atTime:nil completionHandler:^{
AVAudioTime *nodeTime = player.lastRenderTime;
AVAudioTime *playerTime = [player playerTimeForNodeTime:nodeTime];
float secs = (float)playerTime.sampleTime / file.fileFormat.sampleRate;
NSLog(#"finished at: %.2f", secs);
}];
mainMixer = [engine mainMixerNode];
[engine connect:player to:mainMixer format:file.processingFormat];
[engine startAndReturnError:nil];
[player play];
The above code initializes the engine and a node, then starts playing back whatever file I'm using. First it prints out the duration of the music file, then, after finishing playback, in the callback function, it prints the current time of the player. These two should be the same or in a worst case scenario, very, very close to each other, but this is not the case, the difference between these two values is very big, e.g.
duration: 148.51
finished at: 147.61
Am I doing something wrong? This should be fairly straight forward, I've tried with different file formats, file lengths, tens of music files, but the difference is always around or just under 1 second.
Update:
As of iOS 11 you can specify the completionCallbackType: AVAudioPlayerNodeCompletionCallbackType
dataConsumed:
A completion handler indicating that the buffer or file data has been consumed by the player.
dataRendered:
The buffer or file data that has been rendered by the player.
dataPlayedBack:
A completion handler indicating that the buffer or file has finished playing.
More info: Documentation
Original:
According to Apple's documentation for scheduleFile:atTime:completionHandler:
It is possible for the completionHandler to be called before rendering
begins or before the file is played completely.

UITextView setText should not jump to top in ios8

Following iOS 8 code is called every second:
- (void)appendString(NSString *)newString toTextView:(UITextView *)textView {
textView.scrollEnabled = NO;
textView.text = [NSString stringWithFormat:#"%#%#%#", textView.text, newString, #"\n"];
textView.scrollEnabled = YES;
[textView scrollRangeToVisible:NSMakeRange(textView.text.length, 0)];
}
The goal is to have the same scrolling down behaviour as the XCode console when the text starts running off the bottom. Unfortunately, setText causes the view to reset to the top before I can scroll down again with scrollRangeToVisible.
This was solved in iOS7 with the above code and it worked, but after upgrading last week to iOS8, that solution no longer seems to work anymore.
I can't figure out how to get this going fluently without the jumping behaviour?
I meet this problem too. You can try this.
textView.layoutManager.allowsNonContiguousLayout = NO;
refrence:http://hayatomo.com/2014/09/26/1307
The following two solutions don't work for me on iOS 8.0.
textView.scrollEnabled = NO;
[textView.setText: text];
textView.scrollEnabled = YES;
and
CGPoint offset = textView.contentOffset;
[textView.setText: text];
[textView setContentOffset:offset];
I setup a delegate to the textview to monitor the scroll event, and noticed that after my operation to restore the offset, the offset is reset to 0 again. So I instead use the main operation queue to make sure my restore operation happens after the "reset to 0" option.
Here's my solution that works for iOS 8.0.
CGPoint offset = self.textView.contentOffset;
self.textView.attributedText = replace;
[[NSOperationQueue mainQueue] addOperationWithBlock: ^{
[self.textView setContentOffset: offset];
}];
Try just to add text to UITextView (without scrollRangeToVisible/scrollEnabled). It seams that hack with scroll enabled/disabled is no more needed in iOS8 SDK. UITextView scrolls automatically.

How do I add background music to my spritekit file

Could someone give me a quick easy step by step to adding background m4a music once my app has loaded. It is a sprite kit Xcode file, and the music is in m4a format. Thanks
Try with this:
#import AVFoundation;
...
AVAudioPlayer * backgroundMusicPlayer;
NSError *error;
NSURL * backgroundMusicURL = [[NSBundle mainBundle] URLForResource:#"song" withExtension:#"m4a"];
backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:backgroundMusicURL error:&error];
backgroundMusicPlayer.numberOfLoops = -1; //-1 = infinite loop
[backgroundMusicPlayer prepareToPlay];
[backgroundMusicPlayer play];
and to stop simply
[backgroundMusicPlayer stop];
note: I don't use SKAction to play background music because you can't stop it when you want
You can use AVAudioPlayer for this purpose:
In your .h:
#import <AVFoundation/AVFoundation.h>
and add the following to interface
AVAudioPlayer *player;
In .m, initialize player with audio oath url:
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"bg_music"
ofType:#"mp3"]];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
player.numberOfLoops = -1;
and when you need to play the audio, you can call:
[player play];
Note:
"numberOfLoops" is the number of times that the sound will return to the beginning upon reaching the end.
A value of zero means to play the sound just once.
A value of one will result in playing the sound twice, and so on...
Any negative number will loop indefinitely until stopped.
Keep Coding................ :)

How update image from URL in OS X app?

i have some problem. So i have code which update song name and picture from php. Song name work and also updated but picture not work, in php file all work but in my project - no. How make update picture from url after 10 sec for example. Thanks.
-(void)viewWillDraw {
NSURL *artistImageURL = [NSURL URLWithString:#"http://site.ru/ParseDataField/kiss.php?image"];
NSImage *artistImage = [[NSImage alloc] initWithContentsOfURL:artistImageURL];
[dj setImage:artistImage];
dispatch_queue_t queue = dispatch_get_global_queue(0,0);
dispatch_async(queue, ^{
NSError* error = nil;
NSString* text = [NSString stringWithContentsOfURL:[NSURL URLWithString:#"http://site.ru/ParseDataField/kiss.php?artist"]
encoding:NSASCIIStringEncoding
error:&error];
dispatch_async(dispatch_get_main_queue(), ^{
[labelName setStringValue:text];
});
});
}
You should really consider placing this code someplace other than -viewWillDraw. This routine can be called multiple times for the same NSView under some circumstances and, more importantly, you need to call [super viewWillDraw] to make sure that things will actually draw correctly (if anything is drawn in the view itself).
For periodic updates (such as every 10 seconds), you should consider using NSTimer to trigger the retrieval of the next object.
As for the general question of why your image isn't being drawn correctly, you should probably consider putting the image retrieval and drawing code into the same structure as your label retrieval and drawing code. This will get the [dj setImage: artistImage] method call outside of the viewWillDraw chain which is likely causing some difficulty here.

How to get the frame (origin, Size) of every visible windows on the active space?

I'm trying to figure out how to get the frame of all visible windows.
I tried the following code, but it only works for the app itself other windows report {0,0,0,0}
NSArray *windowArray = [NSWindow windowNumbersWithOptions:NSWindowNumberListAllApplications | NSWindowNumberListAllSpaces];
for(NSNumber *number in windowArray){
NSLog(#"Window number: %#", number);
NSWindow *window = [[NSApplication sharedApplication] windowWithWindowNumber:[number intValue]];
NSLog(#"Window: %#", NSStringFromRect( [[window contentView] frame]));
}
Sample code is appreciated.
I figured it out:
NSMutableArray *windows = (__bridge NSMutableArray *)CGWindowListCopyWindowInfo(kCGWindowListOptionOnScreenOnly | kCGWindowListExcludeDesktopElements, kCGNullWindowID);
for (NSDictionary *window in windows) {
NSString *name = [window objectForKey:#"kCGWindowName" ];
CGRect bounds;
CGRectMakeWithDictionaryRepresentation((CFDictionaryRef)[window objectForKey:#"kCGWindowBounds"], &bounds);
NSLog(#"%#: %#",name,NSStringFromRect(bounds));
}
You can't create an NSWindow for a window of another application. In general, you can't access the objects of other applications except through an interface that they cooperate with, like scripting.
You can get what you're looking for using the Quartz Window Services (a.k.a. CGWindowList) API.
I'm not at all sure that the window numbers returned by Cocoa are the same as the window numbers used by that API. In fact, the docs for -[NSWindow windowNumber] specifically say "note that this isn’t the same as the global window number assigned by the window server". I'm note sure to what use you can put the window numbers returned by +[NSWindow windowNumbersWithOptions:] which are not for your application's windows.

Resources