Having trouble with NSSound currentTime - cocoa

I'm using NSSound and I can get sounds to play / pause correctly, but for some reason I can't get the currentTime method to return anything else than zero.
Here's the code that I'm having trouble with:
NSSound* sound = [[NSSound alloc] initWithContentsOfFile:#"path_to_sound.mp3" byReference:NO];
[sound play];
sleep(1);
NSLog(#"Current time: %f", [sound currentTime]);
The sound plays but the NSLog always returns zero. Any ideas?

NSSound uses the default sound device, by default. you can change output device using setPlaybackDeviceIdentifier: method of NSSound class.
Take a look at this and how to get Audio Device UID to pass into NSSound's setPlaybackDeviceIdentifier: for getting unique identifier of a sound output device

I do not have a machine to test things, but I expect that you may learn what you need from:
NSLog(#"Playback on %#, current time: %# / %#",
[sound playbackDeviceIdentifier], [sound currentTime], [sound duration]);

Related

AVAudioEngine incorrect time management and callback for AVAudioPlayerNode

I have a serious issue with the new audio engine in iOS8. I have an application, which is built with AVAudioPlayer and I am trying to figure out a way to migrate to the new architecture, however I bumped into the following problem (which I'm sure you would agree, is a serious and basic obstacle):
My header file:
AVAudioEngine *engine;
AVAudioMixerNode *mainMixer;
AVAudioPlayerNode *player;
My m file (inside the viewDidLoad):
engine = [[AVAudioEngine alloc] init];
player = [[AVAudioPlayerNode alloc] init];
[engine attachNode:player];
NSURL *fileUrl = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:#"test" ofType:#"mp3"]];
AVAudioFile *file = [[AVAudioFile alloc] initForReading:fileUrl error:nil];
NSLog(#"duration: %.2f", file.length / file.fileFormat.sampleRate);
[player scheduleFile:file atTime:nil completionHandler:^{
AVAudioTime *nodeTime = player.lastRenderTime;
AVAudioTime *playerTime = [player playerTimeForNodeTime:nodeTime];
float secs = (float)playerTime.sampleTime / file.fileFormat.sampleRate;
NSLog(#"finished at: %.2f", secs);
}];
mainMixer = [engine mainMixerNode];
[engine connect:player to:mainMixer format:file.processingFormat];
[engine startAndReturnError:nil];
[player play];
The above code initializes the engine and a node, then starts playing back whatever file I'm using. First it prints out the duration of the music file, then, after finishing playback, in the callback function, it prints the current time of the player. These two should be the same or in a worst case scenario, very, very close to each other, but this is not the case, the difference between these two values is very big, e.g.
duration: 148.51
finished at: 147.61
Am I doing something wrong? This should be fairly straight forward, I've tried with different file formats, file lengths, tens of music files, but the difference is always around or just under 1 second.
Update:
As of iOS 11 you can specify the completionCallbackType: AVAudioPlayerNodeCompletionCallbackType
dataConsumed:
A completion handler indicating that the buffer or file data has been consumed by the player.
dataRendered:
The buffer or file data that has been rendered by the player.
dataPlayedBack:
A completion handler indicating that the buffer or file has finished playing.
More info: Documentation
Original:
According to Apple's documentation for scheduleFile:atTime:completionHandler:
It is possible for the completionHandler to be called before rendering
begins or before the file is played completely.

How do I add background music to my spritekit file

Could someone give me a quick easy step by step to adding background m4a music once my app has loaded. It is a sprite kit Xcode file, and the music is in m4a format. Thanks
Try with this:
#import AVFoundation;
...
AVAudioPlayer * backgroundMusicPlayer;
NSError *error;
NSURL * backgroundMusicURL = [[NSBundle mainBundle] URLForResource:#"song" withExtension:#"m4a"];
backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:backgroundMusicURL error:&error];
backgroundMusicPlayer.numberOfLoops = -1; //-1 = infinite loop
[backgroundMusicPlayer prepareToPlay];
[backgroundMusicPlayer play];
and to stop simply
[backgroundMusicPlayer stop];
note: I don't use SKAction to play background music because you can't stop it when you want
You can use AVAudioPlayer for this purpose:
In your .h:
#import <AVFoundation/AVFoundation.h>
and add the following to interface
AVAudioPlayer *player;
In .m, initialize player with audio oath url:
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"bg_music"
ofType:#"mp3"]];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
player.numberOfLoops = -1;
and when you need to play the audio, you can call:
[player play];
Note:
"numberOfLoops" is the number of times that the sound will return to the beginning upon reaching the end.
A value of zero means to play the sound just once.
A value of one will result in playing the sound twice, and so on...
Any negative number will loop indefinitely until stopped.
Keep Coding................ :)

initWithContentsOfURL often returns nil

NSError *error;
NSString *string = [[NSString alloc]
initWithContentsOfURL:URL
encoding:NSUTF8StringEncoding
error:&error];
When I test this on my iPhone it always works when I have wifi turned on. However when I'm on 3G I often get nil. If I try perhaps 15 times in a row (I have an update button for this) I finally get the desired result.
My question is, is this problem located at the server side or is my code unstable? Should I use a different approach to get a more secure fetch of data?
You haven't provided enough information to give anything but a vague answer, but you do have some options here.
Most importantly, you have an "error" parameter that you should be printing out the results of. There's also a slightly better API you could be using in the NSString class.
Change your code to something like this:
NSError *error = NULL;
NSStringEncoding actualEncoding;
// variable names in Objective-C should usually start with lower case letters, so change
// URL in your code to "url", or even something more descriptive, like "urlOfOurString"
NSString *string = [[NSString alloc] initWithContentsOfURL:urlOfOurString usedEncoding:&actualEncoding error:&error];
if(string)
{
NSLog( #"hey, I actually got a result of %#", string);
if(actualEncoding != NSUTF8StringEncoding)
{
// I also suspect the string you're trying to load really isn't UTF8
NSLog( #"and look at that, the actual encoding wasn't NSUTF8StringEncoding");
}
} else {
NSLog( #"error when trying to fetch from URL %# - %#", [urlOfOurString absoluteString], [error localizedDescription]);
}
I'm now using STHTTPRequest instead. I recommend this library very much, easy to use yet powerful.

QTKit, capture video for live streaming

I am trying to create an application for the Mac that would create live video streaming. I know about VLC and other solutions, but still.
To that end i am trying to record video from iSight using QTKit, and save it continuously as a series of tiny video files. However, the recording turns out not quite continuous, with gaps between the files.
Basically, I am just setting up a timer, that starts recording to a new file at certain time intervals, thus stopping the old recording. I also tried setting the max recorded length, and using a delegate method ...didFinishRecording... and ...willFinishRecording..., but with the same result (i can't really estimate the difference between the gaps in these cases).
Please, help me, if you know how these things should be done.
Here is my current code:
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
QTCaptureSession *session = [[QTCaptureSession alloc] init];
QTCaptureDevice *iSight = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
[iSight open:nil];
QTCaptureDeviceInput *myInput = [QTCaptureDeviceInput deviceInputWithDevice:iSight];
output = [[QTCaptureMovieFileOutput alloc] init] ; //ivar, QTCaptureFileOutput
[output setDelegate:self];
a = 0; //ivar, int
fileName = #"/Users/dtv/filerecording_"; //ivar, NSString
[session addOutput:output error:nil];
[session addInput:myInput error:nil];
[capview setCaptureSession:session]; //IBOutlet
[session startRunning];
[output setCompressionOptions:[QTCompressionOptions compressionOptionsWithIdentifier:#"QTCompressionOptionsSD480SizeH264Video"] forConnection:[[output connections] objectAtIndex:0]];
[output recordToOutputFileURL:[NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%i.mov", fileName, a]] bufferDestination:QTCaptureFileOutputBufferDestinationOldFile];
NSTimer *tmr = [NSTimer timerWithTimeInterval:5 target:self selector:#selector(getMovieLength:) userInfo:nil repeats:YES];
[[NSRunLoop currentRunLoop] addTimer:tmr forMode:NSDefaultRunLoopMode];
}
&dash; (void) getMovieLength:(NSTimer *) t {
a++;
[output recordToOutputFileURL:[NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%i.mov", fileName, a]] bufferDestination:QTCaptureFileOutputBufferDestinationOldFile];
}
There is a native mechanism to brake the captured movie into pieces. Use
[QTCaptureFileOutput setMaximumRecordedDuration:]
to specify the duration of the piece or
[QTCaptureFileOutput setMaximumRecordedFileSize:]
to specify the file size limit.
When the limit is reached the delegate method will be called:
[QTCaptureFileOutput_Delegate captureOutput: shouldChangeOutputFileAtURL: forConnections: dueToError:]
In the this method you can set the new file name:
[QTCaptureFileOutput recordToOutputFileURL:]
This will allow you to cut the pieces of the recorded movie pretty precisely.
Note, that [QTCaptureFileOutput_Delegate captureOutput: didFinishRecordingToOutputFileAtURL: forConnections: dueToError:] will be called a bit later after the recoding into the file has been actually finished. If you use this method to set the new file you will have gaps in the final video. It does not mean you do not need to use this method though. This method will indicated when the piece of the movie is ready to be used.
If you need even more precise cutting you can use
[QTCaptureFileOutput captureOutput: didOutputSampleBuffer: fromConnection:]
to specify the exact movie frame when to start recording into a new piece. However, you will need more specific knowledge to work with the method.

MPMoviePlayerController doesn't work after upgrading to iOS 5

This code works perfectly on iPad 4.3 Simulator:
NSString *source = [mediaObject objectForKey:#"source"];
NSString *videoPath = [NSString stringWithFormat:#"%#/%#", path, source];
NSURL *videoUrl = [NSURL fileURLWithPath:videoPath];
MPMoviePlayerController *videoPlayer = [[MPMoviePlayerController alloc] initWithContentURL:videoUrl];
videoPlayer.shouldAutoplay = NO;
videoPlayer.view.frame = CGRectMake(xPos, yPos, width, height);
[backgroundImageView addSubview:videoPlayer.view];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(videoPlaybackStateDidChange:) name:MPMoviePlayerPlaybackStateDidChangeNotification object:videoPlayer];
but it doesn't work on iPad 5 Simulator. I get a black frame with no movie nor playback controls.
I read the Apple changelog about MPMoviePlayerController, but I didn't found anything about this problem. Can you help me?
I solved the problem in this way: in my header file I wrote:
MPMoviePlayerController *moviePlayer;
with this property:
#property(nonatomic, strong) MPMoviePlayerController *moviePlayer;
and in the method in which I init the moviePlayer:
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:movieUrl];
self.moviePlayer = player;
It seems that assigning the player to a property "saves" the player. But don't ask me why...
You don't mention what type of URL you are trying to play, however, if it's an HTTP Live Streaming resource (.m3u8 file), then be aware that iOS 5.0 seems to have tightened up on validating the contents of the m3u8 index file.
Specifically, I've discovered that:
No individual segment can be more than twice as long as the #EXT-X-TARGETDURATION value;
The #EXTINF value (segment length in seconds) can, now, only be an integer value.
If one of these is your problem, running your application under the iOS 5.0 simulator should show a warning in the debugger console.
For HLS on iOS5, the TARGETDURATION value is really not the target duration but needs to be the maximum duration. So it should be set to the largest segment in the file.

Resources