QTKit, capture video for live streaming - cocoa

I am trying to create an application for the Mac that would create live video streaming. I know about VLC and other solutions, but still.
To that end i am trying to record video from iSight using QTKit, and save it continuously as a series of tiny video files. However, the recording turns out not quite continuous, with gaps between the files.
Basically, I am just setting up a timer, that starts recording to a new file at certain time intervals, thus stopping the old recording. I also tried setting the max recorded length, and using a delegate method ...didFinishRecording... and ...willFinishRecording..., but with the same result (i can't really estimate the difference between the gaps in these cases).
Please, help me, if you know how these things should be done.
Here is my current code:
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
QTCaptureSession *session = [[QTCaptureSession alloc] init];
QTCaptureDevice *iSight = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
[iSight open:nil];
QTCaptureDeviceInput *myInput = [QTCaptureDeviceInput deviceInputWithDevice:iSight];
output = [[QTCaptureMovieFileOutput alloc] init] ; //ivar, QTCaptureFileOutput
[output setDelegate:self];
a = 0; //ivar, int
fileName = #"/Users/dtv/filerecording_"; //ivar, NSString
[session addOutput:output error:nil];
[session addInput:myInput error:nil];
[capview setCaptureSession:session]; //IBOutlet
[session startRunning];
[output setCompressionOptions:[QTCompressionOptions compressionOptionsWithIdentifier:#"QTCompressionOptionsSD480SizeH264Video"] forConnection:[[output connections] objectAtIndex:0]];
[output recordToOutputFileURL:[NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%i.mov", fileName, a]] bufferDestination:QTCaptureFileOutputBufferDestinationOldFile];
NSTimer *tmr = [NSTimer timerWithTimeInterval:5 target:self selector:#selector(getMovieLength:) userInfo:nil repeats:YES];
[[NSRunLoop currentRunLoop] addTimer:tmr forMode:NSDefaultRunLoopMode];
}
‐ (void) getMovieLength:(NSTimer *) t {
a++;
[output recordToOutputFileURL:[NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%i.mov", fileName, a]] bufferDestination:QTCaptureFileOutputBufferDestinationOldFile];
}

There is a native mechanism to brake the captured movie into pieces. Use
[QTCaptureFileOutput setMaximumRecordedDuration:]
to specify the duration of the piece or
[QTCaptureFileOutput setMaximumRecordedFileSize:]
to specify the file size limit.
When the limit is reached the delegate method will be called:
[QTCaptureFileOutput_Delegate captureOutput: shouldChangeOutputFileAtURL: forConnections: dueToError:]
In the this method you can set the new file name:
[QTCaptureFileOutput recordToOutputFileURL:]
This will allow you to cut the pieces of the recorded movie pretty precisely.
Note, that [QTCaptureFileOutput_Delegate captureOutput: didFinishRecordingToOutputFileAtURL: forConnections: dueToError:] will be called a bit later after the recoding into the file has been actually finished. If you use this method to set the new file you will have gaps in the final video. It does not mean you do not need to use this method though. This method will indicated when the piece of the movie is ready to be used.
If you need even more precise cutting you can use
[QTCaptureFileOutput captureOutput: didOutputSampleBuffer: fromConnection:]
to specify the exact movie frame when to start recording into a new piece. However, you will need more specific knowledge to work with the method.

Related

AVAudioEngine incorrect time management and callback for AVAudioPlayerNode

I have a serious issue with the new audio engine in iOS8. I have an application, which is built with AVAudioPlayer and I am trying to figure out a way to migrate to the new architecture, however I bumped into the following problem (which I'm sure you would agree, is a serious and basic obstacle):
My header file:
AVAudioEngine *engine;
AVAudioMixerNode *mainMixer;
AVAudioPlayerNode *player;
My m file (inside the viewDidLoad):
engine = [[AVAudioEngine alloc] init];
player = [[AVAudioPlayerNode alloc] init];
[engine attachNode:player];
NSURL *fileUrl = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:#"test" ofType:#"mp3"]];
AVAudioFile *file = [[AVAudioFile alloc] initForReading:fileUrl error:nil];
NSLog(#"duration: %.2f", file.length / file.fileFormat.sampleRate);
[player scheduleFile:file atTime:nil completionHandler:^{
AVAudioTime *nodeTime = player.lastRenderTime;
AVAudioTime *playerTime = [player playerTimeForNodeTime:nodeTime];
float secs = (float)playerTime.sampleTime / file.fileFormat.sampleRate;
NSLog(#"finished at: %.2f", secs);
}];
mainMixer = [engine mainMixerNode];
[engine connect:player to:mainMixer format:file.processingFormat];
[engine startAndReturnError:nil];
[player play];
The above code initializes the engine and a node, then starts playing back whatever file I'm using. First it prints out the duration of the music file, then, after finishing playback, in the callback function, it prints the current time of the player. These two should be the same or in a worst case scenario, very, very close to each other, but this is not the case, the difference between these two values is very big, e.g.
duration: 148.51
finished at: 147.61
Am I doing something wrong? This should be fairly straight forward, I've tried with different file formats, file lengths, tens of music files, but the difference is always around or just under 1 second.
Update:
As of iOS 11 you can specify the completionCallbackType: AVAudioPlayerNodeCompletionCallbackType
dataConsumed:
A completion handler indicating that the buffer or file data has been consumed by the player.
dataRendered:
The buffer or file data that has been rendered by the player.
dataPlayedBack:
A completion handler indicating that the buffer or file has finished playing.
More info: Documentation
Original:
According to Apple's documentation for scheduleFile:atTime:completionHandler:
It is possible for the completionHandler to be called before rendering
begins or before the file is played completely.

How update image from URL in OS X app?

i have some problem. So i have code which update song name and picture from php. Song name work and also updated but picture not work, in php file all work but in my project - no. How make update picture from url after 10 sec for example. Thanks.
-(void)viewWillDraw {
NSURL *artistImageURL = [NSURL URLWithString:#"http://site.ru/ParseDataField/kiss.php?image"];
NSImage *artistImage = [[NSImage alloc] initWithContentsOfURL:artistImageURL];
[dj setImage:artistImage];
dispatch_queue_t queue = dispatch_get_global_queue(0,0);
dispatch_async(queue, ^{
NSError* error = nil;
NSString* text = [NSString stringWithContentsOfURL:[NSURL URLWithString:#"http://site.ru/ParseDataField/kiss.php?artist"]
encoding:NSASCIIStringEncoding
error:&error];
dispatch_async(dispatch_get_main_queue(), ^{
[labelName setStringValue:text];
});
});
}
You should really consider placing this code someplace other than -viewWillDraw. This routine can be called multiple times for the same NSView under some circumstances and, more importantly, you need to call [super viewWillDraw] to make sure that things will actually draw correctly (if anything is drawn in the view itself).
For periodic updates (such as every 10 seconds), you should consider using NSTimer to trigger the retrieval of the next object.
As for the general question of why your image isn't being drawn correctly, you should probably consider putting the image retrieval and drawing code into the same structure as your label retrieval and drawing code. This will get the [dj setImage: artistImage] method call outside of the viewWillDraw chain which is likely causing some difficulty here.

Scaling a QTMovie before appending

Using the QTKit framework, I'm developing a little app.
In the app, I'm trying to append a movie after a other movie, which in essence is already working (most of the time), but I'm having a little trouble with the appended movie. The movie is which I'm appending to is quite big, like 1920x1080, and the appended movie is usually much smaller, but I never know what size it exactly is. The appended movie sort of stays its own size in the previous 1920x1080 frame, as seen here:
Is there anyone familiar with this? Is there a way I can scale the movie which I need to append to, to the size of the appended movie? There is no reference of such a thing in the documentation.
This is are some relevant methods:
`QTMovie *segmentTwo = [QTMovie movieWithURL:finishedMovie error:nil];
QTTimeRange range = { .time = QTZeroTime, .duration = [segmentTwo duration] };
[segmentTwo setSelection:range];
[leader appendSelectionFromMovie:segmentTwo];
while([[leader attributeForKey:QTMovieLoadStateAttribute] longValue] != 100000L)
{
//wait until QTMovieLoadStateComplete
}
NSDictionary *exportAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], QTMovieExport,
[NSNumber numberWithLong:kQTFileTypeMovie], QTMovieExportType, nil];
NSString *outputFile = [NSString stringWithFormat:#"%#.mov", onderwerp];
NSString *filepath = [[#"~/Desktop" stringByExpandingTildeInPath] stringByAppendingFormat:#"/%#", outputFile];
BOOL succes = [leader writeToFile:filepath withAttributes:exportAttributes error:&theError];
Leader is initialized like this:
NSDictionary *movieAttributes = [NSDictionary dictionaryWithObjectsAndKeys:path, QTMovieFileNameAttribute, [NSNumber numberWithBool:YES], QTMovieEditableAttribute, nil];
leader = [QTMovie movieWithAttributes: movieAttributes error:&error];
This contained all the information I need, although without using the QTKit framework. QTKit - Merge two videos with different width and height?

Having trouble with NSSound currentTime

I'm using NSSound and I can get sounds to play / pause correctly, but for some reason I can't get the currentTime method to return anything else than zero.
Here's the code that I'm having trouble with:
NSSound* sound = [[NSSound alloc] initWithContentsOfFile:#"path_to_sound.mp3" byReference:NO];
[sound play];
sleep(1);
NSLog(#"Current time: %f", [sound currentTime]);
The sound plays but the NSLog always returns zero. Any ideas?
NSSound uses the default sound device, by default. you can change output device using setPlaybackDeviceIdentifier: method of NSSound class.
Take a look at this and how to get Audio Device UID to pass into NSSound's setPlaybackDeviceIdentifier: for getting unique identifier of a sound output device
I do not have a machine to test things, but I expect that you may learn what you need from:
NSLog(#"Playback on %#, current time: %# / %#",
[sound playbackDeviceIdentifier], [sound currentTime], [sound duration]);

Placing an NSTimer in a separate thread

Note: It's probably worth scrolling down to read my edit.
I'm trying to setup an NSTimer in a separate thread so that it continues to fire when users interact with the UI of my application. This seems to work, but Leaks reports a number of issues - and I believe I've narrowed it down to my timer code.
Currently what's happening is that updateTimer tries to access an NSArrayController (timersController) which is bound to an NSTableView in my applications interface. From there, I grab the first selected row and alter its timeSpent column. Note: the contents of timersController is a collection of managed objects generated via Core Data.
From reading around, I believe what I should be trying to do is execute the updateTimer function on the main thread, rather than in my timers secondary thread.
I'm posting here in the hopes that someone with more experience can tell me if that's the only thing I'm doing wrong. Having read Apple's documentation on Threading, I've found it an overwhelmingly large subject area.
NSThread *timerThread = [[[NSThread alloc] initWithTarget:self selector:#selector(startTimerThread) object:nil] autorelease];
[timerThread start];
-(void)startTimerThread
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSRunLoop *runLoop = [NSRunLoop currentRunLoop];
activeTimer = [[NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(updateTimer:) userInfo:nil repeats:YES] retain];
[runLoop run];
[pool release];
}
-(void)updateTimer:(NSTimer *)timer
{
NSArray *selectedTimers = [timersController selectedObjects];
id selectedTimer = [selectedTimers objectAtIndex:0];
NSNumber *currentTimeSpent = [selectedTimer timeSpent];
[selectedTimer setValue:[NSNumber numberWithInt:[currentTimeSpent intValue]+1] forKey:#"timeSpent"];
}
-(void)stopTimer
{
[activeTimer invalidate];
[activeTimer release];
}
UPDATE
I'm still totally lost with regards to this leak. I know I'm obviously doing something wrong, but I've stripped my application down to its bare bones and still can't seem to find it. For simplicities sake, I've uploaded my applications controller code to: a small pastebin. Note that I've now removed the timer thread code and instead opted to run the timer in a separate runloop (as suggested here).
If I set the Leaks Call Tree to hide both Missing Symbols and System Libraries, I'm shown the following output:
EDIT: Links to screenshots broken and therefor removed.
If the only reason you are spawning a new thread is to allow your timer to run while the user is interacting with the UI you can just add it in different runloop modes:
NSTimer *uiTimer = [NSTimer timerWithTimeInterval:(1.0 / 5.0) target:self selector:#selector(uiTimerFired:) userInfo:nil repeats:YES];
[[NSRunLoop mainRunLoop] addTimer:uiTimer forMode:NSRunLoopCommonModes];
As an addendum to this answer it is now possible to schedule timers using Grand Central Dispatch and blocks:
// Update the UI 5 times per second on the main queue
// Keep a strong reference to _timer in ARC
_timer = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, dispatch_get_main_queue());
dispatch_source_set_timer(_timer, DISPATCH_TIME_NOW, (1.0 / 5.0) * NSEC_PER_SEC, 0.25 * NSEC_PER_SEC);
dispatch_source_set_event_handler(_timer, ^{
// Perform a periodic action
});
// Start the timer
dispatch_resume(_timer);
Later when the timer is no longer needed:
dispatch_source_cancel(_timer);

Resources