Unexpected behaviour of AVCaptureMovieFileOutput - xcode

I am making a movie of screen using AVCaptureMovieFIleOutput, but it is showing unexpected behaviour.
Like, if I am sending the cropRect parameter the movie captured is fine, but if I making the movie of whole screen instead of movie file it is showing a folder. How can I get rid of that?
Code is :
// Create a capture session
mSession = [[AVCaptureSession alloc] init];
// If you're on a multi-display system and you want to capture a secondary display,
// you can call CGGetActiveDisplayList() to get the list of all active displays.
// For this example, we just specify the main display.
CGDirectDisplayID displayId = kCGDirectMainDisplay;
// Create a ScreenInput with the display and add it to the session
input = [[AVCaptureScreenInput alloc] initWithDisplayID:displayId];
[input setCropRect:rect];
if (!input) {
mSession = nil;
return;
}
if ([mSession canAddInput:input])
[mSession addInput:input];
// Create a MovieFileOutput and add it to the session
mMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([mSession canAddOutput:mMovieFileOutput])
[mSession addOutput:mMovieFileOutput];
// Start running the session
[mSession startRunning];
// Delete any existing movie file first
if ([[NSFileManager defaultManager] fileExistsAtPath:[destPath path]])
{
NSError *err;
if (![[NSFileManager defaultManager] removeItemAtPath:[destPath path] error:&err])
{
NSLog(#"Error deleting existing movie %#",[err localizedDescription]);
}
}
// Start recording to the destination movie file
// The destination path is assumed to end with ".mov", for example, #"/users/master/desktop/capture.mov"
// Set the recording delegate to self
[mMovieFileOutput startRecordingToOutputFileURL:destPath recordingDelegate:self];

Have to use setSessionPreset property.
Default value of sessionPreset is AVCaptureSessionPresetHigh and it does not work with the full screen capture. Will have to provide any other preset for that.

Related

Core Data Exporting all tables

I have an app that creates individual events and stores them in core data. What I need to do it load one individually and then export it by email. The code below works except it exports every event where I need it to just export the index path selected one. The code does load the appropriate record because the NSLog (#"My record is: %#", currentItem); does display only the settings for that event but when the data is exported to email all events are sent. I need the selected event with the event name to export. Any thoughts?
NSInteger index = exportevent.tag;
NSIndexPath *indexPath = [NSIndexPath indexPathForRow:index inSection:0];
CDBaseItem *rawRecord = [self.fetchedResultsController objectAtIndexPath:indexPath];
CDSurveyItem *surveyItem = [CDSurveyItem castObject:rawRecord];
self.recordEditID = [rawRecord.objectID URIRepresentation];
NSManagedObjectID *objectId = [self.managedObjectContext.persistentStoreCoordinator managedObjectIDForURIRepresentation:self.recordEditID];
TSPItem *currentItem = [self.managedObjectContext objectWithID:objectId];
NSString *eventName = nil;
if (currentItem.eventname) {
eventName = currentItem.eventname;
}
else if (surveyItem.eventname) {
eventName = surveyItem.eventname;
}
[self setSelection:indexPath];
if (self.selection)
{
if (currentItem)
{
NSLog (#"My record is: %#", currentItem);
NSData *export = [CDJSONExporter exportContext:currentItem.managedObjectContext auxiliaryInfo:nil];
MFMailComposeViewController *composeVC1 = [[MFMailComposeViewController alloc] init];
composeVC1 = [[MFMailComposeViewController alloc] init];
composeVC1.mailComposeDelegate = self;
[composeVC1 setSubject:[NSString stringWithFormat:#"Settings From %# Event", eventName]];
[composeVC1 setMessageBody:[NSString stringWithFormat:#"Here is the event settings. Simply press on the attachment and then choose Open in iPIX"] isHTML:NO];
[composeVC1 addAttachmentData:export mimeType:#"application/octet-stream" fileName:[NSString stringWithFormat:#"%#.ipix", eventName]];
[self presentViewController:composeVC1 animated:NO completion:^(void){}];
}
[self setSelection:nil];
}
Your NSLog may be correct, but you're not exporting the thing that you're printing. In this line (which I assume is a reference to this project):
NSData *export = [CDJSONExporter exportContext:currentItem.managedObjectContext auxiliaryInfo:nil];
You're telling CDJSONExporter to export the context, not a single object. You get every object because that is what CDJSONExporter does. It gets everything it can find in the context and gives you a data object. It's not designed to do what you're asking it to do.
If you want to convert a single object to JSON, you could
Roll your own JSON conversion code. Since you know what the object looks like, this would be easy. Or...
Implement Encodable on your model object and then use JSONEncoder to convert to JSON. Or...
Find some other open source project that does what you want, instead of this one which does not.

passing video frame to Core Image on osx

Hi all you awesome coders! I've put together this thing from various helpful sources over the last couple of weeks (including a lot of posts from stackoverflow), trying to create something that will take a webcam feed and detect smiles when they occur (might as well draw boxes around the faces and the smiles as well, that doesn't seem like it would be hard once they are detected). Please give me some lee-way if it's messy code because I'm still very much learning.
Currently I'm stuck at trying to pass the image to a CIImage so it can be analysed for faces (I plan to deal with smiles after the face hurdle is overcome). As it is the compiler succeeds if I comment out the block after (5) - it brings up a simple AVCaptureVideoPreviewLayer in a window. I think this is what I've called "rootLayer", so it's like the first layer of the displayed output, and after I detect faces in the video frames I'll show a rectangle following the "bounds" of any detected face in a new layer overlaid on top of this one, and I've called that layer "previewLayer"... correct?
But with the block after (5) there, the compiler throws out three errors -
Undefined symbols for architecture x86_64:
"_CMCopyDictionaryOfAttachments", referenced from:
-[AVRecorderDocument captureOutput:didOutputSampleBuffer:fromConnection:] in AVRecorderDocument.o
"_CMSampleBufferGetImageBuffer", referenced from:
-[AVRecorderDocument captureOutput:didOutputSampleBuffer:fromConnection:] in AVRecorderDocument.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Can anyone tell me where I'm going wrong and what my next steps are?
Thanks for any help, I've been stuck at this point for a couple of days and I can't figure it out, all the examples I can find are for IOS and don't work in OSX.
- (id)init
{
self = [super init];
if (self) {
// Move the output part to another function
[self addVideoDataOutput];
// Create a capture session
session = [[AVCaptureSession alloc] init];
// Set a session preset (resolution)
self.session.sessionPreset = AVCaptureSessionPreset640x480;
// Select devices if any exist
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
[self setSelectedVideoDevice:videoDevice];
} else {
[self setSelectedVideoDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeMuxed]];
}
NSError *error = nil;
// Add an input
videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
[self.session addInput:self.videoDeviceInput];
// Start the session (app opens slower if it is here but I think it is needed in order to send the frames for processing)
[[self session] startRunning];
// Initial refresh of device list
[self refreshDevices];
}
return self;
}
-(void) addVideoDataOutput {
// (1) Instantiate a new video data output object
AVCaptureVideoDataOutput * captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.videoSettings = #{ (NSString *) kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
// discard if the data output queue is blocked (while CI processes the still image)
captureOutput.alwaysDiscardsLateVideoFrames = YES;
// (2) The sample buffer delegate requires a serial dispatch queue
dispatch_queue_t captureOutputQueue;
captureOutputQueue = dispatch_queue_create("CaptureOutputQueue", DISPATCH_QUEUE_SERIAL);
[captureOutput setSampleBufferDelegate:self queue:captureOutputQueue];
dispatch_release(captureOutputQueue); //what does this do and should it be here or after we receive the processed image back?
// (3) Define the pixel format for the video data output
NSString * key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber * value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * settings = #{key:value};
[captureOutput setVideoSettings:settings];
// (4) Configure the output port on the captureSession property
if ( [self.session canAddOutput:captureOutput] )
[session addOutput:captureOutput];
}
// Implement the Sample Buffer Delegate Method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// I *think* I have a video frame now in some sort of image format... so have to convert it into a CIImage before I can process it:
// (5) Convert CMSampleBufferRef to CVImageBufferRef, then to a CI Image (per weichsel's answer in July '13)
CVImageBufferRef cvFrameImage = CMSampleBufferGetImageBuffer(sampleBuffer); // Having trouble here, prog. stops and won't recognise CMSampleBufferGetImageBuffer.
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
self.ciFrameImage = [[CIImage alloc] initWithCVImageBuffer:cvFrameImage options:(__bridge NSDictionary *)attachments];
//self.ciFrameImage = [[CIImage alloc] initWithCVImageBuffer:cvFrameImage];
//OK so it is a CIImage. Find some way to send it to a separate CIImage function to find the faces, then smiles. Then send it somewhere else to be displayed on top of AVCaptureVideoPreviewLayer
//TBW
}
- (NSString *)windowNibName
{
return #"AVRecorderDocument";
}
- (void)windowControllerDidLoadNib:(NSWindowController *) aController
{
[super windowControllerDidLoadNib:aController];
// Attach preview to session
CALayer *rootLayer = self.previewView.layer;
[rootLayer setMasksToBounds:YES]; //aaron added
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
[self.previewLayer setBackgroundColor:CGColorGetConstantColor(kCGColorBlack)];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[self.previewLayer setFrame:[rootLayer bounds]];
//[newPreviewLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable]; //don't think I need this for OSX?
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[rootLayer addSublayer:previewLayer];
// [newPreviewLayer release]; //what's this for?
}
(moved from the comments section)
Wow. I guess two days and one StackOverflow post is what it takes to figure out that I haven't added CoreMedia.framework to my project.

Nsarray count +1 from current item & loop

I am writing a audio player App using AVAudio Player to play mp3's from the NSDocs Dir loading them with ALAssetsLibrary subclass & AVURLAsset When I select a mp3 from the tableview it will push to a player & play but Im struggling with the skip & previous buttons
I can skip to the last track with the below code, although I'm unsure if this is correct approach sending one AVURLAsset to play rather than a queue array.
Im using apple sample project AVPlayerDemo as the foundation of the ALAssetsLibrary subclass
- (IBAction)nextButtonTapped:(id)sender {
NSArray *sourceitems = [[finalSources objectAtIndex:_selectedPath.section] items];
[sourceitems enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
id next = nil;
next = [next objectAtIndex:_selectedPath.row];
if (idx + 1 < sourceitems.count) {
next = [sourceitems objectAtIndex:idx + 1];
AssetBrowserItem *item = next;
AVURLAsset *asset = (AVURLAsset*)item.asset;
[self setURL:asset.URL];
NSLog(#"next %#", next);
//here it continues to the last object and plays,
//but i just want to skip one track and play, then click again and move
//to the next one
}
}];
}
- (void)setURL:(NSURL*)URL
{
if (mURL != URL)
{
mURL = [URL copy];
/*
Create an asset for inspection of a resource referenced by a given URL.
Load the values for the asset keys "tracks", "playable".
*/
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:mURL options:nil];
NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey, kPlayableKey, nil];
/* Tells the asset to load the values of any of the specified keys that are not already loaded. */
[asset loadValuesAsynchronouslyForKeys:requestedKeys completionHandler:
^{
dispatch_async( dispatch_get_main_queue(),
^{
/* IMPORTANT: Must dispatch to main queue in order to operate on the AVPlayer and AVPlayerItem. */
[self prepareToPlayAsset:asset withKeys:requestedKeys];
[self syncLabels:mURL];
});
}];
}
}
Any pointers would be greatly appreciated.
Thanks in advance Gav.

Cocoa: AVFoundation - no active/enabled connections

I'm trying to build a simple app that captures video off of the built-in iSight camera of a MacBook. I've looked at a couple of example projects on the developer site and am following the tutorial here: Apple's AVFoundation Guide.
Each time I keep breaking on the AVCaptureMovieFileOutput, I get an uncaught exception - no active/enabled connections. I'm new to AV framework so I'm not sure why it recognizes the iSight, allows me to input it to the session, allows me to make a movie output for the session but then tells me there's no connections? What connections is it looking for? (Note: I do not have a QTMovieView in my viewcontroller yet but thought I would only need that for playback, not recording).
I know the iSight is working as I just used it recently with Skype.
Here's my relevant code:
thisSession = [[AVCaptureSession alloc] init];
//set presets for this session
if ([thisSession canSetSessionPreset:AVCaptureSessionPreset640x480]) {
thisSession.sessionPreset = AVCaptureSessionPreset640x480;
NSLog(#"Session Preset: OK");
//capture a device - captures all the devices, microphone, camera, etc.
NSArray *devices = [AVCaptureDevice devices];
//this will hold our decvice
AVCaptureDevice* iSightCamera;
for (AVCaptureDevice *device in devices) {
//we only want to work with the internal camera
if ([[device localizedName] isEqualToString:#"Built-in iSight"]) {
iSightCamera = device;
//creating an input of the device for the session
NSError *error = nil;
AVCaptureDeviceInput* iSightCameraInput =
[AVCaptureDeviceInput deviceInputWithDevice:iSightCamera error:&error];
if (!iSightCameraInput) {
NSLog(#"Error creating device input: %#", error);
} else {
NSLog(#"iSight device input created!");
//adding the device input to the session
if ([thisSession canAddInput:iSightCameraInput]) {
[thisSession addInput:iSightCameraInput];
NSLog(#"iSight input added to session!");
//add the output to the session
AVCaptureMovieFileOutput *movieOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([thisSession canAddOutput:movieOutput]) {
[thisSession beginConfiguration];
[thisSession addOutput:movieOutput];
[thisSession commitConfiguration];
NSLog(#"Movie output added to the session!");
//start writing the movie
NSURL *movieFolder = [NSURL fileURLWithPath:[#"~/Movies" stringByExpandingTildeInPath]];
[movieOutput startRecordingToOutputFileURL:movieFolder recordingDelegate:self];
}
else {
NSLog(#"Error: Could not add movie output to the session.");
}
}
else {
NSLog(#"Error: Could not add iSight to session.");
}
}
}
}

Cocoa: Getting image file metadata

I am building an app that allows users to select a file and/or folder either locally or across the network and list the contents of that selection in a NSTableView after some filtering (no hidden files, only accepting .tif, .eps). The user can then select a file name from the list and then have the files metadata shown to them. At least that is what I want to happen. Right now I am getting null returned for the metadata. Here's my code:
- (void)tableViewSelectionDidChange:(NSNotification *)notif {
NSDictionary* metadata = [[NSDictionary alloc] init];
//get selected item
NSString* rowData = [fileList objectAtIndex:[tblFileList selectedRow]];
//set path to file selected
NSString* filePath = [NSString stringWithFormat:#"%#/%#", objPath, rowData];
//declare a file manager
NSFileManager* fileManager = [[NSFileManager alloc] init];
//check to see if the file exists
if ([fileManager fileExistsAtPath:filePath] == YES) {
//escape all the garbage in the string
NSString *percentEscapedString = (NSString *)CFURLCreateStringByAddingPercentEscapes(NULL, (CFStringRef)filePath, NULL, NULL, kCFStringEncodingUTF8);
//convert path to NSURL
NSURL* filePathURL = [[NSURL alloc] initFileURLWithPath:percentEscapedString];
NSError* error;
NSLog(#"%#", [filePathURL checkResourceIsReachableAndReturnError:error]);
//declare a cg source reference
CGImageSourceRef sourceRef;
//set the cg source references to the image by passign its url path
sourceRef = CGImageSourceCreateWithURL((CFURLRef)filePathURL, NULL);
//set a dictionary with the image metadata from the source reference
metadata = (NSDictionary *)CGImageSourceCopyPropertiesAtIndex(sourceRef,0,NULL);
NSLog(#"%#", metadata);
[filePathURL release];
} else {
[self showAlert:#"I cannot find this file."];
}
[fileManager release];
}
I'm guessing the problem here is the CFURLREF in CGImageSourceCreateWithURL. Instead of NSURL should I be using something else?
Thanks
Here's the path I am passing (logged from filePathURL): file://localhost/Volumes/STORAGE%20SVR/Illustration-Wofford/Illustration%20Pickup/Archive/AL013111_IL_Communication.eps
I can't tell where the "localhost" part in your file URL comes from, but I think that's the culprit. A file url doesn't usually contain a "localhost" part. In your example, it should look like this:
file:///Volumes/STORAGE%20SVR/Illustration-Wofford/Illustration%20Pickup/Archive/AL013111_IL_Communication.eps
But I'm pretty sure you've figured this out by now :)
Update: I stand corrected by Mike's comment: file://localhost/... is the same thing as file:///...!

Resources