I'm trying to build a simple app that captures video off of the built-in iSight camera of a MacBook. I've looked at a couple of example projects on the developer site and am following the tutorial here: Apple's AVFoundation Guide.
Each time I keep breaking on the AVCaptureMovieFileOutput, I get an uncaught exception - no active/enabled connections. I'm new to AV framework so I'm not sure why it recognizes the iSight, allows me to input it to the session, allows me to make a movie output for the session but then tells me there's no connections? What connections is it looking for? (Note: I do not have a QTMovieView in my viewcontroller yet but thought I would only need that for playback, not recording).
I know the iSight is working as I just used it recently with Skype.
Here's my relevant code:
thisSession = [[AVCaptureSession alloc] init];
//set presets for this session
if ([thisSession canSetSessionPreset:AVCaptureSessionPreset640x480]) {
thisSession.sessionPreset = AVCaptureSessionPreset640x480;
NSLog(#"Session Preset: OK");
//capture a device - captures all the devices, microphone, camera, etc.
NSArray *devices = [AVCaptureDevice devices];
//this will hold our decvice
AVCaptureDevice* iSightCamera;
for (AVCaptureDevice *device in devices) {
//we only want to work with the internal camera
if ([[device localizedName] isEqualToString:#"Built-in iSight"]) {
iSightCamera = device;
//creating an input of the device for the session
NSError *error = nil;
AVCaptureDeviceInput* iSightCameraInput =
[AVCaptureDeviceInput deviceInputWithDevice:iSightCamera error:&error];
if (!iSightCameraInput) {
NSLog(#"Error creating device input: %#", error);
} else {
NSLog(#"iSight device input created!");
//adding the device input to the session
if ([thisSession canAddInput:iSightCameraInput]) {
[thisSession addInput:iSightCameraInput];
NSLog(#"iSight input added to session!");
//add the output to the session
AVCaptureMovieFileOutput *movieOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([thisSession canAddOutput:movieOutput]) {
[thisSession beginConfiguration];
[thisSession addOutput:movieOutput];
[thisSession commitConfiguration];
NSLog(#"Movie output added to the session!");
//start writing the movie
NSURL *movieFolder = [NSURL fileURLWithPath:[#"~/Movies" stringByExpandingTildeInPath]];
[movieOutput startRecordingToOutputFileURL:movieFolder recordingDelegate:self];
}
else {
NSLog(#"Error: Could not add movie output to the session.");
}
}
else {
NSLog(#"Error: Could not add iSight to session.");
}
}
}
}
Related
I had a working share routine and now it is broken. Hadn't checked it, or modified it, for some time and now find that it is inoperable. When I call
[sharingService performWithItems:[NSArray arrayWithObject:itemProvider]];
I get a share sheet displayed. It shows the current members of the share. The form is inoperable and will not accept any input or taps. I cannot add, remove or stop sharing altogether. When I close the form, my app is hung up and will not respond or take focus. I have to kill the app and reopen to get it working again.
This used to work fine, within the last few months. I hadn't changed anything so I am very surprised by new problem.
I am adding my code for creating share here:
NSString *shareOption = [[NSUserDefaults standardUserDefaults] objectForKey:kSet_CLOUD_SERVICE_USER_DEFAULT];
if ([shareOption isEqualToString:TTICloudKitShareOwnerService]) {
CDEZipCloudFileSystem *zipFile = (CDEZipCloudFileSystem *)_cloudFileSystem;
CDECloudKitFileSystem *fileSystem = (CDECloudKitFileSystem *)zipFile.cloudFileSystem;
NSItemProvider *itemProvider = [[NSItemProvider alloc] init];
[itemProvider registerCloudKitShare:fileSystem.share container:fileSystem.container];
NSSharingService *sharingService = [NSSharingService sharingServiceNamed:NSSharingServiceNameCloudSharing];
sharingService.subject = #"Share Workforce Data";
sharingService.delegate = self;
if ([sharingService canPerformWithItems:[NSArray arrayWithObject:itemProvider]]) {
[sharingService performWithItems:[NSArray arrayWithObject:itemProvider]];
// This is the point at which the Apple UI is presented but inoperable.
// No changes can be made to the share.
// The only way to dismiss the dialog is to quit or press escape.
// Upon dismissal the app is either crashed or hung up.
// Quitting the app and restart is only option to use the app again.
// If not run from Xcode, requires force quit.
}
} else {
NSLog(#"Is Shared Ensemble");
NSAlert *alert = [[NSAlert alloc] init];
[alert addButtonWithTitle:#"Stop Share"];
[alert addButtonWithTitle:#"Cancel"];
[alert setMessageText:#"Shared Data Options"];
[alert setInformativeText:#"You are participating in a shared file. Stop sharing will remove your participation and reset your data. You will no longer participate or have access to the shared information."];
[alert setAlertStyle:NSAlertStyleWarning];
if ([alert runModal] == NSAlertFirstButtonReturn) {
[alert setInformativeText:#"Are you sure? You will no longer have access to shared data. You will need the owner of the share to resend an invitation to join the share."];
if ([alert runModal] == NSAlertFirstButtonReturn) {
// This actually does not remove user from sharing as intended.
// I am sure that is my own implementation incomplete though.
NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
[defaults setNilValueForKey:kSet_CLOUDKIT_SHARE_OWNER_DEFAULT];
[defaults setObject:TTICloudKitShareOwnerService forKey:kSet_CLOUD_SERVICE_USER_DEFAULT];
[defaults synchronize];
[self disconnectFromSyncServiceWithCompletion:^{
// TODO: Need to wipe the existing Core Data info here. Leave them with no access to shared data.
// Also need to remove self from the share?
[self reset];
[self setupEnsemble];
}];
}
}
}
Creating share and sending worked flawlessly and I'd been developing app and testing live. Currently my test is shared with two other users and still works. In fact I can't seem to find a way to stop sharing with those users or in any way alter the current share at all.
This is the NSCloudSharingServiceDelegate code:
-(NSCloudKitSharingServiceOptions)optionsForSharingService:(NSSharingService *)cloudKitSharingService shareProvider:(NSItemProvider *)provider
{
return NSCloudKitSharingServiceAllowPrivate | NSCloudKitSharingServiceAllowReadWrite;
}
-(void)sharingService:(NSSharingService *)sharingService willShareItems:(NSArray *)items
{
DLog(#"Will Share Called with items:%#",items);
}
-(void)sharingService:(NSSharingService *)sharingService didShareItems:(NSArray *)items
{
DLog(#"Did share called");
}
-(void)sharingService:(NSSharingService *)sharingService didFailToShareItems:(NSArray *)items error:(NSError *)error
{
DLog(#"Sharing service failed to share items, %#-", error);
if (error.code == NSUserCancelledError) return;
DLog(#"Failed to share, error- %#", error.userInfo);
[self disconnectFromSyncServiceWithCompletion:^{
NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
[defaults setObject:kSet_CLOUDKIT_SHARE_OWNER_DEFAULT forKey:kSet_CLOUD_SERVICE_USER_DEFAULT];
[defaults setNilValueForKey:kSet_CLOUDKIT_SHARE_OWNER_DEFAULT];
[defaults synchronize];
}];
}
It is apparent that I am one of the very few who find this to be important as I have scoured the webs and found almost nobody discussing it. Apple documentation is just about nil.
This is a screenshot of the Apple UI which is not working:
I am posting this as an answer, is more like a work around that I have managed to get working. Still no answer as to why the Apple UI does not respond.
See code for inviting participants without the Apple UI.
-(void)addParticipantWithEmail:(NSString *)email toShare:(CKShare *)share inContainer:(CKContainer *)container
{
[container discoverUserIdentityWithEmailAddress:(email) completionHandler:^(CKUserIdentity * _Nullable userInfo, NSError * _Nullable error) {
if (!userInfo || error) {
NSLog(#"Participant was not found for email %#", email);
if (error) {
NSLog(#"Error: %#", error.userInfo);
} else {
NSLog(#"No error was provided");
}
// abort
return;
}
CKFetchShareMetadataOperation *fetchMetaDataOperation = [[CKFetchShareMetadataOperation alloc] initWithShareURLs:[NSArray arrayWithObject:share.URL]];
fetchMetaDataOperation.shouldFetchRootRecord = YES;
[fetchMetaDataOperation setPerShareMetadataBlock:^(NSURL * _Nonnull shareURL, CKShareMetadata * _Nullable shareMetadata, NSError * _Nullable error) {
CKRecord *root = shareMetadata.rootRecord;
if (!root) {
NSLog(#"There was an error retrieving the root record- %#", error);
} else {
NSLog(#"Root is %#", root);
NSLog(#"/n");
}
CKUserIdentityLookupInfo *info = userInfo.lookupInfo;
CKFetchShareParticipantsOperation *fetchOperation = [[CKFetchShareParticipantsOperation alloc] initWithUserIdentityLookupInfos:[NSArray arrayWithObject:info]];
[fetchOperation setShareParticipantFetchedBlock:^(CKShareParticipant * _Nonnull participant) {
participant.permission = CKShareParticipantPermissionReadWrite;
[share addParticipant:participant];
CKModifyRecordsOperation *modifyOperation = [[CKModifyRecordsOperation alloc] initWithRecordsToSave:[NSArray arrayWithObjects:root, share, nil] recordIDsToDelete:nil];
modifyOperation.savePolicy = CKRecordSaveIfServerRecordUnchanged;
[modifyOperation setPerRecordCompletionBlock:^(CKRecord * _Nonnull record, NSError * _Nullable error) {
if (error) {
DLog(#"Error modifying record %#. UserInfo: %#", record, error.userInfo);
} else {
DLog(#"No Error Reported in Modify Operation");
}
}];
[container.privateCloudDatabase addOperation:modifyOperation];
}];
[fetchOperation setFetchShareParticipantsCompletionBlock:^(NSError * _Nullable operationError) {
if (operationError) {
NSLog(#"There was en error in the fetch operation- %#", operationError.userInfo);
// Error may be a network issue, should implement a retry and possibly a limit to how many times to run it
}
}];
[container addOperation:fetchOperation];
}];
[container addOperation:fetchMetaDataOperation];
}];
}
It seems now, if I pass an email address to this function they are successfully invited to share, provided the user is in my contacts and has allowed discoverability.
I send the user the link to the share manually via iMessage at this point. Copied the URL from the console. My intent is to provide my own forms to handle that now.
On receiving link, I use Ensembles method:
CDECloudKitFileSystem acceptInvitationToShareWithMetadata:metadata completion:^(NSError *error)
This code didn't seem to work, accepting invites was failing initially. Without having changed anything, the accepting shares started to work. I am not sure why the initial fails.
Hi all you awesome coders! I've put together this thing from various helpful sources over the last couple of weeks (including a lot of posts from stackoverflow), trying to create something that will take a webcam feed and detect smiles when they occur (might as well draw boxes around the faces and the smiles as well, that doesn't seem like it would be hard once they are detected). Please give me some lee-way if it's messy code because I'm still very much learning.
Currently I'm stuck at trying to pass the image to a CIImage so it can be analysed for faces (I plan to deal with smiles after the face hurdle is overcome). As it is the compiler succeeds if I comment out the block after (5) - it brings up a simple AVCaptureVideoPreviewLayer in a window. I think this is what I've called "rootLayer", so it's like the first layer of the displayed output, and after I detect faces in the video frames I'll show a rectangle following the "bounds" of any detected face in a new layer overlaid on top of this one, and I've called that layer "previewLayer"... correct?
But with the block after (5) there, the compiler throws out three errors -
Undefined symbols for architecture x86_64:
"_CMCopyDictionaryOfAttachments", referenced from:
-[AVRecorderDocument captureOutput:didOutputSampleBuffer:fromConnection:] in AVRecorderDocument.o
"_CMSampleBufferGetImageBuffer", referenced from:
-[AVRecorderDocument captureOutput:didOutputSampleBuffer:fromConnection:] in AVRecorderDocument.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Can anyone tell me where I'm going wrong and what my next steps are?
Thanks for any help, I've been stuck at this point for a couple of days and I can't figure it out, all the examples I can find are for IOS and don't work in OSX.
- (id)init
{
self = [super init];
if (self) {
// Move the output part to another function
[self addVideoDataOutput];
// Create a capture session
session = [[AVCaptureSession alloc] init];
// Set a session preset (resolution)
self.session.sessionPreset = AVCaptureSessionPreset640x480;
// Select devices if any exist
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
[self setSelectedVideoDevice:videoDevice];
} else {
[self setSelectedVideoDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeMuxed]];
}
NSError *error = nil;
// Add an input
videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
[self.session addInput:self.videoDeviceInput];
// Start the session (app opens slower if it is here but I think it is needed in order to send the frames for processing)
[[self session] startRunning];
// Initial refresh of device list
[self refreshDevices];
}
return self;
}
-(void) addVideoDataOutput {
// (1) Instantiate a new video data output object
AVCaptureVideoDataOutput * captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.videoSettings = #{ (NSString *) kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
// discard if the data output queue is blocked (while CI processes the still image)
captureOutput.alwaysDiscardsLateVideoFrames = YES;
// (2) The sample buffer delegate requires a serial dispatch queue
dispatch_queue_t captureOutputQueue;
captureOutputQueue = dispatch_queue_create("CaptureOutputQueue", DISPATCH_QUEUE_SERIAL);
[captureOutput setSampleBufferDelegate:self queue:captureOutputQueue];
dispatch_release(captureOutputQueue); //what does this do and should it be here or after we receive the processed image back?
// (3) Define the pixel format for the video data output
NSString * key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber * value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * settings = #{key:value};
[captureOutput setVideoSettings:settings];
// (4) Configure the output port on the captureSession property
if ( [self.session canAddOutput:captureOutput] )
[session addOutput:captureOutput];
}
// Implement the Sample Buffer Delegate Method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// I *think* I have a video frame now in some sort of image format... so have to convert it into a CIImage before I can process it:
// (5) Convert CMSampleBufferRef to CVImageBufferRef, then to a CI Image (per weichsel's answer in July '13)
CVImageBufferRef cvFrameImage = CMSampleBufferGetImageBuffer(sampleBuffer); // Having trouble here, prog. stops and won't recognise CMSampleBufferGetImageBuffer.
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
self.ciFrameImage = [[CIImage alloc] initWithCVImageBuffer:cvFrameImage options:(__bridge NSDictionary *)attachments];
//self.ciFrameImage = [[CIImage alloc] initWithCVImageBuffer:cvFrameImage];
//OK so it is a CIImage. Find some way to send it to a separate CIImage function to find the faces, then smiles. Then send it somewhere else to be displayed on top of AVCaptureVideoPreviewLayer
//TBW
}
- (NSString *)windowNibName
{
return #"AVRecorderDocument";
}
- (void)windowControllerDidLoadNib:(NSWindowController *) aController
{
[super windowControllerDidLoadNib:aController];
// Attach preview to session
CALayer *rootLayer = self.previewView.layer;
[rootLayer setMasksToBounds:YES]; //aaron added
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
[self.previewLayer setBackgroundColor:CGColorGetConstantColor(kCGColorBlack)];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[self.previewLayer setFrame:[rootLayer bounds]];
//[newPreviewLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable]; //don't think I need this for OSX?
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[rootLayer addSublayer:previewLayer];
// [newPreviewLayer release]; //what's this for?
}
(moved from the comments section)
Wow. I guess two days and one StackOverflow post is what it takes to figure out that I haven't added CoreMedia.framework to my project.
I am trying to mirror video received from webcam on mac os x. I would like to avoid doing a manual flip/tranform after receiving the video buffer . So, I want to setup AVCaptureSession such that video buffer received in captureOutput method of AVCaptureVideoDataOutputSampleBufferDelegate is mirrored by AVFoundation itself. I don't want to use the preview layer.
On an iMac(10.8.5), to mirror video, AVCaptureConnection isVideoMirroringSupported is successfully tested before setting the videoMirrored property. But video buffer received in captureOutput delegate isn't mirrored.
Note: Video mirroring on iOS was successful, when I followed this SO answer. But it isn't helping on mac os x.
Code used is below. Error checking is left out for this post.
//create session
_session = [[AVCaptureSession alloc] init];
//get capture device
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
//create sesion input
NSError * error;
_sessionInput = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
//create session output
_sessionOutput = [[AVCaptureVideoDataOutput alloc] init];
[_sessionOutput setAlwaysDiscardsLateVideoFrames:YES];
[[_sessionOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[_sessionOutput setVideoSettings:videoSettings];
//serial queue to process video frames
dispatch_queue_t videoOutputQueue = dispatch_queue_create("deviceeraQueue", DISPATCH_QUEUE_SERIAL);
[_sessionOutput setSampleBufferDelegate:self queue:videoOutputQueue];
//begin session configuration
[_session beginConfiguration ];
//input and output for session
if( [_session canAddInput:_sessionInput]) {
[_session addInput:_sessionInput];
}
if( [_session canAddOutput:_sessionOutput]) {
[_session addOutput:_sessionOutput];
}
//set video mirroring
AVCaptureConnection* avConnection = [_sessionOutput connectionWithMediaType:AVMediaTypeVideo];
if( [avConnection isVideoMirroringSupported]) {
avConnection.videoMirrored = YES;
NSLog(#"Video mirroring Support: YES"); // this line is printed
} else {
NSLog(#"Video mirroring Support: NO");
}
//set session preset
[_session setSessionPreset:AVCaptureSessionPreset640x480];
[ _session commitConfiguration ];
...........
...........
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
.........
//sampleBuffer is not mirrored video
........
Of lesser importance 1 - though C++, I also tried looking into OpenCV's VideoCapture implementation for way to mirror video. But, OpenCV don't mirror video from Mac(uses flip). Left is libVlc/V4L.
Of lesser importance 2 - In slide 73 of this 2010 wwdc apple presentation (3Mb pdf), there is a mention that setVideoOrientation is not supported on 'AVCaptureVideoDataOutput` connection. But in 2013, apple docs are updated and supports this method.
You can add a transform on the preview layer to flip x value of the frames before they get to the preview window.
[[self previewLayer] setTransform:CATransform3DMakeScale(-1, 1, 1)];
Then you can run the recorded video through export session and do the same transformation. That way the video preview will match the final recorded video. Bit of a hack, but it gets the same results.
Why hack it when it's very easy. Just set automaticallyAdjustVideoMirroring of your AVCaptureConnection then set it manually.
aPreviewLayer.connection.automaticallyAdjustsVideoMirroring = NO;
aPreviewLayer.connection.videoMirrored = YES;
Swift 5 version of "Þorvaldur Rúnarsson" answer:
previewLayer.connection?.automaticallyAdjustsVideoMirroring = false
previewLayer.connection?.isVideoMirrored = true
I am making a movie of screen using AVCaptureMovieFIleOutput, but it is showing unexpected behaviour.
Like, if I am sending the cropRect parameter the movie captured is fine, but if I making the movie of whole screen instead of movie file it is showing a folder. How can I get rid of that?
Code is :
// Create a capture session
mSession = [[AVCaptureSession alloc] init];
// If you're on a multi-display system and you want to capture a secondary display,
// you can call CGGetActiveDisplayList() to get the list of all active displays.
// For this example, we just specify the main display.
CGDirectDisplayID displayId = kCGDirectMainDisplay;
// Create a ScreenInput with the display and add it to the session
input = [[AVCaptureScreenInput alloc] initWithDisplayID:displayId];
[input setCropRect:rect];
if (!input) {
mSession = nil;
return;
}
if ([mSession canAddInput:input])
[mSession addInput:input];
// Create a MovieFileOutput and add it to the session
mMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([mSession canAddOutput:mMovieFileOutput])
[mSession addOutput:mMovieFileOutput];
// Start running the session
[mSession startRunning];
// Delete any existing movie file first
if ([[NSFileManager defaultManager] fileExistsAtPath:[destPath path]])
{
NSError *err;
if (![[NSFileManager defaultManager] removeItemAtPath:[destPath path] error:&err])
{
NSLog(#"Error deleting existing movie %#",[err localizedDescription]);
}
}
// Start recording to the destination movie file
// The destination path is assumed to end with ".mov", for example, #"/users/master/desktop/capture.mov"
// Set the recording delegate to self
[mMovieFileOutput startRecordingToOutputFileURL:destPath recordingDelegate:self];
Have to use setSessionPreset property.
Default value of sessionPreset is AVCaptureSessionPresetHigh and it does not work with the full screen capture. Will have to provide any other preset for that.
I have an app that has record and playback capabilities. These work fine. I can also email the audio file.
If I read the email on a computer, I can play the audio file.
I can read the email on an iDevice, but if I try to play the audio file, I see something like QT pop up for a moment, but then, the message returns to the screen.
If I hold down the audiofile icon, I get a list of apps that can be used - but they are all document apps, not audio apps.
The soundfile has a .caf suffix (Core Audio Format).
My audio file is created like this:
NSMutableDictionary* recordSettings = [[NSMutableDictionary alloc] init];
[recordSettings setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSettings setValue :[NSNumber numberWithInt:16] forKey:AVEncoderBitRateKey];
[recordSettings setValue :[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSettings setValue :[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
NSError *error = nil;
audioRecorder = [[AVAudioRecorder alloc]
initWithURL:soundFileURL settings:recordSettings error:&error];
if (error)
{
NSLog(#"error: %#", [error localizedDescription]);
} else {
// ALL IS OK, START RECORDING
//NSLog(#"DetailVC - recordAudio - soundFilePath is %#", soundFile);
[audioRecorder prepareToRecord];
recordToggle = 1;
[autoCog startAnimating];
[audioRecorder record];
recordingTimer = [NSTimer scheduledTimerWithTimeInterval:1.0
target:self
selector:#selector(recordingOn)
userInfo:nil
repeats:YES];
}
Thanks for any advice.
The problem might not be in the audio creation, but how you attach it to the email. Are you setting the MIME type correctly so the mail client knows how to read the data? For .caf, you should use the audio/x-caf MIME type.
The code to solve the problem was:
NSData *soundData = [NSData dataWithContentsOfFile:soundFile];
[mailer addAttachmentData:soundData mimeType:#"audio/mpeg" fileName:#"YourFile.mp3"];