I'm trying to implement multiple shots in an ios8 app.
I'm using llsimplecamera (https://github.com/omergul123/LLSimpleCamera) that is a wrapper for AvFoundation that work good as I would.
When I push on the shutter button, 'captureStillImageAsynchronouslyFromConnection' (a method of an instance of AVCaptureStillImageOutput) is called and all work well and I obtain a photo.
If I try to put 'captureStillImageAsynchronouslyFromConnection', to capture multiple shot, in a loop, I get no photo.
I tried to use semaphore technique:
if ([self.captureDevice lockForConfiguration:nil]) {
if ([self.captureDevice isFocusModeSupported:AVCaptureFocusModeLocked])
[self.captureDevice setFocusMode:AVCaptureFocusModeLocked];
if ([self.captureDevice isExposureModeSupported:AVCaptureExposureModeLocked])
[self.captureDevice setExposureMode:AVCaptureExposureModeLocked];
if ([self.captureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked])
[self.captureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeLocked];
}
AVCaptureConnection *videoConnection = [self captureConnection];
videoConnection.videoOrientation = [self orientationForConnection];
dispatch_semaphore_t sync = dispatch_semaphore_create(0);
while(1)
{
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if (imageSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
NSLog(#"image %#",image);
}
dispatch_semaphore_signal(sync);
}];
dispatch_semaphore_wait(sync, DISPATCH_TIME_FOREVER);
}
return nil;
but I get no photo.
What am I doing wrong ?
thanks in advance
Related
I'm no longer able to render collectionview cells after using data from Cloudkit via CKAssets. I was previously using images loaded in a folder on my desktop just for initial testing. I'm now using Cloudkit and I've created some test records via the CK dashboard using those same images. I was successfully able to query the CK database and retrieve the expected records. I then changed my code to populate the model data for the cells to use the CK data. That data previously came from the images retrieved locally. I can see from logging that I am getting the data from CK successfully, including the images. I can also see from logging that my custom CV cells are no longer getting initialed. From what I can tell, my code looks good based on examples I've seen online.
Can anyone help me with this? Thank you!
Designated initializer in the model...
- (instancetype)initImagesForSelection:(NSString *)selectionType {
self = [super init];
if (self) {
CKDatabase *publicDatabase = [[CKContainer defaultContainer] publicCloudDatabase];
NSPredicate *predicate = [NSPredicate predicateWithFormat:#"ImageDescription = 'description'"];
CKQuery *query = [[CKQuery alloc] initWithRecordType:#"ImageData" predicate:predicate];
[publicDatabase performQuery:query inZoneWithID:nil completionHandler:^(NSArray *results, NSError *error) {
// handle the error
if (error) {
NSLog(#"Error: there was an error querying the cloud... %#", error);
} else {
// any results?
if ([results count] > 0) {
NSLog(#"Success querying the cloud for %lu results!!!", (unsigned long)[results count]);
for (CKRecord *record in results) {
ImageData *imageData = [[ImageData alloc] init];
CKAsset *imageAsset = record[#"Image"];
imageData.imageURL = imageAsset.fileURL;
NSLog(#"asset URL: %#", imageData.imageURL);
imageData.imageName = record[#"ImageName"];
//imageData.image = [UIImage imageWithData:[NSData dataWithContentsOfURL:imageAsset.fileURL]];
imageData.image = [UIImage imageWithContentsOfFile:imageAsset.fileURL.path];
NSLog(#"image size height:%f, width:%f", imageData.image.size.height, imageData.image.size.width);
[self.imageDataArray addObject:imageData];
}
NSLog(#"imageDataArray size %lu", (unsigned long)[self.imageDataArray count]);
}
}
}];
}
return self;
}
Collectionview viewcontroller which worked perfectly before pulling the data from Cloudkit...
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath {
static NSString *CellIdentifier = #"Cell"; // string value identifier for cell reuse
ImageViewCell *cell = [collectionView dequeueReusableCellWithReuseIdentifier:CellIdentifier forIndexPath:indexPath];
NSLog(#"cellForItemAtIndexPath: section:%ld row:%ld", (long)indexPath.section, (long)indexPath.row);
cell.layer.borderWidth = 1.0;
cell.layer.borderColor = [UIColor grayColor].CGColor;
ImageData *imageData = [self.imageLoadManager imageDataForCell:indexPath.row];
cell.imageView.image = imageData.image;
cell.imageView.contentMode = UIViewContentModeScaleAspectFit;
return cell;
}
Ok, I figured this out. My code was actually working. The collectionview was not displaying due to a multithreading/asynchronous download issue with the data from cloudkit. I hit the camera button to take a pic, which refreshed the CV and everything in the CV appeared. I just need to use multithreading so things start rendering while the images are downloading.
I'm having a little trouble switching scenes (or even displaying sprites) on my CCscene UserLevelCreation, but it only happens after i use the UIImagePickerController to take/accept a picture, And then when I try to switch scenes, It crashes with the error:
"Could not attach texture to framebuffer"
this is the code I use to take a picture:
-(void)takePhoto{
AppController *appdel = (AppController*) [[UIApplication sharedApplication] delegate];
#try {
uip = [[UIImagePickerController alloc] init] ;
uip.sourceType = UIImagePickerControllerSourceTypeCamera;
uip.allowsEditing = YES;
uip.delegate = self;
}
#catch (NSException * e) {
uip = nil;
}
#finally {
if(uip) {
[appdel.navController presentModalViewController:uip animated:NO];
}
}
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
NSString * LevelImage;
LevelImage=[info objectForKey:UIImagePickerControllerOriginalImage];
AppController *appdel = (AppController*) [[UIApplication sharedApplication] delegate];
[appdel.navController dismissModalViewControllerAnimated:YES];
[NSThread detachNewThreadSelector:#selector(writeImgToPath:) toTarget:self withObject:LevelImage];
}
And the WriteImageToPath function:
-(void)writeImgToPath:(id)sender
{
UIImage *image = sender;
NSArray *pathArr = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask,
YES);
CGSize size;
int currentProfileIndex = 1;
NSString *path = [[pathArr objectAtIndex:0]
stringByAppendingPathComponent:[NSString stringWithFormat:#"CustomLevel_%d.png",currentProfileIndex]];
size = CGSizeMake(1136, 640);
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, 1136, 640)];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *data = UIImagePNGRepresentation(image);
[data writeToFile:path atomically:YES];
CCLOG(#"saved...");
CGRect r = CGRectMake(0, 0, 1136, 640);
UIGraphicsBeginImageContext(r.size);
UIImage *img1;
[image drawInRect:r];
img1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(img1, nil, nil, nil);
currentProfileIndex++;
[[CCDirector sharedDirector] replaceScene:[LevelEditor Scene]
withTransition:[CCTransition transitionPushWithDirection:CCTransitionDirectionLeft duration:1.0f]];
}
-UPDATE----------------------------------------------------------------------------------
I just tried switching scenes in different places in the code, and I can switch scenes freely until the Function:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
I think that function might be the problem.
So far, I have tried to:
1.switch scenes before taking the picture, and that works fine
2.I read that it might be because i was resizing the image, but commenting that out made no difference.
3.I've also added breakpoints before the scene switch, and the scene isn't nil
4.Finally i tred dismissing the uip UIImagePickerController, but that made no difference either.
Sorry for the long post /: if anyone knows whats going on any help would be really great.
Ok so I found the problem to be this line:
[NSThread detachNewThreadSelector:#selector(writeImgToPath:) toTarget:self withObject:LevelImage];
}
Apparently some Parts of Uikit are not thread-friendly, so I'm now using just
[self writeImgToPath];
And it works, hope this helps anyone with the same problem.
I want to obtain the referenceURL to the image that I saved into camera roll using UIImageWriteToSavedPhotosAlbum().
iOS 4.1 or above can do it easily by using AssetLibrary.
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL* url, NSError* error) {
if (error == nil) {
savedURL = url;
}
};
UIImage * originalImage = [info objectForKey:UIImagePickerControllerOriginalImage];
NSMutableDictionary * metadata = (NSMutableDictionary *)[info objectForKey:UIImagePickerControllerMediaMetadata];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:originalImage.CGImage
metadata:metadata
completionBlock:completionBlock];
But, I cannot figure out a smart way in case of earlier iOS where the only way of saving an image to the camera library is UIImageWriteToSavedPhotosAlbum(). One way I think about is looking around the saved image using ALAssetsGroup etc. This is not smart for me, and it only helps iOS 4.0.
Thank you in advance,
Kiyo
Use writeImageToSavedPhotosAlbum instead:
[library writeImageToSavedPhotosAlbum:[originalImage CGImage] orientation:(ALAssetOrientation)[originalImage imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
NSLog(#"error"); // oops, error !
} else {
NSLog(#"url %#", assetURL); // assetURL is the url you looking for
}
}];
I play a video with AVPlayer. It works ok.
Now I want to get a UIImage from the video playing (when I push a button for the moment).
Attached to my AVPlayer there is a CALayer that is used to display video on my UIView.
My idea is to get an UIImage from CALayer during video is playing.
I do this with code from another question:
UIImage from CALayer - iPhone SDK
However my UIImage is empty. The resolution is good but it is but fully white !!!
It seems that video doesn't write the contents of my CALayer.
Someone can help me?
Thanks
I couldn't get Meet's solution to work for me, but it got me thinking in the right direction.
Below is the code that I ended up using in my project. The method screenshotFromPlayer:maximumSize: accepts an instance of an AVPlayer from which to take a screenshot, and a CGSize that will be the returned image's maximum size.
- (UIImage *)screenshotFromPlayer:(AVPlayer *)player maximumSize:(CGSize)maxSize {
CMTime actualTime;
NSError *error;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:player.currentItem.asset];
// Setting a maximum size is not necessary for this code to
// successfully get a screenshot, but it was useful for my project.
generator.maximumSize = maxSize;
CGImageRef cgIm = [generator copyCGImageAtTime:player.currentTime
actualTime:&actualTime
error:&error];
UIImage *image = [UIImage imageWithCGImage:cgIm];
CFRelease(cgIm);
if (nil != error) {
NSLog(#"Error making screenshot: %#", [error localizedDescription]);
NSLog(#"Actual screenshot time: %f Requested screenshot time: %f", CMTimeGetSeconds(actualTime),
CMTimeGetSeconds(self.recordPlayer.currentTime));
return nil;
}
return image;
}
Note also that one could use the method generateCGImagesAsynchronouslyForTimes:completionHandler: instead of copyCGImageAtTime:actualTime:error: (on the instance of AVAssetImageGenerator) to perform the image generation asynchronously.
This code sample generates a screenshot at the AVPlayer's currentTime, but any time could be used instead.
The code to get image from avplayer.
- (UIImage *)currentItemScreenShot
{
AVPlayer *abovePlayer = [abovePlayerView player];
CMTime time = [[abovePlayer currentItem] currentTime];
AVAsset *asset = [[[abovePlayerView player] currentItem] asset];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
if ([imageGenerator respondsToSelector:#selector(setRequestedTimeToleranceBefore:)] && [imageGenerator respondsToSelector:#selector(setRequestedTimeToleranceAfter:)]) {
[imageGenerator setRequestedTimeToleranceBefore:kCMTimeZero];
[imageGenerator setRequestedTimeToleranceAfter:kCMTimeZero];
}
CGImageRef imgRef = [imageGenerator copyCGImageAtTime:time
actualTime:NULL
error:NULL];
if (imgRef == nil) {
if ([imageGenerator respondsToSelector:#selector(setRequestedTimeToleranceBefore:)] && [imageGenerator respondsToSelector:#selector(setRequestedTimeToleranceAfter:)]) {
[imageGenerator setRequestedTimeToleranceBefore:kCMTimePositiveInfinity];
[imageGenerator setRequestedTimeToleranceAfter:kCMTimePositiveInfinity];
}
imgRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
}
UIImage *image = [[UIImage alloc] initWithCGImage:imgRef];
CGImageRelease(imgRef);
[imageGenerator release];
return [image autorelease];
}
set [imageGenerator setRequestedTimeToleranceBefore:kCMTimeZero] and [imageGenerator setRequestedTimeToleranceAfter:kCMTimeZero] if you need precise time.
Try to get the image from the video file at specified instance using the AVAssetImageGenerator:
AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:[info objectForKey:#"UIImagePickerControllerReferenceURL"]] options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform=TRUE;
[asset release];
CMTime thumbTime = CMTimeMakeWithSeconds(0,30);
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result != AVAssetImageGeneratorSucceeded) {
}
UIImage *thumbImg = [[UIImage imageWithCGImage:im] retain];
[generator release];
I have been trying to minimize my memory footprint with UIImagePickerController, but I'm starting to think that the memory problems I am having are resulting from poor memory management, instead of a particular way to handle the UIImagePickerController object.
My workflow is this: The "Edit Image" button is clicked, which presents a UIActionSheet. This action sheet allows you to delete, take a picture, choose from the library, or cancel. If you select Choose from the library or Take Picture, I alloc an instance of UIImagePickerController and present it, followed by a release of UIImagePickerController:
-(void)actionSheet:(UIActionSheet *)actionSheet clickedButtonAtIndex:(NSInteger)buttonIndex
{
if (actionSheet.tag != 999) {
UIImagePickerController *imagePicker = [[UIImagePickerController alloc] init];
imagePicker.delegate = self;
BOOL pickImage = nil;
if (actionSheet.tag == iPhoneWithDelete) {
switch (buttonIndex) {
case 0:
object.objectImage = nil;
pickImage = NO;
break;
case 1:
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
pickImage = YES;
break;
case 2:
imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
pickImage = YES;
break;
default:
pickImage = NO;
break;
}
} else if (actionSheet.tag == iPhoneNoDelete) {
switch (buttonIndex) {
case 0:
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
pickImage = YES;
break;
case 1:
imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
pickImage = YES;
break;
default:
pickImage = NO;
break;
}
} else if (actionSheet.tag == iPodWithDelete) {
switch (buttonIndex) {
case 0:
object.objectImage = nil;
pickImage = NO;
break;
case 1:
imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
pickImage = YES;
break;
default:
pickImage = NO;
break;
}
} else if (actionSheet.tag == iPodNoDelete) {
switch (buttonIndex) {
case 0:
imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
pickImage = YES;
break;
default:
pickImage = NO;
break;
}
}
if (pickImage) {
imagePicker.allowsEditing = YES;
[self presentModalViewController:imagePicker animated:YES];
} else {
[self setupImageButton];
[self setupChooseImageButton];
}
[imagePicker release];
}
}
Once I get a selection back from the UIImagePickerController, I save 2 images, a resized version of the edited image to use for a thumbnail, and a 800x600 version of the original unedited image into a relationship attribute (Transformational, using the same UIImage to PNG transformations found in the Recipes demo code) for display use: (the resize methods are based on the one demoed in this SO post.)
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self dismissModalViewControllerAnimated:YES];
NSManagedObject *oldImage = object.imageFull;
if (oldImage != nil)
{
[object.managedObjectContext deleteObject:oldImage];
}
NSManagedObject *image = [NSEntityDescription insertNewObjectForEntityForName:#"Image" inManagedObjectContext:object.managedObjectContext];
object.imageFull = image;
UIImage *rawImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
CGSize size = CGSizeMake(800, 600);
UIImage *fullImage = [UIImageManipulator scaleImage:rawImage toSize:size];
[image setValue:fullImage forKey:#"imageFull"];
UIImage *processedImage = [UIImageManipulator scaleImage:[info objectForKey:#"UIImagePickerControllerEditedImage"] toSize:CGSizeMake(75, 75)];
object.objectImage = processedImage;
[self setupImageButton];
[self setupChooseImageButton];
rawImage = nil;
fullImage = nil;
processedImage = nil;
}
When I go through viewDidUnload I am setting self.object = nil, and [object release] during dealloc, but I'm still getting memory warnings after about 10 image changes, with a crash at around 20. It leads me to believe that I am not getting that full image out of memory the correct way. What am I missing here?
And on a second note, does the Camera source use significantly more memory than the Photo Albums source? I tend to get more crashes when using the camera.
--EDIT--
Starting a bounty for any information about what I may be handling wrong. I will update this post with any answers to anything I have been unclear about. Just at my wit's end on this.
--EDIT 2--
Reworked the code to take chrissr's suggestions into account, and implemented GCD to improve usability. Is this as efficient as this process gets? Still getting memory warnings, and a crash around 20 images in. I'm sure that the combination of doing expensive UIImage resizing and using UIImagePickerController is murdering the CPU, but I can't imagine that every app is dealing with uncertainty around the UIImagePickerController. My memory footprint is around 2 megs. I have been operating under the assumption that that was plenty of overhead. Should I reduce that footprint further?
Here is the modified code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self dismissModalViewControllerAnimated:YES];
if (object.imagePath != nil) {
[self deleteImages];
}
dispatch_queue_t image_queue;
image_queue = dispatch_queue_create("com.gordonfontenot.app", NULL);
dispatch_async(image_queue, ^{
NSDate *now = [NSDate date];
NSDateFormatter *f = [[NSDateFormatter alloc] init];
[f setDateFormat:#"yyyyMMDDHHmmss"];
NSString *imageName = [NSString stringWithFormat:#"Image-%#-%i", [f stringFromDate:now], arc4random() % 100];
NSString *thumbName = [NSString stringWithFormat:#"%#-thumb", imageName];
[f release];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *fullPath = [documentsDirectory stringByAppendingPathComponent:imageName];
NSString *thumbPath = [documentsDirectory stringByAppendingPathComponent:thumbName];
NSData *thumbImageData = UIImagePNGRepresentation([UIImageManipulator scaleImage:[info objectForKey:#"UIImagePickerControllerEditedImage"] toSize:CGSizeMake(120, 120)]);
[thumbImageData writeToFile:thumbPath atomically:NO];
dispatch_async(dispatch_get_main_queue(), ^{
object.thumbPath = thumbPath;
[self setupImageButton];
imageButton.enabled = NO;
[self setupChooseImageButton];
});
NSData *fullImageData = UIImagePNGRepresentation([UIImageManipulator scaleImage:[info objectForKey:#"UIImagePickerControllerOriginalImage"] toSize:CGSizeMake(800, 600)]);
[fullImageData writeToFile:fullPath atomically:NO];
dispatch_async(dispatch_get_main_queue(), ^{
imageButton.enabled = YES;
object.imagePath = fullPath;
});
if (picker.sourceType == UIImagePickerControllerSourceTypeCamera) {
UIImageWriteToSavedPhotosAlbum([info objectForKey:#"UIImagePickerControllerOriginalImage"], self, nil, nil);
}
});
dispatch_release(image_queue);
}
Memory warnings are extremely common when dealing with the UIImagePickerController. This is especially true when using the camera. Keep in mind that while a JPG or PNG on disk may only amount to a few MB, the uncompressed in memory bitmap used to draw the image uses considerably more.
There's nothing that you're doing wrong necessarily, but some improvements can be made:
Rather than storing the image bytes in Core Data, why not write the image to disk and store the path to the file in your database?
Rather than using so many autoreleased images, can you find a way to manage their lifecycle directly and release them sooner?
Your best bet may be to write the images to disk as soon after processing as possible and free up the memory they're using. Then store their location using Core Data rather than the raw data.