Obtain ReferenceURL after saving an image using UIImageWriteToSavedPhotosAlbum() - assetslibrary

I want to obtain the referenceURL to the image that I saved into camera roll using UIImageWriteToSavedPhotosAlbum().
iOS 4.1 or above can do it easily by using AssetLibrary.
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL* url, NSError* error) {
if (error == nil) {
savedURL = url;
}
};
UIImage * originalImage = [info objectForKey:UIImagePickerControllerOriginalImage];
NSMutableDictionary * metadata = (NSMutableDictionary *)[info objectForKey:UIImagePickerControllerMediaMetadata];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:originalImage.CGImage
metadata:metadata
completionBlock:completionBlock];
But, I cannot figure out a smart way in case of earlier iOS where the only way of saving an image to the camera library is UIImageWriteToSavedPhotosAlbum(). One way I think about is looking around the saved image using ALAssetsGroup etc. This is not smart for me, and it only helps iOS 4.0.
Thank you in advance,
Kiyo

Use writeImageToSavedPhotosAlbum instead:
[library writeImageToSavedPhotosAlbum:[originalImage CGImage] orientation:(ALAssetOrientation)[originalImage imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
NSLog(#"error"); // oops, error !
} else {
NSLog(#"url %#", assetURL); // assetURL is the url you looking for
}
}];

Related

How capture multiple shots with captureStillImageAsynchronouslyFromConnection

I'm trying to implement multiple shots in an ios8 app.
I'm using llsimplecamera (https://github.com/omergul123/LLSimpleCamera) that is a wrapper for AvFoundation that work good as I would.
When I push on the shutter button, 'captureStillImageAsynchronouslyFromConnection' (a method of an instance of AVCaptureStillImageOutput) is called and all work well and I obtain a photo.
If I try to put 'captureStillImageAsynchronouslyFromConnection', to capture multiple shot, in a loop, I get no photo.
I tried to use semaphore technique:
if ([self.captureDevice lockForConfiguration:nil]) {
if ([self.captureDevice isFocusModeSupported:AVCaptureFocusModeLocked])
[self.captureDevice setFocusMode:AVCaptureFocusModeLocked];
if ([self.captureDevice isExposureModeSupported:AVCaptureExposureModeLocked])
[self.captureDevice setExposureMode:AVCaptureExposureModeLocked];
if ([self.captureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked])
[self.captureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeLocked];
}
AVCaptureConnection *videoConnection = [self captureConnection];
videoConnection.videoOrientation = [self orientationForConnection];
dispatch_semaphore_t sync = dispatch_semaphore_create(0);
while(1)
{
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if (imageSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
NSLog(#"image %#",image);
}
dispatch_semaphore_signal(sync);
}];
dispatch_semaphore_wait(sync, DISPATCH_TIME_FOREVER);
}
return nil;
but I get no photo.
What am I doing wrong ?
thanks in advance

non-persistence of object written to documentsDirectory - is

-- a question about how to make an object that is saved to the documents directory persist on the drive and be recoverable after the iDevice is rebooted.
Here's my problem. I make a data object with NSCoding and fill it with data. I write it to the documentsDirectory each time the data in the object are updated. I stop the app and start the app again, and my data object persists, with all of its data. But if I reboot the iPhone the code I wrote to recover and read the data object fails.
The code I wrote originally used only a NSString for the file path. It worked well under ios7 but it fails under ios8.
Reading up on things, I found this clue from the Apple documentation:
"Important: Although they are safe to use while your app is running, file reference URLs are not safe to store and reuse between launches of your app because a file’s ID may change if the system is rebooted. If you want to store the location of a file persistently between launches of your app, create a bookmark as described in Locating Files Using Bookmarks."
So I rewrote my ios7 file open and file close methods so they no longer use strings or urls but get their strings and urls from a bookmark that is saved using NSUserDefaults. Same problem: everything works fine so long as I do not power off the phone, but all is lost once I do. I am not able to solve this.
Here is my current series of steps. First I either determine (or if it already exists in NSUsrDefaults, I recover) the absolute path to the documentsDirectory, using a bookmark:
+ (NSString*) getGeoModelAbsolutePath
{
NSString *path;
NSUserDefaults *userDefaults = [NSUserDefaults standardUserDefaults];
NSURL *documentsDirectoryBookmarkURL;
NSData* documentsDirectoryBookmark = [userDefaults objectForKey:#"documentDirectoryBookmark"];
if(documentsDirectoryBookmark == nil)
{
documentsDirectoryBookmarkURL = [self getDocumentsDirectoryURL];
documentsDirectoryBookmark = [self bookmarkForURL:documentsDirectoryBookmarkURL];
}
documentsDirectoryBookmarkURL = [self urlForBookmark:documentsDirectoryBookmark];
path = documentsDirectoryBookmarkURL.path;
path = [path stringByAppendingString:#"/Model.mod"];
return path;
}
using methods modified from my ios7 code (which used only the getDocumentsDirectory method):
+ (NSString *)getDocumentsDirectory
{
NSURL *directory = [self getDocumentsDirectoryURL];
NSString * documentsDirectory = directory.path;
return documentsDirectory;
}
And
+ (NSURL *)getDocumentsDirectoryURL
{
NSURL *directory = [[[NSFileManager defaultManager]
URLsForDirectory:NSDocumentDirectory
inDomains:NSUserDomainMask]
lastObject];
return directory;
}
And
+ (NSData*)bookmarkForURL:(NSURL*)url {
NSError* theError = nil;
NSData* bookmark = [url bookmarkDataWithOptions:NSURLBookmarkCreationSuitableForBookmarkFile
includingResourceValuesForKeys:nil
relativeToURL:nil
error:&theError];
if (theError || (bookmark == nil)) {
// Handle any errors.
return nil;
}
return bookmark;
}
So now I have a NSString path with the model filename that I can use to get to the GeoModel
- (GeoModel*) openGeoModel
{
GeoModel *geoModel;
NSString* documentsDirectoryGeoModel =[FileManager getGeoModelAbsolutePath];
if([FileManager fileExistsAtAbsolutePath:documentsDirectoryGeoModel])
{
NSData* data = [NSData dataWithContentsOfFile: documentsDirectoryGeoModel]; //]documentsDirectoryGeoModel];
geoModel = [NSKeyedUnarchiver unarchiveObjectWithData: data];
NSString *unarchivedGeoModelVersion = geoModel.geoModel_VersionID;
if(![unarchivedGeoModelVersion isEqual: currentGeoModelVersion])
{
[FileManager deleteFile:documentsDirectoryGeoModel];
geoModel = [GeoModel geoModelInit];
[Utilities setGeoProjectCounter:0];
}
}
else
{
geoModel = [GeoModel geoModelInit];
}
[FileManager saveGeoModel];
return geoModel;
}
Which I then can save to the documentsDirectory as follows:
+ (BOOL)saveGeoModel
{
NSError *error = nil;
NSString *path = [self getGeoModelAbsolutePath];
[NSKeyedArchiver archiveRootObject:appDelegate.currentGeoModel toFile:path];
NSData* encodedData = [NSKeyedArchiver archivedDataWithRootObject: appDelegate.currentGeoModel];
BOOL success = [encodedData writeToFile: path options:NSDataWritingAtomic error:&error];
return success;
}
Which is always successful -- but is persistent only if I do not turn off the device! I am not making any progress with this: Any help would be much appreciated!
Thanks in advance
Tim Redfield
There. I think it is answered -- unless someone else has a comment on how to improve the above listings, they DO work as they ought to!

How to upload in ios app extension

How to upload using AFNetworking in ios app extension?
apple's example uses NSURLSession, can you explain to me how this works?
- (void)didSelectPost {
NSExtensionItem *imageItem = [self.extensionContext.inputItems lastObject];
// Verify that we have a valid NSExtensionItem
if (!imageItem) {
return;
}
// Verify that we have a valid NSItemProvider
NSItemProvider *imageItemProvider = [[imageItem attachments] firstObject];
if (!imageItemProvider) {
return;
}
// Look for an image inside the NSItemProvider
if ([imageItemProvider hasItemConformingToTypeIdentifier:(NSString *)kUTTypeImage]) {
[imageItemProvider loadItemForTypeIdentifier:(NSString *)kUTTypeImage options:nil completionHandler:^(id item, NSError *error) {
if (item)
{
NSData *data = [NSData dataWithContentsOfURL:item];
[self method:data];
}
[self.extensionContext completeRequestReturningItems:nil completionHandler:nil];
}];
}
}
How do I upload this data using this method or using AFNetworking or using my app to upload this?
- (void)method:(NSData *)data
{
NSString *confName = #"com.example.photoblog.backgroundconfiguration";
NSURLSessionConfiguration *conf = [NSURLSessionConfiguration backgroundSessionConfiguration:confName];
NSURLSession *session = [NSURLSession sessionWithConfiguration:conf delegate:self delegateQueue:nil];
NSURLRequest *requeust = [self requestForExtensionItems];
NSURLSessionUploadTask *upload = [session uploadTaskWithStreamedRequeust:request];
[upload resume];
}
You should setting app group both Your extension and containing app,then config session like this
config.sharedContainerIdentifier = #"group.xxxxx";
You can refer more info by this tutorial
http://www.shinobicontrols.com/blog/posts/2014/07/21/ios8-day-by-day-day-2-sharing-extension

thumbnailImageAtTime: now deprecated - What's the alternative?

Until iOS7 update I was using...
UIImage *image = [moviePlayer thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
...with great success, so that my app could show a still of the video that the user had just taken.
I understand this method, as of iOS7 has now deprecated and I need an alternative. I see there's a method of
- (void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimes timeOption:(MPMovieTimeOption)option
though how do I return the image from it so I can place it within the videoReview button image?
Thanks in advance, Jim.
****Edited question, after trying notification centre method***
I used the following code -
[moviePlayer requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionNearestKeyFrame];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
I made the NSArray times of two NSNumber objects 1 & 2.
I then tried to capture the notification in the following method
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSDictionary*)info{
UIImage *image = [info objectForKey:MPMoviePlayerThumbnailImageKey];
Then proceeded to use this thumbnail image as the button image as a preview.... but it didn't work.
If you can see from my coding where I've went wrong your help would be appreciated again. Cheers
Managed to find a great way using AVAssetImageGenerator, please see code below...
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:partOneUrl options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one = [[UIImage alloc] initWithCGImage:oneRef];
[_firstImage setImage:one];
_firstImage.contentMode = UIViewContentModeScaleAspectFit;
Within header file, please import
#import <AVFoundation/AVFoundation.h>
It works perfect and I've been able to call it from viewDidLoad, which was quicker than calling the deprecated thumbNailImageAtTime: from the viewDidAppear.
Hope this helps anyone else who had the same problem.
* **Update for Swift 5.1 ****
Useful function...
func createThumbnailOfVideoUrl(url: URL) -> UIImage? {
let asset = AVAsset(url: url)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(1.0, preferredTimescale: 600)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
return thumbnail
} catch {
print(error.localizedDescription)
return nil
}
}
The requestThumbnailImagesAtTimes:timeOption: method will post a MPMoviePlayerThumbnailImageRequestDidFinishNotification notification when an image request completes. Your code that needs the thumbnail image should subscribe to this notification using NSNotificationCenter, and use the image when it receives the notification.
The problem is that you have to specify float values in requestThumbnailImagesAtTimes.
For example, this will work
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14.f] timeOption:MPMovieTimeOptionNearestKeyFrame];
but this won't work:
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14] timeOption:MPMovieTimeOptionNearestKeyFrame];
The way to do it, at least in iOS7 is to use floats for your times
NSNumber *timeStamp = #1.f;
[moviePlayer requestThumbnailImagesAtTimes:timeStamp timeOption:MPMovieTimeOptionNearestKeyFrame];
Hope this helps
Jeely provides a good work around but it requires an additional library that isn't necessary when the MPMoviePlayer already provides functions for this task. I noticed a syntax error in the original poster's code. The thumbnail notification handler expects an object of type NSNotification, not a dictionary object. Here's a corrected example:
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSNotification*)note
{
NSDictionary * userInfo = [note userInfo];
UIImage *image = (UIImage *)[userInfo objectForKey:MPMoviePlayerThumbnailImageKey];
if(image!=NULL)
[thumbView setImage:image];
}
I've just looked for a solution for this problem myself and got good help from your question.
Got your code above to work with one small change, removed a colon...
Change
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
to
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
Got this to work nicely. Also, I've found that you can't call a method the rely on NotificationCenter if you're already in a notification selector. Something I tried at first - I tried calling requestThumbnailImagesAtTimes inside the notification selector for MPMoviePlayerPlaybackDidFinishNotification - something that won't work. I think because the notification won't fire.
The code in Swift 2.1 would look like this:
do{
let asset1 = AVURLAsset(URL: url)
let generate1: AVAssetImageGenerator = AVAssetImageGenerator(asset: asset1)
generate1.appliesPreferredTrackTransform = true
let time: CMTime = CMTimeMake(3, 1) //TO CATCH THE THIRD SECOND OF THE VIDEO
let oneRef: CGImageRef = try generate1.copyCGImageAtTime(time, actualTime: nil)
let resultImage = UIImage(CGImage: oneRef)
}
catch let error as NSError{
print(error)
}

Taking a screenshot of MKMapView

I'm trying to get a screenshot of a MKMapView.
and I'm using the following code:
UIGraphicsBeginImageContext(myMapView.frame.size);
[myMapView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenShot=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenShot;
And I'm getting and almost blank image with the map current location icon and a Google logo in it.
What could be causing that?
I should tell you that myMapView is actually on another viewController's view but since I'm getting the blue spot showing the location and the google logo I assume the reference I have is the correct one.
Thank you.
iOS 7 introduced a new method to generate screenshots of a MKMapView. It is now possible to use the new MKMapSnapshot API as follows:
MKMapView *mapView = [..your mapview..]
MKMapSnapshotOptions *options = [[MKMapSnapshotOptions alloc]init];
options.region = mapView.region;
options.mapType = MKMapTypeStandard;
options.showsBuildings = NO;
options.showsPointsOfInterest = NO;
options.size = CGSizeMake(1000, 500);
MKMapSnapshotter *snapshotter = [[MKMapSnapshotter alloc]initWithOptions:options];
[snapshotter startWithQueue:dispatch_get_main_queue() completionHandler:^(MKMapSnapshot *snapshot, NSError *error) {
if( error ) {
NSLog( #"An error occurred: %#", error );
} else {
[UIImagePNGRepresentation( snapshot.image ) writeToFile:#"/Users/<yourAccountName>/map.png" atomically:YES];
}
}];
Currently all overlays and annotations are not rendered. You have to render them afterwards onto the resulting snapshot image yourself. The provided MKMapSnapshot object has a handy helper method to do the mapping between coordinates and points:
CGPoint point = [snapshot pointForCoordinate:locationCoordinate2D];
As mentioned here, you can try this
- (UIImage*) renderToImage
{
UIGraphicsBeginImageContextWithOptions(self.frame.size, NO, 0.0);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}

Resources