ABPerson uses old picture data - cocoa

I have a Addressbook in my App which gets filled from a website by json. If I add a new Person on my website the person get added to the app addressbook on the next launch. If I update a persons name on my website it works perfectly.. but if I add a new image to a existing person it still shows the old default image from Apple after the reload. If I compile it the picture appears. Is there a cache for addressbook pictures?
I tried the following:
BOOL imgBool = ABPersonHasImageData(aContact); -> false
and
ABPersonRemoveImageData(aContact, &anError); -> false because no picture in AB
I add a Persons picture like this:
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[[[self.profsInSection objectAtIndex:indexPath.section] objectAtIndex:indexPath.row] bildUrl]]]];
NSData *picData = UIImageJPEGRepresentation(image, 0.9f);
[...]
ABPersonSetImageData(aContact, (CFDataRef)picData, nil);
There are no Errors, it just saves the last picture and ignores new pictures... :( Any ideas?

Haven't you forget to call saving function, after applying your changes in the address book?
ABAddressBookSave(addrBook, nil);
UPDATE: so, as long as you're using ABUnknownPersonViewController, I've tried the following code by myself and it's working both on the simulator & device
ABRecordRef newPerson = ABPersonCreate();
ABRecordSetValue(newPerson, kABPersonLastNameProperty, #"Smith", nil);
UIImage *personImage = [UIImage imageNamed:#"image.jpg"];
NSData *dataRef = UIImageJPEGRepresentation(personImage, 0.9f);
CFErrorRef error = nil;
bool result = ABPersonSetImageData(newPerson, (CFDataRef)dataRef, &error);
ABUnknownPersonViewController *controller = [ABUnknownPersonViewController new];
//controller.addressBook = addressBook;
controller.allowsAddingToAddressBook = YES;
controller.displayedPerson = newPerson;
[self.navigationController pushViewController:controller animated:YES];
[controller release];
The issue I've met when was testing this code, was my mistype in image name, so the UIImage instance was nil and NSData was empty. And ABPersonSetImageData gave no error on this. So, double check your UIImage instance is not nil.

Related

How do I play a video on tvOS for Apple TV?

I started a blank tvOS project and created the following code:
- (void)viewDidLoad
{
[super viewDidLoad];
AVPlayer *avPlayer = [AVPlayer playerWithURL:[NSURL URLWithString:#"http://www.myurl.com/myvideo.mp4"]];
AVPlayerLayer *avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
avPlayerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
[self.view.layer addSublayer:avPlayerLayer];
[avPlayer play];
}
Nothing happens in the simulator though once the app loads. No video, nothing, just a blank translucent screen in my Apple TV simulator.
What's the proper way to play a sample video on app launch for an Apple TV app from an HTTP source?
I just pasted your code in my tvOS sample project, replaced the URL and ran it.
Nothing happened. Well, except for the fact that there's a log entry telling me that App Transport Security has blocked my URL request.
So I headed to the Info.plist, disabled ATS and upon next launch the video showed up just fine.
So if you're also using a non-HTTPS URL you're very likely running into this issue which is easily fixed by either using an HTTPS URL, disabling ATS completely or allowing specific non-HTTPs URLs in your Info.plist.
P.S.: I used this video for testing.
You could also use TVML and TVMLJS
https://developer.apple.com/library/prerelease/tvos/documentation/TVMLJS/Reference/TVJSFrameworkReference/
Adhere to the 'TVApplicationControllerDelegate' protocol and add some properties.
AppDelegate.h
#interface AppDelegate : UIResponder <UIApplicationDelegate, TVApplicationControllerDelegate>
...
#property (strong, nonatomic) TVApplicationController *appController;
#property (strong, nonatomic) TVApplicationControllerContext *appControllerContext;
Then add the following to 'didFinishLaunchingWithOptions'
AppDelegate.m
#define url #"http://localhost:8000/main.js"
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
// Override point for customization after application launch.
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
self.appControllerContext = [[TVApplicationControllerContext alloc] init];
NSURL *javascriptURL = [NSURL URLWithString:url];
self.appControllerContext.javaScriptApplicationURL= javascriptURL;
for (id key in launchOptions) {
id val=[launchOptions objectForKey:key];
NSLog(#"key=%# value=%#", key, val);
if([val isKindOfClass:[NSString class]]) [self.appControllerContext.launchOptions objectForKey:val];
self.appController = [[TVApplicationController alloc] initWithContext:self.appControllerContext window:self.window delegate:self];
}
return YES;
}
create a folder and add the following files
main.js
index.tvml
main.js
function launchPlayer() {
var player = new Player();
var playlist = new Playlist();
var mediaItem = new MediaItem("video", "http://trailers.apple.com/movies/focus_features/9/9-clip_480p.mov");
player.playlist = playlist;
player.playlist.push(mediaItem);
player.present();
//player.play()
}
//in application.js
App.onLaunch = function(options) {
launchPlayer();
}
careful with this url in the mediaItem
Set up a template of your choice.
index.tvml
<document>
<alertTemplate>
<title>…</title>
<description>…</description>
<button>
<text>…</text>
</button>
<text>…</text>
</alertTemplate>
</document>
open terminal and navigate to this folder then run
python -m SimpleHTTPServer 8000
make sure the port here is the port in your ObjC url. The Apple examples use 9001.
See these tutorials for more info
http://jamesonquave.com/blog/developing-tvos-apps-for-apple-tv-with-swift/
http://jamesonquave.com/blog/developing-tvos-apps-for-apple-tv-part-2/
One issue I ran into was trying to play a local video file. It wouldn't work and there were constraint issues etc.
It looks like you can't use python to play the videos so either try apache or link to a video on the web.
This SO answer pointed me there.
The best way to play video in your app on AppleTV is going to be AVKit's AVPlayerViewController. If you use AVKit, you get a lot of stuff for free.
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVPlayerViewController_Class/index.html
You simply add that player to the viewController's player property:
// instantiate here or in storyboard
AVPlayerViewController *viewController = [[AVPlayerViewController alloc] initWithNibName:nil bundle:nil];
viewController.player = player;
[self addChildViewController:viewController];
[self.view addSubview:viewController.view];
[viewController didMoveToParentViewController:self];
// setup constraints, etc.
// play the video
[player play];
Also as mentioned below, make sure the video you're trying to play is coming either from an HTTPS connection or that you've disabled App Transport Security by setting the proper flags in the plist.
I didn't like the answers which messed about with subviews etc.
For full-screen playback, I use the following (Non-ARC) code:
// Play the stream
NSString *wifiStreamAddress = #"http://yourmoviefile.m3u8";
AVPlayer *player = [[AVPlayer alloc] initWithURL: [NSURL URLWithString: wifiStreamAddress] ];
AVPlayerViewController *playerViewController = [[AVPlayerViewController alloc] init];
playerViewController.player = player;
// Keep pointers to player and controller using retained properties:
self.player = player;
self.playerViewController = playerViewController;
[player release];
[playerViewController release];
[self presentViewController: playerViewController animated: true completion: ^{
[self.player play];
}];
This works really neat, animating presentation and fading back to previous view when you tap the MENU button. Also, it works great with the remote control using all the standard functions.
Its working for me.
May be helpful for you
-(void)playAction
{
AVPlayerViewController *viewController = [[AVPlayerViewController alloc] initWithNibName:nil bundle:nil];
viewController.player = player;
[self addChildViewController:viewController];
[self.view addSubview:viewController.view];
[viewController didMoveToParentViewController:self];
// play the video
[player play];
}
Swift version
Make a PlayViewController which inherit the AVPlayerViewController.
In the viewcontroller which has play button, add such function
#IBAction func onClickPlay(sender: AnyObject) {
let playerVC = PlayerViewController()
playerVC.playVideo(urlString)
[self.presentViewController(playerVC, animated: true, completion: nil)]
}
In the PlayerViewController
func playVimeoVideo(link : String) {
player = AVPlayer(URL: NSURL(string: link)!)
player?.play()
}
Notice
The question and some answers may be a little misleading so that you might think that only the url with ".mp4" at the end can be played by the Apple TV. I believed so at the first time I saw the post. It is not true. In fact, with AVPlayerViewController you can play Vimeo streaming video! The link to the stream video is not like https://vimeo.com/92655878. It is possible to get it from Vimeo site by extracting it from a json file, which can be downloaded from this link
let link = "https://vimeo.com/api/oembed.json?url=https%3A//vimeo.com/" + videoId
To be able to get correct url for the video, you need to use the Vimeo Pro user access to get the stream link for a specific video.

thumbnailImageAtTime: now deprecated - What's the alternative?

Until iOS7 update I was using...
UIImage *image = [moviePlayer thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
...with great success, so that my app could show a still of the video that the user had just taken.
I understand this method, as of iOS7 has now deprecated and I need an alternative. I see there's a method of
- (void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimes timeOption:(MPMovieTimeOption)option
though how do I return the image from it so I can place it within the videoReview button image?
Thanks in advance, Jim.
****Edited question, after trying notification centre method***
I used the following code -
[moviePlayer requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionNearestKeyFrame];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
I made the NSArray times of two NSNumber objects 1 & 2.
I then tried to capture the notification in the following method
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSDictionary*)info{
UIImage *image = [info objectForKey:MPMoviePlayerThumbnailImageKey];
Then proceeded to use this thumbnail image as the button image as a preview.... but it didn't work.
If you can see from my coding where I've went wrong your help would be appreciated again. Cheers
Managed to find a great way using AVAssetImageGenerator, please see code below...
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:partOneUrl options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one = [[UIImage alloc] initWithCGImage:oneRef];
[_firstImage setImage:one];
_firstImage.contentMode = UIViewContentModeScaleAspectFit;
Within header file, please import
#import <AVFoundation/AVFoundation.h>
It works perfect and I've been able to call it from viewDidLoad, which was quicker than calling the deprecated thumbNailImageAtTime: from the viewDidAppear.
Hope this helps anyone else who had the same problem.
* **Update for Swift 5.1 ****
Useful function...
func createThumbnailOfVideoUrl(url: URL) -> UIImage? {
let asset = AVAsset(url: url)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(1.0, preferredTimescale: 600)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
return thumbnail
} catch {
print(error.localizedDescription)
return nil
}
}
The requestThumbnailImagesAtTimes:timeOption: method will post a MPMoviePlayerThumbnailImageRequestDidFinishNotification notification when an image request completes. Your code that needs the thumbnail image should subscribe to this notification using NSNotificationCenter, and use the image when it receives the notification.
The problem is that you have to specify float values in requestThumbnailImagesAtTimes.
For example, this will work
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14.f] timeOption:MPMovieTimeOptionNearestKeyFrame];
but this won't work:
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14] timeOption:MPMovieTimeOptionNearestKeyFrame];
The way to do it, at least in iOS7 is to use floats for your times
NSNumber *timeStamp = #1.f;
[moviePlayer requestThumbnailImagesAtTimes:timeStamp timeOption:MPMovieTimeOptionNearestKeyFrame];
Hope this helps
Jeely provides a good work around but it requires an additional library that isn't necessary when the MPMoviePlayer already provides functions for this task. I noticed a syntax error in the original poster's code. The thumbnail notification handler expects an object of type NSNotification, not a dictionary object. Here's a corrected example:
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSNotification*)note
{
NSDictionary * userInfo = [note userInfo];
UIImage *image = (UIImage *)[userInfo objectForKey:MPMoviePlayerThumbnailImageKey];
if(image!=NULL)
[thumbView setImage:image];
}
I've just looked for a solution for this problem myself and got good help from your question.
Got your code above to work with one small change, removed a colon...
Change
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
to
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
Got this to work nicely. Also, I've found that you can't call a method the rely on NotificationCenter if you're already in a notification selector. Something I tried at first - I tried calling requestThumbnailImagesAtTimes inside the notification selector for MPMoviePlayerPlaybackDidFinishNotification - something that won't work. I think because the notification won't fire.
The code in Swift 2.1 would look like this:
do{
let asset1 = AVURLAsset(URL: url)
let generate1: AVAssetImageGenerator = AVAssetImageGenerator(asset: asset1)
generate1.appliesPreferredTrackTransform = true
let time: CMTime = CMTimeMake(3, 1) //TO CATCH THE THIRD SECOND OF THE VIDEO
let oneRef: CGImageRef = try generate1.copyCGImageAtTime(time, actualTime: nil)
let resultImage = UIImage(CGImage: oneRef)
}
catch let error as NSError{
print(error)
}

Storing a PDF generated 'on the fly' for iPad on IOS6.1

I am trying to create a PDF report from an iPad app using xcode 4.6. I know a valid pdf file is being created when run on the simulator, because I can dig it out and preview it. The commented out code does this. The problem is that I can't write it somewhere I can get at it on the iPad.
I've tried using UIGraphicsBeginPDFContextToData instead and trying to write the image out to the PhotoAlbum instead. The problem here is that when I convert the NSMutableData into an image it returns nil.
Here is the code. Thanks for any help you can give me.
- (IBAction)makePDF:(UIButton *)sender
{
CFAttributedStringRef currentText = CFAttributedStringCreate(NULL, (CFStringRef)self.labelCopyright.text, NULL);
if (currentText)
{
CTFramesetterRef framesetter = CTFramesetterCreateWithAttributedString(currentText);
if (framesetter)
{
// NSString *rootPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, //NSUserDomainMask, YES) objectAtIndex:0];
// NSString *pdfPath = [rootPath stringByAppendingPathComponent:#"Nick.pdf"];
// NSLog(#"pdf is at %#",pdfPath);
// UIGraphicsBeginPDFContextToFile(pdfPath, CGRectZero, nil);
NSMutableData *data = [[NSMutableData alloc] initWithCapacity:100000];
UIGraphicsBeginPDFContextToData(data, CGRectZero, nil);
CFRange currentRange = CFRangeMake(0, 0);
NSInteger currentPage = 0;
BOOL done = NO;
do
{
UIGraphicsBeginPDFPageWithInfo(CGRectMake(0, 0, 612, 792), nil);
currentPage++;
// [self drawPageNumber:currentPage];
currentRange = [self renderPage:currentPage withTextRange:currentRange andFramesetter:framesetter];
if (currentRange.location == CFAttributedStringGetLength((CFAttributedStringRef)currentText)) done = YES;
}
while (!done);
UIGraphicsEndPDFContext();
UIImage* image = [UIImage imageWithData:data];
assert(image);
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
CFRelease(framesetter);
}
else NSLog(#"Could not create the framesetter needed to lay out the atrributed string.");
CFRelease(currentText);
}
else NSLog(#"Could not create the attributed string for the framesetter");
}
- (CFRange)renderPage:(NSInteger)pageNum withTextRange:(CFRange)currentRange andFramesetter:(CTFramesetterRef)framesetter
{
CGContextRef currentContext = UIGraphicsGetCurrentContext();
CGContextSetTextMatrix(currentContext, CGAffineTransformIdentity);
CGRect frameRect = CGRectMake(72, 72, 468, 648);
CGMutablePathRef framePath = CGPathCreateMutable();
CGPathAddRect(framePath, NULL, frameRect);
CTFrameRef frameRef = CTFramesetterCreateFrame(framesetter, currentRange, framePath, NULL);
CGPathRelease(framePath);
CGContextTranslateCTM(currentContext, 0, 792);
CGContextScaleCTM(currentContext, 1.0, -1.0);
CTFrameDraw(frameRef, currentContext);
currentRange = CTFrameGetVisibleStringRange(frameRef);
currentRange.location += currentRange.length;
currentRange.length = 0;
CFRelease(frameRef);
return currentRange;
}
Save the mutable data to your documents directory
[data writeToFile:filePath atomically:YES]
Here's an example:
+(void) saveData: (NSData*) data ToFileName: (NSString*) filename {
// Retrieves the document directories from the iOS device
NSArray* documentDirectories = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask,YES);
NSString* documentDirectory = [documentDirectories objectAtIndex:0];
NSString* documentDirectoryFilename = [documentDirectory stringByAppendingPathComponent: filename];
// instructs the mutable data object to write its context to a file on disk
[data writeToFile:documentDirectoryFilename atomically:YES];
//NSLog(#"documentDirectoryFileName: %#",documentDirectoryFilename);
}
As for displaying the generated PDF on the device, the UIWebView object supports loading PDF files from NSData. Here is an example:
[self.webView loadData:pdfData MIMEType:#"application/pdf" textEncodingName:#"utf-8" baseURL:nil];
It is possible to attach an NSData object to an email as well. Here is an example:
//Check if we can send e-mails
if ([MFMailComposeViewController canSendMail]) {
//Create the Email view controller
MFMailComposeViewController *controller = [[MFMailComposeViewController alloc] init];
controller.mailComposeDelegate = self;
//Set the subject and body
[controller setSubject:#"Email Subject"];
[controller setMessageBody:#"Email body" isHTML:NO];
//Set the email address
[controller setToRecipients:#"test#test.com"];
//Add the current PDF as an attachment
NSString *fileName = #"file.pdf";
[controller addAttachmentData:self.retrievedPDF mimeType:#"application/pdf" fileName:fileName];
// show the email controller modally
[self.navigationController presentModalViewController: controller animated: YES];
}
Instead of writing the PDF to an NSMutableData object, write it to a file using UIGraphicsBeginPDFContextToFile.
The first argument is the file path. The best place would be the Documents directory. There are then many different ways to get the file out of the app:
iTunes file sharing
Email
iCloud
Sending to a 3rd party server (Dropbox, Box, Google Drive, etc.)
Open in another iOS app using UIDocumentInteractionController.

MWPhotoBrowser used in button (Xcode)

Im trying to implement your implementation to my project as a button called 'Gallery" but receive an error when trying to run the project.
My implementation file looks like this.
-(IBAction) gallery{
NSMutableArray *photos = [[NSMutableArray alloc] init];
MWPhoto *photo;
{
photo = [MWPhoto photoWithFilePath:[[NSBundle mainBundle] pathForResource:#"fans1" ofType:#"jpg"]];
photo.caption = #"My fans 1";
[photos addObject:photo];
photo = [MWPhoto photoWithFilePath:[[NSBundle mainBundle] pathForResource:#"fans2" ofType:#"jpg"]];
photo.caption = #"My fans 2";
[photos addObject:photo];
photo = [MWPhoto photoWithFilePath:[[NSBundle mainBundle] pathForResource:#"fans3" ofType:#"jpg"]];
photo.caption = #"Fans3";
[photos addObject:photo];
photo = [MWPhoto photoWithFilePath:[[NSBundle mainBundle] pathForResource:#"fans4" ofType:#"jpg"]];
photo.caption = #"Fans4";
[photos addObject:photo];
}
self.photos = photos;
// Create browser
MWPhotoBrowser *browser = [[MWPhotoBrowser alloc] initWithDelegate:self];
browser.displayActionButton = YES;
//browser.wantsFullScreenLayout = NO;
//[browser setInitialPageIndex:2];
// Show
if (_segmentedControl.selectedSegmentIndex == 0) {
// Push
[self.navigationController pushViewController:browser animated:YES];
} else {
// Modal
UINavigationController *nc = [[UINavigationController alloc] initWithRootViewController:browser];
nc.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
[self presentModalViewController:nc animated:YES];
}
}
#pragma mark - MWPhotoBrowserDelegate
- (NSUInteger)numberOfPhotosInPhotoBrowser:(MWPhotoBrowser *)photoBrowser {
return _photos.count;
}
- (MWPhoto *)photoBrowser:(MWPhotoBrowser *)photoBrowser photoAtIndex: (NSUInteger)index {
if (index < _photos.count)
return [_photos objectAtIndex:index];
return nil;
}
When I run the project but the image gallery is not displayed. Im fairly new to ioS development.If you can please point me in the right direction i will deeply appreciate it. The main goal is to have the image gallery displayed when the user touches the Gallery button.
I copied and edited the code from the MWPhotoBrowser Demo Project File I don't receive any errors but I can't get the gallery to appear once button is touched. (BTW I assigned the IBACTION to the buttons). If there is another source code or alternative Framework I can use please advise. Thanks!
Some Ideas:
1) try using 'initWithPhotos'.
2) put a breakpoint at the place you are pushing, check that the push is called.
3) check that the navigationController is not null (0x0) in debugging.
4) remember to release 'browser'.

AVAssetImageGeneratorCompletionHandler - how to set or return variables?

i´m using the AVAssetImageGenerator to get images from a movieclip without playing it before. Now i´ve got a question how to set up variables in the loop of a handler?
Is it possible?
I´m getting this error message and have no idea what does that mean. (google> no results).
"Variable is not assignable (missing
__block type specifier)"
So i have to ask the pro´s here.
Here´s the code. I want to save or return my generated imageData, so i can delete the "setImage" message within that following handler.
UIImage* thumbImg = [[UIImage alloc] init];
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error)
{
if (result != AVAssetImageGeneratorSucceeded)
{
NSLog(#"couldn't generate thumbnail, error:%#", error);
}
[button setImage:[UIImage imageWithCGImage:im] forState:UIControlStateNormal];
thumbImg = [[UIImage imageWithCGImage:im] retain];
[generator release];
};
Would be great to learn about that.
Thanks for your time.
1st of all it seems you don't need to init your thumbImg when its declared - UIImage object created in that line will be overwritten in block and will leak. Just init it with nil value.
Actual problem in your code is that variable you're going to change in block should be declared with __block specifier (as error message says). So your 1s line should be
__block UIImage* thumbImg = nil;

Resources