Until iOS7 update I was using...
UIImage *image = [moviePlayer thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
...with great success, so that my app could show a still of the video that the user had just taken.
I understand this method, as of iOS7 has now deprecated and I need an alternative. I see there's a method of
- (void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimes timeOption:(MPMovieTimeOption)option
though how do I return the image from it so I can place it within the videoReview button image?
Thanks in advance, Jim.
****Edited question, after trying notification centre method***
I used the following code -
[moviePlayer requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionNearestKeyFrame];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
I made the NSArray times of two NSNumber objects 1 & 2.
I then tried to capture the notification in the following method
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSDictionary*)info{
UIImage *image = [info objectForKey:MPMoviePlayerThumbnailImageKey];
Then proceeded to use this thumbnail image as the button image as a preview.... but it didn't work.
If you can see from my coding where I've went wrong your help would be appreciated again. Cheers
Managed to find a great way using AVAssetImageGenerator, please see code below...
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:partOneUrl options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one = [[UIImage alloc] initWithCGImage:oneRef];
[_firstImage setImage:one];
_firstImage.contentMode = UIViewContentModeScaleAspectFit;
Within header file, please import
#import <AVFoundation/AVFoundation.h>
It works perfect and I've been able to call it from viewDidLoad, which was quicker than calling the deprecated thumbNailImageAtTime: from the viewDidAppear.
Hope this helps anyone else who had the same problem.
* **Update for Swift 5.1 ****
Useful function...
func createThumbnailOfVideoUrl(url: URL) -> UIImage? {
let asset = AVAsset(url: url)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(1.0, preferredTimescale: 600)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
return thumbnail
} catch {
print(error.localizedDescription)
return nil
}
}
The requestThumbnailImagesAtTimes:timeOption: method will post a MPMoviePlayerThumbnailImageRequestDidFinishNotification notification when an image request completes. Your code that needs the thumbnail image should subscribe to this notification using NSNotificationCenter, and use the image when it receives the notification.
The problem is that you have to specify float values in requestThumbnailImagesAtTimes.
For example, this will work
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14.f] timeOption:MPMovieTimeOptionNearestKeyFrame];
but this won't work:
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14] timeOption:MPMovieTimeOptionNearestKeyFrame];
The way to do it, at least in iOS7 is to use floats for your times
NSNumber *timeStamp = #1.f;
[moviePlayer requestThumbnailImagesAtTimes:timeStamp timeOption:MPMovieTimeOptionNearestKeyFrame];
Hope this helps
Jeely provides a good work around but it requires an additional library that isn't necessary when the MPMoviePlayer already provides functions for this task. I noticed a syntax error in the original poster's code. The thumbnail notification handler expects an object of type NSNotification, not a dictionary object. Here's a corrected example:
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSNotification*)note
{
NSDictionary * userInfo = [note userInfo];
UIImage *image = (UIImage *)[userInfo objectForKey:MPMoviePlayerThumbnailImageKey];
if(image!=NULL)
[thumbView setImage:image];
}
I've just looked for a solution for this problem myself and got good help from your question.
Got your code above to work with one small change, removed a colon...
Change
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
to
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
Got this to work nicely. Also, I've found that you can't call a method the rely on NotificationCenter if you're already in a notification selector. Something I tried at first - I tried calling requestThumbnailImagesAtTimes inside the notification selector for MPMoviePlayerPlaybackDidFinishNotification - something that won't work. I think because the notification won't fire.
The code in Swift 2.1 would look like this:
do{
let asset1 = AVURLAsset(URL: url)
let generate1: AVAssetImageGenerator = AVAssetImageGenerator(asset: asset1)
generate1.appliesPreferredTrackTransform = true
let time: CMTime = CMTimeMake(3, 1) //TO CATCH THE THIRD SECOND OF THE VIDEO
let oneRef: CGImageRef = try generate1.copyCGImageAtTime(time, actualTime: nil)
let resultImage = UIImage(CGImage: oneRef)
}
catch let error as NSError{
print(error)
}
Related
I started a blank tvOS project and created the following code:
- (void)viewDidLoad
{
[super viewDidLoad];
AVPlayer *avPlayer = [AVPlayer playerWithURL:[NSURL URLWithString:#"http://www.myurl.com/myvideo.mp4"]];
AVPlayerLayer *avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
avPlayerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
[self.view.layer addSublayer:avPlayerLayer];
[avPlayer play];
}
Nothing happens in the simulator though once the app loads. No video, nothing, just a blank translucent screen in my Apple TV simulator.
What's the proper way to play a sample video on app launch for an Apple TV app from an HTTP source?
I just pasted your code in my tvOS sample project, replaced the URL and ran it.
Nothing happened. Well, except for the fact that there's a log entry telling me that App Transport Security has blocked my URL request.
So I headed to the Info.plist, disabled ATS and upon next launch the video showed up just fine.
So if you're also using a non-HTTPS URL you're very likely running into this issue which is easily fixed by either using an HTTPS URL, disabling ATS completely or allowing specific non-HTTPs URLs in your Info.plist.
P.S.: I used this video for testing.
You could also use TVML and TVMLJS
https://developer.apple.com/library/prerelease/tvos/documentation/TVMLJS/Reference/TVJSFrameworkReference/
Adhere to the 'TVApplicationControllerDelegate' protocol and add some properties.
AppDelegate.h
#interface AppDelegate : UIResponder <UIApplicationDelegate, TVApplicationControllerDelegate>
...
#property (strong, nonatomic) TVApplicationController *appController;
#property (strong, nonatomic) TVApplicationControllerContext *appControllerContext;
Then add the following to 'didFinishLaunchingWithOptions'
AppDelegate.m
#define url #"http://localhost:8000/main.js"
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
// Override point for customization after application launch.
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
self.appControllerContext = [[TVApplicationControllerContext alloc] init];
NSURL *javascriptURL = [NSURL URLWithString:url];
self.appControllerContext.javaScriptApplicationURL= javascriptURL;
for (id key in launchOptions) {
id val=[launchOptions objectForKey:key];
NSLog(#"key=%# value=%#", key, val);
if([val isKindOfClass:[NSString class]]) [self.appControllerContext.launchOptions objectForKey:val];
self.appController = [[TVApplicationController alloc] initWithContext:self.appControllerContext window:self.window delegate:self];
}
return YES;
}
create a folder and add the following files
main.js
index.tvml
main.js
function launchPlayer() {
var player = new Player();
var playlist = new Playlist();
var mediaItem = new MediaItem("video", "http://trailers.apple.com/movies/focus_features/9/9-clip_480p.mov");
player.playlist = playlist;
player.playlist.push(mediaItem);
player.present();
//player.play()
}
//in application.js
App.onLaunch = function(options) {
launchPlayer();
}
careful with this url in the mediaItem
Set up a template of your choice.
index.tvml
<document>
<alertTemplate>
<title>…</title>
<description>…</description>
<button>
<text>…</text>
</button>
<text>…</text>
</alertTemplate>
</document>
open terminal and navigate to this folder then run
python -m SimpleHTTPServer 8000
make sure the port here is the port in your ObjC url. The Apple examples use 9001.
See these tutorials for more info
http://jamesonquave.com/blog/developing-tvos-apps-for-apple-tv-with-swift/
http://jamesonquave.com/blog/developing-tvos-apps-for-apple-tv-part-2/
One issue I ran into was trying to play a local video file. It wouldn't work and there were constraint issues etc.
It looks like you can't use python to play the videos so either try apache or link to a video on the web.
This SO answer pointed me there.
The best way to play video in your app on AppleTV is going to be AVKit's AVPlayerViewController. If you use AVKit, you get a lot of stuff for free.
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVPlayerViewController_Class/index.html
You simply add that player to the viewController's player property:
// instantiate here or in storyboard
AVPlayerViewController *viewController = [[AVPlayerViewController alloc] initWithNibName:nil bundle:nil];
viewController.player = player;
[self addChildViewController:viewController];
[self.view addSubview:viewController.view];
[viewController didMoveToParentViewController:self];
// setup constraints, etc.
// play the video
[player play];
Also as mentioned below, make sure the video you're trying to play is coming either from an HTTPS connection or that you've disabled App Transport Security by setting the proper flags in the plist.
I didn't like the answers which messed about with subviews etc.
For full-screen playback, I use the following (Non-ARC) code:
// Play the stream
NSString *wifiStreamAddress = #"http://yourmoviefile.m3u8";
AVPlayer *player = [[AVPlayer alloc] initWithURL: [NSURL URLWithString: wifiStreamAddress] ];
AVPlayerViewController *playerViewController = [[AVPlayerViewController alloc] init];
playerViewController.player = player;
// Keep pointers to player and controller using retained properties:
self.player = player;
self.playerViewController = playerViewController;
[player release];
[playerViewController release];
[self presentViewController: playerViewController animated: true completion: ^{
[self.player play];
}];
This works really neat, animating presentation and fading back to previous view when you tap the MENU button. Also, it works great with the remote control using all the standard functions.
Its working for me.
May be helpful for you
-(void)playAction
{
AVPlayerViewController *viewController = [[AVPlayerViewController alloc] initWithNibName:nil bundle:nil];
viewController.player = player;
[self addChildViewController:viewController];
[self.view addSubview:viewController.view];
[viewController didMoveToParentViewController:self];
// play the video
[player play];
}
Swift version
Make a PlayViewController which inherit the AVPlayerViewController.
In the viewcontroller which has play button, add such function
#IBAction func onClickPlay(sender: AnyObject) {
let playerVC = PlayerViewController()
playerVC.playVideo(urlString)
[self.presentViewController(playerVC, animated: true, completion: nil)]
}
In the PlayerViewController
func playVimeoVideo(link : String) {
player = AVPlayer(URL: NSURL(string: link)!)
player?.play()
}
Notice
The question and some answers may be a little misleading so that you might think that only the url with ".mp4" at the end can be played by the Apple TV. I believed so at the first time I saw the post. It is not true. In fact, with AVPlayerViewController you can play Vimeo streaming video! The link to the stream video is not like https://vimeo.com/92655878. It is possible to get it from Vimeo site by extracting it from a json file, which can be downloaded from this link
let link = "https://vimeo.com/api/oembed.json?url=https%3A//vimeo.com/" + videoId
To be able to get correct url for the video, you need to use the Vimeo Pro user access to get the stream link for a specific video.
So I have been at it all day to no luck and it has been needless to say quite frustrating, I have looked up many examples and downloadable categories which all tout being able to crop images flawlessly. Which they do, However the minute i try to do it from an image genrated via AVCaptureSession it does not work as well. I consulted both these sources
http://codefuel.wordpress.com/2011/04/22/image-cropping-from-a-uiscrollview/
http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/
and the project from the first link seems to work directly as advertised but as soon as i hack it to do the same magic on an av capture image...nope...
does anyone have insight into this? Also here is my code for reference.
- (IBAction)TakePhotoPressed:(id)sender
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
//NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
//NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(#"%f",image.size.width);
NSLog(#"%f",image.size.height);
float scale = 1.0f/_scrollView.zoomScale;
CGRect visibleRect;
visibleRect.origin.x = _scrollView.contentOffset.x * scale;
visibleRect.origin.y = _scrollView.contentOffset.x * scale;
visibleRect.size.width = _scrollView.bounds.size.width * scale;
visibleRect.size.height = _scrollView.bounds.size.height * scale;
UIImage* cropped = [self cropImage:image withRect:visibleRect];
[croppedImage setImage:cropped];
[image release];
}
];
[croppedImage setHidden:NO];
}
cropImage function used above.
-(UIImage*)cropImage :(UIImage*)originalImage withRect :(CGRect) rect
{
CGRect transformedRect=rect;
if(originalImage.imageOrientation==UIImageOrientationRight)
{
transformedRect.origin.x = rect.origin.y;
transformedRect.origin.y = originalImage.size.width-(rect.origin.x+rect.size.width);
transformedRect.size.width = rect.size.height;
transformedRect.size.height = rect.size.width;
}
CGImageRef cr = CGImageCreateWithImageInRect(originalImage.CGImage, transformedRect);
UIImage* cropped = [UIImage imageWithCGImage:cr scale:originalImage.scale orientation:originalImage.imageOrientation];
[croppedImage setFrame:CGRectMake(croppedImage.frame.origin.x,
croppedImage.frame.origin.y,
cropped.size.width,
cropped.size.height)];
CGImageRelease(cr);
return cropped;
}
I am also tempted for verbosity and arming whomever might help me in my plight with as much information as possible to post my init of my scrollView and avcapture session. However That may be a bit too much so if you want to see it just ask.
Now as for results of what the code actually does?..
What it looks like before i take the picture
And After...
EDIT:
Well I have a few views now and no comment's so either no one has figured it out or it's so simple they thought i would have figured it out again...In any case i have not made any progress. So for anyone interested here is a small sample app with the code all set up and you can see what i am doing
https://docs.google.com/open?id=0Bxr4V3a9QFM_NnoxMkhzZTVNVEE
It seems that this little conundrum did not only have me stumped as after nearly a week,but a scant few of whoever viewed my question had no suggestions either. I must say for this particular problem i could not get it to work in this way, I pondered and tinkered and mused for a while to no avail. Until i did this
[self HideElements];
UIGraphicsBeginImageContext(chosenPhotoView.frame.size);
[chosenPhotoView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self ShowElements];
And that's it, less code and it worked pretty much instantly. So instead of trying to crop an image via the scrollview I take a screenshot of the screen at that time then crop the image using the scrollviews frame variables. And the hide/show element functions hide any overlapping elements on the picture i want.
I'm trying to get a screenshot of a MKMapView.
and I'm using the following code:
UIGraphicsBeginImageContext(myMapView.frame.size);
[myMapView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenShot=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenShot;
And I'm getting and almost blank image with the map current location icon and a Google logo in it.
What could be causing that?
I should tell you that myMapView is actually on another viewController's view but since I'm getting the blue spot showing the location and the google logo I assume the reference I have is the correct one.
Thank you.
iOS 7 introduced a new method to generate screenshots of a MKMapView. It is now possible to use the new MKMapSnapshot API as follows:
MKMapView *mapView = [..your mapview..]
MKMapSnapshotOptions *options = [[MKMapSnapshotOptions alloc]init];
options.region = mapView.region;
options.mapType = MKMapTypeStandard;
options.showsBuildings = NO;
options.showsPointsOfInterest = NO;
options.size = CGSizeMake(1000, 500);
MKMapSnapshotter *snapshotter = [[MKMapSnapshotter alloc]initWithOptions:options];
[snapshotter startWithQueue:dispatch_get_main_queue() completionHandler:^(MKMapSnapshot *snapshot, NSError *error) {
if( error ) {
NSLog( #"An error occurred: %#", error );
} else {
[UIImagePNGRepresentation( snapshot.image ) writeToFile:#"/Users/<yourAccountName>/map.png" atomically:YES];
}
}];
Currently all overlays and annotations are not rendered. You have to render them afterwards onto the resulting snapshot image yourself. The provided MKMapSnapshot object has a handy helper method to do the mapping between coordinates and points:
CGPoint point = [snapshot pointForCoordinate:locationCoordinate2D];
As mentioned here, you can try this
- (UIImage*) renderToImage
{
UIGraphicsBeginImageContextWithOptions(self.frame.size, NO, 0.0);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I have a Addressbook in my App which gets filled from a website by json. If I add a new Person on my website the person get added to the app addressbook on the next launch. If I update a persons name on my website it works perfectly.. but if I add a new image to a existing person it still shows the old default image from Apple after the reload. If I compile it the picture appears. Is there a cache for addressbook pictures?
I tried the following:
BOOL imgBool = ABPersonHasImageData(aContact); -> false
and
ABPersonRemoveImageData(aContact, &anError); -> false because no picture in AB
I add a Persons picture like this:
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[[[self.profsInSection objectAtIndex:indexPath.section] objectAtIndex:indexPath.row] bildUrl]]]];
NSData *picData = UIImageJPEGRepresentation(image, 0.9f);
[...]
ABPersonSetImageData(aContact, (CFDataRef)picData, nil);
There are no Errors, it just saves the last picture and ignores new pictures... :( Any ideas?
Haven't you forget to call saving function, after applying your changes in the address book?
ABAddressBookSave(addrBook, nil);
UPDATE: so, as long as you're using ABUnknownPersonViewController, I've tried the following code by myself and it's working both on the simulator & device
ABRecordRef newPerson = ABPersonCreate();
ABRecordSetValue(newPerson, kABPersonLastNameProperty, #"Smith", nil);
UIImage *personImage = [UIImage imageNamed:#"image.jpg"];
NSData *dataRef = UIImageJPEGRepresentation(personImage, 0.9f);
CFErrorRef error = nil;
bool result = ABPersonSetImageData(newPerson, (CFDataRef)dataRef, &error);
ABUnknownPersonViewController *controller = [ABUnknownPersonViewController new];
//controller.addressBook = addressBook;
controller.allowsAddingToAddressBook = YES;
controller.displayedPerson = newPerson;
[self.navigationController pushViewController:controller animated:YES];
[controller release];
The issue I've met when was testing this code, was my mistype in image name, so the UIImage instance was nil and NSData was empty. And ABPersonSetImageData gave no error on this. So, double check your UIImage instance is not nil.
i wanted to find an "easy" way to use the pinch/zoom function on my app!
so i decided to use a UIScrollView.
so far so good.
i load my image from an sqlite db like so:
- (void)viewWillAppear:(BOOL)animated {
imageView.image = entity.Aattribute;
}
- (void)viewDidLoad {
self.title = #"Title";
imageView = [[UIImageView alloc] initWithFrame:[UIScreen mainScreen].applicationFrame];
imageView.autoresizingMask = UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth;
imageView.contentMode = UIViewContentModeScaleAspectFit;
imageView.backgroundColor = [UIColor blackColor];
myScrollView.contentSize = CGSizeMake(imageView.frame.size.width, imageView.frame.size.height);
myScrollView.maximumZoomScale = 4.0;
myScrollView.minimumZoomScale = 0.75;
myScrollView.clipsToBounds = YES;
myScrollView.delegate = self;
[myScrollView addSubview:imageView];
self.view = myScrollView;
}
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView {
return imageView;
}
any help would be appreciated!
thank you for your time!
EDIT:
i m just gonna answer my own question here! (i m getting better at this!!! LOL)
up above is the working code. i have edited it so if anyone needs this can refer to it!
thanks for the answers!
This looks mostly correct. A couple of notes:
you've misspelled "viewDidLoad". This could be a side effect of typing it into the browser window. If it's wrong in your code, then it's certainly not helping things be better.
viewForZoomingInScrollView: is expecting you to return a UIView. myImage sounds suspiciously unlike a UIView. You should be returning the imageView instance variable. That's what's getting moved around.
Edit
From the documentation:
The UIScrollView class can have a
delegate that must adopt the
UIScrollViewDelegate protocol. For
zooming and panning to work, the
delegate must implement both
viewForZoomingInScrollView: and
scrollViewDidEndZooming:withView:atScale:;
in addition, the maximum
(maximumZoomScale) and minimum (
minimumZoomScale) zoom scale must be
different.
So it looks like you still need to implement another delegate method.