MPMediaPickerController for in house app videos - mpmovieplayercontroller

I have an app that has stored video files. I would like to have the ability for the user to pick and play ANY video stored in my apps file, NOT a predetermined moviePlayer like,
//
NSString *movieFile = [[NSBundle mainBundle}
pathForResource:#"somemoviename" ofType:#m4v"];
MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc]
[initWithContentURL:
[NSURL fileURLWithPath: movieFile]];
//
so it would work just like getting a video from the photo library on my iPhone, but its doing it with the video files in the application itself.

Related

How to fetch file icons of any files in OS X using Cocoa?

I am writing a Mac app in which I list directory contents i.e. files /folders in that directory.
Now, I want to show thumbnails of the various file types of that directory. Files can be image files, pdf, psd, etc.
How do I fetch icons of any file type in Cocoa?
Thanks!
An application bundle usually has icons in a file that vary in size, so we can specify which version of the icon we require. For example
const int size = 64;
const char* filePath = "pathToFileOrAppBundle";
NSString* nsfilePath = [NSString stringWithUTF8String:filePath];
NSImage *image = [[NSWorkspace sharedWorkspace] iconForFile:nsfilePath];
// set the image size
[image setSize: NSMakeSize(size,size)];

Re-render video using the new Photos Framework in iOS8

I need to be able take a video from Photos and re-rendering, both clipping it in time, changing the width and height, and frame rate. Certainly I need to start with:
PHContentEditingInputRequestOptions *options = [[PHContentEditingInputRequestOptions alloc] init];
[self.asset requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
// Get full image
NSURL *url = [contentEditingInput fullSizeImageURL];
}];
And I should be able to adjust width, height and duration. Grab an NSData from that, write that out to the file syset.m
But the url is nil, which implies to me that I can't edit videos with the new Photos framework. (ALAsset didn't have a problem with this using AVAssetExportSession.) This makes sense since the Apple Dev sample code can't edit videos either.
Now, to make life easier I could just pass that url to an AVAssetExportSession but I can't, because it is nil. If I just modified width, height and duration I'd still need to grab an NSData from it, write that out to the file system.
I do not need to write the modified video back to Photos, I actually need the video on the file system since I'll be uploading it to our servers.
fullSizeImageURL is for working with Photo assets. You want the avAsset property when working with a video. Modify the actual video, not the metadata, by writing a new video file.
To do that, you could use that avAsset in an AVMutableComposition:
Insert the appropriate time range of the avAsset's video track (AVAssetTrack) into an AVMutableCompositionTrack. That'll do your trimming.
Place/size it appropriately using layer instructions. (AVMutableVideoCompositionLayerInstruction) to do your cropping and scaling.

Upload JPG image to Twitter

I'm trying to upload a jpg image to twitter. I can upload the #"icon.png" as many times as I want but can't seem to get a jpg to upload. I always get error 403 back from twitter. This is my code:
ACAccount *twitterAccount = [arrayOfAccounts objectAtIndex:0];
UIImage *image =[UIImage imageNamed:file]; //file is the name of a jpg image. If replaced with #"icon.png" the upload works
TWRequest *postRequest = [[TWRequest alloc] initWithURL:[NSURL URLWithString:#"https://upload.twitter.com/1/statuses/update_with_media.json"]
parameters:[NSDictionary dictionaryWithObject:#"Hello. This is a tweet." forKey:#"status"] requestMethod:TWRequestMethodPOST];
[postRequest addMultiPartData:UIImagePNGRepresentation(image) withName:#"media" type:#"multipart/png"]; //I've tried this with UIImageJPEGRepresentation(image, 80) and multipart/jpg and jpeg to no avail
// Set the account used to post the tweet.
[postRequest setAccount:twitterAccount];
[postRequest performRequestWithHandler:^(NSData *responseData, NSHTTPURLResponse *urlResponse, NSError *error)
{
NSLog(#"Twitter response, HTTP response: %i", [urlResponse statusCode]);
}];
Please help!
If I use the below it works
UIImage *image =[UIImage imageNamed:#"background.png"]; //This file is copied to the device when the app is built.
But, if I use this is does not work, even though I have confirmed that this path and file name does intact pull up the image
UIImage *image =[UIImage imageNamed:[NSString stringWithFormat:#"%#/%#", dir, file]];
What am I doing wrong? Thanks
/// SOLUTION
I may be an idiot, please don't crucify me, but here is what I needed to change the UIImage to
UIImage *image = [UIImage imageWithContentsOfFile:[NSString stringWithFormat:#"%#/%#", dir, file]];
This allows both PNG and JPG to upload without issue. Stupid mistake but at least it's working now. Thanks for the replies!!
The 403 error means the error comes from Twitter; you cannot share the same content twice from the same Twitter account. If you want to upload the same picture, then you must first remove it from the Twitter account, then try to upload it again. Otherwise, upload a different picture.

HTTP Live Streaming of static file to iOS device

I'm trying to understand the "chunked" aspect of HTTP Live Streaming a static video file to an iOS device. Where does the chunking of the video file happen?
Edit: from reading HTTP LIve Streaming and a bit more of https://datatracker.ietf.org/doc/html/draft-pantos-http-live-streaming-07 it sounds like the video file is split into .ts segments on the server. Or the m3u8 playlists can specify byte offsets into the file (apparently using EXT-X-BYTERANGE).
Here's what I understand of this process after reading Apple's HLS description and https://datatracker.ietf.org/doc/html/draft-pantos-http-live-streaming-07:
A static file lives on my server. It has the proper audio/video encoding (H.264 and AAC).
I'll pass an m3u8 playlist to the media player (MPMoviePlayer or similar) in my app.
The app will "reload the index" during media playback. In other words the app will request additional segments to play.
each 10 second segment is in an MPEG Transport Stream container.
My understanding of this process is incomplete (and perhaps incorrect). Any additional info is much appreciated.
What are you asking for?? Info???
-The app is not reloading the index but playing it... using the M3U8 file that switches the correct encoded file. That way you only have to make a connection between the mediaPlayer and the "Manifest File" for example...
NSURL *fileURL = [NSURL URLWithString:#"http://techxvweb.fr/html5/AppleOutput/2012-03-10-j23-dax-smr-mt1-m3u8-aapl.ism/manifest(format=m3u8-aapl)"];
moviePlayerController = [[MPMoviePlayerController alloc] initWithContentURL:fileURL];
/* Inset the movie frame in the parent view frame. */
CGRect viewInsetRect = CGRectInset ([self.view bounds],0.0, 0.0 );
[[moviePlayerController view] setFrame: viewInsetRect ];
[self.view addSubview:moviePlayerController.view];
[moviePlayerController play];
where the NSUrl is the url to your manifestFile... note that I'm adding:
/manifest(format=m3u8-aapl)
to the original manifest file, what parses the "ISM" file to the correct M3U8 syntax
NSURL *fileURL = [NSURL URLWithString:#"http://techxvweb.fr/html5/AppleOutput/2012-03-10-j23-dax-smr-mt1-m3u8-aapl.ism/manifest(format=m3u8-aapl)"];

Comprehensive Image processing example using Cocoa API

Can someone point out a comprehensive example on Image processing using cocoa API? I am developing an application for the Mac, and not for the iphone device. I usually come across with UIImage manipulation which provides an intuitive set of methods to achieve task such as per pixel manipulation and saving into file at different format. In the case with Appkit, NSImage I really find it hard to manipulate per pixel data of the images and saving to different file formats such as PNG not just TIFF.
If you want to work with pixels, CGImage and CGImageSource and CGImageDestination are the way to go. Unlike AppKit's NSImage, which is designed generally in order to handle any kind of image, the CGImage classes are specifically designed for raster images.
You can retrieve a bitmap representation of your image object and modify its data
NSBitmapImageRep *rep = [[image representations] objectAtIndex: 0];
unsigned char *bmpData = [rep bitmapData];
To save modified representation in PNG format do the following:
NSData *data = [bits representationUsingType: NSPNGFileType properties: nil];
[data writeToFile: #"/path-to-your-file/image.png" atomically: NO];

Resources