With file access in a sandboxed osx app with swift in mind, does it work the same with URLs provided via Finder or other apps drops?
As there's no NSOpenPanel call to afford folder access as in this example, just urls - I think the folder access is implicit since the user dragged the file from the source / desktop "folder" much the same as implicit selection via the open dialog.
I have not begun the sandbox migration yet but wanted to verify my thinking was accurate, but here's a candidate routine that does not work in sandbox mode:
func performDragOperation(_ sender: NSDraggingInfo!) -> Bool {
let pboard = sender.draggingPasteboard()
let items = pboard.pasteboardItems
if (pboard.types?.contains(NSURLPboardType))! {
for item in items! {
if let urlString = item.string(forType: kUTTypeURL as String) {
self.webViewController.loadURL(text: urlString)
}
else
if let urlString = item.string(forType: kUTTypeFileURL as String/*"public.file-url"*/) {
let fileURL = NSURL.init(string: urlString)?.filePathURL
self.webViewController.loadURL(url: fileURL!)
}
else
{
Swift.print("items has \(item.types)")
}
}
}
else
if (pboard.types?.contains(NSPasteboardURLReadingFileURLsOnlyKey))! {
Swift.print("we have NSPasteboardURLReadingFileURLsOnlyKey")
}
return true
}
as no URL is acted upon or error thrown.
Yes, the file access is implicit. As the sandbox implementation is poorly documented and had/has many bugs, you want to work around URL and Filenames. The view should register itself for both types at initialisation. Code is in Objective-C, but API should be the same.
[self registerForDraggedTypes:[NSArray arrayWithObjects:NSFilenamesPboardType, NSURLPboardType, nil]];
Then on performDragOperation:
- (BOOL)performDragOperation:(id <NSDraggingInfo>)sender
{
BOOL dragPerformed = NO;
NSPasteboard *paste = [sender draggingPasteboard];
NSArray *typesWeRead = [NSArray arrayWithObjects:NSFilenamesPboardType, NSURLPboardType, nil];
//a list of types that we can accept
NSString *typeInPasteboard = [paste availableTypeFromArray:typesWeRead];
if ([typeInPasteboard isEqualToString:NSFilenamesPboardType]) {
NSArray *fileArray = [paste propertyListForType:#"NSFilenamesPboardType"];
//be careful since this method returns id.
//We just happen to know that it will be an array. and it contains strings.
NSMutableArray *urlArray = [NSMutableArray arrayWithCapacity:[fileArray count]];
for (NSString *path in fileArray) {
[urlArray addObject:[NSURL fileURLWithPath:path]];
}
dragPerformed = //.... do your stuff with the files;
} else if ([typeInPasteboard isEqualToString:NSURLPboardType]) {
NSURL *droppedURL = [NSURL URLFromPasteboard:paste];
if ([droppedURL isFileURL]) {
dragPerformed = //.... do your stuff with the files;
}
}
return dragPerformed;
}
I'm trying to use a NSFetchedResultsController with an additional NSManagedObjectContext different from the main MOC that I use in my App. Although I'm able to save and retrieve data with that additional MOC, every time that I try to create a NSFetchedResultsController it returns an object with nil fetchedObjects.
I'm using MagicalRecord, and this is the code that I'm using:
- (id) init
{
if(SINGLETON){
return SINGLETON;
}
if (isFirstAccess) {
[self doesNotRecognizeSelector:_cmd];
}
self = [super init];
if (self) {
NSManagedObjectModel *model = [NSManagedObjectModel MR_newManagedObjectModelNamed:#"ConnectorCache.momd"];
[NSManagedObjectModel MR_setDefaultManagedObjectModel:model];
storeCoordinator = [NSPersistentStoreCoordinator MR_coordinatorWithAutoMigratingSqliteStoreNamed:#"ConnectorCache.sqlite"];
cacheContext = [NSManagedObjectContext MR_contextWithStoreCoordinator:storeCoordinator];
[MagicalRecord setShouldAutoCreateManagedObjectModel:YES];
}
return self;
}
and the code for getting the fetchedResultsController is:
- (NSFetchedResultsController *) fetchedResultsController {
if (_fetchedResultController != nil) {
return _fetchedResultController;
}
NSFetchRequest *request = [Appointment MR_requestAllSortedBy:#"start" ascending:YES inContext:cacheContext];
_fetchedResultController = [[NSFetchedResultsController alloc] initWithFetchRequest:request managedObjectContext:cacheContext sectionNameKeyPath:#"day" cacheName:nil];
return _fetchedResultController;
}
If I use the request for fetching the data, it works correctly:
(lldb) po [cacheContext executeFetchRequest:request error:nil];
<_PFBatchFaultingArray 0x7fd9ca576710>(
<Appointment: 0x7fd9ca4c7220> (entity: Appointment; id: 0xd00000000004000a <x-coredata://36BACA76-7C2B-413C-8782-F92BBC7C1AA7/Appointment/p1> ; data: {
attendees = "<relationship fault: 0x7fd9ca518400 'attendees'>";
attendeesOmitted = nil;
created = "2015-03-10 16:41:31 +0000";
creator = nil;
end = "2015-03-18 18:45:00 +0000";
eventDescription = "Doctor Garc\U00eda Villaran";
eventID = "_8gs3ecpi8h0jab9i6op3ib9k6gp3iba18kojiba68d1j8c2260pj4e1o64";
lastUpdated = "2015-03-10 16:42:13 +0000";
location = "Centro M\U00e9dico Quir\U00f3n Sevilla Este, Sevilla, ES";
start = "2015-03-18 17:45:00 +0000";
status = 0;
title = "Cita otorrino";
})
)
I cannot see what I'm missing…
Kind regards
I was my fault, what I missed is the fact that I'm using MagicalRecord which actually provides a method for getting a NSFetchedResultsController.
NSManageObject MR_fetchAll… or friends will do the job. In my case I used:
_fetchedResultController = [Appointment MR_fetchAllSortedBy:#"start"
ascending:YES
withPredicate:nil
groupBy:#"day"
delegate:nil
inContext:cacheContext];
I'm converting our app over to use the Photos Framework of iOS8, the ALAsset framework is clearly a second class citizen under iOS8.
I'm having a problem is that our architecture really wants an NSURL that represents the location of the media on "disk." We use this to upload the media to our servers for further processing.
This was easy with ALAsset:
ALAssetRepresentation *rep = [asset defaultRepresentation];
self.originalVideo = rep.url;
But I'm just not seeing this ability in PHAsset. I guess I can call:
imageManager.requestImageDataForAsset
and then write it out to a temp spot in the file system but that seems awfully heavyweight and wasteful, not to mention potentially slow.
Is there a way to get this or am I going to have refactor more of my app to only use NSURLs for iOS7 and some other method for iOS8?
If you use [imageManager requestAVAssetForVideo...], it'll return an AVAsset. That AVAsset is actually an AVURLAsset, so if you cast it, you can access it's -url property.
I'm not sure if you can create a new asset out of this, but it does give you the location.
SWIFT 2.0 version
This function returns NSURL from PHAsset (both image and video)
func getAssetUrl(mPhasset : PHAsset, completionHandler : ((responseURL : NSURL?) -> Void)){
if mPhasset.mediaType == .Image {
let options: PHContentEditingInputRequestOptions = PHContentEditingInputRequestOptions()
options.canHandleAdjustmentData = {(adjustmeta: PHAdjustmentData) -> Bool in
return true
}
mPhasset.requestContentEditingInputWithOptions(options, completionHandler: {(contentEditingInput: PHContentEditingInput?, info: [NSObject : AnyObject]) -> Void in
completionHandler(responseURL : contentEditingInput!.fullSizeImageURL)
})
} else if mPhasset.mediaType == .Video {
let options: PHVideoRequestOptions = PHVideoRequestOptions()
options.version = .Original
PHImageManager.defaultManager().requestAVAssetForVideo(mPhasset, options: options, resultHandler: {(asset: AVAsset?, audioMix: AVAudioMix?, info: [NSObject : AnyObject]?) -> Void in
if let urlAsset = asset as? AVURLAsset {
let localVideoUrl : NSURL = urlAsset.URL
completionHandler(responseURL : localVideoUrl)
} else {
completionHandler(responseURL : nil)
}
})
}
}
If you have a PHAsset, you can get the url for said asset like this:
[asset requestContentEditingInputWithOptions:editOptions
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *imageURL = contentEditingInput.fullSizeImageURL;
}];
Use the new localIdentifier property of PHObject. (PHAsset inherits from this).
It provides similar functionality to an ALAsset URL, namely that you can load assets by calling the method
+[PHAsset fetchAssetsWithLocalIdentifiers:identifiers options:options]
All the above solutions won't work for slow-motion videos. A solution that I found handles all video asset types is this:
func createFileURLFromVideoPHAsset(asset: PHAsset, destinationURL: NSURL) {
PHCachingImageManager().requestAVAssetForVideo(self, options: nil) { avAsset, _, _ in
let exportSession = AVAssetExportSession(asset: avAsset!, presetName: AVAssetExportPresetHighestQuality)!
exportSession.outputFileType = AVFileTypeMPEG4
exportSession.outputURL = destinationURL
exportSession.exportAsynchronouslyWithCompletionHandler {
guard exportSession.error == nil else {
log.error("Error exporting video asset: \(exportSession.error)")
return
}
// It worked! You can find your file at: destinationURL
}
}
}
See this answer here.
And this one here.
In my experience you'll need to first export the asset to disk in order to get a fully accessible / reliable URL.
The answers linked to above describe how to do this.
Just want to post the hidden gem from a comment from #jlw
#rishu1992 For slo-mo videos, grab the AVComposition's
AVCompositionTrack (of mediaType AVMediaTypeVideo), grab its first
segment (of type AVCompositionTrackSegment), and then access its
sourceURL property. – jlw Aug 25 '15 at 11:52
In speking of url from PHAsset, I had once prepared a util func on Swift 2 (although only for playing videos from PHAsset). Sharing it in this answer, might help someone.
static func playVideo (view:UIViewController, asset:PHAsset)
Please check this Answer
Here's a handy PHAsset category:
#implementation PHAsset (Utils)
- (NSURL *)fileURL {
__block NSURL *url = nil;
switch (self.mediaType) {
case PHAssetMediaTypeImage: {
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.synchronous = YES;
[PHImageManager.defaultManager requestImageDataForAsset:self
options:options
resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
url = info[#"PHImageFileURLKey"];
}];
break;
}
case PHAssetMediaTypeVideo: {
dispatch_semaphore_t semaphore = dispatch_semaphore_create(0);
[PHImageManager.defaultManager requestAVAssetForVideo:self
options:nil
resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:AVURLAsset.class]) {
url = [(AVURLAsset *)asset URL];
}
dispatch_semaphore_signal(semaphore);
}];
dispatch_semaphore_wait(semaphore, DISPATCH_TIME_FOREVER);
break;
}
default:
break;
}
return url;
}
#end
I had similiar problem with video files, what worked for me was:
NSString* assetID = [asset.localIdentifier substringToIndex:(asset.localIdentifier.length - 7)];
NSURL* videoURL = [NSURL URLWithString:[NSString stringWithFormat:#"assets-library://asset/asset.mov?id=%#&ext=mov", assetID]];
Where asset is PHAsset.
I want to customize text for the same information but when I am sharing it on Facebook I don't want to use the twitter hash tags or #username scheme...
How can I diversify text for sharing based on which sharing service would be used?
Ofcourse I'm using UIActivityViewController:
UIActivityViewController *activityVC = [[UIActivityViewController alloc] initWithActivityItems:#[shareText, shareURL] applicationActivities:nil];
I took this answer and made a simple class for it. The default message will be seen by sharing outlets other than Twitter, and for Twitter words within the hashWords array will appear with hashes if they are present in the default message. I thought I would share it for anyone else who needs it. Thanks Christopher!
Usage:
TwitterHashActivityItemProvider *twit = [[TwitterHashActivityItemProvider alloc] initWithDefaultText:#"I really like stackoverflow and code"
hashWords:#[#"stackoverflow", #"code"]];
NSArray *items = #[twit];
UIActivityViewController *act = [[UIActivityViewController alloc] initWithActivityItems:items applicationActivities:nil];
Header:
#interface TwitterHashActivityItemProvider : UIActivityItemProvider
- (id)initWithDefaultText:(NSString*)text hashWords:(NSArray*)hashItems;
#property (nonatomic,strong) NSArray *hashItems;
#end
Implementation:
#import "TwitterHashActivityItemProvider.h"
#implementation TwitterHashActivityItemProvider
- (id)initWithDefaultText:(NSString*)text hashWords:(NSArray*)hashItems;
{
self = [super initWithPlaceholderItem:text];
if ( self )
{
self.hashItems = hashItems;
}
return self;
}
- (id)item
{
if ( [self.placeholderItem isKindOfClass:[NSString class]] )
{
NSString *outputString = [self.placeholderItem copy];
// twitter gets some hash tags!
if ( self.activityType == UIActivityTypePostToTwitter )
{
// go through each potential hash item and augment the main string
for ( NSString *hashItem in self.hashItems)
{
NSString *hashed = [#"#" stringByAppendingString:hashItem];
outputString = [outputString stringByReplacingOccurrencesOfString:hashItem withString:hashed];
}
}
return outputString;
}
// else we didn't actually provide a string...oops...just return the placeholder
return self.placeholderItem;
}
#end
Instead of passing the text strings into the initWithActivityItems call, pass in your own sub-class of the UIActivityItemProvider class and when you implement the itemForActivityType method it will provide the sharing service as the 'activityType' parameter.
You can then return the customized content from this method.
Swift implementation example of an UIActivityItemProvider subclass. Copy option will use only the password, other activity types will use the full share text. Should be easy to customize for different use cases. Credit to Cristopher & NickNack for their answers.
class PasswordShareItemsProvider: UIActivityItemProvider {
private let password: String
private var shareText: String {
return "This is my password: " + password
}
init(password: String) {
self.password = password
// the type of the placeholder item is used to
// display correct activity types by UIActivityControler
super.init(placeholderItem: password)
}
override var item: Any {
get {
guard let activityType = activityType else {
return shareText
}
// return desired item depending on activityType
switch activityType {
case .copyToPasteboard: return password
default: return shareText
}
}
}
}
Usage:
let itemProvider = PasswordShareItemsProvider(password: password)
let activityViewController = UIActivityViewController(activityItems: [itemProvider], applicationActivities: nil)
Hy all, i have developed an application that searches a table view connected to a sqlite database.
A search Bar is added onto of the application and the search is working fine, but when i type, i need only the items STARTING by the letter typed appear.
This is my code till now :
-(void)searchBar:(UISearchBar*)searchBar textDidChange:(NSString*)text
{
if(text.length == 0)
{
isFiltered = FALSE;
}
else
{
isFiltered = true;
filteredTableData = [[NSMutableArray alloc] init];
for (Author* author in theauthors)
{ //[NSPredicate predicateWithFormat:#"SELECT * from books where title LIKE %#", searchBar.text];
NSRange nameRange = [author.name rangeOfString:text options:NSCaseInsensitiveSearch];
NSRange descriptionRange = [author.genre rangeOfString:text options:NSCaseInsensitiveSearch];
if(nameRange.location != NSNotFound || descriptionRange.location != NSNotFound)
{
[filteredTableData addObject:author];
}
}
}
[self.tableView reloadData];
}
It was actually pretty simple. I just had to change the options file from options:NSCaseInsensitiveSearch to option:NSAnchoredSearch
It worked like a charm