NSDocument isLocked implementation for 10.7? - cocoa

How can I check if a document isLocked in 10.7?
NSDocument has a method isLocked, but it available only on 10.8.

Here is my implementation:
+ (BOOL)isDocumentLocked:(NSDocument*)doc
{
if (doc == nil)
{
return NO;
}
else if ([doc respondsToSelector:#selector(isLocked)]) // 10.8
{
return [doc isLocked];
}
else // OS X version < 10.8
{
NSError * error;
BOOL isAutosavingSafe = [doc checkAutosavingSafetyAndReturnError:&error];
if (!isAutosavingSafe)
{
return YES;
}
if (doc.fileURL == nil)
return NO;
NSFileManager* fm = [NSFileManager defaultManager];
NSString* path = doc.fileURL.absoluteURL.path;
if (![fm isWritableFileAtPath:path])
return YES; // No writing permissions
NSDictionary *attributes = [fm attributesOfItemAtPath:path error:&error];
BOOL isLocked = [[attributes objectForKey:NSFileImmutable] boolValue];
if (isLocked)
{
return YES;
}
}
return NO;
}

Related

macOS SearchKit CoreService found nothing

im working on an that search in the content of allot of files, i planed to use SearchKit but i can't figure out to make Apple's sample code to work, and i can't find any other ressources (NSHipster code didn't work either), here's my code:
#define kSearchMax 1000
#interface ViewController()
#property(nonatomic) SKIndexRef mySKIndex;
#end
#implementation ViewController
#synthesize mySKIndex;
- (void)viewDidLoad {
[super viewDidLoad];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
[self openIndex];
[self addDoc];
SKIndexFlush(self.mySKIndex);
// i thought that the indexation may need some time ..
sleep(2);
dispatch_async(dispatch_get_main_queue(), ^{
[self searchterm:#"var"];
});
});
}
- (void) openIndex {
NSString *path = [[NSHomeDirectory() stringByAppendingPathComponent:#"index"] stringByAppendingPathExtension:#"txt"]; // 1
NSURL *url = [NSURL fileURLWithPath:path];
NSString *name = #"extension_index";
if ([name length] == 0) name = nil;
SKIndexType type = kSKIndexInverted;
if ([[NSFileManager defaultManager] fileExistsAtPath:path]) {
mySKIndex = SKIndexOpenWithURL ((__bridge CFURLRef) url,
(__bridge CFStringRef) name,
true
);
}else{
self.mySKIndex = SKIndexCreateWithURL((__bridge CFURLRef) url,
(__bridge CFStringRef) name,
(SKIndexType) type,
(CFDictionaryRef) NULL);
}
}
- (void) addDoc {
SKLoadDefaultExtractorPlugIns ();
NSString *path = [NSBundle.mainBundle pathForResource:#"Products" ofType:#"rtf"]; // 1
NSURL *url = [NSURL fileURLWithPath: path]; // 2
SKDocumentRef doc = SKDocumentCreateWithURL ((__bridge CFURLRef) url);
NSString *mimeTypeHint = #"text/rtf";
BOOL added = SKIndexAddDocument ((SKIndexRef) mySKIndex,
(SKDocumentRef) doc,
(__bridge CFStringRef)mimeTypeHint,
(Boolean) true
);
NSLog(added ? #"added" : #"not added");
}
- (void) searchterm:(NSString*)query{
SKSearchOptions options = kSKSearchOptionDefault;
BOOL more = YES;
UInt32 totalCount = 0;
SKSearchRef search = SKSearchCreate (mySKIndex,
(__bridge CFStringRef) query,
options);
while (more) {
SKDocumentID foundDocIDs [kSearchMax];
float foundScores [kSearchMax];
float *scores;
Boolean unranked =
options & kSKSearchOptionNoRelevanceScores;
if (unranked) {
scores = NULL;
} else {
scores = foundScores;
}
CFIndex foundCount = 0;
more = SKSearchFindMatches (
search,
kSearchMax,
foundDocIDs,
scores,
100,
&foundCount
);
NSLog(#"%#", [NSString stringWithFormat:#"current count = %i", totalCount]);
totalCount += foundCount;
}
}
#end
it always print "current count = 0" and the loop is executed only one time.

saveInBackgroundWithBlock not calling the block when it finished working

I'm trying to do a simple chat in iOS with Obj-C, and when I press the "send" button, I save the object into Parse and I retrieve all the messages.
When it succeeds, I retrieve all the messages so the last one (mine) should be retrieved as well, but the last one never gets called. What should I do?
This is my code:
NSMutableArray *messagesArray;
- (IBAction)sendButtonTapped:(id)sender {
self.sendButton.enabled = false;
self.textField.enabled = false;
PFObject *objectChat = [PFObject objectWithClassName: #"ChatClass"];
objectChat[#"text"] = self.textField.text;
[objectChat saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
self.sendButton.enabled = true;
self.textField.enabled = true;
if (succeeded) {
self.textField.text = #"";
[self retrieveMessages];
NSLog(#"success!");
} else {
NSLog(error.description);
}
}];
}
-(void) retrieveMessages {
NSDate *now = [NSDate date];
PFQuery *query = [PFQuery queryWithClassName:#"ChatClass"];
[query whereKey:#"createdAt" lessThanOrEqualTo:now];
[query findObjectsInBackgroundWithBlock:^(NSArray *objects, NSError *error) {
if (error == nil) {
messagesArray = [[NSMutableArray alloc] init];
for (int i = 0; i < [objects count]; i++) {
[messagesArray addObject:[[objects objectAtIndex:i] objectForKey:#"text"]];
}
runOnMainQueueWithoutDeadlocking(^{
[self.messageTableView reloadData];
});
}
}];
}
void runOnMainQueueWithoutDeadlocking(void (^block)(void))
{
if ([NSThread isMainThread])
{
block();
}
else
{
dispatch_async(dispatch_get_main_queue(), block);
}
}
Thank you very much in advance.
Regards.
Perhaps there's enough of a clock difference with the parse server that the time qualification misses the object just created. Removing that line should do the trick.

setExposureMode xcode iOS 8

I am using the following code below to capture an image. Everything works fine but my commands to set the exposure and whitebalance in setCameraSettings() are ignored. They get executed but they have no effect. My command to set the session image resolution works fine.
#import "CaptureSessionManager.h"
#import <ImageIO/ImageIO.h>
// based on https://github.com/jj0b/AROverlayImageCapture
#implementation CaptureSessionManager
#synthesize captureSession;
#synthesize previewLayer;
#synthesize stillImageOutput;
#synthesize stillImage;
#synthesize imageWidth;
#synthesize imageHeight;
#synthesize imageBrightnessValue;
#synthesize imageExposureTime;
#synthesize imageApertureValue;
#synthesize imageISOSpeedRatings;
#synthesize playShutterSound;
/*************************************************************************************/
- (id)init {
if ((self = [super init])) {
AVCaptureSession *session = [[AVCaptureSession alloc] init];
// [session beginConfiguration];
if ([session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
session.sessionPreset = AVCaptureSessionPresetHigh; // AVCaptureSessionPresetHigh; // AVCaptureSessionPresetLow;
}
// [session commitConfiguration];
[self setCaptureSession:session];
}
return self;
}
/*************************************************************************************/
- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession: [self captureSession]]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}
/*************************************************************************************/
- (void)addVideoInputFrontCamera:(BOOL)front {
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
AVCaptureDevice *backCamera;
for (AVCaptureDevice *device in devices) {
NSLog(#"Device name: %#", [device localizedName]);
if ([device hasMediaType:AVMediaTypeVideo]) {
if ([device position] == AVCaptureDevicePositionBack) {
NSLog(#"Device position : back");
backCamera = device;
}
else {
NSLog(#"Device position : front");
frontCamera = device;
}
}
}
NSError *error = nil;
if (front) {
AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
if (!error) {
if ([[self captureSession] canAddInput:frontFacingCameraDeviceInput]) {
[[self captureSession] addInput:frontFacingCameraDeviceInput];
currentCaptureDevice = frontCamera;
} else {
NSLog(#"Couldn't add front facing video input");
}
}
} else {
AVCaptureDeviceInput *backFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
if (!error) {
if ([[self captureSession] canAddInput:backFacingCameraDeviceInput]) {
[[self captureSession] addInput:backFacingCameraDeviceInput];
currentCaptureDevice = backCamera;
} else {
NSLog(#"Couldn't add back facing video input");
}
}
}
}
/*************************************************************************************/
- (void)addStillImageOutput
{
[self setStillImageOutput:[[AVCaptureStillImageOutput alloc] init]];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
[[self stillImageOutput] setOutputSettings:outputSettings];
[[self captureSession] addOutput:[self stillImageOutput]];
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
[self setCameraSettings];
return;
}
}
}
}
/*************************************************************************************/
- (void)setCameraSettings:(long)expTime1000thSec iso:(int)isoValue
{
if ( currentCaptureDevice ) {
[captureSession beginConfiguration];
NSError *error = nil;
if ([currentCaptureDevice lockForConfiguration:&error]) {
if ([currentCaptureDevice isExposureModeSupported:AVCaptureExposureModeLocked]) {
CMTime minTime, maxTime, exposureTime;
if ( isoValue < minISO ) {
isoValue = minISO;
} else if ( isoValue > maxISO ) {
isoValue = maxISO;
}
exposureTime = CMTimeMake(expTime1000thSec, EXP_TIME_UNIT); // in 1/EXP_TIME_UNIT of a second
minTime = currentCaptureDevice.activeFormat.minExposureDuration;
maxTime = currentCaptureDevice.activeFormat.maxExposureDuration;
if ( CMTimeCompare(exposureTime, minTime) < 0 ) {
exposureTime = minTime;
} else if ( CMTimeCompare(exposureTime, maxTime) > 0 ) {
exposureTime = maxTime;
}
NSLog(#"setting exp time to %lld/%d s (want %ld) iso=%d", exposureTime.value, exposureTime.timescale, expTime1000thSec, isoValue);
[currentCaptureDevice setExposureModeCustomWithDuration:exposureTime ISO:isoValue completionHandler:nil];
}
if (currentCaptureDevice.lowLightBoostSupported) {
currentCaptureDevice.automaticallyEnablesLowLightBoostWhenAvailable = NO;
NSLog(#"setting automaticallyEnablesLowLightBoostWhenAvailable = NO");
}
// lock the gains
if ([currentCaptureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
currentCaptureDevice.whiteBalanceMode = AVCaptureWhiteBalanceModeLocked;
NSLog(#"setting AVCaptureWhiteBalanceModeLocked");
}
// set the gains
AVCaptureWhiteBalanceGains gains;
gains.redGain = 1.0;
gains.greenGain = 1.0;
gains.blueGain = 1.0;
AVCaptureWhiteBalanceGains normalizedGains = [self normalizedGains:gains];
[currentCaptureDevice setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:normalizedGains completionHandler:nil];
NSLog(#"setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains g.red=%.2lf g.green=%.2lf g.blue=%.2lf",
normalizedGains.redGain, normalizedGains.greenGain, normalizedGains.blueGain);
[currentCaptureDevice unlockForConfiguration];
}
[captureSession commitConfiguration];
}
}
/*************************************************************************************/
- (void)captureStillImage
{
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSLog(#"name: %#", [currentCaptureDevice localizedName]);
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
NSDictionary *dict = (__bridge NSDictionary*)exifAttachments;
NSString *value = [dict objectForKey:#"PixelXDimension"];
[self setImageWidth:[NSNumber numberWithInt:[value intValue]]];
NSString *value1 = [dict objectForKey:#"PixelYDimension"];
[self setImageHeight:[NSNumber numberWithInt:[value1 intValue]]];
NSString *value2 = [dict objectForKey:#"BrightnessValue"];
[self setImageBrightnessValue:[NSNumber numberWithFloat:[value2 floatValue]]];
NSString *value3 = [dict objectForKey:#"ExposureTime"];
[self setImageExposureTime:[NSNumber numberWithFloat:[value3 floatValue]]];
NSString *value4 = [dict objectForKey:#"ApertureValue"];
[self setImageApertureValue:[NSNumber numberWithFloat:[value4 floatValue]]];
NSArray *values = [dict objectForKey:#"ISOSpeedRatings"];
[self setImageISOSpeedRatings:[NSNumber numberWithFloat:[ [values objectAtIndex:0] floatValue]]];
// must be at end
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
}];
}
/********************************************************************************/
- (void)dealloc {
[[self captureSession] stopRunning];
// [super dealloc];
}
/************************************************************************************/
#end
You need to tell the device you want to use custom settings.
Like this :
if([device isExposureModeSupported:AVCaptureExposureModeCustom])
{
[device setExposureMode:AVCaptureExposureModeCustom];
[device setExposureModeCustomWithDuration:exposureTime ISO:exposureISO completionHandler:^(CMTime syncTime) {}];
[device setExposureTargetBias:exposureBIAS completionHandler:^(CMTime syncTime) {}];
}
You are skipping the setExposureMode..
Hope this works.

Default Save location of document in Cocoa document based application

I have created a Cocoa document based picture drawing application. I want that the default location of a new document created using my app in Save/Save As dialog should be in ~/Pictures/MyAppName/ directory.
How can I achieve this?
I tried more or less what Ole suggested below, but it doesn't work. Here is my implementation of prepareSavePanel. What am I doing wrong?
- (BOOL)prepareSavePanel:(NSSavePanel *)savePanel
{
if ([self fileURL] == nil) {
//new, not saved yet
[savePanel setExtensionHidden:NO];
//set default save location
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSPicturesDirectory, NSUserDomainMask, YES);
if ([paths count] > 0) {
NSString *userPicturesPath = [paths objectAtIndex:0];
NSString *myDirPath = [userPicturesPath stringByAppendingPathComponent:#"MyAppName"];
//create directory is it doesn't already exist
NSFileManager *fileManager = [NSFileManager defaultManager];
BOOL isDir;
BOOL useMyAppDir = NO;
if([fileManager fileExistsAtPath:myDirPath isDirectory:&isDir]){
if (isDir) {
useMyAppDir = YES;
}
} else {
//create the directory
if([fileManager createDirectoryAtPath:myDirPath withIntermediateDirectories:YES attributes:nil error:nil]){
useMyAppDir = YES;
}
}
if (useMyAppDir) {
NSURL * myAppDirectoryURL = [NSURL URLWithString:myDirPath];
[savePanel setDirectoryURL:myAppDirectoryURL];
}
}
} else {
[savePanel setExtensionHidden:[self fileNameExtensionWasHiddenInLastRunSavePanel]];
}
return YES;
}
In your NSDocument subclass, override -prepareSavePanel:
- (BOOL) prepareSavePanel:(NSSavePanel *)savePanel
{
// Set default folder if no default preference is present
NSDictionary *userDefaults = [[NSUserDefaults standardUserDefaults] persistentDomainForName:[[NSBundle mainBundle] bundleIdentifier]];
if ([userDefaults objectForKey:#"NSNavLastRootDirectory"] == nil) {
NSArray *picturesFolderURLs = [[NSFileManager defaultManager] URLsForDirectory:NSPicturesDirectory inDomains:NSUserDomainMask];
if ([picturesFolderURLs count] > 0) {
NSURL *picturesFolderURL = [[picturesFolderURLs objectAtIndex:0] URLByAppendingPathComponent:#"MyAppName"];
[savePanel setDirectoryURL:picturesFolderURL];
}
}
return YES;
}

I just want to turn the flash on for stitching photo's I'm trying to use AVCaptureDevice and AVCaptureFlashModeOn

-(IBAction)turningFlashOn:(id)sender
{
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
if (videoInput) {
[captureSession addInput:videoInput];
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_current_queue()];
[captureSession addOutput:videoOutput];
[captureSession startRunning];
videoCaptureDevice.torchMode = AVCaptureFlashModeOn;
}
}
I am being asked to use lockForConfiguration but it doesn't work or maybe i'm using it wrong. Can anyone please tell me what I'm doing wrong?
if([videoCaptureDevice lockForConfiguration]) {
[videoCaptureDevice setTorchMode:AVCaptureTorchModeOn];
[videoCaptureDevice unlockForConfiguration];
}
- (void)flashLightOn {
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device hasFlash] == YES) {
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device unlockForConfiguration];
}
}
}
-(void)flashLightOff {
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device hasFlash] == YES) {
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOff];
[device unlockForConfiguration];
}
}
}

Resources