Close and launch with different command line parameters/arguments - macos

I am running a .app and I need to "restart" it so to speak. Basically I need to tell it to close, then after closing, it should launch the path i tell (which is to itself) with some command line arguments. Is this possible with cocoa? Im getting stuck at the part where my app is closing then closed, then I cant get it back.
My code is in js-ctypes, but here is the objc pseudo code:
default_center = [NSDistributedNotificationCenter defaultCenter];
shared_workspace = [NSWorkspace sharedWorkspace];
notification_center = [[NSWorkspace sharedWorkspace] notificationCenter];
[notification_center addObserver:selector:name:object: ***, ***, NSWorkspaceDidLaunchApplicationNotification, NIL]
And in my observr when it responds with completion of quit it has code to launch. But as my app is closed it never gets to the observer responder.
Here
Thanks

You didn't mention any reason that you cannot launch a second instance of your app from the first instance, rather than the chicken & egg approach of trying to restart after you've quit... I have this code in my AppWillTerminate function where I have a situation like yours:
[[NSWorkspace sharedWorkspace] launchApplicationAtURL:appUrl options:NSWorkspaceLaunchNewInstance configuration:nil error:&error];
( To get the AppWillTerminate to be called in the first place, I had to disableSuddenTermination before calling [app quit] )
There's also some flag in the app's plist file like "allow multiple instance" or something.
Also, KNOW THIS: if your app is sandboxed, this will not work UNLESS it is code signed with an AppleStore given id, or with a Developer ID Application id. Also, it won't work on X.7 no matter what, when sandboxed.
METHOD TWO,
is to create a "Helper App". Your KillerApp goes through the Quit process, and right before it dies, it launches "HelperApp", which is a tiny command line tool which waits for KillerApp to really die, then relaunches it.
In XCode, the code for the HelperApp, a "Command line tool", is like this:
#import <Cocoa/Cocoa.h>
int main( int argc , char *argv[] ) {
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
pid_t parentPID = atoi(argv[2]);
ProcessSerialNumber psn;
while ( GetProcessForPID(parentPID, &psn) != procNotFound )
sleep(1);
NSString* appPath = [NSString stringWithCString:argv[1] encoding:NSUTF8StringEncoding];
BOOL success = [[NSWorkspace sharedWorkspace] openFile:[appPath stringByExpandingTildeInPath]];
if ( ! success )
NSLog(#"Error: could not relaunch application at %#", appPath);
[pool drain];
return (success) ? 0 : 1;
}
As you can see, you call the HelperApp with a couple parameters from your KillerApp... And in the case where you don't need to be sandboxed, that's about it.
If you DO need sandboxing, then it gets more complicated, of course. You need to create a "privileged helper tool", and thank goodness there is sample code for it.
"SMJobBless" is the Apple sample code project which outlines how to do this, but it's kind of weird-- it doesn't actually DO anything. Thankfully, somebody took that project and created "SMJobBlessXPC" from it, which really does finish the job, ( and when you get it working, your KillerApp can actually communicate with your HelperApp ). The downside is that you need to exactly maintain the plist files of the two apps in terms of code signing.

Related

Register for global file drag events in Cocoa

I'm trying to be notified when a OS X user is dragging any file in OS X, not only in my app.
My current approach was using addGlobalMonitorForEventsMatchingMask:handler: on NSEvent, as follows:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSLeftMouseDraggedMask handler:^(NSEvent* event) {
NSPasteboard* pb = [NSPasteboard pasteboardWithName:NSDragPboard];
NSLog(#"%#", [pb propertyListForType:NSFilenamesPboardType]);
}];
This works partially - the handler is being called when I start dragging a file from my desktop or Finder, however it also is being called when I perform every other operation that contains a left-mouse-drag, e.g. moving a window. The issue is that the NSDragPboard still seems to contain the latest dragged file URL e.g. when I let off the file and start moving a window, which makes it hard to distinguish between these operations.
TL;DR - I am interested in file drag operations system-wide. I do not need any information about the dragged file itself, just the information that a file drag operation has been started or stopped. I would appreciate any hint to a possible solution for this question.
After having talked to Apple DTS, this is most likely a bug. I have filed rdar://25892115 for this issue. There currently seems to be no way to solve my original question with the given API.
To solve my problem, I am now using the Accessibility API to figure out if the item below the cursor is a file (kAXFilenameAttribute is not NULL).
NSPasteboard* pb = [NSPasteboard pasteboardWithName:NSDragPboard];
NSArray* filenames = [pb propertyListForType:NSFilenamesPboardType];
NSInteger changeCount = pb.changeCount;
//when moving a window. the changeCount is not changed, use it to distinguish
if (filenames.count > 0 && self.lastChangeCount != changeCount){
self.lastChangeCount = changeCount;
//your code here
}

Weird Xcode behavior

Lately something weird has been happening to my projects in xcode: I've been trying to learn a lot of new stuff, and doing so by testing things out in different simple cocoa apps (written by me, from scratch). sometimes I will get a code that doesn't have any error messages, but when i run it, i will stop at some break-point. I then conclude that I have probably done something wrong, and restores the code back to the form it was before the error, but from then on out it is impossible to get the code to run. even if i restore the code to a state that i am 100 % sure that has worked before, it just stops at the same break-point. in order to fix this problem, i have to copy my code from the class this has happened to, delete the class, make a new one with the exact same name, paste the exact same code back in the class, and voila, it works again. what on earth is happening? my newest problem code goes like this:
-(IBAction)openFile:(id)sender {
NSOpenPanel *openPanel = [[NSOpenPanel alloc] init];
NSURL *fileURL;
[openPanel setCanChooseFiles:YES];
[openPanel setCanChooseDirectories:NO];
[openPanel setAllowsMultipleSelection:NO];
[openPanel setAllowedFileTypes:[NSArray arrayWithObject:#"txt"]];
if ( [openPanel runModal] == NSOKButton ) {
fileURL = [openPanel URL];
}
[openPanel release];
}
I know this code has worked before. It is currently my only method, and it activates when i press open in the menu. If I delete everything inside the method, so that pressing open should do nothing it stops at a break-point inside the method anyway. I have had exactly the same kind of problem before with openGL codes, and with a method that used c syntax to do file reading. Does anybody know what kind of horrible mistake I'm making over and over again?
A Breakpoint is something you yourself set explicitly to tell Xcode to pause the program at this exact line. Superficially it might look like the program crashed, but in reality it's just waiting for you to tell it to go on.
This page looks like it has a nice explanation of the breakpoint interface in Xcode. (This is from a framework called Cocos2D, but ignore that. You should stick to ordinary Cocoa until you know what you're doing.)

Cocoa - Programmatically adding an application to all spaces

is there a way to add an application to all spaces programmatically? I'd like my application to be on all spaces by default.
The methods you need are in NSWindow.
For Lion use:
- (void)setCollectionBehavior:(NSWindowCollectionBehavior)behavior
For pre-Lion override the following to return YES:
- (BOOL)canBeVisibleOnAllSpaces
This piece of code works for me (at least on 10.6.8 in a little project I recently worked on):
-(void)windowDidLoad {
// Make the window visible on all Spaces
if([[self window] respondsToSelector: #selector(setCollectionBehavior:)]) {
[[self window] setCollectionBehavior: NSWindowCollectionBehaviorCanJoinAllSpaces];
}
else if([[self window] respondsToSelector: #selector(canBeVisibleOnAllSpaces)]) {
[[self window] canBeVisibleOnAllSpaces]; // AVAILABLE_MAC_OS_X_VERSION_10_5_AND_LATER_BUT_DEPRECATED
}
}
I put this code in a (custom subclass of a) WindowController for the main app window.
Ok. Just setting the workspaces-app-bindings programmatically didn't work. I tried:
1) Verified no entries were in System Preferences->Spaces
2) defaults write com.apple.dock workspaces-app-bindings -dict-add com.apple.mail 65544
3) killall Dock (also needed to kill System Preferences )
4) Opened System Preferences->Spaces to verify the Mail app entry
appeared and was set to Every Space
5) Launched Mail, but it was still stuck to Space 1
6) Only when I went back into System Preferences->Spaces and changed the
Mail app *from* Every Space and then *back* to Every Space did the Mail
app stick to every space
So clearly system preferences is doing something extra to activate the setting. Does anyone know what this could be? Thanks!
Update: So I was able to get this working by using the applescript api instead of the user defaults api. The following post tells how to append an entry using applescript. Then just kill the dock.
Applescript; opening an app in Space number N
Use the defaults-command that ships with OS X, like so:
defaults write com.apple.dock workspaces-app-bindings -dict-add com.apple.mail 65544
By issuing the above command, you set the application identified by “com.apple.mail” to appear on every space. 65544 is a magic value saying “every space”. If the key-value pair (identifier + settings) exists it will be overwritten.
Note that you have to reload the Dock (killall Dock) and somehow execute these commands from within your application. From within objective-c you can use the following snippet to quit the Dock:
NSRunningApplication *dock = [NSRunningApplicationrunningApplicationWithBundleIdentifier:#"com.apple.dock"];
[dock terminate];
From within AppleScript use the following:
quit application "Dock"
Your app delegate should look like this...
#import "alwaysOnTopAppDelegate.h"
#implementation alwaysOnTopAppDelegate
#synthesize window;
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
[window setCollectionBehavior:NSWindowCollectionBehaviorCanJoinAllSpaces];
}
#end

Removing app from background while debugging

I am debugging an issue for my app in iOS 4 and above where it doesn't appear to save progress when it's closed. I'm using Xcode 4.0 and running it in the simulator, and, when I close the app in the simulator, remove it from the background apps bar, then relaunch it from the simulator, it appears to break in the retval line below:
int main(int argc, char *argv[]) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
int retVal = UIApplicationMain(argc, argv, nil, nil);
[pool release];
return retVal;
}
It cites "Thread 1: Program received signal: "SIGKILL" and I'm not quite sure what to make of it (also I'm just minutes new to using Xcode 4).
Can someone explain what's going on here, whether I simply can't debug once I stick an app in background (and/or remove it), or whether this potentially points to my issue with saving progress? I basically trigger the save when my main delegate receives:
- (void)applicationWillTerminate:(UIApplication *)application
You should save in applicationDidEnterBackground:, not applicationWillTerminate:. When an app in the background is closed, it is killed without sending applicationWillTerminate: (this is the SIGKILL you are getting). However, if you are supporting devices or versions without multitasking, you will need to save in applicationWillTerminate: also, since it is used in those circumstances.

OS X: Detect system-wide keyDown events?

I'm working on a typing-tutor application for Mac OS X that needs to have keystrokes forwarded to it, even when the application is not in focus.
Is there a way to have the system forward keystrokes to the app, possibly through NSDistributedNotificationCenter? I've googled myself silly, and haven't been able to find an answer...
EDIT: Sample code below.
Thanks #NSGod for pointing me in the right direction -- I ended up adding a global events monitor using the method addGlobalMonitorForEventsMatchingMask:handler:, which works beautifully. For completeness, my implementation looks like this:
// register for keys throughout the device...
[NSEvent addGlobalMonitorForEventsMatchingMask:NSKeyDownMask
handler:^(NSEvent *event){
NSString *chars = [[event characters] lowercaseString];
unichar character = [chars characterAtIndex:0];
NSLog(#"keydown globally! Which key? This key: %c", character);
}];
For me, the tricky part was using blocks, so I'll give a little description in case it helps anyone:
The thing to notice about the above code is that it's all one single method call on NSEvent. The block is supplied as an argument, directly to the function. You could think of it kind of like an inline delegate method. Just because this took a while to sink in for me, I'm going to work through it step by step here:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSKeyDownMask
This first bit is no problem. You're calling a class method on NSEvent, and telling it which event you're looking to monitor, in this case NSKeyDownMask. A list of masks for supported event types can be found here.
Now, we come to the tricky part: handler, which expects a block:
handler:^(NSEvent *event){
It took me a few compile errors to get this right, but (thank you Apple) they were very constructive error messages. The first thing to notice is the carat ^. That signals the start of the block. After that, within the parentheses,
NSEvent *event
Which declares the variable that you'll be using within the block to capture the event. You could call it
NSEvent *someCustomNameForAnEvent
doesn't matter, you'll just be using that name within the block. Then, that's just about all there is to it. Make sure to close your curly brace, and bracket to finish the method call:
}];
And you're done! This really is kind of a 'one-liner'. It doesn't matter where you execute this call within your app -- I do it in the AppDelegate's applicationDidFinishLaunching method. Then, within the block, you can call other methods from within your app.
If you are okay with a minimum requirement of OS X 10.6+, and can suffice with "read-only" access to the stream of events, you can install a global event monitor in Cocoa:
Cocoa Event-Handling Guide: Monitoring Events.
If you need to support OS X 10.5 and earlier, and read-only access is okay, and don't mind working with the Carbon Event Manager, you can basically do the Carbon-equivalent using GetEventMonitorTarget(). (You will be hard-pressed to find any (official) documentation on that method though). That API was first available in OS X 10.3, I believe.
If you need read-write access to the event stream, then you will need to look at a slightly lower-level API that is part of ApplicationServices > CoreGraphics:CGEventTapCreate() and friends. This was first available in 10.4.
Note that all 3 methods will require that the user have "Enable access for assistive devices" enabled in the System Preferences > Universal Access preference pane (at least for key events).
I'm posting the code that worked for my case.
I'm adding the global event handler after the app launches. My shortcut makes ctrl+alt+cmd+T open my app.
- (void) applicationWillFinishLaunching:(NSNotification *)aNotification
{
// Register global key handler, passing a block as a callback function
[NSEvent addGlobalMonitorForEventsMatchingMask:NSKeyDownMask
handler:^(NSEvent *event){
// Activate app when pressing cmd+ctrl+alt+T
if([event modifierFlags] == 1835305 && [[event charactersIgnoringModifiers] compare:#"t"] == 0) {
[NSApp activateIgnoringOtherApps:YES];
}
}];
}
The issue I find with this is that any key registered globally by another app will not be cought... or at least in my case, perhaps I am doing something wrong.
If your program needs to display all keys, like "Command-Shift-3" for example, then it will not see that go by to display it... since it is taken up by the OS.
Or did someone figure that out? I'd love to know...
As NSGod already pointed out you can also use CoreGraphics.
In your class (e.g. in -init):
CFRunLoopRef runloop = (CFRunLoopRef)CFRunLoopGetCurrent();
CGEventMask interestedEvents = NSKeyDown;
CFMachPortRef eventTap = CGEventTapCreate(kCGSessionEventTap, kCGHeadInsertEventTap,
0, interestedEvents, myCGEventCallback, self);
// by passing self as last argument, you can later send events to this class instance
CFRunLoopSourceRef source = CFMachPortCreateRunLoopSource(kCFAllocatorDefault,
eventTap, 0);
CFRunLoopAddSource((CFRunLoopRef)runloop, source, kCFRunLoopCommonModes);
CFRunLoopRun();
Outside of the class, but in the same .m file:
CGEventRef myCGEventCallback(CGEventTapProxy proxy,
CGEventType type,
CGEventRef event,
void *refcon)
{
if(type == NX_KEYDOWN)
{
// we convert our event into plain unicode
UniChar myUnichar[2];
UniCharCount actualLength;
UniCharCount outputLength = 1;
CGEventKeyboardGetUnicodeString(event, outputLength, &actualLength, myUnichar);
// do something with the key
NSLog(#"Character: %c", *myUnichar);
NSLog(#"Int Value: %i", *myUnichar);
// you can now also call your class instance with refcon
[(id)refcon sendUniChar:*myUnichar];
}
// send event to next application
return event;
}

Resources