OS X - App doesn't show up in force quit. How do I fix that? - cocoa

I'm writing a Cocoa application that installs itself as an menulet in the menu bar (i.e. like the volume or battery icons). When the program crashes, it isn't possible to use the Force-Quit dialog, because it doesn't show up in the list. Of course, I can still kill it using the command-line, but my users don't know how to do that. Is there any way to fix this, say by making the program show up in the Force-Quit dialog?
(Note: the app is Leopard only).

To be honest, the proper solution is to make sure your app never hangs or crashes for users. This should be your #1 priority, rather than figuring out how to let users deal with crashes and hangs. Obviously it isn't always possible to make sure your app never breaks in these ways, but it should definitely be the exception rather than the rule.
On another note, MenuExtras is a private API which I hope you aren't using to create your "menulet". Rather, the public class NSStatusItem (part of Cocoa) is the Apple-approved, recommended way to install icons into your menu bar.

Not really an answer, but hopefully helpful still ...
I think that most people who know how to force quit also know they can kill a process in the activity monitor. Just make sure it's not named '93AZkZ' or something.

You could provide a PreferencePane for your application that can send the proper signal to it, if you want to allow users an easy way to shut it down or restart it. This is the pattern that MySQL uses on OS X.

Related

Intercepting a window's attempt to steal global focus on Windows

I'm a developer and a long-time Windows user with an obsession about making my system as convenient to use as possible.
Yesterday I thought about something that has always annoyed me in Windows and that I've taken for granted, and I realized that I have a better idea for how it could work, and I'm now wondering whether it's possible to tweak Windows to work like that.
The thing that annoys me is when windows steal focus. For example, I could be running an installer for some program. While it's working, I'll switch to my browser and browse, maybe entering some text into an email in my browser. Then suddenly the installer finishes and its window steals the focus. Now I'm in the middle of writing an email, so I might press a key that happens to be bound to a button on that installer, and then that button gets invoked, doing some action that I never intended to happen!
This is doubly annoying to me because I'm using a multiple-desktop program called DexPot, and when a window steals focus, it also brings itself to the desktop I'm currently on, which can be really annoying, because then I have to put it back into its original desktop.
How my ideal solution to this problem would work: Every time a window tries to steal focus, we intercept that, and don't let it. We show something like a toaster message saying "Foobar installer wants focus, press Win-Whatever to switch to it". If and when you press the key combo, it switches to the window.
The question is: Is there an easy way to tweak Windows to make this happen? I know very little about Windows programming. I do know AHK and if it's possible with that, that'd be great.
No, there isn't an easy way to add this behavior, but Windows tries to do this automatically.
In theory apps shouldn't be able to steal the foreground while you're actively using another app. Unfortunatly there are some scenarios where Windows can't tell the difference between legitimate user actions that should change the foreground and unwanted foreground-theft. The window manager generally tightens up the holes a bit with each new version of Windows, but also needs to make sure that apps can come to the foreground when the user wants them to, even if that desire is expressed indirectly.
For example, a process launched by the current foreground process can put a window into the foreground. This is necessary so that when a user launches a window from Explorer the newly launched process can open its main window. This permission only lasts until the next user input, so if an application is slow to launch and you start working on an email the app may lose its foreground permissions before it can use them.
See the SetForegroundWindow function documentation for a list of requirements for a process to be able to set a window into the foreground.
There are also apps which specifically make use of these requirements to steal the permission (by joining the foreground queue or synthsising user input to themselves), but I suspect in your installer scenario it is accidental.
I'm not sure what exactly is going on, but I suspect that the problem comes from the installer running as a service and accidentally stealing the foreground permission when it tries to launch the app on your current desktop.
It would be theoretically possible for an external process to hook into the foreground system to override this and show your confirmation toast, but it would be tricky to get right and would require significant low level code (I'd probably start with a CbtHook). It would not be possible in a scripting package like AHK (assuming you mean AutoHotKey) but would need to be native C/C++ code injected into every running process.

How to disable the Help key and context sensitive help mode in OSX, especially with Qt

I have a cross platform Qt application that's running into some trouble in OSX. There's a feature that OSX has that I didn't even know existed - the 'Help' key. My MBP doesn't have one, and neither does my Apple wired keyboard purchased a year ago. It seems that this is mostly something that older Macs have. Apparently it generates the same scan code as the Insert key on PC keyboards.
Anyways, when the Help key is pressed, the cursor over our application (or any application that receives the Help key event) turns into a little question mark. This seems to be part of what's called 'context-sensitive help mode', as documented in the NSHelpManager's setContextHelpModeActive: method and in the NSApplication's activateContextHelpMode: method docs. From the docs:
In this mode, the cursor becomes a question mark, and help appears for any user interface item the user clicks.
Most applications don’t use this method. Instead, applications enter
context-sensitive mode when the user presses the Help key.
Applications exit context-sensitive help mode upon the first event
after a help window is displayed.
How many Cocoa developers actually know about this? I'm assuming that clicking on something in the application with this question mark cursor should do something like bring up a help message, but I haven't found a single Cocoa application where it actually does anything at all - not even Apple's apps do anything. In fact, it even seems to put a lot of applications into a strange mode where the cursor text selection is enabled.
The problem is that when we change the application cursor programmatically in Qt when we're in this help-question-cursor-mode, bad things happen. Specifically, our application actually crashes. The crash happens deep inside Cocoa in the NSApplication's NSHelpManager. I'd like to find out why we're seeing this crash, but I'm actually more interested in how we can suppress this 'help' mode. There's nothing in Qt or Cocoa that I can see that would stop it, other than perhaps intercepting and squashing an event, which I haven't tried yet.
Does anyone know any more about this?

Start and manage an Application.app's view from another cocoa app?

I'm in the situation where I love the Terminal.app of the mac, but I would love to add some further enhancements, like split views, terminal sets, etc.
Basically I tried to rebuild the Terminal.app with an NSTask/PseudoTTY approach which basically works but just doesn't feel and behave like the beloved Terminal.app itself. There's also no need to reinvent the wheel, I think.
So is there any approach to start an cocoa application (b) from another cocoa application (a) and manage the window or view of b from a? Like I have a ManageTerminals.app that start 6 Terminals and puts the views of them fullscreen in a grid, every instance being a fully working Terminal.app?
I found the SIMBL that basically allows to do something like that. At least the website says so. But there are no manuals or documentation available.
Does anybody have an idea how to accomplish this? I don't want to change an App, I just want to manage the size and appearance of the window/view on the screen.
Thanks for any ideas or concepts!
-- EDIT
I tried Apples ScriptingBridge now which almost does the job. There's just one little last step missing that might be a show stopper. Right now I have the following:
terminal = [SBApplication applicationWithBundleIdentifier:#"com.apple.Terminal"];
[terminal activate];
if([terminal isRunning]){
TerminalWindow *terminalWindow = [[[terminal windows] get] objectAtIndex:0];
view = (NSView*)[terminalWindow contentView];
}
Of course it's giving me an unrecognized selector, because there's no method to retrieve the view from the terminalWindow in the Terminal header. But if that was possible I could create x instances of my application and replug the view of the terminals to an own window that manages only the views.
Does someone know how to accomplish this, or do you think it's totally capsuled away?
You can launch an application with [[NSWorkspace sharedWorkspace] launchApplication:#"iChat"]. However you can't manage views. Youre only allowed to change the windows frame. AppleScript might help you here out. I've never used SIMBL before but theres a [wiki page][code.google.com/p/simbl/w/list]
You should probably take a look at iTerm which is an open-source terminal emulator for Mac OS X. You might be able to modify it to your needs, or at least see how a terminal emulator works via Cocoa.
Otherwise, you can use the Accessibility framework to control the position of other apps' windows. The user has to specifically allow this via the "Allow access for assistive devices" preference in the Accessibility pane of System Preferences.
Doing much more than that becomes more complex. Apple Events/AppleScript may give you the tools you need. I know Terminal has an AppleScript interface but I'm not sure how complete it is. I really don't recommend using SIMBL. This does allow you to inject your code into another app's memory space but since you would need to reverse engineer the other app you cannot guarantee stability.

Create a Program that Sits in The Windows Taskbar and, When Activated, Stops the Screensaver From Starting

I don't really know where to begin. Let's start with the stupid questions:
What language should I use for this? What is suited for the task at hand?
Next, the real ones:
Is there a way to stop the screensaver from starting, short of changing the cursor position? If not, will changing the cursor position even work?
SetThreadExecutionState will prevent the screensaver from coming on or the machine from automatically going to sleep if you pass the ES_CONTINUOUS and ES_DISPLAY_REQUIRED flags.
I wrote an app awhile ago that does exactly what you are asking for. It runs as an icon in the System Tray, not the Taskbar, and uses a global message hook to disable the WM_SYSCOMMAND/SC_SCREENSAVE notification from reaching any applications. If that notification does not reach the DefWindowProc() function, the screen saver will never run.
Your program does not need to be visible in the task bar at all.
You don't even need a program at all, if you can disable the screensaver in the registry.
What you want to do can perhaps be achieved by sending a MOUSE_MOVE event to the desktop window. If you want to use C# (the only language I am current with right now), you can look at this article, but maybe a simple C program using the WinAPI is better suited for this task.
.NET will easily allow you to put an application in the system tray (checkout the NotifyIcon object in System.Windows.Forms.Controls).
I believe you can use the SetCursorPos (http://msdn.microsoft.com/en-us/library/ms648394(VS.85).aspx) API call to prevent the screen saver, just make sure you set them to the current location so you don't actually move the mouse.

How can I post a Cocoa "sheet" on another program's window?

Using the Apple OS X Cocoa framework, how can I post a sheet (slide-down modal dialog) on the window of another process?
Edit: Clarified a bit:
My application is a Finder extension to do Subversion version control (http://scplugin.tigris.org/). Part of my application is a plug-in (a Contextual Menu Item for Finder); the bulk of my application, however, is in a separate daemon proces. For several reasons, we've chosen to put virtually all the code into the daemon; the plug-in only defines the menu itself, and Apple-Events over to the Daemon.
Sometimes, the daemon needs to prompt the user for further information. It can toss a window on-screen for this, but that's disruptive (randomly positioned), and it seems to me the work flow here is legitimately modal, for example "select a file, pick 'commit' from the menu, provide commit comments, do the operation."
Interprocess cooperation (such as passing a reference of some kind) is acceptable: both processes are mine, but I want to avoid binding the sheet's code into the primary process.
Really, it sounds like you're trying to have your inter-process communication happen at the view level, which isn't really how Cocoa generally works. Things will be much easier if you separate your layers a bit more than that.
Why don't you want to put the sheet code into the other process? It's view code, and view code is inherently process-specific. The right thing to do here is probably to add somewhat generic modal-sheet support to your plugin code, and an IPC call that your daemon can make to summon that code. Trying to ship view objects over to the remote process is going to be nightmarish if you can make it work at all.
You're fighting the frameworks with this approach.
You can't add a sheet to a window in another process, because you have at most only the most restricted access to the windows in the other process.
Please don't do this. Make the interaction nonmodal if at all possible. Especially in something like a commit, it's much nicer to be able to browse around your files while you're writing commit comments.
OS X does have window groups, but I don't think they can (easily) span applications.
Another thing to consider is that in OS X it's possible to have many Finder windows open on the same folder (unlike in OS 9). Even if you did have sufficient privileges/APIs to add a sheet to a Finder window, it's not like the modality of that window would prevent the user from being able to continue working with the files.
(My personal opinion as a long-time Mac user is that this kind of interaction would drive me right up the wall.)

Resources