Intercepting click on send in Mail.app - cocoa

Is it possible to intercept click on Send in Mac OS X Mail application?
Thanks

You can, but it won't necessarily be easy. You should be able to inject some code into the program (see for example SIMBL and F-Script Anywhere) and then change the button's target.

Basically it can be done by using Accessibility APIs. I listen to moue click events in my application, so when I get a click I check whether the Mail.app is a frontmost application and whether a click came from Send button UIElement.

Related

Possible to tell if NSApplicationActivationPolicyProhibited application is active?

Using JUCE with TUIO, I'm developing a multi-touch utility to send "hot keys" commands to other applications (I am using a usb touch frame that sends TUIO messages). For instance, I provide an interface through which users can touch-and-hold to program a key combo and then tap that button to send the programmed key combo to another app. They way I accomplish this on OSX is by running my utility as a "background only" application (NSApplicationActivationPolicyProhibited). I use [NSWindow setCanHide: NO] so the GUI of my utility is visible even though it runs as a background app.
It works well except in the case that a window from another application is on top of mine. What happens is that touches get passed through that other app into mine- causing unintentional button pushes in my app. Normally, I could have my app only listen to the TUIO touch callback whenever is is the active application, [NSApp isActive]. But, since my app is background only, it is never active and I have no way to tell if another window is covering it to prevent touches.
So, is there any way for a "background only" app to be able to tell if it is on top of all other windows? Or, is there a way from within my app to get a list of all Cocoa windows from other applications and be able to tell if they are appearing on top of my "background only" app?
Also, does anyone know how I would go about all of the above in Windows? In other words, what is the Windows equivalent of NSApplicationActivationPolicyProhibited and would I be able to tell if it is covered by other applications' windows?

Can I know what called applicationShouldHandleReopen ?

I'm looking to differenciate a dock clic from a click on the app icon in the finder.
Can I know what called applicationShouldHandleReopen or is there another way to do it ?
applicationShouldHandleReopen:hasVisibleWindows: is sent to the application's delegate. Delegate messages are normally sent by the delegating object, which in this case would be the application object.
The application object sends that message to its delegate in order to handle the reopen-application Apple Event. So, to find the sender, install your own Apple Event handler for that event and get the sender from the event. (The sample code is in Pascal and uses Apple Event Manager, but you can translate it to Objective-C and NSAppleEventDescriptor.)
That said, what you're doing is very dubious from a UI perspective. Reopening is meant to do the same thing no matter which application is reopening you—and it is not limited to the Finder or the Dock. In the common case, it is literally the user trying to launch the app while it is already open.
It may make more sense to only do your “reopen” behavior when no windows are open. Cocoa's built-in document-based-apps support does this automatically; if you don't respond to applicationShouldHandleReopen:: or you return YES, the application tries to open a new document. You can perform the same check (it even tells you whether you have any windows open) and perform your desired behavior under the same condition.

Cocoa : Repport click on dock icon

Is there a way to report every mouse click on the application dock icon?
Not completely safe (also activated by double-click on the application itself),
but definitely the most easy way to implement:
- (BOOL)applicationShouldHandleReopen:(NSApplication *)theApplication hasVisibleWindows:(BOOL)flag
Quote from NSApplicationDelegate Protocol Reference:
These events are sent whenever the Finder reactivates an already running application because someone double-clicked it again or used the dock to activate it.
I would like to suggest an alternative solution to the answer provided by Anne, which avoids conflicting with the event in which the user double clicks on the application icon, instead of on the dock icon.
Thus, I suggest to use
- (BOOL)applicationShouldOpenUntitledFile:(NSApplication *)sender;
See also the Apple's documentation.

Cocoa accessibility API, can I click a window in the background without activating it?

I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?
Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.

How do I get keyboard events in an NSStatusWindowLevel window while my application is not frontmost?

After creating a translucent window (based on example code by Matt Gemmell) I want to get keyboard events in this window. It seems that there are only keyboard events when my application is the active application while I want keyboard events even when my application isn't active but the window is visible.
Basically I want behavior like that provided by the Quicksilver application (by blacktree).
Does anybody have any hints on how to do this?
There are two options:
Use GetEventMonitorTarget() with a tacked-on Carbon run loop to grab keyboard events. Sample code is available on this page at CocoaDev.
Register an event trap with CGEventTapCreate. Sample code can be found in this thread from the Apple developer mailing list.
Edit: Note that these methods only work if you check off “Enable access for assistive devices” in the Universal Access preference pane.
A simpler route that may work better for you is to make your app background-only. The discussion on CocoaDev of the LSUIElement plist key explains how to set it up. Basically, your application will not appear in the dock or the app switcher, and will not replace the current application's menu bar when activated. From a user perspective it's never the 'active' application, but any windows you open can get activated and respond to events normally. The only caveat is that you'll never get to show your menu bar, so you'll probably have to set up an NSStatusItem (one of those icon menus that show up on the right side of the menu bar) to control (i.e. quit, bring up prefs, etc.) your application.
Edit: I completely forgot about the Non-Activating Panel checkbox in Interface Builder. You need to use an NSPanel instead of an NSWindow to get this choice. This setting lets your panel accept clicks and keyboard input without activating your application. I'm betting that some mix of this setting and the Carbon Hot Keys API is what QuickSilver is using for their UI.
Update:
Apple actually seems to have changed everything again starting with 10.5 BTW (I recently upgraded and my sample code did not work as before).
Now you can indeed only capture keydown events setting up an event tap if you are either root or assistive devices are enabled, regardless on which level you plan to capture and regardless if you selected to capture (which allows you to modify and even discard events) or to be listen only. You can still get information when flags have changed (actually even change these) and other events, but keydown under no other circumstances.
However, using the carbon event handler and the method RegisterEventHotKey() allows you to register a hotkey and you'll get notified when it is pressed, you neither need to be root for that nor do you need anything like assistive devices enabled. I think Quicksilver is probably doing it that way.

Resources