I'm looking to differenciate a dock clic from a click on the app icon in the finder.
Can I know what called applicationShouldHandleReopen or is there another way to do it ?
applicationShouldHandleReopen:hasVisibleWindows: is sent to the application's delegate. Delegate messages are normally sent by the delegating object, which in this case would be the application object.
The application object sends that message to its delegate in order to handle the reopen-application Apple Event. So, to find the sender, install your own Apple Event handler for that event and get the sender from the event. (The sample code is in Pascal and uses Apple Event Manager, but you can translate it to Objective-C and NSAppleEventDescriptor.)
That said, what you're doing is very dubious from a UI perspective. Reopening is meant to do the same thing no matter which application is reopening you—and it is not limited to the Finder or the Dock. In the common case, it is literally the user trying to launch the app while it is already open.
It may make more sense to only do your “reopen” behavior when no windows are open. Cocoa's built-in document-based-apps support does this automatically; if you don't respond to applicationShouldHandleReopen:: or you return YES, the application tries to open a new document. You can perform the same check (it even tells you whether you have any windows open) and perform your desired behavior under the same condition.
Related
On Windows, the drag and drop action can be done via COM DoDragAndDrop API, see http://msdn.microsoft.com/zh-cn/library/windows/desktop/ms678486%28v=vs.85%29.aspx. It can perform D&D operation perfectly and has the best system integration.
Recently, I found it is not touch friendly API, it can not handle touch event very well. On Window 7/8, A Win32 window created by CreateWindow API is also able to handle touch event in the same way as handling mouse event. Actullay, it seems the touch events are converted into a similar mouse events, e.g. a mouse down event is triggered when a finger is tapped down, a mouse move event is triggered when a finger is moved.
However, the DoDragAndDrop COM API doesn't convert touch event into mouse event, even the COM service doesn't have any idea about touch event at all. But I had a try to drag a file from one folder to another folder on Win8, it works. If the D&D operation is also implemented based on COM API, I indeed got a conflicting result.
Did I miss something when I use DoDragAndDrop for touch event support? Thanks.
DoDragDrop() does support touch on Win7/8 (and yes, D&D of files is implemented by Windows Explorer using DoDragDrop()), so your problem is related to something else. Did you check whether DoDragDrop() is returning any error code to your code that you may be ignoring?
I'm writing an application with a main window that's displayed when the app starts. When the window is closed, I'd like the app to remain running (with a menu-bar menu), and if the user clicks on the dock icon again, I'd like the main window to be presented again.
I'm about 90% of the way there: my app properly keeps running after the main window is closed with Cmd-W, and since "Release When Closed" is unchecked, the window could be [makeKeyAndOrderFront:]-ed to show it again when the dock icon is clicked.
The only missing piece of this puzzle is intercepting the actual dock-icon click.
The other threads about this topic recommend implementing either applicationShouldHandleReopen:hasVisibleWindows: or applicationShouldOpenUntitledFile: in the window controller. I've done both, and neither one ever gets called.
Any other ideas?
The other threads about this topic recommend implementing either applicationShouldHandleReopen:hasVisibleWindows: or applicationShouldOpenUntitledFile: in the window controller.
That's only true if the window controller is the application's delegate. That is the object to which the application sends those messages.
I would not make a window controller the application's delegate, though. I typically make them two separate objects. Make one object specifically to be the application's delegate, and when that object receives the relevant delegate messages, send a message to your window controller telling it to do whatever it needs to do.
Actually, what I usually do in single-window apps is make the application's delegate create and own the window controller. You can respond to window closure by throwing away the WC, and respond to reopen by checking whether you have a WC and creating one (and thereby reopening the window) if you don't.
Use [NSApp setDelegate:self]; in awakeFromNib.
I'm using Lion, and the applicationWillUnhide - applicationDidUnhide application delegate methods are not being called when expected.
I'm miniaturizing the app to the doc, and then clicking on the dock icon again, but they are not being called, and the application is being deminiaturized correctly.
Maybe this does not count as hiding? How can I catch this event?
Your help is greatly appreciated,
Jose.
You can’t minimise an application on OS X, only hide it. To observe your NSApplication’s hidden state use NSApplicationDidHideNotification and NSApplicationWillUnhideNotification. If these notifications don’t appear to be sent correctly, you’ll need to show us some code.
Or do you actually mean minimising windows? You’ll have to observe NSWindowWillMiniaturizeNotification/NSWindowDidMiniaturizeNotification and NSWindowDidDeminiaturizeNotification for that (as per the NSWindow class reference). Remember that you can pass nil for the object parameter of -[NSNotification addObserver:selector:name:object:] to observe the minimisation state of all your application’s windows.
I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?
Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.
After creating a translucent window (based on example code by Matt Gemmell) I want to get keyboard events in this window. It seems that there are only keyboard events when my application is the active application while I want keyboard events even when my application isn't active but the window is visible.
Basically I want behavior like that provided by the Quicksilver application (by blacktree).
Does anybody have any hints on how to do this?
There are two options:
Use GetEventMonitorTarget() with a tacked-on Carbon run loop to grab keyboard events. Sample code is available on this page at CocoaDev.
Register an event trap with CGEventTapCreate. Sample code can be found in this thread from the Apple developer mailing list.
Edit: Note that these methods only work if you check off “Enable access for assistive devices” in the Universal Access preference pane.
A simpler route that may work better for you is to make your app background-only. The discussion on CocoaDev of the LSUIElement plist key explains how to set it up. Basically, your application will not appear in the dock or the app switcher, and will not replace the current application's menu bar when activated. From a user perspective it's never the 'active' application, but any windows you open can get activated and respond to events normally. The only caveat is that you'll never get to show your menu bar, so you'll probably have to set up an NSStatusItem (one of those icon menus that show up on the right side of the menu bar) to control (i.e. quit, bring up prefs, etc.) your application.
Edit: I completely forgot about the Non-Activating Panel checkbox in Interface Builder. You need to use an NSPanel instead of an NSWindow to get this choice. This setting lets your panel accept clicks and keyboard input without activating your application. I'm betting that some mix of this setting and the Carbon Hot Keys API is what QuickSilver is using for their UI.
Update:
Apple actually seems to have changed everything again starting with 10.5 BTW (I recently upgraded and my sample code did not work as before).
Now you can indeed only capture keydown events setting up an event tap if you are either root or assistive devices are enabled, regardless on which level you plan to capture and regardless if you selected to capture (which allows you to modify and even discard events) or to be listen only. You can still get information when flags have changed (actually even change these) and other events, but keydown under no other circumstances.
However, using the carbon event handler and the method RegisterEventHotKey() allows you to register a hotkey and you'll get notified when it is pressed, you neither need to be root for that nor do you need anything like assistive devices enabled. I think Quicksilver is probably doing it that way.