I want to make an Cocoa aplication that take input from keyboard and does specific actions, depends on which button is pressed. What function(or anything) shoud I use?
Check the NSResponder class documentation and there keyDown.
Related
Newbie OS X developer here, although fairly experienced with iOS.
I am missing something basic about the way the top-level NSMenu interacts with the application. I want the File->Save command to go to the current window. So far I have only been able to receive NSMenu actions in the app delegate. Am I supposed to keep track of the active window there and invoke methods from the app delegate?
Firstly, it sounds like you need to read up on Mac menu handling, because there are a lot of things you need to know about in order to deal with menus correctly.
To answer your specific question, if a menu item has a target of nil, such as the Save menu item, then the menu handling system walks up the responder chain, starting from the currently active control or view (first responder), looking for an object that implements the action selector for that menu item.
If you don't understand how the responder chain works, you should read about that too, because it's fundamental to understanding how Mac apps work.
If you want your window controller to handle the ‑save: action when its window is the main window, then all you need to do is implement the ‑save: action in your window controller. Because the window controller is in the responder chain before the application delegate, its implementation of the method will be used.
I'm building a little app which needs to recognize if certain keys on the keyboard were pressed. In this case the arrow keys. The app must take action when these keys get pressed, even if it's not the frontmost and has no focus.
Is this possible to do? What would I have to do to receive these keyboard events no matter where they happen?
You do this by registering a hotkey using Carbon's RegisterEventHotKey function. There are also open source libraries available that make this easier, for example SGHotKeysLib.
I need to simulate a button press on my Cocoa button programatically and I am trying to do this on cocotron which unfortunately does not have the NSEvent method: mouseEventWithType:location:modifierFlags:timestamp:windowNumber:context:eventNumber:clickCount:pressure:
implemented. Is there anyway to programatically simulate a button press without having to create an event?
Don't forget to look in superclasses when you're looking for something. All NSControls, including all NSButtons, respond to a performClick: message.
That said, is it really appropriate for you to simulate a button press? If you just want something done, it's generally better to directly tell the controller to do it.
I have created an NSStatusBar cocoa application which sits in the system status bar.
I want to assign a hotkey so that when pressed it toggles my applications and show the menu.
Is this possible?, In my searching and experimenting I have found a few different ways of assigning global hot keys that can be pressed when your application is in the background but I can't find any way to problematically make the menu show.
Is this possible?, If anyone thinks a way of assigning a global hotkey is best please post it.
Thanks.
One of the hotkey tutorials I found was on http://dbachrach.com/blog/2005/11/program-global-hotkeys-in-cocoa-easily/ for anyone interested.
If you're targeting 10.6+, there's some new API for NSEvent that can do global hotkeys. For more information, check out this awesome blog post: http://cocoakids.net/global-hotkeys-in-cocoa-on-snow-leopard
EDIT (a long time later)
Tooting my own horn a bit: I could never get things like PTHotKey and other libraries to work the way I was expecting, so I eventually gave up and wrote my own HotKey wrapper. It has a very simple API (you give it a key code, modifiers, a target, and an action), that even supports fun things like 10.6's blocks. You can download the source here: http://github.com/davedelong/DDHotKey
There is an actual hotkey API, which still exists in Snow Leopard and is available in 64-bit. It's designed specifically for this purpose, unlike the NSEvent methods, which are essentially just a block-based wrapper around CGEventTaps.
The difference is that the NSEvent methods (or CGEventTaps directly) make you look at every event that comes in, whereas the hotkey API only calls your function when the user presses your hotkey.
After creating a translucent window (based on example code by Matt Gemmell) I want to get keyboard events in this window. It seems that there are only keyboard events when my application is the active application while I want keyboard events even when my application isn't active but the window is visible.
Basically I want behavior like that provided by the Quicksilver application (by blacktree).
Does anybody have any hints on how to do this?
There are two options:
Use GetEventMonitorTarget() with a tacked-on Carbon run loop to grab keyboard events. Sample code is available on this page at CocoaDev.
Register an event trap with CGEventTapCreate. Sample code can be found in this thread from the Apple developer mailing list.
Edit: Note that these methods only work if you check off “Enable access for assistive devices” in the Universal Access preference pane.
A simpler route that may work better for you is to make your app background-only. The discussion on CocoaDev of the LSUIElement plist key explains how to set it up. Basically, your application will not appear in the dock or the app switcher, and will not replace the current application's menu bar when activated. From a user perspective it's never the 'active' application, but any windows you open can get activated and respond to events normally. The only caveat is that you'll never get to show your menu bar, so you'll probably have to set up an NSStatusItem (one of those icon menus that show up on the right side of the menu bar) to control (i.e. quit, bring up prefs, etc.) your application.
Edit: I completely forgot about the Non-Activating Panel checkbox in Interface Builder. You need to use an NSPanel instead of an NSWindow to get this choice. This setting lets your panel accept clicks and keyboard input without activating your application. I'm betting that some mix of this setting and the Carbon Hot Keys API is what QuickSilver is using for their UI.
Update:
Apple actually seems to have changed everything again starting with 10.5 BTW (I recently upgraded and my sample code did not work as before).
Now you can indeed only capture keydown events setting up an event tap if you are either root or assistive devices are enabled, regardless on which level you plan to capture and regardless if you selected to capture (which allows you to modify and even discard events) or to be listen only. You can still get information when flags have changed (actually even change these) and other events, but keydown under no other circumstances.
However, using the carbon event handler and the method RegisterEventHotKey() allows you to register a hotkey and you'll get notified when it is pressed, you neither need to be root for that nor do you need anything like assistive devices enabled. I think Quicksilver is probably doing it that way.