No, really. What is the proper way to handle keyboard input in a game using Cocoa? - cocoa

Let's say you're creating a game for Mac OS X. In fact, let's say you're creating Quake, only it's 2011 and you'd prefer to only use modern, non-deprecated frameworks.
You want your game to be notified when the user presses (or releases) a key, any key, on the keyboard. This includes modifer keys, like shift and control. Edited to add: Also, you want to know if the left or right version of a modifier key was pressed.
You also want your game to have a config screen, where the user can inspect and modify the keyboard config. It should contain things like:
Move forward: W
Jump: SPACE
Fire: LCTRL
What do you do? I've been trying to find a good answer to this for a day or so now, but haven't succeeded.
This is what I've came up with:
Subclass NSResponder, implement keyUp: and keyDown:, like in this answer. A problem with this approach, is that keyUp: and keyDown: won't get called when the user presses only a modifier key. To work around that, you can implement flagsChanged:, but that feels like a hack.
Use a Quartz Event Tap. This only works if the app runs as root, or the user has enabled access for assistive devices. Also, modifier key events still do not count as regular key events.
Use the HIToolbox. There is virtually no mention at all of it in the 10.6 developer docs. It appears to be very, very deprecated.
So, what's the proper way to do this? This really feels like a problem that should have a well-known, well-documented solution. It's not like games are incredibly niche.

As others have said, there’s nothing wrong with using -flagsChanged:. There is another option: use the IOKit HID API. You should be using this anyway for joystick/gamepad input, and arguably mouse input; it may or may not be convenient for keyboard input too, depending on what you’re doing.

This looks promising:
+[ NSEvent addLocalMonitorForEventsMatchingMask:handler: ]
Seems to be new in 10.6 and sounds just like what you're looking for. More here:
http://developer.apple.com/library/mac/#documentation/Cocoa/Reference/ApplicationKit/Classes/NSEvent_Class/Reference/Reference.html%23//apple_ref/occ/clm/NSEvent/addLocalMonitorForEventsMatchingMask:handler:

Related

Swipe Gesture Recogniser using voiceover

I have a view with a few gesture recognisers (ala Clear). Should I add buttons only for voice over users instead?
I thought about using the hint to say something like "3 Finger Swipe Right to Edit. Left to Delete. Up to Create a new one." but it seems like Apple discourages that. Even apple uses "Double Tap to Edit" on textFields and such and I have no idea why they discourage that.
Does not include the name of the action or gesture. A hint does not tell users how to perform the action, it tells users what will happen when that action occurs. Therefore, do not create hints such as “Tap to play the song,” “Tapping purchases the item,” or “Swipe to delete the item.”
This is especially important because VoiceOver users can use VoiceOver-specific gestures to interact with elements in your application. If you name a different gesture in a hint, it would be very confusing.
Yes you should include alternate buttons.
You're misunderstanding the Apple Disclaimer. The disclaimer refers to the fact that VoiceOver is going to take over the touch screen. Once VoiceOver takes over the screen, it decides how to pass gestures to your application. So as it works right now to activate a button, a user would highlight the button, and then double tap. But, VoiceOver doesn't need to stick to this (though it is highly likely that they will for some time). However, it is not a developers job to inform users of this. VoiceOver informs users of this through earcons, traits, and other instructions that are dependant on the AT. If a developer were to include this information in the hint, it could be invalidated by a change in the AT, and then be inconsistent across device versions, or other ATs such as braille boards.
Not only would you be potentially describing gestures that VoiceOver doesn't allow (given that it captures screen gestures. But, even if you were to apply the allows direct interaction trait, you may be describing gestures that people with disabilities can not perform. Either way, including another method of achieving the given interaction is the better solution.
Use custom actions defined on accessible elements instead of using specific buttons for your purpose.
Moreover, I don't think it's a good idea to add VoiceOver gestures dedicated to an application as you suggested with your hints : try and build your app with the VoiceOver standards that users are used to manipulating.

How to hook/remap an arbitrary keyboard event on OSX?

I would like to map:
CAPS-LOCK to Fn
Fn to Left-MOUSE
LSHIFT+3 to #
RSHIFT+3 to something else
etc.
I have searched exhaustively for any tool that offers complete freedom for remapping keyboard input, but cannot find one. (Windows has AutoHotkey).
I'm planning to write my own tool that parses a config file.
But how to actually dig in and do this?
Solving this problem will require a thorough understanding of the journey of a keystroke through the operating system, so as to intercept at the appropriate point.
I think I need to eat the event at a low-level, where it is still a virtual key code, then provide my own customised mapping system and inject an appropriate event further up the system.
But where (and how)?
Edit: I am detailing the results of my research in an answer below (which maybe should be migrated back into the question at some point).
I'm making this community wiki, please feel welcome to improve it.
Sub-Questions I've asked:
Make SHIFT+3 produce `#` not `£` on OSX by code
How to tap (hook) F7 through F12 and Power/Eject on a MacBook keyboard
How to tap/hook keyboard events in OSX and record which keyboard fires each event
-> https://github.com/Daij-Djan/DDHidLib
Trap each SHIFT key independently on OS X
In OSX, how to determine which keyboard generated an NSEvent?
I can intercept almost all keydown/keyup events at the bottom of the middle tier. Except for power, and also CAPSLOCK key-UP when it is transitioning from ON to OFF.
Pretty nasty!
Working at the bottom tier level, I can get all key up/down except for the PowerKey.
If it were not for that awkward 75% success rate for CapsLock I would have a good solution. It is vexing that handling a key in such a useful location massively escalates the required complexity.
I have found DDHidLib and am currently looking through it to figure out if it smoothes that problem.
Research
Googling "keyEventWithType CGEventTapCreate" seems like a good idea, as those are essential ingredients for Tapping the event and Re-Emitting it.
Yay! Modify keyDown output -- that code compiles, and with minor tweaking (CGEventMaskBit( kCGEventKeyDown ) | CGEventMaskBit( kCGEventFlagsChanged ),) I can pick up modifier keys also. I get different keycodes for LSHIFT and RSHIFT. Brilliant!
Problems with the above:
Tapping kCGEventKeyDown works for some function keys but not others. It looks as though Apple have only overloaded certain keys, and the overloaded ones seem to get caught at a lower level.
Power/Eject key doesn't get caught.
I don't see any way to disambiguate which device the keystroke is coming from.
How to tap (hook) F7 through F12 and Power/Eject on a MacBook keyboard
http://blog.tklee.org/2008/09/modifiers-across-keyboards.html
-> http://forums.macrumors.com/showthread.php?t=778484
-> https://github.com/maravillas/dual-keyboards
https://github.com/pkamb/PowerKey may provide some insight
-> https://github.com/pkamb/PowerKey/blob/master/PowerKey/PKPowerKeyEventListener.m
-> Apple Keyboard Media Key Event Handling -- Rogue Amoeba
... system wide shortcut for Mac OS X
-> http://snippets.aktagon.com/snippets/361-registering-global-hot-keys-with-cocoa-and-objective-c
Another problem is: LSHIFT-down RSHIFT-down&up LSHIFT-up. The RSHIFT events wouldn't get caught.
Looks like I need to dip down into IOKit
Using IOHIDManager to Get Modifier Key Events
-> https://github.com/candera/khordr/blob/master/src/c/keygrab/hid-scratch.c
kEventRawKeyDown in:
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/
Developer/SDKs/MacOSX10.10.sdk/System/Library/
Frameworks//Carbon.framework/Frameworks/HIToolbox.framework/
Headers/CarbonEvents.h
Resources
3-tier:
Cocoa/AppKit is a higher-level wrapper
Quartz takes the events from IOKit routes them to apps
Note: NSEvent is built over Quartz Event
IOKit -- talks to the hardware
Top Tier (Cocoa/AppKit)
https://developer.apple.com/library/mac/documentation/Cocoa/Conceptual/EventOverview/EventArchitecture/EventArchitecture.html -- this document is a must-see, and shows the above basic 3-tier architecture pic. However, it appears to only focus on the top tier (Cocoa/AppKit).
http://www.hcs.harvard.edu/~jrus/Site/Cocoa%20Text%20System.html <-- this article shows you 3 OSX config files that operate at an even higher level, letting you script your own mappings. This is documented here.
^ KeyBindingsEditor lets you make the above modifications.
Middle Tier (Quartz)
QuartzEventServicesRef
NSEvent -- specifically Creating Events
A couple of working code examples at this level. However, they all perform the same basic trick of receiving one virtual key code and emitting another. So you could use this technique for example to swap 'a' and 'z'.
Intercept keyboard input in OSX -- leads to Quartz Event Taps
Modify NSEvent to send a different key than the one that was pressed -- Dave De Long provide a working example, also using QET.
https://gist.github.com/wewearglasses/8313195 <-- "Global keyboard hook for OSX" -- another short working demo using QET.
Ukelele lets you choose which Unicode gets associated with a particular key, or Modifier+key. Doesn't allow remapping of modifiers, nor does it disambiguate left from right shift keys etc.
Keyboard input on OSX -- answer points towards addGlobalMonitorForEventsMatchingMask in NSEvent (in AppKit)
Base Tier (IOKit)
IOKitFundamentals <-- Also IOKit ("Interrupt Handling in the I/O Kit... Two types of events trigger an interrupt: ... Asynchronous events, such as keyboard presses")
https://developer.apple.com/library/mac/documentation/DeviceDrivers/Conceptual/AccessingHardware/AH_Other_APIs/AH_Other_APIs.html
How can Mac OS X games receive low-level keyboard input events? <-- gamers are interested in this!
http://phrack.org/issues/66/16.html#article -- sometimes the hackers present things most clearly, haven't read through this yet. IOKit again, it seems.
More Links...
How do you implement global keyboard hooks in Mac OS X? <-- links to an article.
OS X: Detect system-wide keyDown events? <-- slightly OT as this is just to do with global monitoring, i.e. read-only.
http://www.codeitive.com/7QHSeVjPWq/where-can-i-find-kcgkeyboardeventkeycode-list-of-key-codes-for-cyrillic-language.html
Have you checked out Karabiner (which does all that you want to do.. up till OSX 10.11 .. MacOS 10.12 changed the keyboard driver model, and the authors - mainly Tekezo- are still re writing Karabiner to take account of the new model - this is as of feb 2017)
Karabiner is open source, and you can download the code from Github and twiddle with it.
As part of the re-write they have released Karabiner-elements which works on 10.12 Sierra, but cannot yet do everything that karabiner did.
Karabiner is very powerful, and I miss it greatly on 10.12

how do i enable screen zoom/magnification programmatically under os x?

i'm talking about the "zoom" functionality in the universal access system preference panel. normally this is accomplished with command–option–8. then the zoom controls are command–option–+ (magnify) and command–option–- (minus/minify).
my most recent attempt involved sending the keypresses for the shortcuts as events. however this approach has serious bugs. on top of that, i don't know whether the user already has zoom enabled. i'm looking for something cleaner. like, the way you're supposed to do it.
of course there is always using applescript to open the system preferences pane and toggle the radio buttons, but that is not really what i would call "clean."
even if you don't know exactly how to accomplish what i'm asking, even some pointers as to where this kind of thing (programmatic toggling of os functionality) might be documented would be helpful. the language doesn't matter. thanks.
it's not quite what i wanted, but UAZoomEnabled() in /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/HIServices.framework/Versions/A/Headers/UniversalAccess.h lets me know whether zoom is currently enabled. then i know whether to send the command-option-8 keystrokes using CGEventCreateKeyboardEvent(), CGEventSetFlags() and CGEventPost(). in order to make sure that they're zoomed in 10 ticks, i zoom out 100 ticks, then zoom in 10 ticks.
source: http://lists.apple.com/archives/accessibility-dev/2011/Mar/msg00013.html

Get notification on Ctrl+Alt+Del

I know there is no way to block or ignore ctrl+Alt+Del within a program. But thats not what I want. Is there a way to only be notified if it WAS pressed? No interaction required, only notification.
Thank you!
I'm not sure why you'd want to do this, and I have a suspicion there's probably a better and cleaner way to accomplish your ultimate goal, but...
Off the top of my head, I would run a timer in the background of your application, and each time the timer fires, check to see if the Ctrl , Alt , and Delete keys were pressed. To do that, you'll have to use GetAsyncKeyState from user32.dll. I'd give you a code sample, but I'm not sure what language you're using. Play around with the interval for the timer to see what it needs to be to balance performance, yet still work.
Doesn't seem that there is an easy way to get a SAS notification, all articles I've found dealt with replacing GINA.
You might want to take a look at these:
Customizing GINA, Part 1
Customizing GINA, Part 2
C++ Q&A Typename, Disabling Keys in Windows XP with TrapKeys
If you only want to find out whether the user has locked his workstation, you should take a look at WTSRegisterSessionNotification in conjunction with WM_WTSESSION_CHANGE.

Find out which keyboard layout is used, using ruby

How can I find out, which keyboard layout the user of my ruby application is using?
My aim is to have a game, where you can move the player on a map. To go one step down and one step left you press "Y" on a german keyboard. On an American keyboard, you would press "Z". We optimized the game for windows and mac, so I would like a solution for both platforms (and we don't use any command/shift/control-keys).
For Windows, you probably have to use the Windows API GetKeyboardLayout(), unless Ruby provides a wrapper for that.
There are a lot of useful I18n resources for Windows on the MSDN web site.
It might be easier to simply allow them to configure it themselves as a preference if you don't have a good portable way of detecting it.
I think it'll be much easier and naturally to allow users to define keys themselves.
As Alexander recommended, let the user define the keys themselves.
But, if you really want to recognize the layout, you could always ask the user to press keys in certain positions, particular to some layouts.
"Press the second key to the left of the return key. If your return key is two rows high, press the lower one"
[presses ä]
"Looks like you have a scandinavian keyboard"
That, however, is a horrible cludge, and, in the game context, would recommend the custom keymapping method

Resources