OSX: detect SPACEBAR down and up events - macos

I need to record several thousand short soundbites of my own speech (I'm training an acoustic model for a speech recognition engine)
So for each one, a line of text presents itself on the screen and I have to speak it, and capture the audio into a .WAV
I found a sample project for recording the audio; now I am trying to figure out how to do keyboard input.
I would like to push the SPACEBAR down to start recording and release it to terminate the recording.
Can anyone get me started? ( an example would be ideal! )
Sorry, this is probably really really basic -- I haven't done any coding in OS X before (though I have done a lot of iOS work so I am no stranger to Xcode and some of the frameworks)

If you create a basic Cocoa application, you can use the following methods of NSResponder, of which NSView is a subclass, to capture your desired key up/down events:
-(void)keyDown:(NSEvent*)event;
-(void)keyUp:(NSEvent*)event;
Use [event keyCode] to get the key pressed.

Related

Programmatically stop recording a screen capture on mac

I start screen capturing on Mac using cmd-shift-5. I would like a way to programmatically stop the screen capture (or alternatively, to flush to screen capture so far to disk).
The reason I'd like this is I'd like to start processing the first minute of screen capture while I continue recording additional footage. I already have ways of programmatically starting a screen capture (1. programmatically sending keyboard inputs to mimic cmd-shift-5 ENTER; or 2. the screencapture command line tool, though I can't seem to get the command line tool to record audio, just video, and the audio's important too), but not of programmatically stopping the capture.
There doesn't even seem to be a keyboard shortcut for stopping the capture; only a button on the touchbar. Some options I've considered are 1) trying to find a way to programmatically use the touchbar, or 2) a combination of programmatic keyboard and mouse movements (cmd-shift-5, then clicking the stop recording button.)
A better approach would be appreciated!

How can I detect the position of touch input on the track pad in OSX?

I have a magic trackpad 2 and I'm hoping to write an app that can read the raw touch data and re-position the mouse on touch start to make it behave more like a tablet than a trackpad.
Step 1 is reading raw touch input. I need to hook the "user put their finger on the trackpad" event and get the absolute x/y.
In googling I've found several solutions in swift and objc, but they're all like 6+ years old and I can't get any of them to build. Honestly, if there is an even lower level api I could use (something akin to win32 apis) that would be even better.
Any advise on where to start on this would be highly appreciated.

How to hook/remap an arbitrary keyboard event on OSX?

I would like to map:
CAPS-LOCK to Fn
Fn to Left-MOUSE
LSHIFT+3 to #
RSHIFT+3 to something else
etc.
I have searched exhaustively for any tool that offers complete freedom for remapping keyboard input, but cannot find one. (Windows has AutoHotkey).
I'm planning to write my own tool that parses a config file.
But how to actually dig in and do this?
Solving this problem will require a thorough understanding of the journey of a keystroke through the operating system, so as to intercept at the appropriate point.
I think I need to eat the event at a low-level, where it is still a virtual key code, then provide my own customised mapping system and inject an appropriate event further up the system.
But where (and how)?
Edit: I am detailing the results of my research in an answer below (which maybe should be migrated back into the question at some point).
I'm making this community wiki, please feel welcome to improve it.
Sub-Questions I've asked:
Make SHIFT+3 produce `#` not `£` on OSX by code
How to tap (hook) F7 through F12 and Power/Eject on a MacBook keyboard
How to tap/hook keyboard events in OSX and record which keyboard fires each event
-> https://github.com/Daij-Djan/DDHidLib
Trap each SHIFT key independently on OS X
In OSX, how to determine which keyboard generated an NSEvent?
I can intercept almost all keydown/keyup events at the bottom of the middle tier. Except for power, and also CAPSLOCK key-UP when it is transitioning from ON to OFF.
Pretty nasty!
Working at the bottom tier level, I can get all key up/down except for the PowerKey.
If it were not for that awkward 75% success rate for CapsLock I would have a good solution. It is vexing that handling a key in such a useful location massively escalates the required complexity.
I have found DDHidLib and am currently looking through it to figure out if it smoothes that problem.
Research
Googling "keyEventWithType CGEventTapCreate" seems like a good idea, as those are essential ingredients for Tapping the event and Re-Emitting it.
Yay! Modify keyDown output -- that code compiles, and with minor tweaking (CGEventMaskBit( kCGEventKeyDown ) | CGEventMaskBit( kCGEventFlagsChanged ),) I can pick up modifier keys also. I get different keycodes for LSHIFT and RSHIFT. Brilliant!
Problems with the above:
Tapping kCGEventKeyDown works for some function keys but not others. It looks as though Apple have only overloaded certain keys, and the overloaded ones seem to get caught at a lower level.
Power/Eject key doesn't get caught.
I don't see any way to disambiguate which device the keystroke is coming from.
How to tap (hook) F7 through F12 and Power/Eject on a MacBook keyboard
http://blog.tklee.org/2008/09/modifiers-across-keyboards.html
-> http://forums.macrumors.com/showthread.php?t=778484
-> https://github.com/maravillas/dual-keyboards
https://github.com/pkamb/PowerKey may provide some insight
-> https://github.com/pkamb/PowerKey/blob/master/PowerKey/PKPowerKeyEventListener.m
-> Apple Keyboard Media Key Event Handling -- Rogue Amoeba
... system wide shortcut for Mac OS X
-> http://snippets.aktagon.com/snippets/361-registering-global-hot-keys-with-cocoa-and-objective-c
Another problem is: LSHIFT-down RSHIFT-down&up LSHIFT-up. The RSHIFT events wouldn't get caught.
Looks like I need to dip down into IOKit
Using IOHIDManager to Get Modifier Key Events
-> https://github.com/candera/khordr/blob/master/src/c/keygrab/hid-scratch.c
kEventRawKeyDown in:
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/
Developer/SDKs/MacOSX10.10.sdk/System/Library/
Frameworks//Carbon.framework/Frameworks/HIToolbox.framework/
Headers/CarbonEvents.h
Resources
3-tier:
Cocoa/AppKit is a higher-level wrapper
Quartz takes the events from IOKit routes them to apps
Note: NSEvent is built over Quartz Event
IOKit -- talks to the hardware
Top Tier (Cocoa/AppKit)
https://developer.apple.com/library/mac/documentation/Cocoa/Conceptual/EventOverview/EventArchitecture/EventArchitecture.html -- this document is a must-see, and shows the above basic 3-tier architecture pic. However, it appears to only focus on the top tier (Cocoa/AppKit).
http://www.hcs.harvard.edu/~jrus/Site/Cocoa%20Text%20System.html <-- this article shows you 3 OSX config files that operate at an even higher level, letting you script your own mappings. This is documented here.
^ KeyBindingsEditor lets you make the above modifications.
Middle Tier (Quartz)
QuartzEventServicesRef
NSEvent -- specifically Creating Events
A couple of working code examples at this level. However, they all perform the same basic trick of receiving one virtual key code and emitting another. So you could use this technique for example to swap 'a' and 'z'.
Intercept keyboard input in OSX -- leads to Quartz Event Taps
Modify NSEvent to send a different key than the one that was pressed -- Dave De Long provide a working example, also using QET.
https://gist.github.com/wewearglasses/8313195 <-- "Global keyboard hook for OSX" -- another short working demo using QET.
Ukelele lets you choose which Unicode gets associated with a particular key, or Modifier+key. Doesn't allow remapping of modifiers, nor does it disambiguate left from right shift keys etc.
Keyboard input on OSX -- answer points towards addGlobalMonitorForEventsMatchingMask in NSEvent (in AppKit)
Base Tier (IOKit)
IOKitFundamentals <-- Also IOKit ("Interrupt Handling in the I/O Kit... Two types of events trigger an interrupt: ... Asynchronous events, such as keyboard presses")
https://developer.apple.com/library/mac/documentation/DeviceDrivers/Conceptual/AccessingHardware/AH_Other_APIs/AH_Other_APIs.html
How can Mac OS X games receive low-level keyboard input events? <-- gamers are interested in this!
http://phrack.org/issues/66/16.html#article -- sometimes the hackers present things most clearly, haven't read through this yet. IOKit again, it seems.
More Links...
How do you implement global keyboard hooks in Mac OS X? <-- links to an article.
OS X: Detect system-wide keyDown events? <-- slightly OT as this is just to do with global monitoring, i.e. read-only.
http://www.codeitive.com/7QHSeVjPWq/where-can-i-find-kcgkeyboardeventkeycode-list-of-key-codes-for-cyrillic-language.html
Have you checked out Karabiner (which does all that you want to do.. up till OSX 10.11 .. MacOS 10.12 changed the keyboard driver model, and the authors - mainly Tekezo- are still re writing Karabiner to take account of the new model - this is as of feb 2017)
Karabiner is open source, and you can download the code from Github and twiddle with it.
As part of the re-write they have released Karabiner-elements which works on 10.12 Sierra, but cannot yet do everything that karabiner did.
Karabiner is very powerful, and I miss it greatly on 10.12

Can AppleScript read mouse position / action in one application and replicate it in another application?

I've been searching for a mouse broadcaster for Mac for a while and it seems there are no solutions for doing this, so I must look for alternative solutions now. I'm wondering if AppleScript is capable of performing such a task. Basically, what I would like to do is read mouse position and action when performed in one application for as long as the script is active, and broadcast/replicate it in one or more other applications. Is AppleScript capable of this?
Just to clarify, I'd need to simulate mouse movement in the other applications... for example, if I opened up several instances of a drawing program, assuming that the program had the same resolution, anything I drew in the main program, would replicate on the other programs.
Really applescript cannot do what you need. It's not made for that. Applescript is made to run the commands in an application's applescript dictionary. I assume that the dictionary of the applications you want to control give you no way to read and control the mouse.
You do have an applescript alternative though. I have made a command line tool to read the mouse position and also to move the mouse. So theoretically you can do what you want with applescript and my tool. I do not believe you will get the results you expect though. Anyway you can try. Here's a link to the web page for my tool. I hope it helps.
Get it here.
Your basic approach could be 1) activate the application you want to read the mouse position, 2) run my tool in a repeat loop and record the mouse positions, 3) activate the second application that you want to duplicate the mouse movements, 4) use a repeat loop with my tool to make the mouse move according to how you recorded it.

No, really. What is the proper way to handle keyboard input in a game using Cocoa?

Let's say you're creating a game for Mac OS X. In fact, let's say you're creating Quake, only it's 2011 and you'd prefer to only use modern, non-deprecated frameworks.
You want your game to be notified when the user presses (or releases) a key, any key, on the keyboard. This includes modifer keys, like shift and control. Edited to add: Also, you want to know if the left or right version of a modifier key was pressed.
You also want your game to have a config screen, where the user can inspect and modify the keyboard config. It should contain things like:
Move forward: W
Jump: SPACE
Fire: LCTRL
What do you do? I've been trying to find a good answer to this for a day or so now, but haven't succeeded.
This is what I've came up with:
Subclass NSResponder, implement keyUp: and keyDown:, like in this answer. A problem with this approach, is that keyUp: and keyDown: won't get called when the user presses only a modifier key. To work around that, you can implement flagsChanged:, but that feels like a hack.
Use a Quartz Event Tap. This only works if the app runs as root, or the user has enabled access for assistive devices. Also, modifier key events still do not count as regular key events.
Use the HIToolbox. There is virtually no mention at all of it in the 10.6 developer docs. It appears to be very, very deprecated.
So, what's the proper way to do this? This really feels like a problem that should have a well-known, well-documented solution. It's not like games are incredibly niche.
As others have said, there’s nothing wrong with using -flagsChanged:. There is another option: use the IOKit HID API. You should be using this anyway for joystick/gamepad input, and arguably mouse input; it may or may not be convenient for keyboard input too, depending on what you’re doing.
This looks promising:
+[ NSEvent addLocalMonitorForEventsMatchingMask:handler: ]
Seems to be new in 10.6 and sounds just like what you're looking for. More here:
http://developer.apple.com/library/mac/#documentation/Cocoa/Reference/ApplicationKit/Classes/NSEvent_Class/Reference/Reference.html%23//apple_ref/occ/clm/NSEvent/addLocalMonitorForEventsMatchingMask:handler:

Resources