I am about to implement simple keyboard and mouse input on osx for my engine. I want to abstract the implementation in more generic c++ classes like Keyboard and Mouse plus appropriate Listeners for portability. Anyways, I came across the Leopard HID Api (http://developer.apple.com/mac/library/technotes/tn2007/tn2187.html#TNTAG9000-SAMPLE_CODE_)which seems to be the right way to go for the osx implementation of these classes. Anyways, the examples of the HID are very complex and I can't really wrap my head around it as fast as I wished I could, so I was wondering if anybody has used it allready to get some basic mouse and keyboard input, or knows about some good examples/resources online.
Or Maybe even a totally different way to go?
thanks
Related
I'm wanting to port a windows program I'm working on to linux. It entirely uses d3d11 to draw it output so I'll need to write an opengl alternative. That's fine. But I need to create a window and a few basic operations like resizing, setting fullscreen, getting notified of user keyboard and mouse input, and close and resize notification.
I won't require any child windows, or controls as everything is drawn by opengl.
So what is an appropriate way to do this? I looked at raw Xlib but it seems quite low level. I'm prepared to learn it but all the examples seem really old so I'm not sure if it's still the best way, plus will doing that work with whatever environment the user had (kde, gnome etc...)?
I could use qt, kde etc, but they are far, far more sophisiticated than I need for this so introduce a large dependency that I'd rather avoid if possible.
So, is Xlib an appropriate technology for this, or is there some other lightweight library I'm not aware of?
Look at SDL. It's pretty much the standard choice for applications like yours.
It looks like GLUT could suit your needs.
I'm working on a couple audio plugins. Right now, they are audio units. And while the "DSP" code won't change for the most part between implementations / ports, I'm not sure how to go about the GUI.
For instance, I was looking at the Apple-supplied AUs in Lion. Does anyone know how did they go about the UI? Like, are the knobs and controls just subclasses of Cocoa controls? are they using some separate framework or coding these knobs and such from scratch?
And then, the plugs I'm working on are going to be available too as VSTs for Windows. I already have them up and running with generic interfaces. But I'm wondering if I should just get over it and recreate all my interfaces with the vstgui code provided by Steinberg or if there's a more practical approach to making the interfaces cross-platform.
VSTGUI is not very much fun to work with, especially as your interface gets to be more complicated. The source is a mess and you end up with a very hardcoded GUI, which becomes difficult to refactor.
I'd recommend checking out Juce, which includes a nice GUI builder. If your DSP code is well modularized, switching to its architecture won't be so painful. As an added bonus, it will make the x-platform (where "platform" means both OS and underlying plugin platform) jumps a bit easier for you.
I am currently trying to get simple keyboard input on OSX, right now I am doing it through the Leopard HID Manager object and that generally works, but since that is pretty low level I am wondering if there is an API available that has some extra functionality build it like key repetition or Unicode support (since when I catch the events on HID I/O level I think I have to write all these fancy extras from scratch). I know carbon event handlers(NewEventHandlerUPP) are capable of that but I am pretty sure that they are deprecated since you can't find anything about them in the current OSX reference and I don't want to use anything deprecated, so I am wondering if there is any alternative I didn't come across during my search!
Thanks!
No.
At the Unicode level, the official API of receiving input is NSTextInputClient protocol in Objective-C, and the official API of processing input between the keyboard and the program is the Input Method Kit.
And you can never write a sufficiently fancy extra correctly from scratch. You need to get the user's setting of the international keyboard and modify the key obtained accordingly. And you can never write an input method from scratch which turns the raw key input to Chinese or Japanese ...
So, I think the sane choices are either
Just get the raw ASCII data from the keyboard and don't aim for more, or
Use Cocoa at least around the key input handling, to get additional features.
I'm looking for application-wide access to raw keyboard events in OS X, either using the Cocoa or Carbon frameworks (or any of the underlying APIs, for that matter). I know that I can override NSApplication's sendEvent: to get raw keyboard information, but for the meta keys (command, control, alternate, shift, etc) don't show up as keystroke events. I'm looking for something analogous to Microsoft's DirectInput framework.
Thanks!
I think the equivalent to DirectInput is HID Manager. HID stands for "human interface device" and HID Manager (sometimes called HIDLib) is the low-level API to HIDs: keyboards, mice, and joysticks.
Leopard's got a new HID Manager API, documented in Technical Note TN2187. The pre-Leopard API is documented in HID Class Device Interface Guide. I wrote an Objecive-C wrapper around the older APIs, DDHidLib, which you may find useful. The Leopard API is much nicer. I'd use that directly, if you can.
The Core Graphics framework also has some useful functionality buried in it as part of the remote operation system. Look for CGRemoteOperation.h, and check out the Quartz Events reference.
You can use the Quartz Events system to install application-specific or system-wide "event taps", which let you monitor and inject keyboard and mouse events at a pretty low level. A few years ago there were some bugs with application-specific event taps, but they've hopefully been worked out by now.
I think the HID stuff is mostly for driver development, so if you're just looking for a tool for your application, HID is probably overkill.
NSResponder derived classes have a method -(void)flagsChanged: that gets called when meta-keys are held down.
You can use the RegisterHotKeyEvent Carbon function, I'm not sure if you can register for any of the meta keys explicitly though.
The blog post Program Global Hotkeys in Cocoa Easily explains how to do this.
I thought I was a decent programmer until I tried writing gamepad code for OS X. Now I feel deeply useless.
Does anyone know of any code that I can legally use in my (non-free) game?
Is it really this hard to talk to a gamepad on OS X? What am I missing?
Check out the HID Manager, especially the new HID Manager APIs in Leopard. It's somewhat verbose, but the essence of it is that you can get callbacks when devices are attached and detached, and get callbacks when events from those devices are enqueued.
If you're working with Cocoa, Dave Dribin has DDHidLib which provides a nicer Objective-C API atop the HID Manager, and runs on Tiger as well.
Turns out the answer was Apple's HID_Utilities, which (somewhat) simplifies the job of talking to HID Manager.
John Carmack really hit the nail on the head when he said that Apple don't care about games...
The quickest way to get gamepad events on OSX is to use SDL, the game library.
You don't have to use the whole library, you can just init the joystick subsystem
and then poll or wait for SDL_JOYAXISMOTION and SDL_JOYBUTTONUP/DOWN events.
SDL has an LGPL license, so you can dynamically link to it in your non-free game.
Easy!
No code, but communicating with gamepads and the like is pretty straightforward with the InputSprocket mechanism. What was the precise problem you had?