Background:
I have a Bluetooth gamepad which I can discover and connect to through xCode. I can communicate with it through the Control and Interrupt channel and thus I can find out which buttons on the Gamepad are being pressed. I can then map this to a keyboard button.
However, most recent games look for an actual gamepad or joystick to enable multiplayer functions as only one player is allowed to play on the keyboard and mouse. Since my gamepad currently only simulates the keyboard buttons the game does not recognize a gamepad and thus will not unlock the multiplayer mode.
Question:
I have no idea where to look for, but is there a way I can simulate a gamepad so the MAC actually recognizes it as a joystick/gamepad and I can map the buttons accordingly?
I am coding in xCode 4.6.1 for a Mac OSX "Mountain Lion" APP.
Related
I'm looking for the Mac OS API that virtual machine or remote desktop type programs would use to "capture" the mouse and keyboard. That is to say, I want to write a GUI program where when the user clicks in my window, the normal mouse cursor disappears, and all keyboard and mouse input is diverted to my program, including global shortcuts like cmd-tab. What's the name of this API?
Found it: CGEventTapCreate can tap into the low level event stream to receive, filter, or insert HID events.
I have a Windows tablet in my car. It is connected to the car via Bluetooth, so I can skip tracks, change volume, play/pause, and so on using my steering wheel. I want to fire an event depending on which button was pressed.
So far I've tried intercepting inputs and keypresses globally using:
NodeJS (using a library called iohook)
Listens for keypresses on the system, so I can always listen for keypresses (basically a keylogger)
Works great if I hit play/pause on the keyboard
Does not work if I hit play/pause over Bluetooth using a speaker or my car's steering wheel
.NET Windows Forms
"Keylogger" works great, but same problem with NodeJS: It does not listen for "keypresses" from a Bluetooth device
I don't care which language I have to use. I know that it's possible, because if I have Spotify running on my Windows machine and I hit play/pause on my speaker or on my car's steering wheel, it will pause Spotify. So something is definitely listening to the inputs, but I haven't figured out what.
One very important thing: It has to listen globally for the Bluetooth events. I do not want to have an application focused.
What language can I use, that can listen for these events? Are there frameworks or libraries out there, that can do this easily already?
Currently, i'm working on my first project with Xamarin forms and android. I have a bluetooth barcode scanner paired with android device. So softkeyboard doesn't appear when it's connected. I tried many options what i found in the internet. Such as forced softkeyboard call from CustomEntryRenderer and other places. In some cases it's possible to handle in Language/InputMethods menu in Android, but not in all phones. Maybe i'm missing something, and called code from wrong place. The question is how to show softkeyboard, even when bluetooth keyboard is connected?
InputMethodManager inputMethodManager = this.Control.Context.GetSystemService(Context.InputMethodService) as InputMethodManager;
inputMethodManager.ShowSoftInput(this.Control, ShowFlags.Forced);
inputMethodManager.ToggleSoftInput(ShowFlags.Forced, HideSoftInputFlags.ImplicitOnly);
On my device, there are keyboard-related settings, one of which is "always show onscreen keyboard while a physical keyboard is connected". Switch this to "ON".
Apparently the default is to see the scanner not as "one of the keyboards" but "the only one", so an on-screen keyboard isn't needed. Which makes sense for a real keyboard, but not for a scanner.
I'm currently building a kiosk program with as3+airkinect so the whole interface is built up with the Kinect and one external Button which is a keyboard input. Somehow the .exe program I've built looses the focus of the window after about one hour, which has the effect that the keyboard listener stops listening. I can detect a Event.DEACTIVATE, how can i make shure the keyboard listener always works?
I currently have a program written in Cocoa and I would like it to have an onscreen keyboard as I am thinking of using a touch-screen monitor and would like to not have a keyboard for this particular piece of software.
I know there is an onscreen keyboard in Cocoa-touch, but as far as I am aware, that can only be used on the iPhone, iTouch and iPad.
Is there anyway I can use it in a regular Cocoa application?
Thanks
UIKit isn't part of Mac OS X, unfortunately, so you can't use anything from it. There is an onscreen keyboard that you can enable in the system's Language & Text preference pane, but I don't know how you'd do it programmatically. If this is a major requirement for your system, though, it would probably be better to roll your own. OS X is not really designed for touchscreens and has just recently gotten the most rudimentary support for touch-y interaction (thanks to the Magic Mouse and the MacBooks' trackpad).
Unfortunately, the on-screen keyboard is part of UIKit, which is only available on iOS. The only (hacky) option that I'm aware of would be to run your app in the iPad simulator. Not ideal, for sure and I'm not sure how the simulator handles a multi-touch capable display.