RAWINPUT provides two flags (RI_KEY_E0 and RI_KEY_E1) to check whether the left or right version of a key is pressed. This works great for CTRL, but not for left and right shift. In fact, the flags are the same for both, and the VKey is also the same (VK_SHIFT). How can I find out which shift was pressed? I'm working on Windows 7. Interestingly, the flags/vkey values are exactly the same no matter which shift key I'm pressing.
Windows 7, and I only get VK_SHIFT, never the L/R variants
Which is part of the explanation why this doesn't work the way you think it should do. There's ancient history behind this. The keyboard controller was redesigned for the IBM AT, again for the Enhanced keyboard. It started sending out 0xe0 and 0xe1 prefixes for keys that were added to the keyboard layout. Like the right Ctrl and Alt keys.
But keyboards always had two shift keys. The original IBM PC didn't consider them special keys, they simply have a different scan code. Which was maintained in later updates. Accordingly, you don't get the RI_KEY_E0 or E1 flags for them. You have to distinguish them by the RAWKEYBOARD.MakeCode value. The left shift key has makecode 0x2a, the right key is 0x36.
Note that the left Ctrl and Alt keys don't have the flags either. They match the corresponding keys on the old PC keyboard layout. The description of the flags in the MSDN Library article is not very accurate.
You can distinguish left-right SHIFT/CONTROL/ALT VK codes like this:
case WM_INPUT:
{
HRAWINPUT dataHandle = reinterpret_cast<HRAWINPUT>(lParam);
RAWINPUT input;
UINT size = sizeof(input);
::GetRawInputData(dataHandle, RID_INPUT, &input, &size, sizeof(RAWINPUTHEADER));
if (input.header.dwType != RIM_TYPEKEYBOARD)
break;
const RAWKEYBOARD& keyboard = input.data.keyboard;
// Ignore key overrun state
if (keyboard.MakeCode == KEYBOARD_OVERRUN_MAKE_CODE)
return;
// Ignore keys not mapped to any VK code
// This effectively filters out scan code pre/postfix for some keys like PrintScreen.
if (keyboard.VKey >= 0xff/*VK__none_*/)
return;
uint16_t scanCode = keyboard.MakeCode;
// Scan codes could contain 0xe0 or 0xe1 one-byte prefix.
// See https://download.microsoft.com/download/1/6/1/161ba512-40e2-4cc9-843a-923143f3456c/translate.pdf
scanCode |= (keyboard.Flags & RI_KEY_E0) ? 0xe000 : 0;
scanCode |= (keyboard.Flags & RI_KEY_E1) ? 0xe100 : 0;
uint16_t vkCode = keyboard.VKey;
switch (vkCode)
{
case VK_SHIFT: // -> VK_LSHIFT or VK_RSHIFT
case VK_CONTROL: // -> VK_LCONTROL or VK_RCONTROL
case VK_MENU: // -> VK_LMENU or VK_RMENU
vkCode = LOWORD(MapVirtualKeyW(scanCode, MAPVK_VSC_TO_VK_EX));
break;
}
//...
return 0;
}
This code should work at least from Vista.
But please note that gamedev programmers are usually manually mapping scancodes to internal game engine specific keycodes - because VK codes are tend to change on different keyboard layouts. For example if you use usual VK_W/VK_A/VK_S/VK_D for movement in QWERTY layout - it could turn into VK_Z/VK_Q/VK_S/VK_D in AZERTY keyboard layout. VK codes are primarily handy in Win32 GUI programming.
You can grab decent scancode<->USB HID Usage conversion table here: https://source.chromium.org/chromium/chromium/src/+/main:ui/events/keycodes/dom/dom_code_data.inc
Related
I am developing an application to detect keyboard type for macOS.
I have seen several functions which reading the documentation are supposed to return keyboard id.
However when I test those on my laptop it always print 59.
Can someone tell me where does this 59 value come from and its meaning ??
So far I have tried with oncreen keyboard and built-in keyboard. I have also tried with different layouts but I keep getting that 59
This is my code:
- (CGEventRef)processEvent:(CGEventRef)cgEvent
{
uint32_t kbdType = LMGetKbdType();
NSLog(#"Testing LMGetKbdType ----------> %d", kbdType);
NSEvent* event = [NSEvent eventWithCGEvent:cgEvent];
NSEventType type = [event type];
if(type==NSKeyDown || type==NSKeyUp) {
int64_t val = CGEventGetIntegerValueField(cgEvent, kCGKeyboardEventKeyboardType);
NSLog(#"CGEventGetIntegerValueField: %lld",val);
EventRef ce = (EventRef)[event eventRef];
if(ce) {
unsigned kbt;
GetEventParameter(
ce,
kEventParamKeyboardType,
typeUInt32, NULL,
sizeof kbt, NULL,
& kbt
);
NSLog(#"CARBON Keyboard type: %d",kbt);
}
CGEventSourceRef evSrc = CGEventCreateSourceFromEvent( cgEvent );
if(evSrc) {
unsigned kbt = (NSUInteger) CGEventSourceGetKeyboardType( evSrc );
CFRelease(evSrc);
NSLog(#"COCOA: %d",kbt);
}
}
}
I think these are undocumented values with no external meaning. They are only useful for passing back into other APIs that need a keyboard type (e.g. UCKeyTranslate()).
I think that they are of the same kind that used to be documented in <CoreServices/CarbonCore/Gestalt.h>, under gestaltKeyboardType. However, that header is no longer being updated and doesn't list a type 59.
What exactly are you trying to figure out about the keyboard? If it's general layout, you can use KBGetLayoutType() to learn if it's ANSI, JIS, or ISO. You pass in the keyboard type, like the one you're getting from LMGetKbdType().
The active keyboard layout (e.g. U.S. vs. French vs. Dvorak) should not affect the keyboard type. The keyboard type is an aspect of the hardware and doesn't change as the layout (the interpretation of keys into characters) changes.
In a MFC application within PreTranslateMessage(MSG *pMsg) inherited from a CView, I have this:
if (pMsg->message == WM_KEYDOWN) ...
The fields in a WM_KEYDOWN are documented here. The virtual key VK_ value is in pMsg->wParam and pMsg->lParam contains several field, of which bits 16-23 is the keyboard scan code.
So in my code I use:
const int virtualKey = pMsg->wParam;
const int hardwareScanCode = (pMsg->lParam >> 16) & 0x00ff; // bits 16-23
On my non-US keyboard for example, when I press the "#" character, I get the following:
virtualKey == 0xde --> VK_OEM_7 "Used for miscellaneous characters; it can vary by keyboard."
hardwareScanCode == 0x29 (41 decimal)
The character I'd like to "capture" or process differently is ASCII "#", 0x23 (35 decimal).
MY QUESTION
How do I translate the WM_KEYDOWN information to get something I can compare against, regardless of language or keyboard layout? I need to identify the # key whether the user has a standard US keyboard, or something different.
For example, I've been looking at the following functions such as:
MapVirtualKey(virtualkey, MAPVK_VSC_TO_VK);
// previous line is useless, the key VK_OEM_7 doesn't map to anything without the scan code
ToAscii(virtualKey, hardwareScanCode, nullptr, &word, 0);
// previous line returns zero, and zero is written to `word`
Edit:
Long story short: On a U.S. keyboard, SHIFT+3 = #, while on a French keyboard SHIFT+3 = /. So I don't want to look at individual keys, instead I want to know about the character.
When handling WM_KEYDOWN, how do I translate lParam and wParam (the "keys") to find out the character which the keyboard and Windows is about to generate?
I believe this is a better solution. This one was tested with both the standard U.S. keyboard layout and the Canadian-French keyboard layout.
const int wParam = pMsg->wParam;
const int lParam = pMsg->lParam;
const int keyboardScanCode = (lParam >> 16) & 0x00ff;
const int virtualKey = wParam;
BYTE keyboardState[256];
GetKeyboardState(keyboardState);
WORD ascii = 0;
const int len = ToAscii(virtualKey, keyboardScanCode, keyboardState, &ascii, 0);
if (len == 1 && ascii == '#')
{
// ...etc...
}
Even though the help page seems to hint that keyboardState is optional for the call to ToAscii(), I found that it was required with the character I was trying to detect.
Found the magic API call that gets me what I need: GetKeyNameText()
if (pMsg->message == WM_KEYDOWN)
{
char buffer[20];
const int len = GetKeyNameTextA(pMsg->lParam, buffer, sizeof(buffer));
if (len == 1 && buffer[0] == '#')
{
// ...etc...
}
}
Nope, that code only works on keyboard layouts that have an explicit '#' key. Doesn't work on layouts like the standard U.S. layout where '#' is a combination of other keys like SHIFT + 3.
I'm not an MFC expert, but here's roughly what I believe its message loop looks like:
while (::GetMessage(&msg, NULL, 0, 0) > 0) {
if (!app->PreTranslateMessage(&msg)) { // the hook you want to use
TranslateMessage(&msg); // where WM_CHAR messages are generated
DispatchMessage(&msg); // where the original message is dispatched
}
}
Suppose a U.S. user (for whom 3 and # are on the same key) presses that key.
The PreTranslateMessage hook will see the WM_KEYDOWN message.
If it allows the message to pass through, then TranslateMessage will generate a WM_CHAR message (or something from that family of messages) and dispatch it directly. PreTranslateMessage will never see the WM_CHAR.
Whether that WM_CHAR is a '3' or a '#' depends on the keyboard state, specifically whether a Shift key is currently pressed. But the WM_KEYDOWN message doesn't contain all the keyboard state. TranslateMessage keeps track of the state by taking notes on the keyboard messages that pass through it, so it knows whether the Shift (or Ctrl or Alt) is already down.
Then DispatchMessage will dispatch the original WM_KEYDOWN message.
If you want to catch only the '#' and not the '3's, then you have two options:
Make your PreTranslateMessage hook keep track of all the keyboard state (like TranslateMessage would normally do). It would have to watch for all of the keyboard messages to track the keyboard state and use that in conjunction with the keyboard layout to figure whether the current message would normally generate a '#'. You'd then have to manually dispatch the WM_KEYDOWN message and return TRUE (so that the normal translate/dispatch doesn't happen). You'd also have to be careful to also filter the corresponding WM_KEYUP messages so that you don't confuse TranslateMessage's internal state. That's a lot of work and lots to test.
Find a place to intercept the WM_CHAR messages that TranslateMessage generates.
For that second option, you could subclass the destination window, have it intercept WM_CHAR messages when the character is '#' and pass everything else through. That seems a lot simpler and well targeted.
I'm a beginner using Arduino with a Teensy 3.2 board and programming it as a usb keyboard.
I have two 4 button membrane switches. Their button contacts are on pins 1-8, and the 9th pin holds a soldered together wire of both membrane switches' "ground" line or whatever it's true name is; the line that completes the circuit.
Basically when you press the buttons they are supposed to simply type "a, b, c..." respectively. I've been told I need to use a matrix for this.
I'm looking for an example of how to code a keyboard matrix that effectively supports a one row/9 column line (or vice versa?) I've been unable to find that solution online.
All I have so far is this code which, when the button on the second pin is pressed, sends tons of "AAAAAAAAAAAAAAAA" keystrokes.
void setup() {
// make pin 2 an input and turn on the
// pullup resistor so it goes high unless
// connected to ground:
pinMode(2, INPUT_PULLUP);
Keyboard.begin();
}
void loop() {
//if the button is pressed
if(digitalRead(2)==LOW){
//Send an ASCII 'A',
Keyboard.write(65);
}
}
Would anyone be able to help?
First of all, a 1-row keypad is NOT a matrix. Or better, technically it can be considered a matrix but... A matrix keypad is something like this:
You see? In order to scan this you have to
Pull Row1 to ground, while leaving rows 2-4 floating
Read the values of Col1-4. These are the values of switches 1-4
Pull Row2 to ground, while leaving rows 1 and 3-4 floating
Read the values of Col1-4. These are the values of switches 5-8
And so on, for all the rows
As for the other problem, you are printing an A when the button is held low. What you want to achieve is to print A only on the falling edge of the pin (ideally once per pressure), so
char currValue = digitalRead(2);
if((currValue==LOW) && (oldValue==HIGH))
{
//Send an ASCII 'A',
Keyboard.write(65);
}
oldValue = currValue;
Of course you need to declare oldValue outside the loop function and initialize it to HIGH in the main.
With this code you won't receive tons of 'A's, but however you will see something like 5-10 'A's every time you press the button. Why? Because of the bouncing of the button. That's what debouncing techniques are for!
I suggest you to look at the class Bounce2 to get an easy to use class for your button. IF you prefer some code, I wrote this small code for another question:
#define CHECK_EVERY_MS 20
#define MIN_STABLE_VALS 5
unsigned long previousMillis;
char stableVals;
char buttonPressed;
...
void loop() {
if ((millis() - previousMillis) > CHECK_EVERY_MS)
{
previousMillis += CHECK_EVERY_MS;
if (digitalRead(2) != buttonPressed)
{
stableVals++;
if (stableVals >= MIN_STABLE_VALS)
{
buttonPressed = !buttonPressed;
stableVals = 0;
if (buttonPressed)
{
//Send an ASCII 'A',
Keyboard.write(65);
}
}
}
else
stableVals = 0;
}
}
In this case there is no need to check for the previous value, since the function already has a point reached only when the state changes.
If you have to use this for more buttons, however, you will have to duplicate the whole code (and also to use more stableVals variables). That's why I suggsted you to use the Bounce2 class (it does something like this but, since it is all wrapped inside a class, you won't need to bother about variables).
I have made an application in Delphi that handles some defined system wide hotkeys, works perfectly. However, for some hotkey functionality I have to trigger/simulate some keyboard strokes such as ALT+ENTER. This works great when the user releases the hotkey keys directly, but when the keys are still pressed by the user the keyboard simulation fail.
Is there a way (with windows API) to check if all keys are released before I process a keyboard simulation?
Use GetAsyncKeyState as this API reflects the true current state of the keyboard, and not when your app last called GetMessage. Just write a loop that calls it for each value between 0 and 0xFF.
If the most significant bit is set, the key is down
Thanks go to #David Ching and #David Heffernan (two Davids!) The solution is not only to test keyboard input but also mouse input or better, the state of input devices.
The mouse is also included because of:
{ Virtual Keys, Standard Set }
VK_LBUTTON = 1;
VK_RBUTTON = 2;
VK_MBUTTON = 4; { NOT contiguous with L & RBUTTON }
So, if don't want to test mousebuttons you have to exclude it from the loop. It's better to check these also because there are hotkeys that must be used with the mouse. It's better to check everything on input is idle.
function isUserInputDevicesInUse() : Boolean; // Keyboard pressed / mouse pressed?
var
i : LongInt;
begin
i:=256;
Result:=FALSE;
while( i > 0 ) and ( NOT Result ) do
begin
Dec( i );
Result:=( GetAsyncKeyState(i) < 0 );
end;
end;
function isUserInputDevicesIdle() : Boolean;
begin
Result:=NOT isUserInputDevicesInUse();
end;
Im trying to implement a keyboard class in my game that has two modes. The game mode takes input that uses lowercase, unmodified keys (unmodified meaning if I type a '0' with the shift it still returns '0' instead of ')'). I have tracked it down as far as using the charactersIgnoringModifiers method of the NSEvent class but this method excludes all the modifier keys except for the shift key.
You can use -[NSEvent keyCode] and then translate the key code to a character without using any modifiers. Doing the latter is easier said than done. Here's a long mailing list thread on the techniques and gotchas.
The best option I could find so far for ignoring the <Shift> modifier is by using NSEvent.characters(byApplyingModifiers:) with a modifier that doesn't change the key glyph, i.e. .numericPad:
func onKeyDown(event: NSEvent) {
let characters = event.characters(byApplyingModifiers: .numericPad)
print("Key pressed: \(characters)")
}
Ideally you'd be able to pass in a mask that represents no modifiers at all, but the API doesn't seem to support it.
For completeness, here's how you could start writing a function that takes a UInt16 (CGKeyCode) and returns a string representation according to the user's keyboard:
func keyCodeToString(code: UInt16) -> String {
switch code {
// Keys that are the same across keyboards
// TODO: Fill in the rest
case 0x7A: return "<F1>"
case 0x24: return "<Enter>"
case 0x35: return "<Escape>"
// Keys that change between keyboards
default:
let cgEvent = CGEvent(keyboardEventSource: nil, virtualKey: code, keyDown: true)!
let nsEvent = NSEvent(cgEvent: cgEvent)!
let characters = nsEvent.characters(byApplyingModifiers: .numericPad)
return String(characters?.uppercased() ?? "<KeyCode: \(code)>")
}
}
The goal being for the F1 key to display <F1>, but the ";" key to display ; on US keyboards but Ñ on Spanish keyboards.