Sending KeyPress events in X11 - x11

I have a program where for various reasons i need to send keypress events to various windows. What I am using at the moment
XEvent event;
/* set some other stuff*/
event.type = KeyPress;
event.xkey.keycode = XKeysymToKeycode(display,XStringToKeysym(curr_key));
works for lower case letters and numbers, but I need to modify this so that it is capable of sending the enter key and upper case letters.

From the XStringToKeysym man page:
void XConvertCase(KeySym keysym, KeySym *lower_return, KeySym *upper_return);
The XConvertCase function returns the uppercase and lowercase forms of the specified Keysym, if the KeySym is subject to case conversion; otherwise, the specified KeySym is returned to both lower_return and upper_return. Support for conversion of other than Latin and Cyrillic KeySyms is implementation-dependent.
All the keysyms are in /usr/include/X11/keysymdef.h e.g. the enter key is XK_Return. The letters are there too e.g. XK_a and XK_A.

Related

How to use AutiIt to simulate Keystrokes using AutoIt for IME other than English

I am trying to figure out how to simulate keystroke with AutoIt using default IME (Input Method Engine) for Indic (or any other input method).
For example and if I send key as Send("a") to Notepad, I should get a character which is on Key 'a' using IME currently set.
Again If I Send Key M, I should get letter स if IME Hindi/Marathi language and Inscript Keyboard is set.
Currently I am getting same English letter M instead of स.
My Autoit script is as follows
$myvar = "`1234567890-=\qwertyuiop[]asdfghjkl;'©zxcvbnm,./"
$charArray = StringSplit($myvar,"",2)
For $char in $charArray
Send( $char & "{ENTER}")
Next
Can some one tell me how achieve this in above script?
I suggest using clipboard:
call ClipPut('string with national characters')
and then simulate Ctrl+V - Send('^v')
If clipboard is not suitable (as in the case of menu accelerators or searching through listview) please tell us what exactly are you trying to send keys to - there may be a way.
Finally I got clue for this problem on Autoit's forum How Can I Find Non-English Characters' Key Code?, there I came across with DllCall function call.
So now I have following function call
DllCall('user32.dll', 'int', 'keybd_event', 'int',$hChar, 'int', 0, 'int', 0, 'ptr', 0)
I can replace value of $hChar with required key code.
For example if I want to type स with Inscript Keyboard and Devanagari language, I should send 0x4D which is Hex code for the keyboard key m.
To convert character m to key code 0x4D we will have to write conversion for all keyboard keys.
We can get श with the the same keycode if we send {SHIFTDOWN} before calling this function.
$hChar = "0x4D"
Send("{SHIFTDOWN}")
DllCall('user32.dll', 'int', 'keybd_event', 'int', $hChar, 'int', 0, 'int', 0, 'ptr', 0)
Send("{SHIFTUP}")
(To use DllCall we must include WinAPI.au3 in our Autoit script.)

How to capture combination key event?

In C#, it was written like this:
private void Form1_KeyDown(object sender, KeyEventArgs e)
{
if (e.Control && e.Shift && e.KeyCode == Keys.P)
{
MessageBox.Show("Hello");
}
}
Capturing Ctrl + Shift + P key stroke in a C# Windows Forms application [duplicate]
I tried to emulate the way they wrote it in C# to Delphi XE8 but it doesn't seemed to work:
procedure TForm12.FormKeyDown(Sender: TObject; var Key: Word; Shift: TShiftState);
begin
if (GetKeyState(VK_CONTROL) < 0) and (Key = 53) then
ShowMessage('You pressed "ctrl + s"');
end;
Note: The TForm.KeyPreview Troperty is set to true
How can I capture that combination key event?
The key you are looking for, S, has code $53. The key you specified has code 53 and is the number 5. The difference is the $ which signifies hexadecimal.
You'd avoid such silly mistakes, and make the code much clearer, if you let the compiler do the work:
Key = ord('S')
You really don't want to use magic constants in your program. That is very important.
Note that Key is a virtual key code and the convention is that for the 26 keys of the Latin alphabet, they are represented by the ordinal value of the uppercase letter.
The message already passes the state of the modifier keys in the Shift argument, so it is idiomatic to write the test as follows:
if (ssCtrl in Shift) and (Key = ord('S')) then
Your test using GetKeyState does work well, but it's just not idiomatic.
Note that this test, which matches that in the question will, ignores the state of the other modifier keys. Indeed, the C# code in the question also ignores the state of the ALT modifier.
So you may want a true test for CTRL + S you must also check that the other modifiers are up:
if ([ssCtrl] = Shift*[ssCtrl, ssShift, ssAlt]) and (Key = ord('S')) then
All this said, it's usually much easier to manage your shortcuts using actions. This will allow you to specify shortcuts directly, and let the framework detect the low level key events that make up a shortcut. What's more actions allow you to centralise handling of the actions behind buttons and menus without you repeating yourself.
You can use actions to automate shortcuts. Drop in a TActionManager and add a TAction to it. On that action, assign a Name, Caption, and an OnExecute event handler, and most importantly a value for ShortCut. This can be a string representing the keystrokes, in your case Ctrl+Shift+P. Then, you can either assign that action to various controls, or call it like MyAction.Execute.
In Delphi you use the Shift: TShiftState to check which 'shift'-keys are pressed.
As pointed out in comments your error is that the key value for letter s is not decimal 53, but hexadecimal 53, iow $53in Delphi syntax.
I first thought you also wanted to check for the shift key as in the referenced source of your inspiration, in which case you can test for the exclusive combination as follows:
procedure TForm15.FormKeyDown(Sender: TObject; var Key: Word;
Shift: TShiftState);
begin
if (Key = $53) and ([ssCtrl, ssShift] = Shift) then
begin
ShowMessage('You pressed "ctrl + shift + s"');
Key := 0; // optional
end;
end;
You may or may not want to clear the Key parameter to prevent further action by the control with focus.
Rereading your question after another comment, you seem to want to detect only the Ctrl + s combination, in which case the exclusive condition test becomes
if (Key = $53) and ([ssCtrl] = Shift) then
I recommed to be precise (exclusive) in the shift state test, because the shift state includes not only the Shift, Ctrland Alt keys but also mouse buttons and some gestures.
The documentation on TShiftState provides other possible values of Shift to check for.
Finally, as #David Heffernan points out in his answer, instead of a magic constant ($53) for the key test, use Ord('S').

Obtaining modifier key pressed in CGEvent tap

Having setup an event tap, I'm not able to identify what modifier key was pressed given a CGEvent.
CGEventFlags flagsP;
flagsP=CGEventGetFlags(event);
NSLog(#"flags: 0x%llX",flagsP);
NSLog(#"stored: 0x%llX",kCGEventFlagMaskCommand);
if (flagsP==kCGEventFlagMaskCommand) {
NSLog(#"command pressed");
}
Given the above snippet, the first NSLog returns a different value from the second NSLog. No surprise that the conditional is never triggered when the command modifier key is pressed.
I need to identify whether command, alternate, option, control or shift are pressed for a given CGEvent. First though, I need help to understand why the above isn't working.
Thanks!
These are bit masks, which will be bitwise-ORed together into the value you receive from CGEventGetFlags (or pass when creating an event yourself).
You can't test equality here because no single bit mask will be equal to a combination of multiple bit masks. You need to test equality of a single bit.
To extract a single bit mask's value from a combined bit mask, use the bitwise-AND (&) operator. Then, compare that to the single bit mask you're interested in:
BOOL commandKeyIsPressed = (flagsP & kCGEventFlagMaskCommand) == kCGEventFlagMaskCommand;
Why both?
The & expression evaluates to the same type as its operands, which is CGEventFlags in this case, which may not fit in the size of a BOOL, which is a signed char. The == expression resolves that to 1 or 0, which is all that will fit in a BOOL.
Other solutions to that problem include negating the value twice (!!) and declaring the variable as bool or _Bool rather than Boolean or BOOL. C99's _Bool type (synonymized to bool when you include stdbool.h) forces its value to be either 1 or 0, just as the == and !! solutions do.

How do I use CGEventKeyboardSetUnicodeString with multiple characters?

I'm trying to use event taps to create an OS X program that will listen for Yiddish typed in transliteration and post the result in Hebrew characters. I made a very short program to test one things I'd have to do: http://pastie.org/791398
As is, the program successfully replaces every typed 'q' with 'w':
if(inputString[0] == 'q') { inputString[0] = 'w'; }
But how does one post a string of more than one character? For instance, if someone types 'sh' you'd presumably have to post a backspace (to delete the character that was posted for 's' alone) and then post the character that corresponds to 'sh'. However, this code results in only a backspace being posted:
else if(inputString[0] == 'm') { inputString[0] = '\b'; inputString[1] = 'n'; }
I apologize if these are basic questions; I have read all the documentation I could find, but I might not have understood it all. It's also possible that I'm going about this entirely the wrong way.
Ideally you should be using an input method instead of a program with event taps, most likely using Input Method Kit if you don't need to support pre-10.5. Using event taps for this purpose is inherently a bad idea because the user can change where he/she is typing with the mouse as well as the keyboard. So if the user typed a "s" in one text field followed by a "h" in another, you wouldn't be able to tell the difference.
That said, here's a direct answer to your question.
The string is length-counted, so you can't just provide the incoming length (1); the second character will be ignored. However, most applications also don't like to get more than a single character per event, so they'll just discard the remaining characters. (Terminal is a notable exception.)
So what you can do is simply post a second event with the second character in it.
else if(inputString[0] == 'm') {
inputString[0] = 'n';
CGEventKeyboardSetUnicodeString(event, 1, inputString);
CGEventPost(kCGSessionEventTap, event);
inputString[0] = '\b';
}
In the general case (simulating > 2 keypresses) you'll need to create an event for each character you want to insert. This mailing list post includes a simple example.
This is how I send a string to the first responder ( foreground application )
// 1 - Get the string length in bytes.
NSUInteger l = [string lengthOfBytesUsingEncoding:NSUTF16StringEncoding];
// 2 - Get bytes for unicode characters
UniChar *uc = malloc(l);
[string getBytes:uc maxLength:l usedLength:NULL encoding:NSUTF16StringEncoding options:0 range:NSMakeRange(0, l) remainingRange:NULL];
// 3 - create an empty tap event, and set unicode string
CGEventRef tap = CGEventCreateKeyboardEvent(NULL,0, YES);
CGEventKeyboardSetUnicodeString(tap, string.length, uc);
// 4 - Send event and tear down
CGEventPost(kCGSessionEventTap, tap);
CFRelease(tap);
free(uc);

Visual Studio C++ 2008 Manipulating Bytes?

I'm trying to write strictly binary data to files (no encoding). The problem is, when I hex dump the files, I'm noticing rather weird behavior. Using either one of the below methods to construct a file results in the same behavior. I even used the System::Text::Encoding::Default to test as well for the streams.
StreamWriter^ binWriter = gcnew StreamWriter(gcnew FileStream("test.bin",FileMode::Create));
(Also used this method)
FileStream^ tempBin = gcnew FileStream("test.bin",FileMode::Create);
BinaryWriter^ binWriter = gcnew BinaryWriter(tempBin);
binWriter->Write(0x80);
binWriter->Write(0x81);
.
.
binWriter->Write(0x8F);
binWriter->Write(0x90);
binWriter->Write(0x91);
.
.
binWriter->Write(0x9F);
Writing that sequence of bytes, I noticed the only bytes that weren't converted to 0x3F in the hex dump were 0x81,0x8D,0x90,0x9D, ... and I have no idea why.
I also tried making character arrays, and a similar situation happens. i.e.,
array<wchar_t,1>^ OT_Random_Delta_Limits = {0x00,0x00,0x03,0x79,0x00,0x00,0x04,0x88};
binWriter->Write(OT_Random_Delta_Limits);
0x88 would be written as 0x3F.
If you want to stick to binary files then don't use StreamWriter. Just use a FileStream and Write/WriteByte. StreamWriters (and TextWriters in generally) are expressly designed for text. Whether you want an encoding or not, one will be applied - because when you're calling StreamWriter.Write, that's writing a char, not a byte.
Don't create arrays of wchar_t values either - again, those are for characters, i.e. text.
BinaryWriter.Write should have worked for you unless it was promoting the values to char in which case you'd have exactly the same problem.
By the way, without specifying any encoding, I'd expect you to get non-0x3F values, but instead the bytes representing the UTF-8 encoded values for those characters.
When you specified Encoding.Default, you'd have seen 0x3F for any Unicode values not in that encoding.
Anyway, the basic lesson is to stick to Stream when you want to deal with binary data rather than text.
EDIT: Okay, it would be something like:
public static void ConvertHex(TextReader input, Stream output)
{
while (true)
{
int firstNybble = input.Read();
if (firstNybble == -1)
{
return;
}
int secondNybble = input.Read();
if (secondNybble == -1)
{
throw new IOException("Reader finished half way through a byte");
}
int value = (ParseNybble(firstNybble) << 4) + ParseNybble(secondNybble);
output.WriteByte((byte) value);
}
}
// value would actually be a char, but as we've got an int in the above code,
// it just makes things a bit easier
private static int ParseNybble(int value)
{
if (value >= '0' && value <= '9') return value - '0';
if (value >= 'A' && value <= 'F') return value - 'A' + 10;
if (value >= 'a' && value <= 'f') return value - 'a' + 10;
throw new ArgumentException("Invalid nybble: " + (char) value);
}
This is very inefficient in terms of buffering etc, but should get you started.
A BinaryWriter() class initialized with a stream will use a default encoding of UTF8 for any chars or strings that are written. I'm guessing that the
binWriter->Write(0x80);
binWriter->Write(0x81);
.
.
binWriter->Write(0x8F);
binWriter->Write(0x90);
binWriter->Write(0x91);
calls are binding to the Write( char) overload so they're going through the character encoder. I'm not very familiar with C++/CLI, but it seems to me that these calls should be binding to Write(Int32), which shouldn't have this problem (maybe your code is really calling Write() with a char variable that's set to the values in your example. That would account for this behavior).
0x3F is commonly known as the ASCII character '?'; the characters that are mapping to it are control characters with no printable representation. As Jon points out, use a binary stream rather than a text-oriented output mechanism for raw binary data.
EDIT -- actually your results look like the inverse of what I would expect. In the default code page 1252, the non-printable characters (i.e. ones likely to map to '?') in that range are 0x81, 0x8D, 0x8F, 0x90 and 0x9D

Resources