keyboard hook in windows C++ or what? - windows

I wish to build my own application which can send keyboard commands(messages) to the Windows OS.
For example when I press the combination ctrl+shift+n, I wish to launch the notepad.exe . How can I do that? Do you have some advice for me about the concept used.
I've read that is possible when are used keyboard hooks? That's the only way? Do you know a free and open-source application which does this as simple is possible?

Your particular example can be done without any programming at all, by right clicking on Notepad, selecting Properties, and setting the "hot key" (various Windows versions might call it by a different name) to Ctrl+Shift+N.
If you still would like to write a program to do this, have a look at the RegisterHotKey Win32 API function.

AutoHotkey is a free, open-source utility for Windows.
You can automate many tasks with the above utility, check it out.

Things to bear in mind:
A system-wide keyboard hook requires the writing of a DLL. There's example keyboard hook code on my website here.
Hooks cannot be installed from a low to a high integrity level application in Vista and Windows 7/8/10. So there's no guarantee your hook will work, depending upon what the foreground application is when the key gets hit.
As Greg pointed out, a lot of the time, RegisterHotKey is a much simpler solution for this problem.

Related

A general solution to injecting keys to different applications on windows?

After receiving some feedback on this question: How to create lParam for WM_CHAR or WM_KEYUP/WM_KEYDOWN?, I`ve started looking for a broader answer to a general solution. One thing I realized that using windows API's might not work for every app and in every case.
My first step in the follow up research was to make an Arduino powered servo to press the keys (yeah the concept is horrible ik).
But that prompted yet another idea, a hardware augmented small numpad keyboard which is also operated by Arduino which was controlled via another usb. This was at least somewhat usable - but still not very.
Then I tried to use Digispark Atiny85 microcontroller which in turn used Digikeyboard library. This solution was much better - but then necessity to have a digispark stuck in your usb port was a bit frustrating.
This made me curious if there are ways to emulate keyboard or any other HID devices using software only? Some brief googling pointed me to Kernel drivers and virtual COM ports, but that seem to be a bit over the top for me to process.
So can that task indeed be achieved by writing a kernel driver? Can it be done in any other manner? In either case are there any pointers which you can give me on the topic?
The SendInput function can be used to generate keyboard and mouse input. This input goes to the foreground window as if generated by real hardware (but lowlevel hooks can tell that it was software generated). It might not let you generate Ctrl+Alt+Delete nor control a UAC prompt but other than that it should be good enough in most cases. Writing a driver to overcome these limitations is normally not worth it.
There is no general way to generate input to a specific application/window if it is not the foreground window.
If you want to control a specific application you should use UI Automation.
Faking key up/down/char messages with PostMessage is not uncommon but it does not always work (the application might be using RAW input, input is not synchronized with real hardware etc.). If you are determined to use this method anyway, make sure you send it to the correct window (the HWND with the keyboard focus, not just the top-level window). Use the Spy++ tool to view the messages to make sure they are going to the correct window.

Typing simulator — sending keystrokes to other application

i've been challenged to write an app simulating keystrokes. After pressing shortcut app should be able to send predefined key combination to currently active app. It is functionality provided by many existing applications, but i want to write it on my own. App should be usable on windows.
Could you provide me with suggestions about:
which programming language should I choose?
Is there any libraries providing such functionality?
EDIT: To be precise: both applications are standalone, windows app.
The native winapi for doing this is SendInput.
To answer your questions:
Which programming language? This is up to you. Because this is just a simple native API call, use any language that allows you to call native API.
Don't bother with libraries if this is all you need, because it's very simple.
Now, to go further, I know you didn't ask this but many many people continue by asking how to send keystrokes to windows that don't have keyboard focus - like sending keystrokes to a specific app. This is much more difficult and error-prone. And since it goes outside the behavior of actual real keystrokes, it can behave unpredictably. Here is one such question.
You can easily do this on C#, using
SendKeys.Send("key here");
or
SendKeys.SendWait("key here");
Some keys use other keycodes, you can see them here on MSDN.

Automate any software

A quick question. Is there any method to control or automate any Windows application, using the command line. I've tried AutoIt. Any other methods? I'm targetting to control WinCE Test Kit (CETK) to perform the test without having to go to the GUI,or click the menu, connect etc, manually.
Thanks in advance!
We use Rational Robot for this but keep in mind it's not cheap. It's also probably been renamed 27 times since we started using it so you may want to just search for Rational testing products in general.
It's fully script-able, allowing you to monitor the screen and send key presses and whatnot.
Look at SCAR, it is great for all sorts of automation and screen reading: http://freddy1990.com/index.php?page=product&name=scar
I've always used a hybrid approach. First purchase SOTI Pocket Controller Pro then just use the normal AutoIt automation tools. It's a little different because you can't actually capture popup windows like you may be used to, but it can automate clicks, and then loop and wait for things using the GetPixel methods to check it the screen is what you expect.
Being able to connect to multiple CE devices at the same time visually is also a nice touch.

How to control the mouse pointer outside our application

I want to control the mouse pointer with my application and be able to interact with other programs using my program,
For example I want my application to be able to click on a button on another application
How should I go about solving this problem?
(Any programming language would work, also if you have any suggestion please let me know)
Afterthoughts:
I want to do it in windows operating system and want to test my GUI to see if it works in different scenarios. Any language would work for me since this is not part of the final product but I prefer one of these languages (Python, Java, C# or MATLAB)
Thanks
There are many ways of doing this, and you didn't mention any details of your application (system, target goal, etc...).
If your goal is menial automation, I'd recommend whipping together a quick AutoIt script on Windows. http://www.autoitscript.com/autoit3/index.shtml
If this isn't what you're looking for, please give more details.
Okay, this one is really operating system and windowing specific. But the phrase you're looking for is "mouse grabbing".
As #Mitch suggests, unless you've got a really good reason — like maybe a GUI testing app? — then grabbing the mouse and messing with it in that way is very bad form.

Windows Systems Programming: Can a keystroke be sent to an open application that is not the currently active one?

I'm a bit rusty on my Windows system programming...
Is it possible for a program to send a keystroke (I'm guessing by SendMessage() api call) to another application, if the (open) target application does not currently have the focus? If it is possible, does it then make the target application become the active application, or does it still remain in the background?
Thanks in advance for any info you may be able to provide!
No, It will not change the focus, unless subsequent calls do setfocus. It will remain the same window order
PostMessage(hwndOther, WM_KEYDOWN, VK_ENTER, 0);
This works for me, but only under Windows XP.
But On Vista and Windows 7 I've too got problem. Propably with UIPI.
I am trying to send a message to process from a DLL injected to this process.
How to fix it?
From memory: Yes, No.
You are looking for WM_KEYDOWN:
PostMessage(hwndOther, WM_KEYDOWN, VK_ENTER, 0);
For a directed send of keystrokes, SendInput is the native method of choice, though it is subject to UIPI (integrity level) checks on Vista/2008/W7. You can't send keystrokes to an app that has an IL > yours.
A more general solution for capturing and redirecting input is a global keyboard hook (See the help for SetWindowsHookEx). But this is fairly hairy stuff - you have to cope with how you send the keystrokes on, you affect every process in the system because your code is effectively inserted in the input stream, it involves writing a native DLL... you have to know exactly what you're doing.
We use a global keyboard hook in our system (I wrote it), but we're a special case - a single function emergency call handling system. I wouldn't advise a global hook as a solution in general purpose Windows computing.
You don't need SendInput() or hooks
The answer with PostMessage is wrong.
You must attach remotely your thread. See on Win32 api Group where it's a very classic question (C code, right MS method)
Yes you can send keystrokes, no it won't bring the other window to the top.
One way is to use the SendInput API function - here's an example of how to use it in VB6 (ulp!).
Probably easier to use GUI Automation which is supported directly from .NET Framework 3.0 - for instance read this.

Resources