Typing simulator — sending keystrokes to other application - windows

i've been challenged to write an app simulating keystrokes. After pressing shortcut app should be able to send predefined key combination to currently active app. It is functionality provided by many existing applications, but i want to write it on my own. App should be usable on windows.
Could you provide me with suggestions about:
which programming language should I choose?
Is there any libraries providing such functionality?
EDIT: To be precise: both applications are standalone, windows app.

The native winapi for doing this is SendInput.
To answer your questions:
Which programming language? This is up to you. Because this is just a simple native API call, use any language that allows you to call native API.
Don't bother with libraries if this is all you need, because it's very simple.
Now, to go further, I know you didn't ask this but many many people continue by asking how to send keystrokes to windows that don't have keyboard focus - like sending keystrokes to a specific app. This is much more difficult and error-prone. And since it goes outside the behavior of actual real keystrokes, it can behave unpredictably. Here is one such question.

You can easily do this on C#, using
SendKeys.Send("key here");
or
SendKeys.SendWait("key here");
Some keys use other keycodes, you can see them here on MSDN.

Related

programming language for capturing keyboard input on a specific USB port

Developing this web app, my only challenge, is to capture keyboard input from a SPECIFIC usb port. I'm trying to target a BARCODE reader that emulates a keyboard and make my web app react only to input coming from the barcode reader, and not the actual keyboard.
I know this can't be done w/o the help of a win32 application with some sort of a keyboard hook, so I'm trying to pursue this and perhaps learn a bit of whatever language I need to learn just to achieve this small part that I need, but i don't know where to start.
I know there's VB, .NET, C, etc. But For my purpose, what's the easiest language to learn for this? I don't plan to learn more than what i need to to achieve this, really...
Thanks all.
If You want to start somewhere, I recommend using this link.
http://www.codeguru.com/cpp/w-p/system/keyboard/article.php/c5699/Hooking-the-Keyboard.htm
If this cannot halp, use this:
http://msdn.microsoft.com/en-us/library/windows/hardware/ff540174(v=vs.85).aspx
The thing is, that first link is designed for hooking windows keyboard ( and also works here in my company, where my keyboard is usb-attached to my laptop ).
Perhaps the first one is enough. You must check it out.
(Recommend to dissasemle also some c-code sections in order to check, which system-calls ( irq's on windows ) are involved.
Whether VB or C# can do the same, ..... sorry, I never tried.
But c can definitely.

Intercepting mouse events in windows

Well the title seems pretty clear about what I want to do.
More precisely: I want to create a program (c++ or java is preferred) that manipulates the mouse in two ways, like: changing its position and doing clicks.
I was thinking about using allegro (it has mouse routines to manipulate the things cited above) or sdl(which I don't know if has that kind of routine). I tried with allegro nsuccessfully. My problem here was that I couldn`t virtually "do" clicks. I also couldn't redirect the stuff changed by my program to some other window.
Any tips?
There are a couple of ways to try automating other applications on Windows...
At the simplest level, one can use PostMessage to post keyboard and mouse messages to another application's windows. This has the advantage that it could work even if the other application is minimized. Unfortunately, this approach skips the majority of the input processing logic so applications that directly access key state using GetAsyncKeyState will not see (for example) the control key being 'down' no matter how many WM_KEYDOWN, vk=VK_CONTROL messages you send.
As Hans Passant commented, SendImput places input events in a lower level input event queue, and so can fully simulate modifier keys. These input events are not posted to windows however, so to get the input events delivered successfully the normal windows rules of activation and focus need to be followed. That said, this is the approach used by most testing-automation software (and is why most testing automation software requires the application being tested be the active application).
The last of the automation methods Ill mention - and sadly the least likley to work - is the Microsoft UI Automation framework. This framework is intended to allow applications to be used by disabled and/or special needs users. Sadly - very few software providers bother to implement this API in their products.

keyboard hook in windows C++ or what?

I wish to build my own application which can send keyboard commands(messages) to the Windows OS.
For example when I press the combination ctrl+shift+n, I wish to launch the notepad.exe . How can I do that? Do you have some advice for me about the concept used.
I've read that is possible when are used keyboard hooks? That's the only way? Do you know a free and open-source application which does this as simple is possible?
Your particular example can be done without any programming at all, by right clicking on Notepad, selecting Properties, and setting the "hot key" (various Windows versions might call it by a different name) to Ctrl+Shift+N.
If you still would like to write a program to do this, have a look at the RegisterHotKey Win32 API function.
AutoHotkey is a free, open-source utility for Windows.
You can automate many tasks with the above utility, check it out.
Things to bear in mind:
A system-wide keyboard hook requires the writing of a DLL. There's example keyboard hook code on my website here.
Hooks cannot be installed from a low to a high integrity level application in Vista and Windows 7/8/10. So there's no guarantee your hook will work, depending upon what the foreground application is when the key gets hit.
As Greg pointed out, a lot of the time, RegisterHotKey is a much simpler solution for this problem.

How to control the mouse pointer outside our application

I want to control the mouse pointer with my application and be able to interact with other programs using my program,
For example I want my application to be able to click on a button on another application
How should I go about solving this problem?
(Any programming language would work, also if you have any suggestion please let me know)
Afterthoughts:
I want to do it in windows operating system and want to test my GUI to see if it works in different scenarios. Any language would work for me since this is not part of the final product but I prefer one of these languages (Python, Java, C# or MATLAB)
Thanks
There are many ways of doing this, and you didn't mention any details of your application (system, target goal, etc...).
If your goal is menial automation, I'd recommend whipping together a quick AutoIt script on Windows. http://www.autoitscript.com/autoit3/index.shtml
If this isn't what you're looking for, please give more details.
Okay, this one is really operating system and windowing specific. But the phrase you're looking for is "mouse grabbing".
As #Mitch suggests, unless you've got a really good reason — like maybe a GUI testing app? — then grabbing the mouse and messing with it in that way is very bad form.

Call another program's functions?

So I have this program that I really like, and it doesn't support Applescript. I'd like to automate it a little bit. Now, I know that I could use applescript to tell the program to tell the menu to tell the submenu to tell the menuitem to activate or whatever, but frankly I don't like applescript very much anyway.
When I open the NIB file in IB, I can see the messages that are being sent to FirstResponder; for example, the Copy menu item sends "copy:". Is there any way for me to invoke this directly from another program?
No. It's called protected memory for a reason, you know. The other program is completely insulated from your application. There are ways to put code into other apps, but (a) it's very inadvisable (b) requires root privileges, which means the rest of your app needs to be ROCK SOLID AND IMPREGNABLE, and (c) writing such code is a black art requiring knowledge of the operating system kernel interfaces, virtual memory management, the ABI, the internals of the linker/loader, assembler programming, and the operational parameters and other specifics of the particular processor upon which your app happens to be running.
Really, AppleEvents and other such IPC mechanisms are there for a reason.
Your other alternatives (all of which are a bit hacky, to be honest, and give you the fairly significant burden of ensuring the target app is in the state you want/expect) the access the data you're looking for are:
The Accessibility APIs from the ApplicationServices framework, through which you can traverse the UI tree to grab the text from wherever you need it directly, or can activate the menu item. Access for your app has to be explicitly granted by the user, however (although this is much the same as the requirement for UI scripting).
You can use the CoreGraphics APIs (within the ApplicationServices framework again) to send keyboard events to the target application (or just to the system) directly. This would mean sending four events: Command-down, C-down, C-up, Command-up.
None of these are ideal. To be honest, your best approach would be to look at your requirements and figure out how you can best engineer around the problem by changing those requirements in some way, i.e. instead of grabbing something directly, ask the user to provide some input, etc.
You might be interested in SIMBL or in mach_inject. SIMBL is a daemon (in my fork based on mach_inject, in the original version based on injection via some ScriptingAdditions hack) which does the injection for you, so you just need to put a bundle with your code into the SIMBL directory and SIMBL will inject it for you into the target application. Or you can do so yourself via mach_inject. Or probably more convenient, mach_inject_framework which injects and runs code which just loads some framework.
I think Jim may overstate the point a bit; he's not wrong, but it seems misleading. There are lots of ways to cause a Cocoa program to execute its own code under you control (Carbon is harder). The Accessibility API is very commonly used this way (so commonly that I expect it to be repurposed eventually). Fscript can give you all kinds of access to the innards of another Cocoa program. While Input Managers may well exit the scene at some point, SIMBL is still out there today to do this kind of stuff.
Whether you like Applescript or not, Apple Events are the primary way Apple provides for inter-program control. Have you double-checked Script Editor's Open Library function to find out if the program really does have any Applescript support? You can code Apple Events entirely in Objective-C these days using Leopard's Scripting Bridge. I wrote up a tutorial if you like (it's still under-documented by Apple).
Cocoa is a reverse-engineer's dream. The same guys who host SIMBL have a nice intro to the subject. "Wolf" also writes a lot of useful information on this.
Jim's right. Many of these approaches can completely destabilize the system if done incorrectly (sometimes even if done correctly). I don't do much of this stuff on my production systems; I need them to work. But there are a lot of things you can make a Mac app do, and it's a good part of a Mac developer's training to understand how all the pieces really work.

Resources