I know that I can use
SendMessage(buttonHandle, BN_CLICK, 0, 0);
to get my program (program 1) to click a button on another program (program 2), but I was wondering if there was a way of doing something else. The button I wish to click on program 2 is going to pop up a window.
QUESTION:: Is there any message I can send in the SendMessage() function to bring that window up directly or is there possibly some other function that I can use to do that?
Possible ways to automate another application include:
Faking input.
Sending messages to specific windows.
Using a system wide automation framework such as UI Automation.
Using an application specific application framework.
You are asking if it is possible to do the final of these options. That depends on the application. If the application publishes such an automation interface it is possible. Otherwise you are looking at one of the other options.
Related
I would like to know is there a way to drive an existing windows application? I want to execute operations in an application like filling out text fields in a form, hitting next and submit buttons, etc. Basically what a user would do, I wanted to automate those operations. What would be the best way to achieve this?
Thanks
Mukul
It is possible (with limitations and quirks), if that particular Windows application uses native windows (so-called) controls (UI elements). Qt, for example, paints UI elements "by hand", while MFC applications uses Win API UI native (and expanded) elements. So, it depends.
You can explore application and it's UI elements using Spy++ tool inside Visual Studio (there are free alternatives available). Using these tools, you can look up target window class name, ID and other attributes that would help you to find and identify elements of interest using Windows API functions.
One can use EnumDesktopWindows, FindWindowEx, FindWindow, and others, to find window and it's inner control of your interest. Then, using SendMessage you can send various messages to set focus, emulate mouse clicks, set text for Edit control, simulate button clicks, etc, etc.
You can write such a program using UI Automation, which allows a program to discover and use the GUI of another application. It's how accessibility tools like screen readers interact with your applications.
I am not sure how to ask the question so here is a picture of some idea that came to mind
So for example, when you run my "custom launcher" it displays a window with a couple buttons on the side which you can assign values to. When you click on a button, the appropriate program will run in the big panel on the right (in window mode).
This is all from the user's perspective of course. They will just see that the program they want to run appears in that panel. The actual implementation may have nothing to do with "one program running inside another program"
My own use case is limited to windows desktop platforms only, but if it is possible to generalize it that would be nice as well.
Is this actually possible? Can I write such a program that will run another program inside a panel? The program that's launched may be someone else's, such as MS paint or calculator.
Just to expand on my comment above, here is an approach that may work for you: Fake it :)
When you launch the program, intercept all windows messages to the program that control it's position on screen. That way it 'appears' to be fixed in place, but in reality it's still attached to the normal Windows desktop.
Here's some light reading for you:
Windows Event Hooks
A hook is a mechanism by which an application can intercept events,
such as messages, mouse actions, and keystrokes. A function that
intercepts a particular type of event is known as a hook procedure. A
hook procedure can act on each event it receives, and then modify or
discard the event.
I would recommend against it in a commercial application because you are modifying the behavior of software you don't own - that software may make assumptions about what its parent window is, but for experimentation there's the SetParent Win32 function.
I need to make minor modifications to a legacy Win32 application, but have no access to the original developer or source code. The application was not designed to be extended by third parties. What are my options for doing this?
The changes required are not extensive: I need to launch an external application when a specific text label is clicked.
Is it possible to access and modify the controls in the target application from an outside application?
What you are asking for can be accomplished by either using a SetWindowsHookEx() hook, or subclassing the label directly, to detect when the label is clicked. Your hook/subclass can then launch the external process.
If you need to react when a text is clicked, you could try to use the Microsoft UI Automation technology, and in this case, UI Automation Events.
Note that depending on how the application is written, it may or may not work.
You can try the cool Inspect and Accessible Event Watcher tools at least to check if its seems feasible.
Normally user is doing it by clicking right-mouse into console title bar then selecting "edit" and finally "mark". -> http://www.megaleecher.net/Copy_Paste_Text_Dos_Window
So is there a way of doing it from a console application either by sending a message/api call/keyboard sequence to its own window ?
If this is your own application and you want the richer behaviour and flexibility of a windows app rather than a console app, then use a windows app. Otherwise, you can try to automate the steps by simulating the input via SendInput. I would advise against doing this because it requires two steps (once for right-click, once to select 'Mark'). This means if someone clicks something else between these two events, your sequence will be broken. Furthermore you are really relying on the automation of an implementation detail which is prone to change at any point.
Looking through the Console Functions, it doesn't appear as though anything exists for setting the selection. The closest is going the other way with GetConsoleSelectionInfo.
If you want to process the information that is within a console application, a better alternative is to pipe it to your own process and deal with it there.
Found: PostMessage(GetConsoleWindow(), WM_COMMAND, 65522, 0);
I have an application that called several other .exe components written in delphi. The question I ask is that is it possible to close the delphi app along with all application it opened (when clicking the '[x]' button)?
Also, obviously, I have learned how to open and close external application, but in several cases like Windows Media Player it just doesn't seem to work... can anyone give me some solution to this?
Thanks in advance
You can use Job Objects , read the documentation for these functions CreateJobObject and AssignProcessToJobObject.
A job object allows groups of processes to be managed as a unit....
Examples include enforcing limits such as working set size and process
priority or terminating all processes associated with a job.
If you keep track of the applications you open, you can post a WM_QUIT message to each one's window handle in the OnClose event of your Delphi app's main form.
The same should work for Media Player, but it's hard to say when you don't give any information about how you opened it.