My application is a fullscreen window which is rendering a designated other window (from dwm), for example Google Chrome. I would like to know if it's possible to send events (such as mouse keyboard events) to the specified window.
Of course the designated window has to stay in background, and my current application on the foreground.
My application is written in C++. I'm working on Windows 7/8.
Just to put it into an answer.
Based on this question Does any program/language/library that interacts with windows do it via the WIN32 API? you should be able to use the windows API to send a windows message to any window. All you need to get is that windows handle, or you could do a broadcast to all windows.
The specific function http://msdn.microsoft.com/en-us/library/windows/desktop/ms644950(v=vs.85).aspx
Though that function will block until the windows responds and processes the message, this could hurt GUI performance. If you notice issues try implementing http://msdn.microsoft.com/en-us/library/windows/desktop/ms644951(v=vs.85).aspx instead.
Related
Using JUCE with TUIO, I'm developing a multi-touch utility to send "hot keys" commands to other applications (I am using a usb touch frame that sends TUIO messages). For instance, I provide an interface through which users can touch-and-hold to program a key combo and then tap that button to send the programmed key combo to another app. They way I accomplish this on OSX is by running my utility as a "background only" application (NSApplicationActivationPolicyProhibited). I use [NSWindow setCanHide: NO] so the GUI of my utility is visible even though it runs as a background app.
It works well except in the case that a window from another application is on top of mine. What happens is that touches get passed through that other app into mine- causing unintentional button pushes in my app. Normally, I could have my app only listen to the TUIO touch callback whenever is is the active application, [NSApp isActive]. But, since my app is background only, it is never active and I have no way to tell if another window is covering it to prevent touches.
So, is there any way for a "background only" app to be able to tell if it is on top of all other windows? Or, is there a way from within my app to get a list of all Cocoa windows from other applications and be able to tell if they are appearing on top of my "background only" app?
Also, does anyone know how I would go about all of the above in Windows? In other words, what is the Windows equivalent of NSApplicationActivationPolicyProhibited and would I be able to tell if it is covered by other applications' windows?
I have been working on windows automation and monitoring.
What exactly happens when I lock the screen of a windows machine?
I am working with Windows 7 at the moment, are there big differences to the behavior if I switch to Vista or the server versions?
Is there still a desktop that can be accessed via api's?
I know that i can still send key strokes and mouse clicks to specific windows (via ControlSend and ControlClick), but there seems to be no "desktop" itself.
Could someone shed some light on this whole thing or point me at a readable source where I could get an overview over the topic?
Basically what happens is that Windows switches to the secure desktop, makes it the current one, so input is now associated with it.
The old desktop remains where it was: all the HWNDs on the desktop are still there, and any thread attached to that desktop can still access those HWNDs, get their location, and so on. You can still send messages to windows on this desktop, so long as the thread sending the message is also on that desktop.
However, since the desktop is now inactive, it cannot receive input. GetForegroundWindow will return NULL (IIRC), and you can't use SendInput any longer, since input now belongs to [a thread on] a different desktop; no controls on that inactive desktop can receive focus.
Note that sending keypress messages to a control that doesn't have focus can sometimes cause unexpected behavior, since the app or control generally never expects to receive keyboard input without getting the focus first. (This can be problematic for controls that set up some sort of input context in WM_SETFOCUS and clear it up in WM_KILLFOCUS, for example.)
In short, the UI is still there: you can do certain queries against it, but you can no longer automate it as you could on a regular desktop by sending input, and some other functions that relate to focus or input may fail.
I'm not super familiar with AutoHotKey, but the name and description of functionality suggests that it's heavily reliant on the underlying Win32 SendInput API. This won't work at all for keyboard input when a desktop is inactive.
For a reasonable overview of how desktops work and how they relate to winstations, the locked desktop, and so on, check out the Desktop article on MSDN.
One issue that I've run into in the past with desktops and automation is: how to I leave a long-running test that's using some form of user input automation (mouse, keyboard simulation), but still lock my PC so that someone can't just walk by and interfere with it. Once you lock the PC, the desktop is inactive, and so the automation stops working. A similar issue happens if the screensaver kicks in: the desktop switches, and the automation fails.
One solution is to use two PCs: let's call them Main and Test: from Main, open a remote terminal services client onto the Test machine, and then run the automated test on the test machine, but from a terminal services client window on the Main machine. Now the cool part: you can minimize that TSC window, or even lock the Main machine (or let the screensaver kick in), and that virtual session will continue working, thinking that it is still active - it's just that nobody is paying it any attention. This is one way to create a "connected" session with an active desktop, but one that no-one can interfere with, because it's protected behind the locked desktop of the Main machine.
I don't know the details, but I believe the lock screen constitutes a separate "desktop" and maybe also a separate "window station" (as I understand it a window station is merely a container for desktops). The MSDN section on window stations should hopefully be useful: http://msdn.microsoft.com/en-us/library/windows/desktop/ms687098%28v=vs.85%29.aspx
In order to access a desktop, you will need to use the regular windows api's from a thread that is on that desktop. SetThreadDesktop would probably be the easiest way to do that in C, as long as the desktop isn't on a different window station.
Unfortunately, this is already difficult for a regular privileged application, and using AutoHotkey complicates it even more. Since you don't have control over threads or over process initialization, you will probably have to create a new process in the other desktop (you can do this using the CreateProcess API, which appears to have a wrapper available for AHK to which you can supply a desktop name: http://www.autohotkey.com/forum/topic1952.html). Your process will need special privileges to do this; I'm not sure that even running as Administrator is enough.
There is an application written in Java using AWT. And I want to resize its windows by an external program. My OS is Windows XP. Actually this application is an online poker client.
The windows are of "SunAwtFrame" class, so I look for those windows and call MoveWindow/SetWindowPos on them. The result is not the one I expect:
a problem http://savepic.net/1331700.png
As you see, the window did resize, but the content did not. While resizing manually it scales, and I want the same behavior here.
Probably, some additional action are needed to make AWT windows understand it was resized.
How can I fix the problem?
I recommend doing this:
Use Spy++ (available as a tool in Microsoft
Visual Studio) to filter messages sent to the SunAwtFrame window.
Resize window manually.
Figure out which messages are sent to the window during resizing. Use
MoveWindow/SetWindowPos and/or send those messages after you resize.
Take a look at functions InvalidateRect and UpdateWindow.
I have a hidden process that waits for non-standard hardware button messages and runs an application (with CreateProcess). No problem with the user disturbing, it's an action that the user approved himself. Everything is fine when it's usual layout with taskbar shown and multiply captioned and non captioned- windows. But the situation is different in XP and 7, when the current application is full-screen. Full-screen application in this case is window without borders having exactly the same dimension as the screen. Windows hides taskbar for such application even if it's always on.
In Xp, it's ok, the taskbar is being shown in this case and appication (for example calculator) also, the full-screen app is still visible in areas other than the launched app's and taskbar'. But in Windows 7 nothing visual happens, the full-screen app is still on and if I switch to taskbar, the executed application is there. I tried to solve it with SetForegroundWindow, BringWindowToTop, even AllowSetForegroundWindow(GetCurrentProcessId()) call for a window handle found with CreateProcess-WaitForIntputIdle-EnumThreadWindows, no change. So did something change since XP related to full-screen windows that officially documented?
Thanks,
Max
I would imagine that, if you have your own hardware device, that there is some API for generating "real" user input. Clearly the legacy keyboard and mouse, and now USB HID drivers (many of which are usermode I think?) have access to an API to do so.
Synergy+ for example can generate fake keyboard and mouse events on connected PCs, and the consequence of the faked input is windows switching activation normally.
So, my initial idea is for your usermode "Device" application to synthesize actual keyboard messages - SendInput seems a likely candidate for "the API that can "fake" real user input events.
Then, use an API like RegisterHotKey in your "UI" app to respond to the hotkey combination your device app generates.
Now, (assuming that SendInput IS generating user input events at the correct level), you should (from within the WM_HOTKEY handler in your UI app) have permission (because everything was "user initiated") to change the foreground window (to yourself).
Vista introduced the desktop composition feature. In short, all windows are drawing to a memory bitmaps and the Desktop Window Manager is then composing these bitmaps and drawing on a full-screen Direct3D surface. Full-screen windows do not participate in the desktop composition and get to draw directly on the screen (mostly because the majority of full-screen apps are games that need real-time screen updates).
In particular, this means that when a full screen app is up and running, it is covering the DWM composed image and the user needs to switch to a DWM-managed window for the DWM to start drawing on top of the full-screen app.
I don't have a good solution for your problem, unfortunately. One way to solve it would be to add the WS_CAPTION style to your app and then handle WM_NCPAINT/WM_NCCALCSIZE/WM_NCHITTEST yourself. This would allow you to lie to the DWM that you are a regular windowed application, but change visually your NC area to look like you have no title. However, this does require certain amount of additional code and might be a bit more effort you want to invest.
Another way you can try to solve your problem is to explicitly minimize your full-screen application window when launching the new process. However, you will then have to solve the problem of when to maximize it back again.
Btw, you might find the comments on this post from Raymond Chen interesting.
Windows supports multiple desktops and my guess would be that the full screen up is using a different desktop than the default one (where your application will be shown). A desktop object in Windows is "a logical display surface and contains user interface objects such as windows, menus, and hooks". For example, screen savers normally are started on a separate desktop.
You can find out which desktop an application is running on using Process Explorer:
Set Process Explorer to replace Task Manager and to run always on top.
When your full screen up is shown, launch Process Explorer by pressing Ctrl + Shift + Esc
Within Process Explorer, select the full screen process and press Ctrl + H to display the handles of this process
See the value of the Desktop item in the list. Usually this would be set to Default
If you know what desktop this app is running on you can start your process on the same desktop by first calling OpenDesktop to get a handle to this desktop and then pass it into the STARTUPINFO of your CreateProcess call.
I have an application that manages patient demographic information. Along with this data a user can scan a picture of a patient and assign that picture to a patient. When the user clicks the scan button a separate application is opened as a dialog in order to scan the image. When running this on XP everything worked fine. The imaging application loaded up fine and gained focus. On Vista however occasionally the imaging application will not gain focus and will popup behind the main application. When running full screen or through 2008 Application Server you cannot see the application, you only get a locked screen and it appears nothing has happened. Is there any way to change the window focus management on Vista to work the way XP did? I'm looking for a way to solve this without making changes to the actual application if possible.
I think you will have to make changes to your application to allow the imaging application to take the focus. I'm going to assume that your application launches the imaging application through ShellExecute or CreateProcess. If so, you can get the process handle of the launched process either through SHELLEXECUTEINFO.hProcess (for ShellExecute) or PROCESS_INFORMATION.hProcess (for CreateProcess). Immediately after launching the imaging application call the AllowSetForegroundWindow API:
AllowSetForegroundWindow(GetProcessId(hProcess));
This will allow the imaging application to place its main window/dialog in the foreground when it's starting up.
You could try the following steps:
1. Right Click on the exe
2. Select Properties
3. Select the Compatibility Tab
4. Check the Run this program in campatibility mode for:
5. Select Windows XP (Service Pack 2)
You could iterate through all top level HWNDs and identify the scanning application via its window class, then send an appropriate message to raise the window.
I don't believe this is Vista vs XP related. I think that simply this imaging app takes longer to start on Vista.
Since Windows 2000, the window manager has prevented background applications stealing the foreground. When an application is launched, it has a window of opportunity to create and show a window that will take the foreground. If it takes too long, the window manager thinks that the current window should keep the foreground, and inhibits the other app taking the foreground when it does finally launch.
I can't think of any specific way to avoid this... other than using FindWindow to search for the other apps window after launching the app. When you eventually find it, call SetForegroundWindow on it to bring it to the foreground.