win32 PostMessage WM_APPCOMMAND sends multiple messages instead of one - winapi

I'm writing a small accessibility app which simulates certain keyboard gestures, such as volume up\down.
The goal is to send a single command.
In practice, the volume goes all the way up to 100%, as if user pressed a button for couple seconds or as if the message was dispatched multiple times.
This behavior is the same with both PostMessage and SendMessage, in both C and C# (using PInvoke)
C:
PostMessage(0xffff, 0x0319, 0, 0xa0000)
C#:
PostMessage(new IntPtr(0xffff), WindowMessage.WM_APPCOMMAND, (void*)0, (void*)0xa0000);
The meaning of parameters: send to all windows, message, no source, volume up
Question: How do I issue a command which would result in Windows adjusting volume by the smallest increment?
Additionally, I attempted using WP_KEYUP and WP_KEYDOWN, without success
// dispatch to all apps, message, wparam: virtual key, lparam: repeat count = 1
User32.PostMessage(new IntPtr(0xffff), User32.WindowMessage.WM_KEYDOWN, new IntPtr(0xaf000), new IntPtr(1));
User32.PostMessage(new IntPtr(0xffff), User32.WindowMessage.WM_KEYUP, new IntPtr(0xaf000), new IntPtr(1));

The reason why the command is sent multiple times is, as pointed by Hans in the comment, I broadcasted it to all windows using 0xffff as first parameter. Every window handled it by increasing volume by a notch.
The solution to sending multiple messages is to send the message to either
The shell handle GetShellWindow()
The foreground window handle GetForegroundWindow()
Both handles adjusted the volume by one notch. GetDesktopWindow() did not work, though.

Related

How GetKeyState works exactly?

I have been struggling with understand how GetKeyState operating. I have done endless google searching and haven't yet managed yet to understand exactly how it works
According to MSDN:
The key status returned from this function changes as a thread reads key messages from its message queue.
Take a look at the following code. I didn't create a message processing loop. 65 represents the virtual key of the character 'A'.
while(true) {
printf("the character %c, the vkey_state is %x",
MapVirtualKey(65, MAPVK_VK_TO_CHAR),GetKeyState(65) & 0x8000);
Sleep(150);
}
I pressed "A" on the keyboard, while being at the window console of my program.
sometimes, the vkey_state value is 0x8000 as expected, sometimes not.
What exactly is happening under the hood? I didn't write any message-processing code, so i assume it is created automatically. When I press 'A', a WM_KEYDOWN is sent to my thread message queue. When I release the key 'A', a WM_KEYUP is sent to my thread message queue. other key-related messages might be sent in between. What happens when I call GetKeyState? when exactly it will set the MSB of its return value to '1'? When will it change back to 0? Is it related to the calls to GetMessage?
In Addition - what confused me the most - is when I switched to another program (cmd.exe), and I typed 'A', my program was able to monitor it while being in the background - but cmd.exe thread has another message queue - why does it work? However - it didn't work If I started cmd.exe in elevated mode (high integrity).
this contradicts the information I found here.
If the user has switched to another program, then the GetKeyState function will not see the input that the user typed into that other program, since that input was not sent to your input queue.

OSX Cocoa input source detect change

Does anyone know how to detect when the user changes the current input source in OSX?
I can call TISCopyCurrentKeyboardInputSource() to find out which input source ID is being used like this:
TISInputSourceRef isource = TISCopyCurrentKeyboardInputSource();
if ( isource == NULL )
{
cerr << "Couldn't get the current input source\n.";
return -1;
}
CFStringRef id = (CFStringRef)TISGetInputSourceProperty(
isource,
kTISPropertyInputSourceID);
CFRelease(isource);
If my input source is "German", then id ends up being "com.apple.keylayout.German", which is mostly what I want. Except:
The results of TISCopyCurrentKeyboardInputSource() doesn't change once my process starts? In particular, I can call TISCopyCurrentKeyboardInputSource() in a loop and switch my input source, but TISCopyCurrentKeyboardInputSource() keeps returning the input source that my process started with.
I'd really like to be notified when the input source changes. Is there any way of doing this? To get a notification or an event of some kind telling me that the input source has been changed?
You can observe the NSTextInputContextKeyboardSelectionDidChangeNotification notification posted by NSTextInputContext to the default Cocoa notification center. Alternatively, you can observe the kTISNotifySelectedKeyboardInputSourceChanged notification delivered via the Core Foundation distributed notification center.
However, any such change starts in a system process external to your app. The system then notifies the frameworks in each app process. The frameworks can only receive such notifications when it is allowed to run its event loop. Likewise, if you're observing the distributed notification yourself, that can only happen when the event loop (or at least the main thread's run loop) is allowed to run.
So, that explains why running a loop which repeatedly checks the result of TISCopyCurrentKeyboardInputSource() doesn't work. You're not allowing the frameworks to monitor the channel over which it would be informed of the change. If, rather than a loop, you were to use a repeating timer with a low enough frequency that other stuff has a chance to run, and you returned control to the app's event loop, you would see the result of TISCopyCurrentKeyboardInputSource() changing.

Problem with Boost Asio asynchronous connection using C++ in Windows

Using MS Visual Studio 2008 C++ for Windows 32 (XP brand), I try to construct a POP3 client managed from a modeless dialog box.
Te first step is create a persistent object -say pop3- with all that Boost.asio stuff to do asynchronous connections, in the WM_INITDIALOG message of the dialog-box-procedure. Some like:
case WM_INITDIALOG:
return (iniPop3Dlg (hDlg, lParam));
Here we assume that iniPop3Dlg() create the pop3 heap object -say pointed out by pop3p-. Then connect with the remote server, and a session is initiated with the client’s id and password (USER and PASS commands). Here we assume that the server is in TRANSACTION state.
Then, in response to some user input, the dialog-box-procedure, call the appropriate function. Say:
case IDS_TOTAL: // get how many emails in the server
total (pop3p);
return FALSE;
case IDS_DETAIL: // get date, sender and subject for each email in the server
detail (pop3p);
return FALSE;
Note that total() uses the POP3’s STAT command to get how many emails in the server, while detail() uses two commands consecutively; first STAT to get the total and then a loop with the GET command to retrieve the content of each message.
As an aside: detail() and total() share the same subroutines -the STAT handle routine-, and when finished, both leaves the session as-is. That is, without closing the connection; the socket remains opened an the server in TRANSACTION state.
When any option is selected by the first time, the things run as expected, obtaining the desired results. But when making the second chance, the connection hangs.
A closer inspection show that the first time that the statement
socket_.get_io_service().run();
Is used, never ends.
Note that all asynchronous write and read routines uses the same io_service, and each routine uses socket_.get_io_service().reset() prior to any run()
Not also that all R/W operations also uses the same timer, who is reseted to zero wait after each operation is completed:
dTimer_.expires_from_now (boost::posix_time::seconds(0));
I suspect that the problem is in the io_service or in the timer, and the fact that subsequent executions occurs in a different load of the routine.
As a first approach to my problem, I hope that someone would bring some light in it, prior to a more detailed exposition of the -very few and simple- routines involved.
Have you looked at the asio examples and studied them? There are several asynchronous examples that should help you understand the basic control flow. Pay particular importance to the main event loop started by invoking io_service::run, it's important to understand control is not expected to return to the caller until the io_service has no more remaining work to do.

How to detect inactive user

How to detect inactive (idle) user in Windows application? I'd like to shutdown application when there hasn't been any input (keyboard, mouse) from user for certain period of time.
To track a user's idle time you could hook keyboard and mouse activity. Note, however, that installing a system-wide message hook is a very invasive thing to do and should be avoided if possible, since it will require your hook DLL to be loaded into all processes.
Another solution is to use the GetLastInputInfo API function (if your application is running on Win2000 (and up) machines).
GetLastInputInfo retrieves the time (in milliseconds) of the last input event (when the last detected user activity has been received, be it from keyboard or mouse).
Here's a simple example. The SecondsIdle function returns a number of second with no user activity (called in an OnTimer event of a TTimer component).
~~~~~~~~~~~~~~~~~~~~~~~~~
function SecondsIdle: DWord;
var
liInfo: TLastInputInfo;
begin
liInfo.cbSize := SizeOf(TLastInputInfo) ;
GetLastInputInfo(liInfo) ;
Result := (GetTickCount - liInfo.dwTime) DIV 1000;
end;
procedure TForm1.Timer1Timer(Sender: TObject) ;
begin
Caption := Format('System IDLE last %d seconds', [SecondsIdle]) ;
end;
http://delphi.about.com/od/adptips2004/a/bltip1104_4.htm
You might want to see the answer to this question: How to tell when Windows is inactive [1] it is basically same question the solution suggested is to use the GetLastInputInfo [2] API call.
This post explains some aspects as well: (The Code Project) How to check for user inactivity with and without platform invokes in C# [3]
[1] How to tell when Windows is inactive
[2] http://msdn.microsoft.com/en-us/library/ms646302%28VS.85%29.aspx
[3] http://www.codeproject.com/KB/cs/uim.aspx
Your application will get a WM_SYSCOMMAND message with SC_SCREENSAVE as a command id when the Screen Saver is about to kick in. Would that do? there's also the SC_MONITORPOWER command id when the monitor is about to blank (also a WM_SYSCOMMAND message).
Edit: looking at the comments, it appears that you don't care about whether the user is inative, but rather whether your application is inactive.
This is easy. If your app is minimized, then the user isn't interacting with it. If your app is not the foreground application, that's a good inicator as well.
You could also pay attention to messages in your pump to notice if there have been any user input messages to your app, In C++ adding code to the pump is trivial, in delphi you can use a WH_GETMESSAGE hook to monitor the pump hook into the message loop that TApplication implements. Or GetLastInputInfo
This SecondsIdle doens't work at all.
The way is to use a TTimer combined with a second variable that resets every time user inputs mouse or keyboard.

Modal operation using IMessageFilter and DoEvents

This is a Windows Forms application. I have a function which captures some mouse events modally till a condition is met. For example, I would like to wait for the user to select a point in the window's client area (or optionally cancel the operation using the Escape key) before the function returns. I am using the following structure:
Application::AddMessageFilter(someFilter);
while(someFilter->HasUserSelectedAPoint_Or_HitEscapeKey()){
Application::DoEvents();
}
Application::RemoveMessageFilter(someFilter);
This works quite nicely except for taking up nearly 100% CPU usage when control enters the while loop. I am looking for an alternative similar to what is shown below:
Application::AddMessageFilter(someFilter);
while(someFilter->HasUserSelectedAPoint_Or_HitEscapeKey()){
// Assuming that ManagedGetMessage() below is a blocking
// call which yields control to the OS
if(ManagedGetMessage())
Application::DoEvents();
}
Application::RemoveMessageFilter(someFilter);
What is the right way to use IMessageFilter and DoEvents? How do I surrender control to the OS till a message is received? Any GetMessage equivalent in the managed world?
You could sleep the thread for 500ms or so between DoEvents() calls. Experiment with different values to see what feels right.

Resources