Hook Windows Mobile 6.5 phone on/off button - windows

recently this post inspired me, I want to track my own life, too.
I take a look at my cellphone's clock every time when go to sleep or after wake up, so I need some program to hook to my cellphone's on/off button, and log the timestamp when I press it.
I am using WM6.5 on a HTC TyTN II. If there is existing software that can do this with few settings and tweaks it would be nice, but I can also write code myself. Any suggestions?

You can use the device power notifications by P/Invoking RequestPowerNotifications to know when a power state changes (there's a CodeProject article that covers this as well). Be aware that the power down notification doesn't wait for subscribers to run code, so in most cases your application's handler doesn't actually run until the device wakes back up (meaning that if you're writing a timestamp or something you're going to get the wake time, not the sleep time).
Also be aware that different devices handle power management differently, so YMMV.

This is more tricky than you could possibly imagine!
The 'on/off' button isn't directly available to the OS (i.e. it's not visible in the keypad driver), as it is hooked into a rather complex system of power collapse (basically, the Apps CPU is completely powered down when the phone is 'off' - the modem wakes up according to the network-configured paging cycle).
The wake-sleep interactions of the two CPUs are extremely complex, and prone to race conditions. Even if you managed to hook it (which would require deep kernel-level programming and a number of security hacks), you would probably make your phone very unstable.

Related

A general solution to injecting keys to different applications on windows?

After receiving some feedback on this question: How to create lParam for WM_CHAR or WM_KEYUP/WM_KEYDOWN?, I`ve started looking for a broader answer to a general solution. One thing I realized that using windows API's might not work for every app and in every case.
My first step in the follow up research was to make an Arduino powered servo to press the keys (yeah the concept is horrible ik).
But that prompted yet another idea, a hardware augmented small numpad keyboard which is also operated by Arduino which was controlled via another usb. This was at least somewhat usable - but still not very.
Then I tried to use Digispark Atiny85 microcontroller which in turn used Digikeyboard library. This solution was much better - but then necessity to have a digispark stuck in your usb port was a bit frustrating.
This made me curious if there are ways to emulate keyboard or any other HID devices using software only? Some brief googling pointed me to Kernel drivers and virtual COM ports, but that seem to be a bit over the top for me to process.
So can that task indeed be achieved by writing a kernel driver? Can it be done in any other manner? In either case are there any pointers which you can give me on the topic?
The SendInput function can be used to generate keyboard and mouse input. This input goes to the foreground window as if generated by real hardware (but lowlevel hooks can tell that it was software generated). It might not let you generate Ctrl+Alt+Delete nor control a UAC prompt but other than that it should be good enough in most cases. Writing a driver to overcome these limitations is normally not worth it.
There is no general way to generate input to a specific application/window if it is not the foreground window.
If you want to control a specific application you should use UI Automation.
Faking key up/down/char messages with PostMessage is not uncommon but it does not always work (the application might be using RAW input, input is not synchronized with real hardware etc.). If you are determined to use this method anyway, make sure you send it to the correct window (the HWND with the keyboard focus, not just the top-level window). Use the Spy++ tool to view the messages to make sure they are going to the correct window.

Windows Timer Resolution vs Application Priority vs Processor scheduling

Please, make it once more clear the technical difference between these three things around MS Windows systems. First is Timer Resolution you may set and get via ntdll.dll non-exported functions NtSetTimerResolution and NtQueryTimerResolution or use the Sysinternals' clockres.exe tool.
One of the scandalous trick used by the Chrome browser some time ago to perform better across the web. (They left high resolution trick for Flash plugin only at the moment). https://bugs.chromium.org/p/chromium/issues/detail?id=153139
https://randomascii.wordpress.com/2013/07/08/windows-timer-resolution-megawatts-wasted/
In fact Visual Studio and SQL Server in some cases do the trick as well. I personally feel like it performs the whole system better and crisp, not slow down as many people warn out there.
What is the difference between the timer resolution and application I/O and memory priority (realtime/high/above normal/normal/low/background/etc.) you may set via Task Manager except the fact that the timer resolution sets up for the whole system, not a single application?
What is the difference between them and Processor scheduling option you can adjust from CMD > SystemPropertiesPerformance.exe -> Advanced tab. By default, the users' OS versions (like XP/Vista/7/8/8.1/10) set the performance of programs, the servers' versions (2k3/2k8/2k12/2k16) do care of background services. How this option interacts with those two above?
timeBeginPeriod() is the documented api to do this. It is documented to affect the accuracy for Sleep(). Dave Cutler probably did not enjoy implementing it, but allowing Win 3.1 code to port made it necessary. The multi-media api back then was necessary to keep anemic hardware with small buffers going without stuttering.
Very crude, but there is no other good way to do it in the kernel. The normal state for a processor core is to be stopped on a HLT instruction. Consuming (almost) no power, the only way to revive it is with a hardware interrupt. Which is what it does, it cranks up the clock interrupt rate. Normally ticks 64 times per second, you can jack it up to 1000 with timeBeginPeriod, 2000 with the native api.
And yes, pretty bad for power consumption. The clock interrupt handler also activates the thread scheduler, an fairly unsubtle chunk of code. The reason why a Sleep() call can now wake up at (almost) the clock interrupt rate. Tinkered with in Win8.1 btw, the only thing I noticed about the changes is that it is not quite as responsive anymore and a 1 msec rate can cause up to 2 msec delays.
Chrome is indeed notorious for ab/using the heck out of it. I always assumed that it provided a competitive edge for a company that does big business in mobile operating systems and battery-powered devices. The guy that started this web site noticed something was wrong. The more responsible thing to do for a browser is to bump up the rate to 10 msec, necessary to get accurate GIF animation. Multi-media playback does not need it anymore.
This otherwise has no effect at all on scheduling priorities. One detail I did not check is if the thread quantum changes correspondingly (the number of ticks a thread may own a core before being evicted, 3 for a workstation). I suspect it does.

Mouse cursor freezes in Windows LabView

I'm developing an application in LabView on Windows. Starting a week ago, one test machine (a ToughBook, no less) was freezing up completely once every couple days: no mouse cursor, taskbar clock frozen. So yesterday it was retired. But just now, I've seen it on another machine, also a laptop.
This is a pretty uncommon failure mode for PC's. I don't know much about Windows, but I'd expect it to indicate that the software stopped running so completely and suddenly that the kernel was unable to panic.
Is this an accurate assessment? Where do I begin to debug this problem? What controls the cursor in the Windows architecture — is it all kernel mode or is there a window server that might be getting choked by something? Would an unstable third-party hardware driver cause this, rather than a blue screen?
EDIT: I should add that the freezes don't necessarily happen while the code is running.
I'd certainly consider hardware and/or drivers as a possibility - perhaps you could say what hardware is involved?
You could test this by adding a 'debug mode' for each piece of hardware your LabVIEW code talks to, where you would use e.g. a case structure to skip the actual I/O calls and return dummy data to the rest of the application. Make sure it's a similar amount of data to what the real device returns. You'll find this much easier if you've modularised your code into subVI's with clearly defined functions! If disabling I/O calls to a particular bit of hardware stops the freezes it would suggest the problem might be with that hardware or its driver.
Hard to say what the problem is. Base on the symptoms I would check for a possible memory leak (see if your LabVIEW app memory usage is growing overtime using "windows task manager").

How can I programmatically stop a notebook battery from charging?

There is some easily available information on finding the status of a battery, or whether it's charging or not. (GetSystemPowerStatus API or System.Windows.Forms.SystemInformation.PowerStatus).
I want to be able to stop a battery from charging based on some criteria, e.g. battery power > 20%.
Is there an API to do this?
I think it's impossible, because you have need some API for battery or battery charger.
And this API can provide to you manufacturer of notebook and battery or battery charger support this.
I honestly don't know, but I'd have a look at the APM or ACPI APIs.
Other than that, the only option I can think of right now is a USB controlled robotic arm that ejects the battery when you need to stop charging, but that's probably not what you are looking for, and borders on the complicator's glove in terms of level of over-engineering. :)
I would just get a UPS and programatically tell it to cut all power... most should have an interface for doing this. Otherwise, as someone already said - a computer-controlled power strip would do it ^^
I've actually played with this idea when I was testing/writing about way too many new laptop models a while ago and the battery testing was annoying to set up, monitor and analyze.
I wrote an app that would do exactly everything (setup, listening, measuring, reporting) except unplugging the power and then replugging it and starting the computer again...
One of the options is to get hold of the device(I) for battery (Microsoft ACPI-Compliant Control Method Battery).
Listen for PowerNotification events forever. On each notification check the PowerStatus of the battery.
There are APIs for all of the above purposes in .Net and win32
Keep the device(I) disabled as long as the powerstatus is >threshold. Enable it as soon as goes below that or when you are not on AC power (i.e. before removing AC power, your continuously monitoring software should enable that battery device - or you manually enable it).
hmm,...this is a very buggy solution, but it can achieve what you want, although you have to be very careful.
Actually I use such a charge limiter. There is the control software - a Python script that monitors the battery level (psutil module) and controls external hardware - i.e. a switch that can be software controlled. I have Energenie and TP-Link homeplugs plus my own hardware contraption.
As it is for home use the software it's not polished at all, but with minimal effort can be adapted to any OS or hardware.
Let know if interested. The software lives here: CCC

Best practices for Alt-Tab support in a DirectX app?

When writing DirectX applications, obviously it's desirable to support the user suspending the application via Alt-Tab in a way that's fast and error-free. What is the best set of practices for ensuring this? Things that need to be addressed include:
The best methods of detecting when your application has been alt-tabbed out of and when it has been returned to.
What DirectX resources are lost when the user alt-tabs, and the best ways to cope with this.
Major things to do and things to avoid in application architecture for purposes of alt-tab support.
Any significant differences between major DirectX versions as they apply to the above.
Interesting tricks and gotchas are also good to hear about.
I will assume you are using C++ for the purposes of my answers, but if you can afford to use C#, XNA (http://creators.xna.com/) is an excellent game platform that handles all of these issues for you.
1]
This article is helpful for windows events in the window procedure to detect when a window loses or gains focus, you could handle this on your main window: http://www.functionx.com/win32/Lesson05.htm. Also, check out the WM_ACTIVATEAPP message here: http://msdn.microsoft.com/en-us/library/ms632614(VS.85).aspx
2]
The graphics device is lost when the application loses focus from full screen mode. Microsoft offers an article on how to handle this: http://msdn.microsoft.com/en-us/library/bb174717(VS.85).aspx This article also has a lost device tutorial: http://www.codesampler.com/dx9src/dx9src_6.htm
DirectInput can also have a device lost error state, here is a link about that: http://www.toymaker.info/Games/html/directinput.html
DirectSound can also have a device lost error state, this article has code that handles that: http://www.eastcoastgames.com/directx/chapter2.html
3]
I would make sure to never disable Alt-Tab. You probably want minimal CPU load while the application is not active because the user probably Alt-Tabbed because they want to do something else, so you could completely pause the application, or reduce the frames rendered per second. If the application is minimzed, you of course don't need to render anything either. After thinking about a network game, my best solution is that you should still reduce the frames rendered per second as well as the amount of network packets handled, possibly even throwing away many of the packets that come in until the game is re-activated.
4]
Honestly I would just stick to DirectX 9.0c (or DirectX 10 if you want to limit your target operating system to Vista and newer) if at all possible :)
Finally, the DirectX sdk has numerous tutorials and samples: http://www.microsoft.com/downloads/details.aspx?FamilyID=24a541d6-0486-4453-8641-1eee9e21b282&displaylang=en
We solved it by not using a fullscreen DirectX device at all - instead we used a full-screen window with the top-most flag to make it hide the task bar. If you Alt-Tab out of that, you can remove the flag and minimize the window. The texture resources are kept alive by the window.
However, this approach doesn't handle the device lost event happening due to 'lock screen', Ctrl+Alt+Delete, remote desktop connections, user switching or similar. But those don't need to be handled extremely fast or efficiently (at least that was the case in our application)
All serious D3D apps should be able to handle lost devices as this is something that can happen for a variety of reasons.
In DX10 under Vista there is a new "Timeout Detection and Recovery" feature that makes it common in my experience for graphics devices to be reset which would cause a lost device for your app. This seems to be improving as drivers mature but you need to handle it anyway.
In DX8 and 9 (and 10?) if you create your resources (vertex and index buffers and textures mainly) using D3DPOOL_MANAGED they will persist across lost devices and will not need reloading. This is because they are stored in system memory and the DX runtime copies to video memory automatically. However there is a performance cost due to the copying and this is not recommended for rapidly changing vertex data. Of course you would profile first to determine if there is a speed issue :-)

Resources