How to assume/steal another process's windows as my own? - windows-7

I'd like to show another app's windows under my app's taskbar button. It's a background app that reports another process's windows as my app's own. Is there any universal way to do this, e.g. each "new" window, alert glow, progressmeter, and other taskbar features, show under my own app's button?
For example, Winfox runs under its own process and steals Firefox's windows. It also adds features, but that's irrelevant -- I just want to support another app's existing taskbar features under my own app's button -- multiple windows, progressmeter, alert flashing, error flashing, mini-icons, etc. Is there a near-universal way to steal an app, or is it largely app-specific? Thanks!

You should be able to use SetParent() to take ownership of a window, but I'm not sure how much this will help you in your attempt to add taskbar features to the legacy app.

Related

How to show the notify icon in taskbar for windows 7?

I implement a notify icon for my application by call Shell_NotifyIcon.
In the default, the notify icon display in the notify icon area instead of task bar in windows 7.
If the user want to show the notify icon in the task bar, he/she need to open the Notification Area Icons control panel item, find the application and set "Show icon and notifications" for the application.
I think it will be difficult to the user with poor windows knowledge. I want to implement this function that show the notify icon in the task bar in my VC++ code or installer. Is it possible? If yes, what should I do?
Appreciate.
No, this is not possible.
Windows 7 introduces a feature where notification icons can be hidden. It is an attempt to reduce the noise created by decades of developers dumping notification icons in the taskbar for no good reason.
In order for that feature to work effectively, there can't be a loophole for applications to get around it, because everyone thinks their application is the most important and the most deserving of prime real estate. Eventually, nothing is sacred anymore.
Raymond Chen has blogged about this very request, and provides some additional background info.
You just create the notification icon and provide the user with instructions in the documentation on how to show it permanently, if they so desire.
Why not just pin your application to the task bar? They can just click it and it will launch the application?

App modal File dialogs on windows

How do i make the common File dialogs App modal using Common File Dialog API ? The dialogs come up modal with respect to the owner window. I want to block all the process' windows when a file dialog is up. In my current code, I am disabling all windows belonging to the app except the dialog parent and when the dialog is closed I enable them again. There should be a better/easy way of achieving application wide modality with Common File Dialogs. Please let me know if there is a standard solution for this.
Manually disabling and re-enabling is the only way I know of in Windows.
The traditional model for Windows applications is to have a single top-level window per instance. (Remember MDI apps?) Of course, there are exceptions, and many apps have always had floating tool palette windows. Nevertheless, the disable-the-parent model works for the lion's share of applications, and it's possible for many-window apps to do what you're doing with manually disabling the extra windows. Thus there isn't much demand for a more general solution.
If you wanted to re-architect things, you could have a master window that owns all the other top-level windows, and make the modal window use the master as a parent, but then you'd have to solve other problems related to the task bar, z-order, and positioning of the modal window.

Cocoa accessibility API, can I click a window in the background without activating it?

I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?
Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.

Mac style menus on Windows, system wide

I'm a Mac user and a Windows user (and once upon a time I used to be an Amiga user). I much prefer the menu-bar-at-the-top-of-the-screen approach that Mac (and Amiga) take (/took), and I'd like to write something for Windows that can provide this functionality (and work with existing applications).
I know this is a little ambitious, especially as it's just an itch-to-scratch type of a project and, thanks to a growing family, I have virtually zero free time. I looked in to this a few years a go and concluded that it was very difficult, but that was before StackOverflow ;)
I presume that I would need to do something like this to achieve the desired outcome:
Create application that will be the custom menu bar that sits on top of all other windows. The custom menus would have to provide all functionality to replace the standard Win32 in-window menus. That's OK, it's just an application that behaves like a menu bar.
It would continuously enumerate windows to find windows that are being created/destroyed. It would enumerate the child windows collection to find the menu bar.
It would build a menu that represents the menu options in the window.
It would hide the menu bar in the window and move all direct child windows up by a corresponding pixel amount. It would shorten the window height too.
It would capture all messages that an application sends to its menu, to adjust the custom menu accordingly.
It would constantly poll for the currently active window, so it can switch menus when necessary.
When a menu hit occurs, it would post a message to the window using the hwnd of the real menu child control.
That's it! Easy, eh? No, probably not.
I would really appreciate any advice from Win32 gurus about where to start, ideas, pitfalls, thoughts on if it's even possible. I'm not a Win32 C++ programmer by day, but I've done a bit in my time and I don't mind digging my way through the MSDN platform SDK docs...
(I also have another idea, to create a taskbar for each screen in a multi-monitor setup and show the active windows for the desktop -- but I think I can do that in managed code and save myself a lot of work).
The real difference between the Mac menu accross the top, and the Windows approach, is not just in the menu :- Its how the menu is used to crack open MDI apps.
In windows, MDI applications - like dev studio and office - have all their document windows hosted inside an application frame window. On the Mac, there are no per-application frame windows, all document windows share the desktop with all other document windows from other applications.
Lacking the ability to do a deep rework of traditional MDI apps to get their document windows out and onto the desktop, an attempt, however noble, to get a desktop menu, seems doomed to be a novelty with no real use or utility.
I am, all things considered, rather depressed by the current state of window managers on both Mac and Windows (and Linux): Things like tabbed paged in browsers are really acts of desperation by application developers who have not been given such things as part of the standard window manager - which is where I believe tabs really belong. Why should notepad++ have a set of tabs, and chrome, and firefox, and internet explorer (yes, I have been known to run all 4), along with dev studios docking view, various paint programs.
Its just a mess of different interpretations of what a modern multi document interface should look like.
The menu bar on a typical window is part of the non-client area of the window. It's drawn when the WndProc gets a WM_NCPAINT message and passes it on to DefWindowProc, which is part of User32.dll - the core window manager code.
Other things that are drawn in the same message? The caption, the window borders, the min/max/close boxes. These are all drawn while processing a single message. So in order to hide the menu for an application, you will have to take over handling of this message, which means changing the behavior of user32.dll. Hiding the menu is going to mean that you become responsible for drawing all of the non-client area.
And the appearance of all of these elements - The caption, the borders, etc. changes with every major version of Windows. So you have to chase that as well.
That's just one of about a dozen insurmountable problems with this idea. Even Microsoft probably couldn't pull this off and they have access to the source code of user32.dll!
It would be a far less difficult job to echo the menu for each application at the top of the screen, and even that is a nearly impossible job. When the menu pops there is lots of interaction with the application during which the menu can be (and often is) changed. It is very common for applications to change the state of menu items just before they are drawn. So you will have to replicate not only the appearance of the menus, but their entire message flow interaction with the application.
What you are trying to do is about a dozen impossible jobs all at once, If you try it, you will probably learn a lot, but you will never get it to work.

How do I get keyboard events in an NSStatusWindowLevel window while my application is not frontmost?

After creating a translucent window (based on example code by Matt Gemmell) I want to get keyboard events in this window. It seems that there are only keyboard events when my application is the active application while I want keyboard events even when my application isn't active but the window is visible.
Basically I want behavior like that provided by the Quicksilver application (by blacktree).
Does anybody have any hints on how to do this?
There are two options:
Use GetEventMonitorTarget() with a tacked-on Carbon run loop to grab keyboard events. Sample code is available on this page at CocoaDev.
Register an event trap with CGEventTapCreate. Sample code can be found in this thread from the Apple developer mailing list.
Edit: Note that these methods only work if you check off “Enable access for assistive devices” in the Universal Access preference pane.
A simpler route that may work better for you is to make your app background-only. The discussion on CocoaDev of the LSUIElement plist key explains how to set it up. Basically, your application will not appear in the dock or the app switcher, and will not replace the current application's menu bar when activated. From a user perspective it's never the 'active' application, but any windows you open can get activated and respond to events normally. The only caveat is that you'll never get to show your menu bar, so you'll probably have to set up an NSStatusItem (one of those icon menus that show up on the right side of the menu bar) to control (i.e. quit, bring up prefs, etc.) your application.
Edit: I completely forgot about the Non-Activating Panel checkbox in Interface Builder. You need to use an NSPanel instead of an NSWindow to get this choice. This setting lets your panel accept clicks and keyboard input without activating your application. I'm betting that some mix of this setting and the Carbon Hot Keys API is what QuickSilver is using for their UI.
Update:
Apple actually seems to have changed everything again starting with 10.5 BTW (I recently upgraded and my sample code did not work as before).
Now you can indeed only capture keydown events setting up an event tap if you are either root or assistive devices are enabled, regardless on which level you plan to capture and regardless if you selected to capture (which allows you to modify and even discard events) or to be listen only. You can still get information when flags have changed (actually even change these) and other events, but keydown under no other circumstances.
However, using the carbon event handler and the method RegisterEventHotKey() allows you to register a hotkey and you'll get notified when it is pressed, you neither need to be root for that nor do you need anything like assistive devices enabled. I think Quicksilver is probably doing it that way.

Resources