Responding to Exposé in MacOS - macos

Is it possible to for an app to respond to Expose on MacOS such that it can choose what gets displayed?
I often use Expose to switch windows. The problem is if the windows look similar then it's harder to know at a glance which one I want.
Trying to switch to one of the 4 Visual Studio Code windows or one of the 4 terminal windows is not as easy as it could be if each window showed more info, for example the document name.
I can hover the mouse over each window but that's not "at a glance" so what I'd like to do is, if and only if Expose appears, choose what Expose shows for my app.
One solution I thought of is to change the display if my app's window does not have the focus. Unfortunately that's not a solution. Often I have windows side by side so I can see both windows. If I changed the display of the unfocused window that would make side by side unusable.
Does MacOS provide an API for responding to expose? Note: I'm asking a programming question. If I was to want to add this feature to Visual Studio Code or ITerm2 or my own app what is the MacOS level event I trigger on? I've never seen an app change its representation in expose so I'm guessing there is no way to do it.

Related

create a program with Xcode-like interface

On Mac it is usual that there is a "hidden" main window.
The usual example is "Text Edit". When you open a file you with you don't see a "main frame". Instead every single file will be opened in its own "Text Edit" instance. This is OSX way of emulating the so-called MDI interface.
However, there is an exception. If you open Xcode and open the project there, you can click on the file and it will be open inside the main Xcode window. And if you double click the file it will be opened in its own independent editor window, keeping the main Xcode window visible.
My question would be: do I need to do anything special in order to make my program behave like an Xcode? Should I use different class for the main frame or maybe react differently on the opening document event?
Any hints/pointers where to look or even to the official Apple documentation would be helpful.
The TextEdit behavior you're describing is much more like “SDI” than “MDI”, and the terms “SDI” and “MDI” weren't even needed until Microsoft invented MDI long after Xerox invented the SDI-type interface of which macOS is a derivative.
Anyway, I think you are misunderstanding Xcode's behavior. You seem to think “its own independent editor window” is a different kind of window than “the main Xcode window”. But in fact the new window is of the same kind as the old window, with some optional parts hidden. You can show those hidden parts and make the new window look exactly like the old window. Demo:
The ability to open multiple windows showing the same document (or, in Xcode's case, project) is a matter of software architecture. If you carefully design your app so that multiple windows can share a single model object graph, and can be notified and redraw themselves when the object graph changes, then you have an app that supports multiple windows showing the same document. If you want multiple kinds of windows showing the same document, nothing about Cocoa stands in your way. As a matter of fact, Xcode does have at least one other kind of window in which it shows some properties of a project:
That project settings sheet is really another window; macOS keeps it attached to the main window, but it is in fact an instance of NSWindow (or a subclass of NSWindow), no doubt with its own custom window controller that references the same project objects as the main window.
If you use the Cocoa NSDocument architecture, then a small amount of multi-window support is built-in: an NSDocument knows about its associated windows (via their window controllers). If you want to use the NSDocument architecture, you should read Document-Based App Programming Guide for Mac.
It is unclear what you are after. The traditional Mac UI has been one window per document - i.e. SDI with a single instance of the app running multiple windows - but there has always been the ability for any app to organise the content of that window as it sees fit, including showing multiple "documents" within one window - i.e MDI type UI.
Apps approach such "MDI" in different ways, e.g. some use panes (views) and others tabs. From macOS Sierra the standard NSWindow supports tabs, this system is (semi)automatic for standard document apps. Read Apple's NSWindow Automatic Window Tabbing section in the Sierra release notes for more details.
If you wish to use multiple panes - e.g. like Xcode - you just use views (NSView) and arrange them how you wish.
HTH

Observe any Window if it Moved on OSX

I want to observe any window on OSX if it is moved. I don't own the windows so i can't get to it directly so I think I have to use the Accessibility APIs. I found a solution for the current active Application here: How can my app detect a change to another app's window? but I can't figure out how I have to modify this that it works for any window which is open. I hope anybody could give me a hint in which direction I have to look.
As I mentioned in the comments, people usually only want to detect window-move events on focused windows. (As unfocused windows seldom move.) If you want to detect application switches, you can poke into this sample project by Apple that shows how to update iChat status with the frontmost application’s name. And as you said, there’s already a solution for an active window.

How come some controls don't have a windows handle?

I want to get the window handle of some controls to do some stuff with it (requiring a handle). The controls are in a different application.
Strangely enough; I found out that many controls don't have a windows handle, like the buttons in the toolbar (?) in Windows Explorer. Just try to get a handle to the Folder/Search/(etc) buttons. It just gives me 0.
So.. first question: how come that some controls have no windows handle? Aren't all controls windows, in their hearts? (Just talking about standard controls, like I would expect them in Windows Explorer, nothing customdrawn on a pane or the like.)
Which brings me to my second question: how to work with them (like using EnableWindow) if you cannot get their handle?
Many thanks for any inputs!
EDIT (ADDITIONAL INFORMATION):
Windows Explorer is just an example. I have the problem frequently - and in a different application (the one I am really interested in, a proprietary one). I have "physical" controls (since I can get an AutomationElement of those controls), but they have no windows handle. Also, I am trying to send a message (SendMessage) to get the button state, trying to find out whether it is pushed or not (it is a standard button that seems to exhibit that behaviour only through that message - at least as far as I have seen. Also, the pushed state can last a lot longer on that button than you would expect on a standard button, though the Windows Explorer buttons show a similar behaviour, acting like button-style checkboxes, though they are (push)buttons). SendMessage requires a window handle.
Does a ToolBar in some way change the behaviour of its child elements? Taking away their window handle or something similar? (Using parent handle/control id for identification??) But then how to use functions on those controls that require a windows handle?
If they don't have a handle, they're not real controls, they're just drawn to look like controls.
But of course, the toolbar buttons in Windows Explorer do have window handles, they're part of a toolbar. Use the toolbar manipulation functions to interact with them, not EnableWindow.
Or, better yet, use the documented APIs for things like search. Reverse-engineering Windows Explorer has never ended well for anyone, least of all the poor Windows Shell team, saddled with years of backwards-compatibility hacks for certain developers who thought that APIs are for everyone else. Whatever you do manage to get to work is very likely to break on the next version of Windows.
The controls you are talking about are using the ToolbarWindow32 class. If you want to interact with them then you'll need to use the toolbar control APIs/message. For example for enabling buttons you'd want to use TB_ENABLEBUTTON.
You can implement the controls yourself using GDI, OpenGL or DirectX. Try Window Detective on Mozilla Firefox and you will see that there is only one window. Controls in dialog boxes are not windows known to Windows.

Mac style menus on Windows, system wide

I'm a Mac user and a Windows user (and once upon a time I used to be an Amiga user). I much prefer the menu-bar-at-the-top-of-the-screen approach that Mac (and Amiga) take (/took), and I'd like to write something for Windows that can provide this functionality (and work with existing applications).
I know this is a little ambitious, especially as it's just an itch-to-scratch type of a project and, thanks to a growing family, I have virtually zero free time. I looked in to this a few years a go and concluded that it was very difficult, but that was before StackOverflow ;)
I presume that I would need to do something like this to achieve the desired outcome:
Create application that will be the custom menu bar that sits on top of all other windows. The custom menus would have to provide all functionality to replace the standard Win32 in-window menus. That's OK, it's just an application that behaves like a menu bar.
It would continuously enumerate windows to find windows that are being created/destroyed. It would enumerate the child windows collection to find the menu bar.
It would build a menu that represents the menu options in the window.
It would hide the menu bar in the window and move all direct child windows up by a corresponding pixel amount. It would shorten the window height too.
It would capture all messages that an application sends to its menu, to adjust the custom menu accordingly.
It would constantly poll for the currently active window, so it can switch menus when necessary.
When a menu hit occurs, it would post a message to the window using the hwnd of the real menu child control.
That's it! Easy, eh? No, probably not.
I would really appreciate any advice from Win32 gurus about where to start, ideas, pitfalls, thoughts on if it's even possible. I'm not a Win32 C++ programmer by day, but I've done a bit in my time and I don't mind digging my way through the MSDN platform SDK docs...
(I also have another idea, to create a taskbar for each screen in a multi-monitor setup and show the active windows for the desktop -- but I think I can do that in managed code and save myself a lot of work).
The real difference between the Mac menu accross the top, and the Windows approach, is not just in the menu :- Its how the menu is used to crack open MDI apps.
In windows, MDI applications - like dev studio and office - have all their document windows hosted inside an application frame window. On the Mac, there are no per-application frame windows, all document windows share the desktop with all other document windows from other applications.
Lacking the ability to do a deep rework of traditional MDI apps to get their document windows out and onto the desktop, an attempt, however noble, to get a desktop menu, seems doomed to be a novelty with no real use or utility.
I am, all things considered, rather depressed by the current state of window managers on both Mac and Windows (and Linux): Things like tabbed paged in browsers are really acts of desperation by application developers who have not been given such things as part of the standard window manager - which is where I believe tabs really belong. Why should notepad++ have a set of tabs, and chrome, and firefox, and internet explorer (yes, I have been known to run all 4), along with dev studios docking view, various paint programs.
Its just a mess of different interpretations of what a modern multi document interface should look like.
The menu bar on a typical window is part of the non-client area of the window. It's drawn when the WndProc gets a WM_NCPAINT message and passes it on to DefWindowProc, which is part of User32.dll - the core window manager code.
Other things that are drawn in the same message? The caption, the window borders, the min/max/close boxes. These are all drawn while processing a single message. So in order to hide the menu for an application, you will have to take over handling of this message, which means changing the behavior of user32.dll. Hiding the menu is going to mean that you become responsible for drawing all of the non-client area.
And the appearance of all of these elements - The caption, the borders, etc. changes with every major version of Windows. So you have to chase that as well.
That's just one of about a dozen insurmountable problems with this idea. Even Microsoft probably couldn't pull this off and they have access to the source code of user32.dll!
It would be a far less difficult job to echo the menu for each application at the top of the screen, and even that is a nearly impossible job. When the menu pops there is lots of interaction with the application during which the menu can be (and often is) changed. It is very common for applications to change the state of menu items just before they are drawn. So you will have to replicate not only the appearance of the menus, but their entire message flow interaction with the application.
What you are trying to do is about a dozen impossible jobs all at once, If you try it, you will probably learn a lot, but you will never get it to work.

Sticky mouse when dragging controls in VS2005

Maybe this is a dumb question, but I have the following behavior in Visual Studio 2005 while designing forms:
1 - Drop a control onto the form (suppose it's a Label, just for discussion)
2 - Drag that label to a specific location (aligning w/other controls, whatever)
3 - Release the mouse button
4 - The control is still stuck to the mouse!!!
To get it un-stuck from the mouse, I have to hit ESC, which restores the Label to it's original location.
This is driving me nuts. I literally have to use the arrow keys to move each control into place, pixel-by-pixel. I don't observe this behavior anywhere else in VS2005, nor do I observe it in the OS in general.
I am running on Windows XP inside a Parallels Virtual Machine, hosted on OS X. I don't think there is a driver problem though, b/c as I already said, no other apps demonstrate anything like this.
Please tell me there is some tiny checkbox buried somewhere that will turn off this behavior.
Sounds like you might have ClickLock enabled (or a similar feature). Try this:
Go to Control Panel in Windows
Open the Mouse control panel
Go to the Activities tab
Deselect ClickLock
If that doesn't work, maybe you have a similar feature in OS X?
This problem spread to other applications within my VM, so I reinstalled Parallels tools and it went away.

Resources