Is there a name that explains the type of dashboard like that of HyperCard. A dashboard that has icons sitting on top of a rectangular box?
In HyperCard, we called it the Home stack. In other software, it could be called the Home window, main window, navigation window, start center, launcher, etc.
“Windoid” is the name given to a floating palette like that.
It includes the tools palette and also includes the message box.
These are like mini windows without the normal system controls that are not focusable, have a frontal z-index, and do not respond to system commands or shortcuts to close them. They can respond to commands from HyperCard (including in HyperTalk scripts) to open, close and move them.
They otherwise have similar behavior to non-HyperCard equivalents such as tool palettes, color pickers and some ‘desk accessories’ in the MacOS (pre OSX of course). These equivalents can be manipulated with AppleScript.
Related
Is there a way to get vscode show navigation menu i.e Code|File|Edit|... and the project name in a full screen mode on MAC. It's almost impossible to see the name of the project when having multiple instances of code open in full screen mode.
v1.42 has a new option that may help:
Controls if native full-screen should be used on macOS.
Disable this option to prevent macOS from creating a new space when going full-screen.
"window.nativeFullScreen": true,
I believe this is not what the full screen mode is made for. If you go full screen you are supposed to work almost exclusively in that application (only occasionally switching to other apps like mail, e.g. via command+tab). You can always have the menu bar (and the window title) appear when you move the mouse pointer to the top of the screen, however.
The name of the project is visible in the file explorer.
In this example, the project name (i.e. root folder) is testgit
You can always quickly show the file explorer using the keyboard shortcut Shift-Cmd-E.
On Mac it is usual that there is a "hidden" main window.
The usual example is "Text Edit". When you open a file you with you don't see a "main frame". Instead every single file will be opened in its own "Text Edit" instance. This is OSX way of emulating the so-called MDI interface.
However, there is an exception. If you open Xcode and open the project there, you can click on the file and it will be open inside the main Xcode window. And if you double click the file it will be opened in its own independent editor window, keeping the main Xcode window visible.
My question would be: do I need to do anything special in order to make my program behave like an Xcode? Should I use different class for the main frame or maybe react differently on the opening document event?
Any hints/pointers where to look or even to the official Apple documentation would be helpful.
The TextEdit behavior you're describing is much more like “SDI” than “MDI”, and the terms “SDI” and “MDI” weren't even needed until Microsoft invented MDI long after Xerox invented the SDI-type interface of which macOS is a derivative.
Anyway, I think you are misunderstanding Xcode's behavior. You seem to think “its own independent editor window” is a different kind of window than “the main Xcode window”. But in fact the new window is of the same kind as the old window, with some optional parts hidden. You can show those hidden parts and make the new window look exactly like the old window. Demo:
The ability to open multiple windows showing the same document (or, in Xcode's case, project) is a matter of software architecture. If you carefully design your app so that multiple windows can share a single model object graph, and can be notified and redraw themselves when the object graph changes, then you have an app that supports multiple windows showing the same document. If you want multiple kinds of windows showing the same document, nothing about Cocoa stands in your way. As a matter of fact, Xcode does have at least one other kind of window in which it shows some properties of a project:
That project settings sheet is really another window; macOS keeps it attached to the main window, but it is in fact an instance of NSWindow (or a subclass of NSWindow), no doubt with its own custom window controller that references the same project objects as the main window.
If you use the Cocoa NSDocument architecture, then a small amount of multi-window support is built-in: an NSDocument knows about its associated windows (via their window controllers). If you want to use the NSDocument architecture, you should read Document-Based App Programming Guide for Mac.
It is unclear what you are after. The traditional Mac UI has been one window per document - i.e. SDI with a single instance of the app running multiple windows - but there has always been the ability for any app to organise the content of that window as it sees fit, including showing multiple "documents" within one window - i.e MDI type UI.
Apps approach such "MDI" in different ways, e.g. some use panes (views) and others tabs. From macOS Sierra the standard NSWindow supports tabs, this system is (semi)automatic for standard document apps. Read Apple's NSWindow Automatic Window Tabbing section in the Sierra release notes for more details.
If you wish to use multiple panes - e.g. like Xcode - you just use views (NSView) and arrange them how you wish.
HTH
In my MFC app, I'm attempting to make a window that resembles the Windows 7 Open File dialog, but it browses a virtual/fake file system. It doesn't need to be pixel-perfect, but I'd like parity with the native OS dialog where possible.
Probably the most challenging part is the address bar the runs along the top of an Open dialog. The address bar control is also atop all Windows Explorer windows. It shows the folder names that make up your path. It shows and hides buttons when moused over (including an attractive fade animation), changes the active directory when names are clicked, and shows submenus when the triangles between names are clicked. This doesn't seem to correspond to any MFC control (or group of controls). Spy++ shows it as an "AddressDisplay Control" but I can't find much documentation beyond that.
Is there a way to access a control like this, or to mimic it, in MFC? Also, I am not browsing the real file system, so I have to be able to tell the control what to display--I can't just point it at C:\ and let the system do the rest.
Here's a picture of the control in question.
Unfortunately, I think this is one of those controls that Microsoft has decided not to expose to developers through the Feature Pack. The Feature Pack was developed from the BCG control library. And, that library contains the control you want. However, it's not free. The only other alternative is to code it yourself.
I'm a Mac user and a Windows user (and once upon a time I used to be an Amiga user). I much prefer the menu-bar-at-the-top-of-the-screen approach that Mac (and Amiga) take (/took), and I'd like to write something for Windows that can provide this functionality (and work with existing applications).
I know this is a little ambitious, especially as it's just an itch-to-scratch type of a project and, thanks to a growing family, I have virtually zero free time. I looked in to this a few years a go and concluded that it was very difficult, but that was before StackOverflow ;)
I presume that I would need to do something like this to achieve the desired outcome:
Create application that will be the custom menu bar that sits on top of all other windows. The custom menus would have to provide all functionality to replace the standard Win32 in-window menus. That's OK, it's just an application that behaves like a menu bar.
It would continuously enumerate windows to find windows that are being created/destroyed. It would enumerate the child windows collection to find the menu bar.
It would build a menu that represents the menu options in the window.
It would hide the menu bar in the window and move all direct child windows up by a corresponding pixel amount. It would shorten the window height too.
It would capture all messages that an application sends to its menu, to adjust the custom menu accordingly.
It would constantly poll for the currently active window, so it can switch menus when necessary.
When a menu hit occurs, it would post a message to the window using the hwnd of the real menu child control.
That's it! Easy, eh? No, probably not.
I would really appreciate any advice from Win32 gurus about where to start, ideas, pitfalls, thoughts on if it's even possible. I'm not a Win32 C++ programmer by day, but I've done a bit in my time and I don't mind digging my way through the MSDN platform SDK docs...
(I also have another idea, to create a taskbar for each screen in a multi-monitor setup and show the active windows for the desktop -- but I think I can do that in managed code and save myself a lot of work).
The real difference between the Mac menu accross the top, and the Windows approach, is not just in the menu :- Its how the menu is used to crack open MDI apps.
In windows, MDI applications - like dev studio and office - have all their document windows hosted inside an application frame window. On the Mac, there are no per-application frame windows, all document windows share the desktop with all other document windows from other applications.
Lacking the ability to do a deep rework of traditional MDI apps to get their document windows out and onto the desktop, an attempt, however noble, to get a desktop menu, seems doomed to be a novelty with no real use or utility.
I am, all things considered, rather depressed by the current state of window managers on both Mac and Windows (and Linux): Things like tabbed paged in browsers are really acts of desperation by application developers who have not been given such things as part of the standard window manager - which is where I believe tabs really belong. Why should notepad++ have a set of tabs, and chrome, and firefox, and internet explorer (yes, I have been known to run all 4), along with dev studios docking view, various paint programs.
Its just a mess of different interpretations of what a modern multi document interface should look like.
The menu bar on a typical window is part of the non-client area of the window. It's drawn when the WndProc gets a WM_NCPAINT message and passes it on to DefWindowProc, which is part of User32.dll - the core window manager code.
Other things that are drawn in the same message? The caption, the window borders, the min/max/close boxes. These are all drawn while processing a single message. So in order to hide the menu for an application, you will have to take over handling of this message, which means changing the behavior of user32.dll. Hiding the menu is going to mean that you become responsible for drawing all of the non-client area.
And the appearance of all of these elements - The caption, the borders, etc. changes with every major version of Windows. So you have to chase that as well.
That's just one of about a dozen insurmountable problems with this idea. Even Microsoft probably couldn't pull this off and they have access to the source code of user32.dll!
It would be a far less difficult job to echo the menu for each application at the top of the screen, and even that is a nearly impossible job. When the menu pops there is lots of interaction with the application during which the menu can be (and often is) changed. It is very common for applications to change the state of menu items just before they are drawn. So you will have to replicate not only the appearance of the menus, but their entire message flow interaction with the application.
What you are trying to do is about a dozen impossible jobs all at once, If you try it, you will probably learn a lot, but you will never get it to work.
I have a few different things open in the terminal whenever I'm developing -- log tailing, Ruby console, plain shell in a certain directory, and so on.
How do I:
start all those things at once, hopefully in the right position on the screen?
make them distinct so I can switch to them with Quicksilver / Alt-Tab?
Fluid solved this problem with all of my web apps, so now I want to do it with my terminals.
And while we're on the topic, has anyone found a working solution for getting OS X to remember window positions on an external monitor? If I unplug it and plug it back in, I have to drag everything back to the same position (although at least Mercury Mover makes it possible to do it with the keyboard.)
Open Terminal, and go into Preferences, then go into the settings tab, and create a new setting for each of your windows that you want. Either give them all different colour schemes, or duplicate a colour scheme multiple times for them all to have the same settings. Under the shell sub-tab, add "Run command" to be run at shell startup. (This is the command that will cd to the directory you want, or tail a log).
Then initialise the windows as you want. Then click Window in the main menu and select Save Windows as Group...
In OSX Yosemite you can use (in Terminal) Window -> Save group. It will do all the work for you.