Is there a way to place window in the same point for example in top-right corner on the to displays with different resolution?
For example you have Macbook and you connected it to big display.
Note: windows property "Spaces" in IB is set to "Can join all spaces"
Spaces and displays are two separate concepts. So, "Can join all spaces" is not relevant to your question.
A window can only be at one position in the global screen coordinate system that spans the whole desktop. Each display constitutes a separate part of that coordinate system (ignoring mirroring). Therefore, no, it's not possible to have a window show up in the top-right corner of two separate displays. You would need two separate windows to achieve that.
Related
When a window is resized by aero snap, User32.GetWindowPlacement(hWnd).rcNormalPosition still stores its original rectangle, while User32.GetWindowRect is affected.
Since aero snap seems independent from WINDOWPLACEMENT, now we cannot collect the complete information of the actual placement simply using user32.dll. Thus I'm wondering if there's a way to get the aero snap state of a window, indicating whether the window is docked and which side the window is docked to.
Aero Snap is a feature of the Shell, not the windowing system. Thus, the windowing system cannot provide that information, because it is not aware of those states.
And the Shell doesn't make this information available either. So, in essence, the system doesn't provide the Aero Snap state of any given window through a public API.
I like having my main windows remember all of their placement information so that I it can be restored when they are restarted. In that past, it was enough to save a copy of the window placement structure and to set it back when recreating the window.
The introduction of snap required keeping some extra information. I detected whether a window appeared to be snapped by comparing its window rectangle to the work area rectangle of the monitor that contains the window. If it seemed to be snapped to one of the edges, I recorded that along with the placement information. Upon creating the window, I first restore the window placement, and then, if I have a snap state recorded, I change the window's size and position accordingly.
You can distinguish between a window that's been snapped to a monitor edge from one that's been carefully sized and placed there because the snapped window's rectangle won't match the one in the window placement.
This approach worked great in Windows 7. I recently discovered that Windows 10 added more flexibility to the snap locations and sizes as well as playing more games to achieve the annoyingly invisible resize borders. So my detection code doesn't always recognize a snapped window, but that should fixable.
Some popular software (like the major browsers) seem to remember that they were snapped, and I assume they use the same approach.
I'm using the HTMLHelp function with C to provide short, context sensitive help for the edit fields on various dialogs. It works perfectly except when the dialog is located on a monitor whose screen has negative X coordinates.
For example, I have three monitors and the top, left point on the CENTER of the three is the (0,0) point on the screen. That makes ALL X-coordinates on the LEFT screen have negative values. When I call HTMLHelp for HH_DISPLAY_TEXT_POPUP, the tooltip it displays shows up stuck to the left edge of the CENTER screen instead of on the LEFT screen, where it belongs. When the coordinate for the help popup is on the center or right screens, the popup is exactly where it should be.
Does anyone know a way to get the HTMLHelp function to work correctly and just use the given coordinates instead of applying an invalid range check and "fixing" my X location?
If not, I guess I will be forced to write my own version of a help popup function, but I'd much rather use the system feature.
When designing a GUI in most languages, you typically don't give exact dimensions for each component. Rather, you say how GUI components fit and size relative to each other. For example, Button1 should take up all the space Button2 and Button3 don't use; the TextPanel should fill as much space as it can; and the horizontal list of images should expand and shrink as the window expands and shrinks. In AnyLogic, I don't see any obvious way to do this, yet I need to develop models that work on multiple screen sizes. Is it possible to auto-scale GUI components in AnyLogic as it is in other languages? If so, how?
Unfortunately, there is no direct support for that as far as I know.
However, some of your requests can be achieved programmatically, i.e. by using the dynamic properties of your GUI elements.
There is the function getWindowWidth() (and height()) for experiments and you can set your button's width to equal that. With a bit of playing, you should be able to get your desired result.
cheers
I want my program's window to be as big as possible without overlapping the window manager's various small windows e.g. the pager. Is there any way to ask the wm what the maximized window size is, before I create my window?
_NET_WORKAREA property of the root window is probably closest match. However on a multi-headed system it will give you the combined work area on all monitors.
If that's what you want, fine (but see here on making a window span multiple monitors). If you want to maximize over a single monitor, then there's a problem as there's no per-monitor API like _NET_WORKAREA. Your best bet is creating a window in a maximized state and then querying its size. If that's not an option, I'm afraid you will have to query the number and sizes of available monitors, and then go and calculate the work area of each monitor by subtracting "struts" from the full area (see here about _NET_WM_STRUT and _NET_WM_STRUT_PARTIAL).
I'm designing my interface in Interface Builder (using Xcode 4.2 on Snow Leopard), and PERFECTLY aligning two elements (two NSButtons, bordered), one below the other.
The thing is that when the window is resizing, at some points, the elements seem misaligned (by 1 pixel or so), while at some other, they're still perfectly aligned.
Here's a (zoomed) example of what I mean:
Aligned
Mis-Aligned
And here are my resizing settings (for the upper NSButton):
And for the container (of my upper NSButton):
I know I'm probably getting a bit too crazy about such a tiny issue, but I definitely need to resolve it.
So, why is that happening? What should I do in order to resolve it?
Are both buttons in the same container?
Do they have the same size & alignment settings?
Below the autosizing widgets in the inspector, there's a set of alignment buttons. Try selecting both buttons and clicking the left-side alignment button. (See if that makes the other side mis-align.) Below that are placement icons - verify both buttons have same settings there.
Type in values for W & H so both buttons are the exact same (even if the boxes already show the same, type over it to be sure). Also type in X & Y so they're same (except for vertical offset).
Personally, having the center scaling set (last image, double-ended horizontal arrow) seems odd when it's only anchored on one side. That might have a strange effect. On the other hand, you have both vertical anchors set but not the vertical scaling.
If all else fails, you could try (save original version first) delete the second button, copy the first and position the copy below it.