Get "Position" (Not Resolution) of macOS Extended Display - macos

I have an extended display set up like this:
In the extended display, I have a Finder window positioned on the left half of the extended screen:
When I call this AppleScript:
tell application "System Events"
tell process "Finder"
{position of window 1, size of window 1}
end tell
end tell
I can get the position of the Finder window:
224, 1331, 881, 1075
However, if I move the bottom screen to the left:
The same AppleScript call now provides a different window position:
10, 1331, 881, 1075
How can I get the "position" of the extended screen?
I know I can get the bounds of the desktop using:
tell application "Finder" to get bounds of window of desktop
But that returns the exact same result for both extended display positions:
0, 0, 2304, 2416
I also know I can get the display resolution using
system_profiler SPDisplaysDataType
But that doesn't seem to tell me anything about the extended display's "position", just its resolution.
That same link suggests the command
defaults read /Library/Preferences/com.apple.windowserver.plist
Which looks promising because it has UnmirroredOriginX and UnmirroredOriginY listed for displays. The problem with that file is that I don't see a way to figure out which one of the 17 (in my case) settings is currently active.
For background, my motivation for this question is to derive window positions like left, right, and center 1/3rds and comparable 1/4ths and half positions for windows in an extended display no matter where the extended display is "positioned" relative to the main display. I am trying to replicate the behavior of a program like Rectangle except programmatically. That way windows can automatically be opened and sent to certain positions when certain scripts are run.
I would just send keystrokes to Rectangle itself but it has no way of specifying main or extended display (only "Next" and "Previous" display), so there is no deterministic way to guarantee which display the window will go to as far as I know.
I would also be happy to hear about any other CLI accessible program that can position any window to any 1/2, 1/3, or 1/4 positions in a specific (main or extended) display.

The solution I came up with was to get the resolution using a regex on the output of
system_profiler SPDisplaysDataType
for lines matching
Resolution: (\d+) x (\d+)
I divide both of those numbers by 2 because I always use retina / 4K monitors. This calculation seems to exactly match the AppleScript position values for the 4K displays but not retina. Luckily for me, the 4K display is the only one I need to be completely accurate.
Because I always scale my main monitor to higher resolution than my laptop screen, I can infer that the larger number is the monitor and thus the main display and the smaller is the extended display.
I then call
tell application "System Events"
tell process "%s"
{position of window 1, size of window 1}
end tell
end tell
On whatever target application's window needs to be positioned, which for me is always the frontmost (usually newly opened) window.
Because I always have the main monitor positioned on top of the extended display (laptop), if the returned Y position for the window is higher than the resolution height (divided by 2) of the main monitor, I assume it is in the extended display.
From there, I can keystroke the "Next Display" hotkey in Rectangle if it is in the wrong display. Then I can keystroke the appropriate 1/2, 1/3, 1/4 hotkey in Rectangle.

Related

Limit on window height when resizing with Applescript

I have a multi-monitor Mac desktop (4 displays each of 1920x1080 arranged in a 4x4 rectangle) and can use a mouse to open a window across all monitors, filling the whole four screen desktop.
(am running Mavericks and have disabled the "Displays have separate Spaces" checkbox)
I want to be able to so this automatically, so used AppleScript. However, the window will not open to a height greater than one of the displays (1080 pixels), even though the displays are arranged in a 4x4 matrix so that the total height of the desktop is reported as 2160 pixels. Window width is no problem and the script opens nicely across displays horizontally.
Here is the key part of the AppleScript:
tell application "Finder"
set bounds of first window to {0, 0, 3840, 1800}
end tell
There seems to be some kind of limit on the vertical size of the window. Any ideas how I can achieve automation?
Googling has pulled endless gripes about multi-monitor support on Mavericks but I can't find anything related to this particular issue.
Thanks in advance
BACKGROUND
I've tried this on two multi-monitor display configurations:
Early 2014 Mac Pro.
Four external 1920x1080 monitors arranged landscape in a 2x2 rectangle.
Reported desktop size is {0, 0, 3840, 2160}
MacBook Pro Retina Late 2013:
Two external 1920x1200 monitors arranged one above the other
(and the laptop's own 2880x1800 internal display of course)
Reported desktop size is {0, 0, 3360, 2400}
I don't have multiple monitors to test this, but on my one monitor the (0, 0) point is the upper left corner of my screen. Maybe you need to adjust the second number of your bounds. My suggestion would be to open a window manually by hand. Then run this code to get the bounds. Then try to set the bounds with the returned values. Of course I still don't know if this will work but at least you'll know you're working with the proper bounds. Good luck.
tell application "Finder"
return bounds of window 1
end tell
EDIT: once you know the proper bounds, you might try using System Events to resize the window. System Events doesn't know "bounds", but it does know "position" (the first 2 numbers in your bounds) and "size" (the second 2 numbers in your bounds). Try this with your numbers.
tell application "System Events"
tell process "Finder"
set position of window 1 to {0, 400}
set size of window 1 to {800, 500}
end tell
end tell
Not to revive a dead subject, but with some external dependencies it is possible to resize larger than the monitor resolution. You need a program called MegaZoomer [https://github.com/ianh/megazoomer] ... and you need EasySIMBL [https://github.com/norio-nomura/EasySIMBL].
Install EasySIMBL first, (can be downloaded from http://www.macupdate.com/app/mac/44354/easysimbl ). Then pull down MegaZoomer from (http://www.macupdate.com/app/mac/21275/megazoomer) ... copy the megazoomer package into the EasySIMBL packages dir. You will need to enable the package in SIMBL. You may have to reboot. Then run your applescript and it should work.

Cocoa Accessibility API: Hide a window

I'm looking to hide a window on OSX (not belonging to my app), but not the rest of the application. I have tried simply moving the window off the screen (like I would do in Windows), but the api always positions it at least 20 pixels away from the edge (#annoying).
Other things I have thought of:
Setting the opacity of the window to zero (can this be done?)
Minimizing the window, but it appears that the window handle becomes null once the window is minimized, so might be hard to get back
Setting the window level (i.e. desktop) or z order (can this be done?)
Moving the window to a different workspace (can this be done?)
Does anyone know of a way to do this?

How do I change the viewport of a window in win32?

I have a window with child windows inside in it. The child windows take up about 1000 pixels of vertical space. However, our users don't always have 1000 pixels of vertical space available - they might have as little as 500 or 600 pixels.
I want to be able to display this window at a size of 500 pixels high, and have the user "scroll" up and down the window to see the full contents. The window should always be 500 pixels high, but the view within it should change.
Assume I can add a scroll bar somewhere so the user can choose which part of the window he wants to see. Windows will normally paint the window contents from height 0 to height 500; how do I tell it instead to "paint from height 250 to height 750", for example?
I know that I can set the viewport with functions like SetViewportOrgEx etc, but those functions require a device context - when do I call them if I want them to be "permanent"? Do I call them when I get the WM_PAINT message from windows? Or at some other time? And which functions from that family do I want to use?
Edit to add: I don't want to actually change the position of the child windows - they should stay at the same position, and the only thing that should change is the view into the window.
Thanks.
If (when you get messages about the scroll bars changing) you call ScrollWindowEx with the SW_SCROLLCHILDREN flag, the child windows should be told to scroll along with everything else. This ought to put them in the right position.

Applescript, multiple monitors and maximum window sizes

I'm looking into windows management on OS X (trying to achieve something like WinSplit Revolution), and I need to use applescript to pull out the maximum size of a window on a given monitor. Currently I've found:
tell application "Safari"
set screen_width to (do JavaScript "screen.availWidth" in document 1)
set screen_height to (do JavaScript "screen.availHeight" in document 1)
end tell
This works for the main monitor on a multiple monitor setup, but doesn't provide at all for secondary monitors. I've also read into the method detailed here, and this obviously doesn't work for multiple displays in an efficient manner. Is there an effective way to get the maximum window size of multiple displays with applescript?
There have been many attempts to get monitor dimensions by reading system plist files. See http://macscripter.net/viewtopic.php?id=15425
If you have access to AppleScript studio (ASS) terms, you can make method calls into NSScreen to get all the monitors then ask them for their sizes. The easiest way to use ASS terms in a plain AppleScript is to compile an empty ASS application in Xcode. In this example, I've created a simple ASS app name ASSAccess which gives me access to ASS terms:
tell application "ASSAccess"
-- The first item in this list will be your main monitor
set screensArray to call method "screens" of class "NSScreen"
set Displays to {}
if {} is not screensArray then
repeat with displayNo from 1 to (count screensArray)
-- call visibleFrame instead to take into account the Dock and menubar
set dims to call method "frame" of (item displayNo of screensArray)
copy dims to end of Displays
end repeat
end if
end tell
Now, the issue I run into with these dimensions is the coordinate system. NSScreen's frame method gives you a rectangle with the origin in the LOWER left hand corner of the main monitor. Then, any secondary screens are given relative to that origin. If you are trying to determine if a window is within these bounds, window position is giving within a coordinate system with the origin at the UPPER left hand corner. This is a whole mess of conversion that I haven't figure out yet.

Auto-Hide taskbar not appearing when my application is maximized

My application draws all its own window borders and decorations. It works fine with Windows taskbars that are set to auto-hide, except when my application window is maximized. The taskbar won't "roll up". It will behave normally if I have the application not maximized, even when sized all the way to the bottom of the screen. It even works normally if I just resize the window to take up the entire display (as though it was maximized).
I found the problem. My application was handling the WM_GETMINMAXINFO message, and was overriding the values in the parameter MINMAXINFO record. The values that were in the record were inflated by 7 (border width) the screen pixel resolution. That makes sense in that when maximized, it pushes the borders of the window beyond the visible part of the screen. It also set the ptMaxPosition (point that the window origin is set to when maximized) to -7, -7. My application was setting that to 0,0, and the max height and width to exactly the screen resolution size (not inflated). Not sure why this was done; it was written by a predecessor. If I comment out that code and don't modify the MINMAXINFO structure, the Auto-hide works.
As to why, I'm not entirely sure. It's possible that the detection for popping up an "autohidden" taskbar is hooked into the mechanism for handling WM_MOUSEMOVE messages, and not for WM_NCMOUSEMOVE. With my application causing the maximize to park my border right on the bottom of the screen, I would have been generating WM_NCMOUSEMOVE events; with the MINMAXINFO left alone, I would have been generating WM_MOUSEMOVE.
This is dependant on whether 'Keep the taskbar on top of other windows' is checked on the taskbar properties. If it's checked then the taskbar will appear.
But don't be tempted to programmatically alter this setting on an end users machine just to suit your needs, it's considered rude and bad practice. Your app should fit whatever environment it gets deployed to.

Resources