I use Matlab on my MacBook Pro with Retina display.
Using get(0,'ScreenSize'), we obtain
ans =
1 1 1440 900
instead of 1 1 2880 1800. Is there any way to work with right sizes?
No, 1440-by-900 is likely the correct effective value for your screen's resolution. This is the value that the OS tells applications and is not the the same as the number of pixels (sometimes referred to as the "native resolution"). However, applications also need to check if a display supports HiDPI mode (a.k.a. Retina) as well. In your case, each "retina pixel" is made up of a 2-by-2 set of raw pixels (which, in turn, each have RGB sub-pixels). Applications that are "Retina-aware" can then render certain graphics (e.g., images and video) at the the full native resolution within regions of the screen. Some more details – probably more accurately stated – can be found in this article.
There are 3rd-party solutions to run OS X at the native resolution (e.g., SwitchResX and the methods discussed here), but this of course makes everything, UI included, ridiculously tiny. If you're running one of these, Matlab should report your resolution as 2880-by-1800.
I am not aware of any Matlab options, properties, or functions that allow one to actually take advantage of advantage of a Retina display. This means that, for example, when you display an image, each of it's pixels are rendered as 2-by-2 retina pixels.
Related
Todays displays have a quite huge range in size and resolution. For example, my 34.5cm × 19.5cm display (resulting in a diagonal of 39.6cm or 15.6") has 1366 × 768 pixels, whereas the MacBook Pro (3rd generation) with a 15" diagonal has 2880×1800 pixels.
Multiple people complained that everything is too small with such high resolution displays (see example). That is simple to explain when developers use pixels to define their GUI. For "traditional displays", this is not a big problem as the pixels might have about the same size on most monitors. But on the new monitors with much higher pixel density the pixels are simply smaller.
So how can / should user interface developers deal with that problem? Is it possible to get the physical size of the screen? Is it possible to set physical sizes instead of pixel-based ones? Is that still a problem (it's been a while since I last read about it) or was that fixed meanwhile?
(While css seems to support cm, when I try here it, it is not the set size).
how can / should user interface developers deal with that problem?
Use a toolkit or framework that support resolution independence. WPF is built from the ground up to be resolution-independent, but even old framework like Windows Forms can learn new tricks. OSX/iOS and Windows (or browser if we're talking about web) itself may try to take care the problem by automatic scaling, but if there's bitmap graphic involved, developers might need to provide different bitmaps such in Android (which face most varying resolution and densities compared to other OS)
Is it possible to get the physical size of the screen?
No, and developers shouldn't care about it. Developers should only care about the class of the device (say, different UI for tablet and smartphone), and perhaps the DPI to decide which bitmap resource to use. Vector resource and font should be scaled by the framework.
Is that still a problem (it's been a while since I last read about it) or was that fixed meanwhile?
Depend on when you last read about it. Windows support is still spotty, even for the internal apps itself, and while anyone developing in WPF or UWP have it easy, don't expect major third party apps to join soon. OSX display scaling seems to work a bit better, while modern mobile OS are either running on limited range of resolution (iOS and Windows Phone) or handle every resolution imaginable quite nicely (Android)
There are a few ways to deal with different screen sizes, for example when I make mobile apps in java, I either use DIP(Density Independent Pixels; They stay at a fixed size) or make objects occupy a percentage of the screen with simple math. As for web development, you can use VW and VH (Viewport Width and Viewport Height), by adding these to the end of a value instead of px, the objects take up a percentage of the viewport. For example 100vh takes 100% of the viewport height. Then what I think is the best way to do it, but time consuming, is to use a library like Bootstrap that automatically resizes elements, even when the window is resized. W3Schools has a good tutorial on bootstrap and more detailed explainations on any of these options can be looked up with an easy google search.
The design of the GUI in today display diversity era is real challenge. I would suggest several hints, mainly about the GUI applications design:
Never set or expect constant pixel size of the text - the user can change it from the system settings of the OS. Use some real-world measures for the text and check its pixel size when drawing. Provide some way to put the random size text in the boundaries of the window.
Never set or expect constant pixel size of the GUI widgets. Try to position them on the window in some adaptive way - according to the size of the window. Most GUI widget toolkits today have such instruments.
Never set or expect constant pixel size dialog windows. Let the OS to choose the size for you and then use what you get (X). Or, if you need to set some size and position (Windows), define it as a percent of the screen size.
If possible use scalable image formats for the icons. SVG is great for icons actually. Using sets of bitmap icons with different sizes is acceptable, but highly non-optimal as memory use and still will not provide perfect scaling in most cases.
Developing using Matlab 2014b's GUIDE, some of my GUIs have elements with units specified as "characters". Depending on the screen magnification level in Windows 7 (Control Panel>Appearance>Display) the GUI will look very different, with elements scattered. Shouldn't using characters as the unit type make adapting to the screen magnification a piece of cake, since the system character size would change I believe?
I'd rather not need to hard-code the units as pixels or etc, so that the GUI is happy being used on Windows/Linux/Mac. Anyone have any experiences/ suggestions with this?
I have found it is easiest to use pixels. You can then get the current window size and set things as percentages (from variables) of the real pixel dimensions. This is nice when you want to make sure there is a minimum or maximum panel or item size that can be resized or scaled within a range.
If you put this code in the resizeFcn() it should be good.
I've tried to find an answer for this on MSDN, but I'm not getting a clear picture of how this is intended to work. All of my work is on Windows 8.1.
Here is my issue. I am working on a Laptop with a high resolution monitor, 3200x1800. I've been using EnumDisplayMonitors to get the bounding rectangle of my screen.
This seems to work fine if my display settings are default. But I've noticed that when I change the Window display settings to provide larger text, the resolution returned by EnumDisplayMonitor changes. Rather than getting 3200x1800 I will get 2133x1200.
I'm guessing since I asked for larger text, Windows chooses to represent the screen as a smaller resolution.
It seems that if I look at the virtual screen properties, everything is represented in the actual coordinates of my screen, i.e. 3200x1800. But the APIs for getting the window and monitor rectangles seem to operate on this "other" coordinate space.
Is there any documentation/Windows APIs to handle the conversion between these "other coordinates" and the "virtual coordinates"? i.e. if I want EnumDisplayMonitor or GetMonitorInfo to give me the true screen coordinates, how could I convert 2133x1200 to 3200x1800?
You have increased the DPI of the video adapter to 150% (144 dots per inch) to keep text readable and avoid having windows the size of a postage stamp. Quite necessary on such high resolution displays. But you haven't told Windows that your program knows how to deal with it.
So it assumes your program is an old one that was never designed to run on such monitors. It helps and lies to you. It gets your program to render its output to a memory buffer, then takes that output, rescales it by 150% and copies it to the video adapter. This is something you can see, text looks fuzzier if you put your program's output next to a program that doesn't ask for this kind of scaling, like Notepad.
And of course, it lies to you when you ask for the size of the screen. It tells you that it is 150% smaller than it really is. So that, after rescaling, a window you create will fill the screen.
Which is all just fine but of course not ideal, your program doesn't look as good as it should. You have to tell Windows that you know how to deal with the higher resolution. Do beware that this looks easier than it is in practice. Getting text to look crisp is trivial, it is bitmaps that are problematic. And in general a fertile source of bugs, even the big companies can get this wrong.
Before I start with an answer, let me ask: what are you really trying to do ? Or more specific - why do you need to know the monitor resolution ? The standard way to do this is to call GetWindowRect(GetDesktopWindow(), &rect) I'm not sure if the screen coordinates change based on DPI settings - but you should try that instead of GetMonitorInfo as the latter is for more advanced stuff. And if GetWindowRect still returns back a scaled rect, just call DPtoLP, LPtoDP or other mapping coordinate function as appropriate.
When you adjust the display settings as you described, you are actually changing the DPI settings of the screen. As such, certain APIs go into compatibility mode so that they allow the app to create larger elements and windows without knowing anything about this setting.
Why do you need to know the actual screen resolution since most of the windowing APIs will behave accordingly when the DPI scaling changes?
I suspect you could call SetProcessDPIAware or the manifest file equivalent. But do read this MSDN article first to understand DPI scaling.
I'd like to play some older games on Windows 7. Running them isn't an issue, but the increase of monitor size and pixel density of later monitors is. Pre-rendered games intended to be played full-screen on e.g. a 640x480 resolution are now "blown up" to fit on a complete screen, making everything look unsharp. I've been looking at different solutions, but so far to no avail for a selection of games:
Running the game in "windowed mode" is an option for those games that support it.
DxWnd could be used to force some games in "windowed mode", but it causes some applications to crash as well.
VirtualBox works nicely since it will automatically resize to the applications desired full-screen resolution, but this is no option if VirtualBox's 3D support is insufficient to play the game.
Drivers like those of AMD or NVIDIA provide means to force maintaining pixel aspect ratio if pixel aspect ratio is an issue on wide-screen monitors
All of the above don't work for me for one game, since it does not provide "windowed mode", DxWnd makes it crash, VirtualBox's 3D support is insufficient and aspect ratio isn't an issue on my monitor.
Which brings me to the question: is there a way to lower the screen resolution while maintaining original pixel density of the monitor instead of having it fill up the whole screen? Thus essentially creating a smaller view port for the Windows environment to use and filling up the rest of the screen with big black borders?
Right click on the game and click property's and then try ticking this option.
.
I'm new to OpenGL development for MacOS.
I make game 1024x768 resolution. In the fullscreen mode on widescreen monitors my game looks streched, it's not good.
Is there any function in OpenGL to get pixel per inch value? If I find it, I can decide whether to add bars to the sides of the screen.
OpenGL is a graphics library, which means that it is not meant to perform such tasks, its only for rendering something on to the screen. It is quite low level. You could use the Cocoa API NSScreen in order to get the correct information about the connected screens of your Mac.
I make game 1024x768 resolution.
That's the wrong approach. Never hardcode any resolutions. If you want to make a fullscreen game, use the fullscreen resolution. If you want to adjust the rendering resolution, switch the screen resolution and let the display do the proper scaling. By using the resolutions offered to you by the display and OS you'll always get proper aspect ratio.
Note that it still may be neccessary to take pixel aspect ratio into account. However neither switching the display resolution, nor determining the pixel aspect ratio is part of OpenGL. Those are facilities provided by the OS.