How get correct DPI information on linux? - x11

system: Ubuntu 20.04.3 LTS
The default Resolution is 1280x720, the DPI is 96.
When adjust the 'fractional scaling' is 125%, I have two options to get the DPI:
Use the command: xrdb -query |grep dpi
the DPI is 192 ??!
Xft.dpi: 192
Use the command: xdpyinfo, the DPI is 120.
screen #0:
dimensions: 2048x1152 pixels (433x244 millimeters)
resolution: 120x120 dots per inch
Why the two commands return different DPI ?
When scaling to 125%, Why the dimensions is 2048x1152 ? (2048/1280 = 1.6, 1152/720 = 1.6)
Is the X11 API is wrong or other problem?
Thanks.

Strictly speaking neither xrdb nor xdpyinfo are the right place to query the screen's pixel density.
xrdb shows you a DPI value, since it's the place where one can (but is not required to!) set a overriding DPI value for Xft, and some desktop environments to, just "because". xdpyinfo shows mostly values that already did exist waaay back in the original X11 core protocol, where one could also specify physical dimensions of a screen. The problem is, that on modern systems, which are capable of dynamically attaching and removing displays, things are done through XRandR and the capability to drive multiple X11 screens on the same X11 display no longer is used (it's all just one large X11 screen now). So depending on how you configured your monitors, the values reported by xdpyinfo are off.
To arrive at the correct pixel density, one must use XRandR (CLI query/set tool name is xrandr) to retrieve information about the physically connected displays. However be advised that it is perfectly possible for several displays of different pixel density to show overlapping regions of the X11 screen, and within those regions there's no unambiguous DPI value available.

Related

Windows System DPI: GetDeviceCaps(hdc, LOGPIXELSX/Y) vs GetDpiForSystem()

Recently, I read this excellent page about DPI on Win32:
DPI and device-independent pixels
However, I am confused about GetDeviceCaps(hdc, LOGPIXELSX/Y) vs GetDpiForSystem(). On systems where I tested, all three values always match.
Questions:
Is it possible for GetDeviceCaps(hdc, LOGPIXELSX) and GetDeviceCaps(hdc, LOGPIXELSY) to return different values? I assume that LOGPIXELS means DPI. (Please correct me if wrong!)
If the previous answer is yes, then is GetDeviceCaps(GetDC(NULL), LOGPIXELSX/Y) the same as GetDpiForSystem()?
When possible, I should be using GetDpiForWindow(hWnd), but I want to clarify my understanding about "system" DPI in my questions above.
As far as I'm concerned, GetDeviceCaps(hdc, LOGPIXELSX/Y) is not a match with GetDpiForSystem().
LOGPIXELSX:Number of pixels per logical inch along the screen width.
In a system with multiple display monitors, this value is the same for
all monitors.
LOGPIXELSY:Number of pixels per logical inch along the screen height.
In a system with multiple display monitors, this value is the same for
all monitors.
This function is not able to return a per-monitor DPI. For that, you should use GetDpiForWindow().
GetDpiForWindow() also returns a different value based on the DPI_AWARENESS value.
GetDpiForSystem() returns the system DPI. The return value will be dependent based upon the calling context. If the current thread has a DPI_AWARENESS value of DPI_AWARENESS_UNAWARE, the return value will be 96.
Is it possible for GetDeviceCaps(hdc, LOGPIXELSX) != GetDeviceCaps(hdc, LOGPIXELSY)? I assume that "LOGPIXELS" means DPI. (Please correct me if wrong!)
For monitors, I believe LOGPIXELSX == LOGPIXELSY even if your display has non-square pixels (which is extremely rare nowadays). There are still many printers out there that have different horizontal and vertical dot densities. In some cases, the printer driver may hide that, giving the illusion of square pixels. Among ones that don't, you not only have to be careful to use the right value for the context, but you should be aware the some drivers forget to swap the horizontal and vertical values when you change the page orientation to landscape.
LOGPIXELSX and LOGPIXELSY refer to the number of pixels per logical inch, an idea that's been buried in recent versions of Windows. You used to be able to tell Windows that, when a program wants to display something that's 1-inch long, use my logical inch value rather than the actual DPI. (In the olden days of CRT monitors, it was usually impossible for Windows to discover the actual DPI anyway.)
Common values for logical inches were 96 and 120. If you had a really high-end monitor, you might've used 144. If you were into WYSIWYG applications or desktop publishing, you'd usually choose a value that was about 20% larger than an actual inch on your screen. I've heard various rationales for this, but I usually chose a higher value for easier reading.
When possible, I should be using GetDpiForWindow(hWnd)
That's fine. I use LOGPIXELSX and LOGPIXELSY because I'm old school and have always written high-DPI aware code.
but I want to clarify my understanding about "system" DPI in my questions above.
I believe the system DPI is the scaled DPI of the primary monitor. The scaling gives the programmer the same functionality as a logical inch, but it's presented to the user in a different way conceptually.
On systems where I tested, all three values always match.
Yes, it's very common for all of the methods to report the same DPI.
If your program is not high-DPI aware, you'll get 96 regardless of how you ask.
If it is aware, the system DPI is the DPI of the primary monitor. (Well, it's the possibly scaled native DPI of the monitor, but that's the same value you'll be told for the DPI of monitor.)
That covers a lot of common cases. To truly and thoroughly test, you'd need a second monitor with a different native DPI than the primary.
A couple points to keep in mind.
GetDeviceCaps approach is specific to the device context you reference with the HDC parameter. Remember that you can have printer DCs, enhanced metafile DCs, and memory DCs in addition to screen DCs. For screens, the return value will depend on the DPI awareness.
DPI awareness comes into play for screens (not printers). Your program's UI thread can be:
DPI unaware, in which all methods will return 96 dots per inch and actual differences will/might be handled by the system scaling things on the backend.
DPI aware, in which case most will return the system DPI. I believe the system DPI is the (scaled?) DPI of the primary monitor, but I'm not sure about that.
per-monitor DPI aware, in which case the GetDeviceCaps and GetDpiForWindow will return the actual DPI of the monitor that's referenced by the DC. I don't recall what it returns if the DC spans multiple monitors.
It might by the actual DPI if the monitors spanned have the same DPI, or it might be the system DPI, or it might be the DPI of one of the spanned monitors.
GetDpiForMonitor ignores DPI awareness altogether. I assume it remains for backward compatibility.

How does Windows 10 guess physical dimensions of a 1080p display?

I'm running Windows 10 1803 on a laptop, attached to a desktop display. Laptop's display resolution is same as the attached display, 1920x1080 pixel (namely "Full HD").
However the screen sizes differ, the laptop has a 13" display and the desktop display is 27"...
By the settings option: start->system->display I can adjust scaling for each (!) display to the lowest scaling factor "100%" and a highest resolution of FullHD (1080p).
This setting is fine for the laptop display.
However, with the same setting for the 27" everything appears double sized ...
Q1: How does Windows 10 guess the (physical) dimensions of a screen display to get dpi be calculated "per inch"?
Q2: Is there any option to let Windows 10 to render a "virtual" screen resolution higher than the physical maximum on the 27" display?
Q3: Why does Windows 10 ignore the registry dword HKEY_CURRENT_USER\Control Panel\Desktop\LogPixels, when I set the value from initial 96ppi to a higher resolution?
Remark: Since I do not work on a stationary desktop, buying a 4k display is definitely no option and no answer to this topic.
However, using 2k+ displays would solve the problem since Windows will then calculate a reasonable dpi resolution for the attached displays.

Wrong screen size for Retina display in Matlab

I use Matlab on my MacBook Pro with Retina display.
Using get(0,'ScreenSize'), we obtain
ans =
1 1 1440 900
instead of 1 1 2880 1800. Is there any way to work with right sizes?
No, 1440-by-900 is likely the correct effective value for your screen's resolution. This is the value that the OS tells applications and is not the the same as the number of pixels (sometimes referred to as the "native resolution"). However, applications also need to check if a display supports HiDPI mode (a.k.a. Retina) as well. In your case, each "retina pixel" is made up of a 2-by-2 set of raw pixels (which, in turn, each have RGB sub-pixels). Applications that are "Retina-aware" can then render certain graphics (e.g., images and video) at the the full native resolution within regions of the screen. Some more details – probably more accurately stated – can be found in this article.
There are 3rd-party solutions to run OS X at the native resolution (e.g., SwitchResX and the methods discussed here), but this of course makes everything, UI included, ridiculously tiny. If you're running one of these, Matlab should report your resolution as 2880-by-1800.
I am not aware of any Matlab options, properties, or functions that allow one to actually take advantage of advantage of a Retina display. This means that, for example, when you display an image, each of it's pixels are rendered as 2-by-2 retina pixels.

How to display 1:1 on a rMBP when not running in "Best for Retina"

Let's say I have a rMBP, and an image that is 1000x1000 pixels.
If I display the image onscreen at 1:1 while running the MBP in "Best for Retina" mode, it will be displayed 1:1 on the actual retina display pixels (i.e. it will take up the same screen real estate as a 500x500 image on a 1440x900 screen).
However, if I then switch to one of the "scaled" resolution modes, e.g. 1680x1050, the system no longer displays the image 1:1, but scales it down (it occupies the same screen real estate as a 500x500 image on a 1680x1050 screen).
I would like a way to have the image continue to display 1:1 on the retina display, regardless of the system resolution in use. I realize that I could calculate an appropriate "scaled" size, and scale the image up so that when it is scaled back down it corresponds to a 1:1 mapping, but this results in a noticeable quality degradation.
When running the MBP in the "scaled" resolutions, does Apple not provide any way to control the on-screen pixels directly (bypassing the scaling for just a part of the screen)?
No. Display scaling occurs at a very low level within the GPU and affects the entire display; there is no way to bypass it for part of the screen.
Look at it this way: If you set the resolution of an ordinary laptop's display to, say, 800x600, there is no way to display an image at the native resolution of the LCD, or to render content inside the black pillarboxes on the sides of the display. For all intents and purposes, the LCD is 800x600 while it's set to that resolution; the fact that it's actually (say) an 1440x900 display is temporarily forgotten.
The same principle applies to the MacBook Pro Retina display. The nature of the scaling is a little more complicated, but the "original" resolution of the display is still forgotten when you apply scaling, and there is no way to render directly to it.
Here are the APIs for addressing the pixels directly:
https://developer.apple.com/library/mac/documentation/GraphicsAnimation/Conceptual/HighResolutionOSX/CapturingScreenContents/CapturingScreenContents.html#//apple_ref/doc/uid/TP40012302-CH10-SW1

What scaling factor to use for mapping the Font size on a high resolution monitor?

We have a requirement where our application needs to support high resolution monitors. Currently, when the application comes up in High res monitor, the text that it displays is too small. We use Arial 12 point font by default.
Now to make the text visible, I need to change the font size proportionally. I am finding it tough to come up with a formula which would give me the target font size given the monitor resolution.
Here is my understanding of the problem.
1) On windows, by default 96 pixels correpond to 1 Logical inch. This means that when the monitor resolution increases, the screen size in logical inches also increase.
2) 1 Point font is 1/72 of a Logical Inch. So combined with the fact that there are 96 Pixels per Logical inch, it turns out that, there are 96/72 Pixels per Point of Font.
This means that for a 12 point font, The number of Pixels it will occupy is 12*96/72 = 16 Pixels.
Now I need to know the scaling factor by which I need to increase these Number of Pixels so that the resultant Font is properly visible. If I know the scaled pixel count, I can get the Font size simply by dividing it by (96/72)
What is the suggested scaling factor which would ensure properly scaled Fonts on all monitor resolutions?
Also, please correct if my understanding is wrong.
There's an example on the MSDN page for LOGFONT structure. Your understanding is correct, you need to scale the point size by vertres / 72.
lfHeight = -PointSize * GetDeviceCaps(hDC, LOGPIXELSY) / 72;
If you set the resolution in Windows to match that of the physical monitor, no adjustment should be needed. Any well written program will do the multiplication and division necessary to scale the font properly, and in the newest versions of Windows the OS will lie about the resolution and scale the fonts automatically.
If you wish to handle this outside of the Windows settings, simply multiply your font size by your actual DPI and divide by 96.
Edit: Beginning with Windows Vista, Windows will not report your actual configured DPI unless you write a DPI-aware program. Microsoft has some guidance on the subject. You might find that the default scaling that Microsoft provides for non-DPI-aware programs is good enough for your purposes.

Resources