X11 (xorg) fails to set/change resolution (linux x86) - x11

I'm trying to get a 10-inch touch display (native resolution: 1280x800) to switch to 1024x768, but everything I try is either ignored or results in an error. The display reportedly supports the resolution, though, xrandr --verbose reports (I'm using the default VESA driver):
xrandr: Failed to get size of gamma for output default
Screen 0: minimum 640 x 480, current 640 x 480, maximum 1280 x 800
default connected 640x480+0+0 (0x180) normal (normal) 0mm x 0mm
Identifier: 0x17d
Timestamp: 635022581
Subpixel: horizontal rgb
Clones:
CRTC: 0
CRTCs: 0
Transform: 1.000000 0.000000 0.000000
0.000000 1.000000 0.000000
0.000000 0.000000 1.000000
filter:
1280x800 (0x17e) 0.0MHz
h: width 1280 start 0 end 0 total 1280 skew 0 clock 0.0KHz
v: height 800 start 0 end 0 total 800 clock 0.0Hz
800x600 (0x17f) 0.0MHz
h: width 800 start 0 end 0 total 800 skew 0 clock 0.0KHz
v: height 600 start 0 end 0 total 600 clock 0.0Hz
640x480 (0x180) 0.0MHz *current
h: width 640 start 0 end 0 total 640 skew 0 clock 0.0KHz
v: height 480 start 0 end 0 total 480 clock 0.0Hz
1024x768 (0x181) 0.0MHz
h: width 1024 start 0 end 0 total 1024 skew 0 clock 0.0KHz
v: height 768 start 0 end 0 total 768 clock 0.0Hz
If I try to change the resolution via xrandr --output default --mode 1027x768, i just get:
xrandr: Failed to get size of gamma for output default
xrandr: Configure crtc 0 failed
As far as I can see, only the second line is relevant to my problem; I don't know why xrandr would want to configre crtc 0, though, I only have the touch screen connected.
Failing that, I tried to configure the mode directly using the following xorg.conf:
Section "InputClass"
Identifier "calibration"
MatchProduct "DIALOGUE INC PenMount USB"
Option "Calibration" "95 911 93 919"
Option "SwapAxes" "0"
EndSection
Section "Monitor"
Identifier "disp0"
Modeline "1024x768_60.00" 63.50 1024 1072 1176 1328 768 771 775 798 -hsync +vsync
Option "PreferredMode" "1024x768_60.00"
EndSection
Section "Device"
Identifier "card0"
Driver "vesa"
EndSection
Section "Screen"
Identifier "src0"
Device "card0"
Monitor "disp0"
SubSection "Display"
Modes "1024x768_60.00" "1024x768"
EndSubSection
EndSection
Unfortunately, This doesn't work, either Xorg.log shows the following:
[634043.694] (II) VESA(0): Not using mode "1024x768_60.00" (no mode of this name)
[634043.694] (II) VESA(0): Not using built-in mode "1024x768" (no mode of this name)
Why doesn't this work? And what else can I try to get the display to switch to 1024x768?
I've uploaded the full logfile to Pastebin.

You can try to perform the following actions
cvt -r 1024 768
xrandr --newmode "1024x768_60.00" 63.50 1024 1072 1176 1328 768 771 775 798 -hsync +vsync
xrandr --addmode default 1024x768_60.00
xrandr --output default --mode 1024x768_60.00

Related

How to capture windows 10/windows 11 advanced display settings for multiple monitors

I am trying to capture the following displays information connected to my laptop programatically using powershell but I am unable to find a way. This information can be found via the windows GUI at Settings > Display > Advanced Display Information (2 screenshots attached below).
Here's the information I am trying to capture
Display Name: (In this case, display name for internal would be but the external one would be "ASUS VP 228)
Display Resolution: 1680x1050 and 1920x1080 for the 2 displays
Whether it's an internal/external display: True and False
Refresh rate: 60Hz and 60Hz
Here's what I've tried
Get-WmiObject win32_videocontroller
Returns the resolution under parameter VideoModeDescription but even then, it seems to return the value of "Active Signal Resolution" (from the screenshot) below).
Get-WmiObject Win32_Desktopmonitor
DeviceID : DesktopMonitor1
DisplayType :
MonitorManufacturer : (Standard monitor types)
Name : Generic PnP Monitor
ScreenHeight :
ScreenWidth :
Only get output information for 1 out of the 2 monitors and don't get ScreenHeight or ScreenWidth` values.
get-ciminstance -namespace root\wmi -classname wmimonitorbasicdisplayparams
Active : True
DisplayTransferCharacteristic : 120
InstanceName : DISPLAY\IVO8C66\5&462698&0&UID256_0
MaxHorizontalImageSize : 31
MaxVerticalImageSize : 17
SupportedDisplayFeatures : WmiMonitorSupportedDisplayFeatures
VideoInputType : 1
PSComputerName :
Active : True
DisplayTransferCharacteristic : 120
InstanceName : DISPLAY\ACI22C3\5&462698&0&UID265_0
MaxHorizontalImageSize : 48
MaxVerticalImageSize : 27
SupportedDisplayFeatures : WmiMonitorSupportedDisplayFeatures
VideoInputType : 1
PSComputerName
Gives me the correct monitor count but doesn't give me any other information.
I then tried using the DumpEDID tool and it gave me more information but didn't give me current monitor resolution or whether it's an internal display or not.
DumpEDID v1.07
Copyright (c) 2006 - 2018 Nir Sofer
Web site: http://www.nirsoft.net
*****************************************************************
Active : Yes
Registry Key : DISPLAY\ACI22C3\5&462698&0&UID265
Monitor Name : ASUS VP228
Serial Number : G6LMTF155938
Manufacture Week : 26 / 2016
ManufacturerID : 26884 (0x6904)
ProductID : 8899 (0x22C3)
Serial Number (Numeric) : 155938 (0x00026122)
EDID Version : 1.3
Display Gamma : 2.20
Vertical Frequency : 50 - 75 Hz
Horizontal Frequency : 24 - 83 KHz
Maximum Image Size : 48 X 27 cm (21.7 Inch)
Maximum Resolution : 1920 X 1080
Support Standby Mode : No
Support Suspend Mode : No
Support Low-Power Mode : Yes
Support Default GTF : No
Digital : Yes
Supported Display Modes :
720 X 400 70 Hz
640 X 480 60 Hz
640 X 480 67 Hz
640 X 480 72 Hz
640 X 480 75 Hz
800 X 600 56 Hz
800 X 600 60 Hz
800 X 600 72 Hz
800 X 600 75 Hz
832 X 624 75 Hz
1024 X 768 60 Hz
1024 X 768 70 Hz
1024 X 768 75 Hz
1280 X 720 60 Hz
1152 X 864 75 Hz
1280 X 960 60 Hz
1440 X 900 60 Hz
1280 X 1024 60 Hz
1280 X 1024 75 Hz
1680 X 1050 60 Hz
1920 X 1080 60 Hz
*****************************************************************
*****************************************************************
Active : Yes
Registry Key : DISPLAY\IVO8C66\5&462698&0&UID256
Manufacture Week : 0 / 2019
ManufacturerID : 53030 (0xCF26)
ProductID : 35942 (0x8C66)
Serial Number (Numeric) : 0 (0x00000000)
EDID Version : 1.4
Display Gamma : 2.20
Maximum Image Size : 31 X 17 cm (13.9 Inch)
Maximum Resolution : 1920 X 1080
Support Standby Mode : No
Support Suspend Mode : No
Support Low-Power Mode : No
Support Default GTF : No
Digital : Yes
Supported Display Modes :
1920 X 1080 60 Hz
The script here gives me almost what I want except it doesn't give me the monitor name nor does it give me information whether it's internal or not.
I have attached 2 screenshots from the Settings > Display > Advanced Display page which is the information I am looking to capture.

Unable to Determine why Frame Too Large For GIF Govips

I am trying to resize the following GIF
Original Dimensions are: 1270 x 1270 and a total of 149 Pages
I am resizing to the following Dimensions:
250 x 250 (Successful)
500 x 500 (Successful)
750 x 750 (Unsuccessful)
It fails for the last case and after some digging I found that the limits are set in libvips. I am not able to conclude how the dimensions are violating the constraints.
Constraints Being:
if( (guint64) frame_rect.width * frame_rect.height > INT_MAX / 4 ||
frame_rect.width > 65535 ||
frame_rect.height > 65535 ) {
vips_error( class->nickname, "%s", _( "frame too large" ) );
return( -1 );
}
Currently I have the latest govips(v2.11.0) and vips(8.13.3) versions installed.
I tried different sizes and it is working till 740 x 740. I tried to change the Export Params but am unable to figure the math behind why the frame is too large.

Calculating Bytes

Suppose each pixel in a digital image is represented by a 24-bit color value. How much memory does it take to store an uncompressed image of 2048 pixels by 1024 pixels?
I said for this that 24 bits is 3 bytes. And 2048 Pixels is 6KB (2048 * 3 / 1024) and 1024 Pixels is 3KB (1024 * 3 / 1024). And then I multipled to get 18KB^2.
But the answer says 6MB? How is this possible and how do 1024 and 2048 play into this because the answer says 6MB and doesn't explain.
24 bit => 24 bit / 8 bit = 3 byte
1) 2048 pixel * 1024 pixel = 2097152 pixel (Area)
1.1) 2097152 pixel * 3 byte = 6291456 byte (Each pixel 3 bytes)
2) 6291456 byte / 1024 byte = 6144 kilobyte
3) 6144 kilobyte / 1024 byte = 6 Megabyte

Determining correct frame format in libuvc

I'm trying to connect to a UVC compatible camera on OS X. Using the hello world example from libuvc, my camera outputs this:
DEVICE CONFIGURATION (2560:c114/39254404) ---
Status: idle
VideoControl:
bcdUVC: 0x0100
VideoStreaming(1):
bEndpointAddress: 131
Formats:
UncompressedFormat(1)
bits per pixel: 16
GUID: 5931362000001000800000aa00389b71
default frame: 1
aspect ration: 0x0
interlace flags: 00
copy protect: 00
FrameDescriptor(1)
capabilities: 00
size: 752x480
bit rate: 346521600-346521600
max frame size: 721920
default interval: 1/60
interval[0]: 1/60
interval[1]: 1/30
FrameDescriptor(2)
capabilities: 00
size: 640x480
bit rate: 294912000-294912000
max frame size: 614400
default interval: 1/60
interval[0]: 1/60
interval[1]: 1/30
FrameDescriptor(3)
capabilities: 00
size: 320x240
bit rate: 73728000-73728000
max frame size: 153600
default interval: 1/60
interval[0]: 1/60
UncompressedFormat(2)
bits per pixel: 24
GUID: 7deb36e44f52ce119f530020af0ba770
default frame: 1
aspect ration: 0x0
interlace flags: 00
copy protect: 00
FrameDescriptor(1)
capabilities: 00
size: 752x480
bit rate: 519782400-519782400
max frame size: 1082880
default interval: 1/60
interval[0]: 1/60
interval[1]: 1/30
FrameDescriptor(2)
capabilities: 00
size: 640x480
bit rate: 442368000-442368000
max frame size: 921600
default interval: 1/60
interval[0]: 1/60
interval[1]: 1/30
FrameDescriptor(3)
capabilities: 00
size: 320x240
bit rate: 110592000-110592000
max frame size: 230400
default interval: 1/60
interval[0]: 1/60
END DEVICE CONFIGURATION
However none of the frame formats seem to work, i.e.
res = uvc_get_stream_ctrl_format_size(
devh, &ctrl,
UVC_FRAME_FORMAT_YUYV,
752, 480, 60 /* width, height, fps */
);
Whatever frame format I try (I tried looping over the enum) I get something like this:
UVC initialized
Device found
Device opened
get_mode: Invalid mode (-51)
Device closed
UVC exited
The camera works fine in Windows and in Linux under ROS. What frame format should I use? Given the configuration, I hoped UVC_FRAME_FORMAT_RGB would work, but no dice. The code for libuvc seems to compare the UVC frame format to what the device provided, but I don't understand how it determines what's a valid format.
You have to use
const uvc_format_desc_t *uvc_get_format_descs(uvc_device_handle_t* )
The returned pointer to uvc_format_desc_t will contain the first available format that is valid for the given camera. You can then iterate through all possible formats with the next pointer in uvc_format_desc_t.
frame_descs in uvc_format_desc_t contains width height etc.
bDescriptorSubtype in uvc_format_desc_t contains the format e.g. UVC_VS_FORMAT_UNCOMPRESSED

CGDisplayModeGetWidth/Height() sometimes returns pixels, sometimes points

According to Apple, both CGDisplayModeGetWidth() and CGDisplayModeGetHeight() should return points instead of pixels starting in macOS 10.8. But Apple's word on those APIs isn't consistent because here they say that the functions return pixels and not points.
This confusion is also reflected in practice because both functions only seem to return points sometimes, not all the time. Sometimes they also return pixels. Consider this example:
CGDirectDisplayID id = CGMainDisplayID();
CFArrayRef modes = CGDisplayCopyAllDisplayModes(id, NULL);
CGDisplayModeRef mode;
int c = 0, k, n;
n = CFArrayGetCount(modes);
for(k = 0; k < n; k++) {
mode = (CGDisplayModeRef) CFArrayGetValueAtIndex(modes, k);
printf("GOT SIZE: %d %d\n", (int) CGDisplayModeGetWidth(mode), (int) CGDisplayModeGetHeight(mode));
}
CFRelease(modes);
The code iterates over all available screen modes. In this example, the output is in pixels.
When using this code, however, the output is in points:
CGDirectDisplayID id = CGMainDisplayID();
mode = CGDisplayCopyDisplayMode(id);
printf("NEW GOT SIZE: %d %d\n", (int) CGDisplayModeGetWidth(mode), (int) CGDisplayModeGetHeight(mode));
CGDisplayModeRelease(mode);
But why? Why do CGDisplayModeGetWidth() and CGDisplayModeGetHeight() return pixels in the first code snippet and points in the second? This is confusing me.
To make things even more complicated, starting with macOS 10.8 there are two new APIs, namely CGDisplayModeGetPixelWidth() and CGDisplayModeGetPixelHeight(). These always return pixels, but I still don't understand why CGDisplayModeGetWidth() and CGDisplayModeGetHeight() return pixels in the first code snippet above... is this a bug?
EDIT
Here is the output for my 1680x1050 monitor. I am using Quartz Debug to put the monitor in 840x525 screen mode to do Retina tests. You can see that the output of the first code snippet must be in pixels because it returns modes such as 1680x1050 which would correspond to 3360x2100 pixels if it were points. Another proof that the first code snippet returns pixels not points lies in the fact that the screen mode the monitor is currently in (i.e. 840x525) isn't returned at all. Only the second code snippet returns this mode.
GOT SIZE: 1680 1050
GOT SIZE: 1152 870
GOT SIZE: 1280 1024
GOT SIZE: 1024 768
GOT SIZE: 1024 768
GOT SIZE: 1024 768
GOT SIZE: 832 624
GOT SIZE: 800 600
GOT SIZE: 800 600
GOT SIZE: 800 600
GOT SIZE: 800 600
GOT SIZE: 640 480
GOT SIZE: 640 480
GOT SIZE: 640 480
GOT SIZE: 640 480
GOT SIZE: 1280 1024
GOT SIZE: 1280 960
GOT SIZE: 848 480
GOT SIZE: 1280 960
GOT SIZE: 1360 768
GOT SIZE: 800 500
GOT SIZE: 1024 640
GOT SIZE: 1280 800
GOT SIZE: 1344 1008
GOT SIZE: 1344 840
GOT SIZE: 1600 1000
--------------------------
NEW GOT SIZE: 840 525

Resources