CGDisplayModeGetWidth/Height() sometimes returns pixels, sometimes points - macos

According to Apple, both CGDisplayModeGetWidth() and CGDisplayModeGetHeight() should return points instead of pixels starting in macOS 10.8. But Apple's word on those APIs isn't consistent because here they say that the functions return pixels and not points.
This confusion is also reflected in practice because both functions only seem to return points sometimes, not all the time. Sometimes they also return pixels. Consider this example:
CGDirectDisplayID id = CGMainDisplayID();
CFArrayRef modes = CGDisplayCopyAllDisplayModes(id, NULL);
CGDisplayModeRef mode;
int c = 0, k, n;
n = CFArrayGetCount(modes);
for(k = 0; k < n; k++) {
mode = (CGDisplayModeRef) CFArrayGetValueAtIndex(modes, k);
printf("GOT SIZE: %d %d\n", (int) CGDisplayModeGetWidth(mode), (int) CGDisplayModeGetHeight(mode));
}
CFRelease(modes);
The code iterates over all available screen modes. In this example, the output is in pixels.
When using this code, however, the output is in points:
CGDirectDisplayID id = CGMainDisplayID();
mode = CGDisplayCopyDisplayMode(id);
printf("NEW GOT SIZE: %d %d\n", (int) CGDisplayModeGetWidth(mode), (int) CGDisplayModeGetHeight(mode));
CGDisplayModeRelease(mode);
But why? Why do CGDisplayModeGetWidth() and CGDisplayModeGetHeight() return pixels in the first code snippet and points in the second? This is confusing me.
To make things even more complicated, starting with macOS 10.8 there are two new APIs, namely CGDisplayModeGetPixelWidth() and CGDisplayModeGetPixelHeight(). These always return pixels, but I still don't understand why CGDisplayModeGetWidth() and CGDisplayModeGetHeight() return pixels in the first code snippet above... is this a bug?
EDIT
Here is the output for my 1680x1050 monitor. I am using Quartz Debug to put the monitor in 840x525 screen mode to do Retina tests. You can see that the output of the first code snippet must be in pixels because it returns modes such as 1680x1050 which would correspond to 3360x2100 pixels if it were points. Another proof that the first code snippet returns pixels not points lies in the fact that the screen mode the monitor is currently in (i.e. 840x525) isn't returned at all. Only the second code snippet returns this mode.
GOT SIZE: 1680 1050
GOT SIZE: 1152 870
GOT SIZE: 1280 1024
GOT SIZE: 1024 768
GOT SIZE: 1024 768
GOT SIZE: 1024 768
GOT SIZE: 832 624
GOT SIZE: 800 600
GOT SIZE: 800 600
GOT SIZE: 800 600
GOT SIZE: 800 600
GOT SIZE: 640 480
GOT SIZE: 640 480
GOT SIZE: 640 480
GOT SIZE: 640 480
GOT SIZE: 1280 1024
GOT SIZE: 1280 960
GOT SIZE: 848 480
GOT SIZE: 1280 960
GOT SIZE: 1360 768
GOT SIZE: 800 500
GOT SIZE: 1024 640
GOT SIZE: 1280 800
GOT SIZE: 1344 1008
GOT SIZE: 1344 840
GOT SIZE: 1600 1000
--------------------------
NEW GOT SIZE: 840 525

Related

Unable to Determine why Frame Too Large For GIF Govips

I am trying to resize the following GIF
Original Dimensions are: 1270 x 1270 and a total of 149 Pages
I am resizing to the following Dimensions:
250 x 250 (Successful)
500 x 500 (Successful)
750 x 750 (Unsuccessful)
It fails for the last case and after some digging I found that the limits are set in libvips. I am not able to conclude how the dimensions are violating the constraints.
Constraints Being:
if( (guint64) frame_rect.width * frame_rect.height > INT_MAX / 4 ||
frame_rect.width > 65535 ||
frame_rect.height > 65535 ) {
vips_error( class->nickname, "%s", _( "frame too large" ) );
return( -1 );
}
Currently I have the latest govips(v2.11.0) and vips(8.13.3) versions installed.
I tried different sizes and it is working till 740 x 740. I tried to change the Export Params but am unable to figure the math behind why the frame is too large.

X11 (xorg) fails to set/change resolution (linux x86)

I'm trying to get a 10-inch touch display (native resolution: 1280x800) to switch to 1024x768, but everything I try is either ignored or results in an error. The display reportedly supports the resolution, though, xrandr --verbose reports (I'm using the default VESA driver):
xrandr: Failed to get size of gamma for output default
Screen 0: minimum 640 x 480, current 640 x 480, maximum 1280 x 800
default connected 640x480+0+0 (0x180) normal (normal) 0mm x 0mm
Identifier: 0x17d
Timestamp: 635022581
Subpixel: horizontal rgb
Clones:
CRTC: 0
CRTCs: 0
Transform: 1.000000 0.000000 0.000000
0.000000 1.000000 0.000000
0.000000 0.000000 1.000000
filter:
1280x800 (0x17e) 0.0MHz
h: width 1280 start 0 end 0 total 1280 skew 0 clock 0.0KHz
v: height 800 start 0 end 0 total 800 clock 0.0Hz
800x600 (0x17f) 0.0MHz
h: width 800 start 0 end 0 total 800 skew 0 clock 0.0KHz
v: height 600 start 0 end 0 total 600 clock 0.0Hz
640x480 (0x180) 0.0MHz *current
h: width 640 start 0 end 0 total 640 skew 0 clock 0.0KHz
v: height 480 start 0 end 0 total 480 clock 0.0Hz
1024x768 (0x181) 0.0MHz
h: width 1024 start 0 end 0 total 1024 skew 0 clock 0.0KHz
v: height 768 start 0 end 0 total 768 clock 0.0Hz
If I try to change the resolution via xrandr --output default --mode 1027x768, i just get:
xrandr: Failed to get size of gamma for output default
xrandr: Configure crtc 0 failed
As far as I can see, only the second line is relevant to my problem; I don't know why xrandr would want to configre crtc 0, though, I only have the touch screen connected.
Failing that, I tried to configure the mode directly using the following xorg.conf:
Section "InputClass"
Identifier "calibration"
MatchProduct "DIALOGUE INC PenMount USB"
Option "Calibration" "95 911 93 919"
Option "SwapAxes" "0"
EndSection
Section "Monitor"
Identifier "disp0"
Modeline "1024x768_60.00" 63.50 1024 1072 1176 1328 768 771 775 798 -hsync +vsync
Option "PreferredMode" "1024x768_60.00"
EndSection
Section "Device"
Identifier "card0"
Driver "vesa"
EndSection
Section "Screen"
Identifier "src0"
Device "card0"
Monitor "disp0"
SubSection "Display"
Modes "1024x768_60.00" "1024x768"
EndSubSection
EndSection
Unfortunately, This doesn't work, either Xorg.log shows the following:
[634043.694] (II) VESA(0): Not using mode "1024x768_60.00" (no mode of this name)
[634043.694] (II) VESA(0): Not using built-in mode "1024x768" (no mode of this name)
Why doesn't this work? And what else can I try to get the display to switch to 1024x768?
I've uploaded the full logfile to Pastebin.
You can try to perform the following actions
cvt -r 1024 768
xrandr --newmode "1024x768_60.00" 63.50 1024 1072 1176 1328 768 771 775 798 -hsync +vsync
xrandr --addmode default 1024x768_60.00
xrandr --output default --mode 1024x768_60.00

animated webp only has key frame?

i want to cover a mp4 to animated webp, so i use ffmpeg command:
mp4 file is http://myvideodata.oss-cn-shenzhen.aliyuncs.com/crs_bcb3f246273d4dbb8ec7f93239fbea6e.mp4
ffmpeg -i ./test.mp4 ./test.webp
it is ok, and animated webp has been created, so i use webpinfo tool (download from https://developers.google.com/speed/webp/download and build example in it, or use this one http://myvideodata.oss-cn-shenzhen.aliyuncs.com/webpInfo)
./webinfo ./test.webp
and get information like this
RIFF HEADER:
File size: 1968244
Chunk VP8X at offset 12, length 18
ICCP: 0
Alpha: 1
EXIF: 0
XMP: 0
Animation: 1
Canvas size 362 x 330
Chunk ANIM at offset 30, length 14
Background color:(ARGB) ff ff ff ff
Loop count : 1
Chunk ANMF at offset 44, length 25116
Offset_X: 0
Offset_Y: 0
Width: 362
Height: 330
Duration: 42
Dispose: 0
Blend: 0
Chunk VP8 at offset 68, length 25092
Width: 362
Height: 330
Alpha: 0
Animation: 0
Format: Lossy (1)
every frame size is about 25k, my question is: all frames in animated webp are key frames?
can any one help
Yes, all frames are marked as key frames by the libwebp_anim encoder.

Calculating Bytes

Suppose each pixel in a digital image is represented by a 24-bit color value. How much memory does it take to store an uncompressed image of 2048 pixels by 1024 pixels?
I said for this that 24 bits is 3 bytes. And 2048 Pixels is 6KB (2048 * 3 / 1024) and 1024 Pixels is 3KB (1024 * 3 / 1024). And then I multipled to get 18KB^2.
But the answer says 6MB? How is this possible and how do 1024 and 2048 play into this because the answer says 6MB and doesn't explain.
24 bit => 24 bit / 8 bit = 3 byte
1) 2048 pixel * 1024 pixel = 2097152 pixel (Area)
1.1) 2097152 pixel * 3 byte = 6291456 byte (Each pixel 3 bytes)
2) 6291456 byte / 1024 byte = 6144 kilobyte
3) 6144 kilobyte / 1024 byte = 6 Megabyte

Determining correct frame format in libuvc

I'm trying to connect to a UVC compatible camera on OS X. Using the hello world example from libuvc, my camera outputs this:
DEVICE CONFIGURATION (2560:c114/39254404) ---
Status: idle
VideoControl:
bcdUVC: 0x0100
VideoStreaming(1):
bEndpointAddress: 131
Formats:
UncompressedFormat(1)
bits per pixel: 16
GUID: 5931362000001000800000aa00389b71
default frame: 1
aspect ration: 0x0
interlace flags: 00
copy protect: 00
FrameDescriptor(1)
capabilities: 00
size: 752x480
bit rate: 346521600-346521600
max frame size: 721920
default interval: 1/60
interval[0]: 1/60
interval[1]: 1/30
FrameDescriptor(2)
capabilities: 00
size: 640x480
bit rate: 294912000-294912000
max frame size: 614400
default interval: 1/60
interval[0]: 1/60
interval[1]: 1/30
FrameDescriptor(3)
capabilities: 00
size: 320x240
bit rate: 73728000-73728000
max frame size: 153600
default interval: 1/60
interval[0]: 1/60
UncompressedFormat(2)
bits per pixel: 24
GUID: 7deb36e44f52ce119f530020af0ba770
default frame: 1
aspect ration: 0x0
interlace flags: 00
copy protect: 00
FrameDescriptor(1)
capabilities: 00
size: 752x480
bit rate: 519782400-519782400
max frame size: 1082880
default interval: 1/60
interval[0]: 1/60
interval[1]: 1/30
FrameDescriptor(2)
capabilities: 00
size: 640x480
bit rate: 442368000-442368000
max frame size: 921600
default interval: 1/60
interval[0]: 1/60
interval[1]: 1/30
FrameDescriptor(3)
capabilities: 00
size: 320x240
bit rate: 110592000-110592000
max frame size: 230400
default interval: 1/60
interval[0]: 1/60
END DEVICE CONFIGURATION
However none of the frame formats seem to work, i.e.
res = uvc_get_stream_ctrl_format_size(
devh, &ctrl,
UVC_FRAME_FORMAT_YUYV,
752, 480, 60 /* width, height, fps */
);
Whatever frame format I try (I tried looping over the enum) I get something like this:
UVC initialized
Device found
Device opened
get_mode: Invalid mode (-51)
Device closed
UVC exited
The camera works fine in Windows and in Linux under ROS. What frame format should I use? Given the configuration, I hoped UVC_FRAME_FORMAT_RGB would work, but no dice. The code for libuvc seems to compare the UVC frame format to what the device provided, but I don't understand how it determines what's a valid format.
You have to use
const uvc_format_desc_t *uvc_get_format_descs(uvc_device_handle_t* )
The returned pointer to uvc_format_desc_t will contain the first available format that is valid for the given camera. You can then iterate through all possible formats with the next pointer in uvc_format_desc_t.
frame_descs in uvc_format_desc_t contains width height etc.
bDescriptorSubtype in uvc_format_desc_t contains the format e.g. UVC_VS_FORMAT_UNCOMPRESSED

Resources