My computer has two monitors, the primary monitor's resolution is 800*600, the other monitor's resolution is 1600*900.
I want to change the second monitor's resolution to 1024*768 in my own application.
How can I do this?
You should be able to enumerate displays using: EnumDisplayDevices - it will give you a name in DISPLAY_DEVICE::DeviceName which is needed by following two functions. Once you find your display use EnumDisplaySettingsEx to verify if new resolution is correct then use ChangeDisplaySettingsEx to do actual resolution change
Related
For saving memory and improving performance,I want to use a special format texture to deal with jpeg picture. The format handles by GL_TEXTURE_EXTERNAL_OES but process is same to GL_TEXTURE_2D (only different from glBindTexture and shader program texture declaration)
I have done it in egl hardware mode('rasterizer_type': 'direct-gles'). But have problems when I use skia hardware mode ('rasterizer_type': 'hardware'), I found skia hardware mode don`t support it directly and will call render_image_fallback_function_ (HardwareRasterizer::Impl::RenderTextureEGL) to deal with it likes 360 video. I found the result for display is very different from it show in egl hardware mode, It seems that the way only use to deal with 360 video. Is there a way to possible I let skia hardware mode support the special format directly or I only add a new way in TexturedMeshRenderer to deal with picture to distinguish 360 video.
Cobalt/Starboard supports letting the platform define custom (possibly accelerated) image decode functionality in starboard/image.h, are you using this to set GL_TEXTURE_EXTERNAL_OES, or are you modifying common Cobalt code?
If you are modifying Cobalt code, you may want to search through https://cobalt.googlesource.com/cobalt/+/master/src/cobalt/renderer/rasterizer/skia/hardware_image.cc for references to "GL_TEXTURE_2D" and make sure that they still make sense after your changes. In particular, you may need to adjust HardwareFrontendImage::CanRenderInSkia().
I'm trying to get the max resolution of the user's display-- not necessarily the current resolution, but the maximum the display supports. I know that I can get the current resolution with something like this but I need the maximum (ie: on a MBP 13inch the resolution would be 2560x1600).
I know I can do this in Terminal using something like this, but I would like to avoid trying to do something hack-y in the Terminal, and instead do it with Swift. Any suggestions on how I can do this? Thanks.
You need to use Quartz Display Services. First, get a list of the displays, probably with CGGetActiveDisplayList. Then, for each display, use CGDisplayCopyAllDisplayModes. Iterate over the array of modes, using CGDisplayModeGetWidth and CGDisplayModeGetHeight to figure out which is highest resolution.
Is it possible to create a program that intercepts the video output to a specific monitor and flips the image about the y-axis. I have found one program called UltraMon that lets you display one monitor as the mirror Image of another but I want to set one monitor to mirror its own input signal. I can easily mirror the output of my own code but I would like the windows desktop and any arbitrary software running to be mirrored as well. Is there a way to hook into the rendering pipeline of windows?
Edit:
I know there is a fliped option in Nvidia's Control Panel but this is just a 180deg rotation. I am looking for the true mirror image such that when viewing the monitor through a single mirror the imgage appears normal. i.e. for normilaized pixel coords x = (1-x)
I am using OGRE to make a re-rendering of the film, which has a wide aspect ratio (around 1.85). The OGRE dialog seems to be showing the standard full screen resolution by default (800/600, 1024/768, etc.), but those obviously have aspect ratios of 1.333 or around that. But as long as I am not running full screen mode, why should I be restricted to these screen sizes only?
I can definitely change the viewport size, but that would make it difficult for me to generate the video later.
Any idea?
There are no any restrictions for using non standard (4:3) aspect ratio screen resolution in Ogre. Default OGRE configuration window just show list of defaults resolution.
But if you need other screen side you should create it from the code directly depending on the dimensions of your input video.
I have a cursor what the size size 128x128, but when i used LoadCursor to load and show it, it only has 32x32. Which API can make it correctly? It seems MS resize it. Thanks.
Windows XP does not include any system cursors that are larger than 32x32. (If larger cursors were included, they would be stretched down to 32x32 when the standard APIs load the cursors.)
For high-DPI systems, Windows XP has adjusted the SM_CXCURSOR and SM_CYCURSOR values to be 64x64 pixels. This size adjustment is to prevent the mouse pointer from virtually disappearing because it is too small to be effectively used. Although the other aspects of the system scale with DPI, the mouse pointer does not scale. Microsoft does not try to enforce a DPI-independent size for the mouse pointer.
The system also provides the SetSystemCursor API function that you can use to change the system cursor for specific categories. You can use this function to set a cursor of any size. However, you must call the function programmatically, and you can only use it to set a cursor for a specific category. You cannot use it to make all cursors on the system the same size.
http://support.microsoft.com/kb/307213
Don't use LoadCursor, use LoadImage() instead.
SM_CXCURSOR by SM_CYCURSOR is the only cursor size the system can currently use.
Use GetSystemMetrics to find out those values.