Use native resolution in DrawingArea on HiDPI displays - macos

I'm developing a cross-platform photo retouching application based on Gtk-2 (but already able to support Gtk-3 with minor modifications).
In my program, the result of the image retouching is previewed in a scrollable area that is implemented through a Gtk::DrawingArea inserted into a Gtk::ScrolledWindow. The drawing itself is performed using Cairo.
Recently I had the possibility to test the software on a MacBookPro laptop with retina display, and I've immediately realised that the preview image gets magnified by a factor of 2, like all the rest of the GUI elements.
Is there a way to tell Cairo and the DrawingArea to use the native screen resolution, instead of applying a 2x magnification? Is this supported in recent Gtk-3 versions?
Thanks for you help.

Related

WinJs: Ignore Font scaling from Windows

I've developed a WinJs app for the Surface pro 4. The app Runs in fullscreen and is layoutet with the screen resolution of 2736x1824 (surface resolution).
Now when i start the App on the surface the DPI scaling comes into play and messes up my Layout.
Is there a way to disable the scaling for the app?
I've tried:
Windows.UI.ViewManagement.ApplicationViewScaling.trySetDisableLayoutScaling(true);
but that doesn't seem to work.
Actually, making your app layout only for one resolution, especially for such a big resolution is not really great idea since the app can run on device with, for example 1920x1080 resolution where even with DPI scaling disabled your layout will be messed up.
So I recommend making the app layout responsive so it will look right on every resolution.
Not sure if you're still looking at this issue but I've found that the code that you see all over the place is actually XBox only (Link To Docs)
And the Microsoft Devs have been saying for a while that it's a user based setting and they don't plan on allowing you to ds(Forum Link)
You can detect the scaling value though with ResolutionScale (docsLink)
Windows.Graphics.Display.DisplayInformation.getForCurrentView().resolutionScale
Here's a link to a sample that detects it and adjusts.
So what you can do (and what I did) was where needed I used css scale to adjust the system to fit. Most of my app is responsive so it didn't matter, but if you use an iFrame and set it to be 1000px wide with this scale factor it will actually be 1400px or even 1800px wide and totally cut off..
I have listeners setup for the resize events and just adjust as needed.
Hope that helps!
-Dennis

MS Windows - Capture windows' gpu bitmaps

I presume dwm holds bitmap data of each rendered window in the GPU. Can I access this data? I want to use it as a texture in D3D (or preferably OpenGL). Screenshotting each window to RAM and back to GPU is too slow.
Ive seen other posts like : obtaining full desktop screenshot from the GPU
so Im doubtful, but maybe something has changed in the last 3 years.
Edit
So do all applications use Direct3D to draw all components? Would, say, this chrome browser's content, or file explorer's, or anything exist as an image in the graphics card or are only borders and such rendered through Direct3D/2D? Want to make sure before pursuing. BTW: my idea is a desktop for the Rift without running an alternate shell.

OpenCV camera stream stopping while in fullscreen mode

I want to have two aplications simultaneously run: one that analyzes image from webcam written using OpenCV (the image is acquired through callback function) and an application that goes into fullscreen mode (let's say a 3D game). The problem is that while the fullscreen mode is launched the webcam image stream is stopping - the frames simply don't turn up, the callback function isn't called. This seems to be an issue with OpenCV - to test that a simple application displaying the image form camera has been prepared.
Why the image stream could be blocked by the fullscreen mode? How to bypass this?
Thanks for any hints.
Your question does not tell if you have tried to search for the problem in the OpenCV community first, so I post this as a hint in case: http://tech.groups.yahoo.com/group/OpenCV/
Also check out the list of issues, maybe its a known bug: https://code.ros.org/trac/opencv/report/1
I'm not an OpenCV expert so this is closer to a suggestion than an answer - but I've experienced similar on my multi-monitor setup using a number of media players on the second monitor and some fullscreen apps ont he first.
In my limited testing, it comes down to what method is used to render the 3d app - DirectX seems to stop media players, OpenGL doesn't.
So it might not be OpenCV which has a problem - it may be what DirectX does to the hardware during a full-screen game.
Actually the behaviour of the OpenCV camera stream is strange. It seems to depend on the native OpenCV window (cvNamedWindow()) that shows the output image form webcam. If the window is on the same screen that went fullscreen the streaming will continue. If the camera window would be placed on another screen, the stream would stop.
Another curious thing is with the screen resolution change. If you change the resolution of the screen and the camera window is not visible (is closed or even minimalized) the image stream would get blocked.
These are just my observations on the topic, maybe it'll be helpful for someone.

Does anyone know if there is a performance benefit to fullscreen opengl vs windowed opengl in OSX?

The client for the MMO I work on uses two contexts, one for a window view and one fullscreen. I'm wondering if I just use a window sized to the display I can simply resize it if the user wants a smaller window so they can access their desktop.
Is their a performance penalty for running opengl in a window vs fullscreen assuming the same dimensions etc?
The client shell is written in cocoa; the game code itself is cross-platform.
We only support OSX 10.5 and 10.6 for the next release.
Before 10.6, if your context did not have the full screen flag in it's creation, then you had a small performance difference. Now, with 10.6, this has changed.
Have a look at:
http://lists.apple.com/archives/Cocoa-dev/2009/Sep/msg01054.html
if there is a cost associated with clipping each frame, then yes.

GUI framework for automatic resizing

I want to build a desktop app where the size of both the window and the content is resized automaticly according to the resolution of the monitor. I know it can be done easily with the docking features of .NET Forms, but my customer insists on going with Linux so I can't use it.
I tried Flex & Air, but the content is not resized automaticaly when I put the app in fullscreen or in another resolution (the app goes full screen but I still have tiny buttons). Now, I am looking at Qt and Gtk...
Is there a GUI framework that can do that? I don't care about the programming language.
Also, since the app will go in a bar it would be nice to be able to customize easily the skin. (like in Flex, WPF, etc.)
Regards,
Pascal
An excellent place to start is understanding how the Screen class works: MSDN Even though that is .Net, it will give you an idea of how the screen size, dpi, etc. can be obtained. In addition that information should translate to the Mono platform. Since your client is insisting on Linux, you should look at MonoDevelop and then possibly the GTK# framework. My understanding is that GTK# is not a very friendly (that is pretty) development system (yet).
See:
MonoDevelop
GTK#

Resources