Change resolution for specific application - windows

On Microsoft Windows 11, is it possible to change the resolution of a single and specific application ?
Let's say that I have a 4k resolution. Some applications are readable, but some are really not. The fonts, the buttons, the images... are too small. I would be great to make the application bigger (like games can run in a different resolution).
Therefore I will not need to change the resolution down and up on a daily basis for just one program or two.

Answer
You cannot.
Workaround
What you can do is make sure the application does not have an assembly manfiest that says it is dpiAware. Because even though it might have a manifest that says it is dpi aware: it obviously isn't.
Once the application is not dpiAware, you can scale your Windows to 125%, 150%, 200%, etc, and let Windows scale the application for you.

Related

Default Windows is scaled to 150%?

I am UX designer designing educational activities for schools. These a responsive websites. When I have come to do QA I have seen everything is HUGE on Windows. The devs tell me that default resolution for Windows is 150%... Um – what? I've been in this game a long time and I have not encountered this. This makes no sense... Has anyone encountered something like this?
It seems like Windows tries to find a good physical size for its icons, and that falls upon the resolution and the physical size of the screen.
For our end users, display scaling is a platform technology ensuring that content is presented at a consistent and optimal–yet easily adjustable–size for readability and comprehension on every device.
I have several laptops to check what the recommended scaling was set as.
15' laptop at 1920x1080: recommended scaling was 125%.
17' laptop at 1366x768: recommended scaling was 100%.
12' tablet at 2160x1440: recommended scaling was 150%.
because the density of the pixels are different on all these screens, windows appears to automatically set a scaling value that it thinks looks like a good size.
Other examples
One
Two
This is a bug. Once a user chooses their preferred setting, Windows should leave it alone. I have resigned myself to resetting it manually each time I logon.

Qt Application Appearance Running Over Remote Desktop

I work on a desktop application that we sometimes have to run on a virtual machine using Windows Remote Desktop for access. Fonts and gradients are noticeably degraded in appearance when running through Remote Desktop. The fonts are clearly not anti-aliased (and are normally) The gradients degenerate into much larger bands of solid color, losing the smoother look. Initially, I had assumed Windows was doing this to improve performance, but when I compared application fonts in our produce with those in other applications (Visual Studio specifically), I see that Qt is definitely rendered fonts in dialogs and QGraphicsScene differently.
In the application title bar of my app, I see that the font exactly matches the appearance of other application title bars, and that makes sense because Windows draws that. Within my application, all of the top menu items and fonts on dialogs are not anti-aliased and look terrible. We use QGraphicsScene extensively, and those fonts are degraded as well.
I don't have another application that generates gradients to compare those, but I viewed a high resolution image through the Remote Desktop connection using the Windows image viewer, and it looks just as good as on a local desktop.
The degraded appearance means that we can't do screen shots for documentation while using the VM. We are also frequently required to do demos using VMs and Remote Desktop, and the appearance is not appealing to show to customers. In our industry and within our company, there's increasing pressure to use VMs instead of local, physical machines, so this is becoming a bigger problem.
Both symptoms lead me to believe that Qt knows that I'm visualizing through Remote Desktop and that it is choosing to degrade appearance in favor of performance. I don't want that, or at the very least, I need to control it.
I suspect this is buried somewhere in Qt's style/theme system, but I haven't had any luck finding clues that would point me to the correct place to do something about this, or at least an answer that indicates whether or not it's even possible. Any advice is greatly appreciated.
With QGraphicScene we have OpenGL for rendering. And with some of VMs we mostly rely on software simulating OpenGL via MS DirectX, which is for software and not hardware supported rendering. The most popular software OpenGL rendering is based on ANGLE.
To improve the rendering on VM I would try to build a custom Qt for your app using one of proposed Qt build configurations to configure specific Windows Qt build.
With Qt evolving it gets a bit confusing: which configuration is the best. I was told that since Qt 5.5 -opengl dynamic will be an optimal for most of environments. I used to configure -opengl es2 configuration with Qt 5.3 and that worked well without degrading the graphics but mind that VMs used are from VMware and not MS Hyper-V that would not even allow the app to load due to OpenGL failing to initialize and I could not make ANGLE to help here with that specific Qt.
I was able to address the issue with fonts in QGraphicsScene. Because of the nature of our product, the font handling for graphics items was fairly specialized, and very early in development when I was very new to Qt, I had set the style strategy to those fonts to QFont::ForceOutline because I didn't want the font matching to use any bitmapped fonts. Through experimentation, I found that this strategy results in the fonts not being anti-aliased when running through Remote Desktop. Changing to QFont::PreferAntialias addressed the problem for the fonts in the scene, and that's a substantial and welcome improvement.
Unfortunately, I haven't been able to find a solution for the general application fonts, nor for the gradient degradation, but at least with the fonts, I have something more to go on. My next step will be to start inspecting the fonts that Qt is using by default on some of the widgets and seeing what their attributes are.

Assesing feasibility for OS X port

I need to figure out if porting an application to Mac OS X (not iOS) is feasible. I wrote some code for Mac around 20 years ago, but what I'm looking at now is completely different, and may require a complete re-write, which I cannot afford. After googling for some time, I found a variety of APIs, which appearing and get deprecated so often, that I feel completely lost.
The application draws through copying small fragments of bitmaps to the window. This is accomplished with BitBlt() on Windows or XCopyArea() on X11. In both cases, the source is stored in the video memory, so copying is really fast, 500K copies per second on a decent card, possibly more. On Mac, there used to be CopyBits() function which did the same, but it is now depreacted. I found CGContextDrawImage() which looks it's getting deprecated too, but copies from the user memory, and can only copy the whole image (not fragments). Is there any way to accomplish bitmap copying at decent speed?
I see everything is 64-bit. I would want to keep it in 32-bit for a number of reasons. 32-bit applications still seem to be supported, but with the fast pace deprecation, Apple may stop supporting at any time. Is this a correct assesment?
Software distribution. I cannot find any information on this. Looks like you need to be a member of the Apple Development program to be able to install your software on user's computers. Is this true? In some other places, I have read that any software must undergo Apple approval. Is this correct?
Thank you for your help.
So much has changed in the past twenty years, that it may indeed be quite difficult to port your app directly to modern OS X. You may be better served by taking the general design concept and application objectives, and create a fresh implementation using up-to-date software technology.
Your drawing system might be much easier to do with modern APIs, but the first step is deciding which framework to use. Invest some time in reading the documentation and watching the many videos available on the Developer website. A logical place to start is Getting Started with Graphics & Animation, but you may also wish to explore Metal Programming Guide and SpriteKit.
The notion of 64-bit vs 32-bit is irrelevant. All Mac computers run 64-bit code.
If you don't purchase a Developer program membership, you can still create an unsigned application with Xcode. It can be installed on another user's computer, but they'll need to specifically change the setting in System Preferences -> Security to "Allow apps downloaded from: Anywhere".
The WWDC videos are very useful in understanding the concepts and benefits of advancements in these frameworks made over the past few years.
After some investingation, it appears that OS X graphics is completely different from others. Screen is regarded as a target for vector graphics, not bitmap.
When the user changes the screen resolution, the screen resolution doesn't really cange (as it would in Linux or Windows) and remains native, but the scale at which the vector graphics is rendered changes. Consequently, it's perfectly possible to set screen "resolution" to be higher than the native one - you just see the things renderd smaller.
When you take a screenshot, the system simply renders everything to an off-screen bitmap (which can be any size), so you can get nice smooth screenshots at any size.
As everything is vectored, the applications that use bitmap graphics are at a huge disadvantage. It is very hard to get to native pixels without much overhead, and worse yet, an application that uses native pixels will behave strange because it won't scale when the user changes screen resolution. It also will have problems when screenshots are taken. Is it possible to make it work? I guess I won't find out until I fork over $2K for Macbook Pro and try it.
32-bit apps seems to be supported, and I don't think there's an intent to drop the support.
As to code distribution, my Thawte Authenticode certificate is supposed to work on OS X as well, so I probably don't need to become a member of Apple Developer program to distribute software, but again there's no definitive answer to that until I try.

How to set application which can suite any resolution of computer?

I am working on C# win form. I created application in 1280 X 960 resolution.
But when i change system which have different resolution. it's not fit with that resolution.
My question is how to set application which can suite any resolution of computer ?
What should i do for that?
If you are stuck with WinForms, you will want to get an understanding of all of the possible resolutions your customers might have. You can then code your form so that it is optimized for the broadest resolution, but still usable by your lowest resolution customers. You can make your forms a bit more dynamic by making use of the Dock property on your controls and using controls like SplitContainer and FlowLayoutPanel to segment the different areas of the form. Though you should really strive to avoid it, you should also use panels to ensure your form scrolls if it will be cut off in very low resolutions.
All that being said, by nature WinForms is very non-dynamic and it can take a lot of effort to get it to be responsive to a lot of different resolutions. If the spread between your minimum and maximum resolutions is not that great, then you can always just code and test to the lowest resolution. Back in the day, I used to keep my second monitor set to 1024x768 just for that purpose.
While WPF will provide you with a truly resolution independent programming environment Windows Forms does have limited support for resolution independence.
Automatic Scaling in Windows Forms
Automatic scaling enables a form and its controls, designed on one machine with a certain
display resolution or system font, to be displayed appropriately on
another machine with a different display resolution or system font. It
assures that the form and its controls will intelligently resize to be
consistent with native windows and other applications on both the
users' and other developers' machines. The support of the .NET
Framework for automatic scaling and visual styles enables .NET
Framework applications to maintain a consistent look and feel when
compared to native Windows applications on each user's machine.

How to preserve other application windows sizes and positions when changing resolution? (eg. to and from full screen game in non-desktop resolution)

Has anyone noticed this odd behavior of application that utilize D3D or OpenGL when they go to full screen in Windows? It applies only when applications go to full screen and then switch back to window or terminate. They either shuffle window positions of other applications (when I am on single monitor machine), or move all the other applications windows to another screen when I am on multiple monitor machine.
I would take this for granted if there weren't for applications that didn't show this two anomalies. So, my question would be what exactly does one need to take care of when writing an application to alleviate these two problems? Also, I am not sure if this problem exists on other platforms besides Windows?
My primary setup concerning this is OpenGL/C++, but I presume this applies to whatever setup you have since it seems to be platform API thing that needs to be taken care of.
edit: OK, here is some more clarification on my observation. Problem persists even on same resolution as desktop one. So, it does not seem to be related to resolution switch, because I've seen application/games that even when they are not in the same resolution as desktop, when they switch back, windows on desktop are restored as they once were before the full screen application was run.
edit2: it looks like it is a resolution switch problem, Windows (at least XP) does not seem to remember positions and size (in case of multiple monitor setup) of applications windows. Looks like only solution is the one I provided in an answer to the question - even though it seems like something OS should provide, at least as an API call or two. I'm still not convinced this is the only solution, there must be an easy way of graceful, easy restoration, no?
Shouldn't you be using ChangedDisplaySettingsEx(..., CDS_FULLSCREEN, NULL)? That will tell the system the resolution swap is temporary.
I can't say that I'm 100% certain about the situation you're experiencing. However, my guess is it's because most D3D/OpenGL games will change the resolution of your machine when they startup/shutdown for performance reasons.
The ones you see that don't shuffle the windows around are likely not changing the resolution because they may be able to run at your current settings.
Hm, I've gone through some more research about this - it looks like there is no default fallback on restoring all running windows sizes and positions after changing resolution, so it must be done from within an application (at least in XP).
So, in order to gracefully return back from other resolution (full screen game for example), I would need to get all running applications hWnd's with EnumWindows and appropriate callback and store each of the windows RECT structure via GetWindowRect in a list.
When switching back to desktop resolution I would EnumWindows again, but with a different callback which sets each of the running application windows position and size with SetWindowPos, using the list of RECTs I've saved before switching to full screen.
There are gotcha's, ofcourse, like watching you get a window hwnd only through EnumWindows etc. It seems odd that OS doesn't provide a feature like that, even if only API. I wonder how other OS's out there handle this, if they handle it at all.

Resources