I'm looking for a way to save and run a script to change the layout of the connected monitors. I have different set ups at home vs in the office that i connect the laptop to. Id also like it to home which is the home dock and set up accordingly. I am also wanting to make certain apps go to certain screens when the doc is connected. Is this even possible.
I have run a few queries' through PowerShell to gain the location of the monitors vs the primary screen (laptop) to no avail but are also then at a complete loss as where to go next
Related
I understand that desktop/GUI apps are not supported in Windows containers. They do run but there's no built-in way to interact with them. I had the following idea - maybe I could use the Desktop Sharing API (https://learn.microsoft.com/en-us/windows/win32/api/_rdp/) for this purpose, the idea is to run a desktop app, then run a sharing program that uses the Desktop Sharing API, and connect to it using a Desktop Sharing API viewing program from the host.
I had to do some recap about window stations and desktops, and I noticed that when starting the container with cmd in interactive mode, I'm logged with ContainerAdministrator as a service (logon type 5). I tried running some WinAPI functions that deal with desktops and winstation and got some access denied results, so I switched to running cmd as system.
The window station of the cmd process (and other child processes) is not the interactive WinSta0, but instead some other service window station, which makes sense since I'm logged on as a service, and I figured that I can't use this window station, so I used a little program I wrote to run notepad in Winsta0 in the Default desktop. Afterwards I ran another program that enumerates the windows on WinSta0\Default, and the notepad window does get enumerated and I also get it's title, so it's running somewhere.
So now I tried running the desktop sharing API program (also on WinSta0\Default). It runs and I can connect from the host, but I only get a black screen without anything on it. I also tried running a program that takes a screenshot of the windows but I get an empty bitmap.
So I thought maybe the Default desktop is not the active desktop, and by using the OpenInputDesktop function I could confirm it - the current desktop was the Winlogon desktop, so I used the SwitchDesktop function to switch to the Default desktop (I used OpenInputDesktop again to verify that it actually worked).
Unfortunately, this didn't change anything, I still get an empty screen and empty bitmaps.
I know that containers are built for micro services and are not supposed to run GUI apps and so, but still - is there a way to make this work? Or any ideas of what else I can check? Alternatively, if you know that it can't work - I would also be happy to hear a good technical explanation of why it doesn't work.
I've got a personal laptop (running Windows 10) which I use at work where I connect it to an external display using extended display mode. I keep all my personal icons and windows on my laptop display and store all the work-related windows on the external display. Whenever I unplug it, all the windows and icons from that display are merged into my laptop screen. I want to programmatically prevent changing anything on my primary screen when the secondary is disconnected. I'm currently writing a utility app for a variety of small productivity improving features and would like to add such feature in it. I can think of two ways to achieve this:
by tricking the system to think that the external display hasn't been
disconnected;
or take all the opened windows and icons on disconnected screen and put them on separate virtual desktop.
I was looking into Windows GDI Device Context Functions but haven't found anything about display connection/disconnection events. How can I detect display disconnection (and get that display's opened windows and icons)? Anything that can be done using C#, C++ or PowerShell scripts would be much appreciated!
I have 3 monitors, but I don't need them all turned on all the time. I can just shut them down with power button, but I want to use their standby mode, like Windows does when we let PC idle for a while - it shuts down monitors, HDD, etc.
But of course, I wanna keep using PC and let just that monitor on standby. Others must remain on and that one doesn't wake up even with me using PC.
Is it possible to do that? It would be great to have a shortcut like Winkey+1, 2, 3 etc to shut down and wake up each monitor.
An existing app with this feature is not likely to exist, but is there a Windows API function that can control monitor state, for each monitor in a MultiMonitor system?
The display control panel applet calls SetDisplayConfig to start or stop forced projection on a particular target
You can probably use MS Detours or some other API hooking tool to inspect the usage pattern of the API while using the applet to adjust display settings.
You'll want to try Display Fusion. You should be able to do what you're asking for using Monitor configurations.
I know I'm late on this but use DDC to control your display. You can easily create hotkeys that send a command via DDC to the display to turn-off. This would be equivalent to turning off the display using the button. Works like a charm for me. The only trick is that DDC command specs vary across monitor manufacturers but its not hard to find the right codes to send with the help of google.
Ready made tools also exist for this; search for anything that is related to DDC or EDID and you should find.
Be aware though that this does not remove the display from Windows which means that apps may find their way onto displays that are off and you will be looking for them.
I manage a number of Windows PCs which are used to control equipment. Each computer has a specific program installed which is what people launch to use that equipment. We want to require people to log in before they can access this program.
Currently, I have a wxpython app which just launches that executable when people log in with the correct credentials. However, you can just run the program directly and bypass logging on. I'd like to make a mock logon screen, ie, fullscreen and modal, which only goes away when you log in. Also it should not be able to be bypassed by alt-tab, windows key, etc. How might I accomplish this with wxpython?
There is no full proof way to do this on Windows. You can show a wx.Frame modally using its MakeModal() method. And you can catch EVT_CLOSE and basically veto it it they try to close the frame. However, if they have access to the Task Manager or even Run, they can probably get around the screen. Most users won't be that smart though. You can delete the shortcuts to the apps you want to launch with wx and that will force most normal users to use your login screen. It's only the smart ones who like to troll through the file system who will go around it.
I have a user that is currently running my Winforms app on Win7. My app allows users to select rows from an open Excel spreadsheet and drag-n-drop them onto the app. However, this user cannot do the drag-n-drop. The cursor changes to the "no" cursor (little circle with line through it) and the operation won't complete.
I was researching drag-n-drop and Win7 and everything I found points to UAC and/or UIPI. I was looking for some solutions and am not sure if any of the below would work:
If the user logs in as admin (and as a result runs my app as admin) would that allow drag-n-drop to work?
Does the user need to turn off or change the settings of UAC/UIPI in order to be able to drag-n-drop?
I am not sure what the issue is. My app usually runs from C:/Documents and Settings/... (C:/Users/... on Win7). Does where it is running from matter? Does drag-n-drop not work because the user is not running my app with enough permissions? Are his Excel and my app on different permission levels? If so, what can be done about that? Note that even though my app allows users to just drag the file directly, that doesn't work either.
Also, is there any way I can have the user reproduce this issue with other apps? Are there apps that come with Win7 that he can see the same problem with. For example, can this be reproduced using Notepad?
Thanks.
Explaining this problem away by UIPI is a very long stretch. It doesn't have anything to do with whether or not the user is logged-in as an admin, that doesn't affect UAC and your program will be running with that same account anyway. The only way UIPI could kick in to stop a D+D is when your program is elevated and Excel is not.
To get yourself elevated requires work and doesn't happen by accident. You'd have to include a manifest so that the user gets the UAC prompt, you'd know about that. Or the user would have to change the desktop shortcut and tick the "Run this program as an administrator" option, she's know about that. While UIPI can be bypassed for Windows messages (ChangeWindowMessageFilter), it cannot for Drag and Drop so if any elevation is going on then your stuck. The ultimate test is to simply ask the user to put the UAC slider all the way down.
The much more likely scenario is that your DragEnter event handler simply isn't happy with the data it sees and therefore doesn't assign the e.Effect property. If you can't get a debugger on-site then write a little test program that logs the values of e.Data.GetFormats() plus whatever else you use to check if the drop is acceptable. And don't forget the obvious: the user simply fumbling the drag somehow.