Is there a way to trick GUI applications in docker to think their window loaded? - windows

I try to run an windows 10 application inside a windows servercore container.
The app can run without user input via COM-Interface (and without visible GUI), but it seems that it needs to load a hidden window in the background. When I start it on docker, the application log file indicates that it's stuck on starting this window.
Is there a way to make the app assume it successfully loaded the window?
All information I found so far was about users who want to see the GUI or about Linux/Windows combinations. None of that helped me.

Related

How to hide terminal shell on server application like Warp in Windows?

I have a small warp server project on Windows that listen to a particular port and do something whenever I send a command to it by REST (for example: POST http://10.10.10.1:5000/print). It's a small client for printing PDF / receipt directly from another computer.
It works. But my problem is when I had to package the whole project, the Rust compiler give me an executable file (.exe). The application displays a terminal window when I run it. I want this terminal to be hidden somehow.
I try to run the program as a windows service (by using NSSM). It doesn't work for me since I had to access the printer. Windows doesn't allow my app to access any devices or any other executable as a windows service. (The reasons are explained here: How can I run an EXE program from a Windows Service using C#?)
So I plan to run my app as a tray-icon application so user can control or close the app. (https://github.com/olback/tray-item-rs)
Unfortunately, I still cannot hide the app's terminal window.
Another solution that I found is hstart (https://www.ntwind.com/software/hstart.html). But I would like to use this as "the last resort" solution since many antivirus/windows defender mark it as a malware.
Do anyone know how to hide or get rid of it ?
After lot of searching, It turns out to be easier than I thought. Just add
#![windows_subsystem = "windows"]
on top of your main.rs file. (for rust > 1.18) and the terminal is gone.
These control the /SUBSYSTEM flag in the linker. For now, only
"console" and "windows" are supported.
When is this useful? In the simplest terms, if you're developing a
graphical application, and do not specify "windows", a console window
would flash up upon your application's start. With this flag, it
won't.
https://doc.rust-lang.org/reference/runtime.html#the-windows_subsystem-attribute
https://blog.rust-lang.org/2017/06/08/Rust-1.18.html
https://learn.microsoft.com/en-us/cpp/build/reference/subsystem-specify-subsystem?view=msvc-170

Windows Containers - Is it possible to interact with desktop apps running in a container using the Desktop Sharing API?

I understand that desktop/GUI apps are not supported in Windows containers. They do run but there's no built-in way to interact with them. I had the following idea - maybe I could use the Desktop Sharing API (https://learn.microsoft.com/en-us/windows/win32/api/_rdp/) for this purpose, the idea is to run a desktop app, then run a sharing program that uses the Desktop Sharing API, and connect to it using a Desktop Sharing API viewing program from the host.
I had to do some recap about window stations and desktops, and I noticed that when starting the container with cmd in interactive mode, I'm logged with ContainerAdministrator as a service (logon type 5). I tried running some WinAPI functions that deal with desktops and winstation and got some access denied results, so I switched to running cmd as system.
The window station of the cmd process (and other child processes) is not the interactive WinSta0, but instead some other service window station, which makes sense since I'm logged on as a service, and I figured that I can't use this window station, so I used a little program I wrote to run notepad in Winsta0 in the Default desktop. Afterwards I ran another program that enumerates the windows on WinSta0\Default, and the notepad window does get enumerated and I also get it's title, so it's running somewhere.
So now I tried running the desktop sharing API program (also on WinSta0\Default). It runs and I can connect from the host, but I only get a black screen without anything on it. I also tried running a program that takes a screenshot of the windows but I get an empty bitmap.
So I thought maybe the Default desktop is not the active desktop, and by using the OpenInputDesktop function I could confirm it - the current desktop was the Winlogon desktop, so I used the SwitchDesktop function to switch to the Default desktop (I used OpenInputDesktop again to verify that it actually worked).
Unfortunately, this didn't change anything, I still get an empty screen and empty bitmaps.
I know that containers are built for micro services and are not supposed to run GUI apps and so, but still - is there a way to make this work? Or any ideas of what else I can check? Alternatively, if you know that it can't work - I would also be happy to hear a good technical explanation of why it doesn't work.

opened dialog window in windows service

I am running some Windows application as a service. This application used Office applications(Word, Excel). The service runs under specific user account.
Here is the problem:
When service is trying to use Excel, Excel is showing some dialog window which blocks the application from running. Since it's a service running in noninteractive mode you are not able to see the dialog window.
I was wondering if there is a way to get the title of a dialog window to understand the root cause of it.
I tried using EnumDesktopWindows but it didnt help.
Platform: Windows 10.
Thanks

Is it possible to run GUI apps in headless Windows containers?

This question is similar to but different from Is it possible to run GUI apps in windows containers? in the following way: I do not wish to login and see the GUI. Instead, I would like to know if is possible to start a winforms or wpf application from the commandline inside of a Windows container. If it runs at all, what are the window/screen implications for the GUI application?
Perhaps I have an GUI application that starts listening on a port when a user clicks a button. And perhaps I want to test this application in an automated way, both with a script that triggers a GUI button click also running in the container, and a script that creates and runs a container with both the GUI app and the button clicker.

Why don't some programs appear on the task manager startup tab?

Many applications start at startup, however, some of them do not appear on the task manager startup tab. What is that due to?
Is there any way to do this with a program, for example, spotify?
What do I need to do in order for a program to start at startup, but not showing in the startup applications tab?
Setting it in HKCU/Microsoft/Windows/CurrentVersion/Run doesn't seem to work, as it starts, but still shows on the mentioned tab.
Thank you in advance.
Its mostly a factor of how the program itself is written. If its written to run as a service, or as a System Tray application, or otherwise.
I know there are wrappers for running any exe as a service NSSM being the main one I have experience with (but this is mostly for when there is going to be NO user interaction)
I do not know if there is anything that can allow an application to run in the system tray only, not in the taskbar, if it doesn't support it.
But since Spotify does support running minimized to the tray, it does seem like there are some ways to "start spotify minimized", Spotify or other applications might have command line options or other settings to tell them to start "hidden"

Resources