Windows Service exits when calling an child process using _execv() - winapi

I have a C++ Windows application that was designed to be a Windows service. It executes an updater periodically to see if there's a new version. To execute the updater, _execv() is used. The updater looks for new versions, downloads them and stops the Windows service (all of these actions are logged), replaces the files, and starts the service again. Doing that in CLI mode (not going into service mode) works fine that way. According to my log files, the child process is launched, but the parent process (the Windows service) exits.
Is it even "allowed" to launch child processes in Windows services, and, why does the service exit unexpected then? My log files show no error (I am even monitoring for segfaults etc which is written to the log).

Why are you using _execv() rather than doing it the windows way and using CreateProcess()?
I assume you've put some debug into your service and you aren't getting past the point where you call _execv() in your service?

_execv replaces the existing process with a new one running the file you pass as the parameter. Under Unix (and similar) that's handled directly/natively. Windows, however, doesn't support that directly -- so it's done by having the parent process exit and arrange for a child process to be started as soon as it does.
IOW, it sounds like _execv is doing exactly what it's designed to -- but in this case, it's probably not what you really want. You can spawn a process from a service, but you generally want to use CreateProcessAsUser to create it under a specified account instead of the service account (which has a rather unusual set of rights assigned to it). The service process will then exit and restart when it's asked to by the service manager when your updater calls ControlService, CreateService, etc.

Related

How to solve the issue of "CS8805: Program using top-level.." VS "1053: The service did not respond.."?

When the output type is set to "Windows application" or "Console application", the service can't be started and gives error "1053: The service did not respond to the start or control request in a timely fashion.". Digging behind that, I have found that it is due to Windows 10 (+ new Windows server versions) not allowing integration with the the UI (zero session) without the "Interactive Services Detection" service running - and that is not allowed in Windows 10.
Trying to solve that, using output type "Class library" instead, results in build error "CS8805: Program using top-level statements must be an executable.". I have read som articles saying that this can be due to a double semi-colon somewhere, but I haven't found anything like that in the source files.
So - how do I create a Worker service which is functional on Windows 10 and new Windows servers?
A service can be a console or GUI .exe, it does not matter. However, it MUST call the service API when it starts:
When the service control manager starts a service process, it waits for the process to call the StartServiceCtrlDispatcher function. The main thread of a service process should make this call as soon as possible after it starts up (within 30 seconds).
Read more about services and look at the examples on MSDN...
Interactive services have been deprecated since Vista but some compatibility still remained for a while. That time is over, you just have to write a service and a helper app that runs in the user session.
You have to specify the output type in your file.csproj by adding :
<PropertyGroup>
...
<OutputType>Exe</OutputType>
<_FunctionsSkipCleanOutput>true</_FunctionsSkipCleanOutput>
...
</PropertyGroup>

Managing the lifetime of a process I don't control

I'm using Chromium Embedded Framework 3 (via CEFGlue) to host a browser in a third-party process via a plugin. CEF spins up various external processes (e.g. the renderer process) and manages the lifetime of these.
When the third-party process exits cleanly, CefRuntime.Shutdown is called and all the processes exit cleanly. When the third-party process exits badly (for example it crashes) I'm left with CEF executables still running and this (sometimes) causes problems with the host application meaning it doesn't start again.
I'd like a way to ensure that whatever manner the host application exits CefRuntime.Shutdown is called and the user doesn't end up with spurious processes running.
I've been pointed in the direction of job objects (see here) but this seems like it might be difficult to ship in a real solution as on some versions of Windows it requires administrative rights.
I could also set CEF to run in single process mode, but the documentation specifies that this is really for "debugging" only, so I'm assuming shipping this in production code is bad for some reason (see here).
What other options do I have?
Following on from the comments, I've tried passing the PID of the host process through to the client (I can do this by overriding OnBeforeChildProcessLaunch). I've then created a simple watchdog with the following code:
ThreadPool.QueueUserWorkItem(_ => {
var process = Process.GetProcessById(pid);
while (!process.WaitForExit(5000)) {
Console.WriteLine("Waiting for external process to die...");
}
Process.GetCurrentProcess().Kill();
});
I can verify in the debugger that this code executes and that the PID I'm passing into it is correct. However, if I terminate the host process I find that the thread simply dies in a way that I can't control and that the lines following the while loop are never executed (even if I replace it with a Console.WriteLine I never see any more messages printed from this thread.
For posterity, the solution suggested by #IInspectable worked, but in order to make it work I had to switch the implementation of of the external process to use the non-multi threaded message loop.
settings.MultiThreadedMessageLoop = false;
CefRuntime.Initialize(mainArgs, settings, cefWebApp, IntPtr.Zero);
Application.Idle += (sender,e) => {
if (parentProcess.HasExited) Process.GetCurrentProcess().Kill();
CefRuntime.DoMessageLoopWork();
}
Application.Run();

Start and monitor multiple instances of one process in Windows

I have a Windows application of which I need multiple instances running, with different command line parameters. The application is quite unstable and tends to crash every 48 hours or so.
Since manual checking for failure and restarting in case of one isn't what I love to do I want to write a "manager program" for this. It would launch the program (all its instances) and then watch them. In case a process crashes it would be restarted.
In Linux I could achieve this with fork()s and pids, but this obviously is not available in Windows. So, should I try to implement a CreateProcess version or is there a better way?
When you call CreateProcess, you are returned a handle to the new process in the hProcess member of the process information struct that you pass to CreateProcess. You can use this handle to detect when the process terminates.
For instance, you can create another thread and call WaitForSingleObject(hProcess) and block until the process terminates. Then you can decide whether or not to restart it.
Or your could call GetExitCodeProcess(hProcess, &exitcode) and test exitcode. If it has the value STILL_ACTIVE then your process has not terminated. This approach based on GetExitCodeProcess necessitates polling.
If it can be run as a daemon, the simplest way to ensure it keep running is Non-Sucking Service Manager.
It will allow to run as win32 service applications not designed as services. It will monitor and restart if necessary. And the source code is included, if any customization is needed.
All you need to do is define each of your instances as a service, with the required parameters, at it will do the rest.
If you have some kind of security police limitation and can't use third party tools, then coding will be necessary. The answer from David Heffernan gives you the appropiate direction.
Or it can be done in batch, vbs or js without need of anything out of the system. WMI Win32_Process class should allow you to handle it.

Connecting to an Adobe InDesign console

I have a single instance of InDesign Server running on a Windows 2007 VPS, which runs a SOAP service on port 8081. This runs as a Windows Service and runs both dev and live JSX scripts, depending on the path of the script (we have a dev folder and a live folder).
I am having trouble running a new script, so would like to get access to the console of the running service, but I am struggling to find a reference to how to do this in the Adobe PDF docs. I know the script itself being found, since there are errors in the Windows Event Viewer for a specific code line, but I think it is having trouble locating JSXBIN resources. The error message just lists the variable in question, rather than the explicit path.
I have modified the script to output path information to stdout, but this doesn't get into the Event Log. So, can I get a window on the console of the running service? I don't want to stop the current service as that is in use for live.
Some ideas I've got from the docs:
InDesignServer -console
InDesignServer -LogToApplicationEventLog
I think this executable however starts up a new instance, which isn't what I want (either it would choose a new port number, or try with 8081 and fail to start since the port is in use - I've not tried either for obvious reasons). The flags respectively display stdout in the DOS window, and redirect std out to the Event Log.
In short, I don't think this is possible. I was hesitant to start a new instance on our live server in case it upset anything, but in fact it is quite safe; just ensure that the port you specify is different to your usual one.
InDesignServer -noconsole -port 10001
The noconsole connects stdout and stderr with the current DOS window - using console opens a new one, so it's the former you want.
Aside: it may be worth avoiding LogToApplicationEventLog, since the process can get disconnected from the console, which makes it fiddly to kill in a graceful manner.

Stopped service does not release its resources?

I'm trying to deploy a patch to a service I created and replace the service file.
For that reason I need to stop the service so the file will be released.
I'm using sc \\remote stop svcname, then I query the service using sc \\remote query svcname until I see that it's state is STOPPED.
At this point the service file should be unlocked, and to be on the safe side I also delete the service using sc \\remote delete svcname.
Still, it doesn't seem to release the file and any deletion or change attempt fails.
I know one solution might be polling the file repeatedly, but I want to avoid this method.
Any suggestions?
Windows don't ensure the process providing the service terminates when the service is stopped (the process may provide more than one service). It just considers the service stopped when it handles the message sent to it.
So if the service process has a bug and does not properly release resources, they may still be locked. I would probably wait a little and than simply terminate the process.
There is also a tool from Microsoft called handle.exe (this is command-line version, they also have a GUI-one) that can list which processes hold the file open. It should be possible to get the same information programmatically, but I am not sure of the exact calls to make (and you need administrator privileges; you have to give them to the tool too). That way you can check whether the file is open, by which process and wait for it to terminate or force-terminate it if you didn't know which one it is.

Resources