WNetGetUniversalName failing when called from a scheduled task - windows

I have a (win32) program that is run through a scheduled task.
When run, my software should map a number of local drive letters to UNC resources, verify that the mappings have been successful, run a few other tasks and then unmap the drives.
When running under the context of a local user, all works fine. However, when I run it through the system task scheduler, the verify task fails.
The verify tasks takes the drive letter, checks if the drive is a network drive (through GetDriveType) then, if the drive if of type DRIVE_REMOTE, calls WNetGetUniversalName and compare the result with the expected mapping.
When run from a regular user context, this works. But when the process is called through the task scheduler, WNetGetUniversalName fails with error 87: The parameter is incorrect.
After trying to isolate the issue, I came to the following conclusions:
The issue is not linked to user rights: even when the user is made a member of both the local administrators group and the domain administrators group, the error remains.
The parameters I pass to the functions are ALWAYS the same: It's the drive letter concatenated with :\.
I have tried repeating the call after a short wait (100ms): same symptoms.
The mapping (made through WNetAddConnection2) actually succeeds.
The issue is not dependent on where the executable is located: same thing happen if it's on the local machine or run from an UNC path.
The issue occurs whether the scheduled task has been set to "run with highest privilege" or not.
Here is the exact call I use:
APIResult := WNetGetUniversalName(PWideChar(pathToCheck), UNIVERSAL_NAME_INFO_LEVEL, #RemoteNameInfo, Size);
I'm out of ideas about what to check next.
Edit Right now, I have reverted to a different behavior: All drives status is checked (GetDriveType), if it's a network drive, it's unmaped, checked again and then mapped. This seems to work but it's slower (of course) and it feels less secure.

Related

How to download a file using BITS in a Packer provisioner?

I'm writing a provisioning script in PowerShell for a Packer-built Windows image on a CI pipeline. This process involves downloading several large files. I'm under the impression that BITS is faster than Invoke-WebRequest, so I've decided to use BITS to asynchronously download these large files.
The problem is that BITS will only process jobs for users that are interactively logged on...
BITS transfers files only when the job's owner is logged on to the computer (the user must have logged on interactively). BITS does not support the RunAs command.
...unless the job was submitted by a service account.
You can use BITS to transfer files from a service. The service must use the LocalSystem, LocalService, or NetworkService system account. These accounts are always logged on; therefore, jobs submitted by a service using these accounts always run.
But even then, there's a wrinkle:
If a service running under a system account impersonates the user before calling BITS, BITS responds as it would for any user account (for example, the user needs to be logged on to the computer for the transfer to occur).
This is an issue because the provisioning script runs as the Administrator account, which is not a service account and therefore must be logged in interactively to use BITS. This happens to be Packer's behavior, so I can't change this. I'm wrong, I can change this. See my final answer. How can I do the following in one PowerShell script?
Submit a BITS job as Administrator using a service account's credentials. I assume I need to pass something in to Start-BitsTransfer's -Credential parameter?
Store the BITS job in a local variable (jobs will be started at different places in the script)
Await the completion of the BITS job so I can start using the file I downloaded (jobs will be awaited at different places in the script)
You could use psexec to run a secondary script with SYSTEM rights by the administrator content and have the primary script identify the exit code of the psexec process to confirm it has successfully executed.
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/start-process?view=powershell-7.1
https://learn.microsoft.com/en-us/sysinternals/downloads/psexec#:~:text=PsExec%20is%20a%20light-weight,to%20manually%20install%20client%20software.
https://weblogs.asp.net/soever/returning-an-exit-code-from-a-powershell-script
It turns out there's a solution to this, although it's specific to Packer. I didn't mention much about my use of it because I didn't think it was that important.
Contrary to my initial belief, Packer's PowerShell provisioner lets you run the provisioning script with elevated privileges as any user...
elevated_user and elevated_password (string) - If specified, the PowerShell script will be run with elevated privileges using the given Windows user.
provisioner "powershell" {
elevated_user = "Administrator"
elevated_password = build.Password
}
...including service users.
If you specify an empty elevated_password value then the PowerShell script is run as a service account. For example:
provisioner "powershell" {
elevated_user = "SYSTEM"
elevated_password = ""
}
After adjusting my Packer template's provisioner block accordingly, I can now confirm that Start-BitsTransfer and friends work as expected. No need to pass complicated arguments or play tricks with sessions.

How to run a VB6 app from a scheduled task without users being able to run it

We have a legacy VB6 application that automatically emails reports. It runs from a scheduled task on a server. Occasionally a user will run the exe - it's in a folder that we can't lock them out of, and it has to remain in that folder for reasons too complicated to go into here. Is there a way to prevent users from running the exe while still letting it run from the scheduled task? I can modify the source code for the exe, so that's an option if someone can help me figure out how.
I'm going to call your existing app AppChild and a new VB6 (or other program language) program AppParent.
Modify AppChild to test for a command line parameter at either Sub Main() or at the first form loaded in the Form_Load() event. If the command line parameter isn't there, AppChild terminates.
AppParent would be in a location not accessible to the other users. The Scheduled task runs AppParent which runs AppChild and passes the required command line parameter. This could be secured somewhat by passing a calculated hash and decoding it in AppChild if needed.
Or, if the users don't have access to the Scheduled Tasks, you could just run AppChild , passing the required parameter from the Scheduled Task. If the users do have access to the Scheduled Task this won't work because they could then see the passed parameter and create a shortcut which passes the required parameter.
You didn't state which OS the server is running but you may have problems using network resources if you try to run the Scheduled Task without a logged in user. Task Manager got a major update to handle security issues to prevent hackers from running tasks without a logged in user. Essentailly, network resources, .e.g. eMail, are not available unless a user is logged in.
https://technet.microsoft.com/en-us/library/cc722152(v=ws.11).aspx
The only way I found around that problem is to run a machine with a user with the correct permissions logged in all the time.
Are you sure you cannot lock the user out?
You could restrict access to the folder so that the user cannot access it and set up the scheduled task to use an account with access to the folder.
Although the users can't be locked out of the folder (perhaps the reports end up in there?), in Windows you can set the permissions on a per file basis. Make a new user that has the full rights (the same as your users). Schedule the VB6 app to run with that user. Remove the rights for the regular users to see the app. You do this by changing the permissions on just the VB6 app.

Network address inaccessible if ran by the Task Scheduler

I have a C# program that does this:
Directory.Exists(#"\\PcName\SomeDir");
and prints whether that path is accessible (exists) or not.
This is the problem: I run this app via the Task Scheduler right after log-in (auto-log-in user), using the "On Login" trigger, and it returns false, although that path IS accessible! (I manage to open that path using the explorer.exe few seconds before my app starts). It is marked to:
Run with highest privileges
If I run it manually it runs OK, even when I right click the task and select "Run" via the Task Scheduler!
If I deselect "Run with highest privileges", there is no problem, but it must be ran with highest privileges (accesses registry and whole lot other stuff)
It runs under same user if I run it manually or automatically by the task scheduler - I made sure using Process Explorer
It happens on certain machines (Win8x64, admin-privileges-user with no password, auto-log-in, workgroup machines, not domain), but not on anothers (same: Win8x64, admin-privileges-user with no password, auto-log-in, workgroup machines, not domain).
Even if I insert Thread.Sleep(TimeSpan.FromMinutes(1)); or enter 1-min delay in the task (in the Task Scheduler) it still says this path does not exist
Problem solved. I had to "impersonate", although I’m not really sure why: if I use the scheduler without restarting, it accesses the remote share – exactly the same settings, one to one. Only after restart it fails to access the share (and a moment later, again – same settings, it is able to access).
The only difference in running it immediately after restart is that the app-process’s parent is services.exe and not explorer.exe as usual. My guess is that it has to log in immediately after the restart, so it must use services.exe (explorer.exe is not supposed to exist at that stage, if I'm not mistaken).
Below is the solution in C#, roughly, to whom it may concern:
// LogonUser is a "P/Invoked" API:
// http://www.pinvoke.net/default.aspx/advapi32/LogonUser.html
// this solution works only with the LOGON32_LOGON_NEW_CREDENTIALS as the 4th parameter:
using (var h = LogonUser(username, domain, password,
LogonType.LOGON32_LOGON_NEW_CREDENTIALS,
LogonProvider.LOGON32_PROVIDER_DEFAULT))
{
using (var winImperson8Ctx = WindowsIdentity.Impersonate(h.DangerousGetHandle())) {
return Directory.Exists(path); // now works fine...
}
}

Batch file called by scheduled task throws error when scheduled, runs fine when double clicked

I have a batch file that maps a networked drive. About a week or so ago the password expired, so the program calling the batch file started throwing errors.
I've updated the password in the batch file, and when I double click on the batch file, the drive maps fine. However, when the scheduled task kicks off, I get the following error:
Logon failure: unknown user name or bad password.
Anyone seen this before? I've tried recreating the scheduled task, but it doesn't seem to make any difference.
EDIT
I've updated the properties of the scheduled task, which isn't the problem. The problem seems to be the username and password in the batch file. The strange thing is if I log on interactively and double click the executable, everything works perfectly.
The last time the job ran it threw a "semaphore timeout period has expired" error. I've never seen this particular error before, but it seems like it was actually logged on and trying to copy files when this happened.
EDIT
I've revised my code to make it as simple as possible. I'm using a batch file to map the drive, then using code to transfer the files. I still run into the same issue - it works fine when I double click the batch file, but once I throw Scheduler into the picture, it throws a "Bad username or invalid password" error.
Occasionally when I'm trying to run the file by double clicking on it, I get a "Could not find part of the path" error. This says to me the drive mapping actually worked but something failed when it was trying to copy. (Most of the time, testing by double clicking works fine)
The username and password associated with the task when you created it is no longer valid or has changed.
This occurs generally when creating the task and forgetting to select the option not to store the password. When your password is set to expire, then you will meet this problem everytime you will have to reset your password. Make sure to do as on the image :
It sounds like the username and/or password associated with the scheduled task is no longer correct. The batch file is likely OK, you just need to change the properties of the scheduled task.
I ran into something similar while testing a new powershell script we wrote to create a scheduled task to back up to one or more network locations. I had to go through a number of iterations, and when I decreased from two network locations to one, the scheduled task stopped working, with individual steps in the called script giving "Logon failure: unknown user name or bad password" errors, though when I copied the arguments and ran them from the command line they worked.
After reading this question and Tim's comment, I tried deleting the scheduled task and re-creating it. After that it worked. I would concur that the scheduled task likely cached something.
From:
https://danblee.com/log-on-as-batch-job-rights-for-task-scheduler/
Go to the Start menu.
Run.
Type secpol.msc and press Enter.
The Local Security Policy manager opens.
Go to Security Settings – Local Policies – User Rights Assignment node.
Double click Log on as a batch job on the right side.
Click Add User or Group…
Select the user and click OK.

service doesn't behave the same as command line

I am running on a Windows Server 2003. This is my problem:
I wrote a Perl script to automate the copy of some files from my Server machine to some network drives. I am using xcopy to copy the files. My problem is the permissions.
If I run the script from the command line, it works, all the copies are successful.
If I try to run the script using a service all the copies fail. This service is a program that I wrote that takes the script and runs it. In the background all it is doing is to call the C function 'system' and it runs the same program that I can run from the command line.
I have tried many variations of this to figure out what is wrong with it but I can't see why the service would not run the same way I run it from the command line.
I set up the service to run as the same user I am using from the command line.
I also tried to map the network drives as the user that has writing permission but the result is the same. Manually the script works, from the service, it doesn't.
Any suggestion is appreciated.
Thanks
Tony
The service may be running as the system and not have access to the network drives. In the Service settings, change the service to run under your account (or an account with the relevant permissions/mappings).
When the service runs, it uses whatever credentials you specify in the Services manager of Windows. The default, LOCAL SERVICE, probably does not have permission to access the resources to be copied.
Create a new user account with the minimum set of permissions needed to perform the copy and configure your service to run under that account.
I did figure out the issue (I think), and that matches what I later found in another post:
https://serverfault.com/questions/4623/windows-can-i-map-a-network-drive-for-a-service-account
<...Persistent drive mappings are only restored during an interactive login, which the service does not use. I believe the only way to get a service to use a network drive is for that service to map the drive itself or alternatively for it to us a UNC path instead of a mapped drive.>
What I did was mapping the drive using the service and that seems to work. It turns out, if I map the drive and save credentials, then I can access later the drive without having to map it again. I don't know why this approach seems to work though.
-Thanks everybody for your help.
Tony

Resources