I have a script on my domain stored on the Active Directory server. every machine on the domain has a task that when fired, calls this script to run.
Running the task from the AD server works. Running the task from another machine doesn't work. However, running the command that is triggered from cmd manually on the remote computer works?
Could anyone shine some light on this. Basically I call it like this so that the trigger is set up like...
Action: PowerShell.exe
Arguments: -noprofile –ExecutionPolicy Bypass -File "\\<>NameOfADServer<>\C$\Tasks\script.ps1" "Argument 1" "Argument 2"
Running as SYSTEM is probably your issue - it wont have any access outside of the PC its running on.
When you run it manually youll have the access.
There's several problems here.
You're running the task as the local SYSTEM accounts. SYSTEM generally does not have access to any network resources.
You're using the administrative share (\\<servername>\C$) to share the script. Only users that have Administrator access to the server can access the administrative shares. Administrative shares are heavily restricted by design and you cannot modify the access on them.
My guess is that the script works when you run it manually is because it's using the current user's credentials for network access when you do that, but don't quote me on that.
The simplest solution with the least amount of change is to do this:
Create a group in Active Directory. Add the Computer accounts, or, preferably, groups with Computer accounts which you want to be able to run the script to this new group. If you really want any SYSTEM account on any computer in the domain to be able to run the script, you can add the "Domain Computers" group to the group.
Create a folder on the server. Put the script in the folder. Don't put anything in this folder you don't want your users to read. Assign the "Read" NTFS permission to the group created above to the folder.
Share the folder out. Grant the group you just created the "Full Control" share access. If you want, you can make it a hidden share by adding a dollar sign to the end of the name.
Update your scheduled tasks to use \\<servername>\<sharename>\script.ps1.
This is almost certainly not the best method to accomplish what you're actually trying to do, but this is probably the best way to use scheduled tasks running scripts on a network share with the SYSTEM account.
Related
I have an issue with a remote site whose printer occasionally errors out. The current solution is to restart the print spooler on the server.
I am trying to create a simple powershell script that allows a non-admin user to restart the Spooler service without being able to see the admin credentials or edit the script.
We are replacing the server in a couple of months, so the configuration will be fixable then. We just need a temporary workaround so the user doesn't need to email me every couple of days when the spooler requires resetting.
Ideas?
Create a scheduled task with the admin credentials cached. Under Actions, have the task run the privileged Powershell script: powershell c:\Path\MyScript.ps1. Assign no schedule (i. e. delete all triggers). Change the permissions on the task's XML file under C:\Windows\System32\Tasks to allow read/execute by nonadmins. Create a user facing CMD script (or a shortcut, even) that would run the task: schtasks /Run /TN:MyTaskName /S:Server.
There are other ways to isolate the credentials, but this seems to be the easiest.
I'm writing a provisioning script in PowerShell for a Packer-built Windows image on a CI pipeline. This process involves downloading several large files. I'm under the impression that BITS is faster than Invoke-WebRequest, so I've decided to use BITS to asynchronously download these large files.
The problem is that BITS will only process jobs for users that are interactively logged on...
BITS transfers files only when the job's owner is logged on to the computer (the user must have logged on interactively). BITS does not support the RunAs command.
...unless the job was submitted by a service account.
You can use BITS to transfer files from a service. The service must use the LocalSystem, LocalService, or NetworkService system account. These accounts are always logged on; therefore, jobs submitted by a service using these accounts always run.
But even then, there's a wrinkle:
If a service running under a system account impersonates the user before calling BITS, BITS responds as it would for any user account (for example, the user needs to be logged on to the computer for the transfer to occur).
This is an issue because the provisioning script runs as the Administrator account, which is not a service account and therefore must be logged in interactively to use BITS. This happens to be Packer's behavior, so I can't change this. I'm wrong, I can change this. See my final answer. How can I do the following in one PowerShell script?
Submit a BITS job as Administrator using a service account's credentials. I assume I need to pass something in to Start-BitsTransfer's -Credential parameter?
Store the BITS job in a local variable (jobs will be started at different places in the script)
Await the completion of the BITS job so I can start using the file I downloaded (jobs will be awaited at different places in the script)
You could use psexec to run a secondary script with SYSTEM rights by the administrator content and have the primary script identify the exit code of the psexec process to confirm it has successfully executed.
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/start-process?view=powershell-7.1
https://learn.microsoft.com/en-us/sysinternals/downloads/psexec#:~:text=PsExec%20is%20a%20light-weight,to%20manually%20install%20client%20software.
https://weblogs.asp.net/soever/returning-an-exit-code-from-a-powershell-script
It turns out there's a solution to this, although it's specific to Packer. I didn't mention much about my use of it because I didn't think it was that important.
Contrary to my initial belief, Packer's PowerShell provisioner lets you run the provisioning script with elevated privileges as any user...
elevated_user and elevated_password (string) - If specified, the PowerShell script will be run with elevated privileges using the given Windows user.
provisioner "powershell" {
elevated_user = "Administrator"
elevated_password = build.Password
}
...including service users.
If you specify an empty elevated_password value then the PowerShell script is run as a service account. For example:
provisioner "powershell" {
elevated_user = "SYSTEM"
elevated_password = ""
}
After adjusting my Packer template's provisioner block accordingly, I can now confirm that Start-BitsTransfer and friends work as expected. No need to pass complicated arguments or play tricks with sessions.
We have a legacy VB6 application that automatically emails reports. It runs from a scheduled task on a server. Occasionally a user will run the exe - it's in a folder that we can't lock them out of, and it has to remain in that folder for reasons too complicated to go into here. Is there a way to prevent users from running the exe while still letting it run from the scheduled task? I can modify the source code for the exe, so that's an option if someone can help me figure out how.
I'm going to call your existing app AppChild and a new VB6 (or other program language) program AppParent.
Modify AppChild to test for a command line parameter at either Sub Main() or at the first form loaded in the Form_Load() event. If the command line parameter isn't there, AppChild terminates.
AppParent would be in a location not accessible to the other users. The Scheduled task runs AppParent which runs AppChild and passes the required command line parameter. This could be secured somewhat by passing a calculated hash and decoding it in AppChild if needed.
Or, if the users don't have access to the Scheduled Tasks, you could just run AppChild , passing the required parameter from the Scheduled Task. If the users do have access to the Scheduled Task this won't work because they could then see the passed parameter and create a shortcut which passes the required parameter.
You didn't state which OS the server is running but you may have problems using network resources if you try to run the Scheduled Task without a logged in user. Task Manager got a major update to handle security issues to prevent hackers from running tasks without a logged in user. Essentailly, network resources, .e.g. eMail, are not available unless a user is logged in.
https://technet.microsoft.com/en-us/library/cc722152(v=ws.11).aspx
The only way I found around that problem is to run a machine with a user with the correct permissions logged in all the time.
Are you sure you cannot lock the user out?
You could restrict access to the folder so that the user cannot access it and set up the scheduled task to use an account with access to the folder.
Although the users can't be locked out of the folder (perhaps the reports end up in there?), in Windows you can set the permissions on a per file basis. Make a new user that has the full rights (the same as your users). Schedule the VB6 app to run with that user. Remove the rights for the regular users to see the app. You do this by changing the permissions on just the VB6 app.
I have a PHP script that runs a Powershell Stop-Process command through shell_exec(). PHP runs as IUSR. When I run the script, I receive an access denied error message. If I run the command in PowerShell using my Administrator account, it works as expected.
How do I grant IUSR the ability to execute Stop-Process in Powershell?
I wasn't able to find a solution to grant IUSR the specific privileges to execute Stop-Process, but I was able to get around this by changing the "Anonymous Authentication" user associated with the kill script from IUSR to Administrator.
In IIS 8.5, go to Sites->My Site->Folder Name. On the main panel, click on Authentication under IIS. Right click on Anonymous Authentication and then click on Edit.
You can set the "Anonymous Authentication" value at any level of your IIS app; from the site level to the sub-directory level. I recommend only changing the value from IUSR to Administrator on the directory that actually hosts your kill script. Changing it for the whole site might create problems for other parts of the application.
I've seen some information that suggests if you add a limited user to the Performance Monitor Users group and grant it debug privileges, it will be able to terminate processes.
You might consider something a bit less risky though, like running another web app as a user with those rights, that can only be accessed from the local machine. Then make your PHP app do a web request to the internal app to do it's killin'.
If you're trying to kill only a specific process this lets you further limit the impact because the internal app could be designed to only kill that one thing.
Other ways to achieve a similar separation is to have for example a scheduled task that looks for a file with specific content in a specific directory, when it sees it, it kills a process and deletes the file. IUSR can be given permission to create files in that directory as a way to trigger this. This method is very easy to implement but isn't synchronous.
I'm trying to create a scheduled task in a Group Policy that runs a script that lives on the domain periodically.
I understand that storing the password in GP is a no no, so avoiding that. However, it seems like there is no way to deploy a scheduled task that can run with access to the network.
I tried the "System" account, that failed with access denied. I also tried using the "Do not store password" setting with a named account, which also prevents network access.
The scripts live in \domain\netlogon land and has full read access to authenticated users.
Is there anyway to accomplish this without having to manually install the task on every server and provide a named service account?
This is a Windows 2012 server domain with about 20 servers.
I ended up getting the "System" account to work correctly.