I have an issue with a remote site whose printer occasionally errors out. The current solution is to restart the print spooler on the server.
I am trying to create a simple powershell script that allows a non-admin user to restart the Spooler service without being able to see the admin credentials or edit the script.
We are replacing the server in a couple of months, so the configuration will be fixable then. We just need a temporary workaround so the user doesn't need to email me every couple of days when the spooler requires resetting.
Ideas?
Create a scheduled task with the admin credentials cached. Under Actions, have the task run the privileged Powershell script: powershell c:\Path\MyScript.ps1. Assign no schedule (i. e. delete all triggers). Change the permissions on the task's XML file under C:\Windows\System32\Tasks to allow read/execute by nonadmins. Create a user facing CMD script (or a shortcut, even) that would run the task: schtasks /Run /TN:MyTaskName /S:Server.
There are other ways to isolate the credentials, but this seems to be the easiest.
Related
I'm writing a provisioning script in PowerShell for a Packer-built Windows image on a CI pipeline. This process involves downloading several large files. I'm under the impression that BITS is faster than Invoke-WebRequest, so I've decided to use BITS to asynchronously download these large files.
The problem is that BITS will only process jobs for users that are interactively logged on...
BITS transfers files only when the job's owner is logged on to the computer (the user must have logged on interactively). BITS does not support the RunAs command.
...unless the job was submitted by a service account.
You can use BITS to transfer files from a service. The service must use the LocalSystem, LocalService, or NetworkService system account. These accounts are always logged on; therefore, jobs submitted by a service using these accounts always run.
But even then, there's a wrinkle:
If a service running under a system account impersonates the user before calling BITS, BITS responds as it would for any user account (for example, the user needs to be logged on to the computer for the transfer to occur).
This is an issue because the provisioning script runs as the Administrator account, which is not a service account and therefore must be logged in interactively to use BITS. This happens to be Packer's behavior, so I can't change this. I'm wrong, I can change this. See my final answer. How can I do the following in one PowerShell script?
Submit a BITS job as Administrator using a service account's credentials. I assume I need to pass something in to Start-BitsTransfer's -Credential parameter?
Store the BITS job in a local variable (jobs will be started at different places in the script)
Await the completion of the BITS job so I can start using the file I downloaded (jobs will be awaited at different places in the script)
You could use psexec to run a secondary script with SYSTEM rights by the administrator content and have the primary script identify the exit code of the psexec process to confirm it has successfully executed.
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/start-process?view=powershell-7.1
https://learn.microsoft.com/en-us/sysinternals/downloads/psexec#:~:text=PsExec%20is%20a%20light-weight,to%20manually%20install%20client%20software.
https://weblogs.asp.net/soever/returning-an-exit-code-from-a-powershell-script
It turns out there's a solution to this, although it's specific to Packer. I didn't mention much about my use of it because I didn't think it was that important.
Contrary to my initial belief, Packer's PowerShell provisioner lets you run the provisioning script with elevated privileges as any user...
elevated_user and elevated_password (string) - If specified, the PowerShell script will be run with elevated privileges using the given Windows user.
provisioner "powershell" {
elevated_user = "Administrator"
elevated_password = build.Password
}
...including service users.
If you specify an empty elevated_password value then the PowerShell script is run as a service account. For example:
provisioner "powershell" {
elevated_user = "SYSTEM"
elevated_password = ""
}
After adjusting my Packer template's provisioner block accordingly, I can now confirm that Start-BitsTransfer and friends work as expected. No need to pass complicated arguments or play tricks with sessions.
I am running a Jenkins server on a Windows 10 computer, as a service. In one of the Jenkins-jobs I have to perform tests using a COM application. The same computer is also used by the developers in their daily work over RDP, and the Jenkins-job in question is run in the night when no regular developer is using it. But if no user is logged in on the computer or using RDP, the script in the job fails to start the COM application with the following message:
The server process could not be started because the configured
identity is incorrect. Check the username and password.
I found that the issue seems to be that the identity for the COM application is taken from the current interactive user, and if there is none, it fails, see
https://support.microsoft.com/en-my/help/305761/com-server-application-that-uses-interactive-user-identity-fails-to-lo
I cant seem to be able to solve my issue. I see two options:
Make sure that a user is logged in when the job is executed
Figure out how to run the COM application without an interactive user
For 1 i see the following solutions and why they do not work:
Autologin on Windows start, and leave logged-in: Will not work since we use the computer in our daily work through RDP, which means that the local logged in user will be kicked out since we are only allowed one session at a time.
Log in using RDP and then exit using the script tscon.exe 0 /dest:console which will leave the session open. Will not work since we are 15 people in the team using that machine over RDP, and people will forgett to use this command when they log of by the end of the day.
For 2, i am unable to find a way to do this.
Can I in Windows schedule a user to automaticall be logged in before the job starts? Can i use a second computer and scedule an RDP-session to the first computer, before the job is executed?
Since nobody was able to provide a good solution I will input my workardound as an answer and possible solution. What I ended up doing was using a second computer (running Windows) and schedule a task on that computer that every night (before the Jenkins-job starts) opens an RDP-session to the computer running Jenkins. This way the Jenkins job, and the COM-application, has an active user that it can use.
This is how I achieved this:
Login to the second computer (i.e. the one not running Jenkins) and open the RDP (Remote Desktop Connection) dialog and click Show Options
Enter the details for the first computer (i.e. the one running Jenkins). Make sure to uncheck Always ask for credentials (you will need to save the credentials to be able to automate this).
Save the configuration to an .rdp-file, using Save As...
IMPORTANT: Press connect to connect to the first computer, enter the password and make sure to save it. Also accept any certificates e.t.c. to prevent future warnings/dialogs.
Create a bat-file containing the following
mstsc C:\Path\To\saved_rdp_file.rdp
Create a task in Windows Sceduler that calls the bat-file created in step 5 every night.
Optional: If you want to close the rdp-session when Jenkins is done, create a second bats-script and scedule that as well, containing:
tasklist /FI "imagename eq mstsc.exe" | find "mstsc.exe" && taskkill /f /im mstsc.exe || echo process "mstsc.exe" is not running
I have a PHP script that runs a Powershell Stop-Process command through shell_exec(). PHP runs as IUSR. When I run the script, I receive an access denied error message. If I run the command in PowerShell using my Administrator account, it works as expected.
How do I grant IUSR the ability to execute Stop-Process in Powershell?
I wasn't able to find a solution to grant IUSR the specific privileges to execute Stop-Process, but I was able to get around this by changing the "Anonymous Authentication" user associated with the kill script from IUSR to Administrator.
In IIS 8.5, go to Sites->My Site->Folder Name. On the main panel, click on Authentication under IIS. Right click on Anonymous Authentication and then click on Edit.
You can set the "Anonymous Authentication" value at any level of your IIS app; from the site level to the sub-directory level. I recommend only changing the value from IUSR to Administrator on the directory that actually hosts your kill script. Changing it for the whole site might create problems for other parts of the application.
I've seen some information that suggests if you add a limited user to the Performance Monitor Users group and grant it debug privileges, it will be able to terminate processes.
You might consider something a bit less risky though, like running another web app as a user with those rights, that can only be accessed from the local machine. Then make your PHP app do a web request to the internal app to do it's killin'.
If you're trying to kill only a specific process this lets you further limit the impact because the internal app could be designed to only kill that one thing.
Other ways to achieve a similar separation is to have for example a scheduled task that looks for a file with specific content in a specific directory, when it sees it, it kills a process and deletes the file. IUSR can be given permission to create files in that directory as a way to trigger this. This method is very easy to implement but isn't synchronous.
I have a script on my domain stored on the Active Directory server. every machine on the domain has a task that when fired, calls this script to run.
Running the task from the AD server works. Running the task from another machine doesn't work. However, running the command that is triggered from cmd manually on the remote computer works?
Could anyone shine some light on this. Basically I call it like this so that the trigger is set up like...
Action: PowerShell.exe
Arguments: -noprofile –ExecutionPolicy Bypass -File "\\<>NameOfADServer<>\C$\Tasks\script.ps1" "Argument 1" "Argument 2"
Running as SYSTEM is probably your issue - it wont have any access outside of the PC its running on.
When you run it manually youll have the access.
There's several problems here.
You're running the task as the local SYSTEM accounts. SYSTEM generally does not have access to any network resources.
You're using the administrative share (\\<servername>\C$) to share the script. Only users that have Administrator access to the server can access the administrative shares. Administrative shares are heavily restricted by design and you cannot modify the access on them.
My guess is that the script works when you run it manually is because it's using the current user's credentials for network access when you do that, but don't quote me on that.
The simplest solution with the least amount of change is to do this:
Create a group in Active Directory. Add the Computer accounts, or, preferably, groups with Computer accounts which you want to be able to run the script to this new group. If you really want any SYSTEM account on any computer in the domain to be able to run the script, you can add the "Domain Computers" group to the group.
Create a folder on the server. Put the script in the folder. Don't put anything in this folder you don't want your users to read. Assign the "Read" NTFS permission to the group created above to the folder.
Share the folder out. Grant the group you just created the "Full Control" share access. If you want, you can make it a hidden share by adding a dollar sign to the end of the name.
Update your scheduled tasks to use \\<servername>\<sharename>\script.ps1.
This is almost certainly not the best method to accomplish what you're actually trying to do, but this is probably the best way to use scheduled tasks running scripts on a network share with the SYSTEM account.
I am trying to have a windows Admin account automatically login multiple local users on a script. The idea is to run a set of applications (tests) in each user session.
Currently I am able to do so by loggin in remotely (RDC) to each of the individual user accounts. This would be fine if there were just a few of these accounts, but now I have upwards of 30 machines with an average of 6 user accounts each so RDPing to each is extremely time consuming.
Instead, I'd like to be able to login as the Admin, and have some sort of script to automatically login the local users within a group, or just a list of users, so I can start the applications using pstools (the applications require desktop interaction, so a session is required).
I have found that you can only automatically login one user via Windows User Accounts.
Does anyone know of a way to login multiple accounts via command line, or automatically somehow?
Use Invoke-Command to execute commands against a remote computer which also has Powershell, and has WinRM enabled. Invoke-Command can also run non-Powershell commands.
# users stored in csv with "username, password" format
foreach ($user in $userlist) {
Invoke-Command "runas /profile /credentials $creds /user:$user.username /password:$user.password *executable*
}
Use the -asJob parameter to run them as separate jobs, or run them in sequence for simplicity. Remote PSSessions are another possiblilty to consider if you need to run multiple commands. Research storing credentials, encrypted, in a file for repeated use.
This is actually not possible, you cannot log in to multiple accounts in command line. RDC is the only feasible way. May be you can automate the RDC for multiple users using other automation softwares like auto it, WSH scripting or some macro recorders. Which might help resolve some effort in your work.
Use runas to open new cmd. After that start your test, they will use your new credentials.
It can be done by 3rd party tool http://www.logonexpert.com via its command line tool, in such manner:
le.exe /logon user1 pass1 domain1
le.exe /logon user2 pass2 domain1