Windows Automatic Multiple User login command line - windows

I am trying to have a windows Admin account automatically login multiple local users on a script. The idea is to run a set of applications (tests) in each user session.
Currently I am able to do so by loggin in remotely (RDC) to each of the individual user accounts. This would be fine if there were just a few of these accounts, but now I have upwards of 30 machines with an average of 6 user accounts each so RDPing to each is extremely time consuming.
Instead, I'd like to be able to login as the Admin, and have some sort of script to automatically login the local users within a group, or just a list of users, so I can start the applications using pstools (the applications require desktop interaction, so a session is required).
I have found that you can only automatically login one user via Windows User Accounts.
Does anyone know of a way to login multiple accounts via command line, or automatically somehow?

Use Invoke-Command to execute commands against a remote computer which also has Powershell, and has WinRM enabled. Invoke-Command can also run non-Powershell commands.
# users stored in csv with "username, password" format
foreach ($user in $userlist) {
Invoke-Command "runas /profile /credentials $creds /user:$user.username /password:$user.password *executable*
}
Use the -asJob parameter to run them as separate jobs, or run them in sequence for simplicity. Remote PSSessions are another possiblilty to consider if you need to run multiple commands. Research storing credentials, encrypted, in a file for repeated use.

This is actually not possible, you cannot log in to multiple accounts in command line. RDC is the only feasible way. May be you can automate the RDC for multiple users using other automation softwares like auto it, WSH scripting or some macro recorders. Which might help resolve some effort in your work.

Use runas to open new cmd. After that start your test, they will use your new credentials.

It can be done by 3rd party tool http://www.logonexpert.com via its command line tool, in such manner:
le.exe /logon user1 pass1 domain1
le.exe /logon user2 pass2 domain1

Related

How do you securely have a non-admin run a powershell script?

I have an issue with a remote site whose printer occasionally errors out. The current solution is to restart the print spooler on the server.
I am trying to create a simple powershell script that allows a non-admin user to restart the Spooler service without being able to see the admin credentials or edit the script.
We are replacing the server in a couple of months, so the configuration will be fixable then. We just need a temporary workaround so the user doesn't need to email me every couple of days when the spooler requires resetting.
Ideas?
Create a scheduled task with the admin credentials cached. Under Actions, have the task run the privileged Powershell script: powershell c:\Path\MyScript.ps1. Assign no schedule (i. e. delete all triggers). Change the permissions on the task's XML file under C:\Windows\System32\Tasks to allow read/execute by nonadmins. Create a user facing CMD script (or a shortcut, even) that would run the task: schtasks /Run /TN:MyTaskName /S:Server.
There are other ways to isolate the credentials, but this seems to be the easiest.

How to download a file using BITS in a Packer provisioner?

I'm writing a provisioning script in PowerShell for a Packer-built Windows image on a CI pipeline. This process involves downloading several large files. I'm under the impression that BITS is faster than Invoke-WebRequest, so I've decided to use BITS to asynchronously download these large files.
The problem is that BITS will only process jobs for users that are interactively logged on...
BITS transfers files only when the job's owner is logged on to the computer (the user must have logged on interactively). BITS does not support the RunAs command.
...unless the job was submitted by a service account.
You can use BITS to transfer files from a service. The service must use the LocalSystem, LocalService, or NetworkService system account. These accounts are always logged on; therefore, jobs submitted by a service using these accounts always run.
But even then, there's a wrinkle:
If a service running under a system account impersonates the user before calling BITS, BITS responds as it would for any user account (for example, the user needs to be logged on to the computer for the transfer to occur).
This is an issue because the provisioning script runs as the Administrator account, which is not a service account and therefore must be logged in interactively to use BITS. This happens to be Packer's behavior, so I can't change this. I'm wrong, I can change this. See my final answer. How can I do the following in one PowerShell script?
Submit a BITS job as Administrator using a service account's credentials. I assume I need to pass something in to Start-BitsTransfer's -Credential parameter?
Store the BITS job in a local variable (jobs will be started at different places in the script)
Await the completion of the BITS job so I can start using the file I downloaded (jobs will be awaited at different places in the script)
You could use psexec to run a secondary script with SYSTEM rights by the administrator content and have the primary script identify the exit code of the psexec process to confirm it has successfully executed.
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/start-process?view=powershell-7.1
https://learn.microsoft.com/en-us/sysinternals/downloads/psexec#:~:text=PsExec%20is%20a%20light-weight,to%20manually%20install%20client%20software.
https://weblogs.asp.net/soever/returning-an-exit-code-from-a-powershell-script
It turns out there's a solution to this, although it's specific to Packer. I didn't mention much about my use of it because I didn't think it was that important.
Contrary to my initial belief, Packer's PowerShell provisioner lets you run the provisioning script with elevated privileges as any user...
elevated_user and elevated_password (string) - If specified, the PowerShell script will be run with elevated privileges using the given Windows user.
provisioner "powershell" {
elevated_user = "Administrator"
elevated_password = build.Password
}
...including service users.
If you specify an empty elevated_password value then the PowerShell script is run as a service account. For example:
provisioner "powershell" {
elevated_user = "SYSTEM"
elevated_password = ""
}
After adjusting my Packer template's provisioner block accordingly, I can now confirm that Start-BitsTransfer and friends work as expected. No need to pass complicated arguments or play tricks with sessions.

Grant IUSR Rights to Use PowerShell Stop-Process Command?

I have a PHP script that runs a Powershell Stop-Process command through shell_exec(). PHP runs as IUSR. When I run the script, I receive an access denied error message. If I run the command in PowerShell using my Administrator account, it works as expected.
How do I grant IUSR the ability to execute Stop-Process in Powershell?
I wasn't able to find a solution to grant IUSR the specific privileges to execute Stop-Process, but I was able to get around this by changing the "Anonymous Authentication" user associated with the kill script from IUSR to Administrator.
In IIS 8.5, go to Sites->My Site->Folder Name. On the main panel, click on Authentication under IIS. Right click on Anonymous Authentication and then click on Edit.
You can set the "Anonymous Authentication" value at any level of your IIS app; from the site level to the sub-directory level. I recommend only changing the value from IUSR to Administrator on the directory that actually hosts your kill script. Changing it for the whole site might create problems for other parts of the application.
I've seen some information that suggests if you add a limited user to the Performance Monitor Users group and grant it debug privileges, it will be able to terminate processes.
You might consider something a bit less risky though, like running another web app as a user with those rights, that can only be accessed from the local machine. Then make your PHP app do a web request to the internal app to do it's killin'.
If you're trying to kill only a specific process this lets you further limit the impact because the internal app could be designed to only kill that one thing.
Other ways to achieve a similar separation is to have for example a scheduled task that looks for a file with specific content in a specific directory, when it sees it, it kills a process and deletes the file. IUSR can be given permission to create files in that directory as a way to trigger this. This method is very easy to implement but isn't synchronous.

Powershell script on remote computer not running as a scheduled task

I have a script on my domain stored on the Active Directory server. every machine on the domain has a task that when fired, calls this script to run.
Running the task from the AD server works. Running the task from another machine doesn't work. However, running the command that is triggered from cmd manually on the remote computer works?
Could anyone shine some light on this. Basically I call it like this so that the trigger is set up like...
Action: PowerShell.exe
Arguments: -noprofile –ExecutionPolicy Bypass -File "\\<>NameOfADServer<>\C$\Tasks\script.ps1" "Argument 1" "Argument 2"
Running as SYSTEM is probably your issue - it wont have any access outside of the PC its running on.
When you run it manually youll have the access.
There's several problems here.
You're running the task as the local SYSTEM accounts. SYSTEM generally does not have access to any network resources.
You're using the administrative share (\\<servername>\C$) to share the script. Only users that have Administrator access to the server can access the administrative shares. Administrative shares are heavily restricted by design and you cannot modify the access on them.
My guess is that the script works when you run it manually is because it's using the current user's credentials for network access when you do that, but don't quote me on that.
The simplest solution with the least amount of change is to do this:
Create a group in Active Directory. Add the Computer accounts, or, preferably, groups with Computer accounts which you want to be able to run the script to this new group. If you really want any SYSTEM account on any computer in the domain to be able to run the script, you can add the "Domain Computers" group to the group.
Create a folder on the server. Put the script in the folder. Don't put anything in this folder you don't want your users to read. Assign the "Read" NTFS permission to the group created above to the folder.
Share the folder out. Grant the group you just created the "Full Control" share access. If you want, you can make it a hidden share by adding a dollar sign to the end of the name.
Update your scheduled tasks to use \\<servername>\<sharename>\script.ps1.
This is almost certainly not the best method to accomplish what you're actually trying to do, but this is probably the best way to use scheduled tasks running scripts on a network share with the SYSTEM account.

How can I make Windows software run as a different user within a script?

I'm using a build script that calls Wise to create some install files. The problem is that the Wise license only allows it to be run under one particular user account, which is not the same account that my build script will run under. I know Windows has the runas command but this won't work for an automated script as there is no way to enter the password via the command line.
This might help: Why doesn't the RunAs program accept a password on the command line?
I recommend taking a look at CPAU.
Command line tool for starting process
in alternate security context.
Basically this is a runas replacement.
Also allows you to create job files
and encode the id, password, and
command line in a file so it can be
used by normal users.
You can use it like this (examples):
CPAU -u user [-p password] -ex "WhatToRun" [switches]
Or you can create a ".job" file which will have the user and password encoded inside of it. This way you can avoid having to put the password for the user inside your build script.
It's a bit of a workaround solution, but you can create a scheduled task that runs as your user account, and have it run regularly, maybe once every minute. Yes, you'll have to wait for it to run then.
This task can then look for some data files to process, and do the real work only if they are there.
This might help, it's a class I've used in another project to let people make their own accounts; everyone had to have access to the program, but the same account couldn't be allowed to have access to the LDAP stuff, so the program uses this class to run it as a different user.
http://www.codeproject.com/KB/dotnet/UserImpersonationInNET.aspx

Resources