New to scripting, currently trying to write a script that will Invoke command to every computer located within domain. My issue is I am trying to direct the output of each computer to a text file.
# Name:
# Date: 02/10/2023
# Desc: Runs remote commands for every computer within domain to collect general information
#Defining variable that reads the computer names from the .txt file
$ADCS = Get-Content -Path "C:\computers.txt"
# Loop through each computer name in the list
foreach ($ADC in $ADCS) {
# Run the Invoke-Command cmdlet for each computer
Invoke-Command -ComputerName $ADC -ScriptBlock {
# Write message to indicate which computer commands are being run on.
Write-Output "Running command on $ADC"
#systeminfo | find "Host Name"
Get-computerinfo
out-file -FilePath C:\Users\aembrey\Documents\ComputerInfo.txt
}
} | out-file -FilePath C:\Users\aembrey\Documents\ComputerInfo.txt
This is what I am currently working with. I have tried multiple difference ways of using Out-file and have failed to redirect the output.
I tried formatting the command like you would as a standard command
"get-computerinfo | out-file -filepath C:\Users\X\X"
But I receive "An empty pipe element is not allowed." error.
I am sure this is a simple issue, but I am stumped.
Basically just trying to get the computer info of all computers, then save it to a text file.
Related
I want to run Pre/Post patching Powershell scripts with Azure Automaton account and get an output of the command which ran Inside the VM, i.e. "Get-service"
I just followed instructions on Microsoft: https://learn.microsoft.com/en-us/azure/automation/update-management/pre-post-scripts#interact-with-machines
and their script:
https://github.com/azureautomation/update-management-run-script-with-run-command
Additionally I've found a way to run command and it's executing on VM, but NO Output.
$ServicePrincipalConnection = Get-AutomationConnection -Name 'AzureRunAsConnection'
Add-AzAccount -ServicePrincipal -TenantId $ServicePrincipalConnection.TenantId -ApplicationId $ServicePrincipalConnection.ApplicationId -CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint
$rgname ="raimundas-rg"
$vmname ="test-win-vm1"
$ScriptToRun =
#"
get-service "wuauserv"
"#
Out-File -InputObject $ScriptToRun -FilePath ScriptToRun.ps1
Invoke-AzVMRunCommand -ResourceGroupName $rgname -Name $vmname -CommandId 'RunPowerShellScript' -ScriptPath ScriptToRun.ps1
Remove-Item -Path ScriptToRun.ps1
However it did not did the trick since it's using old AZrm cmdlet. and did not get output in logs. any suggestions how to run scripts on multiple Azure VMs and get output?
I have an update script for running the Dell Command Update tool. In short dcu-cli.exe. The thing now is than when i run the same script code on the computer local then everything runs OK but when i run the exact same code in a script with invoke-command(and yes i have full admin rights) than the exitcode is 2 meaning An unknown application error has occurred instead of 0 (everything OK)
It is a very large script so i created a new one to debug this. This is the shorted code:
Invoke-Command -ComputerName "MyComputer" -ScriptBlock {
$ExitCode = 0
#Declare path and arguments
$DcuCliPath = 'C:\Program Files (x86)\Dell\CommandUpdate\dcu-cli.exe'
$DellCommand = "/applyUpdates -autoSuspendBitLocker=enable -outputLog=C:\Dell_Update.log"
#Verify Dell Command | Update exists
If (Test-Path -Path $DcuCliPath) {
$objWMI = Get-WmiObject Win32_ComputerSystem
Write-Host ("Dell Model [{0}]" -f $objWMI.Model.Trim())
$serviceName = "DellClientManagementService"
Write-Host ("Service [{0}] is currently [{1}]" -f $serviceName, (Get-Service $serviceName).Status)
If ((Get-Service $serviceName).Status -eq 'Stopped') {
Start-Service $serviceName
Write-Host "Service [$serviceName] started"
}
#Update the system with the latest drivers
Write-Host "Starting Dell Command | Update tool with arguments [$DellCommand] dcu-cli found at [$DcuCliPath]"
$ExitCode = (Start-Process -FilePath ($DcuCliPath) -ArgumentList ($DellCommand) -PassThru -Wait).ExitCode
Write-Host ("Dell Command | Update tool finished with ExitCode: [$ExitCode] current Win32 ExitCode: [$LastExitCode] Check log for more information: C:\Dell_Update.log")
}
}
When i remove the Invoke-Command -ComputerName "MyComputer" -ScriptBlock { and then copy + run the script local on the PC then the exitcode = 0
What i also noticed than when i run the command via 'Invoke-Command' then there is also no log file created as i passed along in the arguments... So my best guess is something is going wrong with local an remote paths?
So what am i missing? I'm guessing it is something simple but i spend several hours to get this running without any luck...
Try running it this way. You should be able to see any output or error messages. I typically add to the path first rather than using & or start-process.
invoke-command mycomputer {
$env:path += ';C:\Program Files (x86)\Dell\CommandUpdate';
dcu-cli /applyUpdates -autoSuspendBitLocker=enable -outputLog=C:\Dell_Update.log }
Using start-process inside invoke-command seems pretty challenging. I can't even see the output of findstr unless I save it to a file. And if I didn't wait the output would be truncated. By default start-process runs in the background and in another window. There's a -nonewwindow option too but it doesn't help with invoke-command.
invoke-command localhost { # elevated
start-process 'findstr' '/i word c:\users\joe\file1' -wait -RedirectStandardOutput c:\users\joe\out }
#js2010, thanks for your additional help. Unfortunately this didn't helped either.
So i did some more debugging and it turns out it was a bug in the dcu-cli version running on my test machine, DOH...!!
On the test machine version 3.1.1 was running and on another machine version 4.0 was running and that worked fine via remote Powershell. So i looked for the release notes, which i found here: https://www.dell.com/support/kbdoc/000177325/dell-command-update
And as you can see in version 3.1.3 there was this fix:
A problem was solved where dcu-cli.exe was not executed in an external interactive session of PowerShell.
I'm trying to run below code in an automated scheduled task.
Whether I run this task manually or automated it is not working. When the option 'Run only when user is logged in' is set I at least see a PowerShell window opening, and I do see the jobs getting started. However, when the PS window closes the jobs are not visible (not completed, failed, nothing).
The logging shows the script runs till the import-csv command. I have put the CSV in the C: map, and I run the automated task as the logged in user and on highest privilege.
Why doesn't it get past import-csv? When I run this script in i.e Powershell ISE it works like a charm.
Running program
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Arguments:
–NoProfile -ExecutionPolicy Unrestricted -File "C:\Users\usr\Desktop\Scripts\script.ps1"
Start-in:
C:\Users\usr\Desktop\Scripts
Write-Host "Starting script"
$maxItems = 8
$iplist = import-csv "C:\Create.csv.txt"
Write-Host "Opened $($iplist[0])"
For ($i=0; $i -le $maxItems; $i++) {
Write-Host $iplist[$i].DisplayName
Start-Job -ScriptBlock {
Param($displayName)
try{
Start-Transcript
Write-Host "Found and started a job for $($displayName)"
Stop-Transcript
}
Catch{
Write-Host "Something went wrong "
Stop-Transcript
}
} -ArgumentList $iplist[$i].DisplayName
}
UPDATE:
The PS window closed before it got to do anything. The answer in this page send me in the right direction. The full fix I used to get this working:
Task Scheduling and Powershell's Start-Job
First, to prevent the powershell window from closing, run add the following line to the bottom of the script:
Read-Host 'Press Any Key to exit'
Second, if you run into issues with params, try explicitly naming the param with a flag:
$iplist = Import-csv -LiteralPath "C:\Create.csv.txt"
Third, make sure that you explicitly declare the delimiter being used if different than a comma.
I'm having some trouble getting Clean Manager on Windows 10 to run remotely. I've seen a few different things were you can edit the registry and modify the /sageset or /sagerun to be specific things then run it remotely, but it seems no matter what I do the CleanMgr runs locally on my machine rather than running remotely.
I believe this is the closest I've gotten to get it to run remotely... It seems to still just run locally on my machine though.
Any ideas?
( All variables are set before this portion of the script, this is just a small portion of what's going on that I'm stuck on )
## Starts cleanmgr.exe
Function Start-CleanMGR {
Write-Host "Please provide your A-Account details to continue with this cleanup."
$creds = Get-Credential
Enter-PSSession -ComputerName $computername -Credential $creds
try {
$cleanmgr = Start-Process -Credential $creds -FilePath "C:\Windows\System32\cleanmgr.exe" -ArgumentList '/verylowdisk' -Wait -Verbose
if ($cleanmgr) {
Write-Host "Clean Manager ran successfully! " -NoNewline -ForegroundColor Green
Write-Host "[DONE]" -ForegroundColor Green -BackgroundColor Black
}
}
catch [System.Exception] {
Write-host "Cleanmgr is not installed! To use this portion of the script you must install the following windows features:" -NoNewline -ForegroundColor DarkGray
Write-host "[ERROR]" -ForegroundColor Red -BackgroundColor Black
}
} Start-CleanMGR
PowerShell always runs in the user context of the user who started the session. This is by design.
You can not run a GUI based application remotely using PowerShell. It is a Windows security boundary.
To run GUI apps, someone must be logged on, and you cannot use PowerShell to run code as the logged-on user.
You are also prompting for info, so, someone must be logged on.
If you are expecting a user to prived info, then you need to:
Create the script
Deploy the script to the user's machine or a files share from wich it
can be ran
Tell the user how to do it or create a batch file they would double
click to run the PowerShell script
Or
Set the script to run as a scheduled task at log on or at some point during the day, as the user credentials.
Variables have the scope and you cannot use local variables in a remote context unless they are scoped for that.
About Remote Variables
Using local variables
You can use local variables in remote commands, but the variable must
be defined in the local session.
Beginning in PowerShell 3.0, you can use the Using scope modifier to
identify a local variable in a remote command.
The syntax of Using is as follows:
$Using:<VariableName>
Still, the remote variable is not something you will do in your use case since you cannot do what you are after natively with PowerShell. You'll need a 3rdP tool like MS SysInternals PSExec to run code remotely as the logged-on user.
Using PsExec
Usage: psexec [\computer[,computer2[,...] | #file]][-u user [-p
psswd][-n s][-r servicename][-h][-l][-s|-e][-x][-i [session]][-c
executable [-f|-v]][-w directory][-d][-][-a n,n,...] cmd
[arguments]
-i Run the program so that it interacts with the desktop of the specified session on the remote system. If no session is specified the
process runs in the console session.
-u Specifies optional user name for login to remote computer.
I suggest that you use Invoke-Command
Function Start-CleanMGR ($computername, $creds) {
Invoke-Command -ComputerName $computername -Credential $creds -ScriptBlock {
try {
$cleanmgr = Start-Process -FilePath "C:\Windows\System32\cleanmgr.exe" -ArgumentList '/verylowdisk' -Wait -Verbose
if ($cleanmgr) {
return "Clean Manager ran successfully!"
}
}
catch [System.Exception] {
return "Cleanmgr is not installed! To use this portion of the script you must install the following windows features:"
}
}
}
Start-CleanMGR -computername "remotehost" -creds (Get-Credential)
As long as you execute cleanmgr.exe under a user account that has local admin rights, everything will work. Running cleanmgr.exe under the SYSTEM account, E.G. running from the Run Script Tool in SCCM/MECM will not work unless the script first opens a separate shell (DOS/PS) under a user that has local admin rights...even the cleanmgr.exe /verylowdisk will not run under the SYSTEM account.
I'm currently trying to pull msinfo data from a remote server and then save that output onto a share located on another server. When I run the command, a progress bar appears and then completes without apparent issue, but the file isn't saved to the UNC path. I've verified that I have permissions on the share and that the nfo generation itself works. Any ideas?
C:\Windows\system32>msinfo32 /computer servername /nfo \\sharename\filename.nfo
Very Strange, It works on CMD but not with Powershell, Not had the time to explore it, However If you need to run it in powershell you can take this workaround:
$TempFile = [System.IO.Path]::GetTempFileName()
C:\Windows\system32\msinfo32 /computer Computer /nfo $TempFile
Do
{
Sleep 5
}
Until (!(Get-Process msinfo32 -ErrorAction SilentlyContinue))
Copy-Item $TempFile \\Computer\Share\output.nfo
$TempFile | Remove-Item -Force
Got it figured out - I was able to use the switch user param to save the file after pulling it from the server.
Thanks!