I'm trying to run below code in an automated scheduled task.
Whether I run this task manually or automated it is not working. When the option 'Run only when user is logged in' is set I at least see a PowerShell window opening, and I do see the jobs getting started. However, when the PS window closes the jobs are not visible (not completed, failed, nothing).
The logging shows the script runs till the import-csv command. I have put the CSV in the C: map, and I run the automated task as the logged in user and on highest privilege.
Why doesn't it get past import-csv? When I run this script in i.e Powershell ISE it works like a charm.
Running program
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Arguments:
–NoProfile -ExecutionPolicy Unrestricted -File "C:\Users\usr\Desktop\Scripts\script.ps1"
Start-in:
C:\Users\usr\Desktop\Scripts
Write-Host "Starting script"
$maxItems = 8
$iplist = import-csv "C:\Create.csv.txt"
Write-Host "Opened $($iplist[0])"
For ($i=0; $i -le $maxItems; $i++) {
Write-Host $iplist[$i].DisplayName
Start-Job -ScriptBlock {
Param($displayName)
try{
Start-Transcript
Write-Host "Found and started a job for $($displayName)"
Stop-Transcript
}
Catch{
Write-Host "Something went wrong "
Stop-Transcript
}
} -ArgumentList $iplist[$i].DisplayName
}
UPDATE:
The PS window closed before it got to do anything. The answer in this page send me in the right direction. The full fix I used to get this working:
Task Scheduling and Powershell's Start-Job
First, to prevent the powershell window from closing, run add the following line to the bottom of the script:
Read-Host 'Press Any Key to exit'
Second, if you run into issues with params, try explicitly naming the param with a flag:
$iplist = Import-csv -LiteralPath "C:\Create.csv.txt"
Third, make sure that you explicitly declare the delimiter being used if different than a comma.
Related
I wrote a ps1 script to automate some package installation but the strange part is when I run the command snippet for executing the .exe file for SEP (Symantec Endpoint Protection) , it is executing fine , but when I execute the entire script , it does run the command snippet.
Iam only running a simple .exe file , and even if I run it manually , it does not show any installer , rather it installs silently in the background.
So in the script, Iam only running the .exe file, thats it .
Should I be giving any wait time or any other inputs ?
Start-Process -Wait -FilePath "C:\Temp\Symantec-Windows\SEP 14.3.3384.1000 x64.exe" -passthru
$SymVersion = Get-WmiObject -Class Win32_Product -ComputerName $hostname | Where-Object -FilterScript {$_.Name -eq "symantec endpoint protection"} | Format-List -Property version, InstallState, name
echo $SymVersion
if($SymVersion)
{
echo 'Symantec is successfully installed' -ForegroundColor Green
}
else
{
echo 'Symantec is not successfully installed' -ForegroundColor Red
}
The symantec antivirus exe files are made for silent installations. If you want to proceed with GUI mode, better unzip the file and use MSI file with arguments. With your current script,Its better to check the process is exited with code 0. The following code is not tested.
$process = Start-Process -FilePath "C:\Temp\Symantec-Windows\SEP 14.3.3384.1000 x64.exe" -passthru -Wait
if($process.ExitCode -ne 0)
{
throw "Installation process returned error code: $($process.ExitCode)"
} else { Write-Host "Installation Successful"}
I have an update script for running the Dell Command Update tool. In short dcu-cli.exe. The thing now is than when i run the same script code on the computer local then everything runs OK but when i run the exact same code in a script with invoke-command(and yes i have full admin rights) than the exitcode is 2 meaning An unknown application error has occurred instead of 0 (everything OK)
It is a very large script so i created a new one to debug this. This is the shorted code:
Invoke-Command -ComputerName "MyComputer" -ScriptBlock {
$ExitCode = 0
#Declare path and arguments
$DcuCliPath = 'C:\Program Files (x86)\Dell\CommandUpdate\dcu-cli.exe'
$DellCommand = "/applyUpdates -autoSuspendBitLocker=enable -outputLog=C:\Dell_Update.log"
#Verify Dell Command | Update exists
If (Test-Path -Path $DcuCliPath) {
$objWMI = Get-WmiObject Win32_ComputerSystem
Write-Host ("Dell Model [{0}]" -f $objWMI.Model.Trim())
$serviceName = "DellClientManagementService"
Write-Host ("Service [{0}] is currently [{1}]" -f $serviceName, (Get-Service $serviceName).Status)
If ((Get-Service $serviceName).Status -eq 'Stopped') {
Start-Service $serviceName
Write-Host "Service [$serviceName] started"
}
#Update the system with the latest drivers
Write-Host "Starting Dell Command | Update tool with arguments [$DellCommand] dcu-cli found at [$DcuCliPath]"
$ExitCode = (Start-Process -FilePath ($DcuCliPath) -ArgumentList ($DellCommand) -PassThru -Wait).ExitCode
Write-Host ("Dell Command | Update tool finished with ExitCode: [$ExitCode] current Win32 ExitCode: [$LastExitCode] Check log for more information: C:\Dell_Update.log")
}
}
When i remove the Invoke-Command -ComputerName "MyComputer" -ScriptBlock { and then copy + run the script local on the PC then the exitcode = 0
What i also noticed than when i run the command via 'Invoke-Command' then there is also no log file created as i passed along in the arguments... So my best guess is something is going wrong with local an remote paths?
So what am i missing? I'm guessing it is something simple but i spend several hours to get this running without any luck...
Try running it this way. You should be able to see any output or error messages. I typically add to the path first rather than using & or start-process.
invoke-command mycomputer {
$env:path += ';C:\Program Files (x86)\Dell\CommandUpdate';
dcu-cli /applyUpdates -autoSuspendBitLocker=enable -outputLog=C:\Dell_Update.log }
Using start-process inside invoke-command seems pretty challenging. I can't even see the output of findstr unless I save it to a file. And if I didn't wait the output would be truncated. By default start-process runs in the background and in another window. There's a -nonewwindow option too but it doesn't help with invoke-command.
invoke-command localhost { # elevated
start-process 'findstr' '/i word c:\users\joe\file1' -wait -RedirectStandardOutput c:\users\joe\out }
#js2010, thanks for your additional help. Unfortunately this didn't helped either.
So i did some more debugging and it turns out it was a bug in the dcu-cli version running on my test machine, DOH...!!
On the test machine version 3.1.1 was running and on another machine version 4.0 was running and that worked fine via remote Powershell. So i looked for the release notes, which i found here: https://www.dell.com/support/kbdoc/000177325/dell-command-update
And as you can see in version 3.1.3 there was this fix:
A problem was solved where dcu-cli.exe was not executed in an external interactive session of PowerShell.
My university requires all computers to perform a web-based login in order to get-access to the internet, and claims that all users will log-off automatically in the mid-night (sounds strange, but it is true), so I am trying to write a powershell script (in Windows 10) to perform automatic login at mid-night.
My script is list here. It opens an IE process in the background (in a nonvisible way), fill in the username and password, login, and kills the IE process.
# If there are existing Internet Explorer processes, close it
$IE_Process = Get-Process iexplore -ErrorAction Ignore
if ($IE_Process) {
$IE_Close = Foreach-Object { $IE_Process.CloseMainWindow() }
}
Stop-Process -Name "iexplore" -ErrorAction Ignore
# Login Information
$url = "http://xxx.xxx.xxx.xxx/"
$username = "xxxxxxxx"
$password = "xxxxxxxx"
# Open an IE process
$ie = New-Object -com internetexplorer.application;
$ie.silent = $true
$ie.navigate($url);
while ($ie.Busy -eq $true)
{
Start-Sleep -s 1;
}
# The stupid webpage needs to submit twice
$ie.Document.getElementById("loginname").value = $username
$ie.Document.getElementByID("password").value = $password
$ie.Document.getElementById("button").Click()
Start-Sleep -s 1;
$ie.Document.getElementById("loginname").value = $username
$ie.Document.getElementByID("password").value = $password
$ie.Document.getElementById("button").Click()
# Close the IE process
$IE_Process = Get-Process iexplore -ErrorAction Ignore
if ($IE_Process) {
$IE_Close = Foreach-Object { $IE_Process.CloseMainWindow() }
}
Stop-Process -Name "iexplore" -ErrorAction Ignore
Remove-Variable -Name ie,username,password,url,IE_Process -ErrorAction Ignore
The script is saved as "login_IE.ps1". It may be poorly written as I am new to powershell, but it works. If I open an cmd window and execute the following command, I am logged in.
C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe -ExecutionPolicy RemoteSigned -File C:\Users\MyName\Documents\Powershell\login_IE.ps1
However, if I create a scheduled task in windows task scheduler executing this script, it doesn't work. I fill the "Program/script:" as:
C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe
and fill the "Add arguments (optional):" as:
-ExecutionPolicy RemoteSigned -File C:\Users\MyName\Documents\Powershell\login_IE.ps1
The scheduled task is run under my account (I am the only user of this computer).
If I run the scheduled task manually, in the task manager I can see two IE process opened in the "Background process", communicate with the internet, and then get killed, so I am pretty sure that the script has actually been executed. But I found I am not logged in since I don't have internet access, where could the problem be?
Any advice is really appreciated. Thanks in advance.
Similar type of issue I had, when trying to run the script directly from Powershell window it works as expected, but from the task scheduler or command line both not getting the desired results.
Commenting and adding lines like below in my script, help me to run the script from command line and task scheduler as well
$AllProtocols = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12'
[System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
[Net.ServicePointManager]::SecurityProtocol =
[Net.SecurityProtocolType]::Tls12
Hope this helps anyone else.
I'm experiencing with weird case with my Powershell script.
I have written a script that's execute an .exe file
this exe runtime is about 3 hours but constantly crashing after 2 hors (1-2 minutes more or less)
I have break my head try to figure out why the process crashing
eventually I found that the .exe crashing because the powershell crashing.
Here is the process execution command:
$Proc = Start-Process -FilePath $ExePath -ArgumentList $Arguments -NoNewWindow -PassThru
$Proc | Wait-Process -Timeout 28800 -ea 0 -ev timeouted
After I realized this issue cased by the Powershell I have enabled windows powershell logging and find an error message "the pipeline has been stopped"
The script need perform more actions after the process ends and get its exit code, that's why I used the -PassThru flag.
I have tried to run it without using the PassThru flag or the Process-Wait command, the result stayed the same (the process crashed after 2 hours but there wasn't log with the message "The pipeline has been stopped")
Important points:
the .exe file is soured with try;catch blocks with logger but did not logged any thing when crashing- this is not a runtime error in the .exe file
When running the .exe independently from the command line its finish successfully after ~3 hours
The Powershell script run with Administrator privileges
The exe is not casing the crashing due to high CPU/Memory/Disk usage
I will follow up once I will have more updates.
Thanks for all the helpers.
Your help is much appreciated!
In my opinion the Start-Process cmdlet is good for quick things. But, it leaves a lot to be desired when trying to debug why an exe isn't behaving.
To work around this in your case it might be useful to use .Net objects to redirect and change certain things about your instantiation. I put an example function below that I've used when having trouble debugging exe runs.
function Start-Exe
{
param
( [string]$exePath, [string]$args1 )
$returnVal = $false
$pinfo = New-Object System.Diagnostics.ProcessStartInfo
$pinfo.FileName = $exePath
$pinfo.RedirectStandardError = $true
$pinfo.RedirectStandardOutput = $true
$pinfo.UseShellExecute = $false
$pinfo.Arguments = $args1
$p = New-Object System.Diagnostics.Process
$p.StartInfo = $pinfo
$p.Start() | Out-Null
$p.WaitForExit()
$stdout = $p.StandardOutput.ReadToEnd()
$stderr = $p.StandardError.ReadToEnd()
$exitCode = $p.ExitCode
#OutputFile can be some log file, or just use write-host to pump to console
#$stdout | Add-Content $global:OutputFile
#$stderr | Add-Content $global:OutputFile
return $exitCode
}
You can try it without using Wait-Process and see if it is reproducible.
Start-Process with -Wait parameter .
or Create it in a Job without -Wait and wait for the Job using Wait-.Job cmdlet
I had the same symptom when trying to execute an 8 hour process, it would always die after 2 hours, you need to clear the powershell IdleTimeout.
This answer helped me
I'm using Register-ScheduledJob to register job in powershell in background job I execute script. This script contains some commands like Get-Process and Write-Host command. And...
Altough every command is executed in results I don't see outputs from write-hosts (get-Process is is ok)
Maybe someone know why?
Write-Host writes to the host, which is the app that the script is running in (PowerShell.exe for instance), so it is output explicitly to the screen, and DOES NOTHING when you're running in a non-interactive environment. You should never use that to output data that you want to collect, only for lightweight debugging or for printing to the screen in interactive scripts.
You should generally use write-output for the data that you want to collect as output.
Although you can also use the debug/warning/error output (those are collected by the job, but not shown in the regular output).
Thank You very much. Write-Output helped.
Additionaly what i discover during last days and could be helpful for others: if you start scheduled background job and in this job start powershell script like this:
Register-ScheduledJob -Name1 temp -ScheduledJobOption $option -ScriptBlock {
D:\scprit.ps1
}
Your job will never end because after finish script powershell window is still open. So additionaly you have to add exit in your scriptblock:
Register-ScheduledJob -Name1 temp -ScheduledJobOption $option -ScriptBlock {
D:\scprit.ps1
Exit-PSSession
}