I need to log my powershell output. My ps file is something like this:
#Set-ExecutionPolicy Unrestricted
$ErrorActionPreference="SilentlyContinue"
Stop-Transcript | out-null
$ErrorActionPreference = "Continue"
$date = (Get-Date).tostring("MMddyy HHmmss")
$filename = 'C:\apierror\logs\' + $date + '.txt'
Start-Transcript -path $filename -append
$python = "C:\Python34\python.exe"
$python_path = "C:\script.py"
cd (split-path $python_path)
& $python $python_path
Stop-Transcript
Now, when I run this file directly from powershell, the output is logged correctly. But when I try to run it from taskscheduler - only some portion of the console output is stored in the file.
Any ideas why that might be?
Using transcript only stored partial output for some reason. I ended up using logs directly into the python file as opposed to powershell. Seems to be working correctly.
Related
I have a list of urls (urls.txt):
https://example.com/1.webp
https://example.org/bar2.webp
... 10k more
Files vary in size from 1kb to 100kb.
How can I download these files quickly on a Windows 10 machine without installing any third-party software?
I need it to be in a single file that user can double-click without installing any additional software.
It should work on any decently up-to-date Windows 10 PC. AFAIK it means the PowerShell version is 5.1.
Additional information.
I tried this:
powershell -Command "Invoke-WebRequest https://example.com/1.webp -OutFile 1.webp"
but it extremely slow due to sequential execution.
So far this works in PowerShell fast enough:
Get-Content .\urls.txt |ForEach-Object {
$FileName = Split-Path -leaf $_
Invoke-WebRequest $_ -OutFile $FileName
}
But I can't figure out how to invoke this script with a double-click on a file.
Invoking .ps1 file from a .bat file doesn't work. Error:
download.ps1 cannot be loaded because running scripts is disabled on this system.
Asking user to adjust permissions is not an option.
This works in a clickable .bat file:
powershell -command ^
Invoke-WebRequest https://example.com/1.webp -OutFile 1.webp;
But this script fails silently:
powershell -command ^
Get-Content .\urls.txt |ForEach-Object { ^
$FileName = Split-Path -leaf $_ ^
Invoke-WebRequest $_ -OutFile $FileName ^
} ^
"...how do I iterate over a file lines with it? Sry, I never used Windows" (that must feel like me after a Linux machine).
Open a PowerShell prompt (Start → Run → PowerShell) or just type PowerShell.exe on the command prompt.
At the PowerShell prompt, to run the task in parallel using ForEach-Object -Parallel:
1..9 |ForEach-Object -Parallel { "Invoke-WebRequest https://example.com/$_.webp" -OutFile "$_.webp" }
Where "$_" is the current item (1to9`), you might also use a list here, like:
'One', 'Two', 'Three' |ForEach-Object -Parallel { ...
In case you "need to read it directly from the file", (presuming that you want use the name in the url as your filename) you might do something like this:
Get-Content .\urls.txt |ForEach-Object -Parallel {
$FileName = Split-Path -leaf $_
"Invoke-WebRequest $_ -OutFile $FileName
}
Update
(based on the additional information in your question and comments in this answer)
Final steps to making you command line easy to launch for novice user, taking in account that passing "complex" commands with special characters (as newlines, spaces and double quotes) from a batch file interpreter to PowerShell is quiet a hassle as there are a lot of exceptions on the exceptions. See: these stackoverflow questions
In your case it might be simply putting your commands in a single (quoted) command line and separate each syntax with a semi-colon (;):
powershell -command "Get-Content .\urls.txt |ForEach-Object { $FileName = Split-Path -leaf $_; Invoke-WebRequest $_ -OutFile $FileName }"
But to be on the safe side (in case e.g. a powershell command/parameter requires to be quoted by itself), I would rather supply robust solution which is encoding your command line to base64 and use the -EncodedCommand parameter. See also these answers:
running powershell as shell command having error in StartTime variable for FilterHashtable
Pass complex arguments to powershell script through encoded command
Encoding
To encode your command line to base64:
$Command = {
Get-Content .\urls.txt |ForEach-Object {
$FileName = Split-Path -leaf $_
Invoke-WebRequest $_ -OutFile $FileName
}
}.ToString()
$Bytes = [System.Text.Encoding]::Unicode.GetBytes($Command)
[Convert]::ToBase64String($Bytes)
Download.bat
Including the encoded command line in a (single) batch file add the following command in you batch file where the base64 string is copied from the above ToBase64String conversion:
PowerShell -EncodedCommand CgAgACAAIAAgACAAIAAgACAARwBlAHQALQBDAG8AbgB0AGUAbgB0ACAALgBcAHUAcgBsAHMALgB0AHgAdAAgAHwARgBvAHIARQBhAGMAaAAtAE8AYgBqAGUAYwB0ACAAewAKACAAIAAgACAAIAAgACAAIAAgACAAIAAgACQARgBpAGwAZQBOAGEAbQBlACAAPQAgAFMAcABsAGkAdAAtAFAAYQB0AGgAIAAtAGwAZQBhAGYAIAAkAF8ACgAgACAAIAAgACAAIAAgACAAIAAgACAAIABJAG4AdgBvAGsAZQAtAFcAZQBiAFIAZQBxAHUAZQBzAHQAIAAkAF8AIAAtAE8AdQB0AEYAaQBsAGUAIAAkAEYAaQBsAGUATgBhAG0AZQAKACAAIAAgACAAIAAgACAAIAB9AAoAIAAgACAAIAA=
You could try the foreach-object -parallel method for this case, i tried something simular once with multiple process starts for robocopy to get like 1000 small files (5-10kb) on another harddrive.
I will look up if i can find it again.
Edit 1: you can go over like this for example.
$allmylinks = import-csv -path "path to your csv"
foreach -parallel($link in $allmylinks){
Invoke-WebRequest $link
}
I'm trying to run below code in an automated scheduled task.
Whether I run this task manually or automated it is not working. When the option 'Run only when user is logged in' is set I at least see a PowerShell window opening, and I do see the jobs getting started. However, when the PS window closes the jobs are not visible (not completed, failed, nothing).
The logging shows the script runs till the import-csv command. I have put the CSV in the C: map, and I run the automated task as the logged in user and on highest privilege.
Why doesn't it get past import-csv? When I run this script in i.e Powershell ISE it works like a charm.
Running program
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Arguments:
–NoProfile -ExecutionPolicy Unrestricted -File "C:\Users\usr\Desktop\Scripts\script.ps1"
Start-in:
C:\Users\usr\Desktop\Scripts
Write-Host "Starting script"
$maxItems = 8
$iplist = import-csv "C:\Create.csv.txt"
Write-Host "Opened $($iplist[0])"
For ($i=0; $i -le $maxItems; $i++) {
Write-Host $iplist[$i].DisplayName
Start-Job -ScriptBlock {
Param($displayName)
try{
Start-Transcript
Write-Host "Found and started a job for $($displayName)"
Stop-Transcript
}
Catch{
Write-Host "Something went wrong "
Stop-Transcript
}
} -ArgumentList $iplist[$i].DisplayName
}
UPDATE:
The PS window closed before it got to do anything. The answer in this page send me in the right direction. The full fix I used to get this working:
Task Scheduling and Powershell's Start-Job
First, to prevent the powershell window from closing, run add the following line to the bottom of the script:
Read-Host 'Press Any Key to exit'
Second, if you run into issues with params, try explicitly naming the param with a flag:
$iplist = Import-csv -LiteralPath "C:\Create.csv.txt"
Third, make sure that you explicitly declare the delimiter being used if different than a comma.
I'm experiencing with weird case with my Powershell script.
I have written a script that's execute an .exe file
this exe runtime is about 3 hours but constantly crashing after 2 hors (1-2 minutes more or less)
I have break my head try to figure out why the process crashing
eventually I found that the .exe crashing because the powershell crashing.
Here is the process execution command:
$Proc = Start-Process -FilePath $ExePath -ArgumentList $Arguments -NoNewWindow -PassThru
$Proc | Wait-Process -Timeout 28800 -ea 0 -ev timeouted
After I realized this issue cased by the Powershell I have enabled windows powershell logging and find an error message "the pipeline has been stopped"
The script need perform more actions after the process ends and get its exit code, that's why I used the -PassThru flag.
I have tried to run it without using the PassThru flag or the Process-Wait command, the result stayed the same (the process crashed after 2 hours but there wasn't log with the message "The pipeline has been stopped")
Important points:
the .exe file is soured with try;catch blocks with logger but did not logged any thing when crashing- this is not a runtime error in the .exe file
When running the .exe independently from the command line its finish successfully after ~3 hours
The Powershell script run with Administrator privileges
The exe is not casing the crashing due to high CPU/Memory/Disk usage
I will follow up once I will have more updates.
Thanks for all the helpers.
Your help is much appreciated!
In my opinion the Start-Process cmdlet is good for quick things. But, it leaves a lot to be desired when trying to debug why an exe isn't behaving.
To work around this in your case it might be useful to use .Net objects to redirect and change certain things about your instantiation. I put an example function below that I've used when having trouble debugging exe runs.
function Start-Exe
{
param
( [string]$exePath, [string]$args1 )
$returnVal = $false
$pinfo = New-Object System.Diagnostics.ProcessStartInfo
$pinfo.FileName = $exePath
$pinfo.RedirectStandardError = $true
$pinfo.RedirectStandardOutput = $true
$pinfo.UseShellExecute = $false
$pinfo.Arguments = $args1
$p = New-Object System.Diagnostics.Process
$p.StartInfo = $pinfo
$p.Start() | Out-Null
$p.WaitForExit()
$stdout = $p.StandardOutput.ReadToEnd()
$stderr = $p.StandardError.ReadToEnd()
$exitCode = $p.ExitCode
#OutputFile can be some log file, or just use write-host to pump to console
#$stdout | Add-Content $global:OutputFile
#$stderr | Add-Content $global:OutputFile
return $exitCode
}
You can try it without using Wait-Process and see if it is reproducible.
Start-Process with -Wait parameter .
or Create it in a Job without -Wait and wait for the Job using Wait-.Job cmdlet
I had the same symptom when trying to execute an 8 hour process, it would always die after 2 hours, you need to clear the powershell IdleTimeout.
This answer helped me
Using Task Scheduler I am running a PS script to restart selected Windows Services using Restart-Service. For troubleshooting issues I'd like to write the output to a log file so we can make sure the service did restart. For the life of me I can't get the output file to write anything just creates the file in date format, but no contents.
THank you
Edit:
OG Script
Restart-Service Printer Spooler -Force | Out-File c:\scripts\test3.txt
If I add -PassThru I get an output but the output is pretty bare bones. Would like to log steps of the Service Controller.
Restart-Service Printer Spooler -Force -PassThru | Out-File c:\scripts\test3.txt
$logFile = "C:\Windows\Temp\out.txt"
$serviceName = "serviceName"
Restart-Service $serviceName -Verbose *> $logFile
The -Verbose switch gives you detailed start/stop attempt information
*> Redirects all command output to the log file.
Provided the service you're restarting talks to the eventlogs, I'd grab the data from there and log it. Or leave it in there and grab it as needed. If you want to output it, this is one approach:
$date = (get-date).AddMinutes(-5)
$serviceData = Get-Service wersvc
restart-service $serviceData
$eventData = Get-Winevent -FilterHashtable #{ LogName = 'System'; StartTime = $date; ID = 7036} | ? {$_.message -match $serviceData.DisplayName}
$eventData | Out-File C:\logs\filename.txt
disclosure
I'm a PHP/Linux developer that is having to get use to working in a Windows environment, so I very well my be missing something fundamental in the question. I've researched the heck out of this and can't seem to pinpoint a solution. Thanks for your help in advanced.
I have a batch file that calls a powershell script that doesn't work correctly when it is started by the window's task scheduler, but works perfectly when it is launched by hand.
Below is the Powershell script that the batch file is launching...
$WatchFolder = '\\networkshare\foldername'
$Filter = '*.csv'
$fsw = New-Object IO.FileSystemWatcher $WatchFolder, $Filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters] 'LastWrite'}
Register-ObjectEvent $fsw Changed -SourceIdentifier SAS_Poll_FileChanged -Action {
$WatchFolder = '\\networkshare\foldername'
$OutputFolder_one = '\\networkshare\outputFolder_One'
$OutputFolder_two = '\\networkshare\outputFolder_Two'
$name = $Event.SourceEventArgs.Name
$lock_file = $WatchFolder + '\.' + $name + '.lock'
$test = (Test-Path "$lock_file")
if ( $test ){
return
} else {
Set-ExecutionPolicy -Scope CurrentUser Bypass
Out-File -FilePath "$lock_file"
Start-Process powershell -ArgumentList ".\General\PollingProcess-alone.ps1 $WatchFolder $MainFrameFolder $SASFolder $name $lock_file" -NoNewWindow
}
}
I know the issue occurs at the following line...
Start-Process powershell -ArgumentList ".\General\PollingProcess-alone.ps1 $WatchFolder $MainFrameFolder $SASFolder $name $lock_file" -NoNewWindow
I know this b/c when the event is triggered ( when the script is launched via the task scheduler ), the lock file is created and then the process hangs.
I therefore think that the issue has something to do with path of second powershell script I'm calling, but I don't know how to fix it. I've tried using the full path of the second script, but haven't been able to make it work.
Just to give you some more context of the script, it is sort of important the the event process spins up a new powershell script b/c I need these scripts to run concurrently.
Pretty sure the problem is with your argument list, right now you are passing a single string with everything contained within it, change that to an array and things should work properly.
so convert
".\General\PollingProcess-alone.ps1 $WatchFolder $MainFrameFolder $SASFolder $name $lock_file"
to
".\General\PollingProcess-alone.ps1",$WatchFolder,$WatchFolder,etc..
give that a shot and let us know if it works for you, also want to say that your code is impressive for being relatively new to PowerShell so kudos lol.