Scheduled Powershell Script wont run - windows

I have a ps1 script like below, it filters files and outputs a formatted HTML file.
$a = "<style>"
$a = $a + "BODY{background-color:peachpuff;}"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}"
$a = $a + "TH{border-width: 1px;padding: 0px;border-style: solid;border-color: black;background-color:thistle}"
$a = $a + "TD{border-width: 1px;padding: 0px;border-style: solid;border-color: black;background-color:PaleGoldenrod}"
$a = $a + "</style>"
$b = Get-Date -Format u
Get-ChildItem -Recurse K:\AppData\*.* -Filter *.CATPart | Where{$_.LastWriteTime -gt (Get-Date).AddDays(-6)} | sort LastWriteTime -descending | select name,LastWriteTime,Directory | convertto-html -head $a -body "<H2>CATIA PAST 7 DAYS -- $b </H2>" | out-file C:\aaa\catia_result.htm
I can run this script manually with no problem at all. but when I schedule it to run, it only gives me the formatted htm file without any filtered data in there. This is arguments I used in task scheduler:
powershell.exe -ExecutionPolicy Bypass -Command "C:\aaa\RLTP_HTML_FINAL.ps1"
I tried change executionpolicy to Unrestricted, it still wont work. The task history shows the task completed, but there is no data in the HTML file.
I also tried to use a batch file call up powershell to run the script, it is the same result that it only works with manual operation but task scheduler.

Most likely, when you schedule the script to execute, it may not have a mapping to the K:\ drive. Make sure that:
The K drive is mapped in the script, using the Get-PSDrive cmdlet
Your credentials that the scheduled task is set to use have access to the K:\ drive
Alternatively, you could simply specify a UNC path, instead of referencing the K:\ drive.
ps. Good job using -ExecutionPolicy Bypass. That helps to avoid any issues with the execution policy! :) It doesn't matter what the execution policy is set to, as long as you use that parameter.
If you want to capture any errors in your script, make this your last line:
Add-Content -Path $PSScriptRoot\error.log -Value $error;
You might see something about the K:\ drive missing.

Related

Powershell script: List files with specific change date (Amount if possible)

For license porpuses I try to automate the counting process instead of having to login into every single server, go into directory, search a file name and count the results based on the change date.
Want I'm aiming for:
Running a powershell script every month that checks the directory "C:\Users" for the file "Outlook.pst" recursively. And then filters the result by change date (one month or newer). Then packing this into an email to send to my inbox.
I'm not sure if that's possible, cause I am fairly new to powershell. Would appreciate your help!
It is possible.
I dont know how to start a ps session on a remote computer, but I think the cmdlet Enter-PSSession will do the trick. Or at least it was the first result while searching for "open remote powershell session". If that does not work use the Invoke-Command as suggested by lit to get $outlookFiles as suggested below.
For the rest use this.
$outlookFiles = Get-ChildItem -Path "C:\Users" -Recurse | Where-Object { $_.Name -eq "Outlook.pst" }
Now you have all files that have this name. If you are not familiar with the pipe in powershell it redirects all objects it found with the Get-ChildItem to the next pipe section and here the Where-Object will filter the received objects. If the current object ($_) will pass the condition it is returned by the whole command.
Now you can filter these objects again to only include the latest ones with.
$latestDate = (Get-Date).AddMonths(-1)
$newFiles = $outlookFiles | Where-Object { $_.LastAccessTime -gt $latestDate }
Now you have all the data you want in one object. Now you only have to format this how you like it e.g. you could use $mailBody = $newFiles | Out-String and then use Send-MailMessage -To x#y.z -From r#g.b -Body $mailBodyto send the mail.

Remove an entry from credential manager for all users on Windows

I am currently implementing a "remove settings" for all users in a Windows uninstaller and came over an issue I am not even sure is possible to solve.
The application stores credential entries for the current user using the CredentialManager (keymgr.dll). Let's call the target of the credential "X". On uninstall all credentials with stored with target "X" should be removed on all users. The uninstaller of course requires administrator privileges but still I find it very difficult to accomplish this.
For the current user that command is generally solved via cmdkey /delete=:X from a command prompt. As far as I know cmdkey.exe /list only helps to list entries for the current user and can't remove local entries from another user.
I have learned that the credentials are stored as OS files under the C:\Users\_user_\AppData\Local\Microsoft\Credentials folder, but I can't know which files are the entries I want to delete and removing all would be dangerous for other applications. Also I assume removing OS files will be dangerous and could have limitations (extra UAC prompt?) as well.
Runas command is the closest shot I got but because it requires the password of the user it becomes very difficult and not something I would want in the uninstaller. I also would need a way to get the username and domain for each user and iterate them.
I would prefer to use either cmd or powershell for this.
Don't want to necro an old post but I needed to do this myself so I figured I'd add this in case anyone else needs it:
cmdkey /list | ForEach-Object{if($_ -like "*Target:*" -and $_ -like "*microsoft*"){cmdkey /del:($_ -replace " ","" -replace "Target:","")}}
Powershell one liner that will remove any credentials with Microsoft in the string.
Reference:
https://gist.github.com/janikvonrotz/7819990
I ran this and it purged it locally without needing to run as admin (but I am a local admin)
The cmdkey.exe utility when run from a batch file or a PowerShell command may encounter two issues related to special characters.
1. If run from a batch file, if the credential has "(" or ")" without the double quotes, that is left and right paren, that credential will not be removed.
2. If the credential name aka targetname, has a hyphen surronded by spaces the cmdkey will not remove or create a a credential with that string " - ".
There are a few powershell modules written to try and do this, but the only one i found that handles this exceptions was on Github
https://github.com/bamcisnetworks/BAMCIS.CredentialManager
BAMCIS.CredentialManager
Using this i was able to create credentials to set up a test environment with parens or hyphens, but more importantly to remove them by gathering the users list of cached credentials using the modules command and then passing the information in the command to the remove command to remove ALL cached credentials.
One caveat. After removing the command, after some period of time two cached credentials dynamically reappear.
So to address frequent user lock out issues i am going to try and deploy this using SCCM under user context at logoff. Otherwise a system restart after removing the credentials may be needed.
Here is a prototype script that imports the module and then uses it to remove all cached credentials, As always, test, test, test and use at your own risk!
Clear-host
import-Module "$PSScriptRoot\BAMCIS.CredentialManager\BAMCIS.CredentialManager.psd1"
$L = Get-CredManCredentialList -ErrorAction SilentlyContinue
If($L -eq $null)
{
Write-host "No Cached Credentials found to remove, no action taken"
$LASTEXITCODE = 0
Write-host "The last exit code is $LASTEXITCODE"
}
Else
{
ForEach($cred in $L)
{
Write-host "`tProcessing...`n"
Write-host "$($cred.TargetName.ToString())`n"
Write-host "$($cred.Type.ToString())`n`n"
$R = Remove-CredManCredential -TargetName $($cred.TargetName.ToString()) -Type $($cred.Type.ToString()) -Force
}
$L = Get-CredManCredentialList -ErrorAction SilentlyContinue -ErrorVariable $Cred_Error
If($L -eq $null)
{
Write-host "All Cached Credentials removed, program Complete"
$LASTEXITCODE = 0
Write-host "The last exit code is $LASTEXITCODE"
}
Else
{
Write-host "WARNING: One or more Cached Credentials were not removed, program Complete"
$LASTEXITCODE = 1
}
}
Batch, elevated cmd prompt:
To find the main ones, lists any with MS.O, Micro and teams:
for /f "tokens=1-4 Delims=:=" %A in ('cmdkey /list ^| findstr Target: ^| findstr /i "MS.O Micro Teams"') do #echo %D
This deletes all the entries for teams:
for /f "tokens=1-4 Delims=:=" %A in ('cmdkey /list ^| findstr /i teams') do #cmdkey /delete:%D
If you want to include this as a script the syntax will be slightly different. Double the %%A %%D vars
Ran into similar issue, clearing old credentials for non domain joined devices. Created logon task to clear credentials. In my case was looking for particular file server, so not to clear all creds by accident. Because of syntax issue when piping string expressions into task, was easier to create reference .ps1 then reference in task. Pushed this script through Intune....
# Full path of the file
$file = 'C:\script\clearcreds.ps1'
#If the file does not exist, create it.
if (-not(Test-Path -Path $file -PathType Leaf)) {
try {
$null = New-Item -Path "C:\ProgramData\CentraStage\Packages" -Name
"clearcreds.ps1" -ItemType "file" -Value 'cmdkey /list | ForEach-Object{if($_ -
like "*Target:*" -and $_ -like "*fileserver*"){cmdkey /del:($_ -replace " ","" -replace "Target:","")}}'
Write-Host "The file [$file] has been created."
}
catch {
throw $_.Exception.Message
}
}
#######################################
#Create ScheduledTask to Run at log on.
#######################################
$schtaskName = "Clear Cached Creds "
$schtaskDescription = "Clears Cached Creds for each user at logon."
$trigger = New-ScheduledTaskTrigger -AtLogOn
$class = cimclass MSFT_TaskEventTrigger root/Microsoft/Windows/TaskScheduler
#Execute task in users context
$principal = New-ScheduledTaskPrincipal -GroupId "S-1-5-32-545" -Id "Author"
$action3 = New-ScheduledTaskAction -Execute 'Powershell.exe' "-File C:\script\clearcreds.ps1"
$settings = New-ScheduledTaskSettingsSet -AllowStartIfOnBatteries -DontStopIfGoingOnBatteries
$null=Register-ScheduledTask -TaskName $schtaskName -Trigger $trigger -Action $action3 -Principal $principal -Settings $settings -Description
$schtaskDescription -Force
Start-ScheduledTask -TaskName $schtaskName

Remote Powershell parsing

Im trying to retrieve a parsed list of different information regarding remote executables within a windows domain, permissions are take care of and the individual Powershell commands are working, my issue is outputting this recursive list on a file (putting all together properly):
My desired Output (per computer):
computer_name.csv # Filename
$application1Name.exe, $application1Version, $application1LastModifiedDateMMDDYY, $application1MD5HASH
$application2Name.exe, $application2Version, $application2LastModifiedDateMMDDYY, $application2MD5HASH
...
So far I have all the pieces:
#A way to recursive retrieve executables from a given remote path (Name + LastModified):
get-childitem \\192.168.X.X\C$\defaultPath\FoldersAndSubfoldersWithExecutables\ - Include *.exe -Recurse | ForEach-Object {$_.Name, $_.LastWriteTime} > C:\LOCALPATH\output.txt
#A way to retrieve the version info from remote executables (Version):
[System.Diagnostics.FileVersionInfo]::GetVersionInfo("\\192.168.X.X\C$\defaultPath\application1.exe").FileVersion
#A way to retrieve the MD5 Hash from remote executable files (MD5HASH):
get-FileHash \\192.168.X.X\C$\defaultPath\application1.exe -Algorithm MD5 | ForEach-Object { $_.Hash }
My issue is building this script structure to accomodate the desired output listed above, I have a list of IP address to loop this script thru but Im having issues connecting the dots..
Thanks!
Each operation you listed can be executed within the ForEach-Object loop, and a resultant csv string containing all the necessary data points can be built using string interpolation.
Get-ChildItem \\192.168.x.x\C$\defaultPath\FoldersAndSubfoldersWithExes\ -Include *.exe -Recurse | ForEach-Object {
$Name = $_.Name
$LastWriteTime = $_.LastWriteTime
$Version =[System.Diagnostics.FileVersionInfo]::GetVersionInfo($_.FullName).FileVersion
$Hash = (Get-FileHash $_.FullName -Algorithm MD5).Hash
"$Name, $Version, $LastWriteTime, $Hash"
} | Out-File computerName.csv

How to check the folder on the appearance in it of any new file?

Help please. I can not find a solution. (Windows platform)
I need to:
Scan the folder
If you receive any new file.
Process the file.
Another method to detect "new files" is the archive attribute. Whenever a file is created or changed, this attribute is set by windows.
Whenever you process a file, unset it's archive attribute (attrib -a file.ext).
The advantage is, you don't depend on any timing.
To list "new" (or changed) files, use dir /aa (dir /a-a will list processed files)
for more infos see dir /? and attrib /?
Without knowing exactly what you're trying to execute, this is all I can provide. You would theoretically run this as a scheduled task every 1 hour:
foreach ($file in (Get-ChildItem "C:\TargetDirectory" | where {$_.lastwritetime -gt (Get-Date).AddHours(-1)})) {
# Execute-Command -Target $file
}
You could use the FileSystemWatcher class to monitor the folder for new files.
It can easily be used from PowerShell as well:
$FSW = New-Object System.IO.FileSystemWatcher
Then use Register-ObjectEvent to "listen" for events raised from it
FileSystemWatcher is a utility I have recently learned and will definitely use in the future. The best part is that it relies on .net eventing, so you don't need to build an external triggering structure.
Here is an example of how I am using this in a 24/7 production environment (the full script receives an xml, processes it, and inserts the results into SQL in under 3 seconds).
Function submit-resultFile {
#Actual file processing takes place here
}
Function create-fsw {
Register-ObjectEvent $fsw Created -SourceIdentifier "Spectro FileCreated" -Action {
$name = $Event.SourceEventArgs.Name
$File = $Event.SourceEventArgs.Fullpath
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Verbose "The file '$name' was $changeType at $timeStamp" -fore green
submit-ResultFile -xmlfile $file
}
}
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $watchFolder, $watchFilter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$xmlFiles = Get-ChildItem -Path $ResultsDirectory -Filter *.xml
foreach ($file in $xmlfiles)
{
submit-ResultFile -xmlfile $File.FullName
}
Create-fsw
# Register a new File System Watcher
Several important points to be aware of:
- if files exist in that location before the FSW is created they WILL NOT trigger an "objectevent", so in my script you'll observe that I begin by running a sweep for existing files.
when FSW does trigger you want it to process only 1 file at a time. Since the next file creation event will generate a new "objectevent". Structuring a FSW to work on multiple files per trigger will eventually result in a crash.

Unix tail equivalent command in Windows Powershell

I have to look at the last few lines of a large file (typical size is 500MB-2GB). I am looking for a equivalent of Unix command tail for Windows Powershell. A few alternatives available on are,
http://tailforwin32.sourceforge.net/
and
Get-Content [filename] | Select-Object -Last 10
For me, it is not allowed to use the first alternative, and the second alternative is slow. Does anyone know of an efficient implementation of tail for PowerShell.
Use the -wait parameter with Get-Content, which displays lines as they are added to the file. This feature was present in PowerShell v1, but for some reason not documented well in v2.
Here is an example
Get-Content -Path "C:\scripts\test.txt" -Wait
Once you run this, update and save the file and you will see the changes on the console.
For completeness I'll mention that Powershell 3.0 now has a -Tail flag on Get-Content
Get-Content ./log.log -Tail 10
gets the last 10 lines of the file
Get-Content ./log.log -Wait -Tail 10
gets the last 10 lines of the file and waits for more
Also, for those *nix users, note that most systems alias cat to Get-Content, so this usually works
cat ./log.log -Tail 10
As of PowerShell version 3.0, the Get-Content cmdlet has a -Tail parameter that should help. See the technet library online help for Get-Content.
I used some of the answers given here but just a heads up that
Get-Content -Path Yourfile.log -Tail 30 -Wait
will chew up memory after awhile. A colleague left such a "tail" up over the last day and it went up to 800 MB. I don't know if Unix tail behaves the same way (but I doubt it). So it's fine to use for short term applications, but be careful with it.
PowerShell Community Extensions (PSCX) provides the Get-FileTail cmdlet. It looks like a suitable solution for the task. Note: I did not try it with extremely large files but the description says it efficiently tails the contents and it is designed for large log files.
NAME
Get-FileTail
SYNOPSIS
PSCX Cmdlet: Tails the contents of a file - optionally waiting on new content.
SYNTAX
Get-FileTail [-Path] <String[]> [-Count <Int32>] [-Encoding <EncodingParameter>] [-LineTerminator <String>] [-Wait] [<CommonParameters>]
Get-FileTail [-LiteralPath] <String[]> [-Count <Int32>] [-Encoding <EncodingParameter>] [-LineTerminator <String>] [-Wait] [<CommonParameters>]
DESCRIPTION
This implentation efficiently tails the cotents of a file by reading lines from the end rather then processing the entire file. This behavior is crucial for ef
ficiently tailing large log files and large log files over a network. You can also specify the Wait parameter to have the cmdlet wait and display new content
as it is written to the file. Use Ctrl+C to break out of the wait loop. Note that if an encoding is not specified, the cmdlet will attempt to auto-detect the
encoding by reading the first character from the file. If no character haven't been written to the file yet, the cmdlet will default to using Unicode encoding
. You can override this behavior by explicitly specifying the encoding via the Encoding parameter.
Probably too late for an answere but, try this one
Get-Content <filename> -tail <number of items wanted> -wait
Just some additions to previous answers. There are aliases defined for Get-Content, for example if you are used to UNIX you might like cat, and there are also type and gc. So instead of
Get-Content -Path <Path> -Wait -Tail 10
you can write
# Print whole file and wait for appended lines and print them
cat <Path> -Wait
# Print last 10 lines and wait for appended lines and print them
cat <Path> -Tail 10 -Wait
I have a useful tip on this subject concerning multiple files.
Following a single log file (like 'tail -f' in Linux) with PowerShell 5.2 (Win7 and Win10) is easy (just use "Get-Content MyFile -Tail 1 -Wait"). However, watching MULTIPLE log files at once seems complicated. With PowerShell 7.x+ however, I've found an easy way by using "Foreach-Object -Parrallel". This performs multiple 'Get-Content' commands concurrently. For example:
Get-ChildItem C:\logs\*.log | Foreach-Object -Parallel { Get-Content $_ -Tail 1 -Wait }
Using Powershell V2 and below, get-content reads the entire file, so it was of no use to me. The following code works for what I needed, though there are likely some issues with character encodings. This is effectively tail -f, but it could be easily modified to get the last x bytes, or last x lines if you want to search backwards for line breaks.
$filename = "\wherever\your\file\is.txt"
$reader = new-object System.IO.StreamReader(New-Object IO.FileStream($filename, [System.IO.FileMode]::Open, [System.IO.FileAccess]::Read, [IO.FileShare]::ReadWrite))
#start at the end of the file
$lastMaxOffset = $reader.BaseStream.Length
while ($true)
{
Start-Sleep -m 100
#if the file size has not changed, idle
if ($reader.BaseStream.Length -eq $lastMaxOffset) {
continue;
}
#seek to the last max offset
$reader.BaseStream.Seek($lastMaxOffset, [System.IO.SeekOrigin]::Begin) | out-null
#read out of the file until the EOF
$line = ""
while (($line = $reader.ReadLine()) -ne $null) {
write-output $line
}
#update the last max offset
$lastMaxOffset = $reader.BaseStream.Position
}
I found most of the code to do this here.
I took #hajamie's solution and wrapped it up into a slightly more convenient script wrapper.
I added an option to start from an offset before the end of the file, so you can use the tail-like functionality of reading a certain amount from the end of the file. Note the offset is in bytes, not lines.
There's also an option to continue waiting for more content.
Examples (assuming you save this as TailFile.ps1):
.\TailFile.ps1 -File .\path\to\myfile.log -InitialOffset 1000000
.\TailFile.ps1 -File .\path\to\myfile.log -InitialOffset 1000000 -Follow:$true
.\TailFile.ps1 -File .\path\to\myfile.log -Follow:$true
And here is the script itself...
param (
[Parameter(Mandatory=$true,HelpMessage="Enter the path to a file to tail")][string]$File = "",
[Parameter(Mandatory=$true,HelpMessage="Enter the number of bytes from the end of the file")][int]$InitialOffset = 10248,
[Parameter(Mandatory=$false,HelpMessage="Continuing monitoring the file for new additions?")][boolean]$Follow = $false
)
$ci = get-childitem $File
$fullName = $ci.FullName
$reader = new-object System.IO.StreamReader(New-Object IO.FileStream($fullName, [System.IO.FileMode]::Open, [System.IO.FileAccess]::Read, [IO.FileShare]::ReadWrite))
#start at the end of the file
$lastMaxOffset = $reader.BaseStream.Length - $InitialOffset
while ($true)
{
#if the file size has not changed, idle
if ($reader.BaseStream.Length -ge $lastMaxOffset) {
#seek to the last max offset
$reader.BaseStream.Seek($lastMaxOffset, [System.IO.SeekOrigin]::Begin) | out-null
#read out of the file until the EOF
$line = ""
while (($line = $reader.ReadLine()) -ne $null) {
write-output $line
}
#update the last max offset
$lastMaxOffset = $reader.BaseStream.Position
}
if($Follow){
Start-Sleep -m 100
} else {
break;
}
}
try Windows Server 2003 Resource Kit Tools
it contains a tail.exe which can be run on Windows system.
https://www.microsoft.com/en-us/download/details.aspx?id=17657
There have been many valid answers, however, none of them has the same syntax as tail in linux. The following function can be stored in your $Home\Documents\PowerShell\Microsoft.PowerShell_profile.ps1 for persistency (see powershell profiles documentation for more details).
This allows you to call...
tail server.log
tail -n 5 server.log
tail -f server.log
tail -Follow -Lines 5 -Path server.log
which comes quite close to the linux syntax.
function tail {
<#
.SYNOPSIS
Get the last n lines of a text file.
.PARAMETER Follow
output appended data as the file grows
.PARAMETER Lines
output the last N lines (default: 10)
.PARAMETER Path
path to the text file
.INPUTS
System.Int
IO.FileInfo
.OUTPUTS
System.String
.EXAMPLE
PS> tail c:\server.log
.EXAMPLE
PS> tail -f -n 20 c:\server.log
#>
[CmdletBinding()]
[OutputType('System.String')]
Param(
[Alias("f")]
[parameter(Mandatory=$false)]
[switch]$Follow,
[Alias("n")]
[parameter(Mandatory=$false)]
[Int]$Lines = 10,
[parameter(Mandatory=$true, Position=5)]
[ValidateNotNullOrEmpty()]
[IO.FileInfo]$Path
)
if ($Follow)
{
Get-Content -Path $Path -Tail $Lines -Wait
}
else
{
Get-Content -Path $Path -Tail $Lines
}
}
Very basic, but does what you need without any addon modules or PS version requirements:
while ($true) {Clear-Host; gc E:\test.txt | select -last 3; sleep 2 }
It is possible to download all of the UNIX commands compiled for Windows from this GitHub repository: https://github.com/George-Ogden/UNIX
For those admins who live by the axiom that less typing is best, here is the shortest version I can find:
gc filename -wai -ta 10

Resources