Trying to pull out newest errors in Log file and the create custom.txt output - powershell-4.0

I am new to powershell scripting and have been tasked to create some alerts based on errors in certain logfiles. These are just logs from a bespoke application.
My current Code is
`$OutputFile3 = (Get-Location).Path + ".\Results.txt"
$Sourcefolder= "C:\Users\dewana\Documents\Test\"
$Targetfolder= "C:\Users\dewana\Documents\Test\Test3"
Get-ChildItem -Path $Sourcefolder -Recurse|
Where-Object {
$_.LastWriteTime -gt [datetime]::Now.AddMinutes(-5)
}| Copy-Item -Destination $Targetfolder
$Testing5 = Get-Content -Tail -1 -Path "C:\Users\dewana\Documents\Test\Test3\*.txt" | Where-Object
{ $_.Contains("errors") }
Remove-Item $OutputFile3
New-Item $OutputFile3 -ItemType file
try
{
$stream = [System.IO.StreamWriter] $OutputFile3
$stream.WriteLine('clientID 1111')
$stream.WriteLine('SEV 1')
$stream.WriteLine('Issue with this process')
}
finally
{
$stream.close()
}`
What i am struggling with is trying is
$Testing5 = Get-Content -Tail -1 -Path "C:\Users\dewana\Documents\Test\Test3\*.txt" | Where-Object { $_.Contains("errors") }
I am trying to store the latest string which contains the word error in the log file. i would want to use the stored string to the create an if statement to say if $Testing5 have a new value of error assigned the create a custom text file.
I can't seem to find out why the get-content is not working with the where-object

The only issue I can see is your Where-Object code block is on the next line.
Get-Content -Tail -1 -Path $tempfile | Where-Object
{ $_.Contains("errors") }
If you separate at the pipe it's fine.
Get-Content -Tail -1 -Path $tempfile |
Where-Object { $_.Contains("errors") }

Related

Getting root folder name with PS

I am trying to create a PowerShell script to fetch the root folder's name where in their subdirectories files with error names are present with today's date. Below is the sample code I have tried so far to pick the folder names.
Root Log folder - C:\Errorlogs, contains many other application log level folders.
$targetDir="C:\Errorlogs"
Get-ChildItem $targetDir -Recurse -ErrorAction SilentlyContinue -Force -Filter "*Error*"|
where {([datetime]::now.Date -eq $_.lastwritetime.Date)} |
select FullName
I have tried the above code; however, it's giving me the whole path as result, whereas I only need the folder name.
Result - C:\Errorlogs\AsyncCreateUsersAPIProcessor\202302\04\Error.txt
Required - AsyncCreateUsersAPIProcessor
Use string LastIndexOf and SubString
$rootPath = "C:\Temp\Errorlogs"
$date = [DateTime]::Now.ToString("yyyyMM\\\\dd")
$pattern = '\\(?<folder>\w+)\\' + $date + '\\Error.*$'
$files = Get-ChildItem -Path $rootPath -Recurse | Select-Object -Property Fullname | Where-Object {$_.Fullname -Match $pattern}
foreach($file in $files)
{
$file.Fullname -match $pattern
Write-Host "folder = " $Matches.folder
}
Looks like you can do it just with splitting the path using \ as delimiter then picking the 3rd token (2nd index of an array):
$targetDir = "C:\Errorlogs"
Get-ChildItem $targetDir -Recurse -ErrorAction SilentlyContinue -Force -Filter "*Error*" |
Where-Object { [datetime]::Now.Date -eq $_.LastWriteTime.Date } |
Select-Object #{ N='Name'; E={ $_.FullName.Split('\')[2] }}
Another option if you want 2 levels up in the folder hierarchy is to query the .Directory property of the file then the .Parent property of the parent folder (2 times or as many times as needed):
$targetDir = "C:\Errorlogs"
Get-ChildItem $targetDir -Recurse -ErrorAction SilentlyContinue -Force -Filter "*Error*" |
Where-Object { [datetime]::Now.Date -eq $_.LastWriteTime.Date } |
Select-Object #{ N='Name'; E={ $_.Directory.Parent.Parent.Name }}
As long as the subfolders inside the folder you are after all have numeric-only names, you can loop backwards to get at the first non-numeric foldername and output that.
$targetDir = "C:\Errorlogs"
Get-ChildItem -Path $targetDir -File -Filter "*Error*" -Recurse -Force -ErrorAction SilentlyContinue |
Where-Object { [datetime]::Now.Date -eq $_.LastWriteTime.Date } | ForEach-Object {
$parentDir = $_.Directory
while ($parentDir.Name -match '^\d+$') { $parentDir = $parentDir.Parent }
$parentDir.Name
}
That way, even a path C:\Errorlogs\AsyncCreateUsersAPIProcessor\202302\02\04\1234\567\Error.txt would produce folder name AsyncCreateUsersAPIProcessor

Powershell command to fetch all file path for all desired files extensions

I want to search all drives using PowerShell on windows machine to get the list of all files along with their extensions -
Based on desired extension we pass in it like - *.mp3 or
Fetch all files with multiple extensions like - *.txt, *.mp3 etc.
I tried below script but its giving only information from where we are running it. But I want to scan whole machine.
Get-ChildItem -Path .\ -Filter ***.doc** -Recurse -File| Sort-Object Length -Descending | ForEach-Object { $_.BaseName }
Checkout the Get-PSDrive cmdlet. It returns a list of drives, and you can specify just disk drives with the -PSProvider FileSystem parameter:
foreach ( $drive in $(Get-PSDrive -PSProvider FileSystem) ) {
Get-ChildItem -Path $drive.Root -Filter ***.doc** -Recurse -File |
Sort-Object Length -Descending |
ForEach-Object { $_.BaseName }
}
Didn't test that but you get the idea.
Using -Include on Get-ChildItem will allow you to specify a list of extensions. The -ErrorAction will cause it to skip drives that are not available such as an unmounted CD drive.
Get-PSDrive -PSProvider FileSystem |
ForEach-Object {
Get-ChildItem -Path $_.Root -Recurse -Include '*.doc*', '*.txt' -ErrorAction SilentlyContinue |
ForEach-Object { $_.Name }
} |
ForEach-Object {[PSCustomObject]#{HashCode = $_.GetHashCode(); FullName = $_.FullName}}
} |
Export-Csv -Path $TempFile -NoTypeInformation -Encoding ASCII
Update:
Here is a better way. It will prevent unknown extensions from getting into the mix such as "Microsoft.NET.Sdk.Publish.Docker.targets."
$ExtensionList = #('.txt', '.doc', '.docx', '.mp3')
$TempFile = Join-Path -path $Env:TEMP -ChildPath "$($pid.ToString()).tmp"
Get-PSDrive -PSProvider FileSystem |
ForEach-Object {
Get-ChildItem -Path $_.Root -Recurse -ErrorAction SilentlyContinue |
Where-Object { $ExtensionList -contains $_.Extension } |
ForEach-Object {
[PSCustomObject]#{
HashCode = $_.GetHashCode();
DirectoryName = $_.DirectoryName
Name = $_.Name
}
}
} |
Export-Csv -Path $TempFile -Delimiter ';' -NoTypeInformation -Encoding ASCII
Write-Host "The temp file is $TempFile"
This is more than what the original question asked, but if you are going to go through the trouble of listing all your files, I suggest getting the filehash as well so you can determine if you have duplicates. A simple file name search will not detect if the same file has been saved with a different name. Adding to what #lit (https://stackoverflow.com/users/447901/lit) has posted:
$ExtensionList = #('.txt', '.doc', '.docx', '.mp3')
Get-PSDrive -PSProvider FileSystem |
ForEach-Object {
Get-ChildItem -Path $_.Root -Recurse -ErrorAction SilentlyContinue |
Where-Object { $ExtensionList -eq $_.Extension } |
## ForEach-Object { $_.Name, $_.FullName, $_.GetHashCode() }
Select-Object #{Name="Name";Expression={$_.Name}}, #{Name="Hash";Expression={$_.GetHashCode()}}, #{Name="FullName";Expression={$_.FullName}} |
Export-Csv -Path C:\Temp\testing.csv -NoTypeInformation -Append
}
The addition of the file hash will allow you to see if you have duplicates and the full name will allow you to see where they are located.

Copy and paste the latest modified log files from one server to other server (the servers are in the different domains)

I am trying to write a script that copies the latest modified log file from one server to another server (servers are in different domains), while copying it should check for the credentials and then execute the script.
Please let me know if the script is correct or any corrections to be made.
$sourcePath = 'sourcepath'
$destPath = 'Destinationpath'
$compareDate = (Get-Date).AddDays(-1);
$LastFileCaptured = Get-ChildItem -Path $sourcePath |
where {$_.Extension.EndsWith('.log') -and $_.LastWriteTime -gt $compareDate } |
Sort LastAccessTime -Descending |
select -First 1 |
select -ExcludeProperty Name, LastAccessTime
Write-Host $LastFileCaptured.Name
$LastFileCaptured.LastAccessTime
$LastFileCaptured = Get-ChildItem -Recurse |
Where-Object{$_.LastWriteTime.AddDays(-1) -gt (Get-Date)}
Write-Host $LastFileCaptured
Get-ChildItem $sourcePath -Recurse -Include '.log' | Where-Object {
$_.LastWriteTime.AddDays(-1).ToString("yyyy/MM/dd") -gt (get-date).ToString("yyyy/mm/dd")
} | ForEach-Object {
$destDir = Split-Path ($_.FullName -replace [regex]::Escape($sourcePath), $destPath)
if (!(Test-Path $destDir)) {
New-Item -ItemType directory $destDir | Out-Null
}
Copy-Item $_ -Destination $destDir
}
The "correctness" of your script is determined easily by running it! But, while this isn't a direct answer, I would suggest robocopy for this task.
In particular note these options:
/mon: Monitors the source, and runs again when more than N changes are detected.
/maxage: Specifies the maximum file age (to exclude files older than N days or date).

powershell exporting to text file

I'm working on a script that checks folders in specific directory. For example, I run the script for first time, it generates me a txt file containing folders in the directory.
I need the script to add any new directories that are found to the previously created txt file when the script is run again.
Does anyone have any suggestions how to make that happen?
Here is my code so far:
$LogFolders = Get-ChildItem -Directory mydirectory ;
If (-Not (Test-Path -path "txtfilelocated"))
{
Add-Content txtfilelocated -Value $LogFolders
break;
}else{
$File = Get-Content "txtfilelocatedt"
$File | ForEach-Object {
$_ -match $LogFolders
}
}
$File
something like this?
You can specify what directory to check adding path to get-childitem cmdlet in first line
$a = get-childitem | ? { $_.psiscontainer } | select -expand fullname #for V2.0 and above
$a = get-childitem -Directory | select -expand fullname #for V3.0 and above
if ( test-path .\list.txt )
{
compare-object (gc list.txt) ($a) -PassThru | Add-Content .\list.txt
}
else
{
$a | set-content .\list.txt
}

How can I use PowerShell to copy the same file to many directories with the same name?

I'm trying to copy one file to any subfolder in a directory that has a specific name. I am part way there, but just can't quite get it to work.
I am able to find all of the subfolders called "help" using:
Get-ChildItem -Path Y:\folder1\subfolder -Directory -Recurse | ? { ($_.PSIsContainer -eq $true) -and ($_.Name -like 'help')}
That will get any folder in Y:\folder1\subfolder named help. So have been trying:
$folder = Get-ChildItem -Path Y:Y:\folder1\subfolder -Directory -Recurse | ? { ($_.PSIsContainer -eq $true) -and ($_.Name -like 'help')}
foreach ($f in $folder){
Copy-Item Y:\Info.html -Destination $folder[$f]
}
and that does not work. Bonus points if you can also tell me how to have it write out to a csv file all of the directories it copies the file to.
Thanks
I wrote this with version 3, but I think it will work with 1 and 2 since I used Set-StrictMode -Version <number> to test them.
The CSV output will look something like this for every line: Y:\Info.html,Y:\folder1\subfolder\help
$logpath = 'C:\log.csv'
$logopts = #{filepath=$logpath; append=$true; encoding='ascii'}
$file = 'Y:\Info.html'
$path = 'Y:\folder1\subfolder'
$search = 'help'
gci $path -d -s `
| ?{ $_.psIsContainer -and $_.name -match $search } `
| %{
cp $file $_.fullName; # copy file
$line = $file, $_.fullName -join ','; # build output
$line | out-file #logopts; # write output
}
Version 1
$folders = #(
(gci Y:\folder1\subfolder -dir -r | ? {$_.Name -like 'help'}).fullname
)
ForEach ($f in $folders) {
Copy-Item Y:\Info.html $f
}
Version 2
(gci Y:\folder1\subfolder -dir -r | ? {$_.Name -like 'help'}).fullname | % {cp Y:\Info.html $_}

Resources