Unix tail equivalent command in Windows Powershell - windows

I have to look at the last few lines of a large file (typical size is 500MB-2GB). I am looking for a equivalent of Unix command tail for Windows Powershell. A few alternatives available on are,
http://tailforwin32.sourceforge.net/
and
Get-Content [filename] | Select-Object -Last 10
For me, it is not allowed to use the first alternative, and the second alternative is slow. Does anyone know of an efficient implementation of tail for PowerShell.

Use the -wait parameter with Get-Content, which displays lines as they are added to the file. This feature was present in PowerShell v1, but for some reason not documented well in v2.
Here is an example
Get-Content -Path "C:\scripts\test.txt" -Wait
Once you run this, update and save the file and you will see the changes on the console.

For completeness I'll mention that Powershell 3.0 now has a -Tail flag on Get-Content
Get-Content ./log.log -Tail 10
gets the last 10 lines of the file
Get-Content ./log.log -Wait -Tail 10
gets the last 10 lines of the file and waits for more
Also, for those *nix users, note that most systems alias cat to Get-Content, so this usually works
cat ./log.log -Tail 10

As of PowerShell version 3.0, the Get-Content cmdlet has a -Tail parameter that should help. See the technet library online help for Get-Content.

I used some of the answers given here but just a heads up that
Get-Content -Path Yourfile.log -Tail 30 -Wait
will chew up memory after awhile. A colleague left such a "tail" up over the last day and it went up to 800 MB. I don't know if Unix tail behaves the same way (but I doubt it). So it's fine to use for short term applications, but be careful with it.

PowerShell Community Extensions (PSCX) provides the Get-FileTail cmdlet. It looks like a suitable solution for the task. Note: I did not try it with extremely large files but the description says it efficiently tails the contents and it is designed for large log files.
NAME
Get-FileTail
SYNOPSIS
PSCX Cmdlet: Tails the contents of a file - optionally waiting on new content.
SYNTAX
Get-FileTail [-Path] <String[]> [-Count <Int32>] [-Encoding <EncodingParameter>] [-LineTerminator <String>] [-Wait] [<CommonParameters>]
Get-FileTail [-LiteralPath] <String[]> [-Count <Int32>] [-Encoding <EncodingParameter>] [-LineTerminator <String>] [-Wait] [<CommonParameters>]
DESCRIPTION
This implentation efficiently tails the cotents of a file by reading lines from the end rather then processing the entire file. This behavior is crucial for ef
ficiently tailing large log files and large log files over a network. You can also specify the Wait parameter to have the cmdlet wait and display new content
as it is written to the file. Use Ctrl+C to break out of the wait loop. Note that if an encoding is not specified, the cmdlet will attempt to auto-detect the
encoding by reading the first character from the file. If no character haven't been written to the file yet, the cmdlet will default to using Unicode encoding
. You can override this behavior by explicitly specifying the encoding via the Encoding parameter.

Probably too late for an answere but, try this one
Get-Content <filename> -tail <number of items wanted> -wait

Just some additions to previous answers. There are aliases defined for Get-Content, for example if you are used to UNIX you might like cat, and there are also type and gc. So instead of
Get-Content -Path <Path> -Wait -Tail 10
you can write
# Print whole file and wait for appended lines and print them
cat <Path> -Wait
# Print last 10 lines and wait for appended lines and print them
cat <Path> -Tail 10 -Wait

I have a useful tip on this subject concerning multiple files.
Following a single log file (like 'tail -f' in Linux) with PowerShell 5.2 (Win7 and Win10) is easy (just use "Get-Content MyFile -Tail 1 -Wait"). However, watching MULTIPLE log files at once seems complicated. With PowerShell 7.x+ however, I've found an easy way by using "Foreach-Object -Parrallel". This performs multiple 'Get-Content' commands concurrently. For example:
Get-ChildItem C:\logs\*.log | Foreach-Object -Parallel { Get-Content $_ -Tail 1 -Wait }

Using Powershell V2 and below, get-content reads the entire file, so it was of no use to me. The following code works for what I needed, though there are likely some issues with character encodings. This is effectively tail -f, but it could be easily modified to get the last x bytes, or last x lines if you want to search backwards for line breaks.
$filename = "\wherever\your\file\is.txt"
$reader = new-object System.IO.StreamReader(New-Object IO.FileStream($filename, [System.IO.FileMode]::Open, [System.IO.FileAccess]::Read, [IO.FileShare]::ReadWrite))
#start at the end of the file
$lastMaxOffset = $reader.BaseStream.Length
while ($true)
{
Start-Sleep -m 100
#if the file size has not changed, idle
if ($reader.BaseStream.Length -eq $lastMaxOffset) {
continue;
}
#seek to the last max offset
$reader.BaseStream.Seek($lastMaxOffset, [System.IO.SeekOrigin]::Begin) | out-null
#read out of the file until the EOF
$line = ""
while (($line = $reader.ReadLine()) -ne $null) {
write-output $line
}
#update the last max offset
$lastMaxOffset = $reader.BaseStream.Position
}
I found most of the code to do this here.

I took #hajamie's solution and wrapped it up into a slightly more convenient script wrapper.
I added an option to start from an offset before the end of the file, so you can use the tail-like functionality of reading a certain amount from the end of the file. Note the offset is in bytes, not lines.
There's also an option to continue waiting for more content.
Examples (assuming you save this as TailFile.ps1):
.\TailFile.ps1 -File .\path\to\myfile.log -InitialOffset 1000000
.\TailFile.ps1 -File .\path\to\myfile.log -InitialOffset 1000000 -Follow:$true
.\TailFile.ps1 -File .\path\to\myfile.log -Follow:$true
And here is the script itself...
param (
[Parameter(Mandatory=$true,HelpMessage="Enter the path to a file to tail")][string]$File = "",
[Parameter(Mandatory=$true,HelpMessage="Enter the number of bytes from the end of the file")][int]$InitialOffset = 10248,
[Parameter(Mandatory=$false,HelpMessage="Continuing monitoring the file for new additions?")][boolean]$Follow = $false
)
$ci = get-childitem $File
$fullName = $ci.FullName
$reader = new-object System.IO.StreamReader(New-Object IO.FileStream($fullName, [System.IO.FileMode]::Open, [System.IO.FileAccess]::Read, [IO.FileShare]::ReadWrite))
#start at the end of the file
$lastMaxOffset = $reader.BaseStream.Length - $InitialOffset
while ($true)
{
#if the file size has not changed, idle
if ($reader.BaseStream.Length -ge $lastMaxOffset) {
#seek to the last max offset
$reader.BaseStream.Seek($lastMaxOffset, [System.IO.SeekOrigin]::Begin) | out-null
#read out of the file until the EOF
$line = ""
while (($line = $reader.ReadLine()) -ne $null) {
write-output $line
}
#update the last max offset
$lastMaxOffset = $reader.BaseStream.Position
}
if($Follow){
Start-Sleep -m 100
} else {
break;
}
}

try Windows Server 2003 Resource Kit Tools
it contains a tail.exe which can be run on Windows system.
https://www.microsoft.com/en-us/download/details.aspx?id=17657

There have been many valid answers, however, none of them has the same syntax as tail in linux. The following function can be stored in your $Home\Documents\PowerShell\Microsoft.PowerShell_profile.ps1 for persistency (see powershell profiles documentation for more details).
This allows you to call...
tail server.log
tail -n 5 server.log
tail -f server.log
tail -Follow -Lines 5 -Path server.log
which comes quite close to the linux syntax.
function tail {
<#
.SYNOPSIS
Get the last n lines of a text file.
.PARAMETER Follow
output appended data as the file grows
.PARAMETER Lines
output the last N lines (default: 10)
.PARAMETER Path
path to the text file
.INPUTS
System.Int
IO.FileInfo
.OUTPUTS
System.String
.EXAMPLE
PS> tail c:\server.log
.EXAMPLE
PS> tail -f -n 20 c:\server.log
#>
[CmdletBinding()]
[OutputType('System.String')]
Param(
[Alias("f")]
[parameter(Mandatory=$false)]
[switch]$Follow,
[Alias("n")]
[parameter(Mandatory=$false)]
[Int]$Lines = 10,
[parameter(Mandatory=$true, Position=5)]
[ValidateNotNullOrEmpty()]
[IO.FileInfo]$Path
)
if ($Follow)
{
Get-Content -Path $Path -Tail $Lines -Wait
}
else
{
Get-Content -Path $Path -Tail $Lines
}
}

Very basic, but does what you need without any addon modules or PS version requirements:
while ($true) {Clear-Host; gc E:\test.txt | select -last 3; sleep 2 }

It is possible to download all of the UNIX commands compiled for Windows from this GitHub repository: https://github.com/George-Ogden/UNIX

For those admins who live by the axiom that less typing is best, here is the shortest version I can find:
gc filename -wai -ta 10

Related

Powershell script: List files with specific change date (Amount if possible)

For license porpuses I try to automate the counting process instead of having to login into every single server, go into directory, search a file name and count the results based on the change date.
Want I'm aiming for:
Running a powershell script every month that checks the directory "C:\Users" for the file "Outlook.pst" recursively. And then filters the result by change date (one month or newer). Then packing this into an email to send to my inbox.
I'm not sure if that's possible, cause I am fairly new to powershell. Would appreciate your help!
It is possible.
I dont know how to start a ps session on a remote computer, but I think the cmdlet Enter-PSSession will do the trick. Or at least it was the first result while searching for "open remote powershell session". If that does not work use the Invoke-Command as suggested by lit to get $outlookFiles as suggested below.
For the rest use this.
$outlookFiles = Get-ChildItem -Path "C:\Users" -Recurse | Where-Object { $_.Name -eq "Outlook.pst" }
Now you have all files that have this name. If you are not familiar with the pipe in powershell it redirects all objects it found with the Get-ChildItem to the next pipe section and here the Where-Object will filter the received objects. If the current object ($_) will pass the condition it is returned by the whole command.
Now you can filter these objects again to only include the latest ones with.
$latestDate = (Get-Date).AddMonths(-1)
$newFiles = $outlookFiles | Where-Object { $_.LastAccessTime -gt $latestDate }
Now you have all the data you want in one object. Now you only have to format this how you like it e.g. you could use $mailBody = $newFiles | Out-String and then use Send-MailMessage -To x#y.z -From r#g.b -Body $mailBodyto send the mail.

Is there a way to make a link/symlink/shortcut to the latest file in Windows? Keep tailing the latest log file

I searched high and low, found how to do it in *nix, but nothing about Windows.
First place I've seen this was Tomcat's catalina.out, and now I was wondering how to do a similar thing on Windows: considering a folder where log files are created, how to make a file that reads the/points to latest log created?
I'm thinking a Powershell solution might be possible, but I honestly can't think or find any way to do it.
(edit) You guys downvoting could at least leave a comment to tell me what did I do wrong or how can I improve this question?
(edit) The idea here is to have some way to create a symlink that points to the latest log file in a folder, so a program can monitor always the same file, no matter if the latest file changes its name - like tail -f catalina.out always reads the latest catalina log file.
The only way out I can see, and that I wanted to avoid, would be to write a powershell script that would monitor a folder (https://superuser.com/questions/226828/how-to-monitor-a-folder-and-trigger-a-command-line-action-when-a-file-is-created) and would dynamically create a symlink to the latest file found (https://stackoverflow.com/a/11211005/1985023), then set it as a service, so it would be always running on the background.
Instead of looking for a dynamically self-updating symlink (which would be quite cumbersome to implement - see the helpful hints from BACON in the comments in the question), you can make this work as a self-contained function/script with the help of PowerShell background jobs:
Run in a loop that periodically gets the latest log-file lines from a background job that does the equivalent of Unix tail -f via Get-Content -Wait -Tail 10.
If a new log file is found, terminate the previous background job and start one for the new log file.
Note that this relies on periodic polling of the background job that tails the log. The code below allows you to adjust the polling interval.
Note that Get-Content -Wait itself polls the target file for changes every second.
Here's the code; run $VerbosePreference = 'Continue' to see what's going on inside the loop:
$dir = 'C:\path\to\logs' # the log-file directory
$logFilePattern = '*.log' # wildcard pattern matching log files
$sleepIntervalMs = 1000 # how many msec. to sleep between getting new lines from the background job
Write-Host -ForegroundColor Green "Tailing the latest log(s) in $dir...`nPress any key to quit."
$currJob = $currLog = $null
while ($true) {
# If the user pressed a key, clean up and exit.
if ([console]::KeyAvailable) {
$null = [console]::ReadKey($True) # consume the key - it will still have printed, though
if ($currJob) { Remove-Job -Job $currJob -Force }
break
}
# Get the latest lines from the current log from the background job.
if ($currJob) {
Write-Verbose "Checking for new lines in $newLog..."
Receive-Job -Job $currJob
Start-Sleep -Milliseconds $sleepIntervalMs # sleep a little
}
# Determine the first / newest log.
$newLog = Get-ChildItem -LiteralPath $dir -Filter $logFilePattern | Sort-Object CreationTimeUtc -Descending | Select-Object -First 1
if ($newLog.FullName -ne $currLog.FullName) { # new log file found.
Write-Verbose "(New) log file found: $newLog"
if ($currJob) {
Write-Verbose "Terminating background job for previous log ($currLog)."
Remove-Job -Job $currJob -Force
# When a *new* log was just started, we show *all* lines (and keep listening for more).
$tailArg = #{}
} else {
# When we first start monitoring, we start with the *last 10* lines
# of the current log (and keep listening for more).
$tailArg = #{ Tail = 10 } # On first
}
$currLog = $newLog
Write-Verbose "Starting background job for $currLog..."
# Start the background job for the new log.
$currJob = Start-Job { Get-Content -Wait #using:tailArg -LiteralPath $using:newLog.FullName }
}
}
Write-Host -ForegroundColor Green "Terminated."

How to check the folder on the appearance in it of any new file?

Help please. I can not find a solution. (Windows platform)
I need to:
Scan the folder
If you receive any new file.
Process the file.
Another method to detect "new files" is the archive attribute. Whenever a file is created or changed, this attribute is set by windows.
Whenever you process a file, unset it's archive attribute (attrib -a file.ext).
The advantage is, you don't depend on any timing.
To list "new" (or changed) files, use dir /aa (dir /a-a will list processed files)
for more infos see dir /? and attrib /?
Without knowing exactly what you're trying to execute, this is all I can provide. You would theoretically run this as a scheduled task every 1 hour:
foreach ($file in (Get-ChildItem "C:\TargetDirectory" | where {$_.lastwritetime -gt (Get-Date).AddHours(-1)})) {
# Execute-Command -Target $file
}
You could use the FileSystemWatcher class to monitor the folder for new files.
It can easily be used from PowerShell as well:
$FSW = New-Object System.IO.FileSystemWatcher
Then use Register-ObjectEvent to "listen" for events raised from it
FileSystemWatcher is a utility I have recently learned and will definitely use in the future. The best part is that it relies on .net eventing, so you don't need to build an external triggering structure.
Here is an example of how I am using this in a 24/7 production environment (the full script receives an xml, processes it, and inserts the results into SQL in under 3 seconds).
Function submit-resultFile {
#Actual file processing takes place here
}
Function create-fsw {
Register-ObjectEvent $fsw Created -SourceIdentifier "Spectro FileCreated" -Action {
$name = $Event.SourceEventArgs.Name
$File = $Event.SourceEventArgs.Fullpath
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Verbose "The file '$name' was $changeType at $timeStamp" -fore green
submit-ResultFile -xmlfile $file
}
}
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $watchFolder, $watchFilter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$xmlFiles = Get-ChildItem -Path $ResultsDirectory -Filter *.xml
foreach ($file in $xmlfiles)
{
submit-ResultFile -xmlfile $File.FullName
}
Create-fsw
# Register a new File System Watcher
Several important points to be aware of:
- if files exist in that location before the FSW is created they WILL NOT trigger an "objectevent", so in my script you'll observe that I begin by running a sweep for existing files.
when FSW does trigger you want it to process only 1 file at a time. Since the next file creation event will generate a new "objectevent". Structuring a FSW to work on multiple files per trigger will eventually result in a crash.

Scheduled Powershell Script wont run

I have a ps1 script like below, it filters files and outputs a formatted HTML file.
$a = "<style>"
$a = $a + "BODY{background-color:peachpuff;}"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}"
$a = $a + "TH{border-width: 1px;padding: 0px;border-style: solid;border-color: black;background-color:thistle}"
$a = $a + "TD{border-width: 1px;padding: 0px;border-style: solid;border-color: black;background-color:PaleGoldenrod}"
$a = $a + "</style>"
$b = Get-Date -Format u
Get-ChildItem -Recurse K:\AppData\*.* -Filter *.CATPart | Where{$_.LastWriteTime -gt (Get-Date).AddDays(-6)} | sort LastWriteTime -descending | select name,LastWriteTime,Directory | convertto-html -head $a -body "<H2>CATIA PAST 7 DAYS -- $b </H2>" | out-file C:\aaa\catia_result.htm
I can run this script manually with no problem at all. but when I schedule it to run, it only gives me the formatted htm file without any filtered data in there. This is arguments I used in task scheduler:
powershell.exe -ExecutionPolicy Bypass -Command "C:\aaa\RLTP_HTML_FINAL.ps1"
I tried change executionpolicy to Unrestricted, it still wont work. The task history shows the task completed, but there is no data in the HTML file.
I also tried to use a batch file call up powershell to run the script, it is the same result that it only works with manual operation but task scheduler.
Most likely, when you schedule the script to execute, it may not have a mapping to the K:\ drive. Make sure that:
The K drive is mapped in the script, using the Get-PSDrive cmdlet
Your credentials that the scheduled task is set to use have access to the K:\ drive
Alternatively, you could simply specify a UNC path, instead of referencing the K:\ drive.
ps. Good job using -ExecutionPolicy Bypass. That helps to avoid any issues with the execution policy! :) It doesn't matter what the execution policy is set to, as long as you use that parameter.
If you want to capture any errors in your script, make this your last line:
Add-Content -Path $PSScriptRoot\error.log -Value $error;
You might see something about the K:\ drive missing.

Extract hostnames from Perfmon blg with Powershell

I'm writing a script which will automate the extraction of data from .blg Perfmon logs.
I've worked out the primary Import-Counter commands I will need to use to get the data out, but am trying to parametrise this so that I can do it for each machine in the log file (without having to open the log up in Perfmon, which can take 15 minutes or sometimes more, and is the reason I'm writing this script), and find out what each hostname is.
The script I have does the job, but it still takes a minute to return the data I want, and I wondered if there was a simpler way to do this, as I'm not too familiar with Powershell?
Here's what I have:
$counters = Import-Counter -Path $log_path$logfile -ListSet * | Select-Object paths -ExpandProperty paths
$svrs = #()
# for each line in the list of counters, extract the name of the server and add it to the array
foreach ($line in $counters) {
$svrs += $line.split("\")[2]
}
# remove duplicates and sort the list of servers
$sorted_svrs = $svrs | sort -unique
foreach ($svr in $sorted_svrs) {
Write-Host $svr
}
I'm just printing the names for the moment, but they'll go into an array in the proper script, and then I'll run my Import-Counter block with each of these hosts parametrised in.
Just wondered if there was a better way of doing this?
$sorted_svrs=Import-Counter "$log_path$logfile" -Counter "\\*\physicaldisk(_total)\% disk time" | %{$_.countersamples.path.split("\")[2]} | sort -Unique

Resources