Is it a way to detect if there is any file added in a folder? Include the sub-folder.
For example, check if any text file *.txt is added in folder c:\data-files\ or its sub-folders.
The folder can be shared folder of another machine too.
Perhaps you are confused on the types of events that are triggered:
http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher_events(v=vs.110).aspx
This should work, taken from the link above and modified for your requirements:
#By BigTeddy 05 September 2011
#This script uses the .NET FileSystemWatcher class to monitor file events in folder(s).
#The advantage of this method over using WMI eventing is that this can monitor sub-folders.
#The -Action parameter can contain any valid Powershell commands. I have just included two for example.
#The script can be set to a wildcard filter, and IncludeSubdirectories can be changed to $true.
#You need not subscribe to all three types of event. All three are shown for example.
# Version 1.1
$folder = '\\remote\shared' # Enter the root path you want to monitor.
$filter = '*.txt' # You can enter a wildcard filter here.
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property #{IncludeSubdirectories = $true;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
# Here, all three events are registerd. You need only subscribe to events that you need:
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action {
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp" -fore green
Out-File -FilePath c:\scripts\filechange\outlog.txt -Append -InputObject "The file '$name' was $changeType at $timeStamp"}
Please note that once you close the powershell console the fileSystemWatcher is thrown away, and will no longer monitor the folder(s). So you have to make sure the powershell window stays open. In order to do that without it getting in your way I suggest a scheduled task http://blogs.technet.com/b/heyscriptingguy/archive/2011/01/12/use-scheduled-tasks-to-run-powershell-commands-on-windows.aspx
Related
Firstly I do apologise if this isn't the correct forum to be posting this question and if this isn't the place to be asking can anyone direct me to a new person forum?
Secondly you'll soon discover that I really don't know much about PowerShell or Visual Studio but I'm learning... I'm sure the Powershell script your going to see could be better but it works.
So my issue was with a system that output .txt files onto 2 PC's LH & RH these .txt files were output with different names as it was using 3 different products.
We then needed these files to be filtered by product and copied to a network drives while also being archived and deleted from the original folder.. Oh and this needed to be done real time.
So my Powershell script I've been using is the following
$folder = 'Target Folder'
$timeout = 1000
$filesystemwatcher = new-object system.IO.filesystemwatcher $folder
write-host "Monitoring... $folder
Transfering..."
while ($true) {
$result = $FileSystemWatcher.WaitForChanged('all', $timeout)
if ($result.timeout -eq $false)
{
write-warning ('file {0} : {1}' -f $result.changetype, $result.name)
}
$targetdirectory = "Target folder"
$sourcedirectory = "export folder"
if (-not(Test-Path -path $targetdirectory)) {
New-Item $targetdirectory -Type Directory}
Copy-Item -Path $sourcedirectory\"*.txt" -Destination $targetdirectory
$Files = Get-ChildItem -Path export folder -Filter "*.txt" -Recurse
foreach($File in $Files)
{
if ($File.name -like "1*.txt")
{
Move-Item -Path $File.FullName "1 folder"
}
elseif ($File.name -like "2*.txt")
{
Move-Item -Path $File.FullName "2 folder"
}
elseif ($File.name -like "3*.txt")
{
Move-Item -Path $File.FullName "3 folder"
}
}
}
}
Now this script gets the files moved and works but its a Powershell script and its running 24/7 365 days a year sometimes the script had stopped sometimes the script has been messed with its just not reliable enough.
So I want to turn it into a application via Visual Studio.
Is it possible? remember i have never used Visual Studio before (Been trying it for a few hours learnt some basics)
Has anyone done anything similar to this before/ is there any guide anyone can think of that would suit my needs more?
I'm looking for the application to have the following
A status screen ie what files it has found and where it has moved them to
A setting option to be able to set.. amount of filter string/save paths... change source & export/archive paths etc
Can anyone point me in the right direction? Being new I don't know what to search for to get guides on my needs
Cheers
For license porpuses I try to automate the counting process instead of having to login into every single server, go into directory, search a file name and count the results based on the change date.
Want I'm aiming for:
Running a powershell script every month that checks the directory "C:\Users" for the file "Outlook.pst" recursively. And then filters the result by change date (one month or newer). Then packing this into an email to send to my inbox.
I'm not sure if that's possible, cause I am fairly new to powershell. Would appreciate your help!
It is possible.
I dont know how to start a ps session on a remote computer, but I think the cmdlet Enter-PSSession will do the trick. Or at least it was the first result while searching for "open remote powershell session". If that does not work use the Invoke-Command as suggested by lit to get $outlookFiles as suggested below.
For the rest use this.
$outlookFiles = Get-ChildItem -Path "C:\Users" -Recurse | Where-Object { $_.Name -eq "Outlook.pst" }
Now you have all files that have this name. If you are not familiar with the pipe in powershell it redirects all objects it found with the Get-ChildItem to the next pipe section and here the Where-Object will filter the received objects. If the current object ($_) will pass the condition it is returned by the whole command.
Now you can filter these objects again to only include the latest ones with.
$latestDate = (Get-Date).AddMonths(-1)
$newFiles = $outlookFiles | Where-Object { $_.LastAccessTime -gt $latestDate }
Now you have all the data you want in one object. Now you only have to format this how you like it e.g. you could use $mailBody = $newFiles | Out-String and then use Send-MailMessage -To x#y.z -From r#g.b -Body $mailBodyto send the mail.
I searched high and low, found how to do it in *nix, but nothing about Windows.
First place I've seen this was Tomcat's catalina.out, and now I was wondering how to do a similar thing on Windows: considering a folder where log files are created, how to make a file that reads the/points to latest log created?
I'm thinking a Powershell solution might be possible, but I honestly can't think or find any way to do it.
(edit) You guys downvoting could at least leave a comment to tell me what did I do wrong or how can I improve this question?
(edit) The idea here is to have some way to create a symlink that points to the latest log file in a folder, so a program can monitor always the same file, no matter if the latest file changes its name - like tail -f catalina.out always reads the latest catalina log file.
The only way out I can see, and that I wanted to avoid, would be to write a powershell script that would monitor a folder (https://superuser.com/questions/226828/how-to-monitor-a-folder-and-trigger-a-command-line-action-when-a-file-is-created) and would dynamically create a symlink to the latest file found (https://stackoverflow.com/a/11211005/1985023), then set it as a service, so it would be always running on the background.
Instead of looking for a dynamically self-updating symlink (which would be quite cumbersome to implement - see the helpful hints from BACON in the comments in the question), you can make this work as a self-contained function/script with the help of PowerShell background jobs:
Run in a loop that periodically gets the latest log-file lines from a background job that does the equivalent of Unix tail -f via Get-Content -Wait -Tail 10.
If a new log file is found, terminate the previous background job and start one for the new log file.
Note that this relies on periodic polling of the background job that tails the log. The code below allows you to adjust the polling interval.
Note that Get-Content -Wait itself polls the target file for changes every second.
Here's the code; run $VerbosePreference = 'Continue' to see what's going on inside the loop:
$dir = 'C:\path\to\logs' # the log-file directory
$logFilePattern = '*.log' # wildcard pattern matching log files
$sleepIntervalMs = 1000 # how many msec. to sleep between getting new lines from the background job
Write-Host -ForegroundColor Green "Tailing the latest log(s) in $dir...`nPress any key to quit."
$currJob = $currLog = $null
while ($true) {
# If the user pressed a key, clean up and exit.
if ([console]::KeyAvailable) {
$null = [console]::ReadKey($True) # consume the key - it will still have printed, though
if ($currJob) { Remove-Job -Job $currJob -Force }
break
}
# Get the latest lines from the current log from the background job.
if ($currJob) {
Write-Verbose "Checking for new lines in $newLog..."
Receive-Job -Job $currJob
Start-Sleep -Milliseconds $sleepIntervalMs # sleep a little
}
# Determine the first / newest log.
$newLog = Get-ChildItem -LiteralPath $dir -Filter $logFilePattern | Sort-Object CreationTimeUtc -Descending | Select-Object -First 1
if ($newLog.FullName -ne $currLog.FullName) { # new log file found.
Write-Verbose "(New) log file found: $newLog"
if ($currJob) {
Write-Verbose "Terminating background job for previous log ($currLog)."
Remove-Job -Job $currJob -Force
# When a *new* log was just started, we show *all* lines (and keep listening for more).
$tailArg = #{}
} else {
# When we first start monitoring, we start with the *last 10* lines
# of the current log (and keep listening for more).
$tailArg = #{ Tail = 10 } # On first
}
$currLog = $newLog
Write-Verbose "Starting background job for $currLog..."
# Start the background job for the new log.
$currJob = Start-Job { Get-Content -Wait #using:tailArg -LiteralPath $using:newLog.FullName }
}
}
Write-Host -ForegroundColor Green "Terminated."
Help please. I can not find a solution. (Windows platform)
I need to:
Scan the folder
If you receive any new file.
Process the file.
Another method to detect "new files" is the archive attribute. Whenever a file is created or changed, this attribute is set by windows.
Whenever you process a file, unset it's archive attribute (attrib -a file.ext).
The advantage is, you don't depend on any timing.
To list "new" (or changed) files, use dir /aa (dir /a-a will list processed files)
for more infos see dir /? and attrib /?
Without knowing exactly what you're trying to execute, this is all I can provide. You would theoretically run this as a scheduled task every 1 hour:
foreach ($file in (Get-ChildItem "C:\TargetDirectory" | where {$_.lastwritetime -gt (Get-Date).AddHours(-1)})) {
# Execute-Command -Target $file
}
You could use the FileSystemWatcher class to monitor the folder for new files.
It can easily be used from PowerShell as well:
$FSW = New-Object System.IO.FileSystemWatcher
Then use Register-ObjectEvent to "listen" for events raised from it
FileSystemWatcher is a utility I have recently learned and will definitely use in the future. The best part is that it relies on .net eventing, so you don't need to build an external triggering structure.
Here is an example of how I am using this in a 24/7 production environment (the full script receives an xml, processes it, and inserts the results into SQL in under 3 seconds).
Function submit-resultFile {
#Actual file processing takes place here
}
Function create-fsw {
Register-ObjectEvent $fsw Created -SourceIdentifier "Spectro FileCreated" -Action {
$name = $Event.SourceEventArgs.Name
$File = $Event.SourceEventArgs.Fullpath
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Verbose "The file '$name' was $changeType at $timeStamp" -fore green
submit-ResultFile -xmlfile $file
}
}
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $watchFolder, $watchFilter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$xmlFiles = Get-ChildItem -Path $ResultsDirectory -Filter *.xml
foreach ($file in $xmlfiles)
{
submit-ResultFile -xmlfile $File.FullName
}
Create-fsw
# Register a new File System Watcher
Several important points to be aware of:
- if files exist in that location before the FSW is created they WILL NOT trigger an "objectevent", so in my script you'll observe that I begin by running a sweep for existing files.
when FSW does trigger you want it to process only 1 file at a time. Since the next file creation event will generate a new "objectevent". Structuring a FSW to work on multiple files per trigger will eventually result in a crash.
I am having below script:
$pattern = 'Unable to authenticate user!'
$events = Get-WinEvent -ea SilentlyContinue `
-ProviderName "Windows DB Controller - Task Manager Service"|
Where-Object { $_.TimeCreated -gt [datetime]::today -and $_.Message -match $pattern }
$events >> D:\Error.txt
if ($events) {
Send-MailMessage -SmtpServer smtp.domain.com -From No-reply#domain.com -To sunny#domain.com -Subject 'Error found in log' -Body $events
}
I had scheduled it to run on every 10 mins and purposely ,I wanted to achieve following point using above script:
Search the specified error message in the event viewer log only for current-date and as soon as the error message encountered send a email notification to me but didn't want to receive email notification for the error message which appreared today and for which I had already been notified (I mean , wanted to receive error-notification only once for a specific time of current day).
But problem I am facing here is: Getting multiple notifications for same error message for which already being notified.
I hope I am clear enough to put my exact problem.
Could you please help me out, how to resolve this problem ?
If you are running the script every 10 minutes, I would change the condition on the Where-Object so instead of getting all of the events that are "today"; I would change it to get only the events that happened in the last 10 minutes. i.e. the code becomes:
Where-Object { $_.TimeCreated -gt [datetime]::now.AddMinutes(-10) -and $_.Message -match $pattern }
Have a look at this thread:
Powershell - Tail Windows Event Log? Is it possible?
It's on tailing an event log, but the same method should work for what you're tyring to do. Just save the last index number to a file between runs.
How about the following approach:
Register-WmiEvent -Query "select * from __InstanceCreationEvent where TargetInstance ISA 'Win32_NTLogEvent' and TargetInstance.SourceName = 'Windows DB Controller - Task Manager Service' and TargetInstance.Message LIKE '%Unable to authenticate user!%'" -SourceIdentifier "MyEventListener" -Action {
#Your actions
write-host "$($eventargs.NewEvent.TargetInstance.RecordNumber) at $($eventargs.NewEvent.TargetInstance.TimeGenerated)"
}
It uses WMI to subscribe to the event that occurs when an eventlog entry is generated with your criterias. The action itself will only return the new object(so no more duplicates). I've included a sample action to help you understand how to access the object. This method will give you live monitoring.
Inside the action, $eventargs.NewEvent.TargetInstance will give you the object which is an instance of win32_ntlogevent. To see properties of this class, check out TechNet or run the following command:
([wmiclass]'win32_ntlogevent').Properties | ft Name
To make the script run forever, just call your script with powershell -file script.ps1 -noexit or include a while($true) loop at the end of your script. (I'm not sure how the while-loop will affect resource usage longterm, you'd have to test).