Yesterdays date in "MMdd", and specific output folder selection depending on date - sorting

so im trying to manage CCTV footage,
and so far i've come up with this code in powershell:
Gets yesterdays date in MMdd (todays version will be 0516), -> selects all files that begin with that -> compresses them using ffmpeg -> moves to another folder -> deletes source fules
$a = get-date -format "MMdd"
$b = 1
$c = $a - $b
$d = $c.ToString("0000")
$inProcessPath = "sourcepath"
$oldVideos = Get-ChildItem -Include #("$d *") -Path $inProcessPath -Recurse;
Set-Location -Path 'D:\ffmpeg\bin';
foreach ($oldVideo in $oldVideos) {
$newVideo = [io.path]::ChangeExtension($oldVideo.FullName, '.avi')
$ArgumentList = '-i "{0}" -b 200000 "{1}"' -f $oldVideo, $newVideo;
Start-Process -FilePath "D:\ffmpeg\bin\ffmpeg.exe" -ArgumentList $ArgumentList -Wait -NoNewWindow;
}
Robocopy D:\ffmpeg\bin\ntv D:\newpaths "$d *.avi" /mov
get-childitem "sourcepath" -include "$d *.mp4" -recurse | foreach ($_) {remove-item $_.fullname}
But, during the testing stage I realised that my implementation wont work when there is a month switch, since from lets say 0601 it wont produce 0531, but 0600.
Also I need the converted files to be moved to a directory according to current Months, so if i have folder May,June, etc. And i need files that start with 05 go to May folder, and so on.
Can someone help my accomplish my task, in code or in advice
My programming knowledge is not enough to solve this issue.
The main goal is automation
For the first part courtesy to #dotnetom
This worked:
$d = (get-date).AddDays(-1).ToString("MMdd")
For the second part i've comeup with this
$a = (get-date).AddDays(-1).ToString("MMMM")
Robocopy D:\Main\AdWords\ffmpeg\bin\ntv "D:\path\$a" "$d *.avi" /mov

To get the previous day you can use function AddDays to get yesterday's date, and then format it according to your needs:
$d = (get-date).AddDays(-1).ToString("MMdd")
If we break this code down, the components are:
$currentDay = get-date # current day
$yesterday = $currentDay.AddDays(-1) # yesterday
$formattedYesterday = $yesterday.ToString("MMdd") #yesterday formatted to MMdd

Related

PS - Find Folders that Haven't Had their Files Modfied in Some Time

We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName

Folder 's"Date modified" incorrect (Windows 7)

Something very odd is going on with the "date modified" field of several folders on an exFAT external drive I have. A folder in which several files were recently added is still showing its date modified as its creation date. Even worse, another folder with recently added files is showing a date that precedes its creation date! Has anyone observed this and know what might be going on? I have checked online and found nothing useful/relevant regarding this. The same information shows up in both Explorer and in a command prompt so its not specific to Explorer
Run this PowerShell script. Close Explorer before running to avoid file-locking.
# -------------give each folder the highest modified date of it's files --------
function OneDir($dir)
{
# elaborate one folder, with given name
Set-Location -Path $dir.FullName
$maxd = Get-Date(0)
$files = Get-ChildItem -Recurse -Filter *.* | Where-Object { $_.PsIsContainer -eq $false }
for ($i=0; $i -lt $files.Count; $i++)
{
$file = $files[$i]
$cd = [datetime]($file.lastwritetime)
If ($cd -Gt $maxd)
{$maxd = $cd}
}
If ($files.Count -Gt 0)
{$dir.LastWriteTime = ($maxd)}
Write-Host ($dir.FullName) + " " + ($dir.LastWriteTime)
}
#------------------------- main ------------------------------------
$startDir = Read-Host 'Foldername to start with'
Set-Location -Path $startDir
$t = Get-ItemProperty $startDir
OneDir $t
$dirs = Get-ChildItem -Recurse -Filter *.* | Where-Object { $_.PSIsContainer }
for ($d=0; $d -lt $dirs.Count; $d++)
{
OneDir $dirs[$d]
}
Write-Host "Finished. press Enter"
cmd /c pause

How to compress log files older than 30 days in Windows?

I wrote the below PowerShell script to compress logs older than 30 days:
$LastWrite=(get-date).AddDays(-30).ToString("MM/dd/yyyy")
Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object
{$_.LastWriteTime -le $LastWrite}
Now, I am unable to get a compress command in PowerShell via which I can compress (zip/tar) the server.log* files older than 30 days.
Expecting a single command which I can use by adding a pipe sign in the above command.
You can use the Compress-Archive cmdlet to zip files if you have PowerShell version 5 or above:
$LastWrite = (get-date).AddDays(-30)
$Files = Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object {$_.LastWriteTime -le $LastWrite}
ForEach ($File in $Files) {
$File | Compress-Archive -DestinationPath "$($File.fullname).zip"
}
If you have an older version of Powershell you can use ZipFileExtensions' CreateEntryFromFile method, but there are a lot of considerations if you want a robust script that runs unattended.
In months of testing a script developed for this purpose, I encountered some issues that have made this small problem more complicated:
Will any of the files be locked? CreateEntryFromFile may fail if so.
Did you know that you can have multiple copies of the same file in a Zip archive? It's harder to extract them because you can't put them in the same folder. My script checks the file path and the archived file time stamp (+/- 2 seconds due to the lost date precision in Zip format) to determine if it's been already archived, and doesn't create a duplicate.
Are the files created in a time zone with Daylight Savings? Zip format doesn't preserve that attribute, and may lose or gain an hour when uncompressed.
Do you want to delete the original if it was successfully archived?
If unsuccessful due to a locked/missing file or very long path, should the process continue?
Will any error leave you with an unusable zip file? You need to Dispose() the archive to finalize it.
How many archives do you want to keep? I prefer one per run-month, adding new entries to an existing zip.
Do you want to preserve the relative path? Doing so will partially eliminate the problem of duplicates inside the zip file.
Mark Wragg's script should work if you don't care about these issues and you have Powershell 5, but it creates a zip for every log, which may not be what you want.
Here's the current version of the script - in case GitHub ever becomes unavailable:
#Sends $FileSpecs files to a zip archive if they match $Filter - deleting the original if $DeleteAfterArchiving is true.
#Files that have already been archived will be ignored.
param (
[string] $ParentFolder = "$PSScriptRoot", #Files will be stored in the zip with path relative to this folder
[string[]] $FileSpecs = #("*.log","*.txt","*.svclog","*.log.*"),
$Filter = { $_.LastWriteTime -lt (Get-Date).AddDays(-7)}, #a Where-Object function - default = older than 7 days
[string] $ZipPath = "$PSScriptRoot\archive-$(get-date -f yyyy-MM).zip", #create one archive per run-month - it may contain older files
[System.IO.Compression.CompressionLevel]$CompressionLevel = [System.IO.Compression.CompressionLevel]::Optimal,
[switch] $DeleteAfterArchiving = $true,
[switch] $Verbose = $true,
[switch] $Recurse = $true
)
#( 'System.IO.Compression','System.IO.Compression.FileSystem') | % { [void][System.Reflection.Assembly]::LoadWithPartialName($_) }
Push-Location $ParentFolder #change to the folder so we can get relative path
$FileList = (Get-ChildItem $FileSpecs -File -Recurse:$Recurse | Where-Object $Filter) #CreateEntryFromFile raises UnauthorizedAccessException if item is a directory
$totalcount = $FileList.Count
$countdown = $totalcount
$skipped = #()
Try{
$WriteArchive = [IO.Compression.ZipFile]::Open( $ZipPath, [System.IO.Compression.ZipArchiveMode]::Update)
ForEach ($File in $FileList){
Write-Progress -Activity "Archiving files" -Status "Archiving file $($totalcount - $countdown) of $totalcount : $($File.Name)" -PercentComplete (($totalcount - $countdown)/$totalcount * 100)
$ArchivedFile = $null
$RelativePath = (Resolve-Path -LiteralPath "$($File.FullName)" -Relative) -replace '^.\\'
$AlreadyArchivedFile = ($WriteArchive.Entries | Where-Object {#zip will store multiple copies of the exact same file - prevent this by checking if already archived.
(($_.FullName -eq $RelativePath) -and ($_.Length -eq $File.Length) ) -and
([math]::Abs(($_.LastWriteTime.UtcDateTime - $File.LastWriteTimeUtc).Seconds) -le 2) #ZipFileExtensions timestamps are only precise within 2 seconds.
})
If($AlreadyArchivedFile -eq $null){
If($Verbose){Write-Host "Archiving $RelativePath $($File.LastWriteTimeUtc -f "yyyyMMdd-HHmmss") $($File.Length)" }
Try{
$ArchivedFile = [System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($WriteArchive, $File.FullName, $RelativePath, $CompressionLevel)
}Catch{
Write-Warning "$($File.FullName) could not be archived. `n $($_.Exception.Message)"
$skipped += [psobject]#{Path=$file.FullName; Reason=$_.Exception.Message}
}
If($File.LastWriteTime.IsDaylightSavingTime() -and $ArchivedFile){#HACK: fix for buggy date - adds an hour inside archive when the zipped file was created during PDT (files created during PST are not affected). Not sure how to introduce DST attribute to file date in the archive.
$entry = $WriteArchive.GetEntry($RelativePath)
$entry.LastWriteTime = ($File.LastWriteTime.ToLocalTime() - (New-TimeSpan -Hours 1)) #TODO: This is better, but maybe not fully correct. Does it work in all time zones?
}
}Else{#Write-Warning "$($File.FullName) is already archived$(If($DeleteAfterArchiving){' and will be deleted.'}Else{'. No action taken.'})"
Write-Warning "$($File.FullName) is already archived - No action taken."
$skipped += [psobject]#{Path=$file.FullName; Reason="Already archived"}
}
If((($ArchivedFile -ne $null) -and ($ArchivedFile.FullName -eq $RelativePath)) -and $DeleteAfterArchiving) { #delete original if it's been successfully archived.
Try {
Remove-Item $File.FullName -Verbose:$Verbose
}Catch{
Write-Warning "$($File.FullName) could not be deleted. `n $($_.Exception.Message)"
}
}
$countdown = $countdown -1
}
}Catch [Exception]{
Write-Error $_.Exception
}Finally{
$WriteArchive.Dispose() #close the zip file so it can be read later
Write-Host "Sent $($totalcount - $countdown - $($skipped.Count)) of $totalcount files to archive: $ZipPath"
$skipped | Format-Table -Autosize -Wrap
}
Pop-Location
Here's a command line that will compress all server.log* files older than 30 days under the current folder:
.\ArchiveOldLogs.ps1 -FileSpecs #("server.log*") -Filter { $_.LastWriteTime -lt (Get-Date).AddDays(-30)}

prompt or PS command to delete folders older than x days [duplicate]

This question already has answers here:
Batch file to delete folders older than 10 days in Windows 7
(3 answers)
Closed last year.
i'm building a setup with inno setup and i'd like to add scheduled task to clean log folders older than X days with single command.
I'm searching for some example to make powershell or prompt command, but none works.
Can you help me to find best way?
Thanks
I don't have much time to research this but if you would like to search for a file within a folder location continuously covering a specific time-frame you can use the following script;
while($true){
# You may want to adjust these
$fullPath = "C:\temp\_Patches\Java\Files\x86\Source"
$numdays = 5
$numhours = 10
$nummins = 5
function ShowOldFiles($path, $days, $hours, $mins)
{
$files = #(get-childitem $path -include *.* -recurse | where {($_.LastWriteTime -lt (Get-Date).AddDays(-$days).AddHours(-$hours).AddMinutes(-$mins)) -and ($_.psIsContainer -eq $false)})
if ($files -ne $NULL)
{
for ($idx = 0; $idx -lt $files.Length; $idx++)
{
$file = $files[$idx]
write-host ("Old: " + $file.Name) -Foregroundcolor Red
Start-Sleep -s 10
}
}
}
ShowOldFiles $fullPath $numdays $numhours $nummins
}
You would just need to add this script to your start-up folder and change the values (E.G file path, file age, sleep). You can also append the data to a text file.
I started with the following post:
How can I check if a file is older than a certain time with PowerShell?
Thanks,
Calvin
Edit: Formatting

Parsing data from multiple text files into a CSV

I have a directory full of files filled with content similar to the below. I want to copy everything after //TEST: and before //, I want to copy the date and time, and the IPO into a CSV.
IPO 7 604 1148 17 - Psuedo text here doesnt mean anything just filler text, beep, boop.txt
werqwerwqerw
erqwerwqer
2. (test) On 7 July 2017 at 0600Z, wqerwqerwqerwerwqerqwerwqjeroisduhsuf //TEST: 37MGUI2974027//,
sdfajsfjiosauf
sadfu
(test2) On 7 July 2017 at 0600Z, blah blah //TEST: 89MTU34782374//
blah blah text here //TEST: GHO394749374// (this is uneeded)
Now, Each file has multiple instances of this data, and there may be hundreds of them.
I want to output it into a CSV similar to this:
89MTU34782374, 3 July 2016 at 0640Z, IPO 7 604 1148 17
I have successfully created that with the following, and I feel like I'm on the right track:
$x = "D:\New folder\"
$s = Get-Content $x
$ipo = [regex]::Match($s,'IPO([^/)]+?) -').Groups[1].Value
$test = [regex]::Matches($s,'//TEST: ([^/)]+?)//').Groups[1].Value
$date = [regex]::Matches($s,' On([^/)]+?),').Groups[1].Value
Write-Host $test"," $date"," IPO $ipo
However, I am having trouble getting it to find and select every instance in the file, and printing them onto a new line. I should also note that the way it is looking for the data, every text file is formatted the same way like this.
Not only am I having issues getting it to print each string/variable in the text document onto a new line, I'm having trouble figuring out how to do it for multiple files.
I have tried the following, but it seems to find the terms it's looking for from the first file, and spitting it out for as many files are contained in the directory:
$files = Get-ChildItem "D:\New folder\*.txt"
$s = Get-Content $files
for ($i=0; $i -lt $files.Count; $i++) {
$ipo = [regex]::Match($s,'IPO([^/)]+?) -').Groups[1].Value
$test = [regex]::Matches($s,'//TEST: ([^/)]+?)//').Groups[1].Value
$date = [regex]::Matches($s,' On([^/)]+?),').Groups[1].Value
Write-Host $test"," $date"," IPO $ipo
}
Does anyone have any ideas on how this could be done?
I did a bad job at explaining this.
Every document has an IPO number.
Every TEST string has a date/time associated with it.
There may be other TEST strings but they can be ignored, they are uneeded without a date/time. I could clean it up easily if they got included into the product, though.
Every TEST+date/time combo should have the IPO number from which they came
If date and //TEST: ...// substring always appear as pairs and in the same order you should be able to extract both values with a single regular expression. Try something like this:
Get-ChildItem "D:\New folder\*.txt" | ForEach-Object {
$s = Get-Content $_.FullName
$ipo = [regex]::Matches($s,'(IPO .+?) -').Groups[1].Value
[regex]::Matches($s,' On (.+?),[\s\S]*?//TEST: (.+?)//') | ForEach-Object {
New-Object -Type PSObject -Property #{
IPO = $ipo
Date = $_.Groups[1].Value
Test = $_.Groups[2].Value
}
}
} | Export-Csv 'C:\path\to\output.csv' -NoType
Like so? Most of your code seems to be fine if I understand your question.
It's the loop that seems incorrect as you are repeating the same thing for the number of files found, but not actually referring to the individual files. Also, $s = ... should be inside the loop to get the content of each file.
$files = Get-ChildItem "D:\New folder\*.txt"
foreach($file in $files){
$s = Get-content $file
$ipo = [regex]::Match($s,'IPO([^/)]+?) -').Groups[1].Value
$test = [regex]::Matches($s,'//TEST: ([^/)]+?)//').Groups[1].Value
$date = [regex]::Matches($s,' On([^/)]+?),').Groups[1].Value
Write-Host "$test, $date, IPO $ipo"
}

Resources