Powershell - Sequential foreach loop zipping files - windows

I have a number of IIS servers, each with a number of sites on them and I want to zip all IIS logs regularly.
I cobbled together the following powershell script with the help of this site and google:
$files = Get-ChildItem "D:\logfiles\IIS-Logs\*.log" -Recurse | Where-Object {($_.LastWriteTime -lt (Get-Date).AddDays(-7))}
foreach ($file in $files) {& 'C:\Program Files\WinRAR\winrar.exe' a -tl -df -m5 "$file.rar" $File}
The problem with this script is that if there are.. say... 2,000 total log files it tried to launch 2,000 simultaneous copies of Winrar and the server will crash. This was unexpected. I expected it to zip the files one at a time. Sequentially.
Does anyone have any ideas to make this work like I want?
I'd really like to use Winrar vs the native Compress-Archive option because:
I want the file dates to reflect the zipped file, not the date it was zipped.
I want the utility to delete the files after archiving because the utility will not delete the file if the archiving failed.
I'm not married to Winrar if I can achieve this another way.

For testing purpose, i created a LogFolder with 3 separate log folders, Log1, Log2 and Log3. Each of these Log folders have 2000 files with 2MB data each. This is the command i ran to Compress each folder `seperately
You can also run these in serial if the performance is too slow (when reading from same disk and writing to same as well).
$ElementsInLog = (Get-ChildItem C:\temp\LogFolder\*.txt -Recurse).Length
$ElementsInLog1 = (Get-ChildItem C:\temp\LogFolder\Log1\*.txt -Recurse).Length
$ElementsInLog2 = (Get-ChildItem C:\temp\LogFolder\Log2\*.txt -Recurse).Length
$ElementsInLog3 = (Get-ChildItem C:\temp\LogFolder\Log3\*.txt -Recurse).Length
Write-Output "Main: $ElementsInLog`nLog1: $ElementsInLog1`nLog2: $elementsInLog2`nLog3: $elementsInLog3"
Write-output "Total File Size: $((Get-ChildItem C:\temp\LogFolder\Log1\*.txt -Recurse | Measure-Object length -Sum).Sum / 1024 / 1024 / 1024) GB"
Write-Output "Starting Tasks..."
$job1 = Start-Job -ScriptBlock {
Write-Output "Log1: $(Get-Date -Format G)"
Get-ChildItem C:\temp\LogFolder\Log1 -Recurse | Compress-Archive -DestinationPath C:\temp\log1.zip -CompressionLevel Fastest
Write-Output "Finished: $(Get-Date -Format G)"
}
$job2 = Start-Job -ScriptBlock {
Write-Output "Log2: $(Get-Date -Format G)"
Get-ChildItem C:\temp\LogFolder\Log3 -Recurse | Compress-Archive -DestinationPath C:\temp\log3.zip -CompressionLevel Fastest
Write-Output "Finished: $(Get-Date -Format G)"
}
$job3 = Start-Job -ScriptBlock {
Write-Output "Log2: $(Get-Date -Format G)"
Get-ChildItem C:\temp\LogFolder\Log4 | Compress-Archive -DestinationPath C:\temp\log4.zip -CompressionLevel Fastest
Write-Output "Finished: $(Get-Date -Format G)"
}
while($job1.State -eq "Running" -or $job2.State -eq "Running" -or $job3.State -eq "Running") {
Start-Sleep 5
}
Receive-Job $job1
Receive-Job $job2
Receive-Job $job3
Output Received
Main: 8000
Log1: 2000
Log2: 2000
Log3: 2000
Total File Size: 4.12791967391968 GB
Starting Tasks...
Log1: 2/10/2020 8:36:22 PM
Finished: 2/10/2020 8:37:30 PM
Log2: 2/10/2020 8:36:22 PM
Finished: 2/10/2020 8:37:27 PM
Log3: 2/10/2020 8:36:22 PM
Finished: 2/10/2020 8:37:28 PM

You probably want to use "Start-Process -Wait" instead of using the &. Using the -Wait flag on Start-Process forces it to wait for completion, and will then cause your program to run sequentially. Check this article on how to use Start-Process: A Better PowerShell Start Process You might also want to use the cmdlet Compress-Archive instead of the command line program Winrar, which might be better integrated and give you better feedback in your scripting.
Something like this with WinRAR?
$files = Get-ChildItem "D:\logfiles\IIS-Logs\*.log" -Recurse | Where-Object {($_.LastWriteTime -lt (Get-Date).AddDays(-7))}
foreach ($file in $files) {
$args = 'a -tl -df -m5 "' + $file.rar + '" ' + $File
Start-Process -Wait -filepath 'C:\Program Files\WinRAR\winrar.exe' -ArgumentList $args
}
Or this with Compress-Archive
$files = Get-ChildItem "D:\logfiles\IIS-Logs\*.log" -Recurse | Where-Object {($_.LastWriteTime -lt (Get-Date).AddDays(-7))}
foreach ($file in $files) {
Compress-Archive -Path $file.FullName -DestinationPath "$($File.FullName).rar"
{
The above are untested, but should work unless I made a typo.

Related

Powershell - extract specific items from zip then send a message (loop with message box)

I'm trying to achieve the following with this powershell script.
Copy any .zip file from folder dropfilehere to folder IN.
For each .zip file in folder "IN" open the zip file, find only the .csv file.
When .csv file is found, extract it to $dst under name DB.csv (overwrite old file).
Empty contents of folders "dropfilehere" and "IN"
Finally, when all the above is done, create a popup box with a message to the user using wscriptshell -
This is the issue. When the message is sent, the user gets 10+ popup boxes or an endless loop of them.
In the background i see cmd.exe and conhost.exe processes appearing as each popup box gets created.
I use a batch file to call the powershell script.
Powershell.exe -ExecutionPolicy Bypass -File C:\pathtoscript\call.ps1
exit
The script is:
$dst = "C:\Testing\DB"
Copy-item -Path "C:\Users\user\dropfilehere\*.zip" -destination "C:\Testing\Other\In" -Force
Foreach ($zipfile in (Get-ChildItem "C:\Testing\Other\In\*.zip" -Recurse)) {
Add-Type -Assembly System.IO.Compression.FileSystem
$zipFile = [IO.Compression.ZipFile]::OpenRead($zipfile)
$zipFile.Entries | where {$_.Name -like '*.csv'} | foreach {$FileName = $_.Name
[System.IO.Compression.ZipFileExtensions]::ExtractToFile($_, "$dst\DB.csv", $true)}
$zipFile.Dispose()
Remove-Item "C:\Testing\Other\In\*" -Recurse -Force
Remove-Item "C:\Users\user\dropfilehere\*" -Recurse -Force
$org="Name of Org"
$timeout = 60 # in seconds
$ws = New-Object -ComObject "Wscript.Shell"
$intButton = $ws.Popup("A new update message here`n
Another message here.",$timeout,$org, 0)
}
exit
There is code inside your foreach loop that should be placed after it, as shown below (properly indenting your code would have made that more obvious):
Add-Type -Assembly System.IO.Compression.FileSystem
$dst = "C:\Testing\DB"
Copy-item -Path "C:\Users\user\dropfilehere\*.zip" -destination "C:\Testing\Other\In" -Force
# Process all files.
foreach ($zipfile in (Get-ChildItem "C:\Testing\Other\In\*.zip" -Recurse)) {
$zipFile = [IO.Compression.ZipFile]::OpenRead($zipfile)
$zipFile.Entries |
Where-Object { $_.Name -like '*.csv' } |
ForEach-Object {
[System.IO.Compression.ZipFileExtensions]::ExtractToFile($_, "$dst\DB.csv", $true)
}
$zipFile.Dispose()
}
# Remove the folders containing the original *.zip files.
Remove-Item "C:\Testing\Other\In\*" -Recurse -Force
Remove-Item "C:\Users\user\dropfilehere\*" -Recurse -Force
# Show a message box.
$org = "Name of Org"
$timeout = 60 # in seconds
$ws = New-Object -ComObject "Wscript.Shell"
$intButton = $ws.Popup("A new update message here`nAnother message here.", $timeout, $org, 0)

Why is my Move-Item command filtering not detected?

My command to move all rar files
Move-Item -Path * -Filter *.rar -Destination .\Target
I changed a little bit
Move-Item -Path * -Filter .\*.rar -Destination .\NewTarget
Same issue again,rar is not filtered.
Everything is moved to Target(all files). Why?
I think this should to the work :-)
It is better to filter them before with Get-ChildItem and store them at some variable. And then when you know that you have exact files just doing it with ForEach or ForEach-Object
$Source = "C:\Users\Puzo\Desktop\FolderA"
$Destination = "C:\Users\Puzo\Desktop\FolderB"
$FilesToMove = Get-ChildItem -Path $Source -Filter "*.rar" -Recurse
$i = 1
ForEach ($File in $FilesToMove)
{
Move-Item -Path $File.FullName -Destination ("$Destination\" + $File.Name)
Write-Host ('File ' + $File.Name + 'was moved.') -ForegroundColor Yellow
$i++
}
Write-Host "$i files was moved!" -ForegroundColor Green

Copy and paste the latest modified log files from one server to other server (the servers are in the different domains)

I am trying to write a script that copies the latest modified log file from one server to another server (servers are in different domains), while copying it should check for the credentials and then execute the script.
Please let me know if the script is correct or any corrections to be made.
$sourcePath = 'sourcepath'
$destPath = 'Destinationpath'
$compareDate = (Get-Date).AddDays(-1);
$LastFileCaptured = Get-ChildItem -Path $sourcePath |
where {$_.Extension.EndsWith('.log') -and $_.LastWriteTime -gt $compareDate } |
Sort LastAccessTime -Descending |
select -First 1 |
select -ExcludeProperty Name, LastAccessTime
Write-Host $LastFileCaptured.Name
$LastFileCaptured.LastAccessTime
$LastFileCaptured = Get-ChildItem -Recurse |
Where-Object{$_.LastWriteTime.AddDays(-1) -gt (Get-Date)}
Write-Host $LastFileCaptured
Get-ChildItem $sourcePath -Recurse -Include '.log' | Where-Object {
$_.LastWriteTime.AddDays(-1).ToString("yyyy/MM/dd") -gt (get-date).ToString("yyyy/mm/dd")
} | ForEach-Object {
$destDir = Split-Path ($_.FullName -replace [regex]::Escape($sourcePath), $destPath)
if (!(Test-Path $destDir)) {
New-Item -ItemType directory $destDir | Out-Null
}
Copy-Item $_ -Destination $destDir
}
The "correctness" of your script is determined easily by running it! But, while this isn't a direct answer, I would suggest robocopy for this task.
In particular note these options:
/mon: Monitors the source, and runs again when more than N changes are detected.
/maxage: Specifies the maximum file age (to exclude files older than N days or date).

Is there a faster way to move pictures by a PowerShell script

I want to move all pictures from several folders to one destination folder, if they listed in my txt-file.
The script works, but there are about 81k pictures and 450k names (eg samlpe-green-bigpic-detail-3.jpg) in the txt-file, it is damn slow.
Is there a way to script it, so it works faster?
$qpath = "c:\sample\picz\"
$Loggit = "c:\sample\pic_move.log"
$txtZeileU = "c:\sample\names.txt"
$d_pic = "C:\sample\moved_picz"
$arrZeileU = Get-Content -Path $txtZeileU
foreach ($Zeile in $arrZeileU) {
Get-ChildItem -Path $qpath -Recurse |
where {$_.Name –eq $Zeile} |
Move-Item -Destination $d_pic -Verbose -Force *>&1 |
Out-File -FilePath $Loggit -Append
}

How to monitor progress of md5 hashing of large drives/many files?

I am looking for the simplest and least intrusive way to monitor the progress of md5 fingerprinting of large drives, many files (8 TB, 2 million).
What would be the best option, for example in case it gets stuck or begins an infinite loop, I can see the trouble file?
The code:
Get-childitem -recurse -file | select-object #{n="Hash";e={get-filehash -algorithm MD5 -path $_.FullName | Select-object -expandproperty Hash}},lastwritetime,length,fullname | export-csv "$((Get-Date).ToString("yyyyMMdd_HHmmss"))_filelistcsv_MD5_LWT_size_path_file.csv" -notypeinformation
aaaa
If you want to list progress, you need to know where your process will end, so you need to list all the files BEFORE you start operating on them.
Write-Host "Listing Files..." -Fore Yellow
$AllFiles = Get-ChildItem -Recurse -File
$CurrentFile = 0 ; $TotalFiles = $AllFiles.Count
Write-Host "Hashing Files..." -Fore Yellow
$AllHashes = foreach ($File in $AllFiles){
Write-Progress -Activity "Hashing Files" -Status "$($CurrentFile)/$($TotalFiles) $($File.FullName)" -PercentComplete (($CurrentFile++/$TotalFiles)*100)
[PSCustomObject]#{
File = $File.FullName
Hash = (Get-FileHash -LiteralPath $File.FullName -Algorithm MD5).Hash
LastWriteTime = $File.LastWriteTime
Size = $File.Length
}
}
$AllHashes | Export-Csv "File.csv" -NoTypeInformation
This will give you a nice header with a progress bar, which looks like this:
ISE:
Normal Shell:

Resources