How can I get a du-ish analysis using PowerShell? I'd like to periodically check the size of directories on my disk.
The following gives me the size of each file in the current directory:
foreach ($o in gci)
{
Write-output $o.Length
}
But what I really want is the aggregate size of all files in the directory, including subdirectories. Also I'd like to be able to sort it by size, optionally.
There is an implementation available at the "Exploring Beautiful Languages" blog:
"An implementation of 'du -s *' in Powershell"
function directory-summary($dir=".") {
get-childitem $dir |
% { $f = $_ ;
get-childitem -r $_.FullName |
measure-object -property length -sum |
select #{Name="Name";Expression={$f}},Sum}
}
(Code by the blog owner: Luis Diego Fallas)
Output:
PS C:\Python25> directory-summary
Name Sum
---- ---
DLLs 4794012
Doc 4160038
include 382592
Lib 13752327
libs 948600
tcl 3248808
Tools 547784
LICENSE.txt 13817
NEWS.txt 88573
python.exe 24064
pythonw.exe 24576
README.txt 56691
w9xpopen.exe 4608
I modified the command in the answer slightly to sort descending by size and include size in MB:
gci . |
%{$f=$_; gci -r $_.FullName |
measure-object -property length -sum |
select #{Name="Name"; Expression={$f}},
#{Name="Sum (MB)";
Expression={"{0:N3}" -f ($_.sum / 1MB) }}, Sum } |
sort Sum -desc |
format-table -Property Name,"Sum (MB)", Sum -autosize
Output:
PS C:\scripts> du
Name Sum (MB) Sum
---- -------- ---
results 101.297 106217913
SysinternalsSuite 56.081 58805079
ALUC 25.473 26710018
dir 11.812 12385690
dir2 3.168 3322298
Maybe it is not the most efficient method, but it works.
If you only need the total size of that path, one simplified version can be,
Get-ChildItem -Recurse ${HERE_YOUR_PATH} | Measure-Object -Sum Length
function Get-DiskUsage ([string]$path=".") {
$groupedList = Get-ChildItem -Recurse -File $path | Group-Object directoryName | select name,#{name='length'; expression={($_.group | Measure-Object -sum length).sum } }
foreach ($dn in $groupedList) {
New-Object psobject -Property #{ directoryName=$dn.name; length=($groupedList | where { $_.name -like "$($dn.name)*" } | Measure-Object -Sum length).sum }
}
}
Mine is a bit different; I group all of the files on directoryname, then walk through that list building totals for each directory (to include the subdirectories).
Building on previous answers, this will work for those that want to show sizes in KB, MB, GB, etc., and still be able to sort by size. To change units, just change "MB" to desired units in both "Name=" and "Expression=". You can also change the number of decimal places to show (rounding), by changing the "2".
function du($path=".") {
Get-ChildItem $path |
ForEach-Object {
$file = $_
Get-ChildItem -File -Recurse $_.FullName | Measure-Object -Property length -Sum |
Select-Object -Property #{Name="Name";Expression={$file}},
#{Name="Size(MB)";Expression={[math]::round(($_.Sum / 1MB),2)}} # round 2 decimal places
}
}
This gives the size as a number not a string (as seen in another answer), therefore one can sort by size. For example:
PS C:\Users\merce> du | Sort-Object -Property "Size(MB)" -Descending
Name Size(MB)
---- --------
OneDrive 30944.04
Downloads 401.7
Desktop 335.07
.vscode 301.02
Intel 6.62
Pictures 6.36
Music 0.06
Favorites 0.02
.ssh 0.01
Searches 0
Links 0
My own take using the previous answers:
function Format-FileSize([int64] $size) {
if ($size -lt 1024)
{
return $size
}
if ($size -lt 1Mb)
{
return "{0:0.0} Kb" -f ($size/1Kb)
}
if ($size -lt 1Gb)
{
return "{0:0.0} Mb" -f ($size/1Mb)
}
return "{0:0.0} Gb" -f ($size/1Gb)
}
function du {
param(
[System.String]
$Path=".",
[switch]
$SortBySize,
[switch]
$Summary
)
$path = (get-item ".").FullName
$groupedList = Get-ChildItem -Recurse -File $Path |
Group-Object directoryName |
select name,#{name='length'; expression={($_.group | Measure-Object -sum length).sum } }
$results = ($groupedList | % {
$dn = $_
if ($summary -and ($path -ne $dn.name)) {
return
}
$size = ($groupedList | where { $_.name -like "$($dn.name)*" } | Measure-Object -Sum length).sum
New-Object psobject -Property #{
Directory=$dn.name;
Size=Format-FileSize($size);
Bytes=$size`
}
})
if ($SortBySize)
{ $results = $results | sort-object -property Bytes }
$results | more
}
Related
I want to keep only the file with the largest version of the specified zip file in the folder using powershell. I wrote a shell script but it returns all the files. How can I modify the script to select only the file with the largest version?
$files = Get-ChildItem -Filter "*.zip"
$max = $files |Measure-Object -Maximum| ForEach-Object {[int]($_.Split("_")[-1].Split(".")[0])}
$largestFiles = $files | Where-Object {[int]($_.Split("_")[-1].Split(".")[0]) -eq $max}
Write-Output $largestFiles
Expectation:
A1_Fantasic_World_20.zip
A1_Fantasic_World_21.zip
B1_Mythical_Realms_11.zip
B1_Mythical_Realms_12.zip
C1_Eternal_Frame_Corporation_2.zip
C1_Eternal_Frame_Corporation_3.zip
↓
A1_Fantasic_World_21.zip
B1_Mythical_Realms_12.zip
C1_Eternal_Frame_Corporation_3.zip
A1_Fantasic_World's biggest number is 21.B1_Mythical_Realms's is 12.C1_Eternal_Frame_Corporation's is 3. So I want to choose the biggest version of zip.
First you add the calculated properties to your file system objects you use for filtering. Then with a combination of Group-Object, Sort-Object and Select.Object you can filter the desired files.
$FileList =
Get-ChildItem -Filter *.zip |
Select-Object -Property *,
#{
Name = 'Title'
Expression = {($_.BaseName -split '_')[0..$(($_.BaseName -split '_').count - 2)] -join '_' }
},
#{
Name = 'Counter'
Expression = {[INT]($_.BaseName -split '_')[-1]}
}
$LastOnesList =
$FileList |
Group-Object -Property Title |
ForEach-Object {
$_.Group | Sort-Object -Property Counter | Select-Object -Last 1
}
$LastOnesList |
Select-Object -Property Name
There is a folder containing database backup files. I need to recursively check all folders and export the list of last backup files in each folder.
I have a code here that contains my idea. just I have to add this part:
-check for each selected file(selected file is the last created backup file) if the file creation time is older than 24 hours, export in csv file.
Thanks in advance
[Cmdletbinding()]
param(
[Parameter(Position=0,Mandatory=$false,ValueFromPipeline=$true)]$path=
"\F:\backups",
[Parameter(Position=1,Mandatory=$false,ValueFromPipeline=$true)]
$OutPutFilepath=
"f:\backup-daily.csv"
)
function Get-LastestWroteFile{
[Cmdletbinding()]
param(
[Parameter(Position=0,Mandatory=$true)]$Folder
)
begin{
$Latest = Get-ChildItem $Folder.FullName -File | select FullName,
CreationTime, LastAccessTime, LastWriteTime, Attributes, #{N='SizeInMb';E=
{$_.Length/1mb}},Name | Sort-Object CreationTime | select -Last 1
}
process{
}
end{
#new custom object with 3 props.
if($Latest){
return New-Object PSobject -Property #{"FullName"=$latest.Name;
LastWriteTime =
$latest.LastWriteTime;"Folder"=$folder.FullName;"SizeInMB" =
[math]::Round($Latest.SizeInMB,3)} #FileInfo=$Latest; }
}
}
}
$OutPut=#()
Get-ChildItem -Directory -Path $path -Recurse | foreach{
$OutPut+= Get-LastestWroteFile $_
}
$OutPut | ConvertTo-Csv -NoTypeInformation -delimiter '|' | Out-File -
FilePath $OutPutFilepath
an advanced function would not be required, try below
Get-ChildItem -Path $Path -Recurse -File | Where-Object -FilterScript {
([Datetime]::Now - $_.CreationTime ).Hours -gt 24
} | Select-Object -Property Name,LastWriteTime,FullName,#{N='SizeInMb';E=
{$_.Length/1mb}},
I've been tasked with creating a script that checks to see if the office cameras we've set up have stopped uploading their feeds to the "Camera" share located on our Windows 2016 storage server. If the NEWEST .mkv is over an hour old compared to the current time (get-date) then the "problem" camera needs to be restarted manually. (No need to script that part.)
Here's what my Director has written so far:
#Variable Definitions start here
$numhours = 1
Get-ChildItem "d:\Shares\Cameras" | Foreach {
$folderToLookAt = ($_.FullName + "\*.mkv")
$result = Get-ChildItem -Recurse $folderToLookAt | Sort-Object CreationTime -Descending
echo $result[0].FullName
echo $result[0].CreationTime
}
The first variable really isn't used yet, but I'm kind of dumb-struck as what to do next. The above returns the full names and creation times successfully of the newest .mkvs
Suggestions on the next part?
Invert the logic - instead of searching all the files, sorting them, finding the most recent, and checking the date, do it the other way round.
Look for files created since the cutoff, and alert if there were none found:
$cutOffTime = [datetime]::Now.AddHours(-1)
Get-ChildItem "d:\Shares\Cameras" | Foreach {
$folderToLookAt = ($_.FullName + "\*.mkv")
$result = Get-ChildItem -Recurse $folderToLookAt | Where-Object { $_.CreationTime -gt $cuttoffTime }
if (-not $result)
{
"$($_.Name) has no files since the cutoff time"
}
}
I'm assuming your paths look like:
D:\Shares\Cameras\Camera1\file1.mkv
D:\Shares\Cameras\Camera1\file2.mkv
D:\Shares\Cameras\Camera2\file1.mkv
D:\Shares\Cameras\Camera2\file2.mkv
D:\Shares\Cameras\Camera3\file1.mkv
.
.
.
If so, I would do something like this:
# The path to your files
$CameraShareRoot = 'D:\Shares\Cameras';
# Number of Hours
$NumberOfHours = 1;
# Date and time of significance. It's $NumberOfHours in the past.
$MinFileAge = (Get-Date).AddHours( - $NumberOfHours);
# Get all the folders at the camera share root
Get-ChildItem -Path $CameraShareRoot -Directory | ForEach-Object {
# Get the most recently created file in each folder
$_ | Get-ChildItem -Recurse -Filter '*.mkv' -File | Sort-Object -Property CreationTime -Descending | Select-Object -First 1
} | Where-Object {
# Remove any files that were created after our datetime
$_.CreationTime -lt $MinFileAge;
} | Select-Object -Property FullName, CreationTime
This will just output the full file name and creation time for stale cameras.
You could do something like this to email yourself a report when the results have any files:
# The path to your files
$CameraShareRoot = 'D:\Shares\Cameras';
# Number of Hours
$NumberOfHours = 1;
# Date and time of significance. It's $NumberOfHours in the past.
$MinFileAge = (Get-Date).AddHours( - $NumberOfHours);
# Get all the folders at the camera share root, save the results to $StaleCameraFiles
$StaleCameraFiles = Get-ChildItem -Path $CameraShareRoot -Directory | ForEach-Object {
# Get the most recently created file in each folder
$_ | Get-ChildItem -Recurse -Filter '*.mkv' -File | Sort-Object -Property CreationTime -Descending | Select-Object -First 1;
} | Where-Object {
# Remove any files that were created after our datetime
$_.CreationTime -lt $MinFileAge;
}
# If there are any stale camera files
if ($StaleCameraFiles) {
# Send an email
$MailMessage = #{
SmtpServer = 'mail.example.com';
To = 'youremail#example.com';
From = 'youremail#example.com';
Subject = 'Stale Camera Files';
Body = $StaleCameraFiles | Select-Object -Property FullName, CreationTime | ConvertTo-Html -Fragment | Out-String;
BodyAsHtml = $true;
}
Send-MailMessage #MailMessage;
}
Generally you will want to use LastWriteTime instead of CreationTime since the latter can be updated by a file move or copy, but maybe that's what you want here.
You have to compare the CreationTime date with (Get-Date).AddHours(-1). The AddHours method allows you to add hours to the DateTime, but also to subtract.
You can use the following example:
$Path = 'd:\Shares\Cameras'
$CreationTime = Get-ChildItem -Path $Path -Filter *.mkv |
Sort-Object -Property CreationTime -Descending |
Select-Object -First 1 -ExpandProperty CreationTime
if ($CreationTime -lt (Get-Date).AddHours(-1)) {
# your action here (restart, send mail, write output, ...)
}
It also optimizes your code a bit. ;)
$LatestFile = Get-ChildItem C:\Users\Connor\Desktop\ | Sort CreationTime | Select -Last 1
if ($LatestFile.CreationTime -gt (Get-Date).AddHours(-1)){
#It's Currently Working
} else {
#Do Other Stuff
}
try this :
Get-ChildItem "c:\temp" -Filter *.mkv -File | sort CreationTime -Descending |
select -First 1 | where CreationTime -lt (Get-Date).AddHours(-1) |
%{Write-Host "Alert !!" -ForegroundColor Red}
My group is moving to a new network where we can't directly copy from our computer in Network A to the new machine in Network B. After years on this machine in Network A, I've got project files interspersed all over the disk. I need to build a script to copy the folders and files to a backup disk. No problem there, but the network tech guy requires the total byte count to be known before copying.
In CMD I've used dir /AD /S /B > C:\Users\r6540\Desktop\UserFiles.txt from C:\ to generate a huge list of directories, including a lot of junk that I've manually edited out.
e.g.
C:\Dev\SSIS
C:\Dev\SSIS\DatabaseCleanup
C:\Dev\SSIS\DatabaseMaintTests
C:\Dev\SSIS\EclipseKeys
C:\Dev\SSIS\TemplateProject
I've never used PowerShell, but it certainly looks like this task would be within its ability. I found this:
$startFolder = "C:\Scripts"
$colItems = (Get-ChildItem $startFolder | Measure-Object -property length -sum)
"$startFolder -- " + "{0:N2}" -f ($colItems.sum / 1MB) + " MB"
$colItems = (Get-ChildItem $startFolder -recurse | Where-Object {$_.PSIsContainer -eq $True} | Sort-Object)
foreach ($i in $colItems)
{
$subFolderItems = (Get-ChildItem $i.FullName | Measure-Object -property length -sum)
$i.FullName + " -- " + "{0:N2}" -f ($subFolderItems.sum / 1MB) + " MB"
}
at Microsoft technet and also this, same page:
$objFSO = New-Object -com Scripting.FileSystemObject
"{0:N2}" -f (($objFSO.GetFolder("C:\Scripts").Size) / 1MB) + " MB"
The output I'm looking for is the directory name, a tab, and the folder size (without the "MB" as shown above though) and CRLF as the EOL written to a text file.
e.g.
C:\Dev\SSIS 70.23
C:\Dev\SSIS\DatabaseCleanup 17.80
C:\Dev\SSIS\DatabaseMaintTests 22.91
C:\Dev\SSIS\EclipseKeys 1.22
C:\Dev\SSIS\TemplateProject 13.29
Anyone know PowerShell well enough to troop through UserFiles.txt and get the resulting text file output?
Form doesn't matter as much as function--so if you can come up with an alternate approach, I'd be glad to see it.
Thanks.
This is pretty straightforward in PowerShell:
$objFSO = New-Object -com Scripting.FileSystemObject
Get-Content c:\input.txt |
Foreach { "{0}`t{1:N2}" -f $_, (($objFSO.GetFolder($_).Size) / 1MB) } |
Out-File c:\output.txt -enc ascii
This is assuming the FileSystemObject script you found works. :-)
Simple approach is to use the pipeline more efficiently
$inputFile = "C:\Users\r6540\Desktop\UserFiles.txt"
$outputFile = "C:\temp\output.txt"
Get-Content $inputFile | ForEach-Object{
$sum = Get-ChildItem $_ -recurse -Force | Measure-Object -property length -sum | Select-Object -ExpandProperty Sum
"{1}`t{0:N2}" -f ($sum / 1MB), $_
} | Set-Content $outputFile
So we take each line of the file and gather the size in MB using your posted logic. The send the output to file. But we can improve on that a little.
$inputFile = "C:\Users\r6540\Desktop\UserFiles.txt"
$outputFile = "C:\temp\output.txt"
$results = Get-Content $inputFile | ForEach-Object{
$props = #{
Folder = $_
Sum = "{0:N2}" -f ((Get-ChildItem $_ -Recurse -Force | Measure-Object -property length -sum | Select-Object -ExpandProperty Sum) / 1MB)
}
New-Object -TypeName PSCustomObject -Property $props
}
$results | Export-CSV $outputFile -NoTypeInformation
"Total,{0}" -f ($results | Measure-Object Sum -sum | Select-Object -ExpandProperty Sum) | Add-Content $outputFile
Only difference here is that we make nice CSV output as well as a total being appended to the bottom since at the end of the day that might be nice to know as well (If I was the recipient of multiple copies from multiple users I would save me a few seconds of effort.) You could just keep a counter in the loop as well for this but this give you an opportunity to see PowerShell at work.
I want to out put folder name, lastwritetime and folder size, how can i combine both of the results in to one line?
For folder name and lastwritetime:
get-item "\\server-01\Y$\Server1" | select name,lastwritetime
For folder size:
$folder = (Get-ChildItem "\\server-01\Y$\Server1" -recurse | Measure-Object -property length -sum)
$size = "{0:N2}" -f ($folder.sum / 1024MB) + " GB"
I need output format like this:
Name LastWriteTime Size
Server1 2014-05-05 55G
Also how to make a loop of running this function through a list of PCs?
Any idea please?
For Folder name and lastwritetime:
Get-Item $Path | Where-Object { $_.BaseName ,$_.LastWriteTime}
For folder size:
$log="C:\log.txt"
$Path = "C:\Test"
$Items = Get-ChildItem $Path | Where-Object {$_.PSIsContainer -eq $True} | Sort-Object
foreach ($f in $Items){
$itemSum = Get-ChildItem ("$Path\" + $f.Name) | Select-Object #{ l="Path" ; e = {$f}},LastWriteTime,#{l="Size" ; e={((Get-childitem -recurse | measure-object length -sum).Sum /1KB)}}
}
Enjoy!!
FYI
Query Folder tree for Size and export to a log on a server
Select-Object will be your friend here:
foreach ($c in (get-content .\Servers.txt))
{ Get-Childitem \\$c\y$\mydirectory | select-object #{l="Name" ; e = {$c}},Lastwritetime,#{l="Size" ; e={(Get-childitem -recurse | measure-object length -sum).sum}} }
But you could also do yourself a favor and add a function like get-foldersize to your profile or to a standard tools module.
http://gallery.technet.microsoft.com/Get-FolderSize-b3d317f5
Here's a true one-liner with some formatting.
Get-ChildItem -Directory -Force|ForEach {"{0,-30} {1,-30} {2:N2}MB" -f $_.Name, $_.LastWriteTime, ((Get-ChildItem $_ -Recurse|Measure-Object -Property Length -Sum -ErrorAction Stop).Sum/1MB)}
Result: