Efficiently counting files in directory and subfolders with specific name - performance

I can count all the files in a folder and sub-folders, the folders themselves are not counted.
(gci -Path *Fill_in_path_here* -Recurse -File | where Name -like "*STB*").Count
However, powershell is too slow for the amount of files (up to 700k). I read that cmd is faster in executing this kind of task.
Unfortunately I have no knowledge of cmd code at all. In the example above I am counting all the files with STB in the file name.
That is what I would like to do in cmd as well.
Any help is appreciated.

Theo's helpful answer based on direct use of .NET ([System.IO.Directory]::EnumerateFiles()) is the fastest option (in my tests; YMMV - see the benchmark code below[1]).
Its limitations in the .NET Framework (FullCLR) - on which Windows PowerShell is built - are:
An exception is thrown when an inaccessible directory is encountered (due to lack of permissions). You can catch the exception, but you cannot continue the enumeration; that is, you cannot robustly enumerate all items that you can access while ignoring those that you cannot.
Hidden items are invariably included.
With recursive enumeration, symlinks / junctions to directories are invariably followed.
By contrast, the cross-platform .NET Core framework, since v2.1 - on which PowerShell Core is built - offers ways around these limitations, via the EnumerationOptions options - see this answer for an example.
Note that you can also perform enumeration via the related [System.IO.DirectoryInfo] type, which - similar to Get-ChildItem - returns rich objects rather than mere path strings, allowing for much for versatile processing; e.g., to get an array of all file sizes (property .Length, implicitly applied to each file object):
([System.IO.DirectoryInfo] $somePath).EnumerateFiles('*STB*', 'AllDirectories').Length
A native PowerShell solution that addresses these limitations and is still reasonably fast is to use Get-ChildItem with the -Filter parameter.
(Get-ChildItem -LiteralPath $somePath -Filter *STB* -Recurse -File).Count
Hidden items are excluded by default; add -Force to include them.
To ignore permission problems, add -ErrorAction SilentlyContinue or -ErrorAction Ignore; the advantage of SilentlyContinue is that you can later inspect the $Error collection to determine the specific errors that occurred, so as to ensure that the errors truly only stem from permission problems.
Note that PowerShell Core - unlike Windows PowerShell - helpfully ignores the inability to enumerate the contents of the hidden system junctions that exist for pre-Vista compatibility only, such as $env:USERPROFILE\Cookies.
In Windows PowerShell, Get-ChildItem -Recurse invariably follows symlinks / junctions to directories, unfortunately; more sensibly, PowerShell Core by default does not, and offers opt-in via -FollowSymlink.
Like the [System.IO.DirectoryInfo]-based solution, Get-ChildItem outputs rich objects ([System.IO.FileInfo] / [System.IO.DirectoryInfo]) describing each enumerated file-system item, allowing for versatile processing.
Note that while you can also pass wildcard arguments to -Path (the implied first positional parameter) and -Include (as in TobyU's answer), it is only -Filter that provides
significant speed improvements, due to filtering at the source (the filesystem driver), so that PowerShell only receives the already-filtered results; by contrast, -Path / -Include must first enumerate everything and match against the wildcard pattern afterwards.[2]
Caveats re -Filter use:
Its wildcard language is not the same as PowerShell's; notably, it doesn't support character sets/ranges (e.g. *[0-9]) and it has legacy quirks - see this answer.
It only supports a single wildcard pattern, whereas -Include supports multiple (as an array).
That said, -Filter processes wildcards the same way as cmd.exe's dir.
Finally, for the sake of completeness, you can adapt MC ND's helpful answer based on cmd.exe's dir command for use in PowerShell, which simplifies matters:
(cmd /c dir /s /b /a-d "$somePath/*STB*").Count
PowerShell captures an external program's stdout output as an array of lines, whose element count you can simply query with the .Count (or .Length) property.
That said, this may or may not be faster than PowerShell's own Get-ChildItem -Filter, depending on the filtering scenario; also note that dir /s can only ever return path strings, whereas Get-ChildItem returns rich objects whose properties you can query.
Caveats re dir use:
/a-d excludes directories, i.e., only reports files, but then also includes hidden files, which dir doesn't do by default.
dir /s invariably descends into hidden directories too during the recursive enumeration; an /a (attribute-based) filter is only applied to the leaf items of the enumeration (only to files in this case).
dir /s invariably follows symlinks / junctions to other directories (assuming it has the requisite permissions - see next point).
dir /s quietly ignores directories or symlinks / junctions to directories if it cannot enumerate their contents due to lack of permissions - while this is helpful in the specific case of the aforementioned hidden system junctions (you can find them all with cmd /c dir C:\ /s /ashl), it can cause you to miss the content of directories that you do want to enumerate, but can't for true lack of permissions, because dir /s will give no indication that such content may even exist (if you directly target an inaccessible directory, you get a somewhat misleading File Not Found error message, and the exit code is set to 1).
Performance comparison:
The following tests compare pure enumeration performance without filtering, for simplicity, using a sizable directory tree assumed to be present on all systems, c:\windows\winsxs; that said, it's easy to adapt the tests to also compare filtering performance.
The tests are run from PowerShell, which means that some overhead is introduced by creating a child process for cmd.exe in order to invoke dir /s, though (a) that overhead should be relatively low and (b) the larger point is that staying in the realm of PowerShell is well worthwhile, given its vastly superior capabilities compared to cmd.exe.
The tests use function Time-Command, which can be downloaded from this Gist, which averages 10 runs by default.
# Warm up the filesystem cache for the target dir.,
# both from PowerShell and cmd.exe, to be safe.
gci 'c:\windows\winsxs' -rec >$null; cmd /c dir /s 'c:\windows\winsxs' >$null
Time-Command `
{ #([System.IO.Directory]::EnumerateFiles('c:\windows\winsxs', '*', 'AllDirectories')).Count },
{ (Get-ChildItem -Force -Recurse -File 'c:\windows\winsxs').Count },
{ (cmd /c dir /s /b /a-d 'c:\windows\winsxs').Count },
{ cmd /c 'dir /s /b /a-d c:\windows\winsxs | find /c /v """"' }
On my single-core VMWare Fusion VM with Windows PowerShell v5.1.17134.407 on Microsoft Windows 10 Pro (64-bit; Version 1803, OS Build: 17134.523) I get the following timings, from fastest to slowest (scroll to the right to see the Factor column to show relative performance):
Command Secs (10-run avg.) TimeSpan Factor
------- ------------------ -------- ------
#([System.IO.Directory]::EnumerateFiles('c:\windows\winsxs', '*', 'AllDirectories')).Count 11.016 00:00:11.0158660 1.00
(cmd /c dir /s /b /a-d 'c:\windows\winsxs').Count 15.128 00:00:15.1277635 1.37
cmd /c 'dir /s /b /a-d c:\windows\winsxs | find /c /v """"' 16.334 00:00:16.3343607 1.48
(Get-ChildItem -Force -Recurse -File 'c:\windows\winsxs').Count 24.525 00:00:24.5254979 2.23
Interestingly, both [System.IO.Directory]::EnumerateFiles() and the Get-ChildItem solution are significantly faster in PowerShell Core, which runs on top of .NET Core (as of PowerShell Core 6.2.0-preview.4, .NET Core 2.1):
Command Secs (10-run avg.) TimeSpan Factor
------- ------------------ -------- ------
#([System.IO.Directory]::EnumerateFiles('c:\windows\winsxs', '*', 'AllDirectories')).Count 5.094 00:00:05.0940364 1.00
(cmd /c dir /s /b /a-d 'c:\windows\winsxs').Count 12.961 00:00:12.9613440 2.54
cmd /c 'dir /s /b /a-d c:\windows\winsxs | find /c /v """"' 14.999 00:00:14.9992965 2.94
(Get-ChildItem -Force -Recurse -File 'c:\windows\winsxs').Count 16.736 00:00:16.7357536 3.29
[1] [System.IO.Directory]::EnumerateFiles() is inherently and undoubtedly faster than a Get-ChildItem solution. In my tests (see section "Performance comparison:" above), [System.IO.Directory]::EnumerateFiles() beat out cmd /c dir /s as well, slightly in Windows PowerShell, and clearly so in PowerShell Core, but others report different findings. That said, finding the overall fastest solution is not the only consideration, especially if more than just counting files is needed and if the enumeration needs to be robust. This answer discusses the tradeoffs of the various solutions.
[2] In fact, due to an inefficient implementation as of Windows PowerShell v5.1 / PowerShell Core 6.2.0-preview.4, use of -Path and -Include is actually slower than using Get-ChildItem unfiltered and instead using an additional pipeline segment with ... | Where-Object Name -like *STB*, as in the OP - see this GitHub issue.

One of the fastest ways to do it in cmd command line or batch file could be
dir "x:\some\where\*stb*" /s /b /a-d | find /c /v ""
Just a recursive (/s) dir command to list all files (no folders /a-d) in bare format (/b), with all the output piped to find command that will count (/c) the number of non empty lines (/v "")
But, in any case, you will need to enumerate the files and it requires time.
edited to adapt to comments, BUT
note The approach below does not work for this case because, at least in windows 10, the space padding in the summary lines of the dir command is set to five positions. File counts greater than 99999 are not correctly padded, so the sort /r output is not correct.
As pointed by Ben Personick, the dir command also outputs the number of files and we can retrieve this information:
#echo off
setlocal enableextensions disabledelayedexpansion
rem Configure where and what to search
set "search=x:\some\where\*stb*"
rem Retrieve the number of files
set "numFiles=0"
for /f %%a in ('
dir "%search%" /s /a-d /w 2^>nul %= get the list of the files =%
^| findstr /r /c:"^ *[1-9]" %= retrieve only summary lines =%
^| sort /r 2^>nul %= reverse sort, greater line first =%
^| cmd /e /v /c"set /p .=&&echo(!.!" %= retrieve only first line =%
') do set "numFiles=%%a"
echo File(s) found: %numFiles%
The basic idea is use a serie of piped commands to handle different parts of data retrieval:
Use a dir command to generate the list of files (/w is included just to generate less lines).
As we only want summary lines with the number of files found, findstr is used to retrieve only that lines starting with spaces (the header/summary lines) and a number greater than 0 (file count summary lines, as we are using /a-d the directory count summary lines will have a value of 0).
Sort the lines in reverse order to end with the greater line first (summary lines start is a left space padded number). Greater line (final file count or equivalent) will be the first line.
Retrieve only this line using a set /p command in a separate cmd instance. As the full sequence is wrapped in a for /f and it has a performance problem when retrieving long lists from command execution, we will try to retrieve as little as possible.
The for /f will tokenize the retrieved line, get the first token (number of files) and set the variable used to hold the data (variable has been initialized, it is possible that no file could be found).

You can speed up things with PowerShell, if you include the filter directly within the command instead of filtering the result set of the command which is way bigger than a pre filtered one.
Try this:
(Get-ChildItem -Path "Fill_in_path_here" -Recurse -File -Include "*STB*").Count

My guess is this is a lot faster:
$path = 'Fill_in_path_here'
#([System.IO.Directory]::EnumerateFiles($path, '*STB*', 'AllDirectories')).Count
If you do not want to recurse subfolders, change 'AllDirectories' to 'TopDirectoryOnly'

I would rather do PowerShell as it is a far stronger tool. You might give this .bat file script a try.
#ECHO OFF
SETLOCAL ENABLEDELAYEDEXPANSION
SET /A "N=0"
FOR /F "delims=" %%f IN ('DIR /S /B /A:-D "C:\path\to\files\*STB*"') DO (SET /A "N=!N!+1")
ECHO There are !N! files.

General testing favors MCND's command when run against several systems
Results over 1000 Runs:
Summary
P/VM - OS - PS Ver - Files - Winner - Faster By % Seconds - Winner FPS - Loser FPS (Files Per Second)
---- - ----------- - ------ - ------ - ------ - ------------------- - ---------- - ----------------------------
PM - Win 7 - 5.1.1 - 87894 - SysIO - 9% (0.29s) - 27,237 FPS - 24,970 FPS
PM - Win 2012 - 5.1.1 - 114968 - MCND - 8% (0.38s) - 25,142 FPS - 23,226 FPS
VM - Win 2012 - 5.1.1 - 99312 - MCND - 34% (1.57s) - 21,265 FPS - 15,890 FPS
PM - Win 2016 - 5.1.1 - 102812 - SysIO - 2% (0.12s) - 20,142 FPS - 19,658 FPS
VM - Win 2012 R2 - 4.0 - 98396 - MCND - 29-34% (1.56-1.71s) - 19,787 FPS - 14,717 FPS
PM - Win 2008 R2 - 5.0.1 - 46557 - MCND - 13-17% (0.33-0.44s) - 18,926 FPS - 16,088 FPS
VM - Win 2012 R2 - 4.0 - 90906 - MCND - 22% (1.25s) - 16,772 FPS - 13,629 FPS
Additionally Theos command will bomb on C:\Windows while MCND's works as expected.
-- I have explained to MK in the comments that the \cookies directory and other such directories are intentionally non-traversable so that you od not double-count the files contained within them.
The test MK ran on a VMWare fusion running atop his MAC OS is far from conclusive, and shows execution times are incredibly slow which immediately tipped me off that they were strange results.
In addition, I could not execute the command as written by MK and receive a result of the number of files in the folder, so I have included a snippet in my testing which shows all methods used do give the correct result.
Here is the code used for my runs, note I also ran 1000 runs using MK's preferred method to compare output.
Strangely that one .count method for the MCND command seems to give very biased results on my win7 system, very different from any other system and hugely slower (5x slower) in the initial runs I posted, and varying the most on future runs I tried.
But I think this is due to load, and plan to drop those results if I ever bother to post more, but most of the remaining systems are pretty similar to the results I feel like they could seem redundant if they aren't from very different systems.
$MaxRuns=1000
$Root_Dir="c:\windows\winsxs"
$Results_SysIO=#()
$Results_MCND1=#()
$Results_MCND2=#()
$Results_MCND3=#()
$Results_Meta=#()
FOR ($j=1; $j -le $MaxRuns; $j++) {
Write-Progress -Activity "Testing Mthods for $MaxRuns Runs" -Status "Progress: $($j/$MaxRuns*100)% -- Run $j of $MaxRuns" -PercentComplete ($j/$MaxRuns*100)
# Tests SysIO: #([System.IO.Directory]::EnumerateFiles($Root_Dir, '*', 'AllDirectories')).Count
$Results_SysIO+=Measure-Command { #([System.IO.Directory]::EnumerateFiles($Root_Dir, '*', 'AllDirectories')).Count }
sleep -milliseconds 500
# Tests MCND1 CMD Script: DIR "%~1" /s /a-d ^| FIND /I /V "" | find /c /v ""
$Results_MCND1+=Measure-Command {C:\Admin\TestMCNDFindFiles1.cmd $Root_Dir}
sleep -milliseconds 500
# Tests MCND2 CMD Count: {cmd /c 'dir /s /b /a-d $Root_Dir | find /c /v """"'}
$Results_MCND2+=Measure-Command {cmd /c `"dir /s /b /a-d $Root_Dir `| find /c /v `"`"`"`"`"}
sleep -milliseconds 500
# Tests MCND3 PS Count (cmd /c dir /s /b /a-d $Root_Dir).Count
$Results_MCND3+=Measure-Command {(cmd /c dir /s /b /a-d $Root_Dir).Count}
sleep -milliseconds 500
}
$CPU=Get-WmiObject Win32_Processor
""
"CPU: $($($CPU.name).count)x $($CPU.name | select -first 1) - Windows: $($(Get-WmiObject Win32_OperatingSystem).Version) - PS Version: $($PSVersionTable.PSVersion)"
ForEach ($Name in "SysIO","MCND1","MCND2","MCND3") {
$Results_Meta+=[PSCustomObject]#{
Method=$Name
Min=$($($(Get-Variable -Name "Results_$Name" -valueOnly).TotalSeconds|Measure-Object -Minimum).Minimum)
Max=$($($(Get-Variable -Name "Results_$Name" -valueOnly).TotalSeconds|Measure-Object -Maximum).Maximum)
Avg=$($($(Get-Variable -Name "Results_$Name" -valueOnly).TotalSeconds|Measure-Object -Average).Average)
}
}
$Results_Meta | sort Avg | select Method,Min,Max,Avg,#{N="Factor";e={("{0:f2}" -f (([math]::Round($_.Avg / $($Results_Meta | sort Avg | select Avg)[0].avg,2,1))))}}|FT
Time-Command `
{cmd /c `"dir /s /b /a-d $Root_Dir `| find /c /v `"`"`"`"`"},
{C:\Admin\TestMCNDFindFiles1.cmd $Root_Dir},
{#([System.IO.Directory]::EnumerateFiles($Root_Dir, '*', 'AllDirectories')).Count},
{(cmd /c dir /s /b /a-d $Root_Dir).Count} $MaxRuns `
""
"Results of Commands - (How many Files were in that Folder?):"
[PSCustomObject]#{
SysIO=$(&{ #([System.IO.Directory]::EnumerateFiles($Root_Dir, '*', 'AllDirectories')).Count })
MCND1=$(&{C:\Admin\TestMCNDFindFiles1.cmd $Root_Dir})
MCND2=$(&{cmd /c `"dir /s /b /a-d $Root_Dir `| find /c /v `"`"`"`"`"})
MCND3=$(&{(cmd /c dir /s /b /a-d $Root_Dir).Count})
}
I have Additional Runs I didn't collect yet from additional systems, the Win7 Results are inconsistent though so I'll probably strip them when I have more to add to the list from other systems.
Detailed Findings
Physical Win 7 Laptop - 87894 Files - Loser: MCND is 9% (.29s) Slower - (Winning Method: 27,237 FPS) -- Results are not consistent on re-runs while other systems are.
CPU: 1x Intel(R) Core(TM) i5-4310U CPU # 2.00GHz - Windows: 6.1.7601 - PS Version: 5.1.14409.1012
CPU: 1x Intel(R) Core(TM) i5-4310U CPU # 2.00GHz - Windows: 6.1.7601 - PS Version: 5.1.14409.1012
Method Min Max Avg Factor
------ --- --- --- ------
SysIO 3.0190345 6.1287085 3.2174689013 1.00
MCND1 3.3655209 5.9024033 3.5490564665 1.10
MCND3 3.5865989 7.5816207 3.8515160528 1.20
MCND2 3.7542295 7.5619913 3.9471552743 1.23
3.2174689013
0.0000366062
Command Secs (1000-run avg.) TimeSpan Factor
------- -------------------- -------- ------
#([System.IO.Directory]::EnumerateFiles($Root_Dir, '*', 'AllDirectories')).Count 3.227 00:00:03.2271969 1.00
C:\Admin\TestMCNDFindFiles1.cmd $Root_Dir 3.518 00:00:03.5178810 1.09
cmd /c `"dir /s /b /a-d $Root_Dir `| find /c /v `"`"`"`"`" 3.911 00:00:03.9106284 1.21
(cmd /c dir /s /b /a-d $Root_Dir).Count 16.338 00:00:16.3377823 5.06
Results of Commands - (How many Files were in that Folder?):
SysIO MCND1 MCND2 MCND3
----- ----- ----- -----
87894 87894 87894 87894
Physical Win 2012 Desktop - 114968 Files - Loser: SysIO is 8% (.38s) Slower - (Winning Method: 25,142 FPS)
CPU: 1x Intel(R) Xeon(R) CPU E5-2407 0 # 2.20GHz - Windows: 6.3.9600 - PS Version: 5.1.14409.1012
CPU: 1x Intel(R) Xeon(R) CPU E5-2407 0 # 2.20GHz - Windows: 6.3.9600 - PS Version: 5.1.14409.1012
Method Min Max Avg Factor
------ --- --- --- ------
MCND1 4.4957173 8.6672112 4.5726616326 1.00
MCND3 4.6815509 18.6689706 4.7940769407 1.05
SysIO 4.8789948 5.1625618 4.9476786004 1.08
MCND2 5.0404912 7.2557797 5.0854683543 1.11
Command Secs (1000-run avg.) TimeSpan Factor
------- -------------------- -------- ------
C:\Admin\TestMCNDFindFiles1.cmd $Root_Dir 4.542 00:00:04.5418653 1.00
(cmd /c dir /s /b /a-d $Root_Dir).Count 4.772 00:00:04.7719769 1.05
#([System.IO.Directory]::EnumerateFiles($Root_Dir, '*', 'AllDirectories')).Count 4.933 00:00:04.9330404 1.09
cmd /c `"dir /s /b /a-d $Root_Dir `| find /c /v `"`"`"`"`" 5.086 00:00:05.0855891 1.12
Results of Commands - (How many Files were in that Folder?):
SysIO MCND1 MCND2 MCND3
----- ----- ----- -----
114968 114968 114968 114968
VM Win 2012 Server - 99312 Files - Loser: SysIO is 34% (1.57s) Slower - (Winning Method: 21,265 FPS)
CPU: 4x Intel(R) Xeon(R) CPU E7- 2850 # 2.00GHz - Windows: 6.3.9600 - PS Version: 5.1.14409.1005
CPU: 4x Intel(R) Xeon(R) CPU E7- 2850 # 2.00GHz - Windows: 6.3.9600 - PS Version: 5.1.14409.1005
Method Min Max Avg Factor
------ --- --- --- ------
MCND1 4.5563908 5.2656374 4.6812307177 1.00
MCND3 4.6696518 5.3846231 4.9064852835 1.05
MCND2 5.0559205 5.5583717 5.15425442679999 1.10
SysIO 6.036294 6.7952711 6.254027334 1.34
Command Secs (1000-run avg.) TimeSpan Factor
------- -------------------- -------- ------
C:\Admin\TestMCNDFindFiles1.cmd $Root_Dir 4.669 00:00:04.6689048 1.00
(cmd /c dir /s /b /a-d $Root_Dir).Count 4.934 00:00:04.9336925 1.06
cmd /c `"dir /s /b /a-d $Root_Dir `| find /c /v `"`"`"`"`" 5.153 00:00:05.1532386 1.10
#([System.IO.Directory]::EnumerateFiles($Root_Dir, '*', 'AllDirectories')).Count 6.239 00:00:06.2389727 1.34
Results of Commands - (How many Files were in that Folder?):
SysIO MCND1 MCND2 MCND3
----- ----- ----- -----
99312 99312 99312 99312
Physical Win 2016 Server - 102812 Files - Loser: MCND is 2% (0.12s) Slower - (Winning Method: 20,142 FPS)
CPU: 2x Intel(R) Xeon(R) CPU E5-2667 v4 # 3.20GHz - Windows: 10.0.14393 - PS Version: 5.1.14393.2608
CPU: 2x Intel(R) Xeon(R) CPU E5-2667 v4 # 3.20GHz - Windows: 10.0.14393 - PS Version: 5.1.14393.2608
Method Min Max Avg Factor
------ --- --- --- ------
SysIO 5.0414178 5.5279055 5.1043614001 1.00
MCND3 5.0468476 5.4673033 5.23160342460001 1.02
MCND1 5.1649438 5.6745749 5.26664923669999 1.03
MCND2 5.3280266 5.7989287 5.3747728434 1.05
Command Secs (1000-run avg.) TimeSpan Factor
------- -------------------- -------- ------
#([System.IO.Directory]::EnumerateFiles($Root_Dir, '*', 'AllDirectories')).Count 5.156 00:00:05.1559628 1.00
(cmd /c dir /s /b /a-d $Root_Dir).Count 5.256 00:00:05.2556244 1.02
C:\Admin\TestMCNDFindFiles1.cmd $Root_Dir 5.272 00:00:05.2722298 1.02
cmd /c `"dir /s /b /a-d $Root_Dir `| find /c /v `"`"`"`"`" 5.375 00:00:05.3747287 1.04
Results of Commands - (How many Files were in that Folder?):
SysIO MCND1 MCND2 MCND3
----- ----- ----- -----
102812 102812 102812 102812
VM Win 2012 R2 Server - 98396 Files - Loser: SysIO 29-34% (1.56-1.71s) Slower - (Winning Method: 19,787 FPS)
CPU: 2x Intel(R) Xeon(R) CPU E7- 2850 # 2.00GHz - Windows: 6.3.9600 - PS Version: 4.0
CPU: 2x Intel(R) Xeon(R) CPU E7- 2850 # 2.00GHz - Windows: 6.3.9600 - PS Version: 4.0
Method Min Max Avg Factor
------ --- --- --- ------
MCND1 4.7007419 5.9567352 4.97285509330001 1.00
MCND2 5.2086999 6.7678172 5.4849721167 1.10
MCND3 5.0116501 8.7416729 5.71391797679999 1.15
SysIO 6.2400687 7.414201 6.6862204345 1.34
Command Secs (1000-run avg.) TimeSpan Factor
------- -------------------- -------- ------
C:\Admin\TestMCNDFindFiles1.cmd $Root... 5.359 00:00:05.3592304 1.00
cmd /c `"dir /s /b /a-d $Root_Dir `| ... 5.711 00:00:05.7107644 1.07
(cmd /c dir /s /b /a-d $Root_Dir).Count 6.173 00:00:06.1728413 1.15
#([System.IO.Directory]::EnumerateFil... 6.921 00:00:06.9213833 1.29
Results of Commands - (How many Files were in that Folder?):
SysIO MCND1 MCND2 MCND3
----- ----- ----- -----
98396 98396 98396 98396
Physical Win 2008 R2 Server - 46557 Files - Loser: SysIO 13-17% (0.33-0.44s) Slower - (Winning Method: 18,926 FPS)
CPU: 2x Intel(R) Xeon(R) CPU 5160 # 3.00GHz - Windows: 6.1.7601 - PS Version: 5.0.10586.117
CPU: 2x Intel(R) Xeon(R) CPU 5160 # 3.00GHz - Windows: 6.1.7601 - PS Version: 5.0.10586.117
Method Min Max Avg Factor
------ --- --- --- ------
MCND3 2.2370018 2.8176253 2.4653543378 1.00
MCND1 2.4063578 2.8108379 2.5373719772 1.03
MCND2 2.5953631 2.9085969 2.7312907064 1.11
SysIO 2.7207865 30.335369 2.8940406601 1.17
Command Secs (1000-run avg.) TimeSpan Factor
------- -------------------- -------- ------
(cmd /c dir /s /b /a-d $Root_Dir).Count 2.500 00:00:02.5001477 1.00
C:\Admin\TestMCNDFindFiles1.cmd $Root_Dir 2.528 00:00:02.5275259 1.01
cmd /c `"dir /s /b /a-d $Root_Dir `| find /c /v `"`"`"`"`" 2.726 00:00:02.7259539 1.09
#([System.IO.Directory]::EnumerateFiles($Root_Dir, '*', 'AllDirectories')).Count 2.826 00:00:02.8259697 1.13
Results of Commands - (How many Files were in that Folder?):
SysIO MCND1 MCND2 MCND3
----- ----- ----- -----
46557 46557 46557 46557
VMWare Win 2012 R2 Server - 90906 Files - Loser: SysIO 23% (1.25s) Slower - (Winning Method: 15,722 FPS)
CPU: 4x Intel(R) Xeon(R) CPU E7- 2850 # 2.00GHz - Windows: 6.3.9600 - PS Version: 4.0
CPU: 4x Intel(R) Xeon(R) CPU E7- 2850 # 2.00GHz - Windows: 6.3.9600 - PS Version: 4.0
Method Min Max Avg Factor
------ --- --- --- ------
MCND1 5.0516057 6.4537866 5.423386317 1.00
MCND3 5.3297157 7.1722929 5.9030135773 1.09
MCND2 5.5460548 7.0356455 5.931334868 1.09
SysIO 6.2059999 19.5145373 6.6747122712 1.23
Command Secs (1000-run avg.) TimeSpan Factor
------- -------------------- -------- ------
C:\Admin\TestMCNDFindFiles1.cmd $Root... 5.409 00:00:05.4092046 1.00
(cmd /c dir /s /b /a-d $Root_Dir).Count 5.936 00:00:05.9358832 1.10
cmd /c `"dir /s /b /a-d $Root_Dir `| ... 6.069 00:00:06.0689899 1.12
#([System.IO.Directory]::EnumerateFil... 6.557 00:00:06.5571859 1.21
Results of Commands - (How many Files were in that Folder?):
SysIO MCND1 MCND2 MCND3
----- ----- ----- -----
90906 90906 90906 90906

Related

A Batch Script that runs depending on the Windows edition

I am looking for a batch that will allow me to run or not depending on the edition of Windows. It will run on all editions of Windows 10 except Home (or Core) and S. For the moment I have successfully filtered according to the build but it is not enough.
I also looked at the registry key EditionID in : HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion and wmic os get caption but being a beginner in batch I can't integrate it in my .bat
FOR /f "tokens=6 delims=[]. " %%i in ('ver') do SET build=%%i
IF %build% LSS 17763 (
COLOR E
ECHO.===============================================================================
ECHO. ALERT - Not compatible with previous versions ... the RS5
ECHO.===============================================================================
ECHO.
PAUSE
GOTO :EOF
)
You could just use this at the beginning of your batch file:
#Echo Off
SetLocal EnableExtensions
Set /A "SKU=OSV=0"
For /F "EOL=O Tokens=1,2 Delims=. " %%G In ('%SystemRoot%\System32\wbem\WMIC.exe
OS Where "Version>10" Get OperatingSystemSKU^, Version 2^>NUL'
) Do Set /A "SKU=%%G, OSV=%%H" 2>NUL
If Not %OSV% Equ 10 GoTo :EOF
For %%G In (98 99 100 101 123 125 126 129 130 131) Do If %%G Equ %SKU% GoTo :EOF
Rem Your code goes here
The idea is to end the script if the detected OS is not Windows 10, and if the Windows 10 Edition is any one of the following:
Edition Identifier
SKU
Operating System Definition
CoreN
98
Windows 10 Home N
CoreCountrySpecific
99
Windows 10 Home China
CoreSingleLanguage
100
Windows 10 Home single language
Core
101
Windows 10 Home
IoTUAP
123
Windows 10 IoT Core
EnterpriseS
125
Windows 10 Enterprise LTSB
EnterpriseSN
126
Windows 10 Enterprise LTSB N
EnterpriseSEval
129
Windows 10 Enterprise LTSB Evaluation
EnterpriseSNEval
130
Windows 10 Enterprise LTSB N Evaluation
IoTUAPCommercial
131
Windows 10 IoT Core Commercial
You can obviously add or remove SKU's within the parentheses on line 8 as required.

How do I force Robocopy to overwrite files?

In general, Robocopy ignores files for which lastwrittendate and filesize are the same. How can we escape this design? I'd like to force overwriting with Robocopy.
I expected that dst\sample.txt should be written test001.
But these file are recognized as the same files by Robocopy and not overwritten. The "/IS" option is not effective in this case.
New-Item src -itemType Directory
New-Item dst -itemType Directory
New-Item src\sample.txt -itemType File -Value "test001"
New-Item dst\sample.txt -itemType File -Value "test002"
Set-ItemProperty src\sample.txt -Name LastWriteTime -Value "2016/1/1 15:00:00"
Set-ItemProperty dst\sample.txt -Name LastWriteTime -Value "2016/1/1 15:00:00"
ROBOCOPY.exe src dst /COPYALL /MIR
Get-Content src\sample.txt, dst\sample.txt
> test001
> test002
ROBOCOPY.exe src dst /COPYALL /MIR /IS
Get-Content src\sample.txt, dst\sample.txt
> test001
> test002
From the documentation:
/is Includes the same files.
/it Includes "tweaked" files.
"Same files" means files that are identical (name, size, times, attributes). "Tweaked files" means files that have the same name, size, and times, but different attributes.
robocopy src dst sample.txt /is # copy if attributes are equal
robocopy src dst sample.txt /it # copy if attributes differ
robocopy src dst sample.txt /is /it # copy irrespective of attributes
This answer on Super User has a good explanation of what kind of files the selection parameters match.
With that said, I could reproduce the behavior you describe, but from my understanding of the documentation and the output robocopy generated in my tests I would consider this a bug.
PS C:\temp> New-Item src -Type Directory >$null
PS C:\temp> New-Item dst -Type Directory >$null
PS C:\temp> New-Item src\sample.txt -Type File -Value "test001" >$null
PS C:\temp> New-Item dst\sample.txt -Type File -Value "test002" >$null
PS C:\temp> Set-ItemProperty src\sample.txt -Name LastWriteTime -Value "2016/1/1 15:00:00"
PS C:\temp> Set-ItemProperty dst\sample.txt -Name LastWriteTime -Value "2016/1/1 15:00:00"
PS C:\temp> robocopy src dst sample.txt /is /it /copyall /mir
...
Options : /S /E /COPYALL /PURGE /MIR /IS /IT /R:1000000 /W:30
------------------------------------------------------------------------------
1 C:\temp\src\
Modified 7 sample.txt
------------------------------------------------------------------------------
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1 0 0 0 0 0
Files : 1 1 0 0 0 0
Bytes : 7 7 0 0 0 0
...
PS C:\temp> robocopy src dst sample.txt /is /it /copyall /mir
...
Options : /S /E /COPYALL /PURGE /MIR /IS /IT /R:1000000 /W:30
------------------------------------------------------------------------------
1 C:\temp\src\
Same 7 sample.txt
------------------------------------------------------------------------------
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1 0 0 0 0 0
Files : 1 1 0 0 0 0
Bytes : 7 7 0 0 0 0
...
PS C:\temp> Get-Content .\src\sample.txt
test001
PS C:\temp> Get-Content .\dst\sample.txt
test002
The file is listed as copied, and since it becomes a same file after the first robocopy run at least the times are synced. However, even though seven bytes have been copied according to the output no data was actually written to the destination file in both cases despite the data flag being set (via /copyall). The behavior also doesn't change if the data flag is set explicitly (/copy:d).
I had to modify the last write time to get robocopy to actually synchronize the data.
PS C:\temp> Set-ItemProperty src\sample.txt -Name LastWriteTime -Value (Get-Date)
PS C:\temp> robocopy src dst sample.txt /is /it /copyall /mir
...
Options : /S /E /COPYALL /PURGE /MIR /IS /IT /R:1000000 /W:30
------------------------------------------------------------------------------
1 C:\temp\src\
100% Newer 7 sample.txt
------------------------------------------------------------------------------
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1 0 0 0 0 0
Files : 1 1 0 0 0 0
Bytes : 7 7 0 0 0 0
...
PS C:\temp> Get-Content .\dst\sample.txt
test001
An admittedly ugly workaround would be to change the last write time of same/tweaked files to force robocopy to copy the data:
& robocopy src dst /is /it /l /ndl /njh /njs /ns /nc |
Where-Object { $_.Trim() } |
ForEach-Object {
$f = Get-Item $_
$f.LastWriteTime = $f.LastWriteTime.AddSeconds(1)
}
& robocopy src dst /copyall /mir
Switching to xcopy is probably your best option:
& xcopy src dst /k/r/e/i/s/c/h/f/o/x/y
This is really weird, why nobody is mentioning the /IM switch ?! I've been using it for a long time in backup jobs. But I tried googling just now and I couldn't land on a single web page that says anything about it even on MS website !!! Also found so many user posts complaining about the same issue!!
Anyway.. to use Robocopy to overwrite EVERYTHING what ever size or time in source or distination you must include these three switches in your command (/IS /IT /IM)
/IS :: Include Same files. (Includes same size files)
/IT :: Include Tweaked files. (Includes same files with different Attributes)
/IM :: Include Modified files (Includes same files with different times).
This is the exact command I use to transfer few TeraBytes of mostly 1GB+ files (ISOs - Disk Images - 4K Videos):
robocopy B:\Source D:\Destination /E /J /COPYALL /MT:1 /DCOPY:DATE /IS /IT /IM /X /V /NP /LOG:A:\ROBOCOPY.LOG
I did a small test for you .. and here is the result:
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1028 1028 0 0 0 169
Files : 8053 8053 0 0 0 1
Bytes : 649.666 g 649.666 g 0 0 0 1.707 g
Times : 2:46:53 0:41:43 0:00:00 0:41:44
Speed : 278653398 Bytes/sec.
Speed : 15944.675 MegaBytes/min.
Ended : Friday, August 21, 2020 7:34:33 AM
Dest, Disk: WD Gold 6TB (Compare the write speed with my result)
Even with those "Extras", that's for reporting only because of the "/X" switch. As you can see nothing was Skipped and Total number and size of all files are equal to the Copied. Sometimes It will show small number of skipped files when I abuse it and cancel it multiple times during operation but even with that the values in the first 2 columns are always Equal. I also confirmed that once before by running a PowerShell script that scans all files in destination and generate a report of all time-stamps.
Some performance tips from my history with it and so many tests & troubles!:
. Despite of what most users online advise to use maximum threads "/MT:128" like it's a general trick to get the best performance ... PLEASE DON'T USE "/MT:128" WITH VERY LARGE FILES ... that's a big mistake and it will decrease your drive performance dramatically after several runs .. it will create very high fragmentation or even cause the files system to fail in some cases and you end up spending valuable time trying to recover a RAW partition and all that nonsense. And above all that, It will perform 4-6 times slower!!
For very large files:
Use Only "One" thread "/MT:1" | Impact: BIG
Must use "/J" to disable buffering. | Impact: High
Use "/NP" with "/LOG:file" and Don't output to the console by "/TEE" | Impact: Medium.
Put the "/LOG:file" on a separate drive from the source or destination | Impact: Low.
For regular big files:
Use multi threads, I would not exceed "/MT:4" | Impact: BIG
IF destination disk has low Cache specs use "/J" to disable buffering | Impact: High
& 4 same as above.
For thousands of tiny files:
Go nuts :) with Multi threads, at first I would start with 16 and multibly by 2 while monitoring the disk performance. Once it starts dropping I'll fall back to the prevouse value and stik with it | Impact: BIG
Don't use "/J" | Impact: High
Use "/NP" with "/LOG:file" and Don't output to the console by "/TEE" | Impact: HIGH.
Put the "/LOG:file" on a separate drive from the source or destination | Impact: HIGH.
I did this for a home folder where all the folders are on the desktops of the corresponding users, reachable through a shortcut which did not have the appropriate permissions, so that users couldn't see it even if it was there. So I used Robocopy with the parameter to overwrite the file with the right settings:
FOR /F "tokens=*" %G IN ('dir /b') DO robocopy "\\server02\Folder with shortcut" "\\server02\home\%G\Desktop" /S /A /V /log+:C:\RobocopyShortcut.txt /XF *.url *.mp3 *.hta *.htm *.mht *.js *.IE5 *.css *.temp *.html *.svg *.ocx *.3gp *.opus *.zzzzz *.avi *.bin *.cab *.mp4 *.mov *.mkv *.flv *.tiff *.tif *.asf *.webm *.exe *.dll *.dl_ *.oc_ *.ex_ *.sy_ *.sys *.msi *.inf *.ini *.bmp *.png *.gif *.jpeg *.jpg *.mpg *.db *.wav *.wma *.wmv *.mpeg *.tmp *.old *.vbs *.log *.bat *.cmd *.zip /SEC /IT /ZB /R:0
As you see there are many file types which I set to ignore (just in case), just set them for your needs or your case scenario.
It was tested on Windows Server 2012, and every switch is documented on Microsoft's sites and others.

how to find large files in command line windows for version 5.1

How do I find large files in windows ver 5.1 command line?
For Windows ver 6.1, I can run the following command:
forfiles /p c:\ /s /m . /c "cmd /c if #fsize gtr 100000 echo #file
#fsize"
However what is the equivalent command for version 5.1 windows?
Appreciate your quick help!
(added quote marks)
To be run from command line
for /r c:\ %f in (*) do if %~zf gtr 100000 echo %f %~zf
To run it from batch file, change % with %%
EDIT - As stated in the comments, arithmetic in batch command lines have some limits on the operands. In this case, the %~zf (size of file referenced in for command) in the left part of the if command have no problems, but the value on the right size is limited to values from -2147483647 to 2147483646.
To handle it, if command is executed under an administrator account, wmic command can be used. The command in OP question can be written as
wmic datafile where "drive='c:' AND filesize>'100000'" get caption, filesize
first search biggest directories in actual directory
diruse /* /m /c /, . |sort
cd to the chosen dir; you can use diruse again or
search biggest files in the chosen directory, which are more than 500MB for example:
forfiles /s /c "cmd /c if #fsize gtr 500000000 echo #path #fsize bytes"
For windows, I couldn't find one answer that suits me for command line, so i coded my answer in powershell.
Following code will help you find the list of files (more than 1 KB) in your desired path to scan.
#Run via Powershell
#$PathToScan refers to the directory where you want to find your files greater than 1 KB.
$PathToScan="E:\"
"Path to scan:`t $PathToScan"
$hash = $null
$hash = #{}
Get-ChildItem -path $PathToScan -recurse | Where-Object {
#If you want to find files greater than 1GB, replace all KB in this powershell script to GB.
($_.Length /1KB) -gt 2
} | ForEach-Object {
$hash.add(
$_.DirectoryName.Replace("$PathToScan","")+$_.Name,
[math]::Round(($_.length/1KB),0))
}
$TableStyle = #{Expression={$_.Value}; Label="Size (KB)"; Width=10; Alignment = 'Left'},
#{Expression={$_.Name}; Label="Name"; Width=100}
$hash.GetEnumerator() | Sort Value -Descending| Format-Table -Property $TableStyle
Countdown 5
Function Countdown{
Param ($TimeInSec)
Do {
Write-Host "$TimeInSec"
Start-Sleep -s 1
$TimeInSec--
}
Until ($TimeInSec -eq 0)
}
Or you may view my code here;
[Find large files in KB for Windows]:
https://github.com/worldpeacez0991/powershell
Hope my answer helps someone, have a great day ^ ^

Windows batch file to get C:\ drive total space and free space available

I need a bat file to get C:\ drive total space and free space available in GB (giga bytes) in a Windows system and create a text file with the details.
Note: i dont want to use any external utilities.
cut 9 digits of the size by bytes to get the size in GB:
#echo off & setlocal ENABLEDELAYEDEXPANSION
SET "volume=C:"
FOR /f "tokens=1*delims=:" %%i IN ('fsutil volume diskfree %volume%') DO (
SET "diskfree=!disktotal!"
SET "disktotal=!diskavail!"
SET "diskavail=%%j"
)
FOR /f "tokens=1,2" %%i IN ("%disktotal% %diskavail%") DO SET "disktotal=%%i"& SET "diskavail=%%j"
(ECHO(Information for volume %volume%
ECHO(total %disktotal:~0,-9% GB
ECHO(avail. %diskavail:~0,-9% GB)>size.txt
TYPE size.txt
cmd can calculate only with numbers up to 2^31-1   (2,147,483,647 ~ 2.000001 Gigabytes)
Not a complete solution by any means, but someone might find this helpful:
dir | find "bytes"
This is probably not at all what you want since it uses PowerShell, but "external utilities" is a bit nebulous and leaves me some wiggle room. Plus, it's essentially a one-liner.
SETLOCAL
FOR /F "usebackq tokens=1,2" %%f IN (`PowerShell -NoProfile -EncodedCommand "CgBnAHcAbQBpACAAVwBpAG4AMwAyAF8ATABvAGcAaQBjAGEAbABEAGkAcwBrACAALQBGAGkAbAB0AGUAcgAgACIAQwBhAHAAdABpAG8AbgA9ACcAQwA6ACcAIgB8ACUAewAkAGcAPQAxADAANwAzADcANAAxADgAMgA0ADsAWwBpAG4AdABdACQAZgA9ACgAJABfAC4ARgByAGUAZQBTAHAAYQBjAGUALwAkAGcAKQA7AFsAaQBuAHQAXQAkAHQAPQAoACQAXwAuAFMAaQB6AGUALwAkAGcAKQA7AFcAcgBpAHQAZQAtAEgAbwBzAHQAIAAoACQAdAAtACQAZgApACwAJABmAH0ACgA="`) DO ((SET U=%%f)&(SET F=%%g))
#ECHO Used: %U%
#ECHO Free: %F%
Since batch/CMD is bad at nearly everything, I decided to use PowerShell, which is meant for such stuff and has quick and easy access to WMI.
Here's the PowerShell code:
Get-WMIObject -Query "SELECT * FROM Win32_LogicalDisk WHERE Caption='C:'" `
| % {
$f = [System.Math]::Round($_.FreeSpace/1024/1024/1024,1);
$t = [System.Math]::Round($_.Size/1024/1024/1024,1);
Write-Host ('' + ($t-$f) + ',' + $f);
}
This spits out the two values separated by a comma. Now, if only we could do this in a FOR loop!
PowerShell has the nice ability to accept a Base64-encoded command (to eliminate the need for escaping and making the code hard to read), so all we need to do is shrink this command as much as possible (to reduce the size of the encoded string—strictly a nicety, not absolutely necessary). I also reduced the sizes to integers, which rounded them. It's at least closer than discarding the lower-order decimal digits.
Shrinking the encoded command and encoding it in PowerShell looks like this:
$code = {
gwmi Win32_LogicalDisk -Filter "Caption='C:'"|%{$g=1073741824;[int]$f=($_.FreeSpace/$g);[int]$t=($_.Size/$g);Write-Host ($t-$f),$f}
}
$enc = [convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes($code))
Write-Host $enc
(See PowerShell /? for more details.)
I would expect this to run on any Win7 or Win8 machine in existence. The PoSH code doesn't rely on any advanced features (except maybe the EncodedCommand bit), so if PoSH is installed on the XP or Vista machine, there's a good chance of it working. I can't speak about the history of MS pushing PoSH via Windows Update, but I think there's a good chance that this will work ubiquitously.
This should work in batch:
for /f "tokens=2" %%S in ('wmic volume get DriveLetter^, FreeSpace ^| findstr "^C:"') do set space=%%S
echo %space%

Free space in a CMD shell

Is there a way to get the amount of free diskspace of a disk or a folder in a CMD
without having to install some thirdparty applications?
I have a CMD that copies a big file to a given directory and could of course use
the errorlevel return from the copy command, but then I have to wait for the time
it takes to copy the file (eg...to that then the disk is full and the copy operation fails).
I would like to know before I start the copy if it is any idea at all. Tried the DU.EXE utility from Sysinternals, but that show occupied space only.
If you run "dir c:\", the last line will give you the free disk space.
Edit:
Better solution: "fsutil volume diskfree c:"
A possible solution:
dir|find "bytes free"
a more "advanced solution", for Windows Xp and beyond:
wmic /node:"%COMPUTERNAME%" LogicalDisk Where DriveType="3" Get DeviceID,FreeSpace|find /I "c:"
The Windows Management Instrumentation Command-line (WMIC) tool (Wmic.exe)
can gather vast amounts of information about about a Windows Server 2003 as well as Windows XP or Vista. The tool accesses the underlying hardware by using Windows Management Instrumentation (WMI). Not for Windows 2000.
As noted by Alexander Stohr in the comments:
WMIC can see policy based restrictions as well. (even if 'dir' will still do the job),
'dir' is locale dependent.
Using this command you can find all partitions, size & free space: wmic logicaldisk get size, freespace, caption
You can avoid the commas by using /-C on the DIR command.
FOR /F "usebackq tokens=3" %%s IN (`DIR C:\ /-C /-O /W`) DO (
SET FREE_SPACE=%%s
)
ECHO FREE_SPACE is %FREE_SPACE%
If you want to compare the available space to the space needed, you could do something like the following. I specified the number with thousands separator, then removed them. It is difficult to grasp the number without commas. The SET /A is nice, but it stops working with large numbers.
SET EXITCODE=0
SET NEEDED=100,000,000
SET NEEDED=%NEEDED:,=%
IF %FREE_SPACE% LSS %NEEDED% (
ECHO Not enough.
SET EXITCODE=1
)
EXIT /B %EXITCODE%
UPDATE:
Much has changed since 2014. Here is a better answer. It uses PowerShell which is available on all currently supported Microsoft Windows systems.
The code below would be much clearer and easier to understand if the script were written in PowerShell without using cmd.exe as a wrapper. If you are using PowerShell Core, change powershell to pwsh.
SET "NEEDED=100,000,000"
SET "NEEDED=%NEEDED:,=%"
powershell -NoLogo -NoProfile -Command ^
$Free = (Get-PSDrive -Name 'C').Free; ^
if ($Free -lt [int64]%NEEDED%) { exit $true } else { exit $false }
IF ERRORLEVEL 1 (
ECHO "Not enough disk space available."
) else (
ECHO "Available disk space is adequate."
)
df.exe
Shows all your disks; total, used and free capacity. You can alter the output by various command-line options.
You can get it from http://www.paulsadowski.com/WSH/cmdprogs.htm, http://unxutils.sourceforge.net/ or somewhere else. It's a standard unix-util like du.
df -h will show all your drive's used and available disk space. For example:
M:\>df -h
Filesystem Size Used Avail Use% Mounted on
C:/cygwin/bin 932G 78G 855G 9% /usr/bin
C:/cygwin/lib 932G 78G 855G 9% /usr/lib
C:/cygwin 932G 78G 855G 9% /
C: 932G 78G 855G 9% /cygdrive/c
E: 1.9T 1.3T 621G 67% /cygdrive/e
F: 1.9T 201G 1.7T 11% /cygdrive/f
H: 1.5T 524G 938G 36% /cygdrive/h
M: 1.5T 524G 938G 36% /cygdrive/m
P: 98G 67G 31G 69% /cygdrive/p
R: 98G 14G 84G 15% /cygdrive/r
Cygwin is available for free from: https://www.cygwin.com/
It adds many powerful tools to the command prompt. To get just the available space on drive M (as mapped in windows to a shared drive), one could enter in:
M:\>df -h | grep M: | awk '{print $4}'
The following script will give you free bytes on the drive:
#setlocal enableextensions enabledelayedexpansion
#echo off
for /f "tokens=3" %%a in ('dir c:\') do (
set bytesfree=%%a
)
set bytesfree=%bytesfree:,=%
echo %bytesfree%
endlocal && set bytesfree=%bytesfree%
Note that this depends on the output of your dir command, which needs the last line containing the free space of the format 24 Dir(s) 34,071,691,264 bytes free. Specifically:
it must be the last line (or you can modify the for loop to detect the line explicitly rather than relying on setting bytesfree for every line).
the free space must be the third "word" (or you can change the tokens= bit to get a different word).
thousands separators are the , character (or you can change the substitution from comma to something else).
It doesn't pollute your environment namespace, setting only the bytesfree variable on exit. If your dir output is different (eg, different locale or language settings), you will need to adjust the script.
Using paxdiablo excellent solution I wrote a little bit more sophisticated batch script, which uses drive letter as the incoming argument and checks if drive exists on a tricky (but not beauty) way:
#echo off
setlocal enableextensions enabledelayedexpansion
set chkfile=drivechk.tmp
if "%1" == "" goto :usage
set drive=%1
set drive=%drive:\=%
set drive=%drive::=%
dir %drive%:>nul 2>%chkfile%
for %%? in (%chkfile%) do (
set chksize=%%~z?
)
if %chksize% neq 0 (
more %chkfile%
del %chkfile%
goto :eof
)
del %chkfile%
for /f "tokens=3" %%a in ('dir %drive%:\') do (
set bytesfree=%%a
)
set bytesfree=%bytesfree:,=%
echo %bytesfree% byte(s) free on volume %drive%:
endlocal
goto :eof
:usage
echo.
echo usage: freedisk ^<driveletter^> (eg.: freedisk c)
note1: you may type simple letter (eg. x) or may use x: or x:\ format as drive letter in the argument
note2: script will display stderr from %chkfile% only if the size bigger than 0
note3: I saved this script as freedisk.cmd (see usage)
I make a variation to generate this out from script:
volume C: - 49 GB total space / 29512314880 byte(s) free
I use diskpart to get this information.
#echo off
setlocal enableextensions enabledelayedexpansion
set chkfile=drivechk.tmp
if "%1" == "" goto :usage
set drive=%1
set drive=%drive:\=%
set drive=%drive::=%
dir %drive%:>nul 2>%chkfile%
for %%? in (%chkfile%) do (
set chksize=%%~z?
)
if %chksize% neq 0 (
more %chkfile%
del %chkfile%
goto :eof
)
del %chkfile%
echo list volume | diskpart | find /I " %drive% " >%chkfile%
for /f "tokens=6" %%a in ('type %chkfile%' ) do (
set dsksz=%%a
)
for /f "tokens=7" %%a in ('type %chkfile%' ) do (
set dskunit=%%a
)
del %chkfile%
for /f "tokens=3" %%a in ('dir %drive%:\') do (
set bytesfree=%%a
)
set bytesfree=%bytesfree:,=%
echo volume %drive%: - %dsksz% %dskunit% total space / %bytesfree% byte(s) free
endlocal
goto :eof
:usage
echo.
echo usage: freedisk ^<driveletter^> (eg.: freedisk c)
Is cscript a 3rd party app?
I suggest trying Microsoft Scripting, where you can use a programming language (JScript, VBS) to check on things like List Available Disk Space.
The scripting infrastructure is present on all current Windows versions (including 2008).

Resources