On my home desktop which is a Windows machine I right click on C:\Windows folder under properties and it displays:
If I use the du tool provided by Microsoft sysinternals
du C:\Windows
This produces
Files: 77060
Directories: 21838
Size: 31,070,596,369 bytes
Size on disk: 31,151,837,184 bytes
If I run the same command as administrator
Files: 77894
Directories: 22220
Size: 32,223,507,961 bytes
Size on disk: 32,297,160,704 bytes
With Powershell ISE running as administrator I ran the following powershell snippet from this SO answer
"{0:N2}" -f ((Get-ChildItem -path C:\InsertPathHere -recurse | Measure-Object -property length -sum ).sum /1MB) + " MB"
which output
22,486.11 MB
The C# code in the following SO answer from a command prompt running as Administrator returns:
35,163,662,628 bytes
Although close it still does not display the same as Windows Explorer. None of these methods therefore return the actual size of the directory. So my question is this.
Is there a scripted or coded method that will return the actual folder size of C:\Windows
If there is no way of retrieving the folder size, is there a way I can programatically retrieve the information displayed by Windows Explorer?
When it comes to windows they have a strange way of actually storing data, for example while a file maybe 1mb in size, when stored on disc its probably going to be 1.1mb the reason for this being is that includes the directory link to the actual file on disc and that estimated size isn't including the possible additional data windows stores with the associated data.
Now your probably thinking, thats nice and all but how do you explain the massive size change when looking at the file size from admin, well that is a good question because this is another additional header/meta data that is stored in conjunction with the file which is only allowed to be seen by admins.
Coming back to your original question about telling the actual size of the file, well that is quite hard to say for windows due to the amount of additional data it uses in conjunction with the desired file, but for readability purposes or if you are using this for some form of coding, I'd suggest looking for the size on disk with the admin command, not because it seems like the file is at its maximum size (for me it is) but because usually when you are looking to transfer that's probably the most reliable figure you can go with, because once you transfer the file, some additional data will be removed or changed and you already know what the likely swing in file size difference will be.
Also you have to take into account the hard drive format (NTFS, fat32) because of how it segments files because that too can change the file size slightly if the file is huge Ie. 1gb++
Hope that helps mate, because we all know how wonderful windows can be when trying to get information (sigh).
The ambiguities and differences have a lot to do with junctions, soft links, and hard links (similar to symlinks if you come from the *nix world). The biggest issue: Almost no Windows programs handle hard links well--they look like (and indeed are) "normal" files. All files in Windows have 1+ hard links.
You can get an indication of "true" disk storage by using Sysinternals Disk Usage utility
> du64 c:\windows
Yields on my machine:
DU v1.61 - Directory disk usage reporter
Copyright (C) 2005-2016 Mark Russinovich
Sysinternals - www.sysinternals.com
Files: 204992
Directories: 57026
Size: 14,909,427,806 bytes
Size on disk: 15,631,523,840 bytes
Which is a lot smaller than what you would see if you right-click and get the size in the properties dialog. By default du64 doesn't double count files with multiple hard links--it returns true disk space used. And that's also why this command takes a while to process. You can use the -u option to have the disk usage utility naively count the size of all links.
> du64 -u c:\windows
DU v1.61 - Directory disk usage reporter
Copyright (C) 2005-2016 Mark Russinovich
Sysinternals - www.sysinternals.com
Files: 236008
Directories: 57026
Size: 21,334,850,784 bytes
Size on disk: 22,129,897,472 bytes
This is much bigger--but it's double counted files that have multiple links pointing to the same storage space. Hope this helps.
Related
is it possible to add (Free)Diskspace to C: with a batch script?
You could write a script which deletes the most common places of temp files and caches on your system in order to free up space. Creating space out of nowhere is not possible though, obviously. If you do not need to automate the cleanup, you are probably better off using some cleanup tool for windows, simply google for "cleanup tool windows" for example.
If you want identify the most bloated places on your system, i found it very useful to use a tool showing you a sunburst diagram of places taking the most space on your hard drive. Such a tool would be f.i. http://www.jgoodies.com/freeware/jdiskreport/
If you are talking about resizing a partition, see e.g. https://www.partition-tool.com/resource/expand-windows-7-partition.htm for non automated options or http://www.itprotoday.com/management-mobility/formatting-and-resizing-partitions-diskpart for being able to do that in a batch file
yes you can extend Partition size using script but with Precaution
there are many tools available for doing the same task safely
but since you asked
Check Few Examples before proceeding.
Diskpart Scripts and Examples - Microsoft
User input for a DISKPART batch file
Extend a Basic Volume and Increase Disk Space
(for example)
Manually Check First (use with caution)
open cmd and input diskpart
Now when in Diskpart window
Input below commands
list volume - This will list all your volumes/partitions
select volume n - This will select your volume to resize/extend n will be your volume number
extend size=10240 - you need to input your desired size to be added to selected volume
10240=10GB
In a Windows 8 Command Prompt, I had a backup drive plugged in and I navigated to my User directory. I executed the command:
copy Documents G:/Seagate_backup/Documents
What I assumed was that copy would create the Documents directory on my backup drive and then copy the contents of the C: Documents directory into it. That is not what happened!
I proceeded to wipe my hard-drive and re-install the operating system, thinking I had backed up the important files, only to find out that copy seemingly concatenated all the C: Documents files of different types (.doc, .pdf, .txt, etc) into one file called "Documents." This file is of course unreadable but opening it in Notepad reveals what happened. I can see some of my documents which were plain text throughout the massively long file.
How do I undo this!!? It's terrible because I was actually helping a friend and was so sure of myself but now this has happened. The only thing I can think of doing is searching for some common separator amongst the concatenated files and write some sort of script to split the file back apart. But then I would have to guess the extensions of each of the pieces...
Merging files together in the fashion that copy uses, discards important file system information such as file size and file name. While the file name may not be as important the size is. Both parameters are used by the OS to discriminate files.
This problem might sound familiar if you have damaged your file allocation table before and all files disappeared. In both cases, you will end up with a binary blob (be it an actual disk or something like your file which might resemble a disk image) that lacks any size and filename information.
Fortunately, this is where a lot of file system recovery tools can help. They are specialized in matching patterns. Specifically they are looking for giveaway clues to what type a file is of, where it starts and what it's size is.
This is for instance enabled by many file types having a set of magic numbers that are used to allow a program to check if a file really is of the type that the extension claims to be.
In principle it is possible to undo this process more or less well.
You will need to use data recovery tools or other analysis tools like binwalk to extract the concatenated binary blob. Essentially the same tools that are used to recover deleted files should be able to extract your documents again. Without any filename of course. I recommend renaming the file to a disk image (.img) and either mounting it from within the operating system as a virtual harddisk (don't worry that it has no file system - it should show up as an unformatted drive) or directly using a data recovery tool or analysis tool which can read binary files (binwalk, for instance, can do that directly, but may not find all types of files as it's mainly for unpacking firmware images that may be assembled in the same or a similar way to how your files ended up).
i have many times seen this on my system that when i format my 16GB pen drive using just right click on it and then select format, then it takes a lot of time to format, but when i select quick format then it takes very less time. Can anyone please tell what is the technical difference between these two process?
When you choose to run a regular format on a volume, files are removed from the volume that you are formatting and the hard disk is scanned for bad sectors. The scan for bad sectors is responsible for the majority of the time that it takes to format a volume.
If you choose the Quick format option, format removes files from the partition, but does not scan the disk for bad sectors. Only use this option if your hard disk has been previously formatted and you are sure that your hard disk is not damaged.
If you installed Windows XP on a partition that was formatted by using the Quick format option, you can also check your disk by using the chkdsk /r command after the installation of Windows XP is completed.
Source: http://support.microsoft.com/kb/302686
Full - set up zeroes in every cell, quick - change file system headers only.
Am looking for some (efficient) code to determine the size of a directory / folder in Windows XP using SAS 9.1.3.
If you are not constrained by the SAS NOXCMD option (such as SAS Enterprise Guide hitting a SASApp - Workspace Server in its default configuration where the administrator has not opened it up) then I would suggest downloading and using the Microsoft Sysinternals Disk Usage (DU) tool via a SAS data null step using a pipe filename. Here is some sample SAS code:
filename du pipe "du -q c:\SAS\EBIEDIEG\Lev1\SASApp";
data work.diskusage;
infile du;
input #;
put _infile_;
if ( _infile_ =: 'Size:' ) then do;
sizeInBytes = input(scan(_infile_,2,' '), comma32.);
output;
end;
input;
run;
Microsoft Sysinternals Disk Usage (DU) is similar to the familiar UNIX du command. You can download Sysinternals DU and review the documentation at http://technet.microsoft.com/en-au/sysinternals/bb896651 It has a -l parameter so you can specify how deep it should go.
If you are constrained by the NOXCMD option then you could use a series of loops using the SAS DOPEN/DREAD/FILENAME/FOPEN/FINFO/FCLOSE/DCLOSE functions to manually walk the directory tree and add up the file sizes. It will involve much more code but should run in a NOXCMD environment. If you want to use this method then a good starting point will be the SAS documentation for DOPEN at http://support.sas.com/documentation/cdl/en/lrdict/64316/HTML/default/viewer.htm#a000209538.htm where you will also be able to find the documentation and examples for the other functions.
I have a folder containing .tcb and .tch files. I need to know what the size of all .tcb files together, respectively of all .tch files is.
I did like this:
1) I created a temp folder and then:
mv *tch temp
2) and then:
du -sk temp
I found the command in the Internet and wikipedia says this: "du (abbreviated from disk usage) is a standard Unix program used to estimate the file space usage". I think the reason why it says that it is an estimation is that if there are links then the size of the link will be shown instead of the linked file.
But if I do
ls -l
in the temp folder (which contains the all *.tch) files and then sum up the sizes which are displayed in the terminal, I have another file size. Why is that the case?
Well in sum, what I need is a command which shows me the real file size of *all .tch files in a folder, which can contain also other file types.
I hope anyone can help me with that. Thanks a lot!
You can use the -L option to du if you want to follow symbolic links (that is, calculate the size of the link target, not of the link itself). You can also use the -c option to display a grand total at the end.
Armed with those options, try du -skLc *.tch.
For more details on du, see this manpage.
Look at the specific man page for your version of du as they vary considerably in how they count.
"Approximate" can be because:
Blocks used or Bytes used can be reported with Blocks over-stating file sizes that aren't exact multiples of the block size but more accurately represents "space used that I can't use for other stuffs"
Unix files can have "holes" created by seeking a long way and writing. The OS doesn't actually allocate space for the skipped holes.
Symbolic links may or may not be dereferenced to the real file they point to.
If you just want the bytecount use wc -c *.tcb