VBScript to get DC SYSVOL information (path, size, free space) - vbscript

I am obtaining information about my DCs and need to pull in the path to sysvol, total size, and free space remaining.
Cant his be done in VBScript and if so how?
Thanks

Have a look at this Microsoft support page - the registry entries are in this document of where the SYSVOL and other NTDS directories are stored.
You might find that you can also get this information through Active Directory however I'm not sure - I will check ADSIEdit shorly.

You can do it through the WMI. See an example here: http://www.computerperformance.co.uk/vbscript/wmi_disks.htm

Related

Finding actual size of a folder in Windows

On my home desktop which is a Windows machine I right click on C:\Windows folder under properties and it displays:
If I use the du tool provided by Microsoft sysinternals
du C:\Windows
This produces
Files: 77060
Directories: 21838
Size: 31,070,596,369 bytes
Size on disk: 31,151,837,184 bytes
If I run the same command as administrator
Files: 77894
Directories: 22220
Size: 32,223,507,961 bytes
Size on disk: 32,297,160,704 bytes
With Powershell ISE running as administrator I ran the following powershell snippet from this SO answer
"{0:N2}" -f ((Get-ChildItem -path C:\InsertPathHere -recurse | Measure-Object -property length -sum ).sum /1MB) + " MB"
which output
22,486.11 MB
The C# code in the following SO answer from a command prompt running as Administrator returns:
35,163,662,628 bytes
Although close it still does not display the same as Windows Explorer. None of these methods therefore return the actual size of the directory. So my question is this.
Is there a scripted or coded method that will return the actual folder size of C:\Windows
If there is no way of retrieving the folder size, is there a way I can programatically retrieve the information displayed by Windows Explorer?
When it comes to windows they have a strange way of actually storing data, for example while a file maybe 1mb in size, when stored on disc its probably going to be 1.1mb the reason for this being is that includes the directory link to the actual file on disc and that estimated size isn't including the possible additional data windows stores with the associated data.
Now your probably thinking, thats nice and all but how do you explain the massive size change when looking at the file size from admin, well that is a good question because this is another additional header/meta data that is stored in conjunction with the file which is only allowed to be seen by admins.
Coming back to your original question about telling the actual size of the file, well that is quite hard to say for windows due to the amount of additional data it uses in conjunction with the desired file, but for readability purposes or if you are using this for some form of coding, I'd suggest looking for the size on disk with the admin command, not because it seems like the file is at its maximum size (for me it is) but because usually when you are looking to transfer that's probably the most reliable figure you can go with, because once you transfer the file, some additional data will be removed or changed and you already know what the likely swing in file size difference will be.
Also you have to take into account the hard drive format (NTFS, fat32) because of how it segments files because that too can change the file size slightly if the file is huge Ie. 1gb++
Hope that helps mate, because we all know how wonderful windows can be when trying to get information (sigh).
The ambiguities and differences have a lot to do with junctions, soft links, and hard links (similar to symlinks if you come from the *nix world). The biggest issue: Almost no Windows programs handle hard links well--they look like (and indeed are) "normal" files. All files in Windows have 1+ hard links.
You can get an indication of "true" disk storage by using Sysinternals Disk Usage utility
> du64 c:\windows
Yields on my machine:
DU v1.61 - Directory disk usage reporter
Copyright (C) 2005-2016 Mark Russinovich
Sysinternals - www.sysinternals.com
Files: 204992
Directories: 57026
Size: 14,909,427,806 bytes
Size on disk: 15,631,523,840 bytes
Which is a lot smaller than what you would see if you right-click and get the size in the properties dialog. By default du64 doesn't double count files with multiple hard links--it returns true disk space used. And that's also why this command takes a while to process. You can use the -u option to have the disk usage utility naively count the size of all links.
> du64 -u c:\windows
DU v1.61 - Directory disk usage reporter
Copyright (C) 2005-2016 Mark Russinovich
Sysinternals - www.sysinternals.com
Files: 236008
Directories: 57026
Size: 21,334,850,784 bytes
Size on disk: 22,129,897,472 bytes
This is much bigger--but it's double counted files that have multiple links pointing to the same storage space. Hope this helps.

dumping the content of the $mft file

for some commercial project I'm doing I need to be able to read the actual data stored on the $mft file.
I found a gpl lib that could help, but since its gpl i can't integrate it into my code.
could someone please point me to a project that i could use / or point me at the relevant windows API (something that doesn't require 1000 lines of code to implement)
BTW, why doesn't windows simply allow me to read the mft file directly anyway? (through the create file and the read method, if i want to ruin my drive it's my business not Ms's).
thanks.
You just have to open a handle to the volume using CreateFile() on \.\X: where X is the drive letter (check the MSDN documentation on CreateFile(), it mentions this in the Remarks section).
Read the first sector into a NTFS Boot Record structure (you can find it online, search for Richard "Flatcap" Russon, edit: I found it, http://www.flatcap.org/ntfs/ntfs/files/boot.html ). One of the fields in the boot sector structure gives the start location of the MFT in clusters (LCN of VCN 0 of the $MFT), you have to do a SetFilePointer() to that location an read in multiples of sectors. The first 1024 bytes from that location is the file record of the $MFT, again you can parse this structure to find the data attribute which is always non-resident and it's size is the actual size of the MFT file at that time.
The basic structures for $Boot, File Record and basic attributes (Standard Information, File Name and Data) along with the parsing code should run you less than 1000 lines of code.
This is not going to be a trivial proposition. You'll likely have to roll your own code solution to accomplish this. You can get some info about the details of the $MFT by checking out http://www.ntfs.com/ntfs-mft.htm
Another option is to spend some time looking through the source code to the opensource project NTFS-3g. You can download the source from http://www.tuxera.com/community/ntfs-3g-download/
Another good project is the NTFSProgs http://en.wikipedia.org/wiki/Ntfsprogs
Good luck.

Getting physical sector of a filesystem directory entry

Is there a way to know the sector/cluster number a directory entry in Window?
If there's a separate or absent solution for NTFS/FAT32, it's ok, I can live with it.
Thanks,
Max
I believe FSCTL_GET_RETRIEVAL_POINTERS is what you are after. I also found a really detailed exploration of its use.

Is there a way to get the filename/location information from the MFT of an NTFS volume?

I need to get the list of all the files on a drive. I am using a recursive solution. But it is taking a lot of time. I was wondering that, is it possible to get the names and location of all the files on a NTFS drive from it's Master File Table? I think it will be very fast. Any suggestions?
There is a tool that will search the mft directly, it's called ndff. I have used it before and it is very fast.
Presumably it is possible to do what you want - there is another tool called "Everything" which I guess does the same thing - it also uses the USN change journal to update it's index.
When you get a list of all the files on an NTFS-formatted drive using a recursive solution, you are getting them from the MFT. There should be little disk IO outside of the MFT when simply retrieving a list of filenames and directories.
Before going down the path of determining the format of the MFT (which is available from a variety of places on the Internet) and writing code to read it directly, you should probably profile your code and determine that you aren't already CPU or IO bound.
I have the impression you're imagining some kind of list-like structure in the MFT which you can read in one go with no or minimal seeking.
This is not the case. The MFT uses a type of b-tree to store pathnames. When you scan the directory structure on your disk, you are in fact walking the MFT b-tree; you are doing what you would have to do if you accessed the MFT directly.
Yes there is, and the program I just open-sourced does exactly this.
You can read the source to find out how it works, but basically, it just looks for FILE_NAME attributes inside the $MFT and then uses the ParentDirectory field to get the parent of every file.
That way it can completely avoid reading the contents of any directory.

Vista Recycle Bin Maximum Size

Does anyone know how to set the Maximum Size of the Recycle Bin via script?
Whether its written in batch or vbs or just a reg file?
I've been exploring this idea for the last few days, and can't get an answer. Maybe someone here knows how. I'm wanting to add it to my Vista Cleanup Script =)
The maximum size of the Recycle Bin for each drive is stored under this key:
HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\BitBucket\Volume\
There's a subkey with the GUID of each mounted volume. This GUID can be found with the MOUNTVOL command, or programmatically with GetVolumeNameForVolumeMountPoint().
You can just change the size value and it will be picked up by Explorer the next time something is deleted (i.e. you don't need to notify the shell about the change).
This isn't something code should be able to do, therefore there probably isn't a way. Seems like something Raymond Chen would blog about.
Why would you want to do that?

Resources