Powershell script to output Size on disk - windows

I'm very new to powershell and is it possible to obtain the actual size of disk of a file? I was able to use the du, but is there another way of doing this without using that application?

This will give you the actual size of a file in bytes:
(gci <insert file path> | Measure-Object -Property length -Sum).sum
You can then use other logic to convert to KB, MB, GB, whatever you want. You can use the same command for size of directories, with the -Recurse option to get the size of all subdirectories and files in the root.

Related

Can you compress files on a network share using PowerShell without downloading them to the client?

I have a network share hosted by a server (\SERVER) which is being accessed by other servers/clients.
\\SERVER\SHARE\Folder\File
If I wanted to compress Folder and everything in it, is it possible to do using PowerShell WITHOUT having the files be downloaded to the machine that is running the command?
The files in question are large, and there is not enough room on the C drive to download and compress the files on the client. So for example if I navigated to the network share from the client using Windows File Explorer, and selected the folder to compress, it would start downloading the files to the client and then fail due to insufficient free space on the client.
What about PowerShell's Invoke-Command Option?
I do have the option to Invoke-Command from the client to the server, however, the C drive of \SERVER is to small to handle the request as well. There is a D drive (which hosts the actual \SHARE), which has plenty of space though. I would have to tell PowerShell to compress files on that drive somehow instead of the default which would be the C drive.
Error when running the below PowerShell Command
Compress-Archive -Path "\\SERVER\SHARE\Folder" -DestinationPath "\\SERVER\SHARE\OtherFolder\Archive.zip"
Exception calling "Write" with "3" argument(s): "Stream was too long."
At
C:\Windows\system32\WindowsPowerShell\v1.0\Modules\Microsoft.PowerShell.Archive\Microsoft.PowerShell.Archive.psm1:820
char:29
+ ... $destStream.Write($buffer, 0, $numberOfBytesRead)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : IOException
The problem is caused by Compress-Archive's limits. Its maximum file size is 2 GB. Documentation mentions this:
The Compress-Archive cmdlet uses the Microsoft .NET API
System.IO.Compression.ZipArchive to compress files. The maximum file
size is 2 GB because there's a limitation of the underlying API.
As for a solution, compress smaller files, or use another a tool such as 7zip. There's a module available, though manual compression is not that complex. As 7zip is not a native tool, install either it or the Powershell module.
Set-Alias sz "$env:ProgramFiles\7-Zip\7z.exe"
$src = "D:\somedir"
$tgt = "D:\otherdir\archive.7z"
sz a $tgt $src
If the source files are small enough so that a single file will never create an archive larger that the limit, consider compressing each file by itself. An example is like so,
$srcDir = "C:\someidir"
$dstDir = "D:\archivedir"
# List all the files, not subdirs
$files = gci $srcDir -recurse | ? { -not $_.PSIsContainer }
foreach($f in $files) {
# Create new name for compressed archive. Add file path, but
# replace \ with _ so there are no name collisions.
$src = $f.FullName
$dst = "c:\temppi\" + $src.Replace('\', '_').Replace(':','_') + ".zip"
Compress-Archive -whatif -Path $src -DestinationPath $dstDir
}
As a side note: use Enter-PSSession or Inoke-Command to run the script on the file server. There you can use local paths, though UNC paths should work pretty well - those are processed by loopback, so data isn't going through network.

How to write a powershell script to know disk is basic or dynamic?

I have to check whether a given node contains any dynamic disk or not and get the list of dynamic disk using Power Shell script. I am not supposed to use diskpart command. Any other solutions other than diskpart will be appreciated.
https://social.technet.microsoft.com/Forums/windowsserver/en-US/cd7c0327-3fe9-45fc-a177-5a9927d468f3/does-the-getdisk-funtion-only-return-basic-disks?forum=winserverpowershell
Get-WmiObject Win32_DiskPartition -filter "Type='Logical Disk Manager'" | Select-Object *
you may use also diskpart utility, which is easily scriptable (I worked with it in Python)
the idea is that when you perform diskpart and then list disk,
output will be like:
Disk ### Status Size Free Dyn Gpt
-------- ------------- ------- ------- --- ---
Disk 0 Online 476 GB 0 B *
So you'll see all dynamic disks marked with asterisk under "Dyn"

cmd dir write file size in windows 10 in Gb instead of byte [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I use command
dir L01\*.gz /o-s/-c > L01.txt
to find all gz files in directory and sort by size. It works well!
But, I need to take size in Gb not in bytes!
01/24/2020 12:36 AM 3018394731 v3000418221_L01_82_1.fq.gz
01/24/2020 12:36 AM 2883376937 v3000418221_L01_82_2.fq.gz
01/24/2020 12:36 AM 2875257587 v3000418221_L01_69_1.fq.gz
01/24/2020 12:36 AM 2785098410 v3000418221_L01_69_2.fq.gz
01/24/2020 12:36 AM 2520038171 v3000418221_L01_99_1.fq.gz
01/24/2020 12:36 AM 2478618550 v3000418221_L01_62_1.fq.gz
01/24/2020 12:36 AM 2470651439 v3000418221_L01_99_2.fq.gz
also I need only filenames and sizes in Gb without date and time
And, it will be great if command do it with files in all subdirectures and give output like:
directory L01 (or smth else):
v3000418221_L01_82_1.fq.gz 2.5 Gb
v3000418221_L01_82_2.fq.gz 2.4 Gb
directory L02 :
v3000418221_L01_12_1.fq.gz 2.1 Gb
v3000418221_L01_32_2.fq.gz 0.4 Gb
v3000418221_L01_42_1.fq.gz 1.5 Gb
v3000418221_L01_8_2.fq.gz 2.4 Gb
It is not my computer so I try to do it in cmd without installing python.
You can do the following in PowerShell to search the current directory, which will output FileInfo objects sorted by Length (Size) and with size converted to GB.
Output to Console Only
Get-ChildItem -Filter '*L01*.gz' | Sort Length -Desc |
Select LastWriteTime,Name,#{n='SizeGB';e={$_.Length/1GB}}
If you want set the directory within the command, you can add -Path DirectoryPath to your Get-ChildItem command. The -Recurse switch allows for a recursive directory search from the search directory. See Get-ChildItem.
If you want the pipe the output to a file as is, you can just add | Out-File L01.txt or | Set-Content L01.txt.
In PowerShell, dir is an alias for Get-ChildItem. So you can use dir -Filter '*L01*.gz' if you feel the need.
Output to File Without Table Headers
PowerShell works with objects. If your objects have property names, they will by default appear as column headers in a table output. If you want the headers removed, you can just pipe your output to Format-Table -HideTableHeaders.
Get-ChildItem -Filter '*L01*.gz' | Sort Length -Desc |
Select LastWriteTime,Name,#{n='SizeGB';e={$_.Length/1GB}} |
Format-Table -HideTableHeaders | Set-Content L01.txt
Output to File From CMD Shell
If you only want to run code from the cmd shell, then you can still execute the PowerShell code there. Just put the PowerShell code in a script file Something.ps1.
Powershell.exe -File Something.ps1
There are some differences in default encoding for Out-File and Set-Content. In Windows PowerShell > redirects output (can target a file or stream) and uses Unicode with BOM. Out-File behaves the same as the redirect operator when no parameters are supplied. In PowerShell Core or just PowerShell (as of v7), both commands should output to files by default in UTF-8 with no BOM. Set-Content outputs using the culture-specific Default encoding. Both commands have an -Encoding parameter for you to control your output encoding.

Filter int, float and character from a text file in a Shell Script

Suppose I have a text file, which contains data like this.
Below output generated from du - sh /home/*
1.5G user1
2.5G user2
And so on...
Now if I want that those files size be stored in an array and compared to 5 GB if the user is consuming more than 5 Gb. What can I do?
The du command shows the usage of each folder in home directory. So if i want myself to be notified that some user is consuming more than 5 GB. Because there is a long list of users. It will be tedious to identify each user's disk usage. I want a shell script to identify the usage for each directory in home. And then I will put mail function to notify myself for exceeded limits.
Note : Don't want to implement quota as I just want to monitor the usage.
Use du's -t (--threshold) option to specify you only want to know about directories with more than a certain amount of data in them:
$ du -sh -t 5G /home/*
If you're picky about precisely how big a gigabyte is, note that 5G uses multiples of 1024; you may prefer -t 5GB for multiples of 1000, or even -t 5000M to mix them.
For lots of users, you're probably better off writing that using -d 1 instead of -s to avoid the shell having to expand the * into a very long list:
$ du -h -d 1 -t 5G /home/

Command Prompt: dir file whose File size is greater than 1024KB

I am currently using the following command to find out how many documents with pdf format is there with there complete path but it shows the list of like 11,000 documents,
dir *.pdf /s /b**
I'd like to list only those images that has the file size greater than 1024KB , the file size shouldn't be displayed yet the file should be greater than 1024KB in size.
is that possible using command prompt ?
Since you're using windows, you will most likely have powershell:
ls *.pdf | where-object {$_.length -gt 1048576} | format-table -property Name
ls will list the files with .pdf extensions. where-object will filter the result set to files with length greater than 1MB (1048576 = 1MB). format-table will format the final output to display only the name of the file

Resources