When running the following line in PowerShell including the "Format-Table -AutoSize", an empty output file is generated:
Get-ChildItem -Recurse | select FullName,Length | Format-Table -AutoSize | Out-File filelist.txt
The reason I need the output file to be AutoSized is because longer filenames from the directoy are being trunacted. I am trying to pull all Filenames and File Sizes for all files within a folder and subfolders. When removing the -Autosize element, an output file is generated with truncated file names:
Get-ChildItem -Recurse | select FullName,Length | Out-File filelist.txt
Like AdminOfThings commented, use Export-CSV to get the untruncated values of your object.
Get-ChildItem -Recurse | select FullName,Length | Export-CSv -path $myPath -NoTypeInformation
I do not use Out-File much at all, and only use Format-Table/Format-List for interactive scripts. If I want to write data to a file, Select-Object Column1,Column2 | Sort-Object Column1| Export-CSV lets me select the properties of the object I am exporting that I want to export, and sort the records as needed. you can change the delimiter from a comma to tab/pipe/whatever else you may need.
While the other answer may address the issue, you may have other reasons for wanting to use Out-File. Out-File has a "Width" parameter. If this is not set, PowerShell defaults to 80 characters - hence your issue. This should do the trick:
Get-ChildItem -Recurse | select FullName,Length | Out-File filelist.txt -Width 250 (or any other value)
The Format-* commandlets in PowerShell are only intended to be used in the console. They do not actually produce output that can be piped to other commandlets.
The usual approach to get the data out is with Export-Csv. CSV files are easily imported into other scripts or spreadsheets.
If you really need to output a nicely formatted text file you can use .Net composite formatting with the -f (format) operator. This works similarly to printf() in C. Here is some sample code:
# Get the files for the report
$files = Get-ChildItem $baseDirectory -Recurse
# Path column width
$nameWidth = $files.FullName |
ForEach-Object { $_.Length } |
Measure-Object -Maximum |
Select-Object -ExpandProperty Maximum
# Size column width
$longestFileSize = $files |
ForEach-Object { $_.Length.tostring().Length } |
Measure-Object -Maximum |
Select-Object -ExpandProperty Maximum
# Have to consider that some directories will have no files with
# length strings longer than "Size (Bytes)"
$sizeWidth = [System.Math]::Max($longestFileSize, "Size (Bytes)".Length)
# Right-align paths, left-align file size
$formatString = "{0,-$nameWidth} {1,$sizeWidth}"
# Build the report and write it to a file
# ArrayList are much more efficient than using += with arrays
$lines = [System.Collections.ArrayList]::new($files.Length + 3)
# The [void] cast are just to prevent ArrayList.add() from cluttering the
# console with the returned indices
[void]$lines.Add($formatString -f ("Path", "Size (Bytes)"))
[void]$lines.Add($formatString -f ("----", "------------"))
foreach ($file in $files) {
[void]$lines.Add($formatString -f ($file.FullName, $file.Length.ToString()))
}
$lines | Out-File "Report.txt"
Related
I am having a helluva time trying to understand why this script is not working as intended. It is a simple script in which I am attempting to import a CSV, select a few columns that I want, then export the CSV and copy over itself. (Basically we have archived data that I only need a few columns from for another project due to memory size constraints). This script is very simple, which apparently has an inverse relationship with how much frustration it causes when it doesn't work... Right now the end result is I end up with an empty csv instead of a csv containing only the columns I selected with Select-Object.
$RootPath = "D:\SomeFolder"
$csvFilePaths = Get-ChildItem $RootPath -Recurse -Include *.csv |
ForEach-Object{
Import-CSV $_ |
Select-Object Test_Name, Test_DataName, Device_Model, Device_FW, Data_Avg_ms, Data_StdDev |
Export-Csv $_.FullName -NoType -Force
}
Unless you read the input file into memory in full, up front, you cannot safely read from and write back to the same file in a given pipeline.
Specifically, a command such as Import-Csv file.csv | ... | Export-Csv file.csv will erase the content of file.csv.
The simplest solution is to enclose the command that reads the input file in (...), but note that:
The file's content (transformed into objects) must fit into memory as a whole.
There is a slight risk of data loss if the pipeline is interrupted before all (transformed) objects have been written back to the file.
Applied to your command:
$RootPath = "D:\SomeFolder"
Get-ChildItem $RootPath -Recurse -Include *.csv -OutVariable csvFiles |
ForEach-Object{
(Import-CSV $_.FullName) | # NOTE THE (...)
Select-Object Test_Name, Test_DataName, Device_Model, Device_FW,
Data_Avg_ms, Data_StdDev |
Export-Csv $_.FullName -NoType -Force
}
Note that I've used -OutVariable csvFiles in order to collect the CSV file-info objects in output variable $csvFiles. Your attempt to collect the file paths via $csvFilePaths = ... doesn't work, because it attempts to collects Export-Csv's output, but Export-Csv produces no output.
Also, to be safe, I've changed the Import-Csv argument from $_ to $_.FullName to ensure that Import-Csv finds the input file (because, regrettably, file-info object $_ is bound as a string, which sometimes expands to the mere file name).
A safer solution would be to output to a temporary file first, and (only) on successful completion replace the original file.
With either approach, the replacement file will have default file attributes and permissions; if the original file had special attributes and/or permissions that you want to preserve, you must recreate them explicitly.
As Matt commented, your last $PSItem ($_) not related to the Get-ChildItem anymore but for the Select-Object cmdlet which don't have a FullName Property
You can use differnt foreach approach:
$RootPath = "D:\SomeFolder"
$csvFilePaths = Get-ChildItem $RootPath -Recurse -Include *.csv
foreach ($csv in $csvFilePaths)
{
Import-CSV $csv.FullName |
Select-Object Test_Name,Test_DataName,Device_Model,Device_FW,Data_Avg_ms,Data_StdDev |
Export-Csv $csv.FullName -NoType -Force
}
Or keeping your code, add $CsvPath Variable containing the csv path and use it later on:
$RootPath = "D:\SomeFolder"
Get-ChildItem $RootPath -Recurse -Include *.csv | ForEach-Object{
$CsvPath = $_.FullName
Import-CSV $CsvPath |
Select-Object Test_Name,Test_DataName,Device_Model,Device_FW,Data_Avg_ms,Data_StdDev |
Export-Csv $CsvPath -NoType -Force
}
So I have figured it out. I was attempting to pipe through the Import-Csv cmdlet directly instead of declaring it as a variable in the o.g. code. Here is the code snippet that gets what I wanted to get done, done. I was trying to pipe in the Import-Csv cmdlet directly before, I simply had to declare a variable that uses the Import-Csv cmdlet as its definition and pipe that variable through to Select-Object then Export-Csv cmdlets. Thank you all for your assistance, I appreciate it!
$RootPath = "\someDirectory\"
$CsvFilePaths = #(Get-ChildItem $RootPath -Recurse -Include *.csv)
$ColumnsWanted = #('Test_Name','Test_DataName','Device_Model','Device_FW','Data_Avg_ms','Data_StdDev')
for($i=0;$i -lt $CsvFilePaths.Length; $i++){
$csvPath = $CsvFilePaths[$i]
Write-Host $csvPath
$importedCsv = Import-CSV $csvPath
$importedCsv | Select-Object $ColumnsWanted | Export-CSV $csvPath -NoTypeInformation
}
I want to check files for integrity with a checksum. To make it easier I put the hash into an alternate data stream of the file. When someone alters the file I can verify this with the checksum.
However, when I add a data stream the file's LastWriteTime gets updated, so I added functionality to reverse it.
It works like a charm - mostly. But it fails with some files, about 5%. I have no idea why. It looks like it fails with file names that contain spaces or extra dots, but many other that have spaces and multiple dots in the file name work just fine.
Does anyone know what's going on, how to prevent these failures or how to improve the code?
Thanks!
The code:
$filenames = Get-ChildItem *.xl* -Recurse | % { $_.FullName }
foreach( $filename in $filenames ) { ForEach-Object { $timelwt = Get-ItemProperty $filename | select -expand LastWriteTime | select -expand ticks } {add-content -stream MD5 -value (Get-FileHash -a md5 $filename).hash $filename } { Set-ItemProperty $filename -Name LastWriteTime -Value $timelwt}}```
Your code can be reduced to this:
Get-ChildItem *.xl* -Recurse | ForEach-Object {
$lastWriteTime = $_.LastWriteTime
$_ | Add-Content -Stream MD5 -Value ($_ | Get-FileHash -a md5).Hash
$_.LastWriteTime = $lastWriteTime
}
Get-ChildItem with the -Filter you have in place will return FileInfo objects, which have a settable LastWriteTime property, there is no reason for using Get-ItemProperty nor Set-ItemProperty over them.
As for, why your code could be failing, the likeable explanation is that you have some file paths with wildcard metacharacters, and since you're not using -LiteralPath, the cmdlets are defaulting to the -Path parameter (which allows wildcard metacharacters).
As aside, I would personally recommend you to create a separate checksum file for the files instead of adding an alternative data stream.
I have 30 folders. Each folder contains 22 .text files. I am trying to get the filenames and row count of each .text files and output it in a .csv file, appending the name of the .csv file with the name of each subfolder.
The script I made works but it will pull all the .text files from all subfolders and output it in a single .csv file.
Any idea how I can create one .csv file per subfolder ?
$results = Get-ChildItem "C:\Users\testserver\Documents\logfiles\*.txt" -Recurse | % { $_ | select name, #{n="lines";e={get-content $_ | measure-object -line | select -expa lines } } } | Select-Object name, lines
$results | Export-Csv "C:\Users\testserver\Documents\results.csv" -notype
Use the Group-Object cmdlet to process the files grouped by the directory they reside in:
$inDir = 'C:\Users\testserver\Documents\logfiles'
$outDir = 'C:\Users\testserver\Documents'
Get-ChildItem -File -Recurse -Filter *.txt $inDir |
Group-Object DirectoryName |
ForEach-Object {
$outFile = Join-Path $outDir "results-$(Split-Path -Leaf $_.Name).csv"
$_.Group |
Select-Object Name, #{ n="Lines"; e={ (Get-Content $_.FullName | Measure-Object -Line).lines } } |
Export-Csv -NoTypeInformation $outFile
}
The above creates output files such as results-foo.csv in $outDir, where foo is the name of a subdirectory containing *.txt files.
Note that the assumption is that no two subdirectories in the target $inDir directory tree have the same name; more work is needed if you need to handle such collisions, such as reflecting the relative paths in the file name, with \ replaced with a substitute char.
I need my program to give me every folder containing files which are out of the Windows' number of characters limit. It means if a file has more than 260 characters (248 for folders), I need it to write the address of the file's parent. And I need it to write it only once. For now, I'm using this code:
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
Select-Object -ExpandProperty FullName |
Split-Path $_.FullName
But the Split-Path won't work (this is the first time I use it). It tells me the -Path parameter has a null value (I can write -Path but it doesn't change anything).
If you want an example of what I need: imagine folder3 has a 230-character address and file.txt has a 280-character address:
C:\users\folder1\folder2\folder3\file.txt
Would write:
C:\users\folder1\folder2\folder3
I'm using PS2, by the way.
Spoiler: the tool you are building may not be able to report paths over the limit since Get-ChildItem cannot access them. You can try nevertheless, and also find other solutions in the links at the bottom.
Issue in your code: $_ only works in specific contexts, for example a ForEach-Object loop.
But here, at the end of the pipeline, you're only left with a string containing the full path (not the complete file object any more), so directly passing it to Split-Path should work:
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
Select-Object -ExpandProperty FullName |
Split-Path
as "C:\Windows\System32\regedt32.exe" | Split-Path would output C:\Windows\System32
Sidenote: what do (Get-Item C:\Windows\System32\regedt32.exe).DirectoryName and (Get-Item C:\Windows\System32\regedt32.exe).Directory.FullName output on your computer ? These both show the directory on my system.
Adapted code example:
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
ForEach-Object { $_.Directory.FullName } |
Select-Object -Unique
Additional information about MAX_PATH:
How do I find files with a path length greater than 260 characters in Windows?
Why does the 260 character path length limit exist in Windows?
http://www.powershellmagazine.com/2012/07/24/jaap-brassers-favorite-powershell-tips-and-tricks/
https://msdn.microsoft.com/en-us/library/windows/desktop/aa365247%28v=vs.85%29.aspx
https://gallery.technet.microsoft.com/scriptcenter/Get-ChildItemV2-to-list-29291aae
you cannot use get-childitem to list paths greater than the windows character limit.
There are a couple of alternatives for you. Try an external library like 'Alphafs' or you can use robocopy. Boe Prox has a script that utilizes robocopy and it is available on technet but i am not sure if it will work on PSV2. Anyway you can give it a try.
I've had a similar problem and resolved it like this:
$PathTooLong = #()
Get-ChildItem -LiteralPath $Path -Recurse -ErrorVariable +e -ErrorAction SilentlyContinue
$e | where {$_.Exception -like 'System.IO.PathTooLongException*'} | ForEach-Object {
$PathTooLong += $_.TargetObject
$Global:Error.Remove($_)
}
$PathTooLong
On every path that is too long, or that the PowerShell engine can't handle, Get-ChildItem will throw an error. This error is saved in the ErrorVariable called e in the example above.
When all errors are collected in $e you can filter out the ones you need by checking the error Exception for the string System.IO.PathTooLongException.
Hope it helps you out.
My group is moving to a new network where we can't directly copy from our computer in Network A to the new machine in Network B. After years on this machine in Network A, I've got project files interspersed all over the disk. I need to build a script to copy the folders and files to a backup disk. No problem there, but the network tech guy requires the total byte count to be known before copying.
In CMD I've used dir /AD /S /B > C:\Users\r6540\Desktop\UserFiles.txt from C:\ to generate a huge list of directories, including a lot of junk that I've manually edited out.
e.g.
C:\Dev\SSIS
C:\Dev\SSIS\DatabaseCleanup
C:\Dev\SSIS\DatabaseMaintTests
C:\Dev\SSIS\EclipseKeys
C:\Dev\SSIS\TemplateProject
I've never used PowerShell, but it certainly looks like this task would be within its ability. I found this:
$startFolder = "C:\Scripts"
$colItems = (Get-ChildItem $startFolder | Measure-Object -property length -sum)
"$startFolder -- " + "{0:N2}" -f ($colItems.sum / 1MB) + " MB"
$colItems = (Get-ChildItem $startFolder -recurse | Where-Object {$_.PSIsContainer -eq $True} | Sort-Object)
foreach ($i in $colItems)
{
$subFolderItems = (Get-ChildItem $i.FullName | Measure-Object -property length -sum)
$i.FullName + " -- " + "{0:N2}" -f ($subFolderItems.sum / 1MB) + " MB"
}
at Microsoft technet and also this, same page:
$objFSO = New-Object -com Scripting.FileSystemObject
"{0:N2}" -f (($objFSO.GetFolder("C:\Scripts").Size) / 1MB) + " MB"
The output I'm looking for is the directory name, a tab, and the folder size (without the "MB" as shown above though) and CRLF as the EOL written to a text file.
e.g.
C:\Dev\SSIS 70.23
C:\Dev\SSIS\DatabaseCleanup 17.80
C:\Dev\SSIS\DatabaseMaintTests 22.91
C:\Dev\SSIS\EclipseKeys 1.22
C:\Dev\SSIS\TemplateProject 13.29
Anyone know PowerShell well enough to troop through UserFiles.txt and get the resulting text file output?
Form doesn't matter as much as function--so if you can come up with an alternate approach, I'd be glad to see it.
Thanks.
This is pretty straightforward in PowerShell:
$objFSO = New-Object -com Scripting.FileSystemObject
Get-Content c:\input.txt |
Foreach { "{0}`t{1:N2}" -f $_, (($objFSO.GetFolder($_).Size) / 1MB) } |
Out-File c:\output.txt -enc ascii
This is assuming the FileSystemObject script you found works. :-)
Simple approach is to use the pipeline more efficiently
$inputFile = "C:\Users\r6540\Desktop\UserFiles.txt"
$outputFile = "C:\temp\output.txt"
Get-Content $inputFile | ForEach-Object{
$sum = Get-ChildItem $_ -recurse -Force | Measure-Object -property length -sum | Select-Object -ExpandProperty Sum
"{1}`t{0:N2}" -f ($sum / 1MB), $_
} | Set-Content $outputFile
So we take each line of the file and gather the size in MB using your posted logic. The send the output to file. But we can improve on that a little.
$inputFile = "C:\Users\r6540\Desktop\UserFiles.txt"
$outputFile = "C:\temp\output.txt"
$results = Get-Content $inputFile | ForEach-Object{
$props = #{
Folder = $_
Sum = "{0:N2}" -f ((Get-ChildItem $_ -Recurse -Force | Measure-Object -property length -sum | Select-Object -ExpandProperty Sum) / 1MB)
}
New-Object -TypeName PSCustomObject -Property $props
}
$results | Export-CSV $outputFile -NoTypeInformation
"Total,{0}" -f ($results | Measure-Object Sum -sum | Select-Object -ExpandProperty Sum) | Add-Content $outputFile
Only difference here is that we make nice CSV output as well as a total being appended to the bottom since at the end of the day that might be nice to know as well (If I was the recipient of multiple copies from multiple users I would save me a few seconds of effort.) You could just keep a counter in the loop as well for this but this give you an opportunity to see PowerShell at work.