I am interested in file searching by custom properties. For example, I want to find all JPEG-images with certain dimensions. Something looks like
Get-ChildItem -Path C:\ -Filter *.jpg -Recursive | where-object { $_.Dimension -eq '1024x768' }
I suspect it's about using of System.Drawing. How it can be done?
Thanks in advance
That's actually pretty easy to do and your gut feeling about System.Drawing was in fact correct:
Add-Type -Assembly System.Drawing
$input | ForEach-Object { [Drawing.Image]::FromFile($_) }
Save that as Get-Image.ps1 somewhere in your path and then you can use it.
Another option would be to add the following to your $profile:
Add-Type -Assembly System.Drawing
function Get-Image {
$input | ForEach-Object { [Drawing.Image]::FromFile($_) }
}
which works pretty much the same. Of course, add fancy things like documentation or so as you see fit.
You can then use it like so:
gci -inc *.jpg -rec | Get-Image | ? { $_.Width -eq 1024 -and $_.Height -eq 768 }
Note that you should dispose the objects created this way after using them.
Of course, you can add a custom Dimension property so you could filter for that:
function Get-Image {
$input |
ForEach-Object { [Drawing.Image]::FromFile($_) } |
ForEach-Object {
$_ | Add-Member -PassThru NoteProperty Dimension ('{0}x{1}' -f $_.Width,$_.Height)
}
}
Here's an alternative implementation as a (almost) one-liner:
Add-Type -Assembly System.Drawing
Get-ChildItem -Path C:\ -Filter *.jpg -Recursive | ForEach-Object { [System.Drawing.Image]::FromFile($_.FullName) } | Where-Object { $_.Width -eq 1024 -and $_.Height -eq 768 }
If you are going to need to run this command more than once, I would recommend Johannes' more complete solution instead.
Related
For PowerShell 2.0 in Win 2008,
I need to check what's the newest file in a directory with about 1.6 million files.
I know I can use Get-ChildItem like so:
$path="G:\Calls"
$filter='*.wav'
$lastFile = Get-ChildItem -Recurse -Path $path -Include $filter | Sort-Object -Property LastWriteTime | Select-Object -Last 1
$lastFile.Name
$lastFile.LastWriteTime
The issue is that it takes sooooo long to find the newest file due to the sheer amount of files.
Is there a faster way to find that?
Sort-Object is known to be slow as it aggregates over each item combination.
But you don't need to do that as you might just go over each file and keep track of the latest one:
Get-ChildItem -Recurse |ForEach-Object `
-Begin { $Newest = $Null } `
-Process { if ($_.LastWriteTime -gt $Newest.LastWriteTime) { $Newest = $_ } } `
-End { $Newest }
there are a couple of things that can be done to improve performance.
First, use -Filter rather than -Include because the filter is passed to the underlying Win32API which will be a bit faster.
Also, because the script gathers all the files and then sorts them, you might be creating a very large memory footprint during the sorting phase. I don't know if it's possible to query the MFT or some other process which avoids retrieving each file and inspecting the lastwritetime, but an alternative approach could be:
gci -rec -file -filter *.wav | %{$v = $null}{if ($_.lastwritetime -gt $v.lastwritetime){$v=$_}}{$v}
I tried this with all files and saw the following:
measure-command{ ls -rec -file |sort lastwritetime|select -last 1}
. . .
TotalSeconds : 142.1333641
vs
measure-command { gci -rec -file | %{$v = $null}{if ($_.lastwritetime -gt $v.lastwritetime){$v=$_}}{$v} }
. . .
TotalSeconds : 87.7215093
which is a pretty good savings. There may be additional ways to improve performance
I am new to powershell scripting and have been tasked to create some alerts based on errors in certain logfiles. These are just logs from a bespoke application.
My current Code is
`$OutputFile3 = (Get-Location).Path + ".\Results.txt"
$Sourcefolder= "C:\Users\dewana\Documents\Test\"
$Targetfolder= "C:\Users\dewana\Documents\Test\Test3"
Get-ChildItem -Path $Sourcefolder -Recurse|
Where-Object {
$_.LastWriteTime -gt [datetime]::Now.AddMinutes(-5)
}| Copy-Item -Destination $Targetfolder
$Testing5 = Get-Content -Tail -1 -Path "C:\Users\dewana\Documents\Test\Test3\*.txt" | Where-Object
{ $_.Contains("errors") }
Remove-Item $OutputFile3
New-Item $OutputFile3 -ItemType file
try
{
$stream = [System.IO.StreamWriter] $OutputFile3
$stream.WriteLine('clientID 1111')
$stream.WriteLine('SEV 1')
$stream.WriteLine('Issue with this process')
}
finally
{
$stream.close()
}`
What i am struggling with is trying is
$Testing5 = Get-Content -Tail -1 -Path "C:\Users\dewana\Documents\Test\Test3\*.txt" | Where-Object { $_.Contains("errors") }
I am trying to store the latest string which contains the word error in the log file. i would want to use the stored string to the create an if statement to say if $Testing5 have a new value of error assigned the create a custom text file.
I can't seem to find out why the get-content is not working with the where-object
The only issue I can see is your Where-Object code block is on the next line.
Get-Content -Tail -1 -Path $tempfile | Where-Object
{ $_.Contains("errors") }
If you separate at the pipe it's fine.
Get-Content -Tail -1 -Path $tempfile |
Where-Object { $_.Contains("errors") }
I want to search all drives using PowerShell on windows machine to get the list of all files along with their extensions -
Based on desired extension we pass in it like - *.mp3 or
Fetch all files with multiple extensions like - *.txt, *.mp3 etc.
I tried below script but its giving only information from where we are running it. But I want to scan whole machine.
Get-ChildItem -Path .\ -Filter ***.doc** -Recurse -File| Sort-Object Length -Descending | ForEach-Object { $_.BaseName }
Checkout the Get-PSDrive cmdlet. It returns a list of drives, and you can specify just disk drives with the -PSProvider FileSystem parameter:
foreach ( $drive in $(Get-PSDrive -PSProvider FileSystem) ) {
Get-ChildItem -Path $drive.Root -Filter ***.doc** -Recurse -File |
Sort-Object Length -Descending |
ForEach-Object { $_.BaseName }
}
Didn't test that but you get the idea.
Using -Include on Get-ChildItem will allow you to specify a list of extensions. The -ErrorAction will cause it to skip drives that are not available such as an unmounted CD drive.
Get-PSDrive -PSProvider FileSystem |
ForEach-Object {
Get-ChildItem -Path $_.Root -Recurse -Include '*.doc*', '*.txt' -ErrorAction SilentlyContinue |
ForEach-Object { $_.Name }
} |
ForEach-Object {[PSCustomObject]#{HashCode = $_.GetHashCode(); FullName = $_.FullName}}
} |
Export-Csv -Path $TempFile -NoTypeInformation -Encoding ASCII
Update:
Here is a better way. It will prevent unknown extensions from getting into the mix such as "Microsoft.NET.Sdk.Publish.Docker.targets."
$ExtensionList = #('.txt', '.doc', '.docx', '.mp3')
$TempFile = Join-Path -path $Env:TEMP -ChildPath "$($pid.ToString()).tmp"
Get-PSDrive -PSProvider FileSystem |
ForEach-Object {
Get-ChildItem -Path $_.Root -Recurse -ErrorAction SilentlyContinue |
Where-Object { $ExtensionList -contains $_.Extension } |
ForEach-Object {
[PSCustomObject]#{
HashCode = $_.GetHashCode();
DirectoryName = $_.DirectoryName
Name = $_.Name
}
}
} |
Export-Csv -Path $TempFile -Delimiter ';' -NoTypeInformation -Encoding ASCII
Write-Host "The temp file is $TempFile"
This is more than what the original question asked, but if you are going to go through the trouble of listing all your files, I suggest getting the filehash as well so you can determine if you have duplicates. A simple file name search will not detect if the same file has been saved with a different name. Adding to what #lit (https://stackoverflow.com/users/447901/lit) has posted:
$ExtensionList = #('.txt', '.doc', '.docx', '.mp3')
Get-PSDrive -PSProvider FileSystem |
ForEach-Object {
Get-ChildItem -Path $_.Root -Recurse -ErrorAction SilentlyContinue |
Where-Object { $ExtensionList -eq $_.Extension } |
## ForEach-Object { $_.Name, $_.FullName, $_.GetHashCode() }
Select-Object #{Name="Name";Expression={$_.Name}}, #{Name="Hash";Expression={$_.GetHashCode()}}, #{Name="FullName";Expression={$_.FullName}} |
Export-Csv -Path C:\Temp\testing.csv -NoTypeInformation -Append
}
The addition of the file hash will allow you to see if you have duplicates and the full name will allow you to see where they are located.
I have to apply a command IF the folder size is greater or equal to 600MB.
I tried something like this
$folders = Get-ChildItem d:\home -exclude *.*
function Get-Size
{
param([string]$pth)
"{0:n2}" -f ((gci -path $pth -recurse | measure-object -property length -sum).sum /1mb)
}
ForEach ($subFolder in $folders){
echo $subFolder | select-object fullname
$size = Get-Size $subFolder
echo $size
if ($size -gt "600") { echo "Not ok." }
else { echo "OK template." }
}
It doesn't work. It writes the right size of the folder but the IF statement is not respected. How do I do?
The simplest way is to use the FileSystemObject COM object:
function Get-FolderSize($path) {
(New-Object -ComObject 'Scripting.FileSystemObject').GetFolder($path).Size
}
I'd recommend against doing formatting in a Get-Size function, though. It's usually better to have the function return the raw size, and do calculations and formatting when you actually display the value.
Use it like this:
Get-ChildItem 'D:\home' | Where-Object {
$_.PSIsContainer -and
Get-FolderSize $_.FullName -gt 600MB
}
or like this:
Get-ChildItem 'D:\home' | Where-Object {
$_.PSIsContainer
} | ForEach-Object {
if (Get-FolderSize $_.FullName -gt 600MB) {
'Not OK.'
} else {
'OK template.'
}
}
On PowerShell v3 and newer you can use Get-ChildItem -Directory instead of Get-ChildItem | Where-Object { $_.PSIsContainer }.
When you are comparing using $size -gt "600" the value is considered as string. Hence not getting right results.
Try comparison using integers.
Does anybody know a powershell 2.0 command/script to count all folders and subfolders (recursive; no files) in a specific folder ( e.g. the number of all subfolders in C:\folder1\folder2)?
In addition I also need also the number of all "leaf"-folders. in other words, I only want to count folders, which don't have subolders.
In PowerShell 3.0 you can use the Directory switch:
(Get-ChildItem -Path <path> -Directory -Recurse -Force).Count
You can use get-childitem -recurse to get all the files and folders in the current folder.
Pipe that into Where-Object to filter it to only those files that are containers.
$files = get-childitem -Path c:\temp -recurse
$folders = $files | where-object { $_.PSIsContainer }
Write-Host $folders.Count
As a one-liner:
(get-childitem -Path c:\temp -recurse | where-object { $_.PSIsContainer }).Count
To answer the second part of your question, of getting the leaf folder count, just modify the where object clause to add a non-recursive search of each directory, getting only those that return a count of 0:
(dir -rec | where-object{$_.PSIsContainer -and ((dir $_.fullname | where-object{$_.PSIsContainer}).count -eq 0)}).Count
it looks a little cleaner if you can use powershell 3.0:
(dir -rec -directory | where-object{(dir $_.fullname -directory).count -eq 0}).count
Another option:
(ls -force -rec | measure -inp {$_.psiscontainer} -Sum).sum
This is a pretty good starting point:
(gci -force -recurse | where-object { $_.PSIsContainer }).Count
However, I suspect that this will include .zip files in the count. I'll test that and try to post an update...
EDIT: Have confirmed that zip files are not counted as containers. The above should be fine!
Get the path child items with recourse option, pipe it to filter only containers, pipe again to measure item count
((get-childitem -Path $the_path -recurse | where-object { $_.PSIsContainer }) | measure).Count