Grep only newest directories - windows

I have a directory of applications, each containing a directory for every version of that application.
Applications
MyApp
MyApp_0.1
MyApp_0.2
MyOtherApp
MyOtherApp_0.1
MyOtherApp_0.2
MyOtherApp_0.3
I want to grep this tree, but it takes too long and yields too many old matches, so I only want to check in the highest version of each application.
I'll accept answers using any built-in windows tools, GNU tools, or powershell, but I'm not very familiar with powershell, so a non-powershell answer would be preferable.

$AppDir = 'C:\apps'
$Apps = Get-ChildItem -Path $AppDir -Directory
Foreach($App in $Apps){
$HighestVersion = Get-ChildItem -Path $App.FullName | Sort -Descending | Select -First 1
[PSCustomObject]#{
App=($App.Name);
Version=([Version]($HighestVersion -split '_')[1]); #Convert it to a version for easier comparisons
#Version=(($HighestVersion -split '_')[1]);
Path=($HighestVersion.FullName)
}
}

Related

Get folder permissions with only 3 levels of subfolders

First of all: sorry for my bad english.
So, I need to create various reports with all permissions of specified folders.
After some search I found 2 ways.
One is using AccessEnum, that it's almost perfect but it doesn't export all permissions, only the folders that have different permission from the root folder. And I need all of them, even if they are the same of the root folder.
The second one is better, a powershell script, but has one weakness: too much recursive, and one of the folders had an output report of 7GB. Holy shirt.
What I need: to modify the script to go deep only for 3 levels of subfolders, for example:
"C:\Folder1" contains various subfolders but I want the script to go deep only to "C:\Folder1\Folder2\Folder3\"
How can I do it?
This is the script:
dir -Recurse "C:\FOLDER" | where { $_.PsIsContainer } | % { $path1 = $_.fullname; Get-Acl $_.Fullname | % { $_.access | Add-Member -MemberType NoteProperty '.\Application Data' -Value $path1 -passthru }} | Export-Csv "C:\REPORT.csv"
Use
Get-Childitem
instead. It has a Depth-Parameter and you can only include Folders.
Reference:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/get-childitem?view=powershell-7

Powershell Get-HotFix find updates supplied in a text file [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am working on updates for Win 7 x64 ultimate. I have a text file in which I have typed KBnnnnn one entry per line. I want sort of a script/loop to go through each entry in text file and find it in installed updates. If found append a new text file with HotFixID, Description, date etc and status='INSTALLED'. If not found, status='NOT INSTALLED'
Later I want to selectively uninstall specific HotFixes by simial loop process reading each entry from a text file and uninstalling it, updating status on screen and in another text log file. I am very new to PowerShell, tried to create a loop in cmd batch scripting using WMIC but no success yet.
Why are you not using WSUS for this? It is why it exists.
There are modules in the MS powershellgallery.com for this kind of use case well.
Find-Module -Name '*WSUS*' | Format-Table -AutoSize
Version Name Repository Description
------- ---- ---------- -----------
2.3.1.6 PoshWSUS PSGallery PowerShell module to manage a WSUS Server. Support site: https://github.com/proxb/PoshWSUS/
1.1.0 ecs.wsus PSGallery This Windows PowerShell module contains ECS.WSUS funtions
0.4.4 PSWsusSpringClean PSGallery Give your WSUS server a thorough spring cleaning
0.9.0 PSWSUSMigration PSGallery Powershell module to help WSUS (Windows Server Update Services) server migration. Support site: https://github.com/reiikei/PSWSUSMigration
I am working on updates for Win 7 x64 ultimate.
So, what version of PowerShell are you using on Win7?
I have a text file in which I have typed KBnnnnn one entry per line.
OK, a standard file that can be easily read using Import-Csv or Get-Content. Yet, why are you doing this? There is a cmdlet called Get-HotFix specifically for this.
# All Help topics and locations
Get-Help about_*
Get-Help about_Functions
Get-Help about* | Select Name, Synopsis
Get-Help about* |
Select-Object -Property Name, Synopsis |
Out-GridView -Title 'Select Topic' -OutputMode Multiple |
ForEach-Object { Get-Help -Name $_.Name -ShowWindow }
explorer "$pshome\$($Host.CurrentCulture.Name)"
# Get parameters, examples, full and Online help for a cmdlet or function
# Get a list of all functions
Get-Command -CommandType Function |
Out-GridView -PassThru -Title 'Available functions'
# Get a list of all commandlets
Get-Command -CommandType Cmdlet |
Out-GridView -PassThru -Title 'Available cmdlets'
# get function / cmdlet details
Get-Command -Name Import-Csv -Syntax
(Get-Command -Name Import-Csv).Parameters.Keys
Get-help -Name Import-Csv -Full
Get-help -Name Import-Csv -Online
Get-help -Name Import-Csv -Examples
Get-Command -Name Get-Content -Syntax
(Get-Command -Name Get-Content).Parameters.Keys
Get-help -Name Get-Content -Full
Get-help -Name Get-Content -Online
Get-help -Name Get-Content -Examples
Get-Command -Name Get-Hotfix -Syntax
(Get-Command -Name Get-Hotfix).Parameters.Keys
Get-help -Name Get-Hotfix -Full
Get-help -Name Get-Hotfix -Online
Get-help -Name Get-Hotfix -Examples
I want sort of a script/loop
Sure, you can do this.
About ForEach
Each of the above help files has examples of loops.
to go through each entry in text file and find it in installed
updates.
OK, this is a common thing. A PowerShell very beginner thing, with lots of articles, samples and videos all over the web for this and shown in the help cmdlet, resources, etc., shown.
If found append a new text file with HotFixID, Description, date etc.,
and status='INSTALLED'. If not found, status='NOT INSTALLED'
Again, nothing new or complicated here and again a very common thing and done via the -Append switch or the Add-Content cmdlet.
Get-Command -Name Add-Content -Syntax
(Get-Command -Name Add-Content).Parameters.Keys
Get-help -Name Add-Content -Full
Get-help -Name Add-Content -Online
Get-help -Name Add-Content -Examples
later I want to selectively uninstall specific HotFixes by simial loop
process reading each entry from a text file and uninstalling it,
Again, nothing new or complicated here and again a very common thing. You do this via a comparison block/command in your code.
About Comparison Operators
updating status on screen and in another text log file.
Again, nothing new or complicated here and again a very common thing. This is what Out-File or Export-Csv, or Start-Transcript or writing your own logger is for and using progress bars. Lots of articles, blogs, videos ho how to do this.
Script Write-Log PowerShell Logging Function
Get-Command -Name Write-Progress -Syntax
(Get-Command -Name Write-Progress).Parameters.Keys
Get-help -Name Write-Progress -Full
Get-help -Name Write-Progress -Online
Get-help -Name Write-Progress -Examples
I am very new to PowerShell, tried to create a loop in cmd batch
scripting using WMIC but no success yet.
OK, this is fine. It means you should spend time learning it first and there are plenty of free text-based and video-based (YouTube videos, MSDN videos, etc.) for you to use. All it requires is that you search for it, use them as-is and or tweak as needed.
'powershell windows hotfix management'
'Beginning PowerShell'
'Intermediate PowerShell'
'Advanced PowerShell'
'PowerShell file and folder management'
'PowerShell hotfix report'
Sample scripts
'powershellgallery.com hotfix management'
'wmic find hotfix'
'vbscript wmic hotfix management'
The question here is, why are you using WMIC, vs Powershell directly? Hence the cmdlets above. One can use WMIC without ever using PowerShell at all, in .bat/cmd/vbs files as has been done for years.
You say you've done batch file programming, It's good to see you dip into the PowerShell pool, but that does not mean you can't stick with batch to do what you need and then convert that to PowerShell now or later.
Update based on you code comment
If you did this in the console/ISE/VSCode, it just works as it would from cmd.exe
wmic qfe get hotfixid > d:\temp\QfElist.txt
Get-content -Path 'd:\temp\QfElist.txt'
<#
Results
KB4537572
KB4513661
...
#>
But you could have just done this and gotten something far more useable
Get-HotFix
<#
# Results
Source Description HotFixID InstalledBy InstalledOn
------ ----------- -------- ----------- -----------
LP70 Update KB4537572 NT AUTHORITY\SYSTEM 11-Mar-20 00:00:00
LP70 Update KB4513661 09-Sep-19 00:00:00
LP70 Security Update KB4515383 09-Sep-19 00:00:00
...
#>
Comparing this against a file is just as simple. Let's say your file looks like this.
$KBCheckList = '
KB4537572
KB4513661
KB4515400
' | Out-File -FilePath 'D:\Temp\KBCheckList.txt'
Now read the file, using this
Get-Content -Path 'D:\Temp\KBCheckList.txt'
<#
# Results
KdId
KB4537572
KB4513661
KB4515400
#>
or this
Import-Csv -Path 'D:\Temp\KBCheckList.txt'
<#
# Results
WARNING: One or more headers were not specified. Default names starting with "H" have been used in place of any missing headers.
H1
--
KB4537572
KB4513661
KB4515400
#>
You can see the difference is minor (visually) but Csv file needs a header (and really should be properly formatted first). Either add it to the top of the file or add it on the fly
Import-Csv -Path 'D:\Temp\KBCheckList.txt' -Header 'KBID'
<#
# Results
KBID
----
KB4537572
KB4513661
KB4515400
#>
All the above is just educational stuff for you. You only really need one of the two below, or similar.
Now just use the file. Read in a loop and use if/then or try/catch statement to get results
Import-Csv -Path 'D:\Temp\KBCheckList.txt' -Header 'KBID' |
ForEach {
$PSItem.KBID
}
<#
# Results
KB4537572
KB4513661
KB4515400
#>
or just compare the file list to the results of the cmdlet
$QfeData = Get-Hotfix
$KBCheckList = Import-Csv -Path 'D:\Temp\KBCheckList.txt' -Header 'KBID'
Compare-Object -ReferenceObject $QfeData.HotFixID -DifferenceObject $KBCheckList.KBID
<#
# Results
InputObject SideIndicator
----------- -------------
KB4515400 => *** this means that this ID is only in the DifferenceObject which is your file, thus not installed.
KB4515383 <=
KB4516115 <=
KB4517245 <=
KB4521863 <=
KB4524244 <=
KB4524569 <=
KB4525419 <=
KB4528759 <=
KB4537759 <=
KB4538674 <=
KB4541338 <=
KB4551762 <=
#>

What is the most efficient way on getting a list of files utilizing get-childitem

I have created a list of PowerShell commands for getting over 500,000 rows of directories. The goal is to get a list of the files in each of the directories specified in the PowerShell command. My syntax works perfectly if I run a small batch, but there are definitely performance issues when running them in a bulk manner. One thing I noticed is that if I run all these 500,000 rows together, I get extremely high usage (about 12GB and using 97% of memory) and it takes a while for me to even begin to generate a CSV file. Please see my code listed below on what I am using
I was thinking I can get a list of the directories I need to use into a CSV. And researching around here, I can use a CSV as a variable and a foreach. But I am stumped on putting all that together.
Get-ChildItem -Path \\MYIP\ARCHIVE\ArchiveVolumes\UniqueID\ -Exclude *.wav*,*.md5*,*.abc, -Recurse |
Select-Object FullName |
Add-Member -MemberType NoteProperty -Name Myfield -Value 123456 -PassThru |
Export-Csv -Append -Path C:\mypath\fileslist.csv -Encoding ascii -NoType
I'm hoping that I can better utilize what I am running here as I am still learning powershell. Any ideas?

Checking for a file whether it is readable and regular in powershell

I'm new to powershell and I want to check if file in readable and regular. In unix we can do it in one line by using -f & -r. For example the following shell script function accepts filename as argument and checks the readability and regularity of file, whats the powershell equivalent for this?
_ChkRegularFile_R() # checks whether a file is regular and readable
{
_CRFRfilename=$1 # name of the file to be checked
_CRFRsts=1 # default is error
if [ -f "$_CRFRfilename" ]
then
if [ -r "$_CRFRfilename" ]
then
_CRFRsts=0 # success: regular file is readable
fi
fi
return $_CRFRsts
}
To test if a file is readable, you try to open it. If you get an error, then it's not readable. You need to either trap or catch exceptions or stop on errors, as appropriate. Remember, Windows locks files that are open for writing, so applications need to expect that they sometimes can't open a file.
If you absolutely have to, you can use something like this to test if you can read a file:
try {
[System.IO.File]::OpenRead($FullPathName).Close()
$Readable = $true
}
catch {
$Readable = $false
}
And this to test if you can write to a file:
try {
[System.IO.File]::OpenWrite($FullPathName).Close()
$Writable = $true
}
catch {
$Writable = $false
}
That logic is fairly easy to wrap into a function if you really need it.
As far as file types, nearly everything in the file system in Windows is a plain file or a directory, since Windows doesn't have the "everything is a file" convention. So, normally you can test as follows:
# Test if file-like
Test-Path -Path $Path -Leaf
# Test if directory-like
Test-Path -Path $Path -Container
If you're working with a FileInfo or DirectoryInfo object (i.e., the output of Get-Item, Get-ChildItem, or a similar object representing a file or directory) you'll have the PSIsContainer property which will tell you if the item is a file or a directory.
That covers probably 99.999% of cases.
However, if you need to know if something is an NTFS hard link to a file (rare, but oldest), an NTFS junction to a directory, an NTFS symlink, an NTFS volume mount point, or any type of NTFS reparse point, it gets much more complicated. [This answer does a good job describing the first three.]
Let's create a simple NTFS folder to test with:
# Create a test directory and change to it.
New-Item -Path C:\linktest -ItemType Directory | Select-Object -ExpandProperty FullName | Push-Location
# Create an empty file
New-Item -Path .\file1 -ItemType file -Value $null | Out-Null
New-Item -Path .\file2 -ItemType file -Value $null | Out-Null
# Create a directory
New-Item -Path .\dir1 -ItemType Directory | Out-Null
# Create a symlink to the file
New-Item -ItemType SymbolicLink -Path .\sfile1 -Value .\file1 | Out-Null
# Create a symlink to the folder
New-Item -ItemType SymbolicLink -Path .\sdir1 -Value .\dir1 | Out-Null
# Create a hard link to the file
New-Item -ItemType HardLink -Path .\hfile1 -Value .\file1 | Out-Null
# Create a junction to the folder
New-Item -ItemType Junction -Path .\jdir1 -Value .\dir1 | Out-Null
# View the item properties
Get-ChildItem -Path . | Sort-Object Name | Format-Table -Property Name, PSIsContainer, LinkType, Target, Attributes -AutoSize
Your output will be:
Name PSIsContainer LinkType Target Attributes
---- ------------- -------- ------ ----------
dir1 True {} Directory
file1 False HardLink {C:\linktest\hfile1} Archive
file2 False {} Archive
hfile1 False HardLink {C:\linktest\file1} Archive
jdir1 True Junction {C:\linktest\dir1} Directory, ReparsePoint
sdir1 True SymbolicLink {C:\linktest\dir1} Directory, ReparsePoint
sfile1 False SymbolicLink {C:\linktest\file1} Archive, ReparsePoint
Note that both file1 and hfile1 are hard links, even though file1 wasn't created as such.
To clean up the above garbage, do:
Get-ChildItem -Path C:\linktest\ | ForEach-Object { $_.Delete() }
There's a bug in Remove-Item with deleting some container links which prevents the command from removing the items.
The general solution would be to get the item and test it:
# Get the item. Don't use Get-ChildItem because that will get a directory's contents
$Item = Get-Item -Path $Path
# Is it a container
$Item.PSIsContainer
# Is it a link of some kind?
[System.String]::IsNullOrWhiteSpace($Item.LinkType)
$Item.LinkType -eq 'Junction'
# Is it a Reparse Point?
($Item.Attributes -band [System.IO.FileAttributes]::ReparsePoint) -eq [System.IO.FileAttributes]::ReparsePoint
There are several other potential attributes, too:
PS> [System.Enum]::GetNames([System.IO.FileAttributes])
ReadOnly
Hidden
System
Directory
Archive
Device
Normal
Temporary
SparseFile
ReparsePoint
Compressed
Offline
NotContentIndexed
Encrypted
IntegrityStream
NoScrubData
Note that Device is documented as reserved for future use. Ain't no device file type in Windows.
For volume mount points, I'm not 100% sure how those look. I know you can create them on Windows 8.1 and later with Get-Partition followed by an appropriate Add-PartitionAccessPath, but I'm on Windows 7 currently. I'm afraid I have no means of testing this at the moment.
Finally, I have no idea how exactly PowerShell Core 6.0 on Linux handles file types.
Soooo,,,,
This is not something I regulary do, but if memory serves. In *nix, a regular file contains data, is a direcotry,
Again, not somehting I do/have to worry about under normal PoSH stuff.
So you are testing for where the object is a writable file (and / or non-zero) or a directory or binary?
So, in PoSH, prior to v3... you do something like this...
$IsDir = {$_.PsIsContainer}
$IsFile = {!$_.PsIsContainer}
ls D:\Temp | Where $IsDir
lsectory: D:\Temp
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 1/4/2018 2:31 PM ArchiveDestination
d----- 1/4/2018 1:40 PM ArchiveSource
d----- 1/1/2018 3:34 PM diff
...
ls D:\Temp | Where $IsFile
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 6/7/2017 5:28 PM 512 CombinedSources07Jun2017.txt
-a---- 2/24/2018 6:29 PM 115 EmpData.csv
-a---- 11/18/2017 6:47 PM 11686 fsoVolume.docx
...
PoSH V3 and higher. This is supported natively e.g.:
ls -directory
ls -ad
ls -file
ls -af
Of course any of the above can be set to just return true or false using if/then or try/catch.
If all the above is a bit more typing than you'd like then you can create your own function and give it whatever alias you choose, well, as long as it's not an alias already in use.
See the help files ...
# Get parameters, examples, full and Online help for a cmdlet or function
(Get-Command -Name Get-ChildItem).Parameters
Get-help -Name Get-ChildItem -Examples
Get-help -Name Get-ChildItem -Full
Get-help -Name Get-ChildItem -Online
Get-Help about_*
Get-Help about_Functions
Get-Alias -Definition Get-ChildItem
# Find all cmdlets / functions with a target parameter
Get-Help * -Parameter Append
# All Help topics locations
explorer "$pshome\$($Host.CurrentCulture.Name)"
Of course you can check / modify file attributes as well. See this article on the topic:
File Attributes in PowerShell
Fun with file and folder attributes, via PowerShell and the DIR command.
https://mcpmag.com/articles/2012/03/20/powershell-dir-command-tricks.aspx
So, you could do something like this, to achieve the same attribute check
Get-ChildItem -Path $FilePath -File -Force | Where {$_.Attributes -notmatch 'ReadOnly'}
Or a function wiht an alias.
Function Test-RegularFile
{
[CmdletBinding()]
[Alias('trf')]
Param
(
[string]$FilePath
)
try
{
Get-ChildItem -Path $FilePath -File -Force `
| Where {$_.Attributes -notmatch 'ReadOnly'}
"$FilePath is a regular file" # success: regular file is readable
}
catch
{
Write-Warning -Message "$FilePath is not a Regular file."
}
}
trf -FilePath D:\Temp\fsoVolume.txt
Since you are new to PoSH, it reall important / vital that you get a base understanding before looking at conversion comparisons.
See this post for folks providing some paths for learning PowerShell.
https://www.reddit.com/r/PowerShell/comments/7oir35/help_with_teaching_others_powershell
To test whether it's a regular file:
Test-Path -PathType Leaf foo.txt
To test whether it's readable:
Get-ChildItem foo.txt | ? { $_.Mode -match 'r'}
To test whether it's hidden:
Get-ChildItem -force foo.txt | ? { $_.Mode -match 'h'}

Remote Powershell parsing

Im trying to retrieve a parsed list of different information regarding remote executables within a windows domain, permissions are take care of and the individual Powershell commands are working, my issue is outputting this recursive list on a file (putting all together properly):
My desired Output (per computer):
computer_name.csv # Filename
$application1Name.exe, $application1Version, $application1LastModifiedDateMMDDYY, $application1MD5HASH
$application2Name.exe, $application2Version, $application2LastModifiedDateMMDDYY, $application2MD5HASH
...
So far I have all the pieces:
#A way to recursive retrieve executables from a given remote path (Name + LastModified):
get-childitem \\192.168.X.X\C$\defaultPath\FoldersAndSubfoldersWithExecutables\ - Include *.exe -Recurse | ForEach-Object {$_.Name, $_.LastWriteTime} > C:\LOCALPATH\output.txt
#A way to retrieve the version info from remote executables (Version):
[System.Diagnostics.FileVersionInfo]::GetVersionInfo("\\192.168.X.X\C$\defaultPath\application1.exe").FileVersion
#A way to retrieve the MD5 Hash from remote executable files (MD5HASH):
get-FileHash \\192.168.X.X\C$\defaultPath\application1.exe -Algorithm MD5 | ForEach-Object { $_.Hash }
My issue is building this script structure to accomodate the desired output listed above, I have a list of IP address to loop this script thru but Im having issues connecting the dots..
Thanks!
Each operation you listed can be executed within the ForEach-Object loop, and a resultant csv string containing all the necessary data points can be built using string interpolation.
Get-ChildItem \\192.168.x.x\C$\defaultPath\FoldersAndSubfoldersWithExes\ -Include *.exe -Recurse | ForEach-Object {
$Name = $_.Name
$LastWriteTime = $_.LastWriteTime
$Version =[System.Diagnostics.FileVersionInfo]::GetVersionInfo($_.FullName).FileVersion
$Hash = (Get-FileHash $_.FullName -Algorithm MD5).Hash
"$Name, $Version, $LastWriteTime, $Hash"
} | Out-File computerName.csv

Resources