Powershell "LastWriteTime" not working - sorting

Is there a way to stop Powershell from sorting by default? I need to read in files from a directory and in the order which they are listed in the directory I need them to also be listed in the array (variable). Even when I use -lastwritetime on the get-childitem command, it seems to have no affect. The primary reason why I want to do this is because the files have names that are all the same except each file has a number after it like the following:
document1.doc
document2.doc
document3.doc
.....
document110.doc
The problem is if it's sorted by name, it will sort in this manner:
document1.doc
document10.doc
document111.doc
Which is horribly wrong!
Right now I have this command and it doesn't work:
$filesnames1 = get-childItem -name *.doc -Path c:\fileFolder\test | sort-object -LastWriteTime

You probably want something more along these lines.
$filesnames1 = Get-ChildItem -Path c:\fileFolder\test\*.doc |
Sort-Object -Property LastWriteTime
I don't think either of those two cmdlets have a -LastWriteTime parameter.
If you need only the names from those filesystem objects, you can use ($filesnames1).Name after the code above. There are other ways.

Thanks for responding Mike. What I did is put a "-filter *.pdf" just before -path which gave me the headers. Then I piped in a "select-object -ExpandProperty name" to list it exactly how I needed it to. It was a little trial and error but I did eventually figure it out.
$filesnames1 = Get-ChildItem -filter *.doc -Path c:\fileFolder\test |
Sort-Object -LastWriteTime | -ExpandProperty name

Related

Is there any way to find 'metadata' on folders using powershell

So I want to know if any of the folders in a directory have any subfolders or files in them, I tried just looking at the directory in PowerShell but it gave me only mode, last write time, and name. Is there any way of adding to this list to include metadata of the folder like size or number of subfiles/folders all I want to know is if they are empty or not so there may be a simpler way I'm missing.
Thanks for any help you can give!
I see the question is tagged 'windows', so on Windows you could also use a COM object.
$fso = New-Object -ComObject Scripting.FileSystemObject
$folder = $fso.GetFolder($pathToFolder)
$folder will be an object with a bunch of interesting metadata on it, including SubFolders and Files. One of the interesting ones is Size. If Size is zero, there are no files in that directory, or in any nested subdirectories either.
If you just want to know if there are folders/subfolders and/or files then this will work:
$folder="C:\Test"
Get-ChildItem $folder -Recurse | Measure-Object
Output (in my case)
Count : 2
Average :
Sum :
Maximum :
Minimum :
Property :
If you want to see more properties then this might work for you:
Get-ChildItem -Path $folder -Recurse | Format-List *
alternatively you can also select the first x, last x, or even skip items:
Get-ChildItem -Path $folder -Recurse |Select-Object -First 2| Format-List *
*-Recurse will check all folders below

Get formatted output of file details in a folder

I want to retrieve certain details about all files in a given folder.
Get-ItemPropertyValue .\*.dll -name versioninfo
Gives me output like this:
That aint bad but i wanna include some other properties, and the -include switch doesnt work like i thought.
And giving it -name versioninfo, lastwritetime for example, doesnt add another column to the list, it prints the date underneath:
How can i bring all read properties of one file into the same row (add column)?
EDIT:
i am aware of format-list but its not giving me the wide list output and format-wide only accepts one single property...
How about this?
Get-Item .\*.dll | Select-Object `
#{N='ProductVersion';E={Get-ItemPropertyValue $_ -Name versionInfo | Select-Object -ExpandProperty ProductVersion}} `
,#{N='FileVersion';E={Get-ItemPropertyValue $_ -Name versionInfo | Select-Object -ExpandProperty FileVersion}} `
,Name `
,LastWriteTime
I found an easier, more readable way to do this, and still generate the output when run inside a script:
Get-Childitem .\ThirdPartyComponents\*.dll | select name, lastwritetime, #{l="ProductVersion";e={$_.VersionInfo.ProductVersion}}, #{l="FileVersion";e={$_.VersionInfo.FileVersion}} | ft
The last pipe to | ft (Format table) is needed because the command does not generate any output when run inside a script otherwise. Why that is, im not exactly sure..

In PowerShell, how can I extract a file from HKEY_Users from all SIDs?

I am writing a PowerShell module to look for data that each user who has logged onto the computer at some point might have in their directory in HKEY_USERS. My initial thought was to mount HKEY_USERS, find a way to store each user's SID in a string variable, and then loop through all folders like so:
dir HKU\<STRING VARIABLE HOLDING SID>\Software\MyApp\Mydesireddata
Is there a way I can avoid having to loop through SIDs (because I won't know them ahead of time), and extract that file info from each SID on the system while remembering which SID it came from?
EDIT: Here is an example of the key I'm trying to extract from each user's SID using regedit (vncviewer's EulaAccepted)
Use Get-ChildItem to retrieve each user-specific subkey:
$UserHives = Get-ChildItem Registry::HKEY_USERS\ |Where-Object {$_.Name -match '^HKEY_USERS\\S-1-5-21-[\d\-]+$'}
Then loop over each entry and retrieve the desired registry value:
foreach($Hive in $UserHives)
{
# Construct path from base key
$Path = Join-Path $Hive.PSPath "SOFTWARE\MyApp\DataKey"
# Attempt to retrieve Item property
$Item = Get-ItemProperty -Path $Path -Name ValueName -ErrorAction SilentlyContinue
# Check if item property was there or not
if($Item)
{
$Item.ValueName
}
else
{
# doesn't exist
}
}
I tackled this issue a slightly different way; preferring to make use of a conspicuously placed wildcard.
Get-ItemProperty -Path Registry::HKEY_USERS\*\SOFTWARE\TestVNC\viewer\ -Name EulaAccepted |
Select-Object -Property #{n="SID";e={$_.PSPath.Split('::')[-1].Split('\')[1]}},EulaAccepted
The wildcard will automatically check all available paths and return what you need as well as the SID from the parent path.
As for the username (which is probably more useful than a SID), you didn't specifically ask for it, but I added it in for grins; this should cover local and domain accounts.
mind the line breaks
Get-ItemProperty -Path Registry::HKEY_USERS\*\SOFTWARE\TestVNC\viewer\ -Name EulaAccepted |
Select-Object -Property #{n="SID";e={$_.PSPath.Split('::')[-1].Split('\')[1]}},EulaAccepted |
Select-Object -Property #{n="User";e={[System.Security.Principal.SecurityIdentifier]::new($_.SID).`
Translate([System.Security.Principal.NTAccount]).Value}},SID,EulaAccepted
Getting the username is just ugly; there's likely a cleaner way to get it, but that's what I have in my head. The double-select really makes my skin crawl - there's something unpleasant about it. I could just do a one shot thing, but then it gets so unwieldly you don't even know what you're doing by looking at it.
I've included a screenshot of the registry below, and a screenshot of the screen output from running the few lines.

PowerShell - Determine the existence of certain files in a folder hierarchy efficiently

I'm looking to thin down how many folders I need to recover after a cryptolocker outbreak at a clients site and started looking into powershell as a good way to do this. What I need to do is recover a folder if it has any file inside with the extension .encrypted.
I can run the below
get-childitem C:\ -recurse -filter “*.encrypted” | %{$_.DirectoryName} | Get-Unique
And get a list of all folders that have .encrypted files in them but what I would like to do is thin down the list for example if we have the below file list and assume * means the folder contains encrypted files.
C:\Folder1
C:\Folder1\Folder2\Folder4*
C:\Folder1\Folder2*
C:\Folder1\Folder3\Folder5*
C:Folder1\Folder3\Folder6\
rather than returning
C:\Folder1\Folder2\Folder4*
C:\Folder1\Folder2*
C:\Folder1\Folder3\Folder5*
I would like it just to return as this would be the optimal recovery option.
C:\Folder1\Folder2*
C:\Folder1\Folder3\Folder5*
I know this is a fairly complex problem so I'm not asking anyone to solve it for me just some pointers in the right direction would be awesome as my brain is fried at the moment and I need to write this fairly quickly.
Here's a simple way to do this that should be pretty efficient:
PS C:\> dir -ad -rec | where { test-path (join-path $_.FullName *.encrypted) }
dir is an alias for get-childitem
where is an alias for where-object
-ad means return directories only
-rec means recurse
test-path returns $true if the path exists (yes, it handles wildcards)
S, we recurse through all folders forwarding the folder object down the pipeline. We get the full name of the folder and append *.encrypted to it. If test-path returns $true for this path, we forward the folder down the pipeline. The folder ends up in the console output.
Now, if you want to get a little fancier, here's a more fleshed out one-liner than will report the folders and the encrypted files count into a csv file named after the machine:
dir -ad -rec | ? { test-path (join-path $_.FullName *.txt) } | % {
[pscustomobject]#{"Path"=$_.fullname;"Count"=(dir (join-path $_ *.txt)).count}} |`
Export-Csv "c:\temp\$(hostname).csv" -NoTypeInformation
(? and % are aliases for where-object and foreach-object respectively)
With a little more effort, you could use a fan-out scan of the entire company assuming powershell remoting is enabled on each target machine and have it return all results to you from all machines.
Good luck!
This is too much for a comment, but I don't know that it would be a good answer, just a kind of hackish way to get it done...
The only thing I could think of is to get your list of folders, then start matching them all against each other, and when you get two that at least partially match remove the longer one.
$FullList = GCI C:\*.encrypted | Select -Expand DirectoryName -Unique | Sort -Property Length
$ToRemove = #()
foreach($Folder in $FullList){$ToRemove+=$FullList| Where{$_ -ne $Folder -and ($_ -match [regex]::Escape($Folder))}}
$FinalList = $FullList | Where{$ToRemove -notcontains $_}
That's going to be slow though, there has to be a better way to do it. I just haven't thought of a better way yet.
Don't get me wrong, this will work, and it's faster than going through things by hand for sure, but I'm sure that there has to be a better way to do it.

How can I use PowerShell's -filter parameter to exclude values/files?

I'm currently running a PowerShell (v3.0) script, one step of which is to retrieve all the HTML files in a directory. That works great:
$srcfiles = Get-ChildItem $srcPath -filter "*.htm*"
However, now I'm faced with having to identify all the non-HTML files...CSS, Word and Excel docs, pictures, etc.
I want something that would work like the -ne parameter in conjunction with the -filter parameter. In essence, give me everything that's not "*.htm*"
-filter -ne doesn't work, I tried -!filter on a whim, and I can't seem to find anything in powershell doc on MSDN to negate the -filter parameter. Perhaps I need to pipe something...?
Does anyone have a solution for this?
-Filter is not the right way. Use the -exclude parameter instead:
$srcfiles = Get-ChildItem $srcPath -exclude *.htm*
-exclude accepts a string[] type as an input. In that way you can exclude more than one extension/file type as follows:
$srcfiles = Get-ChildItem $srcPath -exclude *.htm*,*.css,*.doc*,*.xls*
..And so on.
I am a little newer to PowerShell, but could you pipe to the where command?
$srcfiles = Get-ChildItem $srcPath | where-object {$_.extension -ne "*.htm*"}
I am not sure what the actually property you would use in place "extension" is though.

Resources