How do you keep track of changes in registry using powershell? - ruby

I have a ruby script that basically checks for few keys in registry that are flags for our application. The script would check if the flag was 'on'(DWORD value 1) for longer than 14 days, it will turn it off(0) and send out an email.
To track this, what the script does is during the first run, it creates a local DB and store the flag name, date time and so on of every flag that is on in the registry. During next run, it checks this DB to find if the flag has been on for more than 14 days.
I wanted to know if there is a better way to handle this using powershell or should i create a local DB and follow the same flow?
Further info:
So i will have different accounts under '\HKEY_LOCAL_MACHINE\SOFTWARE\Company\Clients' and each client will have a debug folder which will have the flags as seen below
HKEY_LOCAL_MACHINE\SOFTWARE\Company\Clients
\Client1\Debug
IsEmail
IsShip
\Client2\Debug
IsEmail
IsPack
So i need to effectively iterate through each client, maybe create an XML/JSON or DB with the client name, flags defined and current date time. Then during a later run, i should be accessing this info, compare the registry entries and turn off flags that have been turned for longer than a days specified. I will use the timestamp saved early for this.
Update 1:
Few updates based on my updated understanding of how powershell scripts work. I will have a master array or sort with a list of flags i need. For examples $MasterFlagList=['IsEmail','IsShip','IsNew']. Now during my first run of the application, i would like to create a json string like #tukan mentioned in this answer or xml file that will be something like:
<xml>
<Client1>
<IsEmail>
<value>1</value>
<time>*current time*</time>
</IsEmail>
<IsShip>
<value>1</value>
<time>*current time*</time>
</IsShip>
<IsPack>
<value>0</value>
<time>*current time*</time>
</IsPack>
</Client1>
<Client2>
<IsEmail>
<value>0</value>
<time>*current time*</time>
</IsEmail>
<IsShip>
<value>0</value>
<time>*current time*</time>
</IsShip>
<IsPack>
<value>0</value>
<time>*current time*</time>
</IsPack>
</Client2>
</xml>
So basically what happens here is that the XML string will have all the flags from masterlist, but if there is an entry for those in registry, that value will be loaded(0 or 1) and for those not mentioned in registry, default value of 0 will be set.
Next time i run, i will load this XML from directory into an object and for each client, i need to check if the XML already has a record, if so check the date time and see if the flag has been on for an amount of time, if so turn the flag off.
A better solution to implement the idea is also welcome. I am new to powershell and that why i thought of this approach.

That is really a nasty hack. Why don't you store the date also in the registry with the client record?
I think Powershell does better job in querying registry than ruby.
To get values from a registry path:
Get-ItemProperty -Path Registry::HKEY_LOCAL_MACHINE\SOFTWARE\company\Clients\Client1\Debug
This yields result:
IsEmail : 0
IsShip : 0
PSPath : Microsoft.PowerShell.Core\Registry::HKEY_LOCAL_MACHINE\SOFTWARE\company\Clients\Client1\Debug
PSParentPath : Microsoft.PowerShell.Core\Registry::HKEY_LOCAL_MACHINE\SOFTWARE\company\Clients\Client1
PSChildName : Debug
PSProvider : Microsoft.PowerShell.Core\Registry
Alternative is to do it this way:
Get-Item -Path Registry::HKEY_LOCAL_MACHINE\SOFTWARE\company\Clients\Client1\Debug
Which gives:
Hive: HKEY_LOCAL_MACHINE\SOFTWARE\company\Clients\Client1
Name Property
---- --------
Debug IsEmail : 0
IsShip : 0
If you want get the value directly, you have to do it this way:
(Get-ItemProperty -Path Registry::HKEY_LOCAL_MACHINE\SOFTWARE\company\Clients\Client1\Debug -Name IsEmail).IsEmail
To set a value (don't forget that you probably need admin rights for that!):
Set-ItemProperty -Path Registry::HKEY_LOCAL_MACHINE\SOFTWARE\company\Clients\Client1\Debug -Name 'IsEmail' -Type Dword -Value '1'
Now to the core of your question (and probably where you have problems)
I had to test it as i had different solution on my mind so it took quite a while.
# %Y ... year
# %m ... month
# %d ... day
# %R ... 24 hour time and minutes
# %S ... seconds
# example output: 20180226_10:38:23
$time_stamp= Get-Date -UFormat "%Y%m%d_%R:%S"
Write-Output $time_stamp
Get-ChildItem 'HKLM:\SOFTWARE\company\Clients' -Recurse | ForEach-Object {
$regkey = (Get-ItemProperty $_.pspath)
$regkey.PSObject.Properties | ForEach-Object {
# you could filter it via -> If($_.Name -like 'Is*'){
# $regkey is a HashTable
# Printing some output so you can check
Write-Output $regkey.PSParentPath
Write-Output $regkey.PSPath
Write-Output $_.Name ' = ' $_.Value
Write-Output ''
# convert to JSON - will contain more information than you need -> format it as you wish
$convert_registry_to_json = $regkey | ConvertTo-Json
#}
}
}
# Printing JSON output
Write-Output $convert_registry_to_json
Any further filtering or formatting I leave to the discretion.
Edit - edited question
This made me remember why I hate XML :). I have create a simple solution to your formatting. This code just a proof of concept, I recommend using functions to shorten your code (also avoiding duplicity)
$time_stamp= Get-Date -UFormat "%Y%m%d_%R:%S"
$result_hash = #{}
$result_array= #()
$first_level_hash = #{}
$second_level_hash = #{}
Get-ChildItem 'HKLM:\SOFTWARE\company\Clients' -Recurse | ForEach-Object {
$regkey = (Get-ItemProperty $_.pspath)
$regkey.PSObject.Properties | ForEach-Object {
# a HashTable
#Write-host $regkey.PSParentPath
$parent_key = $regkey.PSParentPath -Match '\w+$'
#Write-host "Found Parent:" $matches[0]
$parent_value = $matches[0]
#Write-Host $regkey.PSPath
#Write-Host $_.Name ' = ' $_.Value
#Write-host ''
$hash_key = $_.Name
$hash_value = $_.Value
If ($hash_key -like 'IsEmail'){
$second_level_hash.Add('value',$hash_value)
$second_level_hash.Add('time',$time_stamp)
$first_level_hash.Add($hash_key,$second_level_hash)
$result_hash.Add($parent_value,$first_level_hash)
$result_array += ($result_hash)
$second_level_hash=#{}
$first_level_hash=#{}
$result_hash=#{}
} ElseIf ($hash_key -like 'IsShip'){
$second_level_hash.Add('value',$hash_value)
$second_level_hash.Add('time',$time_stamp)
$first_level_hash.Add($hash_key,$second_level_hash)
$result_hash.Add($parent_value,$first_level_hash)
$result_array += ($result_hash)
$second_level_hash=#{}
$first_level_hash=#{}
$result_hash=#{}
} ElseIf ($hash_key -like 'IsPack'){
$second_level_hash.Add('value',$hash_value)
$second_level_hash.Add('time',$time_stamp)
$first_level_hash.Add($hash_key,$second_level_hash)
$result_hash.Add($parent_value,$first_level_hash)
$result_array += ($result_hash)
$second_level_hash=#{}
$first_level_hash=#{}
$result_hash=#{}
}
#$convert_registry_to_json = $regkey | ConvertTo-Json
#}
}
}
#Write-Output $result_hash
#Write-Output $result_hash.Item('IsEmail').Keys
#Write-Output $result_hash.Item('Client1').item('IsEmail')
Write-Output $result_array
ForEach ($result in $result_array) {
ForEach ($entry in $result.GetEnumerator()) {
$first_level_name = $($entry.Value)
ForEach ($second_level in $first_level_name.GetEnumerator()) {
$second_level_name = $($second_level.Value)
ForEach ($third_level in $second_level_name.GetEnumerator()) {
Write-Host "$($entry.Name) -> $($second_level.Name) -> $($third_level.Name): $($third_level.Value)"
}
}
}
}
# You can convert it to JSON
# FSpecifies how many levels of contained objects are included in the JSON representation. The default value is 2.
$convert_result_to_json = $result_array | ConvertTo-Json -Depth 3
Write-Output $convert_result_to_json
The output now is, which can be easily converted to JSON or XML if you want :
Now it yields result:
Name Value
---- -----
Client1 {IsEmail}
Client1 {IsShip}
Client2 {IsEmail}
Client2 {IsPack}
Client1 -> IsEmail -> time: 20180226_13:31:58
Client1 -> IsEmail -> value: 1
Client1 -> IsShip -> time: 20180226_13:31:58
Client1 -> IsShip -> value: 0
Client2 -> IsEmail -> time: 20180226_13:31:58
Client2 -> IsEmail -> value: 1
Client2 -> IsPack -> time: 20180226_13:31:58
Client2 -> IsPack -> value: 0
with JSON format:
[
{
"Client1": {
"IsEmail": {
"time": "20180226_14:40:02",
"value": 1
}
}
},
{
"Client1": {
"IsShip": {
"time": "20180226_14:40:02",
"value": 0
}
}
},
{
"Client2": {
"IsEmail": {
"time": "20180226_14:40:02",
"value": 1
}
}
},
{
"Client2": {
"IsPack": {
"time": "20180226_14:40:02",
"value": 0
}
}
}
]
Second edit
As for the DB. Perhaps the best solution would be to use Sqlite, check the technet wiki about it PowerShell: Accessing SQLite databases

Related

How save png as jpg without saving the file in dir

I'm using FromFile to get the image out of files, and it has the following error for the png's on the FromFile line:
Exception calling "FromFile" with "1" argument(s): "The given path's
format is not supported."
So, I'm trying to convert the bmp's to jpg, (see convert line above FromFile below) but all the examples I see (that seem usable) are saving the file. I don't want to save the file in the dir. All I need is the image format, so FromFile can use it like this example. I saw ConvertTo-Jpeg, but I don't think this is a standard powershell module, or don't see how to install it.
I saw this link, but I don't think that would leave the image in the format needed by FromFile.
This is my code:
$imageFile2 = Get-ChildItem -Recurse -Path $ImageFullBasePath -Include #("*.bmp","*.jpg","*.png") | Where-Object {$_.Name -match "$($pictureName)"} #$imageFile | Select-String -Pattern '$($pictureName)' -AllMatches
Write-Host $imageFile2
if($imageFile2.Exists)
{
if($imageFile2 -Match "png")
{
$imageFile2 | .\ConvertTo-Jpeg #I don't think this will work with FromFile below
}
$image = [System.Drawing.Image]::FromFile($imageFile2) step
}
else {
Write-Host "$($imageFile2) does not exist"
}
And then I put it in excel:
$xlsx = $result | Export-Excel -Path $outFilePath -WorksheetName $errCode -Autosize -AutoFilter -FreezeTopRow -BoldTopRow -PassThru # -ClearSheet can't ClearSheet every time or it clears previous data ###left off
$ws = $xlsx.Workbook.Worksheets[$errCode]
$ws.Dimension.Columns #number of columns
$tempRowCount = $ws.Dimension.Rows #number of rows
#only change width of 3rd column
$ws.Column(3).Width
$ws.Column(3).Width = 100
#Change all row heights
for ($row = 2 ;( $row -le $tempRowCount ); $row++)
{
#Write-Host $($ws.Dimension.Rows)
#Write-Host $($row)
$ws.Row($row).Height
$ws.Row($row).Height = 150
#place the image in spreadsheet
#https://github.com/dfinke/ImportExcel/issues/1041 https://github.com/dfinke/ImportExcel/issues/993
$drawingName = "$($row.PictureID)_Col3_$($row)" #Name_ColumnIndex_RowIndex
Write-Host $image
$picture = $ws.Drawings.AddPicture("$drawingName",$image)
$picture.SetPosition($row - 1, 0, 3 - 1, 0)
if($ws.Row($row).Height -lt $image.Height * (375/500)) {
$ws.Row($row).Height = $image.Height * (375/500)
}
if($ws.Column(3).Width -lt $image.Width * (17/120)){
$ws.Column(3).Width = $image.Width * (17/120)
}
}
Update:
I just wanted to reiterate that FromFile can't be used for a png image. So where Hey Scripting Guy saves the image like this doesn't work:
$image = [drawing.image]::FromFile($imageFile2)
I figured out that the $imageFile2 path has 2 filenames in it. It must be that two met the Get-ChildItem/Where-Object/match criteria. The images look identical, but have similar names, so will be easy to process. After I split the names, it does FromFile ok.

Speed up CSV Powershell script

I've got a Powershell script that functions but it takes ages to complete. I'm a newbie with Powershell and i can't find a solution to speed up the proces. Hopefully somebody can show me to the right direction.
Example. I've got 2 csv files.
CSV 1:
CI Name,Last Logon Account
Computer1, User1
Computer2, User2
Computer3, User3
CSV 2:
Device Display Label,Subscriber Employee Id
Computer1, User1
Computer2, User2
Computer3, User6
I want to have all the Ci names in the first column with the last logon account in the second column and match subscriber employee id with ci name from the first file.
Resulting in:
Ci name, Last logon Account, Subscriber Employee Id
Computer1,User1,User1
Computer2,User2,User2
Computer3,User3,User6
I have the following script in Powershell:
$Data = Import-csv 'C:\Temp\Excel\CSV\file1.csv'
$Data2 = Import-Csv 'C:\Temp\Excel\CSV\file2.csv'
$combine = #()
foreach ($first in $Data) {
foreach ($second in $Data2) {
if ($second.'Device Display Label' -eq $first.'CI Name') {
$match = New-Object PSObject
$match | Add-Member Noteproperty "Ci Name" $first.'CI Name'
$match | Add-Member Noteproperty "Last Logon Account" $first.'Last Logon Account'
$match | Add-Member Noteproperty "Subscriber Employee Id" $second.'Subscriber Employee Id'
$combine += $match
}
}
}
$Combine
It works and it gives the desired result.
The only problem is that both csv files have 15000 lines. So it takes ages to finish the script.
Is there a way to speed up the proces. I hope somebody can point me to the right direction.
Use a hashtable to build an index out of one of the CSV files - this way you don't need the nested loops and the runtime should drop significantly:
# Build index/reference table from first data set
$DataTable = #{}
Import-csv 'C:\Temp\Excel\CSV\file1.csv' |ForEach-Object {
$DataTable[$_.'CI Name'] = $_
}
# No need to store the second data set in an intermediate variable
$combine = Import-Csv 'C:\Temp\Excel\CSV\file2.csv' |ForEach-Object {
if($DataTable.ContainsKey($_.'Device Display Label')){
# Take the existing object from the first data set
# and add the subscriber from the second data set
$DataTable[$_.'Device Display Label'] |Add-Member NoteProperty "Subscriber Employee Id" $_.'Subscriber Employee Id' -PassThru
}
}

PowerShell - How to count objects?

I am using PowerShell to build some scripts in an Active Directory enviroment and am currently struggling to find a way to count objects. My base search is:
$DClist = (Get-ADForest).Domains | % { Get-ADDomainController –Filter * -Server $_ } | Select Site, Name, Domain
And it generates the following output:
Site Name Domain
---- ---- ------
Site-A DC-123 acme.local
Site-A DC-ABC acme.local
Site-B DC-XYZ domain.local
Site-C DC-YPT domain.local
Now I would like to count the number of objects in the column 'Name' and display something like this:
Site Count_of_Name
---- ----
Site-A 2
Site-B 1
Site-C 1
I have already tried a lot of things and the closest I got so far was using:
$DcList | Group-Object Site
But unfortunately it is not the right way to go as it only counts the number of 'Site' and "ignores" the rest. Also tried this, but it did not work as I expect either:
$DcList | Group-Object Site, Name
Please help me figure out the logic of this.
********************** UPDATE **********************
I have finally been able to come to this, but I cannot figure out a way to count the objects from 'Site' column:
$DClist | Group-Object -Property Site | ForEach-Object -Process {
[PSCustomObject]#{
Site = $_.Name
DCs = ($_.Group.Site)
}
}
Please help me out. I feel I'm so close to a solution now. :)
you are REALLY close, and the answer is about what you would expect :)
when doing group you automatically get a count property. just use this.
$DClist = (Get-ADForest).Domains | % { Get-ADDomainController –Filter * -Server $_ } | Select Site, Name, Domain
$dclist|Group-Object site|ForEach-Object{
[PSCustomObject]#{
site = $_.name
DCs = $_.group
count = $_.count
}
}
edit:
you could also do this that could be even faster if propegating through many objects. when doing select you can add a custom query and a label for that query.
#{name='fieldname';expression={$_.reference.to.object}} or #{n='field';e={$_.expression}} if you want to shorten it.
$dclist|Group-Object site|ForEach-Object{
$_|select #{n='site';e={$_.name}},count,#{n='DCs';e={$_.group}}
}
I don't exactly understand what you really want, but if it is some kind of tree, this will show it:
ForEach($Site in ($dclist | Group-Object site))
{
$Site.Count.ToString() + " " + $Site.Name
ForEach($Server in $Site.Group)
{
" + " + $Server.Name
}
}
Output:
2 Site-A
+ DC-123
+ DC-ABC
1 Site-B
+ DC-XYZ
1 Site-C
+ DC-YPT

My method to send values of performance counters to Graphite is very slow. What is the bottleneck? And how to improve?

Below I have some code to get the values of instances of performance counters (which are instantiated once a page is visited) and send them to Graphite to display graphs in the following format:
[Path in Graphite (e.g., metric.pages.Counter1)] [value of counter] [epoch time]
To do this I made the following code where the writer is configured correctly to work:
# Get all paths to MultipleInstance counters and averages that start with "BLABLA" and
# put them into an array and get the epoch time
$pathsWithInstances = (get-counter -ListSet BLABLA*) | select -ExpandProperty PathsWithInstances
$epochtime = [int][double]::Parse((Get-Date -UFormat %s))
# This functions splits the path (e.g., \BLABLA Web(welcome)\Page Requests) into three
# parts: the part before the
# opening brace (the CounterCategory, e.g., "\BLABLA Web"), the part in between the braces
# (the page or
# service, e.g., "welcome"), and the part after the closing brace (the name of the test,
# e.g.,
# "\Page Requests"). We obtain the metric out of this information and send it to
# Graphite.
enter code here
foreach ($pathWithInstance in $pathsWithInstances)
{
$instanceProperties = $pathWithInstance.Split('()')
$counterCategory = $instanceProperties[0]
if ($counterCategory -eq ("\BLABLA Web") )
{
# Replace the * with nothing so that counters that are used to display the
# average (e.g., \BLABLAWeb(*)\Page Requests) are displayed on top in the
# Graphite directory.
$pagePath = $instanceProperties[1].Replace('*','')
$nameOfTheTest = $instanceProperties[2]
# Countername which is used in Graphite path gets whitespace and backslash
# removed in the name used for the path in Graphite (naming conventions)
$counterName = $nameOfTheTest.Replace(' ','').Replace('\','')
$pathToPerfCounter = $pathWithInstance
$pathInGraphite = "metrics.Pages." + $pagePath + $counterName
#Invoked like this since otherwise the get-counter [path] does not seem to work
$metricValue = [int] ((Get-Counter "$pathToPerfCounter").countersamples | select -
property cookedvalue).cookedvalue
$metric = ($pathInGraphite + " " + $metricValue + " " + $epochTime)
$writer.WriteLine($metric)
$writer.Flush()
}
}
Unfortunately this code is very slow. It takes about one second for every counter to send a value. Does someone see why it is so slow and how it can be improved?
You're getting one counter at a time, and it takes a second for Get-Counter to get and "Cook" the values. Get-Counter will accept an array of counters, and will sample, "cook" and return them all in that same second. You can speed it up by sampling them all at once, and then parsing the values from the array of results:
$CounterPaths = (
'\\Server1\Memory\Page Faults/sec',
'\\Server1\Memory\Available Bytes'
)
(Measure-Command {
foreach ($CounterPath in $CounterPaths)
{Get-Counter -counter $counterpath}
}).TotalMilliseconds
(Measure-Command {
Get-Counter $CounterPaths
}).TotalMilliseconds
2017.4693
1012.3012
Example:
foreach ($CounterSample in (Get-Counter $CounterPaths).Countersamples)
{
"Path = $($CounterSample.path)"
"Metric = $([int]$CounterSample.CookedValue)"
}
Path = \\Server1\memory\page faults/sec
Metric = 193
Path = \\Server1\memory\available bytes
Metric = 1603678208
Use the Start-Job cmdlet, to create separate threads for each counter.
Here is a simple example of how to take the Counter Paths and pass them into an asynchronous ScriptBlock:
$CounterPathList = (Get-Counter -ListSet Processor).PathsWithInstances.Where({ $PSItem -like '*% Processor Time' });
foreach ($CounterPath in $CounterPathList) {
Start-Job -ScriptBlock { (Get-Counter -Counter $args[0]).CounterSamples.CookedValue; } -ArgumentList $CounterPath;
}
# Call Receive-Job down here, once all jobs are finished
IMPORTANT: The above example uses PowerShell version 4.0's "method syntax" for filtering objects. Please make sure you're running PowerShell version 4.0, or change the Where method to use the traditional Where-Object instead.

How to split a huge folder?

We have a folder on Windows that's ... huge. I ran "dir > list.txt". The command lost response after 1.5 hours. The output file is about 200 MB. It shows there're at least 2.8 million files. I know the situation is stupid but let's focus the problem itself. If I have such a folder, how can I split it to some "manageable" sub-folders? Surprisingly all the solutions I have come up with all involve getting all the files in the folder at some point, which is a no-no in my case. Any suggestions?
Thank Keith Hill and Mehrdad. I accepted Keith's answer because that's exactly what I wanted to do but I couldn't quite get PS working quickly.
With Mehrdad's tip, I wrote this little program. It took 7+ hours to move 2.8 million files. So the initial dir command did finish. But somehow it didn't return to console.
namespace SplitHugeFolder
{
class Program
{
static void Main(string[] args)
{
var destination = args[1];
if (!Directory.Exists(destination))
Directory.CreateDirectory(destination);
var di = new DirectoryInfo(args[0]);
var batchCount = int.Parse(args[2]);
int currentBatch = 0;
string targetFolder = GetNewSubfolder(destination);
foreach (var fileInfo in di.EnumerateFiles())
{
if (currentBatch == batchCount)
{
Console.WriteLine("New Batch...");
currentBatch = 0;
targetFolder = GetNewSubfolder(destination);
}
var source = fileInfo.FullName;
var target = Path.Combine(targetFolder, fileInfo.Name);
File.Move(source, target);
currentBatch++;
}
}
private static string GetNewSubfolder(string parent)
{
string newFolder;
do
{
newFolder = Path.Combine(parent, Path.GetRandomFileName());
} while (Directory.Exists(newFolder));
Directory.CreateDirectory(newFolder);
return newFolder;
}
}
}
I use Get-ChildItem to index my whole C: drive every night into c:\filelist.txt. That's about 580,000 files and the resulting file size is ~60MB. Admittedly I'm on Win7 x64 with 8 GB of RAM. That said, you might try something like this:
md c:\newdir
Get-ChildItem C:\hugedir -r |
Foreach -Begin {$i = $j = 0} -Process {
if ($i++ % 100000 -eq 0) {
$dest = "C:\newdir\dir$j"
md $dest
$j++
}
Move-Item $_ $dest
}
The key is to do the move in a streaming manner. That is, don't collect up all the Get-ChildItem results into a single variable and then proceed. That would require all 2.8 million FileInfos to be in memory at once. Also, if you use the Name parameter on Get-ChildItem it will output a single string containing the file's path relative to the base dir. Even then, perhaps this size will just overwhelm the memory available to you. And no doubt, it will take quite a while to execute. IIRC correctly, my indexing script takes several hours.
If it does work, you should wind up with c:\newdir\dir0 thru dir28 but then again, I haven't tested this script at all so your mileage may vary. BTW this approach assumes that you're huge dir is a pretty flat dir.
Update: Using the Name parameter is almost twice as slow so don't use that parameter.
I found out the GetChildItem is the slowest option when working with many items in a directory.
Look at the results:
Measure-Command { Get-ChildItem C:\Windows -rec | Out-Null }
TotalSeconds : 77,3730275
Measure-Command { listdir C:\Windows | Out-Null }
TotalSeconds : 20,4077132
measure-command { cmd /c dir c:\windows /s /b | out-null }
TotalSeconds : 13,8357157
(with listdir function defined like this:
function listdir($dir) {
$dir
[system.io.directory]::GetFiles($dir)
foreach ($d in [system.io.directory]::GetDirectories($dir)) {
listdir $d
}
}
)
With this in mind, what I would do: I would stay in PowerShell but use more lowlevel approach with .NET methods:
function DoForFirst($directory, $max, $action) {
function go($dir, $options)
{
foreach ($f in [system.io.Directory]::EnumerateFiles($dir))
{
if ($options.Remaining -le 0) { return }
& $action $f
$options.Remaining--
}
foreach ($d in [system.io.directory]::EnumerateDirectories($dir))
{
if ($options.Remaining -le 0) { return }
go $d $options
}
}
go $directory (New-Object PsObject -Property #{Remaining=$max })
}
doForFirst c:\windows 100 {write-host File: $args }
# I use PsObject to avoid global variables and ref parameters.
To use the code you have to switch to .NET 4.0 runtime -- enumerating methods are new in .NET 4.0.
You can specify any scriptblock as -action parameter, so in your case it would be something like {Move-item -literalPath $args -dest c:\dir }.
Just try to list first 1000 items, I hope it will finish very quickly:
doForFirst c:\yourdirectory 1000 {write-host '.' -nonew }
And of course you can process all items at once, just use
doForFirst c:\yourdirectory ([long]::MaxValue) {move-item ... }
and each item should be processed immediately after it is returned. So the whole list is not read at once and then processed, but it is processed during reading.
How about starting with this:
cmd /c dir /b > list.txt
That should get you a list of all the file names.
If you're doing "dir > list.txt" from a powershell prompt, get-childitem is aliased as "dir". Get-childitem has known issues enumerating large directories, and the object collections it returns can get huge.

Resources