I'm hoping to come up with a script that can walk a directory tree on a Windows server and show me a tree that only includes directories whose permissions are different from it's parent (or sub) directory. I want to produce an easily understandable report that can help me quickly audit the permissions of a folder structure.
Here's what I've got so far:
DIR "Z:\FileShare" -directory -recurse | GET-ACL | where {$_.AreAccessRulesProtected -eq $true} | select path, accessToString | format-list |out-file c:\permissions.txt
This produces a usable set of data as is but it's a bit bulky.
What I don't know is how to have it filter out redundant text, namely lines like "BUILTIN\Administrators Allow FullControl" and instead only show me the delta. Human readable psudo-code might be "if this ACL line can be found in the immediate parent directory, don't show it here."
I tested this with a few folders setting up different ACLs with my own user and it seem to be working but I haven't tested enough to be sure. Basically, the script will loop through directories and add the ACLs to a dictionary where the Keys are each IdentityReference and the Values are the properties from the ACLs which you're interested on (FileSystemRights and AccessControlType) in addition to the folder's absolute path. While enumerating the directories, each object will be compared against the stored values using the Compare-Acl function which only returns $true if the object is different.
using namespace System.Collections
using namespace System.Collections.Generic
$map = [Dictionary[string, ArrayList]]::new()
$outObj = {
[pscustomobject]#{
AbsolutePath = $dir.FullName
FileSystemRights = $acl.FileSystemRights
IdentityReference = $acl.IdentityReference
AccessControlType = $acl.AccessControlType
}
}
function Compare-Acl {
param(
[object[]]$Reference,
[object]$Difference
)
foreach($ref in $Reference)
{
$fsRights = $ref.FileSystemRights -eq $Difference.FileSystemRights
$actRef = $ref.AccessControlType -eq $Difference.AccessControlType
if($fsRights -and $actRef)
{
return $false
}
}
$true
}
foreach($dir in Get-ChildItem Z:\FileShare -Directory -Recurse)
{
foreach($acl in (Get-Acl $dir.FullName | Where-Object AreAccessRulesProtected).Access)
{
if($thisKey = $map[$acl.IdentityReference])
{
$obj = & $outObj
if(Compare-Acl -Reference $thisKey -Difference $obj)
{
$null = $thisKey.Add($obj)
}
continue
}
$obj = & $outObj
$null = $map.Add($acl.IdentityReference, [object[]]$obj)
}
}
$map.Keys.ForEach({ $map[$_] })| Export-Csv path\to\acls.csv -NoTypeInformation
Related
Good Afternoon All!
Working on a script that my co-workers use, we move a lot up to Sharepoint and the objective was to make an easy method to set the file to read-only and compress it using the built-in NTFS compression.
All of that is functioning like it should with the code below. The problem I need to solve is this, somehow I need to pass the current directory the worker has open in File explorer to The open file dialog box. So that when the script is run you don't have to manually change it to the correct location.
Currently, I have it set to open on the users desktop directory
Is this possible?
Thanks
Add-Type -AssemblyName System.Windows.Forms
$FileBrowser = New-Object System.Windows.Forms.OpenFileDialog -Property #{
InitialDirectory = [Environment]::GetFolderPath('Desktop')
Multiselect = $true # Multiple files can be chosen
}
[void]$FileBrowser.ShowDialog()
$path = $FileBrowser.FileNames;
If($FileBrowser.FileNames -like "*\*") {
# Do something before work on individual files commences
$FileBrowser.FileNames #Lists selected files (optional)
foreach($file in Get-ChildItem $path){
Get-ChildItem ($file) |
ForEach-Object {
Set-ItemProperty -Path $path -Name IsReadOnly -Value $true
compact /C $_.FullName
}
}
# Do something when work on individual files is complete
}
else {
Write-Host "Cancelled by user"
}
The following solution determines the directory that the topmost File Explorer window has open (the window that is highest in the Z-order).
That is, if File Explorer itself is active, that active window's directory is used, otherwise it is the most recently active window's.
# Helper type for getting windows by class name.
Add-Type -Namespace Util -Name WinApi -MemberDefinition #'
// Find a window by class name and optionally also title.
// The TOPMOST matching window (in terms of Z-order) is returned.
// IMPORTANT: To not search for a title, pass [NullString]::Value, not $null, to lpWindowName
[DllImport("user32.dll")]
public static extern IntPtr FindWindow(string lpClassName, string lpWindowName);
'#
# Get the topmost File Explorer window, by its class name.
$hwndTopMostFileExplorer = [Util.WinApi]::FindWindow(
"CabinetWClass", # the window class of interest
[NullString]::Value # no window title to search for
)
if (-not $hwndTopMostFileExplorer) {
Write-Warning "There is no open File Explorer window."
# Alternatively, use a *default* directory in this case.
return
}
# Using a Shell.Application COM object, locate the window by its hWnd and query its location.
$fileExplorerWin = (New-Object -ComObject Shell.Application).Windows() |
Where-Object hwnd -eq $hwndTopMostFileExplorer
# This should normally not happen.
if (-not $fileExplorerWin) {
Write-Warning "The topmost File Explorer window, $hwndTopMostFileExplorer, must have just closed."
return
}
# Determine the window's active directory (folder) path.
$fileExplorerDir = $fileExplorerWin.Document.Folder.Self.Path
# Now you can use $fileExplorerDir to initialize the dialog:
Add-Type -AssemblyName System.Windows.Forms
$FileBrowser = New-Object System.Windows.Forms.OpenFileDialog -Property #{
InitialDirectory = $fileExplorerDir
Multiselect = $true # Multiple files can be chosen
}
$null = $FileBrowser.ShowDialog()
# ...
I am trying to create an array of objects, and inside each object I want a property called something like "valid paths" indicating where those objects can be found, but I'm having trouble adding new paths after creating the array.
For example:
$Guids = $null
$Guids = #()
$Guids += [pscustomobject]#{
Guid = '{374DE290-123F-4565-9164-39C4925E467B}'
Paths = #()
}
$Guids += [pscustomobject]#{
Guid = '{7d83ee9b-2244-4e70-b1f5-5393042af1e4}'
Paths = #()
}
Here I am creating two objects indicating the guids I am looking for. Then to find and add the paths to the array, I do this:
$Rootpath = "HKU:\$($CurrentUser.PSChildName)"
$ShellFolderspath = "$RootPath\SOFTWARE\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders"
$UserShellFolderspath = "$Rootpath\SOFTWARE\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders"
#region 1 - Identify registry keys to update
New-PSDrive -Name HKU -PSProvider Registry -Root HKEY_USERS -ErrorAction SilentlyContinue
#Adding Downloads Guids to array
ForEach ($object in $Guids)
{
#Check Shell Folders Path
If ($Item = Get-ItemProperty $ShellFolderspath -Name $object.Guid -ErrorAction Stop)
{
"$($object.guid) found in Shell Folders path"
$Object.paths += $Item.PSpath
}
#Check User Shell Folders Path
If ($Item = Get-ItemProperty $UserShellFolderspath -Name $object.Guid -ErrorAction Stop)
{
"$($object.guid) found in User Shell Folders path"
$Object.paths += $Item.PSpath
}
}
I've googled over and over and can't find any examples of anyone trying to do this. Any help would be appreciated!
Starting premise: very restrictive environment, Windows 7 SP1, Powershell 3.0. Limited or no possibility of using external libraries.
I'm trying to re-write a bash tool I created previously, this time using PowerShell. In bash I implemented autocompletion to make the tool more user friendly and I want to do the same thing for the PowerShell version.
The bash version worked like this:
./launcher <Tab> => ./launcher test (or dev, prod, etc.)
./launcher test <Tab> => ./launcher test app1 (or app2, app3, etc.)
./launcher test app1 <Tab> => ./launcher test app1 command1 (or command2, command3, etc.).
As you can see, everything was dynamic. The list of environments was dynamic, the list of application was dynamic, depending on the environment selected, the list of commands was also dynamic.
The problem is with the test → application connection. I want to show the correct application based on the environment already selected by the user.
Using PowerShell's DynamicParam I can get a dynamic list of environments based on a folder listing. I can't however (or at least I haven't found out how to) do another folder listing but this time using a variable based on the existing user selection.
Current code:
function ParameterCompletion {
$RuntimeParameterDictionary = New-Object Management.Automation.RuntimeDefinedParameterDictionary
# Block 1.
$AttributeCollection = New-Object Collections.ObjectModel.Collection[System.Attribute]
$ParameterName = "Environment1"
$ParameterAttribute = New-Object Management.Automation.ParameterAttribute
$ParameterAttribute.Mandatory = $true
$ParameterAttribute.Position = 1
$AttributeCollection.Add($ParameterAttribute)
# End of block 1.
$parameterValues = $(Get-ChildItem -Path ".\configurations" -Directory | Select-Object -ExpandProperty Name)
$ValidateSetAttribute = New-Object Management.Automation.ValidateSetAttribute($parameterValues)
$AttributeCollection.Add($ValidateSetAttribute)
$RuntimeParameter = New-Object Management.Automation.RuntimeDefinedParameter($ParameterName, [string], $AttributeCollection)
$RuntimeParameterDictionary.Add($ParameterName, $RuntimeParameter)
# Block 2: same thing as in block 1 just with 2 at the end of variables.
# Problem section: how can I change this line to include ".\configurations\${myVar}"?
# And what's the magic incantation to fill $myVar with the info I need?
$parameterValues2 = $(Get-ChildItem -Path ".\configurations" -Directory | Select-Object -ExpandProperty Name)
$ValidateSetAttribute2 = New-Object Management.Automation.ValidateSetAttribute($parameterValues2)
$AttributeCollection2.Add($ValidateSetAttribute2)
$RuntimeParameter2 = New-Object
Management.Automation.RuntimeDefinedParameter($ParameterName2, [string], $AttributeCollection2)
$RuntimeParameterDictionary.Add($ParameterName2, $RuntimeParameter2)
return $RuntimeParameterDictionary
}
function App {
[CmdletBinding()]
Param()
DynamicParam {
return ParameterCompletion "Environment1"
}
Begin {
$Environment = $PsBoundParameters["Environment1"]
}
Process {
}
}
I would recommend using argument completers, which are semi-exposed in PowerShell 3 and 4, and fully exposed in version 5.0 and higher. For v3 and v4, the underlying functionality is there, but you have to override the TabExpansion2 built-in function to use them. That's OK for your own session, but it's generally frowned upon to distribute tools that do that to other people's sessions (imagine if everyone tried to override that function). A PowerShell team member has a module that does this for you called TabExpansionPlusPlus. I know I said overriding TabExpansion2 was bad, but it's OK if this module does it :)
When I needed to support versions 3 and 4, I would distribute my commands in modules, and have the modules check for the existence of the 'Register-ArgumentCompleter' command, which is a cmdlet in v5+ and is a function if you have the TE++ module. If the module found it, it would register any completer(s), and if it didn't, it would notify the user that argument completion wouldn't work unless they got the TabExpansionPlusPlus module.
Assuming you have the TE++ module or PSv5+, I think this should get you on the right track:
function launcher {
[CmdletBinding()]
param(
[string] $Environment1,
[string] $Environment2,
[string] $Environment3
)
$PSBoundParameters
}
1..3 | ForEach-Object {
Register-ArgumentCompleter -CommandName launcher -ParameterName "Environment${_}" -ScriptBlock {
param($commandName, $parameterName, $wordToComplete, $commandAst, $fakeBoundParameter)
$PathParts = $fakeBoundParameter.Keys | where { $_ -like 'Environment*' } | sort | ForEach-Object {
$fakeBoundParameter[$_]
}
Get-ChildItem -Path ".\configurations\$($PathParts -join '\')" -Directory -ErrorAction SilentlyContinue | select -ExpandProperty Name | where { $_ -like "${wordToComplete}*" } | ForEach-Object {
New-Object System.Management.Automation.CompletionResult (
$_,
$_,
'ParameterValue',
$_
)
}
}
}
For this to work, your current working directory will need a 'configurations' directory contained in it, and you'll need at least three levels of subdirectories (reading through your example, it looked like you were going to enumerate a directory, and you would go deeper into that structure as parameters were added). The enumerating of the directory isn't very smart right now, and you can fool it pretty easy if you just skip a parameter, e.g., launcher -Environment3 <TAB> would try to give you completions for the first sub directory.
This works if you will always have three parameters available. If you need a variable # of parameters, you could still use completers, but it might get a little trickier.
The biggest downside would be that you'd still have to validate the users' input since completers are basically just suggestions, and users don't have to use those suggestions.
If you want to use dynamic parameters, it gets pretty crazy. There may be a better way, but I've never been able to see the value of dynamic parameters at the commandline without using reflection, and at that point you're using functionality that could change at the next release (the members usually aren't public for a reason). It's tempting to try to use $MyInvocation inside the DynamicParam {} block, but it's not populated at the time the user is typing the command into the commandline, and it only shows one line of the command anyway without using reflection.
The below was tested on PowerShell 5.1, so I can't guarantee that any other version has these exact same class members (it's based off of something I first saw Garrett Serack do). Like the previous example, it depends on a .\configurations folder in the current working directory (if there isn't one, you won't see any -Environment parameters).
function badlauncher {
[CmdletBinding()]
param()
DynamicParam {
#region Get the arguments
# In it's current form, this will ignore parameter names, e.g., '-ParameterName ParameterValue' would ignore '-ParameterName',
# and only 'ParameterValue' would be in $UnboundArgs
$BindingFlags = [System.Reflection.BindingFlags] 'Instance, NonPublic, Public'
$Context = $PSCmdlet.GetType().GetProperty('Context', $BindingFlags).GetValue($PSCmdlet)
$CurrentCommandProcessor = $Context.GetType().GetProperty('CurrentCommandProcessor', $BindingFlags).GetValue($Context)
$ParameterBinder = $CurrentCommandProcessor.GetType().GetProperty('CmdletParameterBinderController', $BindingFlags).GetValue($CurrentCommandProcessor)
$UnboundArgs = #($ParameterBinder.GetType().GetProperty('UnboundArguments', $BindingFlags).GetValue($ParameterBinder) | where { $_ } | ForEach-Object {
try {
if (-not $_.GetType().GetProperty('ParameterNameSpecified', $BindingFlags).GetValue($_)) {
$_.GetType().GetProperty('ArgumentValue', $BindingFlags).GetValue($_)
}
}
catch {
# Don't do anything??
}
})
#endregion
$ParamDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
# Create an Environment parameter for each argument specified, plus one extra as long as there
# are valid subfolders under .\configurations
for ($i = 0; $i -le $UnboundArgs.Count; $i++) {
$ParameterName = "Environment$($i + 1)"
$ParamAttributes = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
$ParamAttributes.Add((New-Object Parameter))
$ParamAttributes[0].Position = $i
# Build the path that will be enumerated based on previous arguments
$PathSb = New-Object System.Text.StringBuilder
$PathSb.Append('.\configurations\') | Out-Null
for ($j = 0; $j -lt $i; $j++) {
$PathSb.AppendFormat('{0}\', $UnboundArgs[$j]) | Out-Null
}
$ValidParameterValues = Get-ChildItem -Path $PathSb.ToString() -Directory -ErrorAction SilentlyContinue | Select-Object -ExpandProperty Name
if ($ValidParameterValues) {
$ParamAttributes.Add((New-Object ValidateSet $ValidParameterValues))
$ParamDictionary[$ParameterName] = New-Object System.Management.Automation.RuntimeDefinedParameter (
$ParameterName,
[string[]],
$ParamAttributes
)
}
}
return $ParamDictionary
}
process {
$PSBoundParameters
}
}
The cool thing about this one is that it can keep going as long as there are folders, and it automatically does parameter validation. Of course, you're breaking the laws of .NET by using reflection to get at all those private members, so I would consider this a terrible and fragile solution, no matter how fun it was to come up with.
I have a powershell script that goes through a directory and deletes files that are older than 18 months. Most of the files follow the following naming convention:
20140628
However, some of the files do not. I am trying to make the script ignore these files and just delete the files that follow the naming convention. For some reason, though, the script below is doing the exact opposite of that and deleting the files that do not follow the naming convention and leaving the ones that do. How would I modify the script to fit my needs? Any help would be greatly appreciated.
$targetdirectory = "\\DPR320-W12-1600\PRTG"
$CleanupList = Get-ChildItem $targetdirectory\test
$Threshold = (get-date).AddMonths(-18)
$culture = [Globalization.CultureInfo]::InvariantCulture
foreach ($DirName in $CleanupList)
{
try {
[datetime]::ParseExact($DirName.BaseName, 'yyyyMMdd', $culture)
Remove-Item $DirName.FullName -Force -Recurse
}
catch {
continue
}
}
$targetDirectory = "\\DPR320-W12-1600\PRTG"
$cleanupList = Get-ChildItem $(Join-Path $targetdirectory "test")
$threshold = (Get-Date).AddMonths(-18)
$culture = [Globalization.CultureInfo]::InvariantCulture
foreach ($dirName in $cleanupList) {
try {
$dateFromName = [datetime]::ParseExact($dirName.BaseName, "yyyyMMdd", $culture)
if($dateFromName -le $threshold) {
Remove-Item $dirName.FullName -Force -Recurse
}
} catch {
Continue
}
}
If parsing as a date fails, an error will be thrown and file will not be deleted.
I'm not even sure Continue is needed here, but you cannot remove the catch block.
I am creating a nightly database schema file and would like to put all the files created each night, one for each database, into a folder and compress that folder.
I have a PowerShell script that creates the schema.Only creation script of the db's and then adds all the files to a new folder. The problem lies within the compression portion of this process.
Does anybody have any idea if this can be accomplished with the pre-installed Windows utility that handles folder compression?
It would be best to use that utility if possible rather than something like 7zip (I don't feel like installing 7zip on every customers' server and it may take IT years to do it if I ask them).
A native way with latest .NET 4.5 framework, but entirely feature-less:
Creation:
Add-Type -Assembly "System.IO.Compression.FileSystem" ;
[System.IO.Compression.ZipFile]::CreateFromDirectory("c:\your\directory\to\compress", "yourfile.zip") ;
Extraction:
Add-Type -Assembly "System.IO.Compression.FileSystem" ;
[System.IO.Compression.ZipFile]::ExtractToDirectory("yourfile.zip", "c:\your\destination") ;
As mentioned, totally feature-less, so don't expect an overwrite flag.
Here's a couple of zip-related functions that don't rely on extensions: Compress Files with Windows PowerShell.
The main function that you'd likely be interested in is:
function Add-Zip
{
param([string]$zipfilename)
if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Start-sleep -milliseconds 500
}
}
Usage:
dir c:\demo\files\*.* -Recurse | Add-Zip c:\demo\myzip.zip
There is one caveat: the shell.application object's NameSpace() function fails to open up the zip file for writing if the path isn't absolute. So, if you passed a relative path to Add-Zip, it'll fail with a null error, so the path to the zip file must be absolute.
Or you could just add a $zipfilename = resolve-path $zipfilename at the beginning of the function.
As of PowersShell 5 there is a Compress-Archive cmdlet that does the task out of the box.
This compresses .\in contents to .\out.zip with System.IO.Packaging.ZipPackage following the example here
$zipArchive = $pwd.path + "\out.zip"
[System.Reflection.Assembly]::Load("WindowsBase,Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35")
$ZipPackage=[System.IO.Packaging.ZipPackage]::Open($zipArchive, [System.IO.FileMode]"OpenOrCreate", [System.IO.FileAccess]"ReadWrite")
$in = gci .\in | select -expand fullName
[array]$files = $in -replace "C:","" -replace "\\","/"
ForEach ($file In $files) {
$partName=New-Object System.Uri($file, [System.UriKind]"Relative")
$part=$ZipPackage.CreatePart($partName, "application/zip", [System.IO.Packaging.CompressionOption]"Maximum")
$bytes=[System.IO.File]::ReadAllBytes($file)
$stream=$part.GetStream()
$stream.Write($bytes, 0, $bytes.Length)
$stream.Close()
}
$ZipPackage.Close()
Used voithos' answer to zip files up in powershell, just had one problem with the Add-Zip function, the Start-sleep -milliseconds 500 caused problems if the file couldn't be fully zipped up in that time -> the next one starting before it was complete caused errors and some files not to be zipped.
So after playing around for a bit, first trying to get a counter going to check the count of the $zipPackage.Items() and only continuing after the items count increased (which did not work as it would return 0 in some cases when it should not) I found that it will return 0 if the package is still zipping/copying the files up (I think, haha). Added a simple while loop with the start-sleep inside of it, waiting for the zipPackage.Items().count to be a non-zero value before continuing and this seems to solve the problem.
function Add-Zip
{
param([string]$zipfilename)
if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
do
{
Start-sleep -milliseconds 250
}
while ($zipPackage.Items().count -eq 0)
}
}
Using PowerShell Version 3.0:
Copy-ToZip -File ".\blah" -ZipFile ".\blah.zip" -Force
Hope this helps.