Efficient way to delete old log files on Windows 16 server? - windows

I have over 10 TB of log file data. I want to find an efficient way to delete log files that are older than 3 years old. Originally, I was thinking about scheduling this PowerShell script:
# Delete all Files in PATH older than 3 years (1095 days) old
$Path = "Path to log files"
$Daysback = "-1095"
$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.AddDays($Daysback)
Get-ChildItem $Path | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item
I am concerned that looping through 10 TB of data will take hours to run even thought this script appears to run on O(n). Does anybody have any better solution?

If performance matters I'd get rid of pipes and wouldn't use gci (use lower level types from System.IO). If there are millions of files I'd also write it in c# and only call methods from ps runtime. Below is example to remove files older than N days:
Add-Type -TypeDefinition #"
public static class LogCleaner {
public static void Clean(string dirName, int age)
{
foreach (string file in System.IO.Directory.GetFiles(dirName))
{
var fi = new System.IO.FileInfo(file);
if (fi.LastWriteTime < System.DateTime.Now.AddDays(-age))
{
fi.Delete();
}
}
}
}
"#
[LogCleaner]::Clean("C:\Temp\demo", 10)
You may also check how to achive this using windows native utilities (like dir), or maybe install git bash and try to do this linux way (e.g. using find command)

Related

Powershell: Export-CliXml is exporting without Structure, elements and nodes

I have a Powershell-Script for Windows Server Update Service to automatically approve Updates.
Approved Updates get listed in an XML-File. To achieve that I export the selected data (Title, Arrivaldate and Update-ID) with Export-CliXml into a File:
That worked for quite some time now.
Since a few days, when a new list has to be established, the exported results have no nodes, no structure, no elements ..
What am I doing wrong?
I need these Nodes in a further process. When I import the xml file with import-CliXml, I can't properly iterate through the values like
$List[x][x]
I know the problem: the xml file is not structured. But I am not able to solve it. Can someone help me?
Here is sample code which is similar to mine (which will obviously only work on a Windows Server with WSUS installed):
$List = #()
$Updates_Pretest = $WSUS.GetUpdates() | select -First 10
if ($Updates_Pretest.count -gt 0)
{
foreach ($Update in $Updates_Pretest)
{
$List += #($Update.Id.UpdateId.Guid;(Get-Date -Format "MM.dd.yyyy");$Update.title)
}
}
$List | Export-Clixml -Path ("c:\" + "Test.xml")

Cannot remove item, The Directory is not empty

When using the Remove-Item command, even utilizing the -r and -Force parameters, sometimes the following error message is returned:
Remove-Item : Cannot remove item C:\Test Folder\Test Folder\Target: The directory is not empty.
Particularly, this happens when the directory to be removed is opened in Windows Explorer.
Now, while it is possible to avoid this simply by having Windows Explorer closed or not browsing that location, I work my scripts in a multi-user environment where people sometimes just forget to close Windows Explorer windows, I am interested in a solution for deleting entire folders and directories even if they are opened in Windows Explorer.
Is there an option more powerful than -Force that I can set to achieve this?
To reliably reproduce this, create the folder C:\Test Folder\Origin and populate it with some files and subfolders (important), then take the following script or one like it and execute it once. Now open one of the subfolders of C:\Test Folder\Target (in my case, I used C:\Test Folder\Target\Another Subfolder containing A third file.txt), and try running the script again. You will now get the error. If you run the script a third time, you will not get the error again (depending on circumstances that I have yet to determine, though, the error sometimes occurs the second time and then never again, and at other times it occurs every second time).
$SourcePath = "C:\Test Folder\Origin"
$TargetPath = "C:\Test Folder\Target"
if (Test-Path $TargetPath) {
Remove-Item -r $TargetPath -Force
}
New-Item -ItemType directory -Path $TargetPath
Copy-Item $SourcePath -Destination $TargetPath -Force -Recurse -Container
Update: Starting with (at least [1]) Windows 10 version 20H2 (I don't know that Windows Server version and build that corresponds to; run winver.exe to check your version and build), the DeleteFile Windows API function now exhibits synchronous behavior on supported file-systems, including NTFS, which implicitly solves the problems with PowerShell's Remove-Item and .NET's System.IO.File.Delete / System.IO.Directory.Delete (but, curiously, not with cmd.exe's rd /s).
This is ultimately only a timing issue: the last handle to a subdirectory may not be closed yet at the time an attempt is made to the delete the parent directory - and this is a fundamental problem, not restricted to having File Explorer windows open:
Incredibly, the Windows file and directory removal API is asynchronous: that is, by the time the function call returns, it is not guaranteed that removal has completed yet.
Regrettably, Remove-Item fails to account for that - and neither do cmd.exe's rd /s and .NET's [System.IO.Directory]::Delete() - see this answer for details.
This results in intermittent, unpredictable failures.
The workaround comes courtesy of in this YouTube video (starts at 7:35), a PowerShell implementation of which is below:
Synchronous directory-removal function Remove-FileSystemItem:
Important:
The synchronous custom implementation is only required on Windows, because the file-removal system calls on Unix-like platforms are synchronous to begin with. Therefore, the function simply defers to Remove-Item on Unix-like platforms. On Windows, the custom implementation:
requires that the parent directory of a directory being removed be writable for the synchronous custom implementation to work.
is also applied when deleting directories on any network drives.
What will NOT prevent reliable removal:
File Explorer, at least on Windows 10, does not lock directories it displays, so it won't prevent removal.
PowerShell doesn't lock directories either, so having another PowerShell window whose current location is the target directory or one of its subdirectories won't prevent removal (by contrast, cmd.exe does lock - see below).
Files opened with FILE_SHARE_DELETE / [System.IO.FileShare]::Delete (which is rare) in the target directory's subtree also won't prevent removal, though they do live on under a temporary name in the parent directory until the last handle to them is closed.
What WILL prevent removal:
If there's a permissions problem (if ACLs prevent removal), removal is aborted.
If an indefinitely locked file or directory is encountered, removal is aborted. Notably, that includes:
cmd.exe (Command Prompt), unlike PowerShell, does lock the directory that is its current directory, so if you have a cmd.exe window open whose current directory is the target directory or one of its subdirectories, removal will fail.
If an application keeps a file open in the target directory's subtree that was not opened with file-sharing mode FILE_SHARE_DELETE / [System.IO.FileShare]::Delete (using this mode is rare), removal will fail. Note that this only applies to applications that keep files open while working with their content. (e.g., Microsoft Office applications), whereas text editors such as Notepad and Visual Studio Code, by contrast, do not keep they've loaded open.
Hidden files and files with the read-only attribute:
These are quietly removed; in other words: this function invariably behaves like Remove-Item -Force.
Note, however, that in order to target hidden files / directories as input, you must specify them as literal paths, because they won't be found via a wildcard expression.
The reliable custom implementation on Windows comes at the cost of decreased performance.
function Remove-FileSystemItem {
<#
.SYNOPSIS
Removes files or directories reliably and synchronously.
.DESCRIPTION
Removes files and directories, ensuring reliable and synchronous
behavior across all supported platforms.
The syntax is a subset of what Remove-Item supports; notably,
-Include / -Exclude and -Force are NOT supported; -Force is implied.
As with Remove-Item, passing -Recurse is required to avoid a prompt when
deleting a non-empty directory.
IMPORTANT:
* On Unix platforms, this function is merely a wrapper for Remove-Item,
where the latter works reliably and synchronously, but on Windows a
custom implementation must be used to ensure reliable and synchronous
behavior. See https://github.com/PowerShell/PowerShell/issues/8211
* On Windows:
* The *parent directory* of a directory being removed must be
*writable* for the synchronous custom implementation to work.
* The custom implementation is also applied when deleting
directories on *network drives*.
* If an indefinitely *locked* file or directory is encountered, removal is aborted.
By contrast, files opened with FILE_SHARE_DELETE /
[System.IO.FileShare]::Delete on Windows do NOT prevent removal,
though they do live on under a temporary name in the parent directory
until the last handle to them is closed.
* Hidden files and files with the read-only attribute:
* These are *quietly removed*; in other words: this function invariably
behaves like `Remove-Item -Force`.
* Note, however, that in order to target hidden files / directories
as *input*, you must specify them as a *literal* path, because they
won't be found via a wildcard expression.
* The reliable custom implementation on Windows comes at the cost of
decreased performance.
.EXAMPLE
Remove-FileSystemItem C:\tmp -Recurse
Synchronously removes directory C:\tmp and all its content.
#>
[CmdletBinding(SupportsShouldProcess, ConfirmImpact='Medium', DefaultParameterSetName='Path', PositionalBinding=$false)]
param(
[Parameter(ParameterSetName='Path', Mandatory, Position = 0, ValueFromPipeline, ValueFromPipelineByPropertyName)]
[string[]] $Path
,
[Parameter(ParameterSetName='Literalpath', ValueFromPipelineByPropertyName)]
[Alias('PSPath')]
[string[]] $LiteralPath
,
[switch] $Recurse
)
begin {
# !! Workaround for https://github.com/PowerShell/PowerShell/issues/1759
if ($ErrorActionPreference -eq [System.Management.Automation.ActionPreference]::Ignore) { $ErrorActionPreference = 'Ignore'}
$targetPath = ''
$yesToAll = $noToAll = $false
function trimTrailingPathSep([string] $itemPath) {
if ($itemPath[-1] -in '\', '/') {
# Trim the trailing separator, unless the path is a root path such as '/' or 'c:\'
if ($itemPath.Length -gt 1 -and $itemPath -notmatch '^[^:\\/]+:.$') {
$itemPath = $itemPath.Substring(0, $itemPath.Length - 1)
}
}
$itemPath
}
function getTempPathOnSameVolume([string] $itemPath, [string] $tempDir) {
if (-not $tempDir) { $tempDir = [IO.Path]::GetDirectoryName($itemPath) }
[IO.Path]::Combine($tempDir, [IO.Path]::GetRandomFileName())
}
function syncRemoveFile([string] $filePath, [string] $tempDir) {
# Clear the ReadOnly attribute, if present.
if (($attribs = [IO.File]::GetAttributes($filePath)) -band [System.IO.FileAttributes]::ReadOnly) {
[IO.File]::SetAttributes($filePath, $attribs -band -bnot [System.IO.FileAttributes]::ReadOnly)
}
$tempPath = getTempPathOnSameVolume $filePath $tempDir
[IO.File]::Move($filePath, $tempPath)
[IO.File]::Delete($tempPath)
}
function syncRemoveDir([string] $dirPath, [switch] $recursing) {
if (-not $recursing) { $dirPathParent = [IO.Path]::GetDirectoryName($dirPath) }
# Clear the ReadOnly attribute, if present.
# Note: [IO.File]::*Attributes() is also used for *directories*; [IO.Directory] doesn't have attribute-related methods.
if (($attribs = [IO.File]::GetAttributes($dirPath)) -band [System.IO.FileAttributes]::ReadOnly) {
[IO.File]::SetAttributes($dirPath, $attribs -band -bnot [System.IO.FileAttributes]::ReadOnly)
}
# Remove all children synchronously.
$isFirstChild = $true
foreach ($item in [IO.directory]::EnumerateFileSystemEntries($dirPath)) {
if (-not $recursing -and -not $Recurse -and $isFirstChild) { # If -Recurse wasn't specified, prompt for nonempty dirs.
$isFirstChild = $false
# Note: If -Confirm was also passed, this prompt is displayed *in addition*, after the standard $PSCmdlet.ShouldProcess() prompt.
# While Remove-Item also prompts twice in this scenario, it shows the has-children prompt *first*.
if (-not $PSCmdlet.ShouldContinue("The item at '$dirPath' has children and the -Recurse switch was not specified. If you continue, all children will be removed with the item. Are you sure you want to continue?", 'Confirm', ([ref] $yesToAll), ([ref] $noToAll))) { return }
}
$itemPath = [IO.Path]::Combine($dirPath, $item)
([ref] $targetPath).Value = $itemPath
if ([IO.Directory]::Exists($itemPath)) {
syncremoveDir $itemPath -recursing
} else {
syncremoveFile $itemPath $dirPathParent
}
}
# Finally, remove the directory itself synchronously.
([ref] $targetPath).Value = $dirPath
$tempPath = getTempPathOnSameVolume $dirPath $dirPathParent
[IO.Directory]::Move($dirPath, $tempPath)
[IO.Directory]::Delete($tempPath)
}
}
process {
$isLiteral = $PSCmdlet.ParameterSetName -eq 'LiteralPath'
if ($env:OS -ne 'Windows_NT') { # Unix: simply pass through to Remove-Item, which on Unix works reliably and synchronously
Remove-Item #PSBoundParameters
} else { # Windows: use synchronous custom implementation
foreach ($rawPath in ($Path, $LiteralPath)[$isLiteral]) {
# Resolve the paths to full, filesystem-native paths.
try {
# !! Convert-Path does find hidden items via *literal* paths, but not via *wildcards* - and it has no -Force switch (yet)
# !! See https://github.com/PowerShell/PowerShell/issues/6501
$resolvedPaths = if ($isLiteral) { Convert-Path -ErrorAction Stop -LiteralPath $rawPath } else { Convert-Path -ErrorAction Stop -path $rawPath}
} catch {
Write-Error $_ # relay error, but in the name of this function
continue
}
try {
$isDir = $false
foreach ($resolvedPath in $resolvedPaths) {
# -WhatIf and -Confirm support.
if (-not $PSCmdlet.ShouldProcess($resolvedPath)) { continue }
if ($isDir = [IO.Directory]::Exists($resolvedPath)) { # dir.
# !! A trailing '\' or '/' causes directory removal to fail ("in use"), so we trim it first.
syncRemoveDir (trimTrailingPathSep $resolvedPath)
} elseif ([IO.File]::Exists($resolvedPath)) { # file
syncRemoveFile $resolvedPath
} else {
Throw "Not a file-system path or no longer extant: $resolvedPath"
}
}
} catch {
if ($isDir) {
$exc = $_.Exception
if ($exc.InnerException) { $exc = $exc.InnerException }
if ($targetPath -eq $resolvedPath) {
Write-Error "Removal of directory '$resolvedPath' failed: $exc"
} else {
Write-Error "Removal of directory '$resolvedPath' failed, because its content could not be (fully) removed: $targetPath`: $exc"
}
} else {
Write-Error $_ # relay error, but in the name of this function
}
continue
}
}
}
}
}
[1] I've personally verified that the issue is resolved in version 20H2, by running the tests in GitHub issue #27958 for hours without failure; this answer suggests that the problem was resolved as early as version 1909, starting with build 18363.657, but Dinh Tran finds that the issue is not resolved as of build 18363.1316 when removing large directory trees such as node_modules. I couldn't find any official information on the subject.

Dynamic parameter value depending on another dynamic parameter value

Starting premise: very restrictive environment, Windows 7 SP1, Powershell 3.0. Limited or no possibility of using external libraries.
I'm trying to re-write a bash tool I created previously, this time using PowerShell. In bash I implemented autocompletion to make the tool more user friendly and I want to do the same thing for the PowerShell version.
The bash version worked like this:
./launcher <Tab> => ./launcher test (or dev, prod, etc.)
./launcher test <Tab> => ./launcher test app1 (or app2, app3, etc.)
./launcher test app1 <Tab> => ./launcher test app1 command1 (or command2, command3, etc.).
As you can see, everything was dynamic. The list of environments was dynamic, the list of application was dynamic, depending on the environment selected, the list of commands was also dynamic.
The problem is with the test → application connection. I want to show the correct application based on the environment already selected by the user.
Using PowerShell's DynamicParam I can get a dynamic list of environments based on a folder listing. I can't however (or at least I haven't found out how to) do another folder listing but this time using a variable based on the existing user selection.
Current code:
function ParameterCompletion {
$RuntimeParameterDictionary = New-Object Management.Automation.RuntimeDefinedParameterDictionary
# Block 1.
$AttributeCollection = New-Object Collections.ObjectModel.Collection[System.Attribute]
$ParameterName = "Environment1"
$ParameterAttribute = New-Object Management.Automation.ParameterAttribute
$ParameterAttribute.Mandatory = $true
$ParameterAttribute.Position = 1
$AttributeCollection.Add($ParameterAttribute)
# End of block 1.
$parameterValues = $(Get-ChildItem -Path ".\configurations" -Directory | Select-Object -ExpandProperty Name)
$ValidateSetAttribute = New-Object Management.Automation.ValidateSetAttribute($parameterValues)
$AttributeCollection.Add($ValidateSetAttribute)
$RuntimeParameter = New-Object Management.Automation.RuntimeDefinedParameter($ParameterName, [string], $AttributeCollection)
$RuntimeParameterDictionary.Add($ParameterName, $RuntimeParameter)
# Block 2: same thing as in block 1 just with 2 at the end of variables.
# Problem section: how can I change this line to include ".\configurations\${myVar}"?
# And what's the magic incantation to fill $myVar with the info I need?
$parameterValues2 = $(Get-ChildItem -Path ".\configurations" -Directory | Select-Object -ExpandProperty Name)
$ValidateSetAttribute2 = New-Object Management.Automation.ValidateSetAttribute($parameterValues2)
$AttributeCollection2.Add($ValidateSetAttribute2)
$RuntimeParameter2 = New-Object
Management.Automation.RuntimeDefinedParameter($ParameterName2, [string], $AttributeCollection2)
$RuntimeParameterDictionary.Add($ParameterName2, $RuntimeParameter2)
return $RuntimeParameterDictionary
}
function App {
[CmdletBinding()]
Param()
DynamicParam {
return ParameterCompletion "Environment1"
}
Begin {
$Environment = $PsBoundParameters["Environment1"]
}
Process {
}
}
I would recommend using argument completers, which are semi-exposed in PowerShell 3 and 4, and fully exposed in version 5.0 and higher. For v3 and v4, the underlying functionality is there, but you have to override the TabExpansion2 built-in function to use them. That's OK for your own session, but it's generally frowned upon to distribute tools that do that to other people's sessions (imagine if everyone tried to override that function). A PowerShell team member has a module that does this for you called TabExpansionPlusPlus. I know I said overriding TabExpansion2 was bad, but it's OK if this module does it :)
When I needed to support versions 3 and 4, I would distribute my commands in modules, and have the modules check for the existence of the 'Register-ArgumentCompleter' command, which is a cmdlet in v5+ and is a function if you have the TE++ module. If the module found it, it would register any completer(s), and if it didn't, it would notify the user that argument completion wouldn't work unless they got the TabExpansionPlusPlus module.
Assuming you have the TE++ module or PSv5+, I think this should get you on the right track:
function launcher {
[CmdletBinding()]
param(
[string] $Environment1,
[string] $Environment2,
[string] $Environment3
)
$PSBoundParameters
}
1..3 | ForEach-Object {
Register-ArgumentCompleter -CommandName launcher -ParameterName "Environment${_}" -ScriptBlock {
param($commandName, $parameterName, $wordToComplete, $commandAst, $fakeBoundParameter)
$PathParts = $fakeBoundParameter.Keys | where { $_ -like 'Environment*' } | sort | ForEach-Object {
$fakeBoundParameter[$_]
}
Get-ChildItem -Path ".\configurations\$($PathParts -join '\')" -Directory -ErrorAction SilentlyContinue | select -ExpandProperty Name | where { $_ -like "${wordToComplete}*" } | ForEach-Object {
New-Object System.Management.Automation.CompletionResult (
$_,
$_,
'ParameterValue',
$_
)
}
}
}
For this to work, your current working directory will need a 'configurations' directory contained in it, and you'll need at least three levels of subdirectories (reading through your example, it looked like you were going to enumerate a directory, and you would go deeper into that structure as parameters were added). The enumerating of the directory isn't very smart right now, and you can fool it pretty easy if you just skip a parameter, e.g., launcher -Environment3 <TAB> would try to give you completions for the first sub directory.
This works if you will always have three parameters available. If you need a variable # of parameters, you could still use completers, but it might get a little trickier.
The biggest downside would be that you'd still have to validate the users' input since completers are basically just suggestions, and users don't have to use those suggestions.
If you want to use dynamic parameters, it gets pretty crazy. There may be a better way, but I've never been able to see the value of dynamic parameters at the commandline without using reflection, and at that point you're using functionality that could change at the next release (the members usually aren't public for a reason). It's tempting to try to use $MyInvocation inside the DynamicParam {} block, but it's not populated at the time the user is typing the command into the commandline, and it only shows one line of the command anyway without using reflection.
The below was tested on PowerShell 5.1, so I can't guarantee that any other version has these exact same class members (it's based off of something I first saw Garrett Serack do). Like the previous example, it depends on a .\configurations folder in the current working directory (if there isn't one, you won't see any -Environment parameters).
function badlauncher {
[CmdletBinding()]
param()
DynamicParam {
#region Get the arguments
# In it's current form, this will ignore parameter names, e.g., '-ParameterName ParameterValue' would ignore '-ParameterName',
# and only 'ParameterValue' would be in $UnboundArgs
$BindingFlags = [System.Reflection.BindingFlags] 'Instance, NonPublic, Public'
$Context = $PSCmdlet.GetType().GetProperty('Context', $BindingFlags).GetValue($PSCmdlet)
$CurrentCommandProcessor = $Context.GetType().GetProperty('CurrentCommandProcessor', $BindingFlags).GetValue($Context)
$ParameterBinder = $CurrentCommandProcessor.GetType().GetProperty('CmdletParameterBinderController', $BindingFlags).GetValue($CurrentCommandProcessor)
$UnboundArgs = #($ParameterBinder.GetType().GetProperty('UnboundArguments', $BindingFlags).GetValue($ParameterBinder) | where { $_ } | ForEach-Object {
try {
if (-not $_.GetType().GetProperty('ParameterNameSpecified', $BindingFlags).GetValue($_)) {
$_.GetType().GetProperty('ArgumentValue', $BindingFlags).GetValue($_)
}
}
catch {
# Don't do anything??
}
})
#endregion
$ParamDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
# Create an Environment parameter for each argument specified, plus one extra as long as there
# are valid subfolders under .\configurations
for ($i = 0; $i -le $UnboundArgs.Count; $i++) {
$ParameterName = "Environment$($i + 1)"
$ParamAttributes = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
$ParamAttributes.Add((New-Object Parameter))
$ParamAttributes[0].Position = $i
# Build the path that will be enumerated based on previous arguments
$PathSb = New-Object System.Text.StringBuilder
$PathSb.Append('.\configurations\') | Out-Null
for ($j = 0; $j -lt $i; $j++) {
$PathSb.AppendFormat('{0}\', $UnboundArgs[$j]) | Out-Null
}
$ValidParameterValues = Get-ChildItem -Path $PathSb.ToString() -Directory -ErrorAction SilentlyContinue | Select-Object -ExpandProperty Name
if ($ValidParameterValues) {
$ParamAttributes.Add((New-Object ValidateSet $ValidParameterValues))
$ParamDictionary[$ParameterName] = New-Object System.Management.Automation.RuntimeDefinedParameter (
$ParameterName,
[string[]],
$ParamAttributes
)
}
}
return $ParamDictionary
}
process {
$PSBoundParameters
}
}
The cool thing about this one is that it can keep going as long as there are folders, and it automatically does parameter validation. Of course, you're breaking the laws of .NET by using reflection to get at all those private members, so I would consider this a terrible and fragile solution, no matter how fun it was to come up with.

Is it possible programmatically add folders to the Windows 10 Quick Access panel in the explorer window?

Apparently Microsoft has (sort of) replaced the "Favorites" Windows explorer item with the Quick Access item. But I haven't been able to find a way to programmatically add folders to it (neither on Google not MSDN). Is there no way to do this yet?
There is a simple way to do it in powershell (at least) :
$o = new-object -com shell.application
$o.Namespace('c:\My Folder').Self.InvokeVerb("pintohome")
Hope it helps.
Yohan Ney's answer for pinning an item is correct. To unpin an item you can do this:
$QuickAccess = New-Object -ComObject shell.application 
($QuickAccess.Namespace("shell:::{679f85cb-0220-4080-b29b-5540cc05aab6}").Items() | where {$_.Path -eq "C:\Temp"}).InvokeVerb("unpinfromhome")
Here's a script I wrote to make pin/unpin a little easier:
https://gallery.technet.microsoft.com/Set-QuickAccess-117e9a89
Maybe it will help someone until MS releases an API.
I ran procmon and it seems that these registry keys are involved
Pin to Quick access:
HKEY_CLASSES_ROOT\Folder\shell\pintohome
When unpin:
HKEY_CLASSES_ROOT\PinnedFrequentPlace\shell\unpinfromhome\command
Also this resource is used when pinning: (EDIT1: can't find it any longer..)
AppData\Roaming\Microsoft\Windows\Recent\AutomaticDestinations\{SOME_SORT_OF_GUID}.automaticDestinations-ms
You can try opening it with 7-zip, there are several files in there which fit the destination
EDIT2: I found running this in the 'Run' opens up Quick access:
shell:::{679F85CB-0220-4080-B29B-5540CC05AAB6}
I got an answer here:
Windows 10 - Programmatically use Quick Access
Apparently, it's not possible yet, but a proposition for such an API has been made.
I like Johan's answer but I added a little bit to make not remove some of the items that were already in there. I had a ton pinned in there by accident I must have selected pin folder or something to quick access.
$QuickAccess = New-Object -ComObject shell.application
$okItems = #("Desktop","Downloads","Documents","Pictures","iCloud Photos","iCloud Drive","PhpstormProjects","Wallpapers 5","Videos", "Schedules for testing")
($QuickAccess.Namespace("shell:::{679f85cb-0220-4080-b29b-5540cc05aab6}").Items() | where {$_.name -notin $okItems}).InvokeVerb("unpinfromhome");
Building on what others have said... This allows you to remove all pinned folders (not just all/recent folders/items):
$o = new-object -com shell.application
$($o.Namespace("shell:::{679f85cb-0220-4080-b29b-5540cc05aab6}").Items() | where { $_.IsFolder -eq "True" -and ($($_.Verbs() | Where-Object {$_.Name -in "Unpin from Quick access"}) -ne $null)}).InvokeVerb("unpinfromhome")
I needed this so I could backup / restore my list of Quick Access links quickly. So I put this at the top of my script (to remove all pinned items, then the rest of the script re-adds them. This ensures the order is correct.
And yes, I'm sure there's a better syntax for the above code.
EDIT: After further investigation, I have realized Quick Access contains two "sections". One is Pinned Items, and the other is Frequent Folders. For some reason, Music and Videos come by default on the second section (at least in 1909), unlike the rest (Desktop/Downloads/Documents/Pictures). So the verb to invoke changes from unpinfromhome to removefromhome (defined in HKEY_CLASSES_ROOT\FrequentPlace, CLSID: {b918dbc4-162c-43e5-85bf-19059a776e9e}). In PowerShell:
$Unpin = #("$env:USERPROFILE\Videos","$env:USERPROFILE\Music")
$qa = New-Object -ComObject shell.application
$ob = $qa.Namespace('shell:::{679f85cb-0220-4080-b29b-5540cc05aab6}').Items() | ? {$_.Path -in $Unpin}
$ob.InvokeVerb('removefromhome')
In Windows 1909, you can't unpin the Music or Videos links from Quick Access with the proposed PowerShell solution. It seems they're special because they don't include the "pin" icon, unlike the rest.
The solution is to pin and unpin them. I don't know much about the Windows API or PowerShell so there may be a less convoluted way.
$Unpin = #("$env:USERPROFILE\Videos","$env:USERPROFILE\Music")
$qa = New-Object -ComObject shell.application
ForEach ($dir in $Unpin) { $qa.Namespace($dir).Self.InvokeVerb('pintohome') }
$ob = $qa.Namespace('shell:::{679f85cb-0220-4080-b29b-5540cc05aab6}').Items() | ? {$_.Path -in $Unpin}
$ob.InvokeVerb('unpinfromhome')
Another way is renaming f01b4d95cf55d32a.automaticDestinations-ms, then logging off/rebooting so that it's recreated. But I don't know if it has side effects. Batch script:
:: f01b4d95cf55d32a => Frequent Folders
:: 5f7b5f1e01b83767 => Recent Files
rename "%APPDATA%\Microsoft\Windows\Recent\AutomaticDestinations\f01b4d95cf55d32a.automaticDestinations-ms" f01b4d95cf55d32a.automaticDestinations-ms.bak
void PinToHome(const std::wstring& folder)
{
ShellExecute(0, L"pintohome", folder.c_str(), L"", L"", SW_HIDE);
}
that was the easy part, still unable to do an unpinfromhome
I was able to get this to work in C# using shell32 based on the information in this post and some info on shell32 from this post https://stackoverflow.com/a/19035049
You need to add a reference to "Microsoft Shell Controls and Automation".
This will add a link
Type shellAppType = Type.GetTypeFromProgID("Shell.Application");
Object shell = Activator.CreateInstance(shellAppType);
Shell32.Folder2 f = (Shell32.Folder2)shellAppType.InvokeMember("NameSpace", System.Reflection.BindingFlags.InvokeMethod, null, shell, new object[] { "C:\\temp" });
f.Self.InvokeVerb("pintohome");
This will remove a link by name
Type shellAppType = Type.GetTypeFromProgID("Shell.Application");
Object shell = Activator.CreateInstance(shellAppType);
Shell32.Folder2 f2 = (Shell32.Folder2)shellAppType.InvokeMember("NameSpace", System.Reflection.BindingFlags.InvokeMethod, null, shell, new object[] { "shell:::{679f85cb-0220-4080-b29b-5540cc05aab6}" });
Console.WriteLine("item count: " + f2.Items().Count);
foreach (FolderItem fi in f2.Items())
{
Console.WriteLine(fi.Name);
if (fi.Name == "temp")
{
((FolderItem)fi).InvokeVerb("unpinfromhome");
}
}
For those that work with .NET Core:
Sadly, you cannot include a reference to "Microsoft Shell Controls and Automation" in the build-process.
But you can instead use dynamic, and omit the reference:
public static void PinToQuickAccess(string folder)
{
// You need to include "Microsoft Shell Controls and Automation" reference
// Cannot include reference in .NET Core
System.Type shellAppType = System.Type.GetTypeFromProgID("Shell.Application");
object shell = System.Activator.CreateInstance(shellAppType);
// Shell32.Folder2 f = (Shell32.Folder2)shellAppType.InvokeMember("NameSpace", System.Reflection.BindingFlags.InvokeMethod, null, shell, new object[] { folder });
dynamic f = shellAppType.InvokeMember("NameSpace", System.Reflection.BindingFlags.InvokeMethod, null, shell, new object[] { folder });
f.Self.InvokeVerb("pintohome");
}
And to unpin:
public static void UnpinFromQuickAccess(string folder)
{
// You need to include "Microsoft Shell Controls and Automation" reference
System.Type shellAppType = System.Type.GetTypeFromProgID("Shell.Application");
object shell = System.Activator.CreateInstance(shellAppType);
// Shell32.Folder2 f2 = (Shell32.Folder2)shellAppType.InvokeMember("NameSpace", System.Reflection.BindingFlags.InvokeMethod, null, shell, new object[] { "shell:::{679f85cb-0220-4080-b29b-5540cc05aab6}" });
dynamic f2 = shellAppType.InvokeMember("NameSpace", System.Reflection.BindingFlags.InvokeMethod, null, shell, new object[] { "shell:::{679f85cb-0220-4080-b29b-5540cc05aab6}" });
foreach (dynamic fi in f2.Items())
{
if (string.Equals(fi.Path, folder))
{
fi.InvokeVerb("unpinfromhome");
}
}
}

Creating a zipped/compressed folder in Windows using Powershell or the command line

I am creating a nightly database schema file and would like to put all the files created each night, one for each database, into a folder and compress that folder.
I have a PowerShell script that creates the schema.Only creation script of the db's and then adds all the files to a new folder. The problem lies within the compression portion of this process.
Does anybody have any idea if this can be accomplished with the pre-installed Windows utility that handles folder compression?
It would be best to use that utility if possible rather than something like 7zip (I don't feel like installing 7zip on every customers' server and it may take IT years to do it if I ask them).
A native way with latest .NET 4.5 framework, but entirely feature-less:
Creation:
Add-Type -Assembly "System.IO.Compression.FileSystem" ;
[System.IO.Compression.ZipFile]::CreateFromDirectory("c:\your\directory\to\compress", "yourfile.zip") ;
Extraction:
Add-Type -Assembly "System.IO.Compression.FileSystem" ;
[System.IO.Compression.ZipFile]::ExtractToDirectory("yourfile.zip", "c:\your\destination") ;
As mentioned, totally feature-less, so don't expect an overwrite flag.
Here's a couple of zip-related functions that don't rely on extensions: Compress Files with Windows PowerShell.
The main function that you'd likely be interested in is:
function Add-Zip
{
param([string]$zipfilename)
if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Start-sleep -milliseconds 500
}
}
Usage:
dir c:\demo\files\*.* -Recurse | Add-Zip c:\demo\myzip.zip
There is one caveat: the shell.application object's NameSpace() function fails to open up the zip file for writing if the path isn't absolute. So, if you passed a relative path to Add-Zip, it'll fail with a null error, so the path to the zip file must be absolute.
Or you could just add a $zipfilename = resolve-path $zipfilename at the beginning of the function.
As of PowersShell 5 there is a Compress-Archive cmdlet that does the task out of the box.
This compresses .\in contents to .\out.zip with System.IO.Packaging.ZipPackage following the example here
$zipArchive = $pwd.path + "\out.zip"
[System.Reflection.Assembly]::Load("WindowsBase,Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35")
$ZipPackage=[System.IO.Packaging.ZipPackage]::Open($zipArchive, [System.IO.FileMode]"OpenOrCreate", [System.IO.FileAccess]"ReadWrite")
$in = gci .\in | select -expand fullName
[array]$files = $in -replace "C:","" -replace "\\","/"
ForEach ($file In $files) {
$partName=New-Object System.Uri($file, [System.UriKind]"Relative")
$part=$ZipPackage.CreatePart($partName, "application/zip", [System.IO.Packaging.CompressionOption]"Maximum")
$bytes=[System.IO.File]::ReadAllBytes($file)
$stream=$part.GetStream()
$stream.Write($bytes, 0, $bytes.Length)
$stream.Close()
}
$ZipPackage.Close()
Used voithos' answer to zip files up in powershell, just had one problem with the Add-Zip function, the Start-sleep -milliseconds 500 caused problems if the file couldn't be fully zipped up in that time -> the next one starting before it was complete caused errors and some files not to be zipped.
So after playing around for a bit, first trying to get a counter going to check the count of the $zipPackage.Items() and only continuing after the items count increased (which did not work as it would return 0 in some cases when it should not) I found that it will return 0 if the package is still zipping/copying the files up (I think, haha). Added a simple while loop with the start-sleep inside of it, waiting for the zipPackage.Items().count to be a non-zero value before continuing and this seems to solve the problem.
function Add-Zip
{
param([string]$zipfilename)
if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
do
{
Start-sleep -milliseconds 250
}
while ($zipPackage.Items().count -eq 0)
}
}
Using PowerShell Version 3.0:
Copy-ToZip -File ".\blah" -ZipFile ".\blah.zip" -Force
Hope this helps.

Resources