So I want to know if any of the folders in a directory have any subfolders or files in them, I tried just looking at the directory in PowerShell but it gave me only mode, last write time, and name. Is there any way of adding to this list to include metadata of the folder like size or number of subfiles/folders all I want to know is if they are empty or not so there may be a simpler way I'm missing.
Thanks for any help you can give!
I see the question is tagged 'windows', so on Windows you could also use a COM object.
$fso = New-Object -ComObject Scripting.FileSystemObject
$folder = $fso.GetFolder($pathToFolder)
$folder will be an object with a bunch of interesting metadata on it, including SubFolders and Files. One of the interesting ones is Size. If Size is zero, there are no files in that directory, or in any nested subdirectories either.
If you just want to know if there are folders/subfolders and/or files then this will work:
$folder="C:\Test"
Get-ChildItem $folder -Recurse | Measure-Object
Output (in my case)
Count : 2
Average :
Sum :
Maximum :
Minimum :
Property :
If you want to see more properties then this might work for you:
Get-ChildItem -Path $folder -Recurse | Format-List *
alternatively you can also select the first x, last x, or even skip items:
Get-ChildItem -Path $folder -Recurse |Select-Object -First 2| Format-List *
*-Recurse will check all folders below
Related
I've seen variations of this question answered, but typically using something like 7zip. I'm trying to find a solution that will work with the capabilities that come with windows absent any additional tools.
I have a directory that contains several hundred subdirectories. I need to individually compress each subdirectory....so I'll wind up with several hundred zip files, one per subdirectory. This is on a machine at work where I don't have administrative privileges to install new software...hence the desire to stay away from 7zip, winRar, etc.
If this has already been answered elsewhere, my apologies...
Never tried that myself, but there is Compress-Archive:
The Compress-Archive cmdlet creates a zipped (or compressed) archive file from one or more specified files or folders. An archive file allows multiple files to be packaged, and optionally compressed, into a single zipped file for easier distribution and storage. An archive file can be compressed by using the compression algorithm specified by the CompressionLevel parameter.
Because Compress-Archive relies upon the Microsoft .NET Framework API System.IO.Compression.ZipArchive to compress files, the maximum file size that you can compress by using Compress-Archive is currently 2 GB. This is a limitation of the underlying API.
Here's a sample script I just hacked together:
# configure as needed
$source = "c:\temp"
$target = "d:\temp\test"
# grab source file names and list them
$files = gci $source -recurse
$files
# target exists?
if( -not (test-path $target)) {
new-item $target -type directory
}
# compress, I am using -force here to overwrite existing files
$files | foreach{
$dest = "$target\" + $_.name + ".zip"
compress-archive $_ $dest -CompressionLevel Optimal -force
}
# list target dir contents
gci $target -recurse
You may have to improve it a bit when it comes to subfolders. In the above version, subfolders are compressed as a whole into a single file. This might not exactly be what you want.
Get-ChildItem c:\path\of\your\folder | ForEach-Object {
$path = $_.FullName
Compress-Archive -Path $path -DestinationPath "$path.zip"
}
I put this, as a quick snippet. Don't hesitate to comment if this does not fit with your request.
In a folder X, there are subfolders Y1, Y2...
Y1.zip, Y2.zip... will be created.
use PowerShell go the the path that you would like to compress, do:
$folderlist = Get-ChildItem "."
foreach ($Folder in $folderlist) { Compress-Archive -path $Folder.Name -destinationPath "$($Folder.Name).zip"}
Is there a way to stop Powershell from sorting by default? I need to read in files from a directory and in the order which they are listed in the directory I need them to also be listed in the array (variable). Even when I use -lastwritetime on the get-childitem command, it seems to have no affect. The primary reason why I want to do this is because the files have names that are all the same except each file has a number after it like the following:
document1.doc
document2.doc
document3.doc
.....
document110.doc
The problem is if it's sorted by name, it will sort in this manner:
document1.doc
document10.doc
document111.doc
Which is horribly wrong!
Right now I have this command and it doesn't work:
$filesnames1 = get-childItem -name *.doc -Path c:\fileFolder\test | sort-object -LastWriteTime
You probably want something more along these lines.
$filesnames1 = Get-ChildItem -Path c:\fileFolder\test\*.doc |
Sort-Object -Property LastWriteTime
I don't think either of those two cmdlets have a -LastWriteTime parameter.
If you need only the names from those filesystem objects, you can use ($filesnames1).Name after the code above. There are other ways.
Thanks for responding Mike. What I did is put a "-filter *.pdf" just before -path which gave me the headers. Then I piped in a "select-object -ExpandProperty name" to list it exactly how I needed it to. It was a little trial and error but I did eventually figure it out.
$filesnames1 = Get-ChildItem -filter *.doc -Path c:\fileFolder\test |
Sort-Object -LastWriteTime | -ExpandProperty name
I'm trying to copy files using copy-item. Specifically, I want to copy files with a particular extension that are within a folder or its subfolders to another location, and to retain the subfolder hierarchy. I've tried using -filter and -include to specify the file extension, but no files are copied.
My source and destination paths are stored in variables $packageSourcePath and $objPath. When called, $packageSourcePath will be like the following ".\src\projects\Project1\PackageFiles" and $objPath will be like the following ".\bld\Project1\obj".
The command I've tried using is this:
Copy-Item -Path $packageSourcePath\* -Filter *.resw -Destination $objPath -Recurse
I've also tried variations, such as leaving off * from the path, or using -Include instead of -Filter. Nothing works. If I leave out the -Filter argument, then files copy, but all of the files are copied. I only want files with the particular extension.
I've given up on Copy-Item. JohnLBevan's answer didn't actually do what I want since all files in the source root get copied, even though they don't match the filter. I tried piping Convert-Path | Select-String | Copy-Item but still got all files in the source root being copied.
A contact in a different context provided a couple of suggestions:
1)
Get-ChildItem -Force -Recurse -ErrorAction Ignore -Path $packageSourcePath -Filter *.resw | % {
$src = $_.FullName
$dst = Join-Path $objPath $src.SubString($packageSourcePath.Length)
echo "copy ""$src"" ""$dst"""
}
I think this is a bit harder to follow, hence less maintainable for the next person (likely another PS-neophyte like me) a year from now. ("Why is the -ErrorAction parameter needed here? What's the behaviour of the Substring() method, and why can't I find that using Get-Help?")
This suggestion is a bit clearer, after re-familiarizing with attrib and checking the effect of the xcopy switches:
2)
cd $packageSourcePath
attrib -a /s
attrib +a *.resw /s
xcopy /eidlm $packageSourcePath $objPath
But if we're going to use xcopy, we don't need to call attrib:
xcopy $packageSourcePath*.resw $objPath /s /i > $null
The only problem with this for my scenario is that xcopy emits an error if no matching files are found. My script is being used for a VSTS build task, and the xcopy errors cause the build task to fail. (For that reason, I'm guess that suggestion 2 also wouldn't work for me.)
So, I've opted for this:
# In PS version 5.1, nothing gets copied using Copy-Item $packageSourcePath\* -Filter *.resw ...
# so resorting to using xcopy, which mostly works. The one issue is that xcopy will output an
# error if no matching file is found, so using GCI first to test for a matching file.
if ($(Get-ChildItem $packageSourcePath\*.resw -Recurse).count -gt 0) {
xcopy $packageSourcePath\*.resw $objPath /s /i > $null
}
The condition using GCI is added to check there are matching files before calling xcopy, thereby avoiding any errors.
I'm still amazed that Copy-Item -Filter -Recurse didn't work.
This should do it (obviously this could be done in 1 line; I've assigned values to the variables just to help make it readable / self-explanatory):
[string]$filter = '*.resw'
[string]$source = Join-Path -Path $packageSourcePath -ChildPath '*'
[string]$target = $objPath
$source | Convert-Path | Copy-Item -Filter $filter -Recurse -Destination $target -Container #-Force
Notes:
We append the asterisk to the source path to ensure that we copy the contents of the source folder to the destination, without copying the source's root folder into the destination (i.e. say we're copying c:\temp\from to c:\temp\to, we don't want c:\temp\to\from (unless it's a copy of c:\temp\from\from)).
We use the Join-Path cmdlet to append this asterisk to ensure the appropriate slashes are inserted into the path.
We do a Convert-Path on the source to resolve the asterisk to the child folder/file names... for some reason copy-item doesn't handle these asterisks well. NB: Convert-Path will potentially return an array of paths; i.e. if there's more than one file/subfolder directly under the source folder. Get-Item or Resolve-Path could equally be used for this; I prefer Convert-Path since it returns a simple string array, rather than a more complex type; but there's no strong argument for using any one over the others.
We pipe these source paths to the Copy-Item command so it can be applied to each path returned by Convert-Path.
We include -Recurse to say we're interested in anything in the subfolders of the copied path.
We include the -Container parameter to say that we want to preserve any folder structure when copying. Strictly this is not needed, as this switch is defaulted to true (i.e. rather we should specify if we don't want this behaviour: -Container:$false; but I like to be clear that I deliberately want to preserve the directory structure, as opposed to leaving the assumption that I may not have thought of this. There's a better explanation of this here: https://stackoverflow.com/a/21798660/361842.
You could optionally include -Force; this would mean that should an item of the same name already exist in the target we overwrite it instead of getting an error.
Related documentation:
Join-Path
Convert-Path
Copy-Item
Update 2018-01-03
Per comments, this solution should ensure that only those items you want get copied, and pre-existing directories shouldn't cause issues.
[string]$filter = '*.resw'
[string]$source = $packageSourcePath
[string]$target = $objPath
#copy all files in subfolders of the source
$source | Get-ChildItem -Directory | Copy-Item -Filter $filter -Recurse -Destination $target -Container -Force
#copy all files in root of the source
$source | Get-ChildItem -File -Filter $filter | Copy-Item -Destination $target -Container -Force
This solution uses 2 steps; there's probably a better option, but due to the peculiarities / bug in this cmdlet the above's a reliable option.
I got this folder structure
C:\Users\myUser\Desktop
including folders called
BL-100
BL-105
BL-108
and so on...
most BL-folders storing a file.xml, but not all.
So on Desktop are much folders starting with BL- and the most, but not all, storing a file.xml.
Now I want to search all folders which are starting with BL- and store a file.xml and rename those folders to RG-100, RG-105, RG-108 and so on
At the moment I got this script:
foreach($Directory in Get-ChildItem -Path C:\Users\myUser\Desktop -Recurse | Where-Object{($_.Name.Substring(0,3) -eq 'BL-')}){
}
This does not work and is showing me error: Exception calling "Substring" with "2" argument(s): "Index and length must refer to a location within the string. Parameter name: length"
Anyone can help please?
The error you're seeing is because SubString fails for some reason. The most likely reason would be if the string is not long enough; e.g. if you had a folder with a 1 character long name. To see what I mean, try running: '1'.Substring(0,2).
To avoid this, instead you could use the like operator. e.g.
foreach($Directory in (
Get-ChildItem -LiteralPath 'C:\Users\myUser\Desktop' -Recurse `
| Where-Object{($_.Name -like 'BL-*')}
)){
#...
}
Just do it:
Get-ChildItem "C:\Users\myuser\Desktop" -directory -Filter "BL-*"
I'm looking to thin down how many folders I need to recover after a cryptolocker outbreak at a clients site and started looking into powershell as a good way to do this. What I need to do is recover a folder if it has any file inside with the extension .encrypted.
I can run the below
get-childitem C:\ -recurse -filter “*.encrypted” | %{$_.DirectoryName} | Get-Unique
And get a list of all folders that have .encrypted files in them but what I would like to do is thin down the list for example if we have the below file list and assume * means the folder contains encrypted files.
C:\Folder1
C:\Folder1\Folder2\Folder4*
C:\Folder1\Folder2*
C:\Folder1\Folder3\Folder5*
C:Folder1\Folder3\Folder6\
rather than returning
C:\Folder1\Folder2\Folder4*
C:\Folder1\Folder2*
C:\Folder1\Folder3\Folder5*
I would like it just to return as this would be the optimal recovery option.
C:\Folder1\Folder2*
C:\Folder1\Folder3\Folder5*
I know this is a fairly complex problem so I'm not asking anyone to solve it for me just some pointers in the right direction would be awesome as my brain is fried at the moment and I need to write this fairly quickly.
Here's a simple way to do this that should be pretty efficient:
PS C:\> dir -ad -rec | where { test-path (join-path $_.FullName *.encrypted) }
dir is an alias for get-childitem
where is an alias for where-object
-ad means return directories only
-rec means recurse
test-path returns $true if the path exists (yes, it handles wildcards)
S, we recurse through all folders forwarding the folder object down the pipeline. We get the full name of the folder and append *.encrypted to it. If test-path returns $true for this path, we forward the folder down the pipeline. The folder ends up in the console output.
Now, if you want to get a little fancier, here's a more fleshed out one-liner than will report the folders and the encrypted files count into a csv file named after the machine:
dir -ad -rec | ? { test-path (join-path $_.FullName *.txt) } | % {
[pscustomobject]#{"Path"=$_.fullname;"Count"=(dir (join-path $_ *.txt)).count}} |`
Export-Csv "c:\temp\$(hostname).csv" -NoTypeInformation
(? and % are aliases for where-object and foreach-object respectively)
With a little more effort, you could use a fan-out scan of the entire company assuming powershell remoting is enabled on each target machine and have it return all results to you from all machines.
Good luck!
This is too much for a comment, but I don't know that it would be a good answer, just a kind of hackish way to get it done...
The only thing I could think of is to get your list of folders, then start matching them all against each other, and when you get two that at least partially match remove the longer one.
$FullList = GCI C:\*.encrypted | Select -Expand DirectoryName -Unique | Sort -Property Length
$ToRemove = #()
foreach($Folder in $FullList){$ToRemove+=$FullList| Where{$_ -ne $Folder -and ($_ -match [regex]::Escape($Folder))}}
$FinalList = $FullList | Where{$ToRemove -notcontains $_}
That's going to be slow though, there has to be a better way to do it. I just haven't thought of a better way yet.
Don't get me wrong, this will work, and it's faster than going through things by hand for sure, but I'm sure that there has to be a better way to do it.