I was downloading a huge torrent (1.2tb with over 6000 folders) divided in 2 parts, so I placed the 2nd part on the designed place and it was not a problem since the master-folder of the torrent is exactly what was needed. The 1st part master-folder was named with some generic torrent name instead of the name I needed, so instead of renaming the torrent name to "source", which I think would have worked and renamed the currently generic name to "source". In files tab I selected all the files and right-click>relocate all of them and bittorrent simply moved all of the files to the same directory, without any subfolder, and created a mess.
So I have a un-finished backup of this torrent and the files are in place, so my idea was using the un-finished one's name, match with the finished ones and put the finished ones in the un-finished matching name's path folder. I hope that was clear.
I tried to resolve this using PowerShell, but I dont know much, so I came up with this and nothing happens, something is wrong. Anyone knows a solution?
$itemlistA = Get-ChildItem -Path "D:\BitTorrent\" |
ForEach-Object {
$objnameA = $_.Name
$objPathA = $_.FullName
}
$itemlistB = Get-ChildItem -Path "E:\DesiredPath\" -recurse |
ForEach-Object{
$objnameB = $_.Name
$objPathB = $_.FullName
}
ForEach-Object{
if($objnameA -eq $objnameB){
Copy-Item -path $objPathA -Destination $objPathB
Write-Host "ffff Object ($objnameA) new Path ($objPathB) ffff"
}
}
If I'm understanding your intent correctly, the script below will accomplish your goal, assuming your goal is to copy files from a flattened directory into some (potentially) nested directories so that the incoming files overwrite files with matching names.
The O(n^2) performance of the nested loops could be improved with a sort and more efficient search.
You'd need to edit the script's params to reflect your own environment.
param(
$pathToFiles = "$PSScriptRoot\BitTorrent\",
$desiredPath = "$PSScriptRoot\DesiredPath\"
)
$itemlistA = Get-ChildItem -Path $pathToFiles | Select-Object -Property Name, FullName
$itemlistB = Get-ChildItem -Path $desiredPath -Recurse | Select-Object -Property Name, FullName
foreach ($fileA in $itemlistA) {
foreach ($fileB in $itemListB) {
if ($fileB.Name -eq $fileA.Name) {
Copy-Item -path $fileA.FullName -Destination $fileB.FullName -Verbose
break
}
}
}
Related
I need everyone's help to get back to the power shell, I currently have a directory tree with a lot of folders you can see the images I borrowed.
enter image description here
I want to share folder "C and F" all directory tree at once with multiple users with view and edit permissions. hope everyone can help. I'm so stupid about this.
Hi khuchatvui and welcome to stackoverflow!
New-SmbShare can be used for creating shared folders.
If I understand correctly, you only want to share folders with a specific name that exist at multiple levels. SMB share names have to be unique, so that will provide a challenge if you want to have a specific sharename
You could partly automate this process by getting prompt for each folder name during the creation:
Solution 1 - prompt for name
$FoldersToShare = Get-ChildItem -Path C:\Tests\ -Recurse | Where-Object { $_.Name -eq 'F' -or $_.Name -eq 'C' } | Select-Object -ExpandProperty FullName
foreach ($folder in $FoldersToShare) {
New-SmbShare -Name (Read-Host -Prompt "Enter the sharename for $($folder)") -Path $folder -ChangeAccess "domain\groupname"
}
If there is no pattern in the folders you want to share, but the names are unique, you could make a list of all the folders you want to share like this:
Solution 2 - create unique folder names
Get-ChildItem -Path c:\tests -Directory -Recurse | Select-Object Name, FullName | Export-Csv -NoTypeInformation -NoClobber -Delimiter ';' -Path C:\Tests\Stackoverflow\FoldersToShare.csv
Then, modify that list using a text editor or Excel to only contain the folders you want to share and use that to loop through New-SmbShare
Finally, use PowerShell to import the contents of the modified csv file and loop through the entries with New-SmbShare to create the shared folders
$FoldersToShare = Import-Csv -Path C:\Tests\Stackoverflow\FoldersToShare.csv -Delimiter ';'
foreach ($folder in $FoldersToShare) {
New-SmbShare -Name $folder.Name -Path $folder.FullName -ChangeAccess "domain\groupname"
}
For my solution, I created the folder structure from your image under C:\Tests
C:\Tests\
A
A1
C
F
A2
C
B
B1
C
B2
C
We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName
I have a series of folders and subfolders, structured in this way:
001/Fabric/Blue/ (.jpg files, sequentially named)
001/Fabric/Green/ (.jpg files, sequentially named)
002/Fabric/Blue/ (.jpg files, sequentially named)
002/Fabric/Green/ (.jpg files, sequentially named)
etc.
The file names have excess string characters that I would like to remove, and I would like to convert their file names into an easier sequential format (0.jpg, 1.jpg, etc.).
I tried working with a few different PowerShell examples to get this to work. I have the recursive searching functionality working, however I receive an error about an InvalidOperationException when trying to rename the files in the ForEach-Object loop. Additionally, I am afraid my sequential numbering is not being 'reset' for each of the folders where it renames files.
$i = 0
Get-ChildItem -Filter "*.jpg" -Recurse | ForEach-Object {
Rename-Item $_ -NewName ('$i.jpg' -f $i++)
}
So, two questions:
How can I fix the error with Rename-Item?
How can I ensure my variable is reset for each subfolder the script starts renaming files in?
If you take a two step approach, first getting all the folders containing jpg's and then iterating through this list, you have no problem beginning with 1. But I'd always use leading zeroes for such a renumbering.
$BaseFld = "Q:\Test\"
$Ext = "*.jpg"
$jpgFolders = gci $($BaseFld+$Ext) -Recurse |
Select -ExpandProperty Directory -Unique |
select -ExpandProperty Fullname | Sort
ForEach ($Folder in $jpgFolders) {
Set-location $Folder
$i = 1
Get-ChildItem $Ext | %{Ren $_ -NewName ('{0:D4}.jpg' -f $i++) -whatif}
}
If the ouptut suits you, remove the -whatif in the second last line
other method
$rootdir="C:\temp"
gci $rootdir -Recurse -Directory | %{$i=1; gci $_.FullName -Recurse -File -Filter "*.jpg" | %{Ren $_.FullName -NewName ('{0}.jpg' -f $i++)} }
I need my program to give me every folder containing files which are out of the Windows' number of characters limit. It means if a file has more than 260 characters (248 for folders), I need it to write the address of the file's parent. And I need it to write it only once. For now, I'm using this code:
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
Select-Object -ExpandProperty FullName |
Split-Path $_.FullName
But the Split-Path won't work (this is the first time I use it). It tells me the -Path parameter has a null value (I can write -Path but it doesn't change anything).
If you want an example of what I need: imagine folder3 has a 230-character address and file.txt has a 280-character address:
C:\users\folder1\folder2\folder3\file.txt
Would write:
C:\users\folder1\folder2\folder3
I'm using PS2, by the way.
Spoiler: the tool you are building may not be able to report paths over the limit since Get-ChildItem cannot access them. You can try nevertheless, and also find other solutions in the links at the bottom.
Issue in your code: $_ only works in specific contexts, for example a ForEach-Object loop.
But here, at the end of the pipeline, you're only left with a string containing the full path (not the complete file object any more), so directly passing it to Split-Path should work:
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
Select-Object -ExpandProperty FullName |
Split-Path
as "C:\Windows\System32\regedt32.exe" | Split-Path would output C:\Windows\System32
Sidenote: what do (Get-Item C:\Windows\System32\regedt32.exe).DirectoryName and (Get-Item C:\Windows\System32\regedt32.exe).Directory.FullName output on your computer ? These both show the directory on my system.
Adapted code example:
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
ForEach-Object { $_.Directory.FullName } |
Select-Object -Unique
Additional information about MAX_PATH:
How do I find files with a path length greater than 260 characters in Windows?
Why does the 260 character path length limit exist in Windows?
http://www.powershellmagazine.com/2012/07/24/jaap-brassers-favorite-powershell-tips-and-tricks/
https://msdn.microsoft.com/en-us/library/windows/desktop/aa365247%28v=vs.85%29.aspx
https://gallery.technet.microsoft.com/scriptcenter/Get-ChildItemV2-to-list-29291aae
you cannot use get-childitem to list paths greater than the windows character limit.
There are a couple of alternatives for you. Try an external library like 'Alphafs' or you can use robocopy. Boe Prox has a script that utilizes robocopy and it is available on technet but i am not sure if it will work on PSV2. Anyway you can give it a try.
I've had a similar problem and resolved it like this:
$PathTooLong = #()
Get-ChildItem -LiteralPath $Path -Recurse -ErrorVariable +e -ErrorAction SilentlyContinue
$e | where {$_.Exception -like 'System.IO.PathTooLongException*'} | ForEach-Object {
$PathTooLong += $_.TargetObject
$Global:Error.Remove($_)
}
$PathTooLong
On every path that is too long, or that the PowerShell engine can't handle, Get-ChildItem will throw an error. This error is saved in the ErrorVariable called e in the example above.
When all errors are collected in $e you can filter out the ones you need by checking the error Exception for the string System.IO.PathTooLongException.
Hope it helps you out.
I have a file transfer/sync job that is copying files from the main network into a totally secure network using a custom protocol (ie no SMB). The problem is that because I can't look back to see what files exist, the destination is filling up, as the copy doesn't remove any files it hasn't touched (like robocopy MIR does).
Initailly I wrote a script that:
1. Opens the log file and grabs the file paths out (this is quite quick and painless)
2. Does a Get-ChildItem on the destination folder (now using dir /s /b as it's way faster than gci)
3. Compared the two, and then removed the differences.
The problem is that there are more jobs that require this clean up but the log files are 100MB and the folders contain 600,000 files, so it's taking ages and using tons of memory. I actually have yet to see one finish. I'd really like some ideas on how to make this faster (memory/cpu use doesn't bother me too much but speed is essential.
$destinationMatch = "//server/fileshare/folder/"
the log file contains some headers and footers and then 600,000 lines like this one:
"//server/fileshare/folder/dummy/deep/tags/20140826/more_stuff/Deeper/2012-07-02_2_0.dat_v2" 33296B 0B completed
Here's the script:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select Name -first 1
$manifestFileName = [string]$manifestFile.name
$manifestFullPath = $logPath + "\" + $manifestFileName
$copiedList = #()
(gc $manifestFullPath -ReadCount 0) | where {$_.trim() -match $DestinationMatch} | % {
if ( $_ -cmatch '(?<=")[^"]*(?=")' ){
$copiedList += ($matches[0]).replace("/","\")
}
}
$dest = $destinationMatch.replace("/","\")
$actualPathString = (gci -Path $dest -Recurse | select fullname).fullnameCompare-Object -ReferenceObject $copiedList -DifferenceObject $actualPathString -PassThru | % {
$leaf = Split-Path $_ -leaf
if ($leaf.contains(".")){
$fsoData = gci -Path $_
if (!($fsoData.PSIsContainer)){
Remove-Item $_ -Force
}
}
}
$actualDirectory | where {$_.PSIsContainer -and #(gci -LiteralPath $_.FullName -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue | where {!$_.PSIsContainer}).Length -eq 0} | remove-item -Recurse -Force
Ok, so let's assume that your file copy preserves the last modified date/time stamp. If you really need to pull a directory listing, and compare it against a log, I think you're doing a decent job of it. The biggest slow down is obviously going to be pulling your directory listing. I'll address that shortly. For right now I would propose the following modification of your code:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select -first 1
$RegExPattern = [regex]::escape($DestinationMatch)
$FilteredManifest = gc $manifestfile.FullPath | where {$_ -match "`"($RegexPattern[^`"]*)`""} |%{$matches[1] -replace '/','\'}
$dest = $destinationMatch.replace("/","\")
$DestFileList = gci -Path $dest -Recurse | select Fullname,Attributes
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -notmatch "Directory"}|Remove-Item $_ -Force
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -match "Directory" -and (gci -LiteralPath $_ -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue).Length -eq 0}{Remove-Item $_ -Recurse -Force}
This stops you from duplicating efforts. There's no need to get your manifest file, and then assign different variables to different properties of the file object, just reference them directly. Then later when you pull your directory listing of the drive (the slow part here), keep the full name and attributes of the files/folders. That way you can easily filter against Attributes to see what's a directory and what not, so we can deal with files first, then clean up directories later after the files are cleaned up.
That script should be a bit more streamlined version of yours. Now, about pulling that directory listing... Here's the deal, using Get-ChildItem is going to be slower than some alternatives (such as dir /s /b) but it stops you from having to duplicate efforts by later checking what's a file, and what's a directory. I suppose if the actual files/folders that you are concerned with are a small percentage of the total, then the double work may actually be worth the time and effort to pull the list with something like dir /s /b, and then parse against the log, and only pull folder/file info for the specific items you need to address.