How do I split thousands of pictures into multiple folders at once? - windows

I have 33,000 photos in one folder. I want to divide all of the photos into multiple folders. The problem is I can’t even access the folder; every-time I open the folder the computer starts overheating. It doesn’t even load the photos; it freezes the computer every-time. Even if I did manage to load all 33,000 photos; It would probably take me all day to drag 100 photos at a time and put them inside a folder. There has to be an easier method, there has to be an application/software that could do that automatically.

It's not the most efficient way, but you can use PowerShell to move them into generic numbered folders using the below script. Simply change the 4x variables to suit and you should be good to go.
# Change these as preferred
$sourceFolder = "D:\source"
$destinationRoot = "D:\dest"
$extensions = #("*.png", "*.jpg", "*.jpeg")
$itemsPerFolder = 100
$folder = 0
while (#(Get-ChildItem $sourceFolder -Include $extensions -Recurse).Count -gt 0)
{
$tDest = "$destinationRoot\$folder"
mkdir $tDest
Write-Host "Moving to folder $tDest"
Get-ChildItem -File -Path $sourceFolder -Include $extensions -Recurse | Select-Object -First $itemsPerFolder | Move-Item -Destination $tDest
$folder++
}
For a more efficient way, try
# Change these as preferred
$sourceFolder = "D:\source"
$destinationRoot = "D:\dest"
$extensions = #("*.png", "*.jpg", "*.jpeg")
$itemsPerFolder = 100
$allItems = #(Get-ChildItem $sourceFolder -Include $extensions -Recurse)
for ($i = 0; $i -lt $allItems.Count; $i++)
{
$folder = [math]::Floor($i / $itemsPerFolder)
$tDest = "$destinationRoot\$folder"
if (!(Test-Path $tDest))
{
mkdir $tDest
}
Move-Item $allItems[$i] -Destination $tDest
}

Related

How can I move an array of files dynamically with Powershell and RoboCopy to individual subfolders of a fixed size?

I am creating a script that splits a target folder's files into subfolders of n length, where n is a number specified dynamically.
So basically, if Folder A has 9000 files, and I limit the number of files to 1000 per folder, the script would create nine sub-directories inside of Folder A with 1000 files each.
Here is working code:
param (
[Parameter(Mandatory,Position=0)]
[String]
$FileList,
[Parameter(Mandatory=$false,ValueFromPipelineByPropertyName)]
[Int32]
$NumFilesPerFolder = 1000,
[Parameter(Mandatory=$false,ValueFromPipelineByPropertyName)]
[Int32]
$FolderNumberPadding = 2
)
$Folders = Get-Content $FileList
Set-Location -LiteralPath ([IO.Path]::GetTempPath())
function Move-Files {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position=0)]
[System.Collections.ArrayList]
$List,
[Parameter(Mandatory)]
[Int32]
$Index
)
$BaseFolder = [System.IO.Path]::GetDirectoryName($List[0])
$DestFolderName = $Index.ToString().PadLeft($FolderNumberPadding, '0')
$DestFolder = New-Item -Path (Join-Path $BaseFolder $DestFolderName) -Type Directory -Force
Move-Item $List -Destination $DestFolder -Force
}
foreach ($Folder in $Folders) {
$Files = Get-ChildItem -LiteralPath $Folder -File -Force
$filesidx = 1
$totalidx = $null
$groupidx = 0
$FilesToMove = [System.Collections.ArrayList]#()
foreach ($File in $Files) {
if($null -eq $totalidx){
$totalidx = $Files.Length
}
if($filesidx -eq 1){
$groupidx++
}
$FilesToMove.Add($File)
if($filesidx -eq $NumFilesPerFolder){
Move-Files -List $FilesToMove -Index $groupidx
$FilesToMove.Clear()
$filesidx = 1
}elseif($totalidx -eq 1){
Move-Files -List $FilesToMove -Index $groupidx
$FilesToMove.Clear()
break
}else{
$filesidx++
}
$totalidx--
}
}
Remove-Item $FileList -Force
$app = New-Object -ComObject Shell.Application
$appwin = $app.Windows()
foreach ($window in $appwin) {
if($window.Name -eq "File Explorer"){
$window.Refresh()
}
}
Invoke-VBMessageBox "Operation Complete" -Title "Operation Complete" -Icon Information -BoxType OKOnly
This code runs reasonably well, but it heavily bottlenecks when actually moving the files with Move-Item. I'd like to try and use RoboCopy here, but I am perplexed as to how I can implement it.
What I'm having trouble with is that the items I need to move are stored in a list (see the Move-Files function), and every item that needs to be moved are all in the same sub-directory. So I can't just do RoboCopy.exe C:\Source C:\Destination /mov.
How can I integrate RoboCopy here to accomplish my goal? I really need multi-threaded performance as this function will be responsible for moving thousands of files around in production on a frequent basis.
Any help would be greatly appreciated - please let me know if I can provide more information to further clarify my objective.
Thanks for any help at all!

Sorting files into directories with powershell

I have the following problem and I would really appreciate it if I could get some help on that front. I am getting a constant flow of xml files into a folder. A XML file name can look like this. It only goes up to 1005.
1001.order-asdf1234.xml
1002.order-asdf4321.xml
I want to sort the files into uniquely named folders that are not based on the file names. A example for that would be
C:\Directory Path...\Peter (All files starting with 1001 go in there)
C:\Directory Path...\John (All files starting with 1002 go there)
How can I create a batch or a powershell script to continuously sorts files into the specified folders? Since I only have 5 folders I would like to simply specify the target folders for each and not have elaborate loops but I don't know how to do that.
The easiest way is to create a lookup Hashtable where you define which prefix ('1001' .. '1005') maps to which destination folder:
# create a Hasthable to map the digits to a foldername
$folderMap = #{
'1001' = 'Peter'
'1002' = 'John'
'1003' = 'Lucretia'
'1004' = 'Matilda'
'1005' = 'Henry'
}
# set source and destination paths
$rootFolder = 'X:\Where\the\files\are'
$destination = 'Y:\Where\the\files\should\go'
# loop over the files in the root path
Get-ChildItem -Path $rootFolder -Filter '*.xml' -File |
Where-Object { $_.BaseName -match '^\d{4}\.' } |
ForEach-Object {
$prefix = ($_.Name -split '\.')[0]
$targetPath = Join-Path -Path $destination -ChildPath $folderMap[$prefix]
$_ | Move-Item -Destination $targetPath -WhatIf
}
Remove the -WhatIf safety-switch if you are satisfied with the results shown on screen
You could use a switch statement to decide on the target folder based on the first part of the file name:
$files = Get-ChildItem path\to\folder\with\xml\files -Filter *.xml
switch($files)
{
{$_.Name -like '1001*'} {
$_ |Move-Item -Destination 'C:\path\to\Peter'
}
{$_.Name -like '1002*'} {
$_ |Move-Item -Destination 'C:\path\to\John'
}
{$_.Name -like '1003*'} {
# etc...
}
default {
Write-Warning "No matching destination folder for file '$($_.Name)'"
}
}
If you change your mind about loops, my preference would be to store the mapping in a hashtable and loop over the entries for each file:
$files = Get-ChildItem path\to\folder\with\xml\files -Filter *.xml
$targetFolders = #{
'1001' = 'C:\path\to\Peter'
'1002' = 'C:\path\to\John'
'1003' = 'C:\path\to\Paul'
'1004' = 'C:\path\to\George'
'1005' = 'C:\path\to\Ringo'
}
foreach($file in $files){
$targetFolder = $targetFolders.Keys.Where({$file.Name -like "${_}*"}, 'First')
$file |Move-Item -Destination $targetFolder
}

Accurate time measurements when copying data via powershell

Hi I have been trying to find a way to help me estimate how long it will take to move databases from one location to another. My online research has helped me through a few issues so far but I seem to be stuck because I have it using what seems to be the correct commands to see all files that would need to be counted but it comes back on a 5TB database as it will only take 22 milliseconds so either I have a faster network and server that I even knew or I screwed this up some how that I cannot see.
$item = get-childitem 'D:\SQL01' -Recurse
$d = "E:\SQL01"
$results = #()
$results = Foreach ($i in $item) {
Measure-Command -Expression {
Copy-Item -literalpath $i $d
}
}
($results | Measure-Object -Property TotalSeconds -Sum).Sum
$results -f "c"
Reading over this it seems fine and it even returns a the sum of time but there is no way that is accurate. Please leave a comment if anyone sees where I did something wrong or you think there is something I could try differently.
Here is an example of Write-Progress in action:
# Get all directories on D:\SQL01 recursively
$directories = Get-ChildItem 'D:\SQL01' -Directory -Recurse
# Set a destination folder
$destination = 'E:\SQL01'
$dirCount = $directories.count
$i=0;foreach($directory in $directories)
{
$progress = #{
Activity = "Copying - {0}" -f $directory.FullName
Status = "Folder $i of $dirCount"
PercentComplete = $i++ / $dirCount * 100
}
Write-Progress #progress
Copy-Item -Path $directory -Destination $destination -Recurse
}

Folder deleting after script ends

I am currently writing a script that takes a folder of files, moves the first file to a folder with a specific name, then move the rest to another folder with a number for a name.
My script works however it also moves the folder and renames it too. Which section of the code is causing this?
$path = "C:\Users\User1\Desktop\MergeTest\_First\"
$FileCount = Get-ChildItem -Path $path -File | Measure-Object | %{$_.Count}
$FirstFile = Get-ChildItem -Path $path -Force -File | Select-Object -First 1
$FinalReport = "C:\Users\User1\Desktop\MergeTest\___Final\TestOutput.xlsx"
Move-Item "C:\Users\User1\Desktop\MergeTest\_First\$FirstFile" $FinalReport
$Counter = 0;
Write-host $FileCount
for($Counter = 0; $Counter -lt $FileCount; $Counter++)
{
$FileInWork = Get-ChildItem -Path $path -Force -File | Select-Object -First 1
move-item "C:\Users\User1\Desktop\MergeTest\_First\$FileInWork" "C:\Users\User1\Desktop\MergeTest\__Second\$Counter.xlsx"
Write-host "File Moved"
}
What you could do is specify the -Include *.txt condition to your move-item commands so it is only to move just .txt, .log, or whatever file type you're moving and leave the folder how it is.
I believe your code could do with some cleaning up. Now you are executing Get-ChildItem 3 times, where using it once is enough.
Also, you should try and use the Join-Path rather than constructing the path and filenames yourself.
Especially where you do "C:\Users\User1\Desktop\MergeTest\_First\$FileInWork", you should realize that Get-ChildItem returns FileInfo and/or DirectoryInfo objects; not strings.
Anyway, the below code should do what you want:
# define the path where all other paths are in
$rootPath = "C:\Users\User1\Desktop\MergeTest"
# create the working paths using the common root folder path
$filesPath = Join-Path -Path $rootPath -ChildPath '_First'
$firstDestination = Join-Path -Path $rootPath -ChildPath '___Final'
$secondDestination = Join-Path -Path $rootPath -ChildPath '__Second'
# test if the destination folders exist and if not create them
if (!(Test-Path -Path $firstDestination -PathType Container)) {
Write-Host "Creating folder '$firstDestination'"
$null = New-Item -Path $firstDestination -ItemType Directory
}
if (!(Test-Path -Path $secondDestination -PathType Container)) {
Write-Host "Creating folder '$secondDestination'"
$null = New-Item -Path $secondDestination -ItemType Directory
}
# get an array of all FileInfo objects in $filesPath
# you could consider adding -Filter '*.xlsx' here..
$allFiles = Get-ChildItem -Path $filesPath -Force -File
Write-Host 'Total number of files found: {0}' -f $allFiles.Count
# move the files
for ($i = 0; $i -lt $allFiles.Count; $i++) {
if ($i -eq 0) {
# the first file should go in the $firstDestination folder with specified name
$target = Join-Path -Path $firstDestination -ChildPath 'TestOutput.xlsx'
}
else {
# all other files go to the $secondDestination folder
# each file should have the index number as name
$target = Join-Path -Path $secondDestination -ChildPath ('{0}.xlsx' -f ($i + 1))
}
$allFiles[$i] | Move-Item -Destination $target -Force -WhatIf
}
Hope that helps
Remove the -WhatIf if you are satisfied with whatever the output on console shows.
P.S. I really think you should edit your question and change its title, because nothing in the question has to do with Folder deleting after script ends..

Copy first N files from source directory to "serialized" destination directory using powershell

Either Powershell or batch script will work. I want to distribute every N number of files from directory A to directory B1, B2, B3, etc.
Example:
C:\a (has 9 .jpg files)
file1.jpg
file2.jpg
...
file9.jpg
then c:\b1, C:\b2, C:\b3 should have 3 files each. it should create directories C:\b* as well.
So far I came up with this code, works fine but copies ALL the files from directory A to directory B:
$sourceFolder = "C:\a"
$destinationFolder = "C:\b"
$maxItems = 9
Get-Childitem $sourceFolder\*.jpg | ForEach-Object {Select-Object -First $maxItems | Robocopy $sourceFolder $destinationFolder /E /MOV}
This also works, will calculate how many new folders should be created.
$excludealreadycopieditems = #()
$sourcefolder = "C:\a"
$destinationFolder = "C:\b"
$maxitemsinfolder = 3
#Calculate how many folders should be created:
$folderstocreate = [math]::Ceiling((get-childitem $sourcefolder\*.jpg).count / $maxitemsinfolder)
#For loop for the proces
for ($i = 1; $i -lt $folderstocreate + 1; $i++)
{
#Create the new folders:
New-Item -ItemType directory $destinationFolder$i
#Copy the items (if moving in stead of copy use Move-Item)
get-childitem $sourcefolder\*.jpg -Exclude $excludealreadycopieditems | sort-object name | select -First $maxitemsinfolder | Copy-Item -Destination $destinationFolder$i ;
#Exclude the already copied items:
$excludealreadycopieditems = $excludealreadycopieditems + (get-childitem $destinationFolder$i\*.jpg | select -ExpandProperty name)
}
Something like this should do:
$cnt = 0
$i = 1
Get-ChildItem "$sourceFolder\*.jpg" | % {
if ($script:cnt -ge $maxItems) {
$script:i++
$script:cnt = 0
}
$dst = "$destinationFolder$script:i"
if (-not (Test-Path -LiteralPath $dst)) {
New-Item $dst -Type Directory | Out-Null
}
Copy-Item $_.FullName $dst
$script:cnt++
}

Resources