Hi I have been trying to find a way to help me estimate how long it will take to move databases from one location to another. My online research has helped me through a few issues so far but I seem to be stuck because I have it using what seems to be the correct commands to see all files that would need to be counted but it comes back on a 5TB database as it will only take 22 milliseconds so either I have a faster network and server that I even knew or I screwed this up some how that I cannot see.
$item = get-childitem 'D:\SQL01' -Recurse
$d = "E:\SQL01"
$results = #()
$results = Foreach ($i in $item) {
Measure-Command -Expression {
Copy-Item -literalpath $i $d
}
}
($results | Measure-Object -Property TotalSeconds -Sum).Sum
$results -f "c"
Reading over this it seems fine and it even returns a the sum of time but there is no way that is accurate. Please leave a comment if anyone sees where I did something wrong or you think there is something I could try differently.
Here is an example of Write-Progress in action:
# Get all directories on D:\SQL01 recursively
$directories = Get-ChildItem 'D:\SQL01' -Directory -Recurse
# Set a destination folder
$destination = 'E:\SQL01'
$dirCount = $directories.count
$i=0;foreach($directory in $directories)
{
$progress = #{
Activity = "Copying - {0}" -f $directory.FullName
Status = "Folder $i of $dirCount"
PercentComplete = $i++ / $dirCount * 100
}
Write-Progress #progress
Copy-Item -Path $directory -Destination $destination -Recurse
}
Related
I have 33,000 photos in one folder. I want to divide all of the photos into multiple folders. The problem is I can’t even access the folder; every-time I open the folder the computer starts overheating. It doesn’t even load the photos; it freezes the computer every-time. Even if I did manage to load all 33,000 photos; It would probably take me all day to drag 100 photos at a time and put them inside a folder. There has to be an easier method, there has to be an application/software that could do that automatically.
It's not the most efficient way, but you can use PowerShell to move them into generic numbered folders using the below script. Simply change the 4x variables to suit and you should be good to go.
# Change these as preferred
$sourceFolder = "D:\source"
$destinationRoot = "D:\dest"
$extensions = #("*.png", "*.jpg", "*.jpeg")
$itemsPerFolder = 100
$folder = 0
while (#(Get-ChildItem $sourceFolder -Include $extensions -Recurse).Count -gt 0)
{
$tDest = "$destinationRoot\$folder"
mkdir $tDest
Write-Host "Moving to folder $tDest"
Get-ChildItem -File -Path $sourceFolder -Include $extensions -Recurse | Select-Object -First $itemsPerFolder | Move-Item -Destination $tDest
$folder++
}
For a more efficient way, try
# Change these as preferred
$sourceFolder = "D:\source"
$destinationRoot = "D:\dest"
$extensions = #("*.png", "*.jpg", "*.jpeg")
$itemsPerFolder = 100
$allItems = #(Get-ChildItem $sourceFolder -Include $extensions -Recurse)
for ($i = 0; $i -lt $allItems.Count; $i++)
{
$folder = [math]::Floor($i / $itemsPerFolder)
$tDest = "$destinationRoot\$folder"
if (!(Test-Path $tDest))
{
mkdir $tDest
}
Move-Item $allItems[$i] -Destination $tDest
}
I am creating a script that splits a target folder's files into subfolders of n length, where n is a number specified dynamically.
So basically, if Folder A has 9000 files, and I limit the number of files to 1000 per folder, the script would create nine sub-directories inside of Folder A with 1000 files each.
Here is working code:
param (
[Parameter(Mandatory,Position=0)]
[String]
$FileList,
[Parameter(Mandatory=$false,ValueFromPipelineByPropertyName)]
[Int32]
$NumFilesPerFolder = 1000,
[Parameter(Mandatory=$false,ValueFromPipelineByPropertyName)]
[Int32]
$FolderNumberPadding = 2
)
$Folders = Get-Content $FileList
Set-Location -LiteralPath ([IO.Path]::GetTempPath())
function Move-Files {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position=0)]
[System.Collections.ArrayList]
$List,
[Parameter(Mandatory)]
[Int32]
$Index
)
$BaseFolder = [System.IO.Path]::GetDirectoryName($List[0])
$DestFolderName = $Index.ToString().PadLeft($FolderNumberPadding, '0')
$DestFolder = New-Item -Path (Join-Path $BaseFolder $DestFolderName) -Type Directory -Force
Move-Item $List -Destination $DestFolder -Force
}
foreach ($Folder in $Folders) {
$Files = Get-ChildItem -LiteralPath $Folder -File -Force
$filesidx = 1
$totalidx = $null
$groupidx = 0
$FilesToMove = [System.Collections.ArrayList]#()
foreach ($File in $Files) {
if($null -eq $totalidx){
$totalidx = $Files.Length
}
if($filesidx -eq 1){
$groupidx++
}
$FilesToMove.Add($File)
if($filesidx -eq $NumFilesPerFolder){
Move-Files -List $FilesToMove -Index $groupidx
$FilesToMove.Clear()
$filesidx = 1
}elseif($totalidx -eq 1){
Move-Files -List $FilesToMove -Index $groupidx
$FilesToMove.Clear()
break
}else{
$filesidx++
}
$totalidx--
}
}
Remove-Item $FileList -Force
$app = New-Object -ComObject Shell.Application
$appwin = $app.Windows()
foreach ($window in $appwin) {
if($window.Name -eq "File Explorer"){
$window.Refresh()
}
}
Invoke-VBMessageBox "Operation Complete" -Title "Operation Complete" -Icon Information -BoxType OKOnly
This code runs reasonably well, but it heavily bottlenecks when actually moving the files with Move-Item. I'd like to try and use RoboCopy here, but I am perplexed as to how I can implement it.
What I'm having trouble with is that the items I need to move are stored in a list (see the Move-Files function), and every item that needs to be moved are all in the same sub-directory. So I can't just do RoboCopy.exe C:\Source C:\Destination /mov.
How can I integrate RoboCopy here to accomplish my goal? I really need multi-threaded performance as this function will be responsible for moving thousands of files around in production on a frequent basis.
Any help would be greatly appreciated - please let me know if I can provide more information to further clarify my objective.
Thanks for any help at all!
We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName
all.
I'm stuck. I have a PowerShell script which looks to a specific folder for files which are older than 30 days from the last modified date (additionally, it'll create the folder if it doesn't exist). It creates the folder, it gives me the total files, it'll list all of the files in a test query, but it won't actually count the number of 30+ day old files. I've tried several methods to get this count (some deriving from other solutions to delete old files from this site), but PowerShell just doesn't want to do it.
Here's my code so far...
$HomePath = $env:USERPROFILE
$CompanyFolder = "\Company"
$TimeSensativeFolder = "\TimeSensative"
$TimeSensativePath = $HomePath+$CompanyFolder+$TimeSensativeFolder
$OldFilesAmount = 0
$TotalFilesAmount = 0
$TimeLimit = (Get-Date).AddDays(-30)
$StatusOK = "No old files were found in the time sensative folder."
$StatusCreated = "The time sensative folder was created."
$StatusError1 = "There were old files found in the time sensative folder!"
$StatusError2 = "Unable to create the time sensative folder!"
function MakeTimeSensativeFolder ($TimeSensativePath) {
try {
md $TimeSensativePath -Force -ErrorAction Stop
Write-Host $StatusCreated
}
catch {
Write-Host $StatusError2
}
}
function CountOldFiles () {
$OldFilesAmount = $OldFilesAmount + 1
}
if(!(Test-Path $TimeSensativePath -PathType Container)) {
MakePHIFolder $TimeSensativePath
}
else {
}
try {
$TotalFilesAmount = (Get-ChildItem $PHIPath -Recurse -File | Measure-Object).Count
# I've tried this...
Get-Item $PHIPath | Foreach {$_.LastWriteTime} -ErrorAction Stop
if (Get-Content $_.LastWriteTime | Where-Object {$_ -gt $TimeLimit}) {
CountOldFiles
}
# And I've tried this...
Get-ChildItem -Path $PHIPath -Recurse -File | Foreach-Object {
if (Get-Content $_.LastWriteTime | Where-Object {$_ -gt $TimeLimit}) {
CountOldFiles
}
}
# I've even tried this...
Get-ChildItem $PHIPath -Recurse -File | ? {
-not $_.PSIsContainer -and $_.LastWriteTime -lt $TimeLimit
} | CountOldFiles
# And this, as well...
Get-ChildItem -Path $PHIPath -Recurse -File | Where-Object {$_.LastWriteTime -gt $TimeLimit} | CountOldFiles
}
catch {
MakeTimeSensativeFolder $TimeSensativePath
}
# Used for testing.
<#
Get-ChildItem $TimeSensativePath -Recurse -File
Write-Host "TimeSensative folder exists:" $TimeSensativePathExists
Write-Host "Home TimeSensative path:" $TimeSensativePath
Write-Host "Old files found:" $OldFilesAmount
Write-Host "Total files found:" $TotalFilesAmount
Exit
#>
# Determining proper grammar for status message based on old file count.
if ($OldFilesAmount -eq 1) {
$StatusError1 = "There was "+$OldFilesAmount+" old file of "+$TotalFilesAmount+" total found in the PHI folder!"
}
if ($OldFilesAmount -gt 1) {
$StatusError1 = "There were "+$OldFilesAmount+" old files of "+$TotalFilesAmount+" total found in the PHI folder!"
}
# Give statuses.
if ($OldFilesAmount -gt 0) {
Write-Host $StatusError1
}
else {
Write-Host $StatusOK
}
Depending on which I tried, I would get no result or I'd get something like this:
Get-Content : Cannot find drive. A drive with the name '12/22/2016 17' does not exist.
At C:\Users\nobody\Scripts\PS1\ts_file_age.ps1:54 char:14
+ if (Get-Content $_.LastWriteTime | Where-Object {$_ -gt $Tim ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (12/22/2016 17:String) [Get-Content], DriveNotFoundException
+ FullyQualifiedErrorId : DriveNotFound,Microsoft.PowerShell.Commands.GetContentCommand
In any instance, there's no old file count as I'm endeavoring to demand.
It's been a bit of a head scratcher. Any advice?
Thanks so much in advance!
Filtering files with last write time is easy enough. Like so,
$allFiles = gci
$d = (Get-Date).adddays(-30)
$newFiles = #()
$oldFiles = #()
$allFiles | % { if ($_.lastwritetime -ge $d) { $newFiles +=$_ } else { $oldFiles += $_ } }
What's done here is that first all the files are set in a collection. This isn't mandatory, but one can browse the collection to check that it's been populated properly. This is useful in cases one has complex paths or exclusion filters.
The second step is just to get a DateTime that is used later to divide files into old and new ones. Just like the sample did, so nothing interesting here. Actually, there's one little thing. The date is -30 days, but hours, minutes and seconds are based on current time. So if there's really tight limit, consider using midnight time ([datetime]::Today).AddDays(-30)
The third step is to declare two empty collections for new and old files.
The last step is to iterate through the $allFiles and check the last write time. If it's greater or equal to the cutpoint, add it into $newFiles, othervise $OldFiles.
After the last step, further processing should be simple enough.
This is what I do to get (delete in this case) files older than X days:
$Days = 5
$limit = (Get-Date).AddDays(-$Days)
$CurrentDate = Get-Date
#This will delete all files older than 5 days
Get-ChildItem -Path $Workdir -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastWriteTime -lt $limit } | Remove-Item -Force
I would like to copy files between folders. Just modified (CSV files with new entries) in current day and one day before.
Here is my code:
foreach ($file in (Get-ChildItem "D:\Shares\WinCAP Data\DAYPROT\OFS-222_2")) {
if ($file.LastWriteTime = (Get-Date).AddDays(-1)) {
Copy-Item -Path "D:\Shares\WinCAP Data\DAYPROT\OFS-222_2\*.csv" -Destination "\\Oracle\MP"
"copying $file"
} else {
"not copying $file"
}
}
What is wrong - any suggestions?
You need to compare the date with -gt otherwise your're looking for files that were copied at that EXACT time.
Note that doing (Get-Date).AddDays(-1) is perfectly valid but will give you anything modified in the last 24 hours.
$DestinationFolder = "\\Oracle\MP\"
$EarliestModifiedTime = (Get-date).AddDays(-1)
$Files = Get-ChildItem "D:\Shares\WinCAP Data\DAYPROT\OFS-222_2\*.csv"
foreach ($File in $Files) {
if ($File.LastWriteTime -gt $EarliestModifiedTime)
{
Copy-Item $File -Destination $DestinationFolder
Write-Host "Copying $File"
}
else
{
Write-Host "Not copying $File"
}
}
If you didn't want to write out the "Copying ..." and "Not Copying ..." then you could simplify this quite a bit.
$DestingationFolder = "\\Oracle\MP\"
$EarliestModifiedTime = (Get-date).AddDays(-1)
Get-ChildItem -File |? { $_.LastWriteTime -gt $EarliestModifiedTime } | Copy-Item -Destination $DestingationFolder
Finally, if you want to copy anything since the beginning of (eg midnight at the start of) yesterday then change the following line:
$EarliestModifiedTime = (Get-date).AddDays(-1).Date
#Mr Tree I have one more related question.
I got few times per day new file at the location D:\Shares\WinCAP Data\DAYPROT\OFS-HT (location 1) with fixed name abcDD.MM.YYYY.csv (abc03.09.2015.csv) and I have a service which every 10 minutes call my powershell script below. I made as you suggest before in upper posts. My goal is: 1. to check if there is new file with name abcDD.MM.YYYY.csv | 2. rename it into abcDD.MM.YYYYHT.csv and move it to "\Oracle\MP\PURO\" (location 2) folder where I need to rewrite it with existing for current day.
Problem is that if the file already exists on the location 2, script does not want to move it and rewrite it? Thanks for hints.
$DestingationFolder = "\\Oracle\MP\PURO\"
$EarliestModifiedTime = (Get-date).AddDays(-1)
Get-ChildItem "D:\Shares\WinCAP Data\DAYPROT\OFS-HT\*.csv" | ?{!($_.fullname -match "HT\.csv")} | Rename-Item -NewName { $_.Name -replace "\.csv", "HT.csv" }
$Files = Get-ChildItem "D:\Shares\WinCAP Data\DAYPROT\OFS-HT\*.csv" -File
foreach ($File in $Files) {
if ($File.LastWriteTime -gt $EarliestModifiedTime)
{
Move-Item $File -Destination $DestingationFolder
Write-Host "Moving $File"
}
else
{
Write-Host "Not moving $File"
}
}