i am trying to loop through all files no matter the type, in a folder, and change a string with one that is input by the user..
i can do this now, with the code below, but only with one type of file extension..
This is my code:
$NewString = Read-Host -Prompt 'Input New Name Please'
$scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
$InputFiles = Get-Item "$scriptPath\*.md"
$OldString = 'SolutionName'
$InputFiles | ForEach {
(Get-Content -Path $_.FullName).Replace($OldString,$NewString) | Set-Content -Path $_.FullName
}
echo 'Complete'
How do i loop through the files, no matter the extension ?
so no matter if it is a md, txt or cshtml or some other, it will replace the string as instructed.
To get all the files in a folder you can get use Get-ChildItem. Add the -Recurse switch to also include files inside of sub-folders.
E.g. you could rewrite your script like this
$path = 'c:\tmp\test'
$NewString = Read-Host -Prompt 'Input New Name Please'
$OldString = 'SolutionName'
Get-ChildItem -Path $path | where {!$_.PsIsContainer} | foreach { (Get-Content $_).Replace($OldString,$NewString) | Set-Content -Path $_.FullName }
this will first get all the files from inside the folder defined in $path, then replace the value given in $OldString with what the user entered in when prompted and finally save the files.
Note: the scripts doesn't make any difference regarding if the content of the files changed or not. This will cause all files modified date to get updated. If this information is important to you then you need to add a check to see if the files contains the $OldString before changing them and saving.
Related
We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName
I was downloading a huge torrent (1.2tb with over 6000 folders) divided in 2 parts, so I placed the 2nd part on the designed place and it was not a problem since the master-folder of the torrent is exactly what was needed. The 1st part master-folder was named with some generic torrent name instead of the name I needed, so instead of renaming the torrent name to "source", which I think would have worked and renamed the currently generic name to "source". In files tab I selected all the files and right-click>relocate all of them and bittorrent simply moved all of the files to the same directory, without any subfolder, and created a mess.
So I have a un-finished backup of this torrent and the files are in place, so my idea was using the un-finished one's name, match with the finished ones and put the finished ones in the un-finished matching name's path folder. I hope that was clear.
I tried to resolve this using PowerShell, but I dont know much, so I came up with this and nothing happens, something is wrong. Anyone knows a solution?
$itemlistA = Get-ChildItem -Path "D:\BitTorrent\" |
ForEach-Object {
$objnameA = $_.Name
$objPathA = $_.FullName
}
$itemlistB = Get-ChildItem -Path "E:\DesiredPath\" -recurse |
ForEach-Object{
$objnameB = $_.Name
$objPathB = $_.FullName
}
ForEach-Object{
if($objnameA -eq $objnameB){
Copy-Item -path $objPathA -Destination $objPathB
Write-Host "ffff Object ($objnameA) new Path ($objPathB) ffff"
}
}
If I'm understanding your intent correctly, the script below will accomplish your goal, assuming your goal is to copy files from a flattened directory into some (potentially) nested directories so that the incoming files overwrite files with matching names.
The O(n^2) performance of the nested loops could be improved with a sort and more efficient search.
You'd need to edit the script's params to reflect your own environment.
param(
$pathToFiles = "$PSScriptRoot\BitTorrent\",
$desiredPath = "$PSScriptRoot\DesiredPath\"
)
$itemlistA = Get-ChildItem -Path $pathToFiles | Select-Object -Property Name, FullName
$itemlistB = Get-ChildItem -Path $desiredPath -Recurse | Select-Object -Property Name, FullName
foreach ($fileA in $itemlistA) {
foreach ($fileB in $itemListB) {
if ($fileB.Name -eq $fileA.Name) {
Copy-Item -path $fileA.FullName -Destination $fileB.FullName -Verbose
break
}
}
}
I have a main folder where other folders reside with .mkg files . These folders have a certain format.
Folder = NameSerie.SxEy.Randomstuf
Item in folder = NameSerie.SxEy.Randomstuf.mkv
Where x is the season number and y is the episode number.
What i want to do is automaticly create a folder if the nameSerie isnt already created and put the .mkv files in this folder.
So, if we have folders named NameSerie.SxEy.Randomstuf we check if folder NameSerie exists, if not we create one. Then we enter folder NameSerie.SxEy.Randomstuf and copy the NameSerie.SxEy.Randomstuf.mkv file in the NameSerie folder.
the file name needs to change from NameSerie.SxEy.Ra.n[dom}stuf.mkv to NameSerie.SxEy.mkv but I cant seem to figure out how to remove the random stuf after the NameSerie.SxEy.< >.mkv
This is the code that i have managed to create but im still stuck. I have managed to create a folder if one does not exists but this only works if the .mkv file is not in a folder.
$Location = "\\<ip>\Share\Media\Series"
#rename files
Get-ChildItem $Location | Rename-Item -NewName {$_.Name.Replace(" [480p]","") }
#make folder for serie if it does not exist
Get-ChildItem "$Location\*.mkv" |
Foreach-Object {
$FullName = $_.Name
$pos = $FullName.IndexOf(" - ")
$Name = $FullName.Substring(0, $pos)
Write-Host $_.FullName
$TARGETDIR = "$Location\$Name"
if( -Not (Test-Path -Path $TARGETDIR ) )
{
New-Item -ItemType directory -Path $TARGETDIR
}
Move-Item -Path $_.FullName -Destination $TARGETDIR
}
You could use the parameter -Recurse when you list files from the path like this:
Get-ChildItem "$Location\*.mkv" -Recurse
instead of:
Get-ChildItem "$Location\*.mkv"
With the -Recurse parameter, the script will list files in the current directory and for each directory in the current directory, it will do the same recursively. So the .mkv inside other folders won't hide any longer.
I currently have a CSV which contains 1 column that lists many file FullNames. (ie. "\\server\sub\folder\file.ext").
I am attempting to import this CSV, move the file to a separate location and append a GUID to the beginning of the filename in the new location (ie GUID_File.ext). I've been able to move the files, generate the GUID_ but haven't been able to store and reuse the existing filename.ext, it just gets cut off and the file ends up just being a GUID_. I just am not sure how to store the existing filename for reuse.
$Doc = Import-CSV C:\Temp\scripttest.csv
ForEach ($line in $Doc)
{
$FileBase = $Line.basename
$FileExt = $Line.extension
Copy-Item -path $line.File -Destination "\\Server\Folder\$((new-guid).guid.replace('-',''))_$($Filebase)$($FileExt)"
}
If possible, I'm going to also need to store and place all the new GUID_File.ext back into a CSV and store any errors to another file.
I currently have a CSV which contains 1 column that lists many file FullNames. (ie. "\server\sub\folder\file.ext").
This isn't a CSV. It's just a plaintext file with a list.
Here's how you can accomplish your goal, however:
foreach ($path in (Get-Content -Path C:\Temp\scripttest.csv))
{
$file = [System.IO.FileInfo]$path
$prefix = (New-Guid).Guid -replace '-'
Copy-Item -Path $file.FullName -Destination "\\Server\Folder\${prefix}_$file"
}
This will take your list, convert the item into a FileInfo type it can work with, and do the rest of your logic.
Based on:
$FileBase = $line.basename
$FileExt = $line.extension
it sounds like you mistakenly think that the $line instances representing the objects returned from Import-Csv C:\Temp\scripttest.csv are [System.IO.FileInfo] instances, but they're not:
What Import-Csv outputs are [pscustomobject] instances whose properties reflect the column values of the input CSV, and the values of these properties are invariably strings.
You must therefore use $line.<column1Name> to refer to the column containing the full filenames, where <column1Name> is the name defined for the column of interest in the header line (the 1st line) of the input CSV file.
If the CSV file has no header line, you can specify the column names by passing an array of column names to Import-Csv's -Header parameter, e.g.,
Import-Csv -Header Path, OtherCol1, OtherCol2, ... C:\Temp\scripttest.csv
I'll assume that the column of interest is named Path in the following solution:
$Doc = Import-Csv C:\Temp\scripttest.csv
ForEach ($rowObject in $Doc)
{
$fileName = Split-Path -Leaf $rowObject.Path
Copy-Item -Path $rowObject.Path `
-Destination "\\Server\Folder\$((new-guid).guid.replace('-',''))_$fileName"
}
Note how Split-Path -Leaf is used to extract the filename, including extension, from the full input path.
If I read your question carefully, you want to:
copy the files listed in the CSV file in the 'File' column.
the new files should have a GUID prepended to the filename
you need a new CSV file where the new filenames are stored for later reference
you want to track any errors and write those to a (log) file
Assuming you have an input CSV file looking something like this:
File,Author,MoreStuff
\\server\sub\folder\file.ext,Someone,Blah
\\server\sub\folder\file2.ext,Someone Else,Blah2
\\server\sub\folder\file3.ext,Same Someone,Blah3
Then below script does hopefully what you want.
It creates new filenames by prepending them with a GUID and copies the files in the CSV listed in column File to some destination path.
It outputs a new CSV file in the destination folder like this:
OriginalFile,NewFile
\\server\sub\folder\file.ext,\\anotherserver\sub\folder\38f7bec9e4c0443081b385277a9d253d_file.ext
\\server\sub\folder\file2.ext,\\anotherserver\sub\folder\d19546f7a3284ccb995e5ea27db2c034_file2.ext
\\server\sub\folder\file3.ext,\\anotherserver\sub\folder\edd6d35006ac46e294aaa25526ec5033_file3.ext
Any errors are listed in a log file (also in the destination folder).
$Destination = '\\Server\Folder'
$ResultsFile = Join-Path $Destination 'Copy_Results.csv'
$Logfile = Join-Path $Destination 'Copy_Errors.log'
$Doc = Import-CSV C:\Temp\scripttest.csv
# create an array to store the copy results in
$result = #()
# loop through the csv data using only the column called 'File'
ForEach ($fileName in $Doc.File) {
# check if the given file exists; if not then write to the errors log file
if (Test-Path -Path $fileName -PathType Leaf) {
$oldBaseName = Split-Path -Path $fileName.Path -Leaf
# or do $oldBaseName = [System.IO.Path]::GetFileName($fileName)
$newBaseName = "{0}_{1}" -f $((New-Guid).toString("N")), $oldBaseName
# (New-Guid).toString("N") returns the Guid without hyphens, same as (New-Guid).Guid.Replace('-','')
$destinationFile = Join-Path $Destination $newBaseName
try {
Copy-Item -Path $fileName -Destination $destinationFile -Force -ErrorAction Stop
# add an object to the results array to store the original filename and the full filename of the copy
$result += New-Object -TypeName PSObject -Property #{
'OriginalFile' = $fileName
'NewFile' = $destinationFile
}
}
catch {
Write-Error "Could not copy file to '$destinationFile'"
# write the error to the log file
Add-content $Logfile -Value "$((Get-Date).ToString("yyyy-MM-dd HH:mm:ss")) - ERROR: Could not copy file to '$destinationFile'"
}
}
else {
Write-Warning "File '$fileName' does not exist"
# write the error to the log file
Add-content $Logfile -Value "$((Get-Date).ToString("yyyy-MM-dd HH:mm:ss")) - WARNING: File '$fileName' does not exist"
}
}
# finally create a CSV with the results of this copy.
# the CSV will have two headers 'OriginalFile' and 'NewFile'
$result | Export-Csv -Path $ResultsFile -NoTypeInformation -Force
Thank you to everyone for the solutions. All of them worked and worked well. I chose Theo as the answer for the fact that his solution solved the error logging and stored all the new renamed files with GUID_File.ext new to the existing CSV info.
Thank you all.
I am using a Powershell Script which should create a file that includes the Directory-Order (folder, subfolder, files, etc.):
$path = "golf.de/dgv"
Get-ChildItem -Path $folder -recurse | sort Directory, Name| format-Table -auto $path, Directory, Name | Out-File C:\Users\J.Kammermeier\Desktop\Johannes\testtext.txt
until now the output looks like this
C:\Users\J.Kammermeier\Desktop\Johannes Test-Datei1.txt
C:\Users\J.Kammermeier\Desktop\Johannes Test-Datei2.txt
C:\Users\J.Kammermeier\Desktop\Johannes\Sonstige Datein\Musik WACKEN.txt
but I need it in this order:
.../Johannes Test-Datei1.txt
...Johannes\Sonstige Datein\Musik WACKEN.txt
How to achieve it?
You'll have to mangle the Directory property a bit, using Select-Object and calculated properties:
# Set the path and folder property
$path = "golf.de/dgv"
$folder = "C:\Users\J.Kammermeier\Desktop\Johannes"
# Get the name of the parent folder (the part we want to remove)
$basePath = (Get-Item $folder).Parent.FullName
# Retrieve the files
$files = Get-ChildItem -Path $folder -Recurse
# Select the Name property and then two calculated properties, "Directory" and "Path"
$files = $files |Select-Object #{Name="BaseURL";Expression={"$path"}},
#{Name="Directory";Expression={$_.Directory.FullName.Substring($basePath.Length - 1)}},
Name
# Sort them
$files = $files |Sort-Object Directory, Name
# Formatted output to file
$files | Format-Table -AutoSize | Out-File C:\Users\J.Kammermeier\Desktop\Johannes\testtext.txt
From the details, I guess that you're trying to audit the files for a website, you could combine the Path and Directory properties and fix the back slashes with -replace:
#{Name="URLPath";Expression={"$path/" + $($_.Directory.FullName.Substring($basePath.Length - 1) -replace "\\","/")}}