Files bulk renaming - match a predefined text file - windows

Good day,
I am trying to rename/organize files based on the match/lookup found in the text file.
I have a couple of hundred Cyrillic(Russian) named media files in a folder like this:
файл 35.avi
файл34.avi
файл2 4.avi
файл14.avi
*note that some files have spaces
The text file, with the desired names, looks like this:
файл 35.avi| 4. файл 35.avi
файл34.avi| 3. файл34.avi
файл2 4.avi| 1. файл2 4.avi
файл14.avi| 2. файл14.avi
The reason it looks that way (with | as a separator) is because I tried using "Bulk Renaming Utility" which uses pipe | as a separator for "Rename Pairs" function. So essentially, the filename to the right of pipe | is the final product. Unfortunately, that function does not work with Cyrillic(Russian) or other non standard characters.
I found PowerShell script HERE which appears to be almost what I need except that it does not match file names before renaming.
Similarly, I found this Python script HERE which does what i need but it's for Ubuntu. Unfortunately, I am on a Windows7 and not sure it applies to me.
Any recommendations?
Thank you very much for your time!

You could read the text file into a hashtable, where the key is the old name (the value on the left hand side of the |), and the value is the new name:
$RenameTable = #{}
Get-Content textfile.txt |ForEach-Object {
$OldName,$NewName = $_.Split('|')
$RenameTable[$OldName] = $NewName
}
Then rename the files based on what is in the hashtable:
Get-ChildItem .\folder\with\avi\files |Rename-Item -NewName {
if($RenameTable.ContainsKey($_.Name)){
$RenameTable[$_.Name]
} else {
$_.Name
}
}

Related

How can I convert part of a filename to become the file extension?

I downloaded a backup folder of about 3,000 files from our email service provider. None of the files have an associated filetype; instead the file extension was appended to the name of each individual file. For example:
community-involvement-photo-1-jpg
social-responsibility-31-2012-png
report-02-12-15-pdf
I can manually change the last dash to a period and the files work just fine. I'm wondering if there is a way to batch convert all of the files so they can be sorted and organized properly. I know in the Command Line I can do something like ren *. *.jpg but there are several different file types contained in the folder, so it wouldn't work for all of them. Is there any way I can tell it to convert the last "-" in each file name into a "." ?
I'm on Windows 10; unable to install any filename conversion programs unless I want to go through weeks of trouble with the IT group.
$ordner = "c:\temp\pseudodaten"
$Liste = (get-childitem -path $Ordner).Name
cd $ordner
foreach ($Datei in $Liste) {
$Length = $datei.length
$NeuerName=$Datei.Substring(0,$Length-4)+"."+$datei.Substring($Length - 3, 3)
rename-item -Path $Datei -NewName $NeuerName
}

Request assistance writing a PS script to search for a list of files (path + filename) against all Windows servers in my environment

What I'm trying to accomplish:
Create a PS script to run from a single Admin machine, but search against C$ on all Windows servers in AD.
Search for a specific list of paths\filenames that I provide.
If ANY of the specific list of paths\filenames are found on a server, THEN output the server name, and paths\filenames to a *.CSV file titled "Badfiles.csv" on the Admin machine.
Was trying to build from the following syntax but admittedly my old brain is not good at this stuff, and I've only specified a single file here - how do I refer to a list of multiple paths\files? Thank you for helping an old lady out. :)
$name= gc env:computername
$computers= get-content -path C:\Windows\Temp\v.bat
$csvfile = "c:\temp\$badfiles.csv"
foreach ($computer in $computers) {
"\$computer\C$\" | Get-ChildItem -recurse -filter "*.bat"
}
To refer to a list of items whether those are files or computer names you will need to use what is called an Array.
You can create an array in many ways, in your case it might best to create a list in a txt file and afterwards in Powershell you read the list contents using get-content, save the result in a variable and it will automatically be saved as an array!
Then iterate through each of them using what is called a foreach loop, that basically lets you take each of the items in the array and do something with it, then move to the next item and so on until every item has been dealt with.
Now the most important part of what you want to achieve is not clear. Let me explain.
To check if a file exists you can use test-path. That will return true or false and you can then act upon the result of that. You need to define an exact path and name of a file to check for this to work.
If you don't know the exact names and paths of files that need to be checked, you can use Get-ChildItem similarly as you have done in the code you provided. The caveat here is that you have to narrow down the scope of the file search as much as you can. In your example you search for the .bat file extension on the whole machine and that can result in some issues. A typical C drive will have hundreds of thousands if not millions of files and folders. Parsing all of them can take a long time.
So this is an important distinction to understand and what causes confusion for me is that you say in "2. Search for a specific list of paths\filenames that I provide..." yet in the code you use Get-ChildItem to get all files instead of providing a list of filenames.
Further I will assume you have a list of filenames with exact known paths.
Now in your given code I can see you have found some of the right commands but they need to be arranged differently to produce the results you need.
Please review this example code that might help you further:
Example ComputerList.txt file content(list of computer hostnames to check):
computer1
serverXYZ
laptop123
Example FileList.txt file content(List of files to check for in each of the above computers):
c:\temp\virus.bat
c:\games\game.exe
c:\Pictures\personal.jpg
Now the PowerShell code:
# Gets the list of items from TXT files and saves them as arrays in variables
$ComputerNames = Get-Content 'c:\temp\ComputerList.txt'
$FileList = Get-Content 'c:\temp\FileList.txt'
# Define the path and name of CSV report
$BadFiles = "c:\temp\badfiles.csv"
# Define the foreach loop that will iterate through each hostname in computer list
foreach($computer in $ComputerNames){
# Define foreach loop that will iterate through each file in the list and test their path in the current computer in the current iteration
foreach($file in $FileList){
# Test the path of the current file in the loop and append the CSV file if it was found
# Convert the file path to C$ share path
$file = $file -replace("c:","c$")
# Define path of file to test
$FileToTest = "\\$computer\$file"
if (test-path $FileToTest -ErrorAction SilentlyContinue){
# This block will run only when a bad file has been found
# This part can be tricky but it is used to make sure we properly format the current bad file entry when we append it to the resulting CSV file
$BadFile = "" | select Computer,File
# Save information about current computer
$BadFile.computer = $computer
# Save information about current file
$BadFile.file = $file
# Append the entry to an array of found bad files
$BadFileList += $badfile
}
}
}
# When done iterating through every computer and file, save the results in a CSV file
$BadFileList | ConvertTo-Csv -NoTypeInformation | Out-File $BadFiles
The above is a full code snippet you can test and run in your environment. First please create the two TXT files and make sure you run PowerShell with the appropriate permissions to access the C$ network shares of the servers.
The snippet should work but I have not tested it myself. Let me know if there are any errors.
Please test and feel free to ask if you have any follow up questions.

Script to compare two different folder contents and rename them based on minimum similarity

Story:
I have multiple folders with 1000+ files in each that are named similar to each other but are slightly different but they relate to the same content.
For example, in one folder I have files named quite simply "Jobs to do.doc" and in another folder "Jobs to do (UK) (Europe).doc" etc.
This is on Windows 10, not Linux.
Question:
Is there a script to compare each folder's content and rename them based on minimum similarity? So the end result would be to remove all the jargon and have each file in each folder (multiple) the same as one another but STILL remain in the retrospective folder?
*Basically compare multiple folder content to one folders contents and rename them so each file in each folder is named the same?
Example:
D:/Folder1/Name_Of_File1.jpeg
D:/Folder2/Name_Of_File1 (Europe).jpeg
D:/Folder3/Name_of_File1_(Random).jpeg
D:/folder1/another_file.doc
D:/Folder2/another_file_(date_month_year).txt
D:/Folder3/another_file(UK).XML
I have used different file extensions in the above example in hope someone can write a script to ignore file extensions.
I hope this make sense. So either a script to remove the content in brackets and keep the files integrity or rename ALL files across all folders based on minimum similarity.
The problem is its 1000+ files in each folder so want to run it as an automated job.
Thanks in advance.
If the stuff you want to get rid of is always in brackets then you could write a regex like
(.*?)([\s|_|]*\(.*\))
Try something like this
$folder = Get-ChildItem 'C:\TestFolder'
$regex = '(.*?)([\s|_|]*\(.*\))'
foreach ($file in $folder){
if ($file.BaseName -match $regex){
Rename-Item -Path $file.FullName -NewName "$($matches[1])$($file.extension)" -Verbose #-WhatIf
}
}
Regarding consistency you could run a precheck using same regex
#change each filename if it matches regex and store only it's new basename
$folder1 = get-childitem 'D:\T1' | foreach {if ($_.BaseName -match $regex){$matches[1]}else{$_.BaseName}}
$folder2 = get-childitem 'D:\T2' | foreach {if ($_.BaseName -match $regex){$matches[1]}else{$_.BaseName}}
#compare basenames in two folders - if all are the same nothing will be returned
Compare-Object $folder1 $folder2
Maybe you could build with that idea.

Windows Batch sript that moves files based on a (partial char string) looking it up in a CSV/txt file

What I'm looking for might be a variation of this solution: Windows batch file to sort files into separate directories based on types specified in a csv
My Situation: a batch process in a server creates files that look like this: S0028513-010716-0932.txt. S stands for summary, the first five digits stand for a supplier, the last two before the hyphen stand for the Distribution Center. After the hyphen, there is the date and after the second hyphen the timestamp.
What I need to do is:
set a variable for the month/year (e.g. 0716) (this has been set with "set /P c:Please enter MMYY:"). This part is done.
create a folder with subfolders (e.g. 0716\PHARMA, 0716\MEDICAL, etc). I've done this part.
look up the supplier number in a CSV file (e.g. S00285 above) and
move the file to the corresponding folder based on MMYY\PHARMA, etc.
Points 3 and 4 are obvioulsy missing. A practical example: there are three folders where the files can be moved: PHARMA, MEDICAL and CONSUMER
The CSV file looks like this:
S00285 CONSUMER
S00286 PHARMA
S00287 MEDICAL
...
What I want the script to do is to look up the month/year combination in variable c and take all files that correspond to this month/year and move them to the three folders according to the assignment in the CSV file.
Can this be done with standard Windows scripting? Sorry guys, I'm a novice as you can tell. I have only some very basic knowledge of BASH scripting.
Thank you a lot for any advice.
BR
Marcio
This can fairly easily be accomplished with PowerShell
$FolderRoot = "E:\Target\Directory"
Set-Location $FolderRoot
# 1. Have user input month/year string
do{
$MMYY = $(Read-Host 'Please enter MMYY').Trim()
} until ($MMYY -match '\d{4}')
# 2. Create directory
mkdir $MMYY
# ?. Gather input files for that year
$Files = Get-ChildItem -Filter S*.txt |Where-Object {$_.BaseName -match "S\d{7}-\d{2}$MMYY-\d{4}"}
# ?. load CSV file into hash table to easily look up supplier numbers
$SupplierLookupTable = #{}
# Assuming the csv has headers: Supplier,Industry
Import-Csv -Path E:\path\to\suppliers.csv |ForEach-Object {
$SupplierLookupTable[$_.Supplier] = $_.Industry
}
foreach($File in $Files)
{
# Grab the S and first 5 digits from the file name
$Supplier = $File.BaseName.Substring(0,6)
# 3. Look up the industry
$Industry = $SupplierLookupTable[$Supplier]
$Destination = Join-Path $MMYY $Industry
# Create folder if it doesn't already exist
if(-not (Test-Path $Destination))
{
mkdir $Destination
}
# 4. Move the file
Move-Item $File.Fullname -Destination $Destination
}

Recursively replace certain text in all files following certain pattern with another text

I have a certain files contained in different directories inside a certain parent directory. Some of these files are prefixed with certain text. I want to replace this text with another text using powershell. I have tried below. But no luck. The powershell outputs as if the file names have been renamed. However when I checked back in the directory it was not actually reflected:
Your ForEach-Object loop just takes the name of the files, replaces the prefix, then echoes the modified string. Use Rename-Item instead of ForEach-Object:
Get-ChildItem abc* | Rename-Item -NewName { $_.Name.Replace('abc', 'uvw') }

Resources