Extracting multiple files into their respective directories in windows - windows

I'm posting this question after extensive searches did not yield a solution to my problem.
Here's the problem: I have a folder in windows, with multiple sub folders. Each of them has 1 or more compressed (rar) folders:
-Master_folder
sub_folder1
rarfolder1
Sub_folder2
rarfolder1
and so on
Is there a way to extract the folder that sub_folderX (where X varies from 1 to 300) contains, into sub_folderX itself, and so on for all other sub folders?.
All posts/solutions out there on extracting multiple files simultaneously (even using CLI) talk about extracting everything into a single location. I observed similar results when experimenting with the Winrar GUI options.
However, i don't want to put them in a single location since the extracted folders have the same name. Their location within their outer folder is what differentiates them.

If you are open to scripting, you can recursively iterate over the subfolders using command line winrar and some batch scripting.

#Root drive where rar files are located
$Directory = "T:\*"
$rar = Get-ChildItem -path $Directory -Recurse -Include *.rar
foreach($line in $rar){
$unradir = $line.Directory
$rarFileLocation = $line.VersionInfo.FileName
C:\"Program Files (x86)"\WinRAR\unrar.exe e -ro- $rarFileLocation $unradir
}

Related

How can I convert part of a filename to become the file extension?

I downloaded a backup folder of about 3,000 files from our email service provider. None of the files have an associated filetype; instead the file extension was appended to the name of each individual file. For example:
community-involvement-photo-1-jpg
social-responsibility-31-2012-png
report-02-12-15-pdf
I can manually change the last dash to a period and the files work just fine. I'm wondering if there is a way to batch convert all of the files so they can be sorted and organized properly. I know in the Command Line I can do something like ren *. *.jpg but there are several different file types contained in the folder, so it wouldn't work for all of them. Is there any way I can tell it to convert the last "-" in each file name into a "." ?
I'm on Windows 10; unable to install any filename conversion programs unless I want to go through weeks of trouble with the IT group.
$ordner = "c:\temp\pseudodaten"
$Liste = (get-childitem -path $Ordner).Name
cd $ordner
foreach ($Datei in $Liste) {
$Length = $datei.length
$NeuerName=$Datei.Substring(0,$Length-4)+"."+$datei.Substring($Length - 3, 3)
rename-item -Path $Datei -NewName $NeuerName
}

Script to compare two different folder contents and rename them based on minimum similarity

Story:
I have multiple folders with 1000+ files in each that are named similar to each other but are slightly different but they relate to the same content.
For example, in one folder I have files named quite simply "Jobs to do.doc" and in another folder "Jobs to do (UK) (Europe).doc" etc.
This is on Windows 10, not Linux.
Question:
Is there a script to compare each folder's content and rename them based on minimum similarity? So the end result would be to remove all the jargon and have each file in each folder (multiple) the same as one another but STILL remain in the retrospective folder?
*Basically compare multiple folder content to one folders contents and rename them so each file in each folder is named the same?
Example:
D:/Folder1/Name_Of_File1.jpeg
D:/Folder2/Name_Of_File1 (Europe).jpeg
D:/Folder3/Name_of_File1_(Random).jpeg
D:/folder1/another_file.doc
D:/Folder2/another_file_(date_month_year).txt
D:/Folder3/another_file(UK).XML
I have used different file extensions in the above example in hope someone can write a script to ignore file extensions.
I hope this make sense. So either a script to remove the content in brackets and keep the files integrity or rename ALL files across all folders based on minimum similarity.
The problem is its 1000+ files in each folder so want to run it as an automated job.
Thanks in advance.
If the stuff you want to get rid of is always in brackets then you could write a regex like
(.*?)([\s|_|]*\(.*\))
Try something like this
$folder = Get-ChildItem 'C:\TestFolder'
$regex = '(.*?)([\s|_|]*\(.*\))'
foreach ($file in $folder){
if ($file.BaseName -match $regex){
Rename-Item -Path $file.FullName -NewName "$($matches[1])$($file.extension)" -Verbose #-WhatIf
}
}
Regarding consistency you could run a precheck using same regex
#change each filename if it matches regex and store only it's new basename
$folder1 = get-childitem 'D:\T1' | foreach {if ($_.BaseName -match $regex){$matches[1]}else{$_.BaseName}}
$folder2 = get-childitem 'D:\T2' | foreach {if ($_.BaseName -match $regex){$matches[1]}else{$_.BaseName}}
#compare basenames in two folders - if all are the same nothing will be returned
Compare-Object $folder1 $folder2
Maybe you could build with that idea.

Copy files based on existence of other files (Windows)

I have a folder of images in jpg format called "finalpics" and also another folder ("sourcepics") which has several subfolders containing RAW files in various formats.
I need a script (batch file?) that will copy all the files from "sourcepics" and its subfolders to another folder ("sourcefinal") only if that file exists in "finalpics".
As an example:
"finalpics" contains files called mypic1.jpg, mypic2.jpg, mypic3.jpg.
"sourcepics" contains files called mypic1.dng, mypic2.psd, mypic3.cr2, yourpic1.dng, yourpic2.psd, yourpic3.cr2.
I'd want the script to copy the 'mypic' files but not the 'yourpic' files to "sourcefinal".
There's over a thousand jpgs in "finalpics" but probably 40,000 files in the various subfolders of "sourcepics".
Hope that makes sense.
Thanks for looking.
I think this PowerShell code will do what you're after; it will copy files of the same name (ignoring file extension) from "SourcePics" to "SourceFinal" if they exist in FinalPics:
# Define your folder locations:
$SourcePicsFolder = 'C:\SourcePics'
$FinalPicsFolder = 'C:\FinalPics'
$SourceFinalFolder = 'C:\SourceFinal'
# Get existing files into arrays:
$SourcePics = Get-ChildItem -Path $SourcePicsFolder -Recurse
$FinalPics = Get-ChildItem -Path $FinalPicsFolder -Recurse
# Loop all files in the source folder:
foreach($file in $SourcePics)
{
# Using the basename property (which ignores file extension), if the $FinalPics
# array contains a basename equal to the basename of $file, then copy it:
if($FinalPics.BaseName -contains $file.BaseName)
{
Copy-Item -Path $file.FullName -Destination $SourceFinalFolder
}
}
Note: There is no filtering based on file type (e.g. it will copy all files). Also, if your 'SourcePics' folder has two images of the same filename but in different subfolders, and a file of this name also exists in 'FinalPics', then you may get an error about file already existing when it tries to copy for the second time. To overwrite, use the -Force parameter on the Copy-Item command.
I tested the above code with some .dng files in 'SourcePics' and .jpg files in 'FinalPics' and it worked (ignoring the yourpic files).

Recursive "touch" on fileserver

There's something I want to accomplish with administrating my Windows file server:
I want to change the "Last Modified" date of all the folders on my server (just the folders and subfolders, not the files within them) to be the same as the most recent "Created" (or maybe "Last Modified") date file within the folder. (In many cases, the date on the folder is MUCH newer than the newest file within it.)
I'd like to do this recursively, from the deepest subfolder to the root. I'd also like to do this without me manually entering any dates and times.
I'm sure with a combination of scripting and a Windows port of "touch" I could maybe accomplish this. Do you have any suggestions? I could maybe accomplish this. Do you have any suggestions?
This closed topic seems really close but I'm not sure how to only touch folders without touching the files inside, or how to get the most-recent-file's date. Recursive touch to fix syncing between computers
If it's for backup purposes, in Windows there is the Archive flag (instead of modifying the timestamp). You can set it recursively with ATTRIB /S (see ATTRIB /?)
If it is for other purposes you can use some touch.exe implementation and use a recursive for:
FOR /R (see FOR /?)
http://ss64.com/nt/for_r.html
http://ss64.com/nt/touch.html
I think that you can do this in PowerShell. I just tried throwing something together and it seems to work correctly. You could invoke this in PowerShell using Set-DirectoryMaxTime(".\Directory") and it will operate recursively on each directory under that.
function Set-DirectoryMaxTime([System.IO.DirectoryInfo]$directory)
{
# Grab a list of all the files in the directory
$files = Get-ChildItem -File $directory
# Get the current CreationTime of the directory we are looking at
$maxdate = Get-Date $directory.CreationTime
# Find the most recently edited file's LastWriteTime
foreach($file in $files)
{
if($file.LastWriteTime -gt $maxdate) { $maxdate = $file.LastWriteTime }
}
# This needs to be in a try/catch block because there is a reasonable chance of it failing
# if a folder is currently in use
try
{
# Give the directory a LastWriteTime equal to the newest file's LastWriteTime
$directory.LastWriteTime = $maxdate
} catch {
# One of the directories could not be updated
Write-Host "Could not update directory: $directory"
}
# Get all the subdirectories of this directory
$subdirectories = Get-ChildItem -Directory $directory
# Jump into each of the subdirectories and do the same thing to each of their CreationTimes
foreach($subdirectory in $subdirectories)
{
Set-DirectoryMaxTime($subdirectory)
}
}

How to read images from folders in matlab

I have six folders like this >> Images
and each folder contains some images. I know how to read images in matlab BUT my question is how I can traverse through these folders and read images in abc.m file (this file is shown in this image)
So basically you want to read images in different folders without putting all of the images into one folder and using imread()? Because you could just copy all of the images (and name them in a way that lets you know which folder they came from) into a your MATLAB working directory and then load them that way.
Use the cd command to change directories (like in *nix) and then load/read the images as you traverse through each folder. You might need absolute path names.
The easiest way is certainly a right clic on the forlder in matlab and "Add to Path" >> "Selected Folders and Subfolders"
Then you can just get images with imread without specifying the path.
if you know the path to the image containing directory, you can use dir on it to list all the files (and directories) in it. Filter the files with the image extension you want and voila, you have an array with all the images in the directory you specified:
dirname = 'images';
ext = '.jpg';
sDir= dir( fullfile(dirname ,['*' ext]) );;
sDir([sDir.isdir])=[]; % remove directories
% following is obsolete because wildcarded dir ^^
b=arrayfun(#(x) strcmpi(x.name(end-length(ext)+1:end),ext),sDir); % filter on extension
sFiles = sDir(b);
You probably want to prefix the name of each file with the directory before using:
sFileName(ii) = fullfile(dirname, sFiles(ii));
You can process this resulting files as you want. Loading all the files for example:
for ii=1:numel(sFiles)
data{i}=imread(sFiles(ii).name)
end
If you also want to recurse the subdirectories, I suggest you take a look at:
How to get all files under a specific directory in MATLAB?
or other solutions on the FEX:
http://www.mathworks.com/matlabcentral/fileexchange/8682-dirr-find-files-recursively-filtering-name-date-or-bytes
http://www.mathworks.com/matlabcentral/fileexchange/15505-recursive-dir
EDIT: added Amro's suggestion of wildcarding the dir call

Resources