Batch rename files in Windows 10 - cmd

I have a folder in which I need to rename all of the files to replace a string with another string:
File 01 (Something) ABC.txt ---> File 01 - ABC.txt
File 02 (Something) DEF.txt ---> File 02 - DEF.txt
In other words I need to replace (Somthing) with -.
I tried the ren solution mentioned in this answer but I got The syntax of the command is incorrect. Here's what I tried:
ren *(Something)* *-*

I was looking for a cmd approach but it looks like my needs can't be done with the ren command.
So I ended up just creating a simple PHP script based on this answer:
<?php
$path = 'E:/My Folder';
if ($handle = opendir($path)) {
while (false !== ($fileName = readdir($handle))) {
if(in_array($fileName, array('.', '..'))) {
continue;
}
$newName = str_replace("(Something)", "-", $fileName);
rename("{$path}/{$fileName}", "{$path}/{$newName}");
}
closedir($handle);
}
?>

Related

Executing Perl script from windows-command line with 2 entry

this is my Perl script
use strict;
use warnings;
use XML::Twig;
use Data::Dumper;
sub xml2array{
my $path = shift;
my $twig = XML::Twig->new->parsefile($path);
return map { $_ -> att('VirtualPath') } $twig -> get_xpath('//Signals');
}
sub compareMappingToArray {
my $mapping = shift;
my $signalsRef = shift;
my $i = 1;
print "In file : $mapping\n";
open(my $fh, $mapping);
while (my $r = <$fh>) {
chomp $r;
if ($r =~ /\'(ModelSpecific.*)\'/) {
my $s = $1;
my #matches = grep { /^$s$/ } #{$signalsRef};
print "line $i : not found - $s\n" if scalar #matches ==0;
print "line $i : multiple $s\n" if scalar #matches > 1;
}
$i = $i + 1 # keep line index
}
}
my $mapping = "C:/Users/HOR1DY/Desktop/Global/TA_Mapping/CAN/CAN_ESP_002_mapping.pm";
my #virtualpath = xml2array("SignalModel.xml");
compareMappingToArray($mapping, \#virtualpath);
The script works well, the aim of it is to compare the file "SignalModel.xml" and "CAN_ESP_002_mapping.pm" and putting the lines that didn't matches in a .TXT file. Here is how the .TXT file looks like:
In file : C:/Users/HOR1DY/Desktop/Global/TA_Mapping/CAN/CAN_ESP_002_mapping.pm
line 331 : not found - ModelSpecific.EID.NET.CAN_Engine.VCU.Transmit.VCU_202.R2B_VCU_202__byte_3
line 348 : not found - ModelSpecific.EID.NET.CAN_Engine.CMM_WX.Transmit.CMM_HYB_208.R2B_CMM_HYB_208__byte_2
line 368 : not found - ModelSpecific.EID.NET.CAN_Engine.VCU.Transmit.VCU_222.R2B_VCU_222__byte_0
But for this script, I put the two files that need to be compare inside of the code and instead of doing that, I would like to run the script in windows cmd line and having something like:
C:\Users>perl CANMappingChecker.pl -'file 1' 'file 2'
All the files are in .zip file so if I can execute the script that he goes inside and take the 2 files that I need for comparison, it should be perfect.
I really don't know how to do and what to put inside my script to make that in the cmd windows. Thanks for your help !
Program (or script) parameters are stored in the #ARGV array. shift and pop without any parameter will work on #ARGV when used outside of a sub, in a sub they operate on #_.
See Archive::Zip for zip file handling.

How to compare the content of multiple txt file in bash shell and delete the one (file) which is duplicate

I am trying to achieve this is Mac OS, tried to achieve similar by using fdupes but didn't work. Here is what I am trying to achieve:
There are 100 files in directory 'alpha'
Pick one file A and compare it with each remaining file in the directory 'alpha'
If content of file A matches any file (duplicate), delete the duplicate file
Move to file B, and compare with the remaining file, and do the same (check for duplicate)
Repeat the same until all files are checked for duplicates. Remaining files should be unique
Update
I modified a bit something similar I found here, but I have to run it multiple times to take out the duplicates. It is not detecting duplicates in a single run (have to run it multiple times to detect duplicate). Not sure if it is working correctly
use Digest::MD5;
%check = ();
while (<*>) {
-d and next;
$fname = "$_";
print "checking .. $fname\n";
$md5 = getmd5($fname) . "\n";
if ( !defined( $check{$md5} ) ) {
$check{$md5} = "$fname";
}
else {
print "Found duplicate files: $fname and $check{$md5}\n";
print "Deleting duplicate $check{$md5}\n";
unlink $check{$md5};
}
}
sub getmd5 {
my $file = "$_";
open( FH, "<", $file ) or die "Cannot open file: $!\n";
binmode(FH);
my $md5 = Digest::MD5->new;
$md5->addfile(FH);
close(FH);
return $md5->hexdigest;
}
You should limit the number of times that you have to read each file's contents:
Inventory the files using Path::Class or some similar method.
a. Build a hash relating file sizes and MD5::Digest to a list of file names.
Compare likely duplicates only. Matching file size and digest.
The following is untested:
use strict;
use warnings;
use Path::Class;
use Digest::MD5;
my $dir = dir('.');
my %files_per_digest;
# Inventory Directory
while ( my $file = $dir->next ) {
my $size = $file->stat->size;
my $digest = do {
my $md5 = Digest::MD5->new;
$md5->addfile( $file->openr );
$md5->hexdigest;
};
push #{ $files_per_digest{"$size - $digest"} }, $file;
}
# Compare likely duplicates only
for my $files ( grep { #$_ > 1 } values %files_per_digest ) {
# Sort by alpha
#$files = sort #$files;
print "Comparing: #files\n";
for my $i ( reverse 0 .. $#files ) {
for my $j ( 0 .. $i - 1 ) {
my $fh1 = $files->[$i]->openr;
my $fh2 = $files->[$j]->openr;
my $diff = 0;
while ( !eof($fh1) && !eof($fh2) ) {
$diff = 1, last if scalar(<$fh1>) ne scalar(<$fh2>);
}
if ( $diff or !eof($fh1) or !eof($fh2) ) {
print " $files->[$i] ($i) is duplicate of $files->[$j] ($j)\n";
$files->[$i]->remove();
splice #$files, $i, 1;
}
}
}
}
I've used rdfind in the past with very good success. It's very accurate, fast, and seems to run leaner than fdupes. According to RDFind's web site (http://rdfind.pauldreik.se/), it can be installed using MacPorts.

How to find a specific files recursively in the directory, rename it by prefixing sub-directory name, and move it to different directory

I am perl noob, and trying to do following:
Search for files with specific string in a directory recursively. Say string is 'abc.txt'
The file can be in two different sub-directories, say dir_1 or dir_2
Once the file is found, if it is found in dir_1, rename it to dir_1_abc.txt. If it is in dir_2, then rename it to dir_2_abc.txt.
Once all the files have been found and renamed, move them all to a new directory named, say dir_3
I don't care if I have to use any module to accomplish this. I have been trying to do it using File::Find::Rule and File::copy, but not getting the desired result. Here is my sample code:
#!/usr/bin/perl -sl
use strict;
use warnings;
use File::Find::Rule;
use File::Copy;
my $dir1 = '/Users/macuser/ParentDirectory/logs/dir_1'
my $dir2 = '/Users/macuser/ParentDirectory/logs/dir_2'
#ideally I just want to define one directory but because of the logic I am using in IF
#statement, I am specifying two different directory paths
my $dest_dir = '/Users/macuser/dir_3';
my(#old_files) = find(
file => (),
name => '*abc.txt',
in => $dir1, $dir2 ); #not sure if I can give two directories, works with on
foreach my $old_file(#old_files) {
print $old_file; #added this for debug
if ($dest_dir =~ m/dir_1/)
{
print "yes in the loop";
rename ($old_file, "dir_1_$old_file");
print $old_file;
copy "$old_file", "$dest_dir";
}
if ($dest_dir =~ m/dir_2/)
{
print "yes in the loop";
rename ($old_file, "dir_2_$old_file");
print $old_file;
copy "$old_file", "dest_dir";
}
}
The code above does not change the file name, instead when I am printing $old_file inside if, it spits the whole directory path, where the file is found, and it is prefixing the path with dir_1 and dir_2 respectively. Something is horribly wrong. Please help simply.
If you have bash ( I assume in OSX it is available), you can do this in a few lines (usually I put them in one line).
destdir="your_dest_dir"
for i in `find /Users/macuser/ParentDirectory/logs -type f -iname '*abc.txt' `
do
prefix=`dirname $i`
if [[ $prefix = *dir_1* ]] ; then
prefix="dir_1"
fi
dest="$destdir/${prefix}_`basename $i`"
mv "$i" "$dest"
done
The advantage of this method is that you can have many sub dirs under logs and you don't need to specify them. you can search for files like blah_abc.txt, tada_abc.txt too. If you want a exact match just juse abc.txt, instead of *abc.txt.
If the files can be placed in the destination as you rename them, try this:
#!/usr/bin/perl
use strict;
use File::Find;
use File::Copy;
my $dest_dir = '/Users/macuser/dir_3';
foreach my $dir ('/Users/macuser/ParentDirectory/logs/dir_1', '/Users/macuser/ParentDirectory/logs/dir_2') {
my $prefix = $dir; $prefix =~ s/.*\///;
find(sub {
move($File::Find::name, "$dest_dir/${prefix}_$_") if /abc\.txt$/;
}, $dir);
}
If you need to do all the renaming first and then move them all, you could either remember the list of files you have to move or you can make two passes making sure the pattern on the second pass is still OK after the initial rename in the first pass.

fopen with dynamic filename is not working

I need to make a web site on which people can upload data files that after some treatment will be plotted using the jpgraph. The file is analyzed with a bash script called analfile.sh. The bash script is like this:
#!/bin/bash
file=$1
fecha=`date +%s`
mv $1 $fecha.dat
echo $fecha.dat
So, it gives back another file which name is like: 1321290921.dat. That is the file that I need to plot.
This my current php code:
$target_path = "/home/myhome";
$target_path = $target_path . basename( $_FILES['rdata']['name']);
$target_file = basename( $_FILES['rdata']['name']);
if(move_uploaded_file($_FILES['rdata']['tmp_name'], $target_path)) {
echo "The file ". $target_file. " has been uploaded";
chdir('/home/myhome');
$filetoplot=shell_exec('./analfile.sh'.' '.$target_file);
} else{
echo "There was an error uploading the file, please <a href=\"index.html\">try again!
</a>";
}
//$filetoplot="1321290921.dat"
echo "<br>The file to be opened is ". $filetoplot. "<br>";
if ($file_handle = fopen("$filetoplot", 'r')) {
while ( $line_of_text = fgets($file_handle) ) {
$parts = explode('.', $line_of_text);
echo $line_of_text ;
print $parts[0] . $parts[1]. $parts[2]. "<br>";
}
fclose($file_handle);
}
I have permissions to read and write on the target directory. I find strange that if I uncomment the line $filetoplot="1321290921.dat" then the script works perfectly. I guess I am doing something stupid since this is my first code in php but after some hours googling I was not able to find a solution.
Any help will be appreciated.
First thing I'm seeing is that you don't append trailing slash (/) to your home path, so that the path would be like /home/myhomefoo rather than /home/myhome/foo.
You should also move $target_file earlier, and reuse that within $target_path. There's no reason to do the same thing twice.
If that doesn't help, we'll see what goes next.

Locating a file on the path

Does anybody know how to determine the location of a file that's in one of the folders specified by the PATH environmental variable other than doing a dir filename.exe /s from the root folder?
I know this is stretching the bounds of a programming question but this is useful for deployment-related issues, also I need to examine the dependencies of an executable. :-)
You can use the where.exe utility in the C:\Windows\System32 directory.
For WindowsNT-based systems:
for %i in (file) do #echo %~dp$PATH:i
Replace file with the name of the file you're looking for.
If you want to locate the file at the API level, you can use PathFindOnPath. It has the added bonus of being able to specify additional directories, in case you want to search in additional locations apart from just the system or current user path.
On windows i'd say use %WINDIR%\system32\where.exe
Your questions title doesn't specify windows so I imagine some folks might find this question looking for the same with a posix OS on their mind (like myself).
This php snippet might help them:
<?php
function Find( $file )
{
foreach( explode( ':', $_ENV( 'PATH' ) ) as $dir )
{
$command = sprintf( 'find -L %s -name "%s" -print', $dir, $file );
$output = array();
$result = -1;
exec( $command, $output, $result );
if ( count( $output ) == 1 )
{
return( $output[ 0 ] );
}
}
return null;
}
?>
This is slightly altered production code I'm running on several servers. (i.e. taken out of OO context and left some sanitation and error checking out for brevity.)
Using PowerShell on Windows...
Function Get-ENVPathFolders {
#.Synopsis Split $env:Path into an array
#.Notes
# - Handle 1) folders ending in a backslash 2) double-quoted folders 3) folders with semicolons 4) folders with spaces 5) double-semicolons i.e. blanks
# - Example path: 'C:\WINDOWS\;"C:\Path with semicolon; in the middle";"E:\Path with semicolon at the end;";;C:\Program Files;
# - 2018/01/30 by Chad#ChadsTech.net - Created
$NewPath = #()
$env:Path.ToString().TrimEnd(';') -split '(?=["])' | ForEach-Object { #remove a trailing semicolon from the path then split it into an array using a double-quote as the delimeter keeping the delimeter
If ($_ -eq '";') { # throw away a blank line
} ElseIf ($_.ToString().StartsWith('";')) { # if line starts with "; remove the "; and any trailing backslash
$NewPath += ($_.ToString().TrimStart('";')).TrimEnd('\')
} ElseIf ($_.ToString().StartsWith('"')) { # if line starts with " remove the " and any trailing backslash
$NewPath += ($_.ToString().TrimStart('"')).TrimEnd('\') #$_ + '"'
} Else { # split by semicolon and remove any trailing backslash
$_.ToString().Split(';') | ForEach-Object { If ($_.Length -gt 0) { $NewPath += $_.TrimEnd('\') } }
}
}
Return $NewPath
}
$myFile = 'desktop.ini'
Get-ENVPathFolders | ForEach-Object { If (Test-Path -Path $_\$myFile) { Write-Output "Found [$_\$myFile]" } }
I also blogged the answer with some details over at http://blogs.catapultsystems.com/chsimmons/archive/2018/01/30/parse-envpath-with-powershell
In addition to the 'which' (MS Windows) and 'where' (unix/linux) utilities, I have written my own utility which I call 'findinpath'. In addition to finding the executable that would be executed, if handed to the command line interpreter (CLI), it will find all matches, returned path-search-order so you can find path-order problems. In addition, my utility returns not just executables, but any file-specification match, to catch those times when a desired file isn't actually executable.
I also added a feature that has turned out to be very nifty; the -s flag tells it to search not just the system path, but everything on the system disk, known user-directories excluded. I have found this feature to be incredibly useful in systems administration tasks...
Here's the 'usage' output:
usage: findinpath [ -p <path> | -path <path> ] | [ -s | -system ] <file>
or findinpath [ -h | -help ]
where: <file> may be any file spec, including wild cards
-h or -help returns this text
-p or -path uses the specified path instead of the PATH environment variable.
-s or -system searches the system disk, skipping /d /l/ /nfs and /users
Writing such a utility is not hard and I'll leave it as an exercise for the reader. Or, if asked here, I'll post my script - its in 'bash'.
just for kicks, here's a one-liner powershell implementation
function PSwhere($file) { $env:Path.Split(";") | ? { test-path $_\$file* } }

Resources