Change file names in windows folders - windows

Hi I am trying to change the files names in some of my folders in windows machine.
I have bunch of files with files names starting with caiptal letter example
"Hello.html" but i want to change that to "hello.html" since there are like thousands of files i cannot just go and do it change it manually. I am looking for a script and i just need some help to get started and what should i start with.
I have access to a linux machine i can just copy the files over there and run any scripts i would really appreciate if some one could guide me to get started either in Linux or windows environments.

On some linux system you can use the rename command, which accept regular expression. Try the following:
rename 's/^([A-Z])/\l$1/' *
This should replace any uppercase char at the beginning with a lower case one.
Othewise, if you're not running a linux system that accept such a command, you can write your own little perl script:
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy;
my #files = `ls`;
foreach (#files) {
chomp($_);
if ($_ =~ m/^[A-Z]/) {
my $newname = $_;
$newname =~ s/^([A-Z])/\l$1/;
move($_, $newname);
}
}
exit 0;

A very easy to use option is ReNamer.
Once installed, simply add the files to be renamed and add a case rule to simply change it to lower case or add a regex rule for advanced cases.

Related

Perl code doesn't run in a bash script with scheduling of crontab

I want to schedule my Perl code to be run every day at a specific time. so I put the below code in bash file:
Automate.sh
#!/bin/sh
perl /tmp/Taps/perl.pl
The schedule has been specified in below path:
10 17 * * * sh /tmp/Taps/Automate.sh > /tmp/Taps/result.log
When the time arrived to 17:10 the .sh file hasn't been running. however, when I run ./Automate.sh (manually) it is running and I see the result. I don't know what is the problem.
Perl Code
#!/usr/bin/perl -w
use strict;
use warnings;
use Data::Dumper;
use XML::Dumper;
use TAP3::Tap3edit;
$Data::Dumper::Indent=1;
$Data::Dumper::Useqq=1;
my $dump = new XML::Dumper;
use File::Basename;
my $perl='';
my $xml='';
my $tap3 = TAP3::Tap3edit->new();
foreach my $file(glob '/tmp/Taps/X*')
{
$files= basename($file);
$tap3->decode($files) || die $tap3->error;
}
my $filename=$files.".xml\n";
$perl = $tap3->structure;
$dump->pl2xml($perl, $filename);
print "Done \n";
error:
No such file or directory for file X94 at /tmp/Taps/perl.pl line 22.
X94.xml
foreach my $file(glob 'Taps/X*') -- when you're running from cron, your current directory is /. You'll want to provide the full path to that Taps directory. Also specify the output directory for Out.xml
Cron uses a minimal environment and a short $PATH, which may not necessarily include the expected path to perl. Try specifying this path fully. Or source your shell settings before running the script.
There are a lot of things that can go wrong here. The most obvious and certain one is that if you use a glob to find the file in directory "Taps", then remove the directory from the file name by using basename, then Perl cannot find the file. Not quite sure what you are trying to achieve there. The file names from the glob will be for example Taps/Xfoo, a relative path to the working directory. If you try to access Xfoo from the working directory, that file will not be found (or the wrong file will be found).
This should also (probably) lead to a fatal error, which should be reported in your error log. (Assuming that the decode function returns a false value upon error, which is not certain.) If no errors are reported in your error log, that is a sign the program does not run at all. Or it could be that decode does not return false on missing file, and the file is considered to be empty.
I assume that when you test the program, you cd to /tmp and run it, or your "Taps" directory is in your home directory. So you are making assumptions about where your program looks for the files. You should be certain where it looks for files, probably by using only absolute paths.
Another simple error might be that crontab does not have permission to execute the file, or no read access to "Taps".
Edit:
Other complications in your code:
You include Data::Dumper, but never actually use that module.
$xml variable is not used.
$files variable not declared (this code would never run with use strict)
Your $files variable is outside your foreach loop, which means it will only run once. Since you use glob I assumed you were reading more than one file, in which case this solution will probably not do what you want. It is also possible that you are using a glob because the file name can change, e.g. X93, X94, etc. In that case you will read the last file name returned by the glob. But this looks like a weak link in your logic.
You add a newline \n to a file name, which is strange.

In Perl Script Not able to add a directory to Archive

I have to add a Windows directory to Archive using Perl Script. After executing the below script only the Directory name is getting archived and the Directory Content under "C:\Software\Postgres" is not included as part of Archive. Can you please help me where I am making mistake in the below script.
use warnings;
use strict;
use Archive::Tar;
tar_BKUP();
sub tar_BKUP{
my $src_D = 'C:\Software\Postgres';
my $dst_Tar = 'C:\temp\Postgres.tar';
my $tar = Archive::Tar->new();
$tar->add_files($src_D);
$tar->write($dst_Tar);
}
Here is an example using File::Find::Rule:
use warnings;
use strict;
use Archive::Tar;
use File::Find::Rule;
tar_BKUP();
sub tar_BKUP{
my $src_D = 'C:/Software/Postgres';
my $dst_Tar ='C:/temp/Postgres.tar';
my $tar = Archive::Tar->new();
my #files = File::Find::Rule->file->in($src_D);
$tar->add_files(#files);
$tar->write($dst_Tar);
}
See also How can I archive a directory in Perl like tar does in UNIX?
The documentation for the add_files() method says this:
$tar->add_files( #filenamelist )
Takes a list of filenames and adds them to the in-memory archive.
So you pass it a list of filenames and those files get added to the archive. It looks like you think you can pass it a directory and get all of the files in that directory added in one go. But it's not documented to work like that.
If you know there are no subdirectories below your source directory, then you could do something like this:
$tar->add_files( glob( "$src_D/*" ) );
But if you need to include the contents of subdirectories, then HÃ¥kon's answer using File::Find::Rule is a good approach.
If a Perl module isn't working how you expect it to, then checking the documentation is always a good first step :-)

Detecting that files are being copied in a folder

I am running a script which copies one folder from a specific location if it does not exist( or is not consistent). The problems appears when I run concurently the script 2+ times. As the first script is trying to copy the files, the second comes and tryes the same thing resulting in a mess. How could I avoid this situation? Something like system wide mutex.
I tryed a simple test with -w, I manually copied the folder and while the folder was copying I run the script:
use strict;
use warnings;
my $filename = 'd:\\folder_to_copy';
if (-w $filename) {
print "i can write to the file\n";
} else {
print "yikes, i can't write to the file!\n";
}
Of course this won't work, cuz I still have write acces to that folder.
Any ideea of how could I check if the folder is being copied in Perl or usingbatch commands?
Sounds like a job for a lock file. There are myriads of CPAN modules that implement lock files, but most of them don't work on Windows. Here are a few that seem to support Windows according to CPAN Testers:
File::Lockfile
File::TinyLock
File::Flock::Tiny
After having a quick view at the source code, the only module I can recommend is File::Flock::Tiny. The others seem racy.
If you need a systemwide mutex, then one "trick" is to (ab)use a directory. The command mkdir is usually atomic and either works or doesn't (if the directory already exists).
Change your script as follows:
my $mutex_dir = '/tmp/my-mutex-dir';
if ( mkdir $mutex_dir ) {
# run your copy-code here
# when finished:
rmdir $mutex_dir;
} else {
print "another instance is already running.\n";
}
The only thing you need to make sure is that you're allowed to create a directory in /tmp (or wherever).
Note that I intentionally do NOT firstly test for the existence of $mutex_dir because between the if (not -d $mutex_dir) and the mkdir someone else could create the directory and the mkdir would fail anyway. So simply call mkdir. If it worked then you can do your stuff. Don't forget to remove the $mutex_dir after you're done.
That's also the downside of this approach: If your copy-code crashes and the script prematurely dies then the directory isn't deleted. Presumably the lock file mechanism suggested in nwellnhof's answer behaves better in that case and automatically unlocks the file.
As the first script is trying to copy the files, the second comes and
tries the same thing resulting in a mess
A simplest approach would be to create a file which will contain 1 if another instance of script is running. Then you can add a conditional based on that.
{local $/; open my $fh, "<", 'flag' or die $!; $data = <$fh>};
die "another instance of script is running" if $data == 1;
Another approach would be to set an environment variable within the script and check it in BEGIN block.
You can use Windows-Mutex or Windows-Semaphore Objects of the package
http://search.cpan.org/~cjm/Win32-IPC-1.11/
use Win32::Mutex;
use Digest::MD5 qw (md5_hex);
my $mutex = Win32::Mutex->new(0, md5_hex $filename);
if ($mutex) {
do_your_job();
$mutex->release
} else {
#fail...
}

Perl: Check if string is valid directory, case SENSITIVE

So I have run into an issue. The -d switch will check if a directory exists just fine. However, I need it to be case sensitive. If I have directory Users, and I do if -d "UsErS", it will return true. I need it to only return true if the case matches.
Any help is much appreciated.
Code:
if (-d $cmdLine[1]) {
chdir $cmdLine[1];
print "CD: Successfully changed directory.\n";
} else {
print "CD: Error: $cmdLine[1] is not a valid directory.\n";
}
The only definitive source for the file name is the filesystem itself. This snippet lists the entries in the parent of the target directory and verifies that the name specified matches exactly with one of those entries. I tested in from Linux on a remote NTFS share (mounted with CIFS).
use File::Basename;
$target = shift;
($base,$parent) = fileparse($target);
opendir($PARENT,$parent)
or die("Error opening '$parent': $!");
%entries = map { $_ => 1 } readdir($PARENT);
closedir($PARENT);
if (-d $target && exists($entries{$base})) {
print("'$target' exists (and correct case)\n");
} else {
print("'$target' does not exist.\n");
}
I can't conceive of how you could experience this problem outside of a case-insensitive filesystem (e.g. NTFS, (V)FAT, others?), and the problem with them (at least when Windows is the OS managing it) is that you cannot necessarily guarantee that the case of the filesystem entry is what you want it to be. For example, try to rename an NTFS file changing only the case. In Windows, the file name doesn't get changed. You'd have to change it to something different entirely, then change it back to the old name with the correct case. There are (or were) configurable Windows settings that do special things if the file name is all uppercase.
Another thing to consider is that if the filesystem is case-insensitive, then there's no possibility that there could be two entries in the same directory that differed only by case. I just don't understand what useful contingency this check would account for.
Check Win32 module,
use Win32;
if (-d $cmdLine[1] and $cmdLine[1] eq Win32::GetLongPathName($cmdLine[1])) { .. }
You may also need use File::Spec::Functions 'canonpath'; if you want to normalize directory separators (/ into \ on win32)
What TLP is suggesting is to combine both -d and eq. Something like:
if (-d $dirname && $dirname eq "Users"){
....
}
BTW -d alone is working fine for me. It's case sensitive. Try the below code and modify $dir to any directory which exists in your system.
#!/usr/bin/perl
use strict;
use warnings;
my $dir = "/pathto/code";
if(-d $dir){print "DIR: $dir";}
else{print "$dir is not a directory";}

Best way to search for a string in a file on a network drive

Here is my problem: We have a file server (Windows 2003) that people keep putting forms on that contain PII. Policy is now that the last 4 of a person's SSN is no longer allowed on any forms on our file servers. I'm trying to figure out a script to scan for a string such as "SSN" or "Last Four" in a document and all I can find are instructions/examples on how to search text files on a local machine. I have seen a lot of threads similar to this but primarily searching a txt file in a local folder. I've seen powershell scripts that do this but (don't ask why) powershell scripting is disabled on our servers.
Is this possible? I've been reading heavily into multiple Perl books to hope for a clue or get me in the right direction and have had 0 luck.
Assuming you get access to the files eventually, here's how you can go about searching a directory of files, looking for a string match.
use strict;
use warnings;
use File::Find;
our $CHECK_FILE_EXTENSION = qr/.txt$/;
File::Find::find({wanted=>\&find_ssn, no_chdir=>1},$_) for #ARGV;
exit;
sub find_ssn
{
## File::Find sets $File::Find::name with full path to file, which is the correct path to an 'open' call when 'no_chdir' is used
return unless $File::Find::name =~ $CHECK_FILE_EXTENSION;
open F,$File::Find::name || die "Can't read file, $File::Find::name, $!\n";
while(<F>)
{
if(/SSN/)
{
## file as 'SSN' in it, do your work here
}
}
close F;
}
Aside from i/o speed, there's no real difference in accessing a file remotely vs locally. It's just a file descriptor.
C:\>perl -MFile::Slurp -E "my $dir = q|//SERVER/Share/Test|; for my $file (read_dir($dir)) { say qq|$file: |, (read_file(qq|$dir/$file|) =~ /foo/) ? q|match| : q|not match| }"
bar.txt: not match
foo.txt: match

Resources