Delete files from 00012 to 00441 using Terminal - bash

This is my Bash version:
3.2.57(1)-release
I found that this MIGHT possible using terminal rm code, but it doesn't work for me
Here is a picture of the first files:
Picture of a bunch of Files
Typing rm _{00012..00441} I get this error:
Terminal Error
Seems like the code us unable to use the leading zeros since is trying to find the files like 12,13,14,15, instead of 00012,00013,00014,00015
To make things more difficult, the last file in the range has a different amount of leading zeros, so using rm _000{ wont work
In trying to use AppleScript to run this as
do shell script ("rm _{" & STARTrange & ".." & ENDrange & "}.psd")

Try this command.
find ./ -name "_0*.psd" -type f -delete;

Thanks to Cyrus I made this work. The answer in OSX is to install the latest version of Bash using homebrew.
Using this guide I was able to change the default Bash from terminal to the one Homebre installs: https://apple.stackexchange.com/questions/193411/update-bash-to-version-4-0-on-osx
Thanks

For versions of bash older than v4.0 either of these should work:
rm _000{12..99} _00{100..441}
rm $(printf '_%.5i ' {12..441})

Related

Cygwin execution of .sh file can't find grep command?

So I was trying to create little .sh script for my work and run into one little problem.
My cygwin terminal (x64) runs just fine and I'm using it often enough to do manual greps.
In the begging I had some issues with this command but now it works fine in cygwin terminal.
Once I wrote my little script and tried to run it only output I'm getting is "line 6: grep: command not found"
My execution method is:
Open cygwin terminal
cd to script location
type in ./script.sh
enter :)
Anyone knows how to fix that? I already added cygwin bin folder to my system path (Win 10 btw) but that didn't helped. Looked around for a while but haven't found anything useful, mostly issues with grep itself.
my script for reference:
mkdir -p output
PATH=$PWD"/output"
while IFS=";" read -r component location global
do
cd $location
grep -iRl $global --exclude-dir={wrongdir1,wrongdir2} > $PATH"/"$component".txt"
done < input.csv
you're overwriting you Cygwin system path: PATH=$PWD"/output" - instead of PATH use a diff var name.

Copy command with `cp -f -R` strangely does not work on all OS X machines

I am on OS X El Capitan. I have 2 directories SourceDir & DestDir with the following structures.
ParentDir/SourceDir |-s_dir1/ss_dir1
|-s_dir2
ParentDir/DestDir |-s_dir1/ss_dir2
|-ddir1
|-ddir2
|-ddir3
I want to do a copy command in such a way that s_dir1 in ParentDir/SourceDir gets merged with s_dir1 in ParentDir/DestDir and additionally, s_dir2 gets placed into ParentDir/DestDir. So, after the copy ParentDir/DestDir should look like this:
ParentDir/DestDir |-s_dir1/ss_dir1
|-s_dir1/ss_dir2
|-s_dir2
|-ddir1
|-ddir2
|-ddir3
I use the following command to copy:
cp -f -R ParentDir/SourceDir/ ParentDir/DestDir/
It works all fine on MacOS Sierra. But strangely it doesn't work on an OS X machine with MacOS El Capitan. Again, I tried it with a MacOS El Capitan on my colleague's machine. It worked fine!!!
What is wrong?
Do different versions of MacOS El Capitan behave differently to cp command?
Or, do I need to change the copy command syntax on El Capitan?
How can I do a copy command on my MacOS El Capitan machine to ensure the correct recursive copy at least on all OSX machines?
Please do not suggest to upgrade the El Capitan machine to Sierra. That is not an option for me. Hence posted this question to get some other syntax options of copy command.
It appears that the standard cp command is being overridden by something else. You can force it to use the standard cp command by explicitly specifying /bin/cp. You could also try to override the current override by using something like an alias, but it'd really be better to find out what the current override is, where it's coming from, and (maybe) getting rid of it.
The first thing to do is run type cp its output tells you what cp is currently doing:
If it prints something like "cp is /usr/local/bin/cp", then you've got a custom-installed version of cp that's overriding the standard one... and is apparently causing problems. Your best option here is to find out what's installed the nonstandard cp command (maybe homebrew?) and remove it.
If it prints something like "cp is aliased to <some command string>", then you have an alias defined in one of your shell initialization files (or one of the files they run, etc). Check your ~/.profile, ~/.bash_profile, ~/.bash_login, and ~/.bashrc for the source of the alias definition.
If it prints "cp is a function", then you have a function defined in one of your shell init files. Check as you would for an alias definition.
The trick for me was having to do the cp command recursively as super user.
Not sure why
sudo /bin/cp -R -v -p source destination
the -v shows you the progress as each file is copied.
Cheers
Greg

Windows version of Unix touch command to modify a file's modification date from filename part

I have this bash script that loops through the files in the current directory and extracts the date part from the filename, then uses (Unix) touch command to modify (or update) that file's modification-date (mtime) to this date.
Filename example
Electric company name - bill - 2014-03-22.pdf
Bash script:
I save this bash script as _save_file_mtime_from_filename.sh (add chmod +x to it) and put in the directory where I'd like to change the file's modification time. And then run it from the command-line.
#!/bin/bash
CURRENT_DIR=$(dirname $_)
cd $CURRENT_DIR
for f in *
do
# Strip the file extension
d=${f%.*}
# Strip the last 10 characters
d=${d:${#d}-10}
# Check the format / mask
if [[ $d = ????-??-?? ]] ; then
# Strip the dash
d=${d//-}
# Run `touch` on the file with the extracted date format
# and add `0000` to the file date
touch -t ${d}0000 "$f"
fi
done
Now I'm searching for a Windows version of this script
I've search the net (and Stackoverflow). Found some related questions like https://stackoverflow.com/questions/51435/windows-version-of-the-unix-touch-command and https://superuser.com/questions/251470/windows-recursive-touch-command/251507#251507
Does anyone have any idea for a Windows version using a _save_file_mtime_from_filename.bat executable version that does essentially the same thing?
With a little tweaking and help of a (Mac) Automator action, saved as an 'application', you can even trigger this script in the Mac Finder from the right-mouse button (https://discussions.apple.com/thread/5287944?start=15&tstart=0). Sweet!
Install Cygwin, or install binaries from GnuWin32 or UnixUtils
I agree with #konsolebox. Just install Cygwin, and you can run your script with no changes.
However, if you don't want to go that route, you should be able to install "coreutils" from GnuWin32. Coreutils contains "touch", among other things. Writing a DOS batch file to emulate what you've written above might be a little painful, though. The part that looks tricky to me is [[ $d = ????-??-?? ]]. You'll have to do something creative to match that, perhaps using grep, which you can get from the same page. Now you're installing a couple things, and doing a lot of work, though. Installing Cygwin would be easier.

ZSH complains about RVM __rvm_cleanse_variables: function definition file not found

When using the latest ZSH and RVM on Mac OS X 10.7.4 ZSH complains about this:
__rvm_cleanse_variables: function definition file not found
Running the following solved the problem:
rm ~/.zcompdump*
Note: The * is incase there are multiple .zcompdump files.
Sometime there is also ~/.zcompdump-<COMPUTER NAME>-<VERSION> file, so use:
rm -f ~/.zcompdump*
To disable the .zcompdump* file(s), you could look in your .zshrc (or /etc/zsh/* files) for compinit and add the -D flag.
This might be better than creating the files and deleting them at every login.
(source: http://www.csse.uwa.edu.au/programming/linux/zsh-doc/zsh_23.html)
My problem persisted even after the compinit -D and rm -f ~/.zcompdump*. I found this github issue and checked my .zplug directory and sure enough found some non-hidden zcompdump files (no preceding '.'). Deleted those and I was good to go. If you're using a zsh plugin manager like zgen or zplug, check their directories.
Add rm -f ~/.zcompdump{,.zwc} to .zlogin to automate it

How can I make the "find" Command on OS X default to the current directory?

I am a heavy command line user and use the find command extensively in my build system scripts. However on Mac OS X when I am not concentrating I often get output like this:
$ find -name \*.plist
find: illegal option -- n
find: illegal option -- a
find: illegal option -- m
find: illegal option -- e
find: *.plist: No such file or directory
Basically, I forgot to add the little dot:
$ find . -name \*.plist
Because BSD find requires the path and GNU find doesn't (it assumes the current directory if you don't specify one). I use Linux, Mac OS X and Cygwin often all at the same time, so it's of great benefit to me to have all my tools behave the same. I tried writing a bash find function that added "./" if I forgot, but I failed. Thanks for your help. :)
Install GNU find instead.
$ brew install findutils
$ alias find=gfind
Yay, it works!
If you can't discipline yourself to use find 'correctly', then why not install GNU find (from findutils) in a directory on your PATH ahead of the system find command.
I used to have my own private variant of cp that would copy files to the current directory if the last item in the list was not a directory. I kept that in my personal bin directory for many years - but eventually removed it because I no longer used the functionality. (My 'cp.sh' was written in 1987 and edited twice, in 1990 and 1997, as part of changes to version control system notations. I think I removed it around 1998. The primary problem with the script is that cp file1 file2 is ambiguous between copying a file over another and copying two files to the current directory.)
Consider writing your own wrapper to find:
#!/bin/sh
[ ! -d "$1" ] && set -- . "$#"
exec /usr/bin/find "$#"
The second line says "if argument 1 is not a directory, then adjust the command line arguments to include dot ahead of the rest of the command. That will be confusing if you ever type:
~/bin/find /non-existent/directory -name '*.plist' -print
because the non-existent directory isn't a directory and the script will add dot to the command line -- the sort of reason that I stopped using my private cp command.
If you must call it 'find', then you want:
alias find=/usr/bin/find\ .
in your .profile or .bash_profile or …. Substitute the real path (if not /usr/bin/find) on your Mac OSX. Enter the full path to avoid cycles (bash normally would interpret alias find=find without issues, but better be sure).
But you better not name the alias find (findl, myfind etc), because it will become a habit and trouble for you if you try it on another system.
find ./ -name "*.plist"
edit: hmm, i may have misunderstood the question! if you were crazy, how about emulating it via a shell script? i routinely keep random utility scripts in ~/.bin, and that's the first thing in my PATH. if you had a similar setup perhaps you could do something like: (untested!)
#!/bin/sh
# remapping find!
CMD=`echo $1 | cut -c 1`
if [ $CMD = '-' ]
then
# pwd search
/usr/bin/find ./ $*
else
# regular find
/usr/bin/find $*
fi
I would suggest that if you're writing scripts (which are more likely to be migrated from one system to another sometime in the future) that you should try to use the more specific form of the command, that is specifying the "." instead of relying on a default. For the same reason, I might even suggest writing sh scripts instead of relying on bash which might not be installed everywhere.
This is probably not what you want but how about: alias find="find ."
or choose a new name (findl for find local?)
You may want to run the commands found in this link: https://www.topbug.net/blog/2013/04/14/install-and-use-gnu-command-line-tools-in-mac-os-x/
It is a bit outdated, for example I found I did not have to add many commands to my path at all.
This covers your problem by having your system use the Non-BSD find utility from the findutils package, while also installing other tools you may want as well.

Resources