Why is my bash randomly selecting files and renaming? - macos

On my Mac I am terminal scripting an increment rename of some files in a folder. The files have a number at the end but from a top down approach they are in order:
foobar101.png
foobar107.png
foobar115.png
foobar121.png
foobar127.png
foobar133.png
foobar141.png
foobar145.png
foobar151.png
foobar155.png
When I create and run my loop it works:
DIR="/customlocation/on/mac"
add=1;
for thefile in $(find $DIR -name "*.png" ); do
cd $DIR
mv -v "${thefile}" foobar"${add}".png
((add++))
done
However, when it runs the increment it's not as expected:
foobar101.png -> need foobar1.png but is foobar10.png
foobar107.png -> need foobar2.png but is foobar3.png
foobar115.png -> need foobar3.png but is foobar4.png
foobar121.png -> need foobar4.png but is foobar2.png
foobar127.png -> need foobar5.png but is foobar9.png
foobar133.png -> need foobar6.png but is foobar6.png
foobar141.png -> need foobar7.png but is foobar1.png
foobar145.png -> need foobar8.png but is foobar5.png
foobar151.png -> need foobar9.png but is foobar8.png
foobar155.png -> need foobar10.png but is foobar7.png
Ive tried searching on SO, Linux/Unix, Ask Ubuntu, and SuperUser but I don't see any questions that solve the issue of controlling the increment and I dont know if it's something in particular I should be looking at. So how can I control the increment from the lowest number/filename instead of the Mac possibly randomly renaming with an increment so I get the desired output?
EDIT:
After a comment from Etan I was looking into the numerical values at the end and some of the files are named foobarXXXX and that is the issue. The below answer, while awesome and a new approach I will look into still produces the same outcome because of some other files. If I remove all files that are foobarXXXX and only leave files with values of foobarXXX my code and the code in fedorqui's answer work. Is there a way then I can target this while in the loop process or do I have to target all names and test to see the length of values and adjust accordingly?

You cannot rely on the order of a find command, which uses the order that the VFS gives them to it in.
You may, instead, want to sort it:
DIR="/customlocation/on/mac"
add=1;
while IFS= read -r thefile; do
cd $DIR
mv -v "${thefile}" foobar"${add}".png
((add++))
done < <(find $DIR -name "*.png" | sort)
#-------^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Note this uses process substitution, which feed the while loop:
Process substitution is a form of redirection where the input or
output of a process (some sequence of commands) appear as a temporary
file.

Related

Print all ongoing targets in Makefile

I have written a makefile which has pretty complicated dependency, and executes with multiple jobs in parallel (make -j100 for example). I am trying to find a way to print all the current running target names. Any idea? Thanks in advance.
If what you want is a kind of command that you can run from time to time while make is running, and that shows all currently executing recipes, you could slightly modify your recipes such that they first create a temporary file with the name of the target, do whatever they are supposed to do and delete the temporary file. Listing these temporary files anytime will then show you the currently executing recipes.
Example if all targets are located under the directory from which make is called (or sub-directories of it):
TAGSDIR := .tags
MKTAG = mkdir -p "$(TAGSDIR)/$(#D)" && touch "$(TAGSDIR)/$#"
RMTAG = rm -f "$(TAGSDIR)/$#"
<target>: <prerequisites>
#$(MKTAG)
<regular recipe>
#$(RMTAG)
And list all files under .tags to get the names of all currently running recipes. Example with find:
find .tags -type f -printf '%P\n'
You could even encapsulate this in an infinite loop and refresh the list e.g. every second:
while true; do clear; find -type f -printf '%P\n'; sleep 1; done
EDIT
Andreas noticed that this works only if the targets are all located under the directory from which make is called. If a target is ../foobar, for instance, the temporary tag file would be .tags/../foobar, which is not what we want.
Andreas suggests to substitute .. with \.\. and / with \/. We could maybe find a way to do something like this under GNU/Linux and macOS (but not exactly, you cannot have a slash in a file name) but there could still be other issues under Windows (C:, backslashes...).
We could also store the name of the target in a text file and use mktemp or an equivalent to generate the text file with a unique name. But we would then need a way to propagate this unique name from MKTAG to RMTAG. This is doable with a shell variable and a one-line recipe (or the .ONESHELL special target) but not very nice.
As you use GNU make we could also use abspath and create temporary files named $(TAGSDIR)/$(abspath $#) but I do not know what abspath does under Windows with drive letters, nor do I know if you can name a file something\c:\something under Windows...
So, if your targets are not all located under the directory from which make is called, the best is to use another solution.

How to loop through the result of find in bash and perform operations on each of them

I am very new to bash scripting. I need to perform same operation on 300 files. I have tried the following so far:
empty_files=$(find ./ -type f -empty)
echo "$empty_files"
Now I have the full path to all the 300 files, stored in variable empty_files.
Now what I want to do is, for each file in the result
go to it's parent's parent directory,
then go to the other child (sibling of earlier file's parent directory) and find a file by name 'abc.js' inside it
Now, inside abc.js, find a particular word (object property) "user"
now on the new line, insert a line of js code. (exact same code for all files)
Please let me know if it's possible from mac command line.
Thanks
You can use a for loop:
for file in "${empty_files[#]}"
do
... code that uses "$file"
done
You could also pipe directly from find into the loop:
find . -type f -empty | while read -r file
do
... code that uses "$file"
done
This version should work for any filenames that don't contain newline.

need to get recursive list of files using mac command line

I need to get the contents of a folder via mac console window
and put into a text file via >output.txt:
existing structure looks like:
folder/index.html
folder/images/backpack.png
folder/shared/bootstrap/fonts/helvertica.eot
folder/css/fonts/helverticabold.eot
folder/shared/css/astyle.css
folder/js/libs/jquery-ui-1.10.4/jquery-ui.min.js",
folder/js/libs/jquery.tipsy.js
folder/js/libs/raphael.js
what I want looks would look like this (the folder is missing):
index.html
images/backpack.png
shared/bootstrap/fonts/helvertica.eot
css/fonts/helverticabold.eot
shared/css/astyle.css
js/libs/jquery-ui-1.10.4/jquery-ui.min.js
js/libs/jquery.tipsy.js
js/libs/raphael.js
No css/fonts or js/libs or css folders listed
i.e. no folders….. and no formatting like
/folder/shared/css/astyle.css Or
./folder/shared/css/astyle.css
even better would be with parens and commas:
“index.html”,
“images/backpack.png”,
“shared/bootstrap/fonts/helvertica.eot”,
“css/fonts/helverticabold.eot”,
“shared/css/astyle.css”,
“js/libs/jquery-ui-1.10.4/jquery-ui.min.js”,
“js/libs/jquery.tipsy.js”,
“js/libs/raphael.js”
As I want to make a json document. Is this possible?
Thanks.
This is the sort of task that find is good at:
% find folder -type f | sed -e 's,folder/,",' -e 's/$/",/'
You might be able to adjust the 's,folder/,",' substitution by, say
(cd folder; find . -type f) | sed 's/\(.*\)/"\1",/'
Further refinements are an exercise for the reader!

Bash script to find file older than X days, then subsequently delete it, and any files with the same base name?

I am trying to figure out a way to search a directory for a file older than 365 days. If it finds a match, I'd like it to both delete the file and locate any other files in the directory that have the same basename, and delete those as well.
File name examples: 12345.pdf (Search for) then delete, 12345_a.pdf, 12345_xyz.pdf (delete if exist).
Thanks! I am very new to BASH scripting, so patience is appreciated ;-))
I doubt this can be done cleanly in a single pass.
Your best bet is to use -mtime or a variant to collect names and then use another find command to delete files matching those names.
UPDATE
With respect to your comment, I mean something like:
# find basenames of old files
find .... -printf '%f\n' | sort -u > oldfiles
for file in ($<oldfiles); do find . -name $file -exec rm; done

batch rename files and folders at once

I got help regarding the following question:
batch rename files with ids intact
It's a great example of how to rename specific files in a group, but I am wondering if there is a similar script I could use to do the following:
I have a group of nested folders and files within a root directory that contain [myprefix_foldername] and [myprefix_filename.ext]
I would like to rename all of the folders and files to [foldername] and [filename.ext]
Can I use a similar methodology to what is found in the post above?
Thanks!
jml
Yes, quite easily, with find.
find rootDir -name "myprefix_*"
This will give you a list of all files and folders in rootDir that start with myprefix_. From there, it's a short jump to a batch rename:
find rootDir -name "myprefix_*" | while read f
do
echo "Moving $f to ${f/myprefix_/}"
mv "$f" "${f/myprefix_/}"
done
EDIT: IFS added per http://www.cyberciti.biz/tips/handling-filenames-with-spaces-in-bash.html
EDIT 2: IFS removed in favor of while read.
EDIT 3: As bos points out, you may need to change while read f to while read -d $'\n' f if your version of Bash still doesn't like it.

Resources