Bash '.' vs proper dir string - bash

I have been running a number of find commands and have noticed something that seems odd about how bash handles the . vs a dir inputted as a string.
find . -type f -exec sh -c 'cd $(dirname "$0") && aunpack "$0"' {} \;
acts completely differently to
find [current dir] -type f -exec sh -c 'cd $(dirname "$0") && aunpack "$0"' {} \;
What gives?
Does bash treat '.' and a string specified directory path differently. Isn't '.' a substitute for the current dir?

What find does is append the rest of the path to the location passed as an argument.
Ie: if you are in dir "/home/user/find":
find .
Prints:
.
./a
./b
But if you try:
find /home/user/find
It prints:
/home/user/find
/home/user/find/a
/home/user/find/b
So find appends the rest of the path (/a, /b...) to the argument (. or /home/user/find).

You could use the pwd command instead of the . and it will behave the same.
find "`pwd`" -type f -exec sh -c 'cd $(dirname "$0") && aunpack "$0"' {} \;

#arutaku has pinpointed the source of the problem; let me point out another possible solution. If your version of find supports it, the -execdir primary does what you want very simply: it cd's to the directory each file is in, then executes the command with just the filename (no path):
find . -type f -execdir aunpack {} \;

Bash has nothing to do with it, it's the logic of find. It does not try to expand or normalize the path(s) you give, it just uses them verbatim: not only for . but for any path specification (e.g. ../../my/other/project).
I find it reasonable because any conversion would be more complicated than the current behavior. At least we would have to remember if symbolic links are resolved during conversion. And whenever we want a relative path for some reason, we would have to relativize it again.

Related

Find and rename files by pattern works in Debian, but not in CentOS7

I need to find and rename files with question mark in names.
Example: "style.css?ver=111" should become "style.css"
I use this command
find . -type f -name "*\?*" -exec rename 's/\?.*//' '{}' \;
In Debian all works fine, but in CentOS7 I get and error that "rename: not enough arguments
"
Any ideas why?
For a reliable option that should work in any POSIX-compliant system, you may use
find . -type f -name "*\?*" -exec sh -c 'mv -- "$1" "${1%%\?*}"' findshell {} \;
$1 is the name of each file found and ${1%%\?*} is a construct that strips the substring starting from the question mark.
That should be enough if you have a few matching files. If you need it, a more efficient alternative is
find . -type f -name "*\?*" -exec sh -c '
for file in "$#"; do
mv -- "$file" "${file%%\?*}"
done
' findshell {} +

Unix find -exec: Why do the following behaviors differ?

The following works as intended:
$ find . -name .git -exec dirname '{}' \;
./google/guava
./JetBrains/intellij-community
./zsh-users/zsh-syntax-highlighting
But the following only returns dots:
$ find . -name .git -exec echo "$(dirname '{}')" \;
.
.
.
Why is that, and how can I use $(dirname '{}') in a find -exec command?
Please note, I am asking about UNIX find (OS X and FreeBSD, specifically), not GNU.
Reason for behaviour difference
Your shell is evaluating the $(dirname) before find, leading to this command being executed :
find . -name .git -exec echo . ;
Other ways to do this
You can of course use shell expansion inside find, by calling another shell yourself (or better, calling a script using the shell you want as shebang).
In other words:
find . -name .git -exec sh -c 'dirname {}' \;
Solution without dirname (POSIX, faster, one less subprocess to call) :
find . -name .git -exec sh -c 'path={}; echo "${path%/*}" ' \;
Combing /u/tripleee's answer (upvote him not me) with the find optimization :
find . -name .git -exec sh -c 'for f; do echo "${f%/*}"; done' _ {} \+
If you're using gnu find then you can ditch dirname and use printf:
find . -name .git -printf "%h\n"
The general answer is to run -exec sh (or -exec bash if you need Bash features).
find . -name .git -exec sh -c 'for f; do dirname "$f"; done' _ {} \+
dirname can easily be replaced with something simpler, but in the general case, this is a useful pattern for when you want a shell to process the results from find.
-exec ... \+ is running the shell on as many matches as possible, instead of executing a separate shell for each match. This optimization is not available in all versions of find.
If you have completely regular file names (no newlines in the results, etc) you might be better off with something like
find . -name .git | sed 's%/[^/]*$%%'
However, assuming that file names will be regular is a huge recurring source of bugs. Don't do that for a general-purpose tool.

Collect jars into a tar without directory

I have a large project that creates a large number of jars in a path similar to project/subproject/target/subproject.jar. I want to make a command to collect all the jars into one compressed tar, but without the directories. The command I have come up with so far is: find project -name \*.jar -exec tar -rvf Collectors.tar.gz -C $(dirname {}) $(basename {}) \; but this isn't quite working as I am intending, the directories are still there.
Does anyone have any ideas for how to resolve this issue?
Your command is quite close, but the problem is that Bash is executing $(dirname {}) and $(basename {}) before executing find; so your command expands to this:
find project -name \*.jar -exec tar -rvf Collectors.tar.gz -C . {} \;
where the -C . is a no-op and the {} just expands to the full relative directory+filename.
One general-purpose way to fix this sort of thing is to wrap up the argument to -exec in a Bash one-liner, so you invoke Bash for each individual file, and let it execute the dirname and basename at the right time:
find project -name \*.jar -exec bash -c 'tar -rvf Collectors.tar.gz -C "$(dirname "$1")" "$(basename "$1")"' '' '{}' \;
In your specific case, however, I'd point you to find's -execdir action, which is the same as -exec except that it cd's into the file's directory first. So you can simply write:
find project -name '*.jar' -execdir tar -rvf "$PWD/Collectors.tar.gz" '{}' \;
(Note that $PWD part, which is to make sure that you write to the Collectors.tar.gz in the current directory, rather than in the directory that find -execdir will cd into.)

recursively rename directories in bash

I'd like to recursively rename all directories containing the string foo by replacing that part of the string with Bar. I've got something like this so far, but it doesn't quite work. I'd also like foo to be searched case-insensitive.
find . -type d -exec bash -c 'mv "$1" "${1//foo/Bar}"' -- {} \;
Are there any elegant one-liners that might be better than this attempt? I've actually tried a few but thought I'd defer to the experts. Note: i'm doing this on a Mac OS X system, and don't have tools like rename installed.
Try the following code using parameter expansion
find . -type d -iname '*foo*' -depth -exec bash -c '
echo mv "$1" "${1//[Ff][Oo][Oo]/BAr}"
' -- {} \;
But your best bet will be the prename command (sometimes named rename or file-rename)
find . -type d -iname '*foo*' -depth -exec rename 's#Foo#Bar#gi' {} +
And if you are using bash4 or zsh (** mean recursive):
shopt -s globstar
rename -n 's#Foo#Bar#gi' **/*foo*/
If it fit your needs, remove the -n (dry run) switch to rename for real.
SOME DOC
rename was originally written by Perl's dad, Larry Wall himself.
I suspect the problem is getting it to work with mkdir -p foo/foo/foo.
In this regard, I think a solution based on find will likely not work because the list of paths is probably predetermined.
The following is in no way elegant, and stretches the definition of a one-liner, but works for the above test.
$ mkdir -p foo/foo/foo
$ (shopt -s nullglob && _() { for P in "$1"*/; do Q="${P//[Ff][Oo][Oo]/bar}"; mv -- "$P" "$Q"; _ "$Q"; done } && _ ./)
$ find
.
./bar
./bar/bar
./bar/bar/bar
find . -type d -iname '*foo*' -exec bash -O nocasematch -c \
'[[ $1 =~ (foo) ]] && mv "$1" "${1//${BASH_REMATCH[1]}/Bar}"' -- {} \;
Pro: Avoids sed.
Con: Will not find all matches if there are multiple in different cases.
Con: Is ridiculous.
Thanks #Gilles Quenot and Wenli: The following worked for me. I based it on both of your solutions.
find . -depth -type d -name 'ReplaceMe*' -execdir bash -c 'mv "$1" "${1/ReplaceMe/ReplaceWith}"' -- {} \;
The -execdir seems to be key on linux Red hat 7.6
I've been searching similar answers and this one worked:
find . -depth -name 'foo' -execdir bash -c 'mv "$0" ${0//foo/Bar}"' {} \;

Bash script, run echo command with find, set a variable and use that variable

I want to run two commands but the second command depends on the first.
Is there a way to do something like this.
find . -name '*.txt' -exec 'y=$(echo 1); echo $y' {} \;
...
And actually, I want to do this.
Run the find command, change to that directory that the file is in and then run the command on the file in the current directory.
find . -name '*.txt' -exec 'cd basedir && /mycmd/' {} \;
How do I do that?
find actually has a primary that switches to each file's directory and executes a command from there:
find . -name '*.txt' -execdir /mycmd {} \;
Find's -exec option expects an executable with arguments, not a command, but you can use bash -c cmd to run an arbitrary shell command like this:
find . -name '*.txt' -exec bash -c 'cd $(dirname {}) && pwd && /mycmd $(basename {})' \;
I have added pwd to confirm that mycmd executes in the right directory. You can remove it. dirname gives you the directory of each file and basename gives you the filename. If you omit basename your command will receive (as {}) pathname to each file relative to the directory where you run find which is different from mycmd's current directory due to cd, so mycmd will likely fail to find the file. If you want your command to receive absolute pathname, you can try this:
find $PWD -name '*.txt' -exec bash -c 'cd $(dirname {}) && pwd && /mycmd {}' \;

Resources