Bash -c argument passing to find - bash

This command works
find . -name \*.txt -print
and outputs two filenames
This command works
bash -c 'echo . $0 $1 -print' "-name" "\*.txt"
and outputs this result:
. -name *.txt -print
But this command
bash -c 'find . $0 $1 -print' "-name" "\*.txt"
does not give an error but does not output anything either.
Can anyone tell me what is happening here?

It looks like you're trying to use "\*.txt" to forestall glob-expansion so that the find command sees *.txt instead of e.g. foo.txt.
However, what it ends up seeing is \*.txt. No files match that pattern, so you see no output.
To make find see *.txt as its 3rd argument, you could do this:
bash -c 'find . $0 "$1" -print' "-name" "*.txt"
Edit: Are you really getting . -name *.txt -print as the output of the first command where you replaced find with echo? When I run that command, I get . -name \*.txt -print.

Well the suggestions from francesco work. But I am still confused by the behaviour here.
We know that putting unquoted wild cards in a find command will usually result in an error. To wit:
find . -name *.txt -print
find: paths must precede expression: HowTo-Word-Split.txt' find:
possible unquoted pattern after predicate-name'?
However putting the wild card in single quotes (or escaping it if it is only 1 char) will work like so:
find . -name \*.txt -print
which gives this output (on two separate lines)
> ./HowTo-Word-Split.txt
> ./bash-parms.txt
So in the bash -c version what I was thinking was this:
bash -c 'find . $0 $1 -print' "-name" "*.txt"
would result in the *.txt being expanded even before being passed in to the cmd string,
and using single quotes around it would result in trying to execute (after the arg substitution and the -c taking effect)
find . -name *.txt -print
which as I just demonstrated does not work.
However there seems to be some sort of magic associated with the -c switch as demonstrated by setting -x at the bash prompt, like so:
$ set -x
$ bash -c ' find . $0 "$1" -print' "-name" "*.txt"
+ bash -c ' find . $0 "$1" -print' -name '*.txt'
./HowTo-Word-Split.txt
./bash-parms.txt
Note that even though I used double quotes in the -c line, bash actually executed the find with single quotes put around the argument, thus making find work.
Problem solved. :)!

Related

Edit a find -exec echo command to include a grep for a string

So I have the following command which looks for a series of files and appends three lines to the end of everything found. Works as expected.
find /directory/ -name "file.php" -type f -exec sh -c "echo -e 'string1\string2\nstring3\n' >> {}" \;
What I need to do is also look for any instance of string1, string2, or string3 in the find ouput of file.php prior to echoing/appending the lines so I don't append a file unnecessarily. (This is being run in a crontab)
Using | grep -v "string" after the find breaks the -exec command.
How would I go about accomplishing my goal?
Thanks in advance!
That -exec command isn't safe for strings with spaces.
You want something like this instead (assuming finding any of the strings is reason not to add any of the strings).
find /directory/ -name "file.php" -type f -exec sh -c "grep -q 'string1|string2|string3' \"\$1\" || echo -e 'string1\nstring2\nstring3\n' >> \"\$1\"" - {} \;
To explain the safety issue.
find places {} in the command it runs as a single argument but when you splat that into a double-quoted string you lose that benefit.
So instead of doing that you pass the file as an argument to the shell and then use the positional arguments in the shell command with quotes.
The command above simply chains the echo to a failure from grep to accomplish the goal.

Bash parameter expansion in brackets not working as expected

I am writing a script that wraps the find command to search for specific source file types under a given directory. A sample invocation would be :
./find_them.sh --java --flex --xml dir1
The above command would search for .java, .as and .xml files under dir1.
To do this manually I came up with the following find command :
find dir1 -type f -a \( -name "*.java" -o -name "*.as" -o -name "*.xml" \)
As I am doing this in a script where I want to be able specify different file sets to search for you end up with the following structure :
find_cmd_file_sets=$(decode_file_sets) # Assume this creates a string with the file sets e.g. -name "*.java" -o -name "*.as" etc
dirs=$(get_search_dirs) # assume this gives you the list of dirs to search, defaulting to the current directory
for dir in $dirs
do
find $dir -type f -a \( $find_cmd_file_sets \)
done
The above script doesn't behave as expected, you execute the script and the find command churns for a while before returning no results.
I'm certain the equivalents of decode_file_sets and get_search_dirs I've created are generating the correct results.
A simpler example if to execute the following directly in a bash shell
file_sets=' -name "*.java" -o -name "*.as" '
find dir -type f -a \( $file_sets \) # Returns no result
# Executing result of below command directly in the shell returns correct result
echo find dir -type f -a \\\( $file_sets \\\)
I don't understand why variable expansion in brackets of the find command would change the result. If it makes any difference I am using git-bash under Windows.
This is really frustrating. Any help would be much appreciated. Most importantly I would like to understand why the variable expansion of $file_sets is behaving as it is.
Hope this will work, Its tested on bash.
file_sets=' -name "*.java" -o -name "*.as" '
command=`echo "find $dir -type f -a \( $file_sets \)"`
eval $command
TLDR: Don't use quotes in find_cmd_file_sets variable and disable pathname expansion (set -f) before calling find.
When you have "special" character in a variable content and then you try to expand that variable without quotes than bash will surround each word with "special" character with single quotes, e.g.:
#!/usr/bin/env bash
set -x
VAR='abc "def"'
echo $VAR
The output is:
+ VAR='abc "def"'
+ echo abc '"def"'
abc "def"
As you can see, bash surrounded "def" with single quotes. In your case, the call to find command becomes:
find ... -name '"*.java"' ...
So it tries to find files which start with " and end with .java"
To prevent that behavior, the only thing you can do (which I'm aware of) is to use double quotes when expanding the variable, e.g.:
#!/usr/bin/env bash
set -x
VAR='abc "def"'
echo "$VAR"
The output is:
+ VAR='abc "def"'
+ echo 'abc "def"'
abc "def"
The only problem, as you probably noticed already, is that now the whole variable is in quotes and is treated as single argument. So this won't work in your find command.
The only option left is to not use quotes, neither in variable content nor when expanding the variable. But then, of course, you have a problem with pathname expansion:
#!/usr/bin/env bash
set -x
VAR='abc *.java'
echo $VAR
The output is:
+ VAR='abc *.java'
+ echo abc file1.java file2.java
abc file1.java file2.java
Fortunately you can disable pathname expansion using set -f:
#!/usr/bin/env bash
set -x
VAR='abc *.java'
set -f
echo $VAR
The output is:
+ VAR='abc *.java'
+ set -f
+ echo abc '*.java'
abc *.java
To sum up, the following should work:
#!/usr/bin/env bash
pattern='-name *.java'
dir="my_project"
set -f
find "$dir" -type f -a \( $pattern \)
bash arrays were introduced to allow this kind of nested quoting:
file_sets=( -name "*.java" -o -name "*.as" )
find dir -type f -a \( "${file_sets[#]}" \)

find command fusses on -exec arg

I am trying to build and run a find command from a script. But I get a very cryptic error message from find. The following basically sums up how I build the command line and run it
$ xx="find . -name 'p*' -mmin +10 -exec echo {} \\;"
$ echo "$xx" #.....and I get the same print from echo $xx
find . -name 'p*' -mmin +10 -exec echo {} \;
$ $xx
find: missing argument to `-exec'
$ find . -name 'p*' -mmin +10 -exec echo {} \;
./p2.sh
./p1.sh
$ read xx
find . -name 'p*' -mmin +2 -exec echo {} \\;
$ $xx
find: missing argument to `-exec'
I am stuck and will appreciate your help. I am also wondering what's causing this. I am using bash 3.2.51 on SLES.
The actual command I want to execute is a little bit longer but I used echo here just to illustrate.
Thanks
Dinesh
Trying to store complicated commands in bash variables and then evaluate the variables pretty well never works.
If you need to build a command in pieces, use an array. See this useful Bash FAQ: I'm trying to put a command in a variable, but the complex cases always fail!.
Here's the basic strategy:
# Make an array
declare -a findcmd=(find .)
# Add some arguments
findcmd+=(-name 'p*')
findcmd+=(-mmin +10)
findcmd+=(-exec echo {} \;)
# Run the command
"${findcmd[#]}"
You need to understand how bash quoting works. Remember that the quoting (and de-quoting) only happens once, when you type the command (or when bash reads it from a script file). Quotes which get into the values of variables are just ordinary characters.
If you're experimenting with set -x, remember also that set -x inserts quotes in order to remove ambiguities. These quotes are not part of the variables. While that is clearly essential, it seems to be confusing to programmers who are not familiar with the bash execution model.

Unix find: list of files from stdin

I'm working in Linux & bash (or Cygwin & bash).
I have a huge--huge--directory structure, and I have to find a few needles in the haystack.
Specifically, I'm looking for these files (20 or so):
foo.c
bar.h
...
quux.txt
I know that they are in a subdirectory somewhere under ..
I know I can find any one of them with
find . -name foo.c -print. This command takes a few minutes to execute.
How can I print the names of these files with their full directory name? I don't want to execute 20 separate finds--it will take too long.
Can I give find the list of files from stdin? From a file? Is there a different command that does what I want?
Do I have to first assemble a command line for find with -o using a loop or something?
If your directory structure is huge but not changing frequently, it is good to run
cd /to/root/of/the/files
find . -type f -print > ../LIST_OF_FILES.txt #and sometimes handy the next one too
find . -type d -print > ../LIST_OF_DIRS.txt
after it you can really FAST find anything (with grep, sed, etc..) and update the file-lists only when the tree is changed. (it is a simplified replacement if you don't have locate)
So,
grep '/foo.c$' LIST_OF_FILES.txt #list all foo.c in the tree..
When want find a list of files, you can try the following:
fgrep -f wanted_file_list.txt < LIST_OF_FILES.txt
or directly with the find command
find . type f -print | fgrep -f wanted_file_list.txt
the -f for fgrep mean - read patterns from the file, so you can easily grepping input for multiple patterns...
You shouldn't need to run find twenty times.
You can construct a single command with a multiple of filename specifiers:
find . \( -name 'file1' -o -name 'file2' -o -name 'file3' \) -exec echo {} \;
Is the locate(1) command an acceptable answer? Nightly it builds an index, and you can query the index quite quickly:
$ time locate id_rsa
/home/sarnold/.ssh/id_rsa
/home/sarnold/.ssh/id_rsa.pub
real 0m0.779s
user 0m0.760s
sys 0m0.010s
I gave up executing a similar find command in my home directory at 36 seconds. :)
If nightly doesn't work, you could run the updatedb(8) program by hand once before running locate(1) queries. /etc/updatedb.conf (updatedb.conf(5)) lets you select specific directories or filesystem types to include or exclude.
Yes, assemble your command line.
Here's a way to process a list of files from stdin and assemble your (FreeBSD) find command to use extended regular expression matching (n1|n2|n3).
For GNU find you may have to use one of the following options to enable extended regular expression matching:
-regextype posix-egrep
-regextype posix-extended
echo '
foo\\.c
bar\\.h
quux\\.txt
' | xargs bash -c '
IFS="|";
find -E "$PWD" -type f -regex "^.*/($*)$" -print
echo find -E "$PWD" -type f -regex "^.*/($*)$" -print
' arg0
# note: "$*" uses the first character of the IFS variable as array item delimiter
(
IFS='|'
set -- 1 2 3 4 5
echo "$*" # 1|2|3|4|5
)

How does command substitution work with find?

I have the following command
find . -name "*.tiff" -exec echo `basename -s .tiff {}` \;
I expect this to print all my .tiff-files without their file extensions. What I get is
./file1.tiff
./file2.tiff
...
The command,
find . -name "*.tiff" -exec basename -s .tiff {} \;
does yield
file1
file2
...
Is this not supposed to be the input of echo?
The content of the backticks is executed before the find command - yielding just the placeholder {}, which is used in the find command line - hence your result. You can always use set -x to examine what the shell is up to.
Use single-quote characters (') instead of backticks (`) - putting a command in backticks causes it to be executed and replaced by its output in your command.
Also, modify the command to get rid of the echo, like this:
find . -name "*.tiff" -exec 'basename -s .tiff {}' \;
This will execute basename on each found file.

Resources