How to search files ended stars? - bash

I want to make script that searches,shows and delete for names ending in "*". My command:
echo rm `find -name "*[*]"`
Command works ,but I create file: something and something(ended star) Now after write command it, shows me : rm something(ended star) and similar file "something"
Why?

As Stefan Hamcke states in comments, this is because the wildcard (*) from find's result ("something*") is passed as argument to echo and ends up being expanded again, resulting in the final output having both something and something*.
Do this instead:
find . -name "*[*]" -exec echo rm {} +
Output:
rm ./something*
You can also achieve the same with the expression "*\*" in find.

The bug is that you do not quote the argument to echo. Here is a trivial fix:
echo "rm $(find -name "*[*]")"
This is not entirely a minimal fix because I also replaced the obsolescent `backtick` syntax with the modern, recommended $(command substitution) syntax.
Without quotes, the string returned from the command substition gets evaluated by the shell for token splitting and wildcard expansion. For details, see When to wrap quotes around a shell variable?

Related

How to avoid using spaces as separators in zsh for-loop?

I'm trying to make a little script to convert some files from a music library.
But If I do something like :
#!/usr/bin/zsh
for x in $(find -name "*.m4a");
do
echo $x;
done
When interpreting in a folder containing :
foo\ bar.m4a
it will return :
foo
bar.m4a
How could I prevent the for loop from interpreting space characters as separators?
I could replace $(find -name "*.m4a") with $(find -name "*.m4a" | sed "s/ /_/g") and then using sed the other way inside the loop, but what if file names/paths already contain underscores (Or other characters I may use instead of underscore)?
Any idea?
You can prevent word splitting of the command substitution by double-quoting it:
#!/usr/bin/zsh
for x in "$(find -name "*.m4a")"
do
echo "$x"
done
Notice that the double quotes inside the command substitution don't conflict with the double quotes outside of it (I'd actually never noticed this before now). You could just as easily use single quotes if you find it more readable, as in "$(find -name '*.m4a')". I usually use single quotes with find primaries anyway.
Quoting inside the loop body is important for the same reason. It will ensure that the value of x is passed as a single argument.
But this is definitely a hybrid, Frankensteinian solution. You'd be better off with either globbing or using find as follows:
find . -name '*.mp3' -exec echo {} \;
but this form is limiting. You can add additional -exec primaries, which will be executed like shell commands that are separated by &&, but you can't pipe from one -exec to another and you can't interact with the shell (e.g. by assigning or expanding parameters).
Use zsh's globbing facilities here instead of find.
for x in **/*.m4a; do
echo "$x"
done
Quoting $x in the body of the loop is optional under the default settings of zsh, but it's not a bad idea to do so anyway.
I found out.
As suggested here : https://askubuntu.com/questions/344407/how-to-read-complete-line-in-for-loop-with-spaces
I may set the IFS (Internal Field Separator) to '\n'.
So this works :
#!/usr/bin/zsh
IFS=$'\n'
for x in $(find -name "*.m4a");
do
echo $x;
done
I hope this could help someone else!

bash variable substitution within command substitution

I want to do something like the following:
#!/bin/bash
cmd="find . -name '*.sh'"
echo $($cmd)
What I expect is that it will show all the shell script files in the current directory, but nothing happened.
I know that I can solve the problem with eval according to this post
#!/bin/bash
cmd="find . -name '*.sh'"
eval $cmd
So my question is why command substitution doesn't work here and what's the difference between $(...) and eval in terms of the question?
Command substitution works here. Just you have wrong quoting. Your script find only one file name! This one with single quotes and asteriks in it:
'*.sh'
You can create such not usual file by this command and test it:
touch "'*.sh'"
Quoting in bash is different than in other programming languages. Check out details in this answer.
What you need is this quoting:
cmd="find . -name *.sh"
echo $($cmd)
Since you are already including the patter *.sh inside double quotes, there's no need for the single quotes to protect the pattern, and as a result the single quotes are part of the pattern.
You can try using an array to keep *.sh quoted until it is passed to the command substitution:
cmd=(find . -name '*.sh')
echo $("${cmd[#]}")
I don't know if there is a simple way to convert your original string to an array without the pattern being expanded.
Update: This isn't too bad, but it's probably better to just create the array directly if you can.
cmd="find . -name *.sh"
set -f
cmd=($cmd)
set +f
echo $("${cmd[#]}")
When you use the echo $($cmd) syntax, it's basically equivalent to just putting $cmd on it's own line. The problem is the way bash wants to interpolate the wildcard before the command runs. The way to protect against that is to put the variable containing the * char in quotes AGAIN when you dereference them in the script.
But if you put the whole command find . -name "*.sh" in a variable, then quote it with `echo $("$cmd"), the shell will interpret that to mean that the entire line is a file to execute, and you get a file not found error.
So it really depends on what you really need in the variable and what can be pulled out of it. If you need the program in the variable, this will work:
#!/bin/bash
cmd='/usr/bin/find'
$cmd . -name "*.sh" -maxdepth 1
This will find all the files in the current working directory that end in .sh without having the shell interpolate the wildcard.
If you need the pattern to be in a variable, you can use:
#!/bin/bash
pattern="*.sh"
/usr/bin/find . -name "$pattern" -maxdepth 1
But if you put the whole thing in a variable, you won't get what you expect. Hope this helps. If a bash guru knows something I'm missing I'd love to hear it.

Bash script look for a file type in a given folder

I have a bit of a problem. I'm trying to write a script that looks for all files of a given type (php) in a given directory. If it doesn't find it, it goes through all the sub-directories in the parent directory. If it finds it then, it performs a given operation and breaks.
Here is what I have so far:
function findPHP(){
declare -a FILES
FILES=$(find ./ -type f -name \*.php)
for f in $FILES
do
echo "Processing $f file..."
# take action on each file.
done
}
Any ideas?
When using $(...) the shell doesn't treat any characters within the parentheses as special (unlike the similar backquote syntax), which suggests that the * does not need to be escaped. The find command is probably literally seeing \*. Try removing the backslash.

/usr/bin/find: cannot build its arguments dynamically

The following command works as expected interactively, in a terminal.
$ find . -name '*.foo' -o -name '*.bar'
./a.foo
./b.bar
$
However, if I do this, I get no results!
$ ftypes="-name '*.foo' -o -name '*.bar'"
$ echo $ftypes
-name '*.foo' -o -name '*.bar'
$ find . $ftypes
$
My understanding was/is that $ftypes would get expanded by bash before find got a chance to run. In which case, the ftypes approach should also have worked.
What is going on here?
Many thanks in advance.
PS: I have a need to dynamically build a list of file types (the ftypes variable above) to be given to find later in a script.
Both answers so far have recommended using eval, but that has a well-deserved reputation for causing bugs. Here's an example of the sort of bizarre behavior you can get with this:
$ touch a.foo b.bar "'wibble.foo'"
$ ftypes="-name '*.foo' -o -name '*.bar'"
$ eval find . $ftypes
./b.bar
Why didn't it find the file ./a.foo? It's because of exactly how that eval command got parsed. bash's parsing goes something like this (with some irrelevant steps left out):
bash looks for quotes first (none found -- yet).
bash substitutes variables (but doesn't go back and look for quotes in the substituted values -- this is what lead to the problem in the first place).
bash does wildcard matching (in this case it looks for files matching '*.foo' and '*.bar' -- note that it hasn't parsed the quotes, so it just treats them as part of the filename to match -- and finds 'wibble.foo' and substitutes it for '*.foo'). After this the command is roughly eval find . -name "'wibble.foo'" -o "'*.bar'". BTW, if it had found multiple matches things would've gotten even sillier by the end.
bash sees that the command on the line is eval, and runs the whole parsing process over on the rest of the line.
bash does quote matching again, this time finding two single-quoted strings (so it'll skip most parsing on those parts of the command).
bash looks for variables to substitute and wildcards to matching, etc, but there aren't any in the unquoted sections of the command.
Finally, bash runs find, passing it the arguments ".", "-name", "wibble.foo", "-o", "-name", and "*.bar".
find finds one match for "*.bar", but no match for "wibble.foo". It never even knows you wanted it to look for "*.foo".
So what can you do about this? Well, in this particular case adding strategic double-quotes (eval "find . $ftypes") would prevent the spurious wildcard substitution, but in general it's best to avoid eval entirely. When you need to build commands, an array is a much better way to go (see BashFAQ #050 for more discussion):
$ ftypes=(-name '*.foo' -o -name '*.bar')
$ find . "${ftypes[#]}"
./a.foo
./b.bar
Note that you can also build the options bit by bit:
$ ftypes=(-name '*.foo')
$ ftypes+=(-o -name '*.bar')
$ ftypes+=(-o -name '*.baz')
Simply prefix the line with eval to force the shell to expand and parse the command:
eval find . $ftypes
Without the eval, the '*.foo' is passed on literally instead of just *.foo (that is, the ' are suddenly considered to be part of the filename, so find is looking for files that start with a single quote and have an extension of foo').
The problem is that since $ftypes a single quoted value, find does see it as a single argument.
One way around it is:
$ eval find . $ftypes

Shell script: Check that file is file and not directory

I'm currently working on a small cute shell script to loop through a specific folder and only output the files inside it, excluding any eventual directories. Unfortunately I can't use find as I need to access the filename variables.
Here's my current snippet, which doesn't work:
for filename in "/var/myfolder/*"
do
if [ -f "$filename" ]; then
echo $filename # Is file!
fi
done;
What am I doing wrong?
You must not escape /var/myfolder/*, meaning, you must remove the double-quotes in order for the expression to be correctly expanded by the shell into the desired list of file names.
What you're doing wrong is not using find. The filename can be retrieved by using {}.
find /var/myfolder -maxdepth 1 -type f -exec echo {} \;
Try without double quotes around /var/myfolder/* (reason being is that by putting double quotes you are making all the files a single string instead of each filename a separate string
for filename in "/var/myfolder/*"
The quotes mean you get one giant string from that glob -- stick an echo _ $filename _ immediately before the if to discover that it only goes through the 'loop' once, with something that isn't useful.
Remove the quotes and try again :)
You can use find and avoid all these hassles.
for i in $(find /var/myfolder -type f)
do
echo $(basename $i)
done
Isn't this what you're trying to do with your situation? If you want to restrict depth, use the -maxdepth option to find.

Resources