Shell script: Check that file is file and not directory - bash

I'm currently working on a small cute shell script to loop through a specific folder and only output the files inside it, excluding any eventual directories. Unfortunately I can't use find as I need to access the filename variables.
Here's my current snippet, which doesn't work:
for filename in "/var/myfolder/*"
do
if [ -f "$filename" ]; then
echo $filename # Is file!
fi
done;
What am I doing wrong?

You must not escape /var/myfolder/*, meaning, you must remove the double-quotes in order for the expression to be correctly expanded by the shell into the desired list of file names.

What you're doing wrong is not using find. The filename can be retrieved by using {}.
find /var/myfolder -maxdepth 1 -type f -exec echo {} \;

Try without double quotes around /var/myfolder/* (reason being is that by putting double quotes you are making all the files a single string instead of each filename a separate string

for filename in "/var/myfolder/*"
The quotes mean you get one giant string from that glob -- stick an echo _ $filename _ immediately before the if to discover that it only goes through the 'loop' once, with something that isn't useful.
Remove the quotes and try again :)

You can use find and avoid all these hassles.
for i in $(find /var/myfolder -type f)
do
echo $(basename $i)
done
Isn't this what you're trying to do with your situation? If you want to restrict depth, use the -maxdepth option to find.

Related

How to search files ended stars?

I want to make script that searches,shows and delete for names ending in "*". My command:
echo rm `find -name "*[*]"`
Command works ,but I create file: something and something(ended star) Now after write command it, shows me : rm something(ended star) and similar file "something"
Why?
As Stefan Hamcke states in comments, this is because the wildcard (*) from find's result ("something*") is passed as argument to echo and ends up being expanded again, resulting in the final output having both something and something*.
Do this instead:
find . -name "*[*]" -exec echo rm {} +
Output:
rm ./something*
You can also achieve the same with the expression "*\*" in find.
The bug is that you do not quote the argument to echo. Here is a trivial fix:
echo "rm $(find -name "*[*]")"
This is not entirely a minimal fix because I also replaced the obsolescent `backtick` syntax with the modern, recommended $(command substitution) syntax.
Without quotes, the string returned from the command substition gets evaluated by the shell for token splitting and wildcard expansion. For details, see When to wrap quotes around a shell variable?

How to avoid using spaces as separators in zsh for-loop?

I'm trying to make a little script to convert some files from a music library.
But If I do something like :
#!/usr/bin/zsh
for x in $(find -name "*.m4a");
do
echo $x;
done
When interpreting in a folder containing :
foo\ bar.m4a
it will return :
foo
bar.m4a
How could I prevent the for loop from interpreting space characters as separators?
I could replace $(find -name "*.m4a") with $(find -name "*.m4a" | sed "s/ /_/g") and then using sed the other way inside the loop, but what if file names/paths already contain underscores (Or other characters I may use instead of underscore)?
Any idea?
You can prevent word splitting of the command substitution by double-quoting it:
#!/usr/bin/zsh
for x in "$(find -name "*.m4a")"
do
echo "$x"
done
Notice that the double quotes inside the command substitution don't conflict with the double quotes outside of it (I'd actually never noticed this before now). You could just as easily use single quotes if you find it more readable, as in "$(find -name '*.m4a')". I usually use single quotes with find primaries anyway.
Quoting inside the loop body is important for the same reason. It will ensure that the value of x is passed as a single argument.
But this is definitely a hybrid, Frankensteinian solution. You'd be better off with either globbing or using find as follows:
find . -name '*.mp3' -exec echo {} \;
but this form is limiting. You can add additional -exec primaries, which will be executed like shell commands that are separated by &&, but you can't pipe from one -exec to another and you can't interact with the shell (e.g. by assigning or expanding parameters).
Use zsh's globbing facilities here instead of find.
for x in **/*.m4a; do
echo "$x"
done
Quoting $x in the body of the loop is optional under the default settings of zsh, but it's not a bad idea to do so anyway.
I found out.
As suggested here : https://askubuntu.com/questions/344407/how-to-read-complete-line-in-for-loop-with-spaces
I may set the IFS (Internal Field Separator) to '\n'.
So this works :
#!/usr/bin/zsh
IFS=$'\n'
for x in $(find -name "*.m4a");
do
echo $x;
done
I hope this could help someone else!

bash variable substitution within command substitution

I want to do something like the following:
#!/bin/bash
cmd="find . -name '*.sh'"
echo $($cmd)
What I expect is that it will show all the shell script files in the current directory, but nothing happened.
I know that I can solve the problem with eval according to this post
#!/bin/bash
cmd="find . -name '*.sh'"
eval $cmd
So my question is why command substitution doesn't work here and what's the difference between $(...) and eval in terms of the question?
Command substitution works here. Just you have wrong quoting. Your script find only one file name! This one with single quotes and asteriks in it:
'*.sh'
You can create such not usual file by this command and test it:
touch "'*.sh'"
Quoting in bash is different than in other programming languages. Check out details in this answer.
What you need is this quoting:
cmd="find . -name *.sh"
echo $($cmd)
Since you are already including the patter *.sh inside double quotes, there's no need for the single quotes to protect the pattern, and as a result the single quotes are part of the pattern.
You can try using an array to keep *.sh quoted until it is passed to the command substitution:
cmd=(find . -name '*.sh')
echo $("${cmd[#]}")
I don't know if there is a simple way to convert your original string to an array without the pattern being expanded.
Update: This isn't too bad, but it's probably better to just create the array directly if you can.
cmd="find . -name *.sh"
set -f
cmd=($cmd)
set +f
echo $("${cmd[#]}")
When you use the echo $($cmd) syntax, it's basically equivalent to just putting $cmd on it's own line. The problem is the way bash wants to interpolate the wildcard before the command runs. The way to protect against that is to put the variable containing the * char in quotes AGAIN when you dereference them in the script.
But if you put the whole command find . -name "*.sh" in a variable, then quote it with `echo $("$cmd"), the shell will interpret that to mean that the entire line is a file to execute, and you get a file not found error.
So it really depends on what you really need in the variable and what can be pulled out of it. If you need the program in the variable, this will work:
#!/bin/bash
cmd='/usr/bin/find'
$cmd . -name "*.sh" -maxdepth 1
This will find all the files in the current working directory that end in .sh without having the shell interpolate the wildcard.
If you need the pattern to be in a variable, you can use:
#!/bin/bash
pattern="*.sh"
/usr/bin/find . -name "$pattern" -maxdepth 1
But if you put the whole thing in a variable, you won't get what you expect. Hope this helps. If a bash guru knows something I'm missing I'd love to hear it.

Copy File: Name Contains Spaces

I have a large shell script that processes files each of my Solaris systems.
In the beginning the script creates a variable FILENAME
Sometimes people create directories/files that contain spaces.
e.g.
/users/ldap/Anukriti's System Backup/BACKUP/workspace/BP8/scripts/yui/editor/simpleeditor.js
Later in the script I run
cp $FILENAME $DESTDIR/
As you can imagine this always fails because the following is invalid.
cp /users/ldap/Anukriti's System Backup/BACKUP/workspace/BP8/scripts/yui/editor/simpleeditor.js $DESTDIR
I have tried putting the Variable in Quotes, but this is not working. I have used find with -exec option before, but for this circumstance that is not really an option, especially since Solaris does not support the -wholename or -path options
What can i do here?
You just have to protect the variables with quotes :
cp "$FILENAME" "$DESTDIR"
NOTE
Don't use single quotes ', the variables can't be expanded this way.
Looks like i need to use curly braces for variable expansion and double Quotes
cp "${FILENAME}" $DESTDIR
Make sure that
$DESTDIR exists
is a directory
and yes, use double quotes for both variables and get rid of the trailing /.
You might not believe it, but that is your problem. :-)

Bash script look for a file type in a given folder

I have a bit of a problem. I'm trying to write a script that looks for all files of a given type (php) in a given directory. If it doesn't find it, it goes through all the sub-directories in the parent directory. If it finds it then, it performs a given operation and breaks.
Here is what I have so far:
function findPHP(){
declare -a FILES
FILES=$(find ./ -type f -name \*.php)
for f in $FILES
do
echo "Processing $f file..."
# take action on each file.
done
}
Any ideas?
When using $(...) the shell doesn't treat any characters within the parentheses as special (unlike the similar backquote syntax), which suggests that the * does not need to be escaped. The find command is probably literally seeing \*. Try removing the backslash.

Resources