else part of an if/else block never runs in Bash script - bash

In the below function, I'm trying to print the file name if there are any files in the directory or an error if there's not, but the else part of the inner if/else block, which prints the error message, never runs.
It should be something logical but I can't figure it out.
walk_dir () {
shopt -s nullglob dotglob
for pathname in "$1"/*; do
if [ -d "$pathname" ]; then
printf "\nFiles under $(basename "$pathname")\n";
printf "==========================================\n";
walk_dir "$pathname";
else
fc=$(find "$pathname" -maxdepth 1 -type f | wc -l);
# Here is the problem
if [ $fc -gt 0 ] && [ ! -z $fc ]; then
printf '%s\n' $(basename "$pathname");
else
printf '\e[30mNo candidate file found.\e[39m\n';
fi
fi
done
}

There are 3 cases for walk_dir
Non-empty folder
Empty folder
Regular file
Ignoring special file, probably N/A here.
When called for non-empty folder, walk_dir will make (recursive) on any sub folder (line #7), and then will print the base name of every regular file in line #12.
When called with empty folder, walk_dir will do nothing since "$1"/* expand to empty list.
When the directory contain files, the code will "count" the number of files in line #9. This l only process non-directories - probably files (assuming not special devices, etc). In this case it will make recursive call for each entry in the folder.
If there are regular files in the folder, the code will execute the find on line #8. On a regular file will always set fc="1", therefore the condition on line 11 will always be true, never getting into the else part on line #13.
1 walk_dir () {
2 shopt -s nullglob dotglob
3 for pathname in "$1"/*; do
4 if [ -d "$pathname" ]; then
5 printf "\nFiles under $(basename "$pathname")\n";
6 printf "==========================================\n";
7 walk_dir "$pathname";
8 else
9 fc=$(find "$pathname" -maxdepth 1 -type f | wc -l);
10 # Here is the problem
11 if [ $fc -gt 0 ] && [ ! -z $fc ]; then
12 printf '%s\n' $(basename "$pathname");
13 else
14 printf '\e[30mNo candidate file found.\e[39m\n';
15 fi
16 fi
17 done
18 }

According to #dash-o explanations I put the if condition before iterating over files and, it worked as I expected.
walk_dir () {
shopt -s nullglob dotglob
fc=$(find "$1" -maxdepth 1 -type f | wc -l);
if [ $fc -eq 0 ] || [ -z $fc ]; then
printf '\e[30mNo candidate file found.\e[0m\n';
fi
for pathname in "$1"/*; do
if [ -d "$pathname" ]; then
printf "\nFiles under $(basename "$pathname")\n";
printf "==========================================\n";
walk_dir "$pathname";
else
printf '%s\n' $(basename "$pathname");
fi
done
}

Too long for a comment!
I don't understand the usage of
fc=$(find "$pathname" -maxdepth 1 -type f | wc -l);
In that part of the code $pathname is either a file (not directory) or the search pattern of the for loop, if no file was found. So this code seems better:
if [[ -f $filename ]]
then
# use input redirection to suppress filename
fc=$(wc -l <"$filename")
else
echo "not a plain file"
fi

Related

BASH test if file name ends with .dylib

I'm walking a file tree in order to identify all .DYLIB files.
#!/bin/bash
#script to recursively travel a dir of n levels
function traverse() {
for file in "$1"/*
do
if [ ! -d "${file}" ] ; then
echo "${file} is a file"
else
echo "entering recursion with: ${file}"
traverse "${file}"
fi
done
}
function main() {
traverse "$1"
}
main "$1"
I want to test if the filename ends with .DYLIB before printing "... is a file". I think I may need to add to the condition "if [ ! -d "${file}" ] ; then", but I'm not sure. Is there a way to do this in bash?
No need to write your own recursive function. You can recursively find all *.dylib files using a ** glob:
shopt -s globstar
ls "$1"/**/*.dylib
Or use find:
find "$1" -name '*.dylib'
To use these results I recommend looping over them directly. It avoids using up memory with a temporary array.
shopt -s globstar
for file in "$1"/**/*.dylib; do
echo "$file"
done
or
while IFS= read -rd '' file; do
echo "$file"
done < <(find "$1" -name '*.dylib')
Is there a way I can store everything in a string array so that I can perform an operation on those .dylib files?
But if you do indeed want an array, you can write:
shopt -s globstar
files=("$1"/**/*.dylib)
or
readarray -td '' files < <(find "$1" -name '*.dylib')
Then to loop over the array you'd write:
for file in "${files[#]}"; do
echo "$file"
done
Don't add to the condition in if [ ! -d "$file" ], because then the else block will try to recurse into files that don't have the suffix. But recursion should only be for directories.
You should add a nested condition for this. You can use bash's [[ ]] condition operator to have = perform wildcard matching.
if [ ! -d "${file}" ] ; then
if [[ "$file" = *.DYLIB ]]; then
echo "${file} is a file"
fi
else
echo "entering recursion with: ${file}"
traverse "${file}"
fi
Or you could invert the sense of the directory test:
if [ -d "$file" ]; then
echo "entering recursion with: ${file}"
traverse "${file}"
elif [ "$file" = *.DYLIB ]; then
echo "$file is a file"
fi

(unix) How to write script to count the number of files in a directory using loop

How can I create a bash script to count the number of files in a directory using a loop.
The script should take a target directory and output: Number of files in ::
#!/bin/bash
counter=0
if [ ! -d "$1" ]
then
printf "%s\n" " $1 is not a directory"
exit 0
fi
directory="$1"
number="${directory##*/}"
number=${#number}
if [ $number -gt 0 ]
then
directory="$directory/"
fi
for line in ${directory}*
do
if [ -d "$line" ]
then
continue
else
counter=$(( $counter + 1))
fi
done
printf "%s\n" "Number of files in $directory :: $counter"
I would use (GNU) find and wc:
find /path/to/dir -maxdepth 1 -type f -printf '.' | wc -c
The above find command prints a dot for every file in the directory and wc -c counts those dots. This would work well with any kind of special character (including whitespaces and newlines) in the filenames.
You don't really need a loop. The following will count the files in a directory:
files=($(ls $1))
echo ${#files[#]}

Setting Shell Positional Parameters With "set ls -AF"

I am writing a ksh function (that is placed in the .profile file) that will present a menu of subdirectories and permit the user to choose one into which to cd. Here is the code:
# Menu driven subdirectory descent.
d(){
# Only one command line argument accepted
[ "$1" = "--" ] && shift $# # Trap for "ls --" feature
wd=`pwd`; arg="${1:-$wd}"
dirs="`/bin/ls -AF $arg 2>/dev/null | grep /$ | tr -d \"/\"`"
# Set the names of the subdirectories to positional parameters
if [ "$dirs" ] ;then
set $dirs
if [ $# -eq 1 -a "$arg" = "$wd" ] ;then cd $arg/$1; return; fi # trap: it's obvious; do it
else echo "No subdirectories found" >&2; return 1
fi
# Format and display the menu
if [ `basename "${arg}X"` = "${arg}X" ] ;then arg="$wd/$arg"; fi # Force absolute path if relitive
echo -e "\n\t\tSubdirectories relative to ${arg}: \n"
j=1; for i; do echo -e "$j\t$i"; j=`expr $j + 1`; done | pr -r -t -4 -e3
echo -e "\n\t\tEnter the number of your choice -- \c "
# Convert user-input to directory-name and cd to it
read choice; echo
dir=`eval "(echo $\{"$choice"\})"` # Magic here.
[ "$choice" -a "$choice" -ge 1 -a "$choice" -le "$#" ] && cd $arg/`eval echo "$dir"`
}
This function works reasonably well with the exception of directory names that contain space characters. If the directory name contains a space, the set command sets each space delimited element of the directory name (instead of the complete directory name) into a separate positional parameter; that is not useful here.
I have attempted to set the $IFS shell variable (which contains a space, tab, and newline by default) to a single newline character with:
IFS=`echo` # echo outputs a trailing newline character by default
Which appears to accomplish what is intended as verified with:
echo -e "$IFS\c" | hexdump -c
But despite my best efforts (over the course of several days work) I have failed to set the entire directory names that contain spaces as values for positional parameters.
What am I missing?
Suggestions are hereby solicited and most welcome.
ADVAthanksNCE
Bob
Short answer: You can't do that. Don't try. See the ParsingLs page for an understanding of why programmatic use of ls is inherently error-prone.
You can't get -F behavior without implementing it yourself in shell (which is indeed feasible), but the following is the correct way to put a list of subdirectories into the argument list:
set -- */
If you don't want to have a literal / on the end of each entry:
set -- */ # put list of subdirectories into "$#"
set -- "${#%/}" # strip trailing / off each
Even better, though: Use an array to avoid needing eval magic later.
dirs=( */ )
dirs=( "${dirs[#]%/}" )
printf '%s\n' "${dirs[$choice]}" # emit entry at position $choice
Let's tie this all together:
d() {
destdir=$(
FIGNORE= # ksh93 equivalent to bash shopt -s dotglob
while :; do
subdirs=( ~(N)*/ ) # ksh93 equivalent to subdirs=( */ ) with shopt -s nullglob
(( ${#subdirs[#]} > 2 )) || break # . and .. are two entries
for idx in "${!subdirs[#]}"; do
printf '%d) %q\n' "$idx" "${subdirs[$idx]%/}" >&2
done
printf '\nSelect a subdirectory: ' >&2
read -r choice
if [[ $choice ]]; then
cd -- "${subdirs[$choice]}" || break
else
break
fi
done
printf '%s\n' "$PWD"
)
[[ $destdir ]] && cd -- "$destdir"
}
Although still not working, this version does pass shellcheck albeit with one exception:
3 # Menu driven subdirectory descent.
4 function d{
5 # Only one command line argument accepted
6 [ "$1" = "--" ] && shift $# # Trap for "ls --" feature
7 wd="$PWD"; arg="${1:-$wd}"
8 set -- "${#%/}" # Set the names of the subdirectories to positional parameters
9 if [ $# -eq 1 -a "$arg" = "$wd" ] ;then cd "$arg/$1" || exit 1; return; # trap: it's obvious; do it
10 else echo "No subdirectories found" >&2; return 1
11 fi
12 # Format and display the menu
13 if [[ $(basename "${arg}X") = "${arg}X" ]] ;then arg="$wd/${arg}"; fi # Force absolute path if relitive
14 echo -e "\n\t\tSubdirectories relative to ${arg}: \n"
15 j=1; for i; do echo -e "$j\t$i"; j=(expr $j + 1); done | pr -r -t -4 -e3
16 echo -e "\n\t\tEnter the number of your choice -- \c "
17 # Convert user-input to directory-name and cd to it
18 read -r choice; echo
19 dir=(eval "(echo $\{\"$choice\"\})") # Magic here.
20 [ "$choice" -a "$choice" -ge 1 -a "$choice" -le "$#" ] && cd "${arg}"/"$(eval echo "${dir}")" || exit 1
^SC2128 Expanding an array without an index only gives the first element.
21 }
Once I have incorporated your suggestions into the code, and made it functional, I'll post it here, and mark my question answered. Thank you for your kind assistance.
I've used the code you kindly wrote as a basis for the d function below. It pretty much does what I'd like, with a few little issues:
All subdirectory names that contain a SPACE character are surrounded by characters, but those that do not are not.
All subdirectory names that contain a SINGLE QUOTE character have that character escaped with a BACKSLASH character.
Given that 1 and 2 above cause no issues, they are acceptable, but not ideal.
After user input does the cd, the menu of subdirectory names is again looped through. This could be considered a feature, I suppose. I tried substituting a return for the brake commands in the sections of code following the cd commands, but was unsuccessful in overcoming the subsequent looped menu.
The inclusion of "." and ".." at the head of the menu of subdirectories is not ideal, and actually serves no good purpose.
------------------- Code Begins ------------------------------
d() {
if [ "$BASH" ] && [ "$BASH" != "/bin/sh" ]; then
echo "$FUNCNAME: ksh only";return 1
fi
FIGNORE= # ksh93 equivalent to bash shopt -s dotglob
if [ ${#} -gt 0 ] ;then # Only one command line argument accepted
cd -- "$1" && return 0
fi
if [ `ls -AF1|grep /|wc -l` -eq 1 ] ;then # cd if only one subdirectory
cd -- `ls -AF1|grep /` && return 0
fi
destdir=$(
while :; do
subdirs=( ~(N)*/ ) # ksh93 equivalent to subdirs=( */ ) with shopt -s nullglob
(( ${#subdirs[#]} > 2 )) || break # . and .. are two entries
echo -e "\n\t\tSubdirectories below ${PWD}: \n" >&2
for idx in "${!subdirs[#]}"; do
printf '%d) %q\n' "$idx" "${subdirs[$idx]%/}" >&2
done
printf '\nSelect a subdirectory: ' >&2
read -r
if [[ $REPLY ]]; then
cd -- "${subdirs[$REPLY]}" || break # Continue to loop through subdirectories after cding
else
break
fi
done
printf '%s\n' "$PWD"
)
--------------------------- Code Ends ------------------------------------
So, overall I'm very pleased, and consider myself very fortunate to have received the knowledgeable assistance of such an accomplished Unix wizard. I can't thank you enough.

bash returning from recursion

I have to search all subdirs recursively and print *(number of * = depth of file/dir) type and name. The problem comes when i enter dir and then want to get out but nothing happens.
my test file
DIR test
*FILE ace
*FILE base
*DIR father
**FILE cookies
*DIR mother
**DIR how
***FILE youdoing
*FILE zebra
my code
maxDepth is how far in to the dir it can go(default 3) and currDepth is 1 at the beginning
function tree(){
maxDepth=$2
currDepth=$3
#print the starting file
if [ "$currDepth" -eq 0 ];then
printf "%s %s\n" DIR "$1"
currDepth=1
fi
for path in "$1"/*;do
for i in $( seq 1 $currDepth );do
echo -n *
done
if [ -d "$path" ];then
printf "%s %s\n" DIR "${path##*/}"
if [[ "$currDepth" -lt "$maxDepth" ]];then
tree "$path" "$maxDepth" "$(( currDepth + 1 ))"
fi
continue
fi
if [ -f "$path" ];then
printf "%s %s\n" FILE "${path##*/}"
continue
fi
if [ -L "$path" ];then
printf "%s %s\n" LINK "${path##*/}"
continue
fi
done
}
my output
DIR test
*FILE ace
*FILE base
*DIR father
**FILE cookies
**DIR mother
***DIR how
***FILE zebra
what am i doing wrong
Debug your script by doing set -x before you run it.
Make sure integers are always integers by declaring them with the -i integer attribute.
Use expression syntax consistently. It's a good idea to always use [[ ]] tests for string comparisons and (( )) for arithmetic and numeric comparisons, if your target shell is bash.
Use (( )) for loops instead of seq, which is nonstandard.
Explicitly declare your variables in the function (using local or declare) to ensure they are scoped to the function.
Actually call the inner tree.
#!/bin/bash
tree() {
local -i maxDepth=$2 # Make sure these values are always integers
local -i currDepth=$3
# print the starting file
if (( currDepth == 0 )); then # use bash arithmetic
printf "%s %s\n" DIR "$1"
currDepth=1
fi
for path in "$1"/*;do
for ((i=0; i<currDepth; i++)); do
printf '*'
done
if [[ -d "$path" ]];then
printf "%s %s\n" DIR "${path##*/}"
if [[ "$currDepth" -lt "$maxDepth" ]];then
tree "$path" "$maxDepth" "$(( currDepth + 1 ))"
fi
continue
fi
if [[ -f "$path" ]];then
printf "%s %s\n" FILE "${path##*/}"
continue
fi
if [[ -L "$path" ]];then
printf "%s %s\n" LINK "${path##*/}"
continue
fi
done
}
Here a solution using find, stat and sed:
find <DIR> -exec stat --printf="%n,%F\n" "{}" \; | \
sed -r -e "s/[^\/]+\//\*/g" -e "s/regular file/FILE/" -e "s/directory/DIR/" | \
sed -r -e "s/([\*]+)([^,]+),(.+)/\1 \3 \2/"
IMPORTANT: Use DIR not DIR/ otherwise DIR name will not appear in results.
Explanation:
find returns recursively all files and directory within DIR.
-exec option in find allows to pass each result to another command.
Here I'm passing each result to the command stat
stat has an option to format the output -printf (see manpage) :
%n is the filename (with relavtive path)
%F is the file type ( regular file, directory,symbolic link,block special file...)
So,
find <DIR> -exec stat --printf="%n,%F\n" "{}" \;
returns the following, one result by line (assuming that there are only regular files and directories in DIR) :
DIR/path/to/file,regular file
DIR/path/to/dir,directory
Then, I'm using sed to transform each line the way you required using regular expression:
Replace string/ by * -> ***basename,file type
Replace "regular file" by FILE
Replace "directory" by DIR
Shuffle around basename and filetype using back referencing in sed.
NOTE: I will not explain in details how regular expressions work as it would be too long.
I should have used local in front of currDepth=$3

loop over directories to echo its content

The directories are variables set to the full-path
for e in "$DIR_0" "$DIR_1" "$DIR_2"
do
for i in $e/*
do
echo $i
done
The output for each line is the full path. I want only the name of each file
You are looking for basename.
This is the Bash equivalent of basename:
echo "${i##*/}"
It strips off everything before and including the last slash.
If you truly do not wish to recurse you can achieve that more succinctly with this find command:
find "$DIR_0" "$DIR_1" "$DIR_2" -type f -maxdepth 1 -exec basename{} \;
If you wish to recurse over subdirs simply leave out maxdepth:
find "$DIR_0" "$DIR_1" "$DIR_2" -type f -exec basename{} \;
to traveling a directory recursively with bash
try this you can find it here
#! /bin/bash
indent_print()
{
for((i=0; i < $1; i++)); do
echo -ne "\t"
done
echo "$2"
}
walk_tree()
{
local oldifs bn lev pr pmat
if [[ $# -lt 3 ]]; then
if [[ $# -lt 2 ]]; then
pmat=".*"
else
pmat="$2"
fi
walk_tree "$1" "$pmat" 0
return
fi
lev=$3
[ -d "$1" ] || return
oldifs=$IFS
IFS=""
for el in $1/ *; do
bn=$(basename "$el")
if [[ -d "$el" ]]; then
indent_print $lev "$bn/"
pr=$( walk_tree "$el" "$2" $(( lev + 1)) )
echo "$pr"
else
if [[ "$bn" =~ $2 ]]; then
indent_print $lev "$bn"
fi
fi
done
IFS=$oldifs
}
walk_tree "$1" "\.sh$"
See also the POSIX compliant Bash functions to replace basename & dirname here:
http://cfaj.freeshell.org/src/scripts/

Resources