Checking the input arguments to script to be empty failed in bash script - bash

This a small bash program that is tasked with looking through a directory and counting how many files are in the directory. It's to ignore other directories and only count the files.
Below is my bash code, which seems to fail to count the files specifically in the directory, I say this because if I remove the if statement and just increment the counter the for loop continues to iterate and prints 4 in the counter (this is including directories though). With the if statement it prints this to the console.
folder1 has files
Looking at other questions I think the expression in my if statement is right and I am getting no compilation errors for syntax or another problems.
So I just simply dumbfounded as to why it is not counting the files.
#!/bin/bash
folder=$1
if [ $1 = empty ]; then
folder=empty
counter=0
echo $folder has $counter files
exit
fi
for d in $(ls $folder); do
if [[ -f $d ]]; then
let 'counter++'
fi
done
echo $folder has $counter files
Thank you.

Your entire script could be very well simplified as below with enhancements made. Never use output of ls programmatically. It should be used only in the command-line. The -z construct allows to you assert if the parameter following it is empty or non-empty.
For looping over files, use the default glob expansion provided by the shell. Note the && is a short-hand to do a action when the left-side of the operand returned a true condition, in a way short-hand equivalent of if <condition>; then do <action>; fi
#!/usr/bin/env bash
[ -z "$1" ] && { printf 'invalid argument passed\n' >&2 ; exit 1 ; }
shopt -s nullglob
for file in "$1"/*; do
[ -f "$file" ] && ((count++))
done
printf 'folder %s had %d files\n' "$1" "$count"

Related

For files in directory Bash [duplicate]

I'm trying to loop through files in a directory, where the directory is passed through as an argument. I currently have the following script saved in test.sh:
#!/bin/bash
for filename in "$1"/*; do
echo "File:"
echo $filename
done
And I am running the above using:
sh test.sh path/to/loop/over
However, the above doesn't output the files at the directory path/to/loop/over, it instead outputs:
File:
path/to/loop/over/*
I'm guessing it's interpreting path/to/loop/over/* as a string and not a directory. My expected output is the following:
File:
foo.txt
File:
bar.txt
Where foo.txt and bar.txt are files in the path/to/loop/over/ directory. I found this answer which suggested to add a /* after the $1, however, this doesn't seem to help (neither do these suggestions)
Iterate over content of directory
Compatible answer (not only bash)
As this question is tagged shell, there is a POSIX compatible way:
#!/bin/sh
for file in "$1"/* ;do
[ -f "$file" ] && echo "Process '$file'."
done
Will be enough (work with filenames containing spaces):
$ myscript.sh /path/to/dir
Process '/path/to/dir/foo'.
Process '/path/to/dir/bar'.
Process '/path/to/dir/foo bar'.
This work well by using any posix shell. Tested with bash, ksh, dash, zsh and busybox sh.
#!/bin/sh
cd "$1" || exit 1
for file in * ;do
[ -f "$file" ] && echo "Process '$file'."
done
This version won't print path:
$ myscript.sh /path/to/dir
Process 'foo'.
Process 'bar'.
Process 'foo bar'.
Some bash ways
Introduction
I don't like to use shopt when not needed... (This change standard
bash behaviours and make script less readables).
There is an elegant way for doing this by using standard bash, without requirement of shopt.
Of course, previous answer work fine under bash, but. There are some
interresting way for making your script more powerfull, flexible, pretty, detailed...
Sample
#!/bin/bash
die() { echo >&2 "$0 ERROR: $#";exit 1;} # Emergency exit function
[ "$1" ] || die "Argument missing." # Exit unless argument submitted
[ -d "$1" ] || die "Arg '$1' is not a directory." # Exit if argument is not dir
cd "$1" || die "Can't access '$1'." # Exit unless access dir.
files=(*) # All files names in array $files
[ -f "$files" ] || die "No files found." # Exit if no files found
for file in "${files[#]}";do # foreach file:
echo Process "$file" # Process file
done
Explanation: considering globbing vs real files
When doing:
files=(/path/to/dir/*)
variable $files becomes an array containing all files contained under /path/to/dir/:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
But if nothing match glob pattern, star won't be replaced and array become:
declare -p files
declare -a files=([0]="/path/to/dir/*")
From there. looking for $files is like looking for ${files[0]} ie: first field in array. So
[ -f "$files" ] || die "No files found."
will execute die function unless first field of array files is a file ([ -e "$files" ] to check for existing entry, [ -d "$files" ] to check for existing directory, ans so on... see man bash or help test).
But you could do replace this filesystem test by some string based test, like:
[ "$files" = "/path/to/dir/*" ] && die "No files found."
or, using array length:
((${#files[#]}==1)) && [ "${files##*/}" = "*" ] && die "No files found."
Dropping paths by using Parameter expansion:
For suppressing path from filenames, instead of cd $path you could do:
targetPath=/path/to/dir
files=($targetPath/*)
[ -f "$files" ] || die "No files found."
Then:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
You could
printf 'File: %s\n' ${files[#]#$targetPath/}
File: bar
File: baz
File: foo
This would happen if the directory is empty, or misspelled. The shell (in its default configuration) simply doesn't expand a wildcard if it has no matches. (You can control this in Bash with shopt -s nullglob; with this option, wildcards which don't match anything are simply removed.)
You can verify this easily for yourself. In a directory with four files,
sh$ echo *
a file or two
sh$ echo [ot]*
or two
sh$ echo n*
n*
And in Bash,
bash$ echo n*
n*
bash$ shopt -s nullglob
bash$ echo n*
I'm guessing you are confused about how the current working directory affects the resolution of directory names; maybe read Difference between ./ and ~/

Bash Recursive Loop searching for file

Our assignment was just a conditional statement to say if a file exists or doesn't in the current directory. I already completed the primary objective but I am just trying to figure out how I could recurse using a loop and finding any other occurrences of the file name.
Currently, I am getting an infinite loop after finding the file in the first directory.
File exists and is located at /home/charlie/file.txt
File exists and is located at /home/charlie/file.txt
...
**Questions:
Would I need to have a nested for loop somewhere even though I am recursively calling the function?
Does using $pwd mess it up as I am trying to step into the directories?
Why is it printing twice as of now?**
#!/bin/bash
if [[ $# -eq 0 ]] ; then
echo 'Usage: findFile_new.sh [file name]'
exit 0
fi
exist="File exists and is located at "
function check() {
for file in $(pwd)/*
do
if [ -d "$file" ]; then
check $file $1
else
## Look for file
if [ -f "$1" ]; then
echo $exist$(pwd)/$1
fi
fi
done
}
check $1
Getting closer....
Why am i getting this as my output????
File exists and is located at
/home/charlie//#/*testFile.txt
File exists and is located at
/home/charlie//compareQuiz.sh/testFile.txt
File exists and is located at
/home/charlie//dir1/dir2/dir3/dir4//*testFile.txt
File exists and is located at
/home/charlie//dir1/dir2/file.txt/*testFile.txt
File exists and is located at
/home/charlie//dir1/dir2/testFile.txt/*testFile.txt
File exists and is located at
/home/charlie//findFile_new.sh/*testFile.txt
File exists and is located at
/home/charlie//testFile.txt/*testFile.txt
File exists and is located at
/home/charlie//testscript.sh/*testFile.txt
File exists and is located at
/home/charlie//~/*testFile.txt
#!/bin/bash
if [[ $# -eq 0 ]] ; then
echo 'Usage: findFile_new.sh [file name]'
exit 0
fi
exist="File exists and is located at "
echo $1 $2
function check() {
for file in $1/*
do
if [ -d "$file" ]; then
check $file $2
else
## Look for file
if [ -f "$2" ]; then
echo $exist$file
fi
fi
done
}
check $1 $2
Yes, you are almost there. A possible problem is the variable $file
will contain pathname to the file such as path/to/the/testFile.txt while
the variable $2 may contain only testFile.txt.
Would you please try the following:
#!/bin/bash
# usage: check path targetname
check() {
local file
for file in "$1"/*; do
if [[ -d $file ]]; then
check "$file" "$2"
elif [[ -f $file && ${file##*/} = $2 ]]; then
echo "found: $file"
fi
done
}
if (( $# != 2 )); then
echo "usage: $0 path name"
exit 0
fi
check "$1" "$2"
The expression ${file##*/} removes the pathname preceding the rightmost slash
inclusive and now you can compare it with $2.
Your search is not truly recursive and will eventually fall into an infinite loop.
Rolling your own Bash script that would cover every case for recursive search within a directory - can be tricky (permission handling and links come to mind...)
What you could do - is to use find which takes care of all recursive woes, and then extract the data you may need. The script (or single command really) would be find . -iname $1
find . (in this directory) -iname match this name and ignore case and $1 first arg value.
From the above you could grep to extract any data you may need.
I know this does not answer your question 1:1, but if you're just starting with Bash, I'd say you're better off using tools provided within Bash first, then trying rolling your own.
As your task was already accomplished and it's more of a general question, I figured I'd provide you with "what I would do". :)
Another point - when using $pwd - this being a bash reserved environment variable, it's better to use $PWD. It makes it simple to understand which env_vars are user declared and which are system reserved.
edit:
If you'd like to see a nice example of how this can be done, please checkout https://www.shellscript.sh/eg/directories/#traverse2.sh

Test -d directory true - subdirectory false (POSIX)

I'm trying to print all directories/subdirectories from a given start directory.
for i in $(ls -A -R -p); do
if [ -d "$i" ]; then
printf "%s/%s \n" "$PWD" "$i"
fi
done;
This script returns all of the directories found in the . directory and all of the files in that directory, but for some reason the test fails for subdirectories. All of the directories end up in $i and the output looks exactly the same.
Let's say I have the following structure:
foo/bar/test
echo $i prints
foo/
bar/
test/
While the contents of the folders are listed like this:
./foo:
file1
file2
./bar:
file1
file2
However the test statement just prints:
PWD/TO/THIS/DIRECTORY/foo
For some reason it returns true for the first level directories, but false for all of the subdirectories.
(ls is probably not a good way of doing this and I would be glad for a find statement that solves all of my issues, but first I want to know why this script doesn't work the way you'd think.)
As pointed out in the comments, the issue is that the directory names include a :, so -d is false.
I guess that this command gives you the output you want (although it requires Bash):
# enable globstar for **
# disabled in non-interactive shell (e.g. a script)
shopt -s globstar
# print each path ending in a / (all directories)
# ** expands recursively
printf '%s\n' **/*/
The standard way would either to do the recursion yourself, or to use find:
find . -type d
Consider your output:
dir1:
dir1a
Now, the following will be true:
[ -d dir1/dir1a ]
but that's not what your code does; instead, it runs:
[ -d dir1a ]
To avoid this, don't attempt to parse ls; if you want to implement recursion in baseline POSIX sh, do it yourself:
callForEachEntry() {
# because calling this without any command provided would try to execute all found files
# as commands, checking for safe/correct invocation is essential.
if [ "$#" -lt 2 ]; then
echo "Usage: callForEachEntry starting-directory command-name [arg1 arg2...]" >&2
echo " ...calls command-name once for each file recursively found" >&2
return 1
fi
# try to declare variables local, swallow/hide error messages if this fails; code is
# defensively written to avoid breaking if recursing changes either, but may be faulty if
# the command passed as an argument modifies "dir" or "entry" variables.
local dir entry 2>/dev/null ||: "not strict POSIX, but available in dash"
dir=$1; shift
for entry in "$dir"/*; do
# skip if the glob matched nothing
[ -e "$entry" ] || [ -L "$entry" ] || continue
# invoke user-provided callback for the entry we found
"$#" "$entry"
# recurse last for if on a baseline platform where the "local" above failed.
if [ -d "$entry" ]; then
callForEachEntry "$entry" "$#"
fi
done
}
# call printf '%s\n' for each file we recursively find; replace this with the code you
# actually want to call, wrapped in a function if appropriate.
callForEachEntry "$PWD" printf '%s\n'
find can also be used safely, but not as a drop-in replacement for the way ls was used in the original code -- for dir in $(find . -type d) is just as buggy. Instead, see the "Complex Actions" and "Actions In Bulk" section of Using Find.

bash - recursive script can't see files in sub directory

I got a recursive script which iterates a list of names, some of which are files and some are directories.
If it's a (non-empty) directory, I should call the script again with all of the files in the directory and check if they are legal.
The part of the code making the recursive call:
if [[ -d $var ]] ; then
if [ "$(ls -A $var)" ]; then
./validate `ls $var`
fi
fi
The part of code checking if the files are legal:
if [[ -f $var ]]; then
some code
fi
But, after making the recursive calls, I can no longer check any of the files inside that directory, because they are not in the same directory as the main script, the -f $var if cannot see them.
Any suggestion how can I still see them and use them?
Why not use find? Simple and easy solution to the problem.
Always quote variables, you never known when you will find a file or directory name with spaces
shopt -s nullglob
if [[ -d "$path" ]] ; then
contents=( "$path"/* )
if (( ${#contents[#]} > 0 )); then
"$0" "${contents[#]}"
fi
fi
you're re-inventing find
of course, var is a lousy variable name
if you're recursively calling the script, you don't need to hard-code the script name.
you should consider putting the logic into a function in the script, and the function can recursively call itself, instead of having to spawn an new process to invoke the shell script each time. If you do this, use $FUNCNAME instead of "$0"
A few people have mentioned how find might solve this problem, I just wanted to show how that might be done:
find /yourdirectory -type f -exec ./validate {} +;
This will find all regular files in yourdirectory and recursively in all its sub-directories, and return their paths as arguments to ./validate. The {} is expanded to the paths of the files that find locates within yourdirectory. The + at the end means that each call to validate will be on a large number of files, instead of calling it individually on each file (wherein the + is replaced with a \), this provides a huge speedup sometimes.
One option is to change directory (carefully) into the sub-directory:
if [[ -d "$var" ]] ; then
if [ "$(ls -A $var)" ]; then
(cd "$var"; exec ./validate $(ls))
fi
fi
The outer parentheses start a new shell so the cd command does not affect the main shell. The exec replaces the original shell with (a new copy of) the validate script. Using $(...) instead of back-ticks is sensible. In general, it is sensible to enclose variable names in double quotes when they refer to file names that might contain spaces (but see below). The $(ls) will list the files in the directory.
Heaven help you with the ls commands if any file names or directory names contain spaces; you should probably be using * glob expansion instead. Note that a directory containing a single file with a name such as -n would trigger a syntax error in your script.
Corrigendum
As Jens noted in a comment, the location of the shell script (validate) has to be adjusted as you descend the directory hierarchy. The simplest mechanism is to have the script on your PATH, so you can write exec validate or even exec $0 instead of exec ./validate. Failing that, you need to adjust the value of $0 — assuming your shell leaves $0 as a relative path and doesn't mess around with converting it to an absolute path. So, a revised version of the code fragment might be:
# For validate on PATH or absolute name in $0
if [[ -d "$var" ]] ; then
if [ "$(ls -A $var)" ]; then
(cd "$var"; exec $0 $(ls))
fi
fi
or:
# For validate not on PATH and relative name in $0
if [[ -d "$var" ]] ; then
if [ "$(ls -A $var)" ]; then
(cd "$var"; exec ../$0 $(ls))
fi
fi

bash - Test if directory contains files ending on .suite

I'm currently writing a bash script for executing test suites. Besides passing the suites directly to this script, like
./bash-specs test.suite
it should also be able to execute all scripts in a given directory if no suite is passed to it, like so
./bash-specs # executes all tests in the directory, namely test.suite
This is implemented like this
(($# == 0)) && set -- *.suite
So, if no suite is passed, all the files ending on .suite are executed. This works fine but fails if the directory contains no such files.
That means I will also need a check to test if there actually are files with that ending.
How would I do this in bash?
I thought a test like
[[ -f *.suite ]]
should work but it seems to fail when there are more than one file in the directory.
The reason -f is failing is because -f only takes a single parameter. When you do [[ -f *.suite ]], it expands to:
[[ -f test.suite test2.suite test3.suite ]]
... which is not valid.
Instead, do this:
shopt -s nullglob
FILES=`echo *.suite`
if [[ -z $FILES ]]; then
echo "No suites found"
exit
fi
for i in $FILES; do
# Run your test on file $i
done
nullglob is a shell option that makes wildcard patterns that aren't found expand to nothing, rather than expanding to the wildcard pattern itself. Once $FILES is set to either a list of files or nothing, we can use -z to test for emptiness, and display the appropriate error message.
ls -al | grep "\.suite";echo $?
This will show 0 if file is present and 1 if file is not present
I would iterate over every suite file like this:
for i in *.suite ; do
if [ -x $i ] ; then
echo running $i
fi
done
Right after :
(($# == 0)) && set -- *.suite
Test if $1 is empty (with -z), then it means that there's no file named *.suite.

Resources