Setting directory as variable in Shell script - shell

What I am trying to do is to count all the files in a directory using shell script.
For example, when execute the program,
./test.sh project
it should count all the files in the folder called "project".
But I am having trouble with the directory part.
What I have done so far is,
#!/bin/bash
directory=$1
count=ls $directory | wc -l
echo "$folder has $count files"
but it does not work... Can anyone blow up my confusion please?
Thanks!

You have an incorrect syntax while setting the count, for running nested commands in bash you need to use command-substitution using $(..) which runs the commands in a sub-shell and returns the restult
count=$(ls -- "$directory" | wc -l)
But never parse ls output in scripts for any purpose, use the more general purpose find command
find "$1" -maxdepth 1 -type f | wc -l
Check more about what $(..) form Wiki Bash Hackers - Command substitution

#!/bin/bash
directory=$1
count=`ls $directory | wc -l`
echo "$folder has $count files"

Related

What does "ls -t . | while read line" mean?

Here is the code:
if test $# -eq 1
then
if test $1 = "--exec"
then
ls -t . | while read line
do
if test -f $line -a -x $line
then
echo $line
fi
done
fi
fi
I don't understand the utility of . here in ls -t . | while read line; can you explain?
The line lists all the files in the present directory in ascending order of their modification time. The pipe operator sends the result of the directory listing to the while loop, which reads in each line from the ls command into the variable "line". When I do this kind of thing, I usually use a foreach loop, but either way works.
your code simply prints out all executables in current folder, order by modification time.
generally shell scripting is very risky, since you can code very error prone or bad style code without actually realizing it. or even if you do, you may not have a correct understanding to the problems.
to achieve the same goal, I would write the following:
find . -maxdepth 1 -type f -executable -printf '%T# %p\0' | sort -zk 1nr | sed -z 's/^[^ ]* //'
it's much concise and has no room for errors.

Shell Script to Display Number of Files and Directories in a Directory

I'm trying to write a script that will tell you how many files and how many directories are in a given directory.
Here's the script I've written, but the output is always "Number of files is ." and "Number of directories is ."
Here's my code:
#!/bin/sh
if [ -d "$#" ]
then
find "$#" -type f | ls -l "$#" | wc -l | echo "Number of files is $#"
find "$#" -type d | ls -l "$#" | wc -l | echo "Number of directories is $#"
fi
You seem to be having difficulties to understand how pipes work. You cannot "natively" use the "result" (stdout) of a pipe (the left-hand side) as a variable on the right-hand side of a pipe, you either need to consume and read it into a variable, e.g.
printf "line1\nline2\n" | while read line; do_stuff_with "${line}"; done
or you need to use command substitution (and optionally assign it to a variable), e.g.
files=$(find "$1" -maxdepth 1 -type f -printf . | wc -c)
A few further notes:
$# expands to all positional parameters, in case of multiple arguments your [ -d "$#" ] will fail.
The ls is completely superfluous
find works recursively, but I guess you only want the first directory level to be checked so this needs the maxdepth parameter
This will break on weird paths with newlines which can be worked around by telling find to print a character for each found directory/file and then count bytes instead of lines
In case you really don't want this to be recursive it might be easier to just use globbing to obtain the desired result:
$ cat t.sh
#!/bin/bash
for file in "${1-.}"/*; do
[ -d "${file}" ] && ((directories++))
[ -f "${file}" ] && ((files++))
done
echo "Number of files: ${files-0}"
echo "Number of directories: ${directories-0}"
.
$ ./t.sh
Number of files: 6
Number of directories: 1
$ ./t.sh /tmp
Number of files: 9
Number of directories: 3
You might want to check man test to tweak with regards to links to obtain your desired result.
You seem to be confused on piping here.
You want the output of find ... | wc -l to be expanded in the echo command.
So, your script, given what you want to accomplish should look something like this:
#!/bin/sh
if [ -d "$#" ]; then
echo "Number of files is $(find "$#" -type f | wc -l)"
echo "Number of directories is $(find "$#" -type d | wc -l)"
else
echo "[ERROR] Please provide a directory."
exit 1
fi

How to get the number of files in a folder as a variable?

Using bash, how can one get the number of files in a folder, excluding directories from a shell script without the interpreter complaining?
With the help of a friend, I've tried
$files=$(find ../ -maxdepth 1 -type f | sort -n)
$num=$("ls -l" | "grep ^-" | "wc -l")
which returns from the command line:
../1-prefix_blended_fused.jpg: No such file or directory
ls -l : command not found
grep ^-: command not found
wc -l: command not found
respectively. These commands work on the command line, but NOT with a bash script.
Given a file filled with image files formatted like 1-pano.jpg, I want to grab all the images in the directory to get the largest numbered file to tack onto the next image being processed.
Why the discrepancy?
The quotes are causing the error messages.
To get a count of files in the directory:
shopt -s nullglob
numfiles=(*)
numfiles=${#numfiles[#]}
which creates an array and then replaces it with the count of its elements. This will include files and directories, but not dotfiles or . or .. or other dotted directories.
Use nullglob so an empty directory gives a count of 0 instead of 1.
You can instead use find -type f or you can count the directories and subtract:
# continuing from above
numdirs=(*/)
numdirs=${#numdirs[#]}
(( numfiles -= numdirs ))
Also see "How can I find the latest (newest, earliest, oldest) file in a directory?"
You can have as many spaces as you want inside an execution block. They often aid in readability. The only downside is that they make the file a little larger and may slow initial parsing (only) slightly. There are a few places that must have spaces (e.g. around [, [[, ], ]] and = in comparisons) and a few that must not (e.g. around = in an assignment.
ls -l | grep -v ^d | wc -l
One line.
How about:
count=$(find .. -maxdepth 1 -type f|wc -l)
echo $count
let count=count+1 # Increase by one, for the next file number
echo $count
Note that this solution is not efficient: it spawns sub shells for the find and wc commands, but it should work.
file_num=$(ls -1 --file-type | grep -v '/$' | wc -l)
this is a bit lightweight than a find command, and count all files of the current directory.
The most straightforward, reliable way I can think of is using the find command to create a reliably countable output.
Counting characters output of find with wc:
find . -maxdepth 1 -type f -printf '.' | wc --char
or string length of the find output:
a=$(find . -maxdepth 1 -type f -printf '.')
echo ${#a}
or using find output to populate an arithmetic expression:
echo $(($(find . -maxdepth 1 -type f -printf '+1')))
Simple efficient method:
#!/bin/bash
RES=$(find ${SOURCE} -type f | wc -l)
Get rid of the quotes. The shell is treating them like one file, so it's looking for "ls -l".
REmove the qoutes and you will be fine
Expanding on the accepted answer (by Dennis W): when I tried this approach I got incorrect counts for dirs without subdirs in Bash 4.4.5.
The issue is that by default nullglob is not set in Bash and numdirs=(*/) sets an 1 element array with the glob pattern */. Likewise I suspect numfiles=(*) would have 1 element for an empty folder.
Setting shopt -s nullglob to disable nullglobbing resolves the issue for me. For an excellent discussion on why nullglob is not set by default on Bash see the answer here: Why is nullglob not default?
Note: I would have commented on the answer directly but lack the reputation points.
Here's one way you could do it as a function. Note: you can pass this example, dirs for (directory count), files for files count or "all" for count of everything in a directory. Does not traverse tree as we aren't looking to do that.
function get_counts_dir() {
# -- handle inputs (e.g. get_counts_dir "files" /path/to/folder)
[[ -z "${1,,}" ]] && type="files" || type="${1,,}"
[[ -z "${2,,}" ]] && dir="$(pwd)" || dir="${2,,}"
shopt -s nullglob
PWD=$(pwd)
cd ${dir}
numfiles=(*)
numfiles=${#numfiles[#]}
numdirs=(*/)
numdirs=${#numdirs[#]}
# -- handle input types files/dirs/or both
result=0
case "${type,,}" in
"files")
result=$((( numfiles -= numdirs )))
;;
"dirs")
result=${numdirs}
;;
*) # -- returns all files/dirs
result=${numfiles}
;;
esac
cd ${PWD}
shopt -u nullglob
# -- return result --
[[ -z ${result} ]] && echo 0 || echo ${result}
}
Examples of using the function :
folder="/home"
get_counts_dir "files" "${folder}"
get_counts_dir "dirs" "${folder}"
get_counts_dir "both" "${folder}"
Will print something like :
2
4
6
Short and sweet method which also ignores symlinked directories.
count=$(ls -l | grep ^- | wc -l)
or if you have a target:
count=$(ls -l /path/to/target | grep ^- | wc -l)

How do I get a list of all available shell commands

In a typical Linux shell (bash) it is possible to to hit tab twice, to get a list of all available shell commands.
Is there a command which has the same behaviour? I want to pipe it into grep and search it.
You could use compgen. For example:
compgen -c
You also could grep it, like this:
compgen -c | grep top$
Source: http://www.cyberciti.biz/open-source/command-line-hacks/compgen-linux-command/
You can list the directories straight from $PATH if you tweak the field separator first. The parens limit the effect to the one command, so use: (...) | grep ...
(IFS=': '; ls -1 $PATH)
"tab" twice & "y" prints all files in the paths of $PATH. So just printing all files in PATH is sufficient.
Just type this in the shell:
# printf "%s\n" ${PATH//:/\/* } > my_commands
This redirect all the commands to a file "my_commands".
List all the files in your PATH variable (ls all the directories in the PATH). The default user and system commands will be in /bin and /sbin respectively but on installing some software we will add them to some directory and link it using PATH variable.
There may be things on your path which aren't actually executable.
#!/bin/sh
for d in ${PATH//:/ }; do
for f in "$d"/*; do
test -x "$f" && echo -n "$f "
done
done
echo ""
This will also print paths, of course. If you only want unqualified filenames, it should be easy to adapt this.
Funny, StackOverflow doesn't know how to handle syntax highlighting for this. :-)
tabtaby
Similar to #ghoti, but using find:
#!/bin/sh
for d in ${PATH//:/ }; do
find $d -maxdepth 1 -type f -executable
done
Bash uses a builtin command named 'complete' to implement the tab feature.
I don't have the details to hand, but the should tell you all you need to know:
help complete
(IFS=':'; find $PATH -maxdepth 1 -type f -executable -exec basename {} \; | sort | uniq)
It doesn't include shell builtins though.
An answer got deleted, I liked it most, so I'm trying to repost it:
compgen is of course better
echo $PATH | tr ':' '\n' | xargs -n 1 ls -1
I found this to be the most typical shell thing, I think it works also with other shells (which I doubt with things like IFS=':' )
Clearly, there maybe problems, if the file is not an executable, but I think for my question, that is enough - I just want to grep my output - which means searching for some commands.

Calling linux utilities with options from within a Bash script

This is my first Bash script so forgive me if this question is trivial. I need to count the number of files within a specified directory $HOME/.junk. I thought this would be simple and assumed the following would work:
numfiles= find $HOME/.junk -type f | wc -l
echo "There are $numfiles files in the .junk directory."
Typing find $HOME/.junk -type f | wc -l at the command line works exactly how I expected it to, simply returning the number of files. Why is this not working when it is entered within my script? Am I missing some special notation when it comes to passing options to the utilities?
Thank you very much for your time and help.
You just need to surround it with backticks:
numfiles=`find $HOME/.junk -type f | wc -l`
The term for this is command substitution.
if you are using bash you can also use $() for command substitution, like so:
numfiles=$(find $HOME/.junk -type f | wc -l)
I find this to be slightly more readable than backquotes, as well as having the ability to nest several commands inside one another.
with bash 4 (if you want recursive)
#!/bin/bash
shopt -s globstar
for file in **
do
((i++))
done
echo "total files: $i"
if not
#!/bin/bash
shopt -s dotglob
shopt -s nullglob
for file in *
do
((i++))
done
echo "total files: $i"

Resources