I'm trying to print all files and folders of a directory.
The problem is that the print out of the if (echo "$f out of if") is ok, but the code never execute the prints inside the if statements.
I'm sure that files and folders exists (printed out of if) and I checked the permissions.
dirPath is /tmp/testFolder.
for f in `ls $dirPath`; do
echo "$f out of if"
if [ -d $f ]; then
echo "$f is a directory"
fi
if [ -f $f ]; then
echo "$f is a file"
fi
done
EDIT:
Thank you at all to answer!! I tryied this solution but I found the problem that "tmp/testFolder" (dirPath) is recognized as a folder too into the for-loop (and I need to find subfolder only).:
shopt -s globstar
for f in "$dirPath"/**; do
echo "$f out of if"
if [ -d $f ]; then
echo "$f is a directory"
fi
if [ -f $f ]; then
echo "$f is a file"
fi
done
The problem with your code is that the f variable doesn't contain the full path to the file(s) in your target directory. So unless you give current directory, the tests for "directory" and "file" are always going to fail.
You could prefix with the $dirPath in your if statements (such as if [ -d "$dirPath/$f" ]; then echo "$dirPath/$f is a directory"; fi). Or better yet, use glob expansion in for loop instead which would give you full path to each file.
#!/bin/bash
for f in "$dirPath"/*; do
if [ -d "$f" ]; then
echo "$f is a directory"
elif [ -f "$f" ]; then
echo "$f is a file"
fi
done
The code in if doesn't execute because there is no namefile1 in the current directory, which is what your code is checking. That is, $f does not include $dirPath (the directory prefix) the way your code is written.
Better:
for f in "$dirPath"/*; do
echo "$f out of if"
if [ -d "$f" ]; then
echo "$f is a directory"
fi
if [ -f "$f" ]; then
echo "$f is a file"
fi
done
See also the following list of bash pitfalls:
for f in $(ls *.mp3)
[ -n $foo ] or [ -z $foo ]
The problem is, that $f is a relative path. So $f is test.file, but you want to check /tmp/testFolder/test.file. The proper solution would be:
for f in "$dirPath"/*; do
echo "$f out of if"
if [ -d "$f" ]; then
echo "$f is a directory"
fi
if [ -f "$f" ]; then
echo "$f is a file"
fi
done
The immediate problem is that f is set to only the filename, not the entire path. That is, if /tmp/testFolder contains a file named example.txt, f will be set to just "example.txt", so if [ -d $f ]; then is checking for a file named "example.txt" in the current working directory, not in /tmp/testFolder. One option would be to use if [ -d $dirPath/$f ]; then, but there's a better way.
As a general rule, you shouldn't parse the output of ls because its output is ambiguous in several ways. To get a list of files in a particular directory, just use dirpath/* -- this gets a list of matching files without any of the ambiguity or parsing problems you'll have with ls. Bonus: it includes the specified path as part of the result (for example, /tmp/testFolder/* might give "/tmp/testFolder/example.txt").
Another suggestion: you should (almost) always put variable references in double-quotes, e.g. "$f" instead of just $f.
Fixing all of this gives:
for f in "$dirPath"/*; do # Note that $dirPath should be quoted, but * cannot be
echo "$f out of if"
if [ -d "$f" ]; then
echo "$f is a directory"
fi
if [ -f "$f" ]; then
echo "$f is a file"
fi
done
Note that the echo commands will also give the full path. If you don't want that, you can either use the basename command to get just the name portion, e.g. f_name="$(basename "$f")", or (as #melpomene pointed out) use the expansion "${f##*/}" (which trims up to the last "/" in the variable):
for f in "$dirPath"/*; do
f_name="${f##*/}" # The quotes are not strictly needed in an assignment, but do no harm
echo "$f_name out of if"
if [ -d "$f" ]; then
echo "$f_name is a directory"
fi
if [ -f "$f" ]; then
echo "$f_name is a file"
fi
done
Oh, and there's one possible downside to using a wildcard instead of ls: it'll return the raw wildcard if there are no matches. It doesn't matter in this case, but for places where it does you can either start the loop with [ -e "$f" ] || continue (i.e. skip the loop if there's nothing actually there), or if you're using bash (not just a generic shell) you can set the nullglob shell option (shopt -s nullglob).
One more recommendation: shellcheck.net is good at spotting common scripting mistakes, so I recommend running your scripts through it for suggestions.
Based on your edit, you need only compare if "$f" is equal to "$dataPath/" to detect and avoid printing the "$dataPath" directory. (actually using !=) Continuing from your example (and incorporating all the comments about not using for i inls anything`...), you could do:
#!/bin/bash
shopt -s globstar ## enable globstar
[ -z "$1" ] && { ## validate at least 1 argument provided
printf "error: insufficient input\nusage: %s path\n" "${0##*/}" >&2
exit 1
}
datapath="$1" ## set datapath
[ -d "$datapath" ] || { ## validate argument is a directory
printf "error: invalid path - not a directory.\n" >&2
exit 1;
}
## output absolute file/directory names
for f in "$datapath"/**; do ## find files & subdirs below datapath
if [ -d "$f" ]; then ## am I a directory that isn't datapath
[ "$f" != "$datapath/" ] && printf "directory: %s\n" "$f"
elif [ -f "$f" ]; then ## am I a file?
printf "filename : %s\n" "$f"
else ## am I neither a file or directory?
printf "unknown : %s\n" "$f"
fi
done
Now if you only want the relative filenames below "$dataPath", you simply need to trim "dataPath/" from the beginning of each filename using the parameter expansion ${parameter#[word]} (where [word] expands to the pattern you want to match, e.g. "$dataPath/" here):
printf "\nwith relative path\n\n"
## output relative file/directory names
for f in "$datapath"/**; do
relpath="${f#$datapath/}" ## trim $datapath/ from f
if [ -d "$f" ]; then
[ -n "$relpath" ] && printf "directory: %s\n" "$relpath"
elif [ -f "$f" ]; then
printf "filename : %s\n" "$relpath"
else
printf "unknown : %s\n" "$relpath"
fi
done
Example including both above on the following directory structure:
Example Directory
$ tree ~/dev/src-c/tmp/tst/sha2dcr
/home/david/dev/src-c/tmp/tst/sha2dcr
├── chkpass.c
├── dat
│ ├── pw
│ ├── users.bin
│ ├── users.bin.sav
│ └── users.txt
├── getpass.c
├── getpass.h
├── readuserdb.c
├── sha256d.c
├── sha256d.h
├── sha256dtst.c
├── sha256fread.c
├── sha256fread_timed.c
└── userdb.c
Example Script Use/Output
Including both absolute and relative paths, you would have:
$ bash ~/scr/utl/filelist.sh ~/dev/src-c/tmp/tst/sha2dcr
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/chkpass.c
directory: /home/david/dev/src-c/tmp/tst/sha2dcr/dat
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/dat/pw
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/dat/users.bin
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/dat/users.bin.sav
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/dat/users.txt
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/getpass.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/getpass.h
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/readuserdb.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256d.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256d.h
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256dtst.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256fread.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256fread_timed.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/userdb.c
with relative path
filename : chkpass.c
directory: dat
filename : dat/pw
filename : dat/users.bin
filename : dat/users.bin.sav
filename : dat/users.txt
filename : getpass.c
filename : getpass.h
filename : readuserdb.c
filename : sha256d.c
filename : sha256d.h
filename : sha256dtst.c
filename : sha256fread.c
filename : sha256fread_timed.c
filename : userdb.c
(note: the output of directory: /home/david/dev/src-c/tmp/tst/sha2dcr/ is suppressed in both cases)
Look things over and let me know if you have further questions.
(note: I use all lower-case variable names if you have an unexplained copy error...)
Related
I want to print a structure of a folder with shell script. So it would look like this
File : linux -3.14/COPYING
File : linux -3.14/CREDITS
Directory : linux -3.14/Documentation
File : linux -3.14/Documentation/00 - INDEX
Directory : linux -3.14/Documentation/ABI
File : linux -3.14/Documentation/ABI/README
and this is my script. The problem is that it prints out all files and folders for the current directory but it will not print for the subfolders. Maybe I do recursion wrong
dirPrint() {
# Find all files and print them first
file=$1
for f in $(ls ${file}); do
if [ -f ${f} ];
then
path="$(pwd)/$f"
echo "File: $path"
fi
done
# Find all directories and print them
for f in $(ls ${file}); do
if [ -d ${f} ];
then
path="$(pwd)/$f"
echo "Directory: $path"
echo " $(dirPrint "$path")"
fi
done
}
if [ $# -eq 0 ]; then
dirPrint .
else
dirPrint "$1"
fi
And also what is the difference between using $1, "$1" and "${1}"?
There are various problems in your script. You shouldn't parse the output of ls, iterate over the expansion of a wildcard instead. Always double quote the variables to prevent spaces in filenames from breaking your commands.
#! /bin/bash
dir_find () {
local dir=$1
local indent=$2
for f in "$dir"/* ; do
printf '%s%s\n' "$indent${f##*/}"
if [[ -d $f ]] ; then
dir_find "$f" " $indent"
fi
done
}
dir_find .
I'm writing code to exchange the upper and lower alphabet of file's name in certain directory.
If directory is unable to access, it shows err message.
#!/bin/bash
if [ $# -eq 0 ];then
path=$(pwd)
for a in "$path"/*
do
mv "$i" "`echo $i | tr '[:upper:][:lower:]' '[:lower:][:upper:]'`"
done
else
if [ ! -d "$1" ];then
echo "Unable to access directory!"
else
for i in "$1"/*
do
mv "$i" "`echo $i | tr '[:upper:][:lower:]' '[:lower:][:upper:]'`"
done
fi
fi
The problem is that, when I echo $i, it doesn't express ONLY filename, but filename with directory!
So, when I try to mv the file, not only file's name is changed but ALSO directory's name is changed so I can't mv the file.
Like this:
mv: cannot move 'test3/Ipad.txt' to 'TEST3/iPAD.TXT': No such file or directory
mv: cannot move 'test3/iPhone' to 'TEST3/IpHONE': No such file or directory
mv: cannot move 'test3/macOS' to 'TEST3/MACos': No such file or directory
how can I change the file names in certain directory?
Any help would be appreciable and thanks in advance.
I'd use a single loop for handling both cases.
#! /bin/sh -
case $# in
( 0 ) path=. ;;
( * ) path=$1
esac
for fpath in "$path"/*; do
echo mv -- "$fpath" "${fpath%/*}/$(
printf '%s\n' "${fpath##*/}" |
tr '[a-zA-Z]' '[A-Za-z]')"
done
If the path given by user is not present or accessible, this'll error out without causing any harm. But if you insist on handling that yourself, add a check before loop, or, use nullglob with bash and the script will exit silently on such an occasion.
Also note that the possibility that files having neither upper nor lower case letters in their names may exist is ignored here. Nothing will happen to them but mv will complain that source and target are the same.
You may want to cd to the directory first, then work on filenames in the directory:
#!/bin/bash
if [ $# -eq 0 ]; then
path=$(pwd)
cd "$path"
for a in *
do
mv "$i" "`echo $i | tr '[:upper:][:lower:]' '[:lower:][:upper:]'`"
done
cd -
else
if [ ! -d "$1" ]; then
echo "Unable to access directory!"
else
cd "$1"
for i in *
do
mv "$i" "`echo $i | tr '[:upper:][:lower:]' '[:lower:][:upper:]'`"
done
cd -
fi
fi
I'm trying to iterate over a folder, running a grep on each file, and putting them into separate files, tagged with a .res extension. Here's what I have so far....
#!/bin/bash
directory=$(pwd)
searchterms="searchterms.txt"
extension=".end"
usage() {
echo "usage: fmat [[[-f file ] [-d directory ] [-e ext]] | [-h]]"
echo " file - text file containing a return-delimited list of materials"
echo " directory - directory to process"
echo " ext - file extension of files to process"
echo ""
}
while [ "$1" != "" ]; do
case $1 in
-d | --directory ) shift
directory=$1
;;
-f | --file ) shift
searchterms=$1
;;
-e | --extension ) shift
extension=$1
;;
-h | --help ) usage
exit
;;
* ) usage
exit 1
esac
shift
done
if [ ! -d "$directory" ]; then
echo "Sorry, the directory '$directory' does not exist"
exit 1
fi
if [ ! -f "$searchterms" ]; then
echo "Sorry, the searchterms file '$searchterms' does not exist"
exit 1
fi
echo "Searching '$directory' ..."
for file in "${directory}/*"; do
printf "File: %s\n" ${file}
[ -e "$file" ] || continue
printf "%s\n" ${file}
if [ ${file: -3} == ${extension} ]; then
printf "%s will be processed\n" ${file}
#
# lots of processing here
#
fi
done
I know that it's down to my poor understanding of of globbing... but I can't get the test on the extension to work.
Essentially, I want to be able to specify a source directory, a file with search terms, and an extension to search for.
NOW, I realise there may be quicker ways to do this, e.g.
grep -f searchterms.txt *.end > allchanges.end.res
but I may have other processing I need to do to the files, and I want to save them into separate files: so bing.end, bong.end, would be grep'ed into bing.end.res, bong.end.res .
Please let me know, just how stupid I'm being ;-)
Just for completeness sake, here's the last part, working, thanks to #chepner and #Gordon Davisson :
echo "Searching '$directory' ..."
for file in "${directory}"/*; do
[ -e "$file" ] || continue
# show which files will be processed
if [[ $file = *.${extension#.} ]]; then
printf "Processing %s \n" "$file"
head -n 1 "${file}" > "${file}.res"
grep -f $searchterms "${file}" >> "${file}.res"
fi
done
You just need to leave the * out of the quotes, so that it isn't treated as a literal *:
for file in "${directory}"/*; do
Unlike most languages, the quotes don't define a string (as everything in bash is already a string: it's the only data type). They simply escape each character inside the quotes. "foo" is exactly the same as \f\o\o, which (because escaping most characters doesn't really have any effect) is the same as foo. Quoted or not, all characters not separated by word-splitting characters are part of the same word.
http://shellcheck.net will catch this, although not with the most useful error message. (It will also catch the other parameter expansions that you did not quote but should.)
I would like to make a bash function that lists all the directories (and files) inside a given directory.
searchInRepo(){
file_list=`ls $1`
#echo $file_list
for aFile in $file_list; do
echo "$aFile --"
# case : directory
if [ -d $aFile ]
then
echo "$aFile ***"
cd $aFile
searchInRepo $aFile
cd ..
# case : file
elif [ -f $aFile ]
then
echo "$aFile is a regular file"
fi
done
}
As you see, this is a recursive function. When I call it with ls $1 (listing parameter's files) it doesn't recognize the directories as being directories. When I just use ls(no argument involved) everything works fine.
Any suggestions here ?
Cheers !
Why use ls when bash can do it for you? This will check to make sure the argument has a trailing /* so it will work with bare directory names.
if [[ ! "$1" =~ /\*$ ]]
then
if [[ ! "$1" =~ /$ ]]
then
searchpath="$1/*"
else
searchpath="$1*"
fi
fi
echo "Searching $searchpath"
for f in $searchpath; do
if [ -d "$f" ]
then
echo "Directory -> $f"
else
echo "File -> $f"
fi
done
Why even use for loop, find is made for this.
find . # List all files and directories recursively
find . -type f # List only files
find . -maxdepth 1 # List all files and directories in current directory
If you are using git, you can also use git ls-files
I have a basic script to edit config files in ~/.config - it works with the cd lines, but that seems redundant:
dir=$HOME/.config/$1
if [ ! -d "$dir" ]; then :
else
cd "$dir" &&
for file in * ; do
case "$file" in
conf | config | *.cfg | *rc) $EDITOR "$file" ;;
*) : ;;
esac
done
cd - 1>/dev/null;
fi
Changing it to use the variable "$dir" fails. What am I doing wrong?
dir=$HOME/.config/$1
if [ ! -d "$dir" ]; then :
else
for file in "$dir" ; do
case "$file" in
conf | config | *.cfg | *rc) $EDITOR "$file" ;;
*) : ;;
esac
done;
fi
You're not globbing the files inside $dir, merely listing $dir itself. Try $dir/*.
You can't use just "$dir" because that gives just a single item: the directory. You need $dir/*, but that includes the path as well, so you have to strip that off to compare just the file name:
...
for file in $dir/*; do
filename=`basename $file`
case $filename in
...
Your first version does a * on the directory, yielding a list of all the files in the directory. Your second version just has one entry in the list --- the directory itself, not its contents.