Print structure of a folder with recursive function Shell script - bash

I want to print a structure of a folder with shell script. So it would look like this
File : linux -3.14/COPYING
File : linux -3.14/CREDITS
Directory : linux -3.14/Documentation
File : linux -3.14/Documentation/00 - INDEX
Directory : linux -3.14/Documentation/ABI
File : linux -3.14/Documentation/ABI/README
and this is my script. The problem is that it prints out all files and folders for the current directory but it will not print for the subfolders. Maybe I do recursion wrong
dirPrint() {
# Find all files and print them first
file=$1
for f in $(ls ${file}); do
if [ -f ${f} ];
then
path="$(pwd)/$f"
echo "File: $path"
fi
done
# Find all directories and print them
for f in $(ls ${file}); do
if [ -d ${f} ];
then
path="$(pwd)/$f"
echo "Directory: $path"
echo " $(dirPrint "$path")"
fi
done
}
if [ $# -eq 0 ]; then
dirPrint .
else
dirPrint "$1"
fi
And also what is the difference between using $1, "$1" and "${1}"?

There are various problems in your script. You shouldn't parse the output of ls, iterate over the expansion of a wildcard instead. Always double quote the variables to prevent spaces in filenames from breaking your commands.
#! /bin/bash
dir_find () {
local dir=$1
local indent=$2
for f in "$dir"/* ; do
printf '%s%s\n' "$indent${f##*/}"
if [[ -d $f ]] ; then
dir_find "$f" " $indent"
fi
done
}
dir_find .

Related

bash - test if files and folders exists in a for statement

I'm trying to print all files and folders of a directory.
The problem is that the print out of the if (echo "$f out of if") is ok, but the code never execute the prints inside the if statements.
I'm sure that files and folders exists (printed out of if) and I checked the permissions.
dirPath is /tmp/testFolder.
for f in `ls $dirPath`; do
echo "$f out of if"
if [ -d $f ]; then
echo "$f is a directory"
fi
if [ -f $f ]; then
echo "$f is a file"
fi
done
EDIT:
Thank you at all to answer!! I tryied this solution but I found the problem that "tmp/testFolder" (dirPath) is recognized as a folder too into the for-loop (and I need to find subfolder only).:
shopt -s globstar
for f in "$dirPath"/**; do
echo "$f out of if"
if [ -d $f ]; then
echo "$f is a directory"
fi
if [ -f $f ]; then
echo "$f is a file"
fi
done
The problem with your code is that the f variable doesn't contain the full path to the file(s) in your target directory. So unless you give current directory, the tests for "directory" and "file" are always going to fail.
You could prefix with the $dirPath in your if statements (such as if [ -d "$dirPath/$f" ]; then echo "$dirPath/$f is a directory"; fi). Or better yet, use glob expansion in for loop instead which would give you full path to each file.
#!/bin/bash
for f in "$dirPath"/*; do
if [ -d "$f" ]; then
echo "$f is a directory"
elif [ -f "$f" ]; then
echo "$f is a file"
fi
done
The code in if doesn't execute because there is no namefile1 in the current directory, which is what your code is checking. That is, $f does not include $dirPath (the directory prefix) the way your code is written.
Better:
for f in "$dirPath"/*; do
echo "$f out of if"
if [ -d "$f" ]; then
echo "$f is a directory"
fi
if [ -f "$f" ]; then
echo "$f is a file"
fi
done
See also the following list of bash pitfalls:
for f in $(ls *.mp3)
[ -n $foo ] or [ -z $foo ]
The problem is, that $f is a relative path. So $f is test.file, but you want to check /tmp/testFolder/test.file. The proper solution would be:
for f in "$dirPath"/*; do
echo "$f out of if"
if [ -d "$f" ]; then
echo "$f is a directory"
fi
if [ -f "$f" ]; then
echo "$f is a file"
fi
done
The immediate problem is that f is set to only the filename, not the entire path. That is, if /tmp/testFolder contains a file named example.txt, f will be set to just "example.txt", so if [ -d $f ]; then is checking for a file named "example.txt" in the current working directory, not in /tmp/testFolder. One option would be to use if [ -d $dirPath/$f ]; then, but there's a better way.
As a general rule, you shouldn't parse the output of ls because its output is ambiguous in several ways. To get a list of files in a particular directory, just use dirpath/* -- this gets a list of matching files without any of the ambiguity or parsing problems you'll have with ls. Bonus: it includes the specified path as part of the result (for example, /tmp/testFolder/* might give "/tmp/testFolder/example.txt").
Another suggestion: you should (almost) always put variable references in double-quotes, e.g. "$f" instead of just $f.
Fixing all of this gives:
for f in "$dirPath"/*; do # Note that $dirPath should be quoted, but * cannot be
echo "$f out of if"
if [ -d "$f" ]; then
echo "$f is a directory"
fi
if [ -f "$f" ]; then
echo "$f is a file"
fi
done
Note that the echo commands will also give the full path. If you don't want that, you can either use the basename command to get just the name portion, e.g. f_name="$(basename "$f")", or (as #melpomene pointed out) use the expansion "${f##*/}" (which trims up to the last "/" in the variable):
for f in "$dirPath"/*; do
f_name="${f##*/}" # The quotes are not strictly needed in an assignment, but do no harm
echo "$f_name out of if"
if [ -d "$f" ]; then
echo "$f_name is a directory"
fi
if [ -f "$f" ]; then
echo "$f_name is a file"
fi
done
Oh, and there's one possible downside to using a wildcard instead of ls: it'll return the raw wildcard if there are no matches. It doesn't matter in this case, but for places where it does you can either start the loop with [ -e "$f" ] || continue (i.e. skip the loop if there's nothing actually there), or if you're using bash (not just a generic shell) you can set the nullglob shell option (shopt -s nullglob).
One more recommendation: shellcheck.net is good at spotting common scripting mistakes, so I recommend running your scripts through it for suggestions.
Based on your edit, you need only compare if "$f" is equal to "$dataPath/" to detect and avoid printing the "$dataPath" directory. (actually using !=) Continuing from your example (and incorporating all the comments about not using for i inls anything`...), you could do:
#!/bin/bash
shopt -s globstar ## enable globstar
[ -z "$1" ] && { ## validate at least 1 argument provided
printf "error: insufficient input\nusage: %s path\n" "${0##*/}" >&2
exit 1
}
datapath="$1" ## set datapath
[ -d "$datapath" ] || { ## validate argument is a directory
printf "error: invalid path - not a directory.\n" >&2
exit 1;
}
## output absolute file/directory names
for f in "$datapath"/**; do ## find files & subdirs below datapath
if [ -d "$f" ]; then ## am I a directory that isn't datapath
[ "$f" != "$datapath/" ] && printf "directory: %s\n" "$f"
elif [ -f "$f" ]; then ## am I a file?
printf "filename : %s\n" "$f"
else ## am I neither a file or directory?
printf "unknown : %s\n" "$f"
fi
done
Now if you only want the relative filenames below "$dataPath", you simply need to trim "dataPath/" from the beginning of each filename using the parameter expansion ${parameter#[word]} (where [word] expands to the pattern you want to match, e.g. "$dataPath/" here):
printf "\nwith relative path\n\n"
## output relative file/directory names
for f in "$datapath"/**; do
relpath="${f#$datapath/}" ## trim $datapath/ from f
if [ -d "$f" ]; then
[ -n "$relpath" ] && printf "directory: %s\n" "$relpath"
elif [ -f "$f" ]; then
printf "filename : %s\n" "$relpath"
else
printf "unknown : %s\n" "$relpath"
fi
done
Example including both above on the following directory structure:
Example Directory
$ tree ~/dev/src-c/tmp/tst/sha2dcr
/home/david/dev/src-c/tmp/tst/sha2dcr
├── chkpass.c
├── dat
│   ├── pw
│   ├── users.bin
│   ├── users.bin.sav
│   └── users.txt
├── getpass.c
├── getpass.h
├── readuserdb.c
├── sha256d.c
├── sha256d.h
├── sha256dtst.c
├── sha256fread.c
├── sha256fread_timed.c
└── userdb.c
Example Script Use/Output
Including both absolute and relative paths, you would have:
$ bash ~/scr/utl/filelist.sh ~/dev/src-c/tmp/tst/sha2dcr
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/chkpass.c
directory: /home/david/dev/src-c/tmp/tst/sha2dcr/dat
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/dat/pw
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/dat/users.bin
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/dat/users.bin.sav
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/dat/users.txt
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/getpass.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/getpass.h
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/readuserdb.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256d.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256d.h
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256dtst.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256fread.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/sha256fread_timed.c
filename : /home/david/dev/src-c/tmp/tst/sha2dcr/userdb.c
with relative path
filename : chkpass.c
directory: dat
filename : dat/pw
filename : dat/users.bin
filename : dat/users.bin.sav
filename : dat/users.txt
filename : getpass.c
filename : getpass.h
filename : readuserdb.c
filename : sha256d.c
filename : sha256d.h
filename : sha256dtst.c
filename : sha256fread.c
filename : sha256fread_timed.c
filename : userdb.c
(note: the output of directory: /home/david/dev/src-c/tmp/tst/sha2dcr/ is suppressed in both cases)
Look things over and let me know if you have further questions.
(note: I use all lower-case variable names if you have an unexplained copy error...)

How to accommodate spaces in a variable in a bash and iterate in directory tree in linux

I am writing a bash script to iterate in directory and sub-directories, check if a file opened by a process,
if yes move it to another location
if no skip it
My issue is that the Source folders have Spaces is their names such as "FTP SYNC LOCAL"
my script is able so far to iterate in the folders and subfolders and test if a file is opened by another process.
It only does this if the file name doesn't contain a SPACE in its name, if it does, nothing happened
print_folder_recurse() {
for i in "$1"/* ;do
if [ -d "$i" ];then
echo $i
#lsof "$i" | grep Serv-U | wc -l
print_folder_recurse "$i"
elif [ -f "$i" ]; then
echo $i
flag=$(lsof "$i" | grep Serv-U | wc -l)
if [ $flag == 0 ];then
echo "Done"
elif [ $flag != 0 ];then
echo "Skip Next"
fi
fi
done
}
path=""
if [ -d "$1" ]; then
path=$1;
else
direct="/Source/FTP Sync"
echo $direct
path="$direct"
fi
#echo "base path: $path"
print_folder_recurse $path
The problem is at the bottom of the code with the variable "direct". If I write it
direct="/Source"
echo $direct
path="$direct"
fi
#echo "base path: $path"
print_folder_recurse $path
The file execute.
I can prevent the issue by writing the folder Source/FTP_Sync but I can do this since it will affect a major workflow.
Any help will be apprecaited
the print_folder_recurse functions reads the $path variable as 2 seperate arguments because of the space in between the variable i.e $1 = /Source/Ftp while $2 = Sync. The solution is to wrap the $path variable in double quote like this print_folder_recurse "$path"so that print_folder_recurse can read it as a single argument

directory isn't recognized as being a directory

I would like to make a bash function that lists all the directories (and files) inside a given directory.
searchInRepo(){
file_list=`ls $1`
#echo $file_list
for aFile in $file_list; do
echo "$aFile --"
# case : directory
if [ -d $aFile ]
then
echo "$aFile ***"
cd $aFile
searchInRepo $aFile
cd ..
# case : file
elif [ -f $aFile ]
then
echo "$aFile is a regular file"
fi
done
}
As you see, this is a recursive function. When I call it with ls $1 (listing parameter's files) it doesn't recognize the directories as being directories. When I just use ls(no argument involved) everything works fine.
Any suggestions here ?
Cheers !
Why use ls when bash can do it for you? This will check to make sure the argument has a trailing /* so it will work with bare directory names.
if [[ ! "$1" =~ /\*$ ]]
then
if [[ ! "$1" =~ /$ ]]
then
searchpath="$1/*"
else
searchpath="$1*"
fi
fi
echo "Searching $searchpath"
for f in $searchpath; do
if [ -d "$f" ]
then
echo "Directory -> $f"
else
echo "File -> $f"
fi
done
Why even use for loop, find is made for this.
find . # List all files and directories recursively
find . -type f # List only files
find . -maxdepth 1 # List all files and directories in current directory
If you are using git, you can also use git ls-files

Bash script loop through subdirectories and write to file without using find,ls etc

Sorry for asking this question again. I have already received answer but with using find but unfortunately I need to write it without using any predefined commands.
I am trying to write a script that will loop recursively through the subdirectories in the current directory. It should check the file count in each directory. If file count is greater than 10 it should write all names of these file in file named "BigList" otherwise it should write in file "ShortList". This should look like:
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
My script only works if subdirectories don't include subdirectories in turn.
I am confused about this because it doesn't work as I expect.
Here is my script
#!/bin/bash
parent_dir=""
if [ -d "$1" ]; then
path=$1;
else
path=$(pwd)
fi
parent_dir=$path
loop_folder_recurse() {
local files_list=""
local cnt=0
for i in "$1"/*;do
if [ -d "$i" ];then
echo "dir: $i"
parent_dir=$i
echo before recursion
loop_folder_recurse "$i"
echo after recursion
if [ $cnt -ge 10 ]; then
echo -e "---"$parent_dir >> BigList
echo -e $file_list >> BigList
else
echo -e "---"$parent_dir >> ShortList
echo -e $file_list >> ShortList
fi
elif [ -f "$i" ]; then
echo file $i
if [ $cur_fol != $main_pwd ]; then
file_list+=$i'\n'
cnt=$((cnt + 1))
fi
fi
done
}
echo "Base path: $path"
loop_folder_recurse $path
How can I modify my script to produce the desired output?
This bash script produces the output that you want:
#!/bin/bash
bigfile="$PWD/BigList"
shortfile="$PWD/ShortList"
shopt -s nullglob
loop_folder_recurse() {
(
[[ -n "$1" ]] && cd "$1"
for i in */; do
[[ -d "$i" ]] && loop_folder_recurse "$i"
count=0
files=''
for j in *; do
if [[ -f "$j" ]]; then
files+="$j"$'\n'
((++count))
fi
done
if ((count > 10)); then
outfile="$bigfile"
else
outfile="$shortfile"
fi
echo "$i" >> "$outfile"
echo "$files" >> "$outfile"
done
)
}
loop_folder_recurse
Explanation
shopt -s nullglob is used so that when a directory is empty, the loop will not run. The body of the function is within ( ) so that it runs within a subshell. This is for convenience, as it means that the function returns to the previous directory when the subshell exits.
Hopefully the rest of the script is fairly self-explanatory but if not, please let me know and I will be happy to provide additional explanation.

shell script only runs as desired in certain directories

I am writing a bourne shell script that essentially has the same functionality as ls.
Here is my code.
#!/bin/sh
echo "\n"
if [ "$#" -eq 0 ]
then
SEARCH_DIR=`pwd`
fi
if [ "$#" -gt 0 ]
then
SEARCH_DIR=$1
if [ ! -d "$SEARCH_DIR" ]
then
echo "Directory Does Not Exist - - - Exiting"
echo "\n"
exit
fi
fi
DIR_CONTENT=`ls $SEARCH_DIR`
for file in $DIR_CONTENT
do
if [ -f "$file" ]
then
echo "f\c"
fi
if [ -d "$file" ]
then
echo "d\c"
fi
if [ ! -f "$file" ] && [ ! -d "$file" ]
then
echo "-\c"
fi
if [ -r "$file" ]
then
echo "r\c"
else
echo "-\c"
fi
if [ -w "$file" ]
then
echo "w\c"
else
echo "-\c"
fi
if [ -x "$file" ]
then
echo "x\c"
else
echo "-\c"
fi
echo ' \c'
echo "$file"
done
echo "\n"
When I execute the script, I get the desired output for that specific directory:
For example:
$ ./dirinfo
dirinfo version 0.1
drwx Desktop
frwx dirinfo
frw- #dirinfo#
frwx dirinfo~
frwx dirinfo2~
But if I try to pass an argument for a different directory the script doesn't seem to acknowledge my if statements.
For example:
$ ./dirinfo /bin
dirinfo version 0.1
---- bash
---- bunzip2
---- busybox
---- bzcat
---- bzcmp
But if I execute the script from the /bin directory I get the desired effect:
$ cd /bin
$ ~/dirinfo
dirinfo version 0.1
fr-x bash
fr-x bunzip2
fr-x busybox
fr-x bzcat
fr-x bzcmp
Could someone please attempt to point me in the right direction? Thanks!
Dont have a bash to test right now, but maybe $file doesnt have the full path so, evaluating -r or -w would not work. When you cd to destination directory, files are on ./.
Yes, as user430051 mentioned you are running it from a directory and listing files from another which will not work.
solution is prepend you search dir before filename like,
for file in $DIR_CONTENT
do
file="$SEARCH_DIR/$file"
if [ -f "$file" ]
then
echo "f\c"
fi
and it should work.
There can be many ways to solve your problem but the simplest solution would be to add one line which is missing (i.e cd "$SEARCH_DIR"), just add it in your script after DIR_CONTENT=ls $SEARCH_DIR and your script is good to go as per your expectation.
The main difference in this solution and above given by Nachiket is that in my solution file names in output will not have absolute path which I guess is your expectation.
DIR_CONTENT=`ls $SEARCH_DIR`
cd "$SEARCH_DIR"
for file in $DIR_CONTENT
do
if [ -f "$file" ]

Resources