Bash: Find any subdirectories without a given file present - bash

I want to know if my file exists in any of the sub directories below. The sub directories are created in the steps above in my shell script, the below code always tells me the file do not exist (even if it does) and I want the path to be printed as well.
#!/bin/bash
....
if ! [[ -e [ **/**/somefile.txt && -s **/**/somefile.txt ]]; then
echo "===> Warn: somefile.txt was not created in the following path: "
# I want to be able to print the path in which file is not generated
exit 1
fi
I know the file name is somefile.txt which is to be created in all sub-directories, but the subdirectory names change a lot.. Hence globbing.

#!/bin/bash
shopt -s extglob ## enable **, which by default has no special behavior
for d in **/; do
if ! [[ -s "$d/somefile.txt" ]]; then
echo "===> WARN: somefile.txt was not created (or is empty) in $d" >&2
exit 1
fi
done

Related

For files in directory Bash [duplicate]

I'm trying to loop through files in a directory, where the directory is passed through as an argument. I currently have the following script saved in test.sh:
#!/bin/bash
for filename in "$1"/*; do
echo "File:"
echo $filename
done
And I am running the above using:
sh test.sh path/to/loop/over
However, the above doesn't output the files at the directory path/to/loop/over, it instead outputs:
File:
path/to/loop/over/*
I'm guessing it's interpreting path/to/loop/over/* as a string and not a directory. My expected output is the following:
File:
foo.txt
File:
bar.txt
Where foo.txt and bar.txt are files in the path/to/loop/over/ directory. I found this answer which suggested to add a /* after the $1, however, this doesn't seem to help (neither do these suggestions)
Iterate over content of directory
Compatible answer (not only bash)
As this question is tagged shell, there is a POSIX compatible way:
#!/bin/sh
for file in "$1"/* ;do
[ -f "$file" ] && echo "Process '$file'."
done
Will be enough (work with filenames containing spaces):
$ myscript.sh /path/to/dir
Process '/path/to/dir/foo'.
Process '/path/to/dir/bar'.
Process '/path/to/dir/foo bar'.
This work well by using any posix shell. Tested with bash, ksh, dash, zsh and busybox sh.
#!/bin/sh
cd "$1" || exit 1
for file in * ;do
[ -f "$file" ] && echo "Process '$file'."
done
This version won't print path:
$ myscript.sh /path/to/dir
Process 'foo'.
Process 'bar'.
Process 'foo bar'.
Some bash ways
Introduction
I don't like to use shopt when not needed... (This change standard
bash behaviours and make script less readables).
There is an elegant way for doing this by using standard bash, without requirement of shopt.
Of course, previous answer work fine under bash, but. There are some
interresting way for making your script more powerfull, flexible, pretty, detailed...
Sample
#!/bin/bash
die() { echo >&2 "$0 ERROR: $#";exit 1;} # Emergency exit function
[ "$1" ] || die "Argument missing." # Exit unless argument submitted
[ -d "$1" ] || die "Arg '$1' is not a directory." # Exit if argument is not dir
cd "$1" || die "Can't access '$1'." # Exit unless access dir.
files=(*) # All files names in array $files
[ -f "$files" ] || die "No files found." # Exit if no files found
for file in "${files[#]}";do # foreach file:
echo Process "$file" # Process file
done
Explanation: considering globbing vs real files
When doing:
files=(/path/to/dir/*)
variable $files becomes an array containing all files contained under /path/to/dir/:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
But if nothing match glob pattern, star won't be replaced and array become:
declare -p files
declare -a files=([0]="/path/to/dir/*")
From there. looking for $files is like looking for ${files[0]} ie: first field in array. So
[ -f "$files" ] || die "No files found."
will execute die function unless first field of array files is a file ([ -e "$files" ] to check for existing entry, [ -d "$files" ] to check for existing directory, ans so on... see man bash or help test).
But you could do replace this filesystem test by some string based test, like:
[ "$files" = "/path/to/dir/*" ] && die "No files found."
or, using array length:
((${#files[#]}==1)) && [ "${files##*/}" = "*" ] && die "No files found."
Dropping paths by using Parameter expansion:
For suppressing path from filenames, instead of cd $path you could do:
targetPath=/path/to/dir
files=($targetPath/*)
[ -f "$files" ] || die "No files found."
Then:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
You could
printf 'File: %s\n' ${files[#]#$targetPath/}
File: bar
File: baz
File: foo
This would happen if the directory is empty, or misspelled. The shell (in its default configuration) simply doesn't expand a wildcard if it has no matches. (You can control this in Bash with shopt -s nullglob; with this option, wildcards which don't match anything are simply removed.)
You can verify this easily for yourself. In a directory with four files,
sh$ echo *
a file or two
sh$ echo [ot]*
or two
sh$ echo n*
n*
And in Bash,
bash$ echo n*
n*
bash$ shopt -s nullglob
bash$ echo n*
I'm guessing you are confused about how the current working directory affects the resolution of directory names; maybe read Difference between ./ and ~/

How to identify files which are not in list using bash?

Unfortunately my knowledge in bash not so well and I have very non-standard task.
I have a file with the files list.
Example: /tmp/my/file1.txt /tmp/my/file2.txt
How can I write a script which can check that files from folder /tmp/my exist and to have two types messages after script is done.
1 - Files exist and show files:
/tmp/my/file1.txt
/tmp/my/file2.txt
2 - The folder /tmp/my including files and folders which are not in your list. The files and folders:
/tmp/my/test
/tmp/my/1.txt
You speak of files and folders, which seems unclear.
Anyways, I wanted to try it with arrays, so here we go :
unset valid_paths; declare -a valid_paths
unset invalid_paths; declare -a invalid_paths
while read -r line
do
if [ -e "$line" ]
then
valid_paths=("${valid_paths[#]}" "$line")
else
invalid_paths=("${invalid_paths[#]}" "$line")
fi
done < files.txt
echo "VALID PATHS:"; echo "${valid_paths[#]}"
echo "INVALID PATHS:"; echo "${invalid_paths[#]}"
You can check for the files' existence (assuming a list of files, one filename per line) and print the existing ones with a prefix using this
# Part 1 - check list contents for files
while read thefile; do
if [[ -n "$thefile" ]] && [[ -f "/tmp/my/$thefile" ]]; then
echo "Y: $thefile"
else
echo "N: $thefile"
fi
done < filelist.txt | sort
# Part 2 - check existing files against list
for filepath in /tmp/my/* ; do
filename="$(basename "$filepath")"
grep "$filename" filelist.txt -q || echo "U: $filename"
done
The files that exist are prefixed here with Y:, all others are prefixed with N:
In the second section, files in the tmp directory that are not in the file list are labelled with U: (unaccounted for/unexpected)
You can swap the -f test which checks that a path exists and is a regular file for -d (exists and is a directory) or -e (exists)
See
man test
for more options.

Bash: Creating subdirectories reading from a file

I have a file that contains some keywords and I intend to create subdirectories into the same directory of the same keyword using a bash script. Here is the code I am using but it doesn't seem to be working.
I don't know where I have gone wrong. Help me out
for i in `cat file.txt`
do
# if [[ ! -e $path/$i ]]; then
echo "creating" $i "directory"
mkdir $path/$i
# fi
grep $i file >> $path/$i/output.txt
done
echo "created the files in "$path/$TEMP/output.txt
You've gone wrong here, and you've gone wrong here.
while read i
do
echo "Creating $i directory"
mkdir "$path/$i"
grep "$i" file >> "$path/$i"/output.txt
done < file.txt
echo "created the files in $path/$TEMP/output.txt"
78mkdir will refuse to create a directory, if parts of it do not exist.
e.g. if there is no /foo/bar directory, then mkdir /foo/bar/baz will fail.
you can relax this a bit by using the -p flag, which will create parent directories if necessary (in the example, it might create /foo and /foo/bar).
you should also use quotes, in case your paths contain blanks.
mkdir -p "${path}/${i}"
finally, make sure that you are actually allowed to create directories in $path

make alias for ls so that it doesn't show files of the pattern *~

Is there a series of commands that does ls then removes backup files? I want to do something like
ls | grep -v *~
but this shows all the files in different lines, any one to make the output identical to ls?
When I type in "man ls" My man page for ls has this option of -B its
-B Force printing of non-printable characters (as defined by ctype(3)
and current locale settings) in file names as \xxx, where xxx is the
numeric value of the character in octal.
It is not identical to the one you showed and I searched for ignored but no results popped up. Btw I am on a mac, which might have a different version of ls?
Alternatively, can I tell a directory to stop making backup files?
Assuming ls from GNU coreutils,
-B, --ignore-backups
do not list implied entries ending with ~
You can also set FIGNORE='~' in Bash so that * never expands to contain filenames ending in ~.
You can list all files ending in ~ with:
ls -d *[^~]
The *[^~] specifies all files that don't end in ~. The -d flag tells ls not to show the directory contents for any directories that it matches (as with the default ls command).
Edit: If you alias your ls to use the command above, it will break the standard ls usage, so you're better off using ephemient's solution if you want your ls usage to always exclude backup files.
For people forced to use ls that doesn't have -B (e.g., using BSD ls in Mac OS X), you can create an alias to a bash function that is based on Mansoor Siddiqui's suggestion. If you add the following function to your bash profile where you keep your aliases (.bash_profile, .profile, .bashrc, .bash_aliases, or equivalent):
ls_no_hidden() {
nonflagcount=0
ARG_ARRAY=(${#})
flags="-l"
curdir=`pwd`
shopt -s nullglob
# Iterate through args, find all flags (arg starting with dash (-))
for (( i = 0; i < $# ; i++ )); do
if [[ ${ARG_ARRAY[${i}]} == -* ]]; then
flags="${flags} ${ARG_ARRAY[${i}]}";
else
((nonflagcount++));
fi
done
if [[ $nonflagcount -eq 0 ]]; then
# ls current directory if no non-flag args provided
FILES=`echo *[^~#]`
# check if files are present, before calling ls
# to suppress errors if no matches.
if [[ -n $FILES ]]; then
ls $flags -d *[^~#]
fi
else
# loop through all args, and run ls for each non-flag
for (( i = 0; i < $# ; i++ )); do
if [[ ${ARG_ARRAY[${i}]} != -* ]]; then
# if directory, enter the directory
if [[ -d ${ARG_ARRAY[${i}]} ]]; then
cd ${ARG_ARRAY[${i}]}
# check that the cd was successful before calling ls
if [[ $? -eq 0 ]]; then
pwd # print directory you are listing (feel free to comment out)
FILES=`echo *[^~#]`
if [[ -n $FILES ]]; then
ls $flags -d *[^~#]
fi
cd $curdir
fi
else
# if file list the file
if [[ -f ${ARG_ARRAY[${i}]} ]]; then
ls $flags ${ARG_ARRAY[${i}]}
else
echo "Directory/File not found: ${ARG_ARRAY[${i}]}"
fi
fi
fi
done
fi
}
alias l=ls_no_hidden
Then l will be mapped to ls but not show files that end in ~ or #.

How can I trap a return of two variables with an if statement using bash?

My task is to list a user's folder in /Users on a mac. I have to allow for dupe folders (large enterprise of 650 mac clients) or where a desktop analyst has backed up a folder and appended something. My $fourFour variable picks that up. However, I must flag that for logging.
This is where I have got below. The variable $fourFour may return one or more folders and I need to get the if statement to echo this accordingly.
folders=$(ls -d */ | grep $fourFour | awk '{print $(NF)}' | sed 's/\///')
echo folders is $folders
if [[ "$folders" == "" ]]; then
echo no items
else
echo one or more items
fi
Do not parse the output of ls unless you absolutely have to. Your code above has major issues with whitespace in folder names.
Bash arrays can be your friend:
#!/bin/bash
shopt -s nullglob
folders=(*$fourFour*/)
# Remove the trailing slashes
folders=("${folders[#]%/}")
if [[ "${#folders[#]}" -gt 0 ]]; then
echo "Folders:" "${folders[#]}"
else
echo "No folders"
fi
you don't have to call so many tools to find your folders. Just use the shell (bash)
#!/bin/bash
shopt -s nullglob
for dir in *$fourFour*/ # putting a slash "/" ensures you get directory entries
do
echo "Do something with $dir"
# if you want to check if its empty folder
v=$(echo "$dir"/*)
case "${#v}" in
0) echo "No files in $dir";;
*) echo "Files in $dir";;
esac
done
if you just want to check whether there are any folders that matched your pattern
v=$(echo "$four"/)
case "${#v}" in
0) echo "0 item";;
*) echo "1 or more item";;
esac

Resources