Bash: For loops inside for loops - bash

I can not make this simple script work in bash
# Works
for f in *; do
for j in $f/Attachments/*.pdf; do
echo "$j"
done;
done
# Doesn't work
for f in *; do
for j in $f/Attachments/*.pdf; do
if [ ! pdfinfo "$j" &> /dev/null ]; then
echo "$j"
fi
done;
done
I have read 10+ guides, and I cannot understand why this script lists a bunch of random directories.
It should:
List folders in the current directory
In each folder it should list all PDF-files in the subdirectory Attachments
For each file it should check if it is corrupt, and if so print it

What you want could be achieved by this code snippet:
for f in */Attachments/*.pdf; do
if ! pdfinfo "$f" &>/dev/null; then
echo "$f"
fi
done
In your code, for f in * iterates through all files (including directories). If you want directories only, you must have used for f in */. Like that:
for d in */; do
for f in "$d"Attachments/*.pdf; do
[[ -f $f ]] || continue
if ! pdfinfo "$f" &>/dev/null; then
echo "$f"
fi
done
done

Related

BASH test if file name ends with .dylib

I'm walking a file tree in order to identify all .DYLIB files.
#!/bin/bash
#script to recursively travel a dir of n levels
function traverse() {
for file in "$1"/*
do
if [ ! -d "${file}" ] ; then
echo "${file} is a file"
else
echo "entering recursion with: ${file}"
traverse "${file}"
fi
done
}
function main() {
traverse "$1"
}
main "$1"
I want to test if the filename ends with .DYLIB before printing "... is a file". I think I may need to add to the condition "if [ ! -d "${file}" ] ; then", but I'm not sure. Is there a way to do this in bash?
No need to write your own recursive function. You can recursively find all *.dylib files using a ** glob:
shopt -s globstar
ls "$1"/**/*.dylib
Or use find:
find "$1" -name '*.dylib'
To use these results I recommend looping over them directly. It avoids using up memory with a temporary array.
shopt -s globstar
for file in "$1"/**/*.dylib; do
echo "$file"
done
or
while IFS= read -rd '' file; do
echo "$file"
done < <(find "$1" -name '*.dylib')
Is there a way I can store everything in a string array so that I can perform an operation on those .dylib files?
But if you do indeed want an array, you can write:
shopt -s globstar
files=("$1"/**/*.dylib)
or
readarray -td '' files < <(find "$1" -name '*.dylib')
Then to loop over the array you'd write:
for file in "${files[#]}"; do
echo "$file"
done
Don't add to the condition in if [ ! -d "$file" ], because then the else block will try to recurse into files that don't have the suffix. But recursion should only be for directories.
You should add a nested condition for this. You can use bash's [[ ]] condition operator to have = perform wildcard matching.
if [ ! -d "${file}" ] ; then
if [[ "$file" = *.DYLIB ]]; then
echo "${file} is a file"
fi
else
echo "entering recursion with: ${file}"
traverse "${file}"
fi
Or you could invert the sense of the directory test:
if [ -d "$file" ]; then
echo "entering recursion with: ${file}"
traverse "${file}"
elif [ "$file" = *.DYLIB ]; then
echo "$file is a file"
fi

can't read all file lines in bash pipeline

I searched and couldn't find anything, maybe I can't understand the problem properly.
I have a bash function who read files in current dir and sub dir's, I'm trying to arrange the text and analyze the data but somehow I'm losing lines if I'm using pipeline.
the code:
function recursiveFindReq {
for file in *.request; do
if [[ -f "$file" ]]; then
echo handling "$file"
echo ---------------with pipe-----------------------
cat "$file" | while read -a line; do
if (( ${#line} > 1 )); then
echo ${line[*]}
fi
done
echo ----------------without pipe----------------------
cat "$file"
echo
echo num of lines: `cat "$file" | wc -l`
echo --------------------------------------
fi
done
for dir in ./*; do
if [[ -d "$dir" ]]; then
echo cd to $dir
cd "$dir"
recursiveFindReq "$1"
cd ..
fi
done
}
the output is:
losing lines even when they meet requirements
I marked with 2 red arrows the place I'm losing info

bash check all file in a directory for their extension

I am writing a shell script to read all the files in the give directory by the user input then count how many files with that extension. I just started learning Bash and I am not sure why it this not locating the files or reading the directory. I am only putting 2 example but my count is always 0.
This is how I run my script
$./check_ext.sh /home/user/temp
my script check_ext.sh
#!/bin/bash
count1=0
count2=0
for file in "ls $1"
do
if [[ $file == *.sh ]]; then
echo "is a txt file"
(( count1++ ))
elif [[ $file == *.mp3 ]]; then
echo "is a mp3 file"
(( count2++ ))
fi
done;
echo $count $count2
"ls $1" doesn't execute ls on $1, it just a plain string. Command substitution syntax is $(ls "$1")
However there is no need to use ls, just use globbing:
count1=0
count2=0
for file in "$1"/*; do
if [[ $file == *.sh ]]; then
echo "is a txt file"
(( count1++ ))
elif [[ $file == *.mp3 ]]; then
echo "is a mp3 file"
(( count2++ ))
fi
done
echo "counts: $count1 $count2"
for file in "$1"/* will iterate through all the files/directories in the directory denoted by $1
EDIT: For doing it recursively inside a directory:
count1=0
count2=0
while IFS= read -r -d '' file; do
if [[ $file == *.sh ]]; then
echo "is a txt file"
(( count1++ ))
elif [[ $file == *.mp3 ]]; then
echo "is a mp3 file"
(( count2++ ))
fi
done < <(find "$1" -type f -print0)
echo "counts: $count1 $count2"
POSIXly:
count1=0
count2=0
for f in "$1"/*; do
case $f in
(*.sh) printf '%s is a txt file\n' "$f"; : "$((count1+=1))" ;;
(*.mp3) printf '%s is a mp3 file\n' "$f"; : "$((count2+=1))" ;;
esac
done
printf 'counts: %d %d\n' "$count1" "$count2"
You can use Bash arrays for this too: if you only want to deal with extensions sh and mp3:
#!/bin/bash
shopt -s nullglob
shs=( "$1"/*.sh )
mp3s=( "$1"/*.mp3 )
printf 'counts: %d %d\n' "${#shs[#]}" "${#mp3s[#]}"
If you want to deal with more extensions, you can generalize this process:
#!/bin/bash
shopt -s nullglob
exts=( .sh .mp3 .gz .txt )
counts=()
for ext in "${exts[#]}"; do
files=( "$1"/*."$ext" )
counts+=( "${#files[#]}" )
done
printf 'counts:'
printf ' %d' "${counts[#]}"
echo
If you want to deal with all extensions (using associative arrays, available in Bash ≥4)
#!/bin/bash
shopt -s nullglob
declare -A exts
for file in "$1"/*.*; do
ext=${file##*.}
((++'exts[$ext]'))
done
for ext in "${!exts[#]}"; do
printf '%s: %d\n' "$ext" "${exts[$ext]}"
done

Bash script loop through subdirectories and write to file without using find,ls etc

Sorry for asking this question again. I have already received answer but with using find but unfortunately I need to write it without using any predefined commands.
I am trying to write a script that will loop recursively through the subdirectories in the current directory. It should check the file count in each directory. If file count is greater than 10 it should write all names of these file in file named "BigList" otherwise it should write in file "ShortList". This should look like:
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
My script only works if subdirectories don't include subdirectories in turn.
I am confused about this because it doesn't work as I expect.
Here is my script
#!/bin/bash
parent_dir=""
if [ -d "$1" ]; then
path=$1;
else
path=$(pwd)
fi
parent_dir=$path
loop_folder_recurse() {
local files_list=""
local cnt=0
for i in "$1"/*;do
if [ -d "$i" ];then
echo "dir: $i"
parent_dir=$i
echo before recursion
loop_folder_recurse "$i"
echo after recursion
if [ $cnt -ge 10 ]; then
echo -e "---"$parent_dir >> BigList
echo -e $file_list >> BigList
else
echo -e "---"$parent_dir >> ShortList
echo -e $file_list >> ShortList
fi
elif [ -f "$i" ]; then
echo file $i
if [ $cur_fol != $main_pwd ]; then
file_list+=$i'\n'
cnt=$((cnt + 1))
fi
fi
done
}
echo "Base path: $path"
loop_folder_recurse $path
How can I modify my script to produce the desired output?
This bash script produces the output that you want:
#!/bin/bash
bigfile="$PWD/BigList"
shortfile="$PWD/ShortList"
shopt -s nullglob
loop_folder_recurse() {
(
[[ -n "$1" ]] && cd "$1"
for i in */; do
[[ -d "$i" ]] && loop_folder_recurse "$i"
count=0
files=''
for j in *; do
if [[ -f "$j" ]]; then
files+="$j"$'\n'
((++count))
fi
done
if ((count > 10)); then
outfile="$bigfile"
else
outfile="$shortfile"
fi
echo "$i" >> "$outfile"
echo "$files" >> "$outfile"
done
)
}
loop_folder_recurse
Explanation
shopt -s nullglob is used so that when a directory is empty, the loop will not run. The body of the function is within ( ) so that it runs within a subshell. This is for convenience, as it means that the function returns to the previous directory when the subshell exits.
Hopefully the rest of the script is fairly self-explanatory but if not, please let me know and I will be happy to provide additional explanation.

Check if each file in a list (which is in a file) exists in bash

I have a text file (ListOfAllFiles.txt) that has a list of 500 files some of which exist and some don't.
I'd like to make two texts files that indicate which files exist and which don't.
This is my code thus far:
#!/bin/bash
for f in $(cat /path/to/ListOfAllFiles.txt)
do
if [[ -f $f ]]; then
echo $f > /path/to/FilesFound.txt
else
echo $f > /path/to/FilesNOTFound.txt
fi
done
What am I doing wrong??
Your biggest problem is that each pass through the loop will overwrite either /path/to/FilesFound.txt or /path/to/FilesNOTFound.txt; instead of using >, you should be using >>. Fixing that, and making other improvements for robustness, we get:
#!/bin/bash
echo -n > /path/to/FilesFound.txt # reset to empty file
echo -n > /path/to/FilesNOTFound.txt # reset to empty file
while IFS= read -r f ; do
if [[ -f "$f" ]]; then
echo "$f" >> /path/to/FilesFound.txt
else
echo "$f" >> /path/to/FilesNOTFound.txt
fi
done < /path/to/ListOfAllFiles.txt

Resources