BASH test if file name ends with .dylib - bash

I'm walking a file tree in order to identify all .DYLIB files.
#!/bin/bash
#script to recursively travel a dir of n levels
function traverse() {
for file in "$1"/*
do
if [ ! -d "${file}" ] ; then
echo "${file} is a file"
else
echo "entering recursion with: ${file}"
traverse "${file}"
fi
done
}
function main() {
traverse "$1"
}
main "$1"
I want to test if the filename ends with .DYLIB before printing "... is a file". I think I may need to add to the condition "if [ ! -d "${file}" ] ; then", but I'm not sure. Is there a way to do this in bash?

No need to write your own recursive function. You can recursively find all *.dylib files using a ** glob:
shopt -s globstar
ls "$1"/**/*.dylib
Or use find:
find "$1" -name '*.dylib'
To use these results I recommend looping over them directly. It avoids using up memory with a temporary array.
shopt -s globstar
for file in "$1"/**/*.dylib; do
echo "$file"
done
or
while IFS= read -rd '' file; do
echo "$file"
done < <(find "$1" -name '*.dylib')
Is there a way I can store everything in a string array so that I can perform an operation on those .dylib files?
But if you do indeed want an array, you can write:
shopt -s globstar
files=("$1"/**/*.dylib)
or
readarray -td '' files < <(find "$1" -name '*.dylib')
Then to loop over the array you'd write:
for file in "${files[#]}"; do
echo "$file"
done

Don't add to the condition in if [ ! -d "$file" ], because then the else block will try to recurse into files that don't have the suffix. But recursion should only be for directories.
You should add a nested condition for this. You can use bash's [[ ]] condition operator to have = perform wildcard matching.
if [ ! -d "${file}" ] ; then
if [[ "$file" = *.DYLIB ]]; then
echo "${file} is a file"
fi
else
echo "entering recursion with: ${file}"
traverse "${file}"
fi
Or you could invert the sense of the directory test:
if [ -d "$file" ]; then
echo "entering recursion with: ${file}"
traverse "${file}"
elif [ "$file" = *.DYLIB ]; then
echo "$file is a file"
fi

Related

Bash: For loops inside for loops

I can not make this simple script work in bash
# Works
for f in *; do
for j in $f/Attachments/*.pdf; do
echo "$j"
done;
done
# Doesn't work
for f in *; do
for j in $f/Attachments/*.pdf; do
if [ ! pdfinfo "$j" &> /dev/null ]; then
echo "$j"
fi
done;
done
I have read 10+ guides, and I cannot understand why this script lists a bunch of random directories.
It should:
List folders in the current directory
In each folder it should list all PDF-files in the subdirectory Attachments
For each file it should check if it is corrupt, and if so print it
What you want could be achieved by this code snippet:
for f in */Attachments/*.pdf; do
if ! pdfinfo "$f" &>/dev/null; then
echo "$f"
fi
done
In your code, for f in * iterates through all files (including directories). If you want directories only, you must have used for f in */. Like that:
for d in */; do
for f in "$d"Attachments/*.pdf; do
[[ -f $f ]] || continue
if ! pdfinfo "$f" &>/dev/null; then
echo "$f"
fi
done
done

Recursively search for files

I am trying to find all files by passing a directory name in all sub directories meaning the process is recursive here is my code
myrecursive() {
if [ -f $1 ]; then
echo $1
elif [ -d $1 ]; then
for i in $(ls $1); do
if [ -f $1 ]; then
echo $i
else
myrecursive $i
fi
done
else
echo " sorry"
fi
}
myrecursive $1
However when I pass directory with another directory I get 2 times sorry,where is my mistake?
The goal that you are trying to achieve could be simply done by using find command:
# will search for all files recursively in current directory
find . * -exec echo {} \;
# will search for all *.txt file recursively in current directory
find . -name "*.txt" -exec echo {} \;
# will search for all *.txt file recursively in current directory
# but depth is limited to 3
find . -name "*.txt" -max-depth 3 -exec echo {} \;
See man find for manual. How to run find -exec?
The problem with your code is quite simple.
The ls command will return a list of filenames, but they aren't valid for
recursion. Use globbing instead. The loop below simply replaces $(ls) with $1/*
myrecursive() {
if [ -f $1 ]; then
echo $1
elif [ -d $1 ]; then
for i in $1/*; do
if [ -f $1 ]; then
echo $i
else
myrecursive $i
fi
done
else
echo " sorry"
fi
}
myrecursive $1
Hope that helps
#!/bin/bash
myrecursive() {
if [ -f "$1" ]; then
echo "$1"
elif [ -d "$1" ]; then
for i in "$1"/*; do
if [ -f "$i" ]; then #here now our file is $i
echo "$i"
else
myrecursive "$i"
fi
done
else
echo " sorry"
fi
}
myrecursive "$1"

How to search for multiple file extensions from shell script

for file in "$1"/*
do
if [ ! -d "${file}" ] ; then
if [[ $file == *.c ]]
then
blah
blah
Above code traverses all the .c files in a directory and does some action.I want to include .cpp,.h,.cc files as well.How can I check multiple file extensions in the same if condition ?
Thanks
You can combine conditions using boolean operators :
if [[ "$file" == *.c ]] || [[ "$file" == *.cpp ]] || [[ "$file" == *.h ]] || [[ "$file" == *.cc ]]; then
#...
fi
Another alternative would be to use a regex :
if [[ "$file" =~ \.(c|cpp|h|cc)$ ]]; then
#...
fi
Using extended patterns,
# Only necessary prior to bash 4.1; since then,
# extglob is temporarily turn on for the pattern argument to !=
# and =/== inside [[ ... ]]
shopt -s extglob nullglob
for file in "$1"/*; do
if [[ -f $file && $file = *.#(c|cc|cpp|h) ]]; then
...
fi
done
The extended pattern can also be to generate the file list; in that case, you definitely need the shopt command:
shopt -s extglob nullglob
for file in "$1"/*.#(c|cc|cpp|h); do
...
done
Why not just iterate over selected file extensions?
#!/bin/bash
for file in ${1}/*.[ch] ${1}/*.cc ${1}/*.cpp; do
if [ -f $file ]; then
# action for the file
echo $file
fi
done

Bash script loop through subdirectories and write to file without using find,ls etc

Sorry for asking this question again. I have already received answer but with using find but unfortunately I need to write it without using any predefined commands.
I am trying to write a script that will loop recursively through the subdirectories in the current directory. It should check the file count in each directory. If file count is greater than 10 it should write all names of these file in file named "BigList" otherwise it should write in file "ShortList". This should look like:
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
My script only works if subdirectories don't include subdirectories in turn.
I am confused about this because it doesn't work as I expect.
Here is my script
#!/bin/bash
parent_dir=""
if [ -d "$1" ]; then
path=$1;
else
path=$(pwd)
fi
parent_dir=$path
loop_folder_recurse() {
local files_list=""
local cnt=0
for i in "$1"/*;do
if [ -d "$i" ];then
echo "dir: $i"
parent_dir=$i
echo before recursion
loop_folder_recurse "$i"
echo after recursion
if [ $cnt -ge 10 ]; then
echo -e "---"$parent_dir >> BigList
echo -e $file_list >> BigList
else
echo -e "---"$parent_dir >> ShortList
echo -e $file_list >> ShortList
fi
elif [ -f "$i" ]; then
echo file $i
if [ $cur_fol != $main_pwd ]; then
file_list+=$i'\n'
cnt=$((cnt + 1))
fi
fi
done
}
echo "Base path: $path"
loop_folder_recurse $path
How can I modify my script to produce the desired output?
This bash script produces the output that you want:
#!/bin/bash
bigfile="$PWD/BigList"
shortfile="$PWD/ShortList"
shopt -s nullglob
loop_folder_recurse() {
(
[[ -n "$1" ]] && cd "$1"
for i in */; do
[[ -d "$i" ]] && loop_folder_recurse "$i"
count=0
files=''
for j in *; do
if [[ -f "$j" ]]; then
files+="$j"$'\n'
((++count))
fi
done
if ((count > 10)); then
outfile="$bigfile"
else
outfile="$shortfile"
fi
echo "$i" >> "$outfile"
echo "$files" >> "$outfile"
done
)
}
loop_folder_recurse
Explanation
shopt -s nullglob is used so that when a directory is empty, the loop will not run. The body of the function is within ( ) so that it runs within a subshell. This is for convenience, as it means that the function returns to the previous directory when the subshell exits.
Hopefully the rest of the script is fairly self-explanatory but if not, please let me know and I will be happy to provide additional explanation.

sh: Test for existence of files

How does one test for the existence of files in a directory using bash?
if ... ; then
echo 'Found some!'
fi
To be clear, I don't want to test for the existence of a specific file. I would like to test if a specific directory contains any files.
I went with:
(
shopt -s dotglob nullglob
existing_files=( ./* )
if [[ ${#existing_files[#]} -gt 0 ]] ; then
some_command "${existing_files[#]}"
fi
)
Using the array avoids race conditions from reading the file list twice.
From the man page:
-f file
True if file exists and is a regular file.
So:
if [ -f someFileName ]; then echo 'Found some!'; fi
Edit: I see you already got the answer, but for completeness, you can use the info in Checking from shell script if a directory contains files - and lose the dotglob option if you want hidden files ignored.
I typically just use a cheap ls -A to see if there's a response.
Pseudo-maybe-correct-syntax-example-ahoy:
if [[ $(ls -A my_directory_path_variable ) ]] then....
edit, this will work:
myDir=(./*) if [ ${#myDir[#]} -gt 1 ]; then echo "there's something down here"; fi
You can use ls in an if statement thus:
if [[ "$(ls -a1 | egrep -v '^\.$|^\.\.$')" = "" ]] ; then echo empty ; fi
or, thanks to ikegami,
if [[ "$(ls -A)" = "" ]] ; then echo empty ; fi
or, even shorter:
if [[ -z "$(ls -A)" ]] ; then echo empty ; fi
These basically list all files in the current directory (including hidden ones) that are neither . nor ...
If that list is empty, then the directory is empty.
If you want to discount hidden files, you can simplify it to:
if [[ "$(ls)" = "" ]] ; then echo empty ; fi
A bash-only solution (no invoking external programs like ls or egrep) can be done as follows:
emp=Y; for i in *; do if [[ $i != "*" ]]; then emp=N; break; fi; done; echo $emp
It's not the prettiest code in the world, it simply sets emp to Y and then, for every real file, sets it to N and breaks from the for loop for efficiency. If there were zero files, it stays as Y.
Try this
if [ -f /tmp/foo.txt ]
then
echo the file exists
fi
ref: http://tldp.org/LDP/abs/html/fto.html
you may also want to check this out: http://tldp.org/LDP/abs/html/fto.html
How about this for whether directory is empty or not
$ find "/tmp" -type f -exec echo Found file {} \;
#!/bin/bash
if [ -e $1 ]; then
echo "File exists"
else
echo "Files does not exist"
fi
I don't have a good pure sh/bash solution, but it's easy to do in Perl:
#!/usr/bin/perl
use strict;
use warnings;
die "Usage: $0 dir\n" if scalar #ARGV != 1 or not -d $ARGV[0];
opendir my $DIR, $ARGV[0] or die "$ARGV[0]: $!\n";
my #files = readdir $DIR;
closedir $DIR;
if (scalar #files == 2) { # . and ..
exit 0;
}
else {
exit 1;
}
Call it something like emptydir and put it somewhere in your $PATH, then:
if emptydir dir ; then
echo "dir is empty"
else
echo "dir is not empty"
fi
It dies with an error message if you give it no arguments, two or more arguments, or an argument that isn't a directory; it's easy enough to change if you prefer different behavior.
# tested on Linux BASH
directory=$1
if test $(stat -c %h $directory) -gt 2;
then
echo "not empty"
else
echo "empty"
fi
For fun:
if ( shopt -s nullglob ; perl -e'exit !#ARGV' ./* ) ; then
echo 'Found some!'
fi
(Doesn't check for hidden files)

Resources