Check if a file exists inside a "Variable" Path - bash

I am trying to find if a file exist in an iPhone application directory
Unfortunately, apps directory differs from a device to another
On my device, i use the following command to see if the file exists:
if [[ -f "/var/mobile/Applications/D0D2B991-3CDA-457B-9187-1F02A84FF3AB/AppName.app/filename.txt" ]]; then
echo "The File Exists";
else
echo "The File Does Not Exist";
fi
I want a command that would automatically search if the file exist without the need to specify the "variable" name inside the path.
I tried this:
if [[ -f "/var/mobile/Applications/*/AppName.app/filename.txt" ]]; then
echo "The File Exists";
else
echo "The File Does Not Exist";
fi
But no luck, it didn't find the file,
Maybe because i have 2 path of /var/mobile/Applications/*/AppName.app/ since i have cloned the app.
I would like to get a way to be able to find if the file filename.txt exists inside any folder named AppName.app inside this directory /var/mobile/Applications/*/

You can do this as follows:
[[ $(find /var/mobile/Applications/*/AppName.app/ -name filename.txt -print -quit | wc -l) -gt 0 ]] && echo "The File Exists" || echo "The File Does Not Exist"

The -f test can only take one argument. You would need to put it in a loop to check if some glob exists and its matches some regular file, i.e.
shopt -s nullglob
found=
for file in /var/mobile/Applications/*/AppName.app/filename.txt; do
[[ -f $file ]] && found=: && break
done
[[ -n $found ]] && echo "The File Exists" || echo "The File Does Not Exist"
If you're not sure specifically where the file is located you can use find, doing something like below which will exit early if found. (should work for gnu find, haven't tested on bsd)
if [[ -f $(find /some_root_directory -type f -name 'filename.txt' -print -quit) ]]; then
echo "The File Exists"
else
echo "The File Does Not Exist"
fi

# if a glob matches nothing, remove it instead of leaving the literal glob
shopt -s nullglob
# stick all matches in an array
files=( /var/mobile/Applications/*/AppName.app/filename.txt )
case "${#files[#]}" in
0 ) echo "Sorry, no such file." ;;
1 ) echo "The file exists: ${files[0]}" ;;
* ) echo "There are multiple files matching this pattern: ${files[*]}" ;;
esac

I like this technique for the purpose:
if find /var/mobile/Applications/*/AppName.app/ -name filename.txt -print -quit | grep -q .; then
echo "The File Exists"
else
echo "The File Does Not Exist"
fi
This has some advantages over this form:
[[ $(find ..... -print -quit | wc -l) -gt 0 ]]
Because:
It doesn't need a $() subshell
It doesn't need to count lines with wc
It doesn't need to compare numbers with the -gt operator
It doesn't need to be inside a [[ ... ]]
Basically it's a find ... | grep -q . versus [[ $(find ... | wc -l) -gt 0 ]]
Or find ... | grep -q . versus [[ -f $(find ...) ]]

Related

Bash Find Function Ubuntu - Find in directory tree, files that have the same name as their directory

I want to find and print files in directory tree, that have the sname name as theirs dirs.
This is my code so far:
#!bin/bash
if [ $# -eq 0 ]
then
echo "No args"
fi
if [[ -d $1 ]] #if its dir
then
find $1 -type f | (while read var1 #for every regular file in dir tree
do
if [[ -f $var1 ]]
then
echo $var1 #full path
# I dont know how to get the dir name
echo $(basename $var1) #file name
echo
#then compare it and print full path
fi
done)
fi
I want to do this using FIND function in bash linux. Thanks
You can use this script with find:
while IFS= read -rd '' f; do
d="${f%/*}"
[[ ${d##*/} == ${f##*/} ]] && echo "$f"
done < <(find . -type f -print0)

Bash Script - Find File by User Input

I'm trying to compile a very simple bash script that will do the following actions (the script I have so far doesn't seem to function at all so I won't waste time putting this up for you to look at)
I need it to find files by their names. I need the script to take the user input and search the .waste directory for a match, should the folder be empty i'd need to echo out "No match was found because the folder is empty!", and just normally failing to find a match a simple "No match found."
I have defined: target=/home/user/bin/.waste
You can use the built in find command to do this
find /path/to/your/.waste -name 'filename.*' -print
Alternatively, you can set this as a function in your .bash_profile
searchwaste() {
find /path/to/your/.waste -name "$1" -print
}
Note that there are quotes around the $1. This will allow you to do file globbing.
searchwaste "*.txt"
The above command would search your .waste directory for any .txt files
Here you go, pretty straightforward script:
#!/usr/bin/env bash
target=/home/user/bin/.waste
if [ ! "$(ls -A $target)" ]; then
echo -e "Directory $target is empty"
exit 0
fi
found=0
while read line; do
found=$[found+1]
echo -e "Found: $line"
done < <(find "$target" -iname "*$1*" )
if [[ "$found" == "0" ]]; then
echo -e "No match for '$1'"
else
echo -e "Total: $found elements"
fi
Btw. in *nix world there are not folders, but there are directories :)
This is a solution.
#!/bin/bash
target="/home/user/bin/.waste"
read name
output=$( find "$target" -name "$name" 2> /dev/null )
if [[ -n "$output" ]]; then
echo "$output"
else
echo "No match found"
fi

Find whether a directory exists by specifying only its partial name

I want to find whether a directory exists by just giving the partial name of the directory as the parameter. i.e if a directory by name /home/directory exists, i want to find if it exists by just giving /home/direc*
Is there any way to do this in shell script?
I tried the following but doesn't work:
directory=/home/direc*
if [[ -d "$directory" ]]; then
echo found;
else
echo not found
fi
directory=/home/direc*
for f in $directory
do
if [ -d $f ]
then
echo $f
fi
done
You can use "wc" to count the results and do it like this:
files=$(ls /home/dir* > /dev/null | wc -l)
if [ **"$files" != "0"** ]
then
echo "Dir exists"
else
echo "Doesn't exist"
fi

How to test if multiple files exist using a Bash script

How can I use the test command for an arbitrary number of files, passed in using an argument with a wildcard?
For example:
test -f /var/log/apache2/access.log.* && echo "exists one or more files"
Currently, it prints
error: bash: test: too many arguments
This solution seems to me more intuitive:
if [ `ls -1 /var/log/apache2/access.log.* 2>/dev/null | wc -l ` -gt 0 ];
then
echo "ok"
else
echo "ko"
fi
To avoid "too many arguments error", you need xargs. Unfortunately, test -f doesn't support multiple files. The following one-liner should work:
for i in /var/log/apache2/access.log.*; do test -f "$i" && echo "exists one or more files" && break; done
By the way, /var/log/apache2/access.log.* is called shell-globbing, not regexp. Please see Confusion with shell-globbing wildcards and Regex for more information.
First, store files in the directory as an array:
logfiles=(/var/log/apache2/access.log.*)
Then perform a test on the count of the array:
if [[ ${#logfiles[#]} -gt 0 ]]; then
echo 'At least one file found'
fi
This one is suitable for use with the Unofficial Bash Strict Mode, no has non-zero exit status when no files are found.
The array logfiles=(/var/log/apache2/access.log.*) will always contain at least the unexpanded glob, so one can simply test for existence of the first element:
logfiles=(/var/log/apache2/access.log.*)
if [[ -f ${logfiles[0]} ]]
then
echo 'At least one file found'
else
echo 'No file found'
fi
If you wanted a list of files to process as a batch, as opposed to doing a separate action for each file, you could use find, store the results in a variable, and then check if the variable was not empty. For example, I use the following to compile all the .java files in a source directory.
SRC=`find src -name "*.java"`
if [ ! -z $SRC ]; then
javac -classpath $CLASSPATH -d obj $SRC
# stop if compilation fails
if [ $? != 0 ]; then exit; fi
fi
You just need to test if ls has something to list:
ls /var/log/apache2/access.log.* >/dev/null 2>&1 && echo "exists one or more files"
Variation on a theme:
if ls /var/log/apache2/access.log.* >/dev/null 2>&1
then
echo 'At least one file found'
else
echo 'No file found'
fi
ls -1 /var/log/apache2/access.log.* | grep . && echo "One or more files exist."
Or using find
if [ $(find /var/log/apache2/ -type f -name "access.log.*" | wc -l) -gt 0 ]; then
echo "ok"
else
echo "ko"
fi
This condition below doesn't produce stderr. the condition's blackhole (/dev/null) doesn't prevent the stderr in cmd.
if [[ $(ls -1 /var/log/apache2/access.log.* | wc -l ) -gt 0 ]] 2> /dev/null
therefore I suggests this code.
if [[ $(ls -1 /var/log/apache2/access.log.* | wc -l ) -gt 0 ]] 2> /dev/null
then
echo "exists one or more files."
fi
more simplyfied:
if ls /var/log/apache2/access.log.* 2>/dev/null 1>&2; then
echo "ok"
else
echo "ko"
fi

sh: Test for existence of files

How does one test for the existence of files in a directory using bash?
if ... ; then
echo 'Found some!'
fi
To be clear, I don't want to test for the existence of a specific file. I would like to test if a specific directory contains any files.
I went with:
(
shopt -s dotglob nullglob
existing_files=( ./* )
if [[ ${#existing_files[#]} -gt 0 ]] ; then
some_command "${existing_files[#]}"
fi
)
Using the array avoids race conditions from reading the file list twice.
From the man page:
-f file
True if file exists and is a regular file.
So:
if [ -f someFileName ]; then echo 'Found some!'; fi
Edit: I see you already got the answer, but for completeness, you can use the info in Checking from shell script if a directory contains files - and lose the dotglob option if you want hidden files ignored.
I typically just use a cheap ls -A to see if there's a response.
Pseudo-maybe-correct-syntax-example-ahoy:
if [[ $(ls -A my_directory_path_variable ) ]] then....
edit, this will work:
myDir=(./*) if [ ${#myDir[#]} -gt 1 ]; then echo "there's something down here"; fi
You can use ls in an if statement thus:
if [[ "$(ls -a1 | egrep -v '^\.$|^\.\.$')" = "" ]] ; then echo empty ; fi
or, thanks to ikegami,
if [[ "$(ls -A)" = "" ]] ; then echo empty ; fi
or, even shorter:
if [[ -z "$(ls -A)" ]] ; then echo empty ; fi
These basically list all files in the current directory (including hidden ones) that are neither . nor ...
If that list is empty, then the directory is empty.
If you want to discount hidden files, you can simplify it to:
if [[ "$(ls)" = "" ]] ; then echo empty ; fi
A bash-only solution (no invoking external programs like ls or egrep) can be done as follows:
emp=Y; for i in *; do if [[ $i != "*" ]]; then emp=N; break; fi; done; echo $emp
It's not the prettiest code in the world, it simply sets emp to Y and then, for every real file, sets it to N and breaks from the for loop for efficiency. If there were zero files, it stays as Y.
Try this
if [ -f /tmp/foo.txt ]
then
echo the file exists
fi
ref: http://tldp.org/LDP/abs/html/fto.html
you may also want to check this out: http://tldp.org/LDP/abs/html/fto.html
How about this for whether directory is empty or not
$ find "/tmp" -type f -exec echo Found file {} \;
#!/bin/bash
if [ -e $1 ]; then
echo "File exists"
else
echo "Files does not exist"
fi
I don't have a good pure sh/bash solution, but it's easy to do in Perl:
#!/usr/bin/perl
use strict;
use warnings;
die "Usage: $0 dir\n" if scalar #ARGV != 1 or not -d $ARGV[0];
opendir my $DIR, $ARGV[0] or die "$ARGV[0]: $!\n";
my #files = readdir $DIR;
closedir $DIR;
if (scalar #files == 2) { # . and ..
exit 0;
}
else {
exit 1;
}
Call it something like emptydir and put it somewhere in your $PATH, then:
if emptydir dir ; then
echo "dir is empty"
else
echo "dir is not empty"
fi
It dies with an error message if you give it no arguments, two or more arguments, or an argument that isn't a directory; it's easy enough to change if you prefer different behavior.
# tested on Linux BASH
directory=$1
if test $(stat -c %h $directory) -gt 2;
then
echo "not empty"
else
echo "empty"
fi
For fun:
if ( shopt -s nullglob ; perl -e'exit !#ARGV' ./* ) ; then
echo 'Found some!'
fi
(Doesn't check for hidden files)

Resources