This question already has answers here:
How to skip the for loop when there are no matching files?
(2 answers)
Closed 3 years ago.
I want to process a set of files (*.ui) in the current directory. The following script works as expected if some *.ui files are found. But if no .ui file exist the current directory, the for loop is entered all the same. Why is that ?
for f in *.ui
do
echo "Processing $f..."
done
It prints :
Processing *.ui...
Use:
shopt -s nullglob
From man bash:
nullglob
If set, bash allows patterns which match no files (see Pathname Expansion
above) to expand to a null string, rather than themselves.
You already have the how, the 'why' is that bash will first try to match *.ui to files, but if that doesn't work (it gets no results) it will assume you meant the string "*.ui".
for f in "*.ui"
do
echo "Processing $f..."
done
wil indeed print "Processing *.ui".
Related
This question already has answers here:
When to wrap quotes around a shell variable?
(5 answers)
Closed 2 years ago.
I'm trying to compare sizes of files inside two directories.
My problem is that when I store the sizes of the files inside of a "for" loop, my variable takes all the sizes at once instead of taking them one by one.
Here is the part of my code that is problematic :
for dir1Files in dir1/*
do
sizeFile1=`stat -c%s $dir1Files`
for dir2Files in dir2/*
do
sizeFile2=`stat -c%s $dir2Files`
diffSize=$((sizeFile1-sizeFile2))
echo "$diffSize"
done
done
I realised, thanks to set -x, that my variables sizeFile1 and sizeFile2 are not integers. Instead, they are a few lines long and contain the sizes of my files in directories, with "one line = one integer", if that makes sense.
For example, with three files in dir1, my variable sizeFile1 is :
12500
14534
23000
What I would like is for my variable to vary from 12500 to 14534 to 23000. How should I do that ? I'm guessing I need to change my "for" into something else ?
Thanks in advance.
Nothing in this is broken 100% of the time, but it certainly can be broken if run with unusual filenames present. To make this code more robust:
Use quotes whenever you expand a variable. This prevents a file named dir1/ * (with the space in its name) from being replaced with a list of all files in the current directory when generating a stat command line.
Use shopt -s nullglob to make the loops not run at all when no glob matches exist, instead of running with the glob expression as a filename itself.
shopt -s nullglob # prevent dir1/* from ever evaluating to itself
for dir1File in dir1/*; do
sizeFile1=$(stat -c%s "$dir1File")
for dir2File in dir2/*; do
sizeFile2=$(stat -c%s "$dir2File")
diffSize=$((sizeFile1-sizeFile2))
echo "$diffSize"
done
done
This question already has answers here:
How to check if a files exists in a specific directory in a bash script?
(3 answers)
Closed 4 years ago.
I'm not sure how to word my question exactly...
I have the code
if grep "mynamefolder" /vol/Homefs/
then
echo "yup"
else
echo "nope"
fi
which gives me the output
grep: /vol/Homefs/: Is a directory
nope
The sh file containing the code and the directory I'm targeting are not in the same directory (if that makes sense).
I want to find the words myfoldername inside /vol/Homefs/ without going through any subdirectories. Doing grep -d skip, which I hoped would "skip" subdirectories and focus only directories, just gives me nope even though the folder/file/word I'm testing it on does exist.
Edit: I forgot to mention that I would also like mynamefolder to be a variable that I can write in putty, something like
./file spaing and spaing being the replacement for myfoldername.
I'm not sure if I did good enough explaining, let me know!
You just want
if [ -e /vol/Homefs/"$1" ]; then
echo yup
else
echo nope
fi
The [ command, with the -e operator, tests if the named file entry exists.
vim is not involved, and grep is not needed.
If you're insisting on using grep, you should know grep doesn't work on directories. You can convert the directory listing to a string.
echo /vol/Homefs/* | grep mynamefolder
This question already has answers here:
Loop through all the files with a specific extension
(7 answers)
Closed 6 years ago.
I am trying to find and process all files with a given ending (.txt in the example below) in a directory. My current example finds all files containing .txt anywhere in the file name (e.g. also files with the ending .txt*, e.g. .txt.xls).
DATADIR=$1
for DATA in `ls $DATADIR`; do
DATABASENAME=$(basename $DATA)
echo "Basename of file $DATABASENAME"
if [[ ${DATABASENAME} =~ .*txt ]];
then
DATAPATH="$DATADIR$DATABASENAME"
echo "File path $DATAPATH"
fi
done
If I understand right, that is the for loop you want:
for file in *.txt ; do
This question already has answers here:
Forcing bash to expand variables in a string loaded from a file
(13 answers)
Closed 7 years ago.
Let's say I have a file called path.txt containing the text $HOME/filedump/ on a single line. How can I then read the contents of path.txt into a variable, while having Bash parse said content?
Here's an example of what I'm trying to do:
#!/bin/bash
targetfile="path.txt"
target=$( [[ -f $targetfile ]] && echo $( < $targetfile ) || echo "Not set" )
echo $target
Desired output: /home/joe/filedump/
Actual output: $HOME/filedump/
I've tried using cat in place of <, wrapping it in quotes, and more. Nothing seems to get me anywhere.
I'm sure I'm missing something obvious, and there's probably a simple builtin command. All I can find on Google is pages about reading variables from ini/config files or splitting one string into multiple variables.
If you want to evaluate the contents of path.txt and assign that to target, then use:
target=$(eval echo $(<path.txt))
for example:
$ target=$(eval echo $(<path.txt)); echo "$target"
/home/david/filedump/
This might not necessarily suit your needs (depending on the context of the code you provided), but the following worked for me:
targetfile="path.txt"
target=$(cat $targetfile)
echo $target
Here's a safer alternative than eval. In general, you should not be using configuration files that require bash to evaluate their contents; that just opens a security risk in your script. Instead, detect if there is something that requires evaluation, and handle it explicitly. For example,
IFS= read -r path < path.txt
if [[ $path =~ '$HOME' ]]; then
target=$HOME/${path#\$HOME}
# more generally, target=${path/\$HOME/$HOME}, but
# when does $HOME ever appear in the *middle* of a path?
else
target=$path
fi
This requires you to know ahead of time what variables might appear in path.txt, but that's a good thing. You should not be evaluating unknown code.
Note that you can use any placeholder instead of a variable in this case; %h/filedump can be detected and processed just as easily as $HOME/filedump, without the presumption that the contents can or should be evaluated as shell code.
This question already has answers here:
Looping on empty directory content in Bash [duplicate]
(2 answers)
Closed 7 years ago.
Consider the following bash code:
for f in /tmp/*.dat; do echo ${f}; done
when I run this and there is no *.dat file in /tmp the output is:
/tmp/*.dat
which is clearly not what I want. However, when there is such a file, it will print out the correct one
/tmp/foo.dat
How can I force the for loop to return 'nothing' when there is no such file in the directory. The find-command is not an option, sorry for that :/ I would like to have also a solution without testing, if *.dat is a file or not. Any solutions so far?
This should work:
shopt -s nullglob
...
From Bash Manual
nullglob
If set, Bash allows filename patterns which match no files to expand
to a null string, rather than themselves.