Bash - while inside for loop not exiting - bash

I'm a beginner getting started with bash scripting.
I have 10 directories in current working directory dir1-dir10 + script.sh + a file called "tocopyfile".
Dir1-10 are empty.
tocopyfile is a test text file to be used for the purpose of my training
script.sh contains the following code:
dir=`pwd`
i="0"
for directory in `ls $dir`;do
while [ $i -le 10 ]
do
cp tocopyfile $directory/file$i &
i=$[$i+1]
done
done
The script should copy 10 copies of the file "tocopyfile" to every dir (dir1-10) in the naming convention file#. The problem is that ones the script is exists after the first directory without executing the while loop to the remaining remaining dirs.
Can someone explain what I'm doing wrong please?
Help is greatly appreciated.

The immediate issue is that you need to reset the value of i for each iteration of the outer loop.
for directory in `ls $dir`; do # No! but more on that in a moment
i=0
while [ $i -le 10 ]
There are a few other issues with your code.
dir=$(pwd) is almost always pointless; bash already provides a variable PWD containing the name of the current working directory. You don't actually need this, though; you can simply use ./*/ to expand to a list of directories in the current working directory.
Never use the output of ls in a script.
$[...] is obsolete syntax; use $((...)) instead.
Cleaning your code up a bit, we get
for directory in ./*/; do
i=0
while [ "$i" -le 10 ]; do
cp tocopyfile "$directory/file$i" &
i=$((i+1))
done
done

You need to initialize $i inside the for loop, such that $i == 0 upon each iteration of your while:
dir=`pwd`
for directory in `ls $dir`;do
i="0" # <===== notice the change here
while [ $i -le 10 ]
do
cp tocopyfile $directory/file$i &
i=$[$i+1]
done
done
Other things you might want to change:
Double-quote all your variables (in the event they have spaces in them).
Use $() instead of the long-deprecated back-tick syntax.
Use $(()) instead of the deprecated $[] syntax.
Tidy up your indentation.

Related

How to delete folders that fail a condition in bash script

I have a number of folders that are constantly and automatically generated. Some are garbage and need to be cleared out. Each folder produces a generations.txt which I want to count the important lines to determine whether or not the folder should be deleted. I'd like to have a bash script I can run every so often to clean things up.
Here's what I have. I can echo the command I want but I don't believe it outputs the integer to compare to 5. Any suggestions would really help me out. Please and thank you!
#!/bin/bash
SEARCHABLES="grep -Evc 'Value:' "
for d in */
do
PATH=$d'generations.txt'
COMMAND=$SEARCHABLES$PATH
if $COMMAND < 5
then
rm -rf $d
fi
done
You're not getting the output of the command, you need $(...) to execute a command and substitute its output.
To perform the arithmetic comparison, you have to put it inside ((...)).
#!/bin/bash
SEARCHABLES="grep -Evc 'Value:' "
for d in */
do
PATH="$d"'generations.txt'
COMMAND=$SEARCHABLES$PATH
if (( $($COMMAND) < 5 ))
then
rm -rf "$d"
fi
done
See BashFAQ/050 - I'm trying to put a command in a variable, but the complex cases always fail!
for a more detailed explanation.
In short, embedding a command in a variable is a faulty approach to the problem here because the single quotes in 'Value:' will be treated like literal data to search for. Syntax parsing happens before expansions, so you can't embed quotes in a variable like that. What you need is a function:
_count() {
grep -Evc 'Value:' "$1"
}
_count "$PATH"
Then compare the output of the function using an arithmetic expression:
occurrences=$( _count "$PATH" )
if (( occurrences < 5 )) ; then
...
fi

load multiple files from different folders with same file name in bash-scipt

So far i am able to read in the files from a single folder on my Ubuntu using:
for i in /path/to/files/Folder1/*.pcd
do
if [ ! -z $last_i ]
then
./vapp $last_i $i
fi
last_i="$i"
done
this will read all files in Folder1. I Also have folder 2 and 3 (i.e. Folder2, Folder3). Inside each folder are several 100 files which are simply numbered such as 0000.pcd, 0001.pcd ... 0129.pcd... and so on.
I have tried to use
/path/to/files/Folder{1..3}/*.pcd
The problem is that it takes now all files from one folder and processes two files within, than goes through all files in this folder the same way before moving on to the next folder.
What i really want is to take from each of my three folders the ith filename e.g. 000i.pcd and pass it (including the path) to my application to do some calculations.
effectively I want to do this:
./vapp /Folder1/000i.pcd /Folder2/000i.pcd /Folder3/000i.pcd
Using native bash features alone, with its extended glob features. Run the script from /path/to/files/
#!/bin/bash
shopt -s globstar nullglob dotglob
i=0
end=129
while [ "$i" -le "$end" ]
do
# Generating the file-name to be generated, the numbers 0-129
# with 4 character padding, taken care by printf
file="$(printf "%04d.pcd" $i)"
# The ** pattern enabled by globstar matches 0 or more directories,
# allowing the pattern to match to an arbitrary depth in the current
# directory.
fileList=( **/"$file" )
# Remember to un-comment the below line and see if the required files
# are seen by doing
# printf "%s\n" "${fileList[#]}"
# and run the executable below
./vapp "$(printf "%s " "${fileList[#]}")"
done
The usage of extglob features are re-used from this wonderful answer.
I have solved my problem yesterday in a similar way to what Inian suggested. Except that my way is the manual way!
It works for me but I agree that Inian's way is much more flexible. Here is my solution anyway:
j=0
leadingZeros1="00000000"
leadingZeros2="0000000"
leadingZeros3="000000"
fol0="lidar03_00/"
fol1="lidar03_01/"
fol2="lidar03_02/"
ext=".pcd"
basepath="/my/path/"
for i in /my/path/lidar03_00/*.pcd
do
if [ ! -z $last_i ]
((j+=1))
then
if [ $j -le 9 ]
then
mypath1=$basepath$fol0$leadingZeros1$j$ext
mypath2=$basepath$fol1$leadingZeros1$j$ext
mypath3=$basepath$fol2$leadingZeros1$j$ext
fi
sleep .009
if [ $j -ge 10 ]
then
if [ $j -le 99 ]
then
mypath1=$basepath$fol0$leadingZeros2$j$ext
mypath2=$basepath$fol1$leadingZeros2$j$ext
mypath3=$basepath$fol2$leadingZeros2$j$ext
fi
if [ $j -ge 100 ]
then
if [ $j -le 999 ]; then
mypath1=$basepath$fol0$leadingZeros3$j$ext
mypath2=$basepath$fol1$leadingZeros3$j$ext
mypath3=$basepath$fol2$leadingZeros3$j$ext
fi;
fi
fi
#echo $mypath1
#echo $mypath2
#echo $mypath3
./vapp -i $mypath1 $mypath2 $mypath3 .txt"
fi
last_i="$i"
done
I admit, it is not the nicest solution but it fixes my problem for now. If i need to do this again I will probably do it Inian's way.
Thanks all for the help.
I have tried to avoid the nested ifs using logical AND (&&) but somehow it didn't work and I didn't wanted to spend more time on this. Thus the clumsy way...

Bash script to compare files

I have a folder with a ton of old photos with many duplicates. Sorting it by hand would take ages, so I wanted to use the opportunity to use bash.
Right now I have the code:
#!/bin/bash
directory="~/Desktop/Test/*"
for file in ${directory};
do
for filex in ${directory}:
do
if [ $( diff {$file} {$filex} ) == 0 ]
then
mv ${filex} ~/Desktop
break
fi
done
done
And getting the exit code:
diff: {~/Desktop/Test/*}: No such file or directory
diff: {~/Desktop/Test/*:}: No such file or directory
File_compare: line 8: [: ==: unary operator expected
I've tried modifying working code I've found online, but it always seems to spit out some error like this. I'm guessing it's a problem with the nested for loop?
Also, why does it seem there are different ways to call variables? I've seen examples that use ${file}, "$file", and "${file}".
You have the {} in the wrong places:
if [ $( diff {$file} {$filex} ) == 0 ]
They should be at:
if [ $( diff ${file} ${filex} ) == 0 ]
(though the braces are optional now), but you should allow for spaces in the file names:
if [ $( diff "${file}" "${filex}" ) == 0 ]
Now it simply doesn't work properly because when diff finds no differences, it generates no output (and you get errors because the == operator doesn't expect nothing on its left-side). You could sort of fix it by double quoting the value from $(…) (if [ "$( diff … )" == "" ]), but you should simply and directly test the exit status of diff:
if diff "${file}" "${filex}"
then : no difference
else : there is a difference
fi
and maybe for comparing images you should be using cmp (in silent mode) rather than diff:
if cmp -s "$file" "$filex"
then : no difference
else : there is a difference
fi
In addition to the problems Jonathan Leffler pointed out:
directory="~/Desktop/Test/*"
for file in ${directory};
~ and * won't get expanded inside double-quotes; the * will get expanded when you use the variable without quotes, but since the ~ won't, it's looking for files under an directory actually named "~" (not your home directory), it won't find any matches. Also, as Jonathan pointed out, using variables (like ${directory}) without double-quotes will run you into trouble with filenames that contain spaces or some other metacharacters. The better way to do this is to not put the wildcard in the variable, use it when you reference the variable, with the variable in double-quotes and the * outside them:
directory=~/"Desktop/Test"
for file in "${directory}"/*;
Oh, and another note: when using mv in a script it's a good idea to use mv -i to avoid accidentally overwriting another file with the same name.
And: use shellcheck.net to sanity-check your code and point out common mistakes.
If you are simply interested in knowing if two files differ, cmp is the best option. Its advantages are:
It works for text as well as binary files, unlike diff which is for text files only
It stops after finding the first difference, and hence it is very efficient
So, your code could be written as:
if ! cmp -s "$file" "$filex"; then
# files differ...
mv "$filex" ~/Desktop
# any other logic here
fi
Hope this helps. I didn't understand what you are trying to do with your loops and hence didn't write the full code.
You can use diff "$file" "$filex" &>/dev/null and get the last command result with $? :
#!/bin/bash
SEARCH_DIR="."
DEST_DIR="./result"
mkdir -p "$DEST_DIR"
directory="."
ls $directory | while read file;
do
ls $directory | while read filex;
do
if [ ! -d "$filex" ] && [ ! -d "$file" ] && [ "$filex" != "$file" ];
then
diff "$file" "$filex" &>/dev/null
if [ "$?" == 0 ];
then
echo "$filex is a duplicate. Copying to $DEST_DIR"
mv "$filex" "$DEST_DIR"
fi
fi
done
done
Note that you can also use fslint or fdupes utilities to find duplicates

Bash create while loop to remove all files within a directory

Trying to create a while loop to remove 8 files from a specified test folder. I keep getting the error no such file or directory even though I am positive I am in the right folder because I can use the ls command to see the files... Anyways here is what I have
#!/bin/bash
var=(`ls ~/Random/Unit1/Test`)
x=${#var[#]}
i=0
while [ $i -lt $x ] ; do
rm $var # this line is incorrect and needs changing
((i++))
done
var is an array variable, right now you're accessing it as a scalar, which in bash returns the first value, so you remove the first file and then try to remove it again once for every file in the directory. If you want to remove every file you need to get the value of the array at every index, so in the loop you would get the nth value in the array, ie
rm ${var[$i]}
There are a couple of problems with your script, the first one is that you should cd to the folder from where you want to remove the files, you can use pushd and popd for it. Second, you should enclose the var variable with double quotes. Also, as stated in #redball's answer, you are accessing an array, you have to use array notation on it.
#!/bin/bash
DIRECTORY=~/Random/Unit1/Test
var=(`ls $DIRECTORY`)
x=${#var[#]}
i=0
# Saves current directory and change it to the one pointed by "$DIRECTORY"
push "$DIRECTORY"
while [ $i -lt $x ] ; do
rm "${var[$i]}"
((i++))
done
# Restores the previously saved directory
popd

Extracting end of filename in bash script

Within my backup script, I'd like to only keep 7 days worth of backups (tried using logrotate for this and it worked perfectly, but I ran into issues with the timing of cron.daily and how it affected the "dateext"). I'm running into problems using parameter expansion to extract the date from the filenames.
Here are some examples of some of the files
foo.bar.tar.gz-20120904
bar.baz.tar.gz-20120904
...
Here is my bash script:
#!/bin/bash
path="/foo/"
today=$(date +%Y%m%d)
keepDays=7
keepSeconds=$(date -d "-$keepDays day" +%s)
for f in $path"*"; do
fileSeconds=$(date -d ${f##*-} +%s)
if [ $fileSeconds -lt $keepSeconds ]
then
rm $f
fi
done
Here is the error I'm getting:
date: extra operand `/foo/foo.bar.tar.gz-20120904'
Remove the quotes around the *, that prevents globbing:
for f in ${path}*; do
(the { } are not strictly required here, but make it easier to read)
Not part of the question, but the Bourne shell syntax [ $fileSeconds -lt $keepSeconds ] could be written as (( $fileSeconds < $keepSeconds )) which is possibly safer.
As cdarke says, remove the quotes around the * in the for loop:
for f in ${path}/*; do
What happens is that the shell executing date gets '/foo/*' and expands that into a list of file names (more than one) and then uses ${f##*-} on part of the list, and date is called with multiple names, and objects.
You'd see this if you ran with bash -x your-script.sh, for instance. When something mysterious goes on, the first step is to make sure you know what the shell is doing. Adding echo "$f" or echo $f in the loop would help you understand — though you'd get two different answers.

Resources