Run a command only once in a for loop in Bash - bash

I have the following pattern of bash code from a function:
for folder in ${FOLDER[#]}; do
if [ $# -ne 0 ]; then
rsync -rvh --delete-after ${folder} ${folder_dest}
else
local zipFile=$(stamp-files file.zip)
unzip -q ${zipFile} "${folder}"* -d "${folder_src}"
rsync -rvh --delete-after ${folder_src}${folder} ${folder_dest}
rsync -rvh --delete-after ${folder_src}${folder} ${folder_dest2}
fi
done
So in this function I have an if statement in a for loop. And as you can see in the else part there is a call to another function stamp-files. This call to the others functions is needed just in the first iteration and even if it is not causing any issue, it could be definitely improved by just calling it once.
Do you know how could I do that simply and keeping that shape of code (the if statement in the for loop)?

You could set a variable and change it in the first iteration:
first='yes'
for folder in "{folders[#]}"; do
if [[ $first == 'yes' ]]; then
local files=$(stamp-files "$folder")
rsync -rvh --delete-after "$files" "$folder_dest"
first='no'
else
rsync -rvh --delete-after "$folder" "$folder_dest"
fi
done
The way you tried to use $# wouldn't work: that's the number of positional parameters and has nothing to do with the iteration number or similar.
I've also changed a few things:
folders instead of FOLDER; uppercase variable names are more likely to clash with shell and environment variables
All expansions quoted to prevent word splitting and globbing (the left-hand side in [[...]] is automatically quoted)
I use $folder instead of ${folder}, but that's really just personal taste

If you need to keep structure of your for loop and inside if :
unset stamp_files_done
for folder in ${FOLDER[#]}; do
if [ $# -ne 0 ]; then
rsync -rvh --delete-after ${folder} ${folder_dest}
else
if [[ -z ${stamp_files_done+x} ]]; then local files=$(stamp-files ${folder}); stamp_files_done=""; fi
rsync -rvh --delete-after ${files} ${folder_dest}
fi
done
You could also explain overall task to have more accurate solution.

I know you've said you'd like to keep the shape of the code, but stylistically I'm always hesitant to structure code with an if inside a loop just to handle the first element of an array - I prefer to hoist the code out to emphasise that the first element is being treated specially. The other option is to add a state variable, which never makes code any clearer.
In your case, how about
if [ ${#FOLDER[#]} -ne 0 ]; then
local files=$(stamp-files ${FOLDER[0]})
rsync -rvh --delete-after ${files} ${folder_dest}
for folder in ${FOLDER[#]:1}; do
rsync -rvh --delete-after ${folder} ${folder_dest}
done
fi

Related

What to do to make loop ignore empty directories

I have a loop and I need it to ignore empty directories.
for i in */*/
do
cd "$i"
mv ./*.py ..
cd -
rm -r "$i"
done
What can i add on to make it ignore empty directories?
I have this but I would like something simpler
x=$(shopt -s nullglob dotglob; echo "$i"/*)
(( ${#x} )) || continue
What can i add on to make it ignore empty directories?
Bash does not have a primitive operator for testing whether a directory is empty. The best alternative in your case is probably to test whether pathname expansion matches any files within. That's what you are already considering, though I would write it differently.
As a general rule, I would also avoid changing the working directory. If you must change directory then consider doing it in a subshell, so that you need only let the subshell terminate to revert to the original working directory. Using a subshell is also a good approach when different parts of your script want different shell options.
I would probably write your script like this:
#!/bin/bash
shopt -s nullglob dotglob
for i in */*/; do
anyfiles=( "$i"/* )
if [[ ${#anyfiles[#]} -ne 0 ]]; then
# Process nonempty directory "$i"
# If there are any Python files within then move them to the parent directory
pyfiles=( "$i"/*.py )
if [[ ${#pyfiles[#]} -ne 0 ]]; then
mv "${pyfiles[#]}" "$(dirname "$i")"
fi
# Remove directory "$i" and any remaining contents
rm -r "$i"
fi
done
If you want that as part of a larger script, then you could put everything from the shopt to the end in a subshell to limit the scope of the shopt.
Alternatively, you could simplify that slightly at the cost of some clarity by using loops to skip the capture of the directory contents into explicit variables:
#!/bin/bash
shopt -s nullglob dotglob
for i in */*/; do
for anyfile in "$i"/*; do
# Process nonempty directory "$i"
# If there are any Python files within then move them to the parent directory
for pyfile in "$i"/*.py; do
mv "$i"/*.py "$(dirname "$i")"
break
done
# Remove directory "$i" and any remaining contents
rm -r "$i"
break
done
done
In that case, each inner loop contains an unconditional break at the end, so at most one iteration will be performed.

Bash Recursive Loop searching for file

Our assignment was just a conditional statement to say if a file exists or doesn't in the current directory. I already completed the primary objective but I am just trying to figure out how I could recurse using a loop and finding any other occurrences of the file name.
Currently, I am getting an infinite loop after finding the file in the first directory.
File exists and is located at /home/charlie/file.txt
File exists and is located at /home/charlie/file.txt
...
**Questions:
Would I need to have a nested for loop somewhere even though I am recursively calling the function?
Does using $pwd mess it up as I am trying to step into the directories?
Why is it printing twice as of now?**
#!/bin/bash
if [[ $# -eq 0 ]] ; then
echo 'Usage: findFile_new.sh [file name]'
exit 0
fi
exist="File exists and is located at "
function check() {
for file in $(pwd)/*
do
if [ -d "$file" ]; then
check $file $1
else
## Look for file
if [ -f "$1" ]; then
echo $exist$(pwd)/$1
fi
fi
done
}
check $1
Getting closer....
Why am i getting this as my output????
File exists and is located at
/home/charlie//#/*testFile.txt
File exists and is located at
/home/charlie//compareQuiz.sh/testFile.txt
File exists and is located at
/home/charlie//dir1/dir2/dir3/dir4//*testFile.txt
File exists and is located at
/home/charlie//dir1/dir2/file.txt/*testFile.txt
File exists and is located at
/home/charlie//dir1/dir2/testFile.txt/*testFile.txt
File exists and is located at
/home/charlie//findFile_new.sh/*testFile.txt
File exists and is located at
/home/charlie//testFile.txt/*testFile.txt
File exists and is located at
/home/charlie//testscript.sh/*testFile.txt
File exists and is located at
/home/charlie//~/*testFile.txt
#!/bin/bash
if [[ $# -eq 0 ]] ; then
echo 'Usage: findFile_new.sh [file name]'
exit 0
fi
exist="File exists and is located at "
echo $1 $2
function check() {
for file in $1/*
do
if [ -d "$file" ]; then
check $file $2
else
## Look for file
if [ -f "$2" ]; then
echo $exist$file
fi
fi
done
}
check $1 $2
Yes, you are almost there. A possible problem is the variable $file
will contain pathname to the file such as path/to/the/testFile.txt while
the variable $2 may contain only testFile.txt.
Would you please try the following:
#!/bin/bash
# usage: check path targetname
check() {
local file
for file in "$1"/*; do
if [[ -d $file ]]; then
check "$file" "$2"
elif [[ -f $file && ${file##*/} = $2 ]]; then
echo "found: $file"
fi
done
}
if (( $# != 2 )); then
echo "usage: $0 path name"
exit 0
fi
check "$1" "$2"
The expression ${file##*/} removes the pathname preceding the rightmost slash
inclusive and now you can compare it with $2.
Your search is not truly recursive and will eventually fall into an infinite loop.
Rolling your own Bash script that would cover every case for recursive search within a directory - can be tricky (permission handling and links come to mind...)
What you could do - is to use find which takes care of all recursive woes, and then extract the data you may need. The script (or single command really) would be find . -iname $1
find . (in this directory) -iname match this name and ignore case and $1 first arg value.
From the above you could grep to extract any data you may need.
I know this does not answer your question 1:1, but if you're just starting with Bash, I'd say you're better off using tools provided within Bash first, then trying rolling your own.
As your task was already accomplished and it's more of a general question, I figured I'd provide you with "what I would do". :)
Another point - when using $pwd - this being a bash reserved environment variable, it's better to use $PWD. It makes it simple to understand which env_vars are user declared and which are system reserved.
edit:
If you'd like to see a nice example of how this can be done, please checkout https://www.shellscript.sh/eg/directories/#traverse2.sh

Checking the input arguments to script to be empty failed in bash script

This a small bash program that is tasked with looking through a directory and counting how many files are in the directory. It's to ignore other directories and only count the files.
Below is my bash code, which seems to fail to count the files specifically in the directory, I say this because if I remove the if statement and just increment the counter the for loop continues to iterate and prints 4 in the counter (this is including directories though). With the if statement it prints this to the console.
folder1 has files
Looking at other questions I think the expression in my if statement is right and I am getting no compilation errors for syntax or another problems.
So I just simply dumbfounded as to why it is not counting the files.
#!/bin/bash
folder=$1
if [ $1 = empty ]; then
folder=empty
counter=0
echo $folder has $counter files
exit
fi
for d in $(ls $folder); do
if [[ -f $d ]]; then
let 'counter++'
fi
done
echo $folder has $counter files
Thank you.
Your entire script could be very well simplified as below with enhancements made. Never use output of ls programmatically. It should be used only in the command-line. The -z construct allows to you assert if the parameter following it is empty or non-empty.
For looping over files, use the default glob expansion provided by the shell. Note the && is a short-hand to do a action when the left-side of the operand returned a true condition, in a way short-hand equivalent of if <condition>; then do <action>; fi
#!/usr/bin/env bash
[ -z "$1" ] && { printf 'invalid argument passed\n' >&2 ; exit 1 ; }
shopt -s nullglob
for file in "$1"/*; do
[ -f "$file" ] && ((count++))
done
printf 'folder %s had %d files\n' "$1" "$count"

Stop watching a folder for changes if a file exists (bash)

I'm using fswatch to keep track of changes to a directory, but would like this process to stop if a certain file exists (with a wildcard). This certain file is created in an alternative directory (not in the tracked directory) by another process (which is generating changes that need to be tracked).
Here's what I tried to do:
while [[ $(shopt -s nullglob; set -- "${file_to_check}"; echo $#) -eq 1 ]]; do
fswatch "${path_to_the_tracked_directory}"
done && echo "Done"
However, this script does not terminate after ${file_to_check} appears.
The complex bit in the condition is to take care of wildcards as per: Bash check if file exists with double bracket test and wildcards
EDIT:
The complex bit can be simplified to:
while [ $(set -- "${path_to_the_file_to_check}"${file_to_check_with_wildcards}; echo $#) -eq 0 ]; do
fswatch "${path_to_the_tracked_directory}"
done && echo "Done"
One solution is to use -1/--one-event option (https://github.com/emcrisostomo/fswatch/wiki/How-to-Use-fswatch).
The code then looks as:
while [ $(set -- "${path_to_the_file_to_check}"${file_to_check_with_wildcards}; echo $#) -eq 0 ]; do
fswatch -1 "${path_to_the_tracked_directory}"
done && echo "Done"

bash - recursive script can't see files in sub directory

I got a recursive script which iterates a list of names, some of which are files and some are directories.
If it's a (non-empty) directory, I should call the script again with all of the files in the directory and check if they are legal.
The part of the code making the recursive call:
if [[ -d $var ]] ; then
if [ "$(ls -A $var)" ]; then
./validate `ls $var`
fi
fi
The part of code checking if the files are legal:
if [[ -f $var ]]; then
some code
fi
But, after making the recursive calls, I can no longer check any of the files inside that directory, because they are not in the same directory as the main script, the -f $var if cannot see them.
Any suggestion how can I still see them and use them?
Why not use find? Simple and easy solution to the problem.
Always quote variables, you never known when you will find a file or directory name with spaces
shopt -s nullglob
if [[ -d "$path" ]] ; then
contents=( "$path"/* )
if (( ${#contents[#]} > 0 )); then
"$0" "${contents[#]}"
fi
fi
you're re-inventing find
of course, var is a lousy variable name
if you're recursively calling the script, you don't need to hard-code the script name.
you should consider putting the logic into a function in the script, and the function can recursively call itself, instead of having to spawn an new process to invoke the shell script each time. If you do this, use $FUNCNAME instead of "$0"
A few people have mentioned how find might solve this problem, I just wanted to show how that might be done:
find /yourdirectory -type f -exec ./validate {} +;
This will find all regular files in yourdirectory and recursively in all its sub-directories, and return their paths as arguments to ./validate. The {} is expanded to the paths of the files that find locates within yourdirectory. The + at the end means that each call to validate will be on a large number of files, instead of calling it individually on each file (wherein the + is replaced with a \), this provides a huge speedup sometimes.
One option is to change directory (carefully) into the sub-directory:
if [[ -d "$var" ]] ; then
if [ "$(ls -A $var)" ]; then
(cd "$var"; exec ./validate $(ls))
fi
fi
The outer parentheses start a new shell so the cd command does not affect the main shell. The exec replaces the original shell with (a new copy of) the validate script. Using $(...) instead of back-ticks is sensible. In general, it is sensible to enclose variable names in double quotes when they refer to file names that might contain spaces (but see below). The $(ls) will list the files in the directory.
Heaven help you with the ls commands if any file names or directory names contain spaces; you should probably be using * glob expansion instead. Note that a directory containing a single file with a name such as -n would trigger a syntax error in your script.
Corrigendum
As Jens noted in a comment, the location of the shell script (validate) has to be adjusted as you descend the directory hierarchy. The simplest mechanism is to have the script on your PATH, so you can write exec validate or even exec $0 instead of exec ./validate. Failing that, you need to adjust the value of $0 — assuming your shell leaves $0 as a relative path and doesn't mess around with converting it to an absolute path. So, a revised version of the code fragment might be:
# For validate on PATH or absolute name in $0
if [[ -d "$var" ]] ; then
if [ "$(ls -A $var)" ]; then
(cd "$var"; exec $0 $(ls))
fi
fi
or:
# For validate not on PATH and relative name in $0
if [[ -d "$var" ]] ; then
if [ "$(ls -A $var)" ]; then
(cd "$var"; exec ../$0 $(ls))
fi
fi

Resources