Recursively change directories and execute a command in each - bash

I'm trying to write a bash script to recursively go through a directory and execute a command at each landing. Each folder from the base has the prefix "lab" and I only want to recurse through those folders. An example without recursively going through the folders would be:
#!/bin/bash
cd $HOME/gpgn302/lab00
scons -c
cd $HOME/gpgn302/lab00/lena
scons -c
cd $HOME/gpgn302/lab01
scons -c
cd $HOME/gpgn302/lab01/cloudpeak
scons -c
cd $HOME/gpgn302/lab01/bear
scons -c
And while this works, if I want to add more directories in say lab01, I would have to edit the script. Thank you in advance.

There are a few close suggestions here, but here's one that actually works:
find "$HOME"/gpgn302/lab* -type d -exec bash -c 'cd "$1"; scons -c' -- {} \;

Use find for this kind of task:
find "$HOME/gpgn302" -name 'lab*' -type d -execdir scons -c . \;

It's easy to use find to locate and run commands.
Here's an example which changes into the correct directory before running your command:
find -name 'lab*' -type d -execdir scons -c \;
Update:
As per thatotherguy's comment, this doesn't work. The find -type d will only return directory names, however -execdir command operates on the subdirectory containing the matched file, so in this example the scons -c command would be execute in the parent directory of the found lab* directory.
Use thatotherguy's method or this which is very similar:
find -name 'a*' -type d -print -exec bash -c 'cd "{}"; scons -c' \;

If you want to do it with bash:
#!/bin/bash
# set default pattern to `lab` if no arguments
if [ $# -eq 0 ]; then
pattern=lab
fi
# get the absolute path to this script
if [[ "$0" = /* ]]
then
script_path=$0
else
script_path=$(pwd)/$0
fi
for dir in $pattern*; do
if [ -d $dir ] ; then
echo "Entering $dir"
cd $dir > /dev/null
sh $script_path dummy
cd - > /dev/null
fi
done

Related

executing multiple commands with find -exec but one of the command is 'cd'

Here's what I am trying to achieve:
find .. -type d -depth 1 \( -exec cd "{}" \; -exec touch abc \; \)
I find that the 'cd' part of the command is not working, I get the file 'abc' in the current folder and not in the children folders
how can I execute the command inside the folders found?
To clarify, following Dibery's comment: I need to be able to cd to each folder to execute more complex commands (touch was an example)
I'm on MacOS if it makes a difference
The command cd cannot be used with -exec in find because cd is a shell built-in (you can check this with type cd) rather than an executable (i.e., there's no such executable /usr/bin/cd). In your case, you may corporate the folder name into the touch command as:
find .. -type d -depth 1 -exec touch "{}/abc" \;
Or using git as you requested (the -C option allows you to run git as if you were in that directory):
find .. -type d -depth 1 -exec git -C "{}" some_git_action \;
Even without find:
for i in ../*/; do cd "$i"; some_cmd; cd -; done
cd to that directory and use cd - to go back to the original position, and adding the trailing / will make the asterisk expand to only the directories.
If diberys' comment isn't sufficient, you can pipe the find to a while loop as such:
find . -maxdepth 1 -type d | while read -r dir; do
cd $dir
touch some_file.txt
cd -
done
You can use a shell loop and run your commands in a subshell so you don't have to change directory back again:
for d in ./*/; do (
cd "$d"
touch foo # Or whatever you want
)
done
Alternatively, to get your find command to work, you could start a subshell for each directory:
find -maxdepth 1 -type d -exec bash -c 'cd "$1"; touch bar' _ {} \;
Where again, touch bar can be something arbitrarily complex.

bash, "make clean" in all subdirectories

How can I find every Makefile file in the current path and subdirs and run a make clean command in every occurance.
What I have till now (does not work) is something like:
find . -type f -name 'Makefile' 2>/dev/null | sed 's#/Makefile##' | xargs -I% cd % && make clean && cd -
Another option would be to use find with -execdir but this gives me the issue with $PATH : The current directory is included in the PATH environment variable, which is insecure in combination with the -execdir action of find ....
But I do not want to change the $PATH variable.
An answer using the tools I used would be helpful so that I can understand what I do wrong,
but any working answer is acceptable.
Of course find is an option.. My approach with that would be more like:
find . -name Makefile -exec bash -c 'make -C "${1%/*}" clean' -- {} \;
But since you're using bash anyway, if you're in bash 4, you might also use globstar.
shopt -s globstar
for f in **/Makefile; do make -C "${f%/*}" clean; done
If you want to use the execution feature of find you can still do this:
find "${PWD}" -name Makefile -exec sh -c 'cd "${0%Makefile}" && make clean' {} \;
I would use the following approach:
find "$(pwd)" -name Makefile | while read -r line; do cd "$(dirname "$line")" && make clean; done
Please note the find $(pwd) which gives the full path as output of find.

Unable to use dirname within subshell in find command

I am trying to make a small script that can move all files from one directory to another, and I figured using find would be the best solution. However, I have run into a problem of using subshells for the 'dirname' value in creating the target directory paths. This does not work because {} evaluates to '.' (a single dot) when inside a subshell. As seen in my script bellow, the -exec mkdir -p $toDir/$(dirname {}) \; portion of the find command is what does not work. I want to create all of the target directories needed to move the files, but I cannot use dirname in a subshell to get only the directory path.
Here is the script:
#!/bin/bash
# directory containting files to deploy relative to this script
fromDir=../deploy
# directory where the files should be moved to relative to this script
toDir=../main
if [ -d "$fromDir" ]; then
if [ -d "$toDir" ]; then
toDir=$(cd $toDir; pwd)
cd $fromDir
find * -type f -exec echo "Moving file [$(pwd)/{}] to [$toDir/{}]" \; -exec mkdir -p $toDir/$(dirname {}) \; -exec mv {} $toDir/{} \;
else
echo "A directory named '$toDir' does not exist relative to this script"
fi
else
echo "A directory named '$fromDir' does not exist relative to this script"
fi
I know that you can us -exec sh -c 'echo $(dirname {})' \;, but with this, I would then not be able to use the $toDir variable.
Can anyone help me figure out a solution to this problem?
Since you appear to be re-creating all the files and directories, try the tar trick:
mkdir $toDir
cd $fromDir
tar -cf - . | ( cd $toDir ; tar -xvf - )

changing to current directory where file.sh is located using variable

I want a script that changes directory to the directory where the file.sh is located, lets say var1.
Then I want to copy files from another location ,lets say var2, to the current dir which would be var.
Then I want to do some unzipping and deleting rows in the files, which would be in var
I have tried the below, but my syntax is not correct. Can someone please advise?
#!/bin/bash
# Configure bash so the script will exit if a command fails.
set -e
#var is where the script is stored
var="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd";
#another dir I want to copy from
var2 = another/directory
#cd to the directory I want to copy my files to
cd "$var" + /PointB
#copy from var2 to the current location
#include the subdirectories
cp -r var2 .
# This will unzip all .zip files in this dir and all subdirectories under this one.
# -o is required to overwrite everything that is in there
find -iname '*.zip' -execdir unzip -o {} \;
#delete specific rows 1-6 and the last one from the csv file
find ./ -iname '*.csv' -exec sed -i '1,6d;$ d' '{}' ';'
a few mistakes here:
# no: var="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd";
var=$(cd "$(dirname "$0") && pwd)
The stuff in $() executes in a subshell, so the "pwd" must be performed in the same shell that you have "cd"-ed in.
# no: var2 = another/directory
var2=another/directory
The = cannot have whitespace around it.
# no: cd "$var" + /PointB
cd "$var"/PointB
shell is not javascript, string contatenation does not have a separate operator
# no: cp -r var2 .
cp -r "$var2" .
Need the $ to get the variable's value.
# no: find -iname '*.zip' -execdir unzip -o {} \;
find . -iname '*.zip' -execdir unzip -o {} \;
Specify the starting directory as the first argument to find.

How to go to each directory and execute a command?

How do I write a bash script that goes through each directory inside a parent_directory and executes a command in each directory.
The directory structure is as follows:
parent_directory (name could be anything - doesnt follow a pattern)
001 (directory names follow this pattern)
0001.txt (filenames follow this pattern)
0002.txt
0003.txt
002
0001.txt
0002.txt
0003.txt
0004.txt
003
0001.txt
the number of directories is unknown.
This answer posted by Todd helped me.
find . -maxdepth 1 -type d \( ! -name . \) -exec bash -c "cd '{}' && pwd" \;
The \( ! -name . \) avoids executing the command in current directory.
You can do the following, when your current directory is parent_directory:
for d in [0-9][0-9][0-9]
do
( cd "$d" && your-command-here )
done
The ( and ) create a subshell, so the current directory isn't changed in the main script.
You can achieve this by piping and then using xargs. The catch is you need to use the -I flag which will replace the substring in your bash command with the substring passed by each of the xargs.
ls -d */ | xargs -I {} bash -c "cd '{}' && pwd"
You may want to replace pwd with whatever command you want to execute in each directory.
If you're using GNU find, you can try -execdir parameter, e.g.:
find . -type d -execdir realpath "{}" ';'
or (as per #gniourf_gniourf comment):
find . -type d -execdir sh -c 'printf "%s/%s\n" "$PWD" "$0"' {} \;
Note: You can use ${0#./} instead of $0 to fix ./ in the front.
or more practical example:
find . -name .git -type d -execdir git pull -v ';'
If you want to include the current directory, it's even simpler by using -exec:
find . -type d -exec sh -c 'cd -P -- "{}" && pwd -P' \;
or using xargs:
find . -type d -print0 | xargs -0 -L1 sh -c 'cd "$0" && pwd && echo Do stuff'
Or similar example suggested by #gniourf_gniourf:
find . -type d -print0 | while IFS= read -r -d '' file; do
# ...
done
The above examples support directories with spaces in their name.
Or by assigning into bash array:
dirs=($(find . -type d))
for dir in "${dirs[#]}"; do
cd "$dir"
echo $PWD
done
Change . to your specific folder name. If you don't need to run recursively, you can use: dirs=(*) instead. The above example doesn't support directories with spaces in the name.
So as #gniourf_gniourf suggested, the only proper way to put the output of find in an array without using an explicit loop will be available in Bash 4.4 with:
mapfile -t -d '' dirs < <(find . -type d -print0)
Or not a recommended way (which involves parsing of ls):
ls -d */ | awk '{print $NF}' | xargs -n1 sh -c 'cd $0 && pwd && echo Do stuff'
The above example would ignore the current dir (as requested by OP), but it'll break on names with the spaces.
See also:
Bash: for each directory at SO
How to enter every directory in current path and execute script? at SE Ubuntu
If the toplevel folder is known you can just write something like this:
for dir in `ls $YOUR_TOP_LEVEL_FOLDER`;
do
for subdir in `ls $YOUR_TOP_LEVEL_FOLDER/$dir`;
do
$(PLAY AS MUCH AS YOU WANT);
done
done
On the $(PLAY AS MUCH AS YOU WANT); you can put as much code as you want.
Note that I didn't "cd" on any directory.
Cheers,
for dir in PARENT/*
do
test -d "$dir" || continue
# Do something with $dir...
done
While one liners are good for quick and dirty usage, I prefer below more verbose version for writing scripts. This is the template I use which takes care of many edge cases and allows you to write more complex code to execute on a folder. You can write your bash code in the function dir_command. Below, dir_coomand implements tagging each repository in git as an example. Rest of the script calls dir_command for each folder in directory. The example of iterating through only given set of folder is also include.
#!/bin/bash
#Use set -x if you want to echo each command while getting executed
#set -x
#Save current directory so we can restore it later
cur=$PWD
#Save command line arguments so functions can access it
args=("$#")
#Put your code in this function
#To access command line arguments use syntax ${args[1]} etc
function dir_command {
#This example command implements doing git status for folder
cd $1
echo "$(tput setaf 2)$1$(tput sgr 0)"
git tag -a ${args[0]} -m "${args[1]}"
git push --tags
cd ..
}
#This loop will go to each immediate child and execute dir_command
find . -maxdepth 1 -type d \( ! -name . \) | while read dir; do
dir_command "$dir/"
done
#This example loop only loops through give set of folders
declare -a dirs=("dir1" "dir2" "dir3")
for dir in "${dirs[#]}"; do
dir_command "$dir/"
done
#Restore the folder
cd "$cur"
I don't get the point with the formating of the file, since you only want to iterate through folders... Are you looking for something like this?
cd parent
find . -type d | while read d; do
ls $d/
done
you can use
find .
to search all files/dirs in the current directory recurive
Than you can pipe the output the xargs command like so
find . | xargs 'command here'
#!/bin.bash
for folder_to_go in $(find . -mindepth 1 -maxdepth 1 -type d \( -name "*" \) ) ;
# you can add pattern insted of * , here it goes to any folder
#-mindepth / maxdepth 1 means one folder depth
do
cd $folder_to_go
echo $folder_to_go "########################################## "
whatever you want to do is here
cd ../ # if maxdepth/mindepath = 2, cd ../../
done
#you can try adding many internal for loops with many patterns, this will sneak anywhere you want
You could run sequence of commands in each folder in 1 line like:
for d in PARENT_FOLDER/*; do (cd "$d" && tar -cvzf $d.tar.gz *.*)); done
for p in [0-9][0-9][0-9];do
(
cd $p
for f in [0-9][0-9][0-9][0-9]*.txt;do
ls $f; # Your operands
done
)
done

Resources