How to go to each directory and execute a command? - bash

How do I write a bash script that goes through each directory inside a parent_directory and executes a command in each directory.
The directory structure is as follows:
parent_directory (name could be anything - doesnt follow a pattern)
001 (directory names follow this pattern)
0001.txt (filenames follow this pattern)
0002.txt
0003.txt
002
0001.txt
0002.txt
0003.txt
0004.txt
003
0001.txt
the number of directories is unknown.

This answer posted by Todd helped me.
find . -maxdepth 1 -type d \( ! -name . \) -exec bash -c "cd '{}' && pwd" \;
The \( ! -name . \) avoids executing the command in current directory.

You can do the following, when your current directory is parent_directory:
for d in [0-9][0-9][0-9]
do
( cd "$d" && your-command-here )
done
The ( and ) create a subshell, so the current directory isn't changed in the main script.

You can achieve this by piping and then using xargs. The catch is you need to use the -I flag which will replace the substring in your bash command with the substring passed by each of the xargs.
ls -d */ | xargs -I {} bash -c "cd '{}' && pwd"
You may want to replace pwd with whatever command you want to execute in each directory.

If you're using GNU find, you can try -execdir parameter, e.g.:
find . -type d -execdir realpath "{}" ';'
or (as per #gniourf_gniourf comment):
find . -type d -execdir sh -c 'printf "%s/%s\n" "$PWD" "$0"' {} \;
Note: You can use ${0#./} instead of $0 to fix ./ in the front.
or more practical example:
find . -name .git -type d -execdir git pull -v ';'
If you want to include the current directory, it's even simpler by using -exec:
find . -type d -exec sh -c 'cd -P -- "{}" && pwd -P' \;
or using xargs:
find . -type d -print0 | xargs -0 -L1 sh -c 'cd "$0" && pwd && echo Do stuff'
Or similar example suggested by #gniourf_gniourf:
find . -type d -print0 | while IFS= read -r -d '' file; do
# ...
done
The above examples support directories with spaces in their name.
Or by assigning into bash array:
dirs=($(find . -type d))
for dir in "${dirs[#]}"; do
cd "$dir"
echo $PWD
done
Change . to your specific folder name. If you don't need to run recursively, you can use: dirs=(*) instead. The above example doesn't support directories with spaces in the name.
So as #gniourf_gniourf suggested, the only proper way to put the output of find in an array without using an explicit loop will be available in Bash 4.4 with:
mapfile -t -d '' dirs < <(find . -type d -print0)
Or not a recommended way (which involves parsing of ls):
ls -d */ | awk '{print $NF}' | xargs -n1 sh -c 'cd $0 && pwd && echo Do stuff'
The above example would ignore the current dir (as requested by OP), but it'll break on names with the spaces.
See also:
Bash: for each directory at SO
How to enter every directory in current path and execute script? at SE Ubuntu

If the toplevel folder is known you can just write something like this:
for dir in `ls $YOUR_TOP_LEVEL_FOLDER`;
do
for subdir in `ls $YOUR_TOP_LEVEL_FOLDER/$dir`;
do
$(PLAY AS MUCH AS YOU WANT);
done
done
On the $(PLAY AS MUCH AS YOU WANT); you can put as much code as you want.
Note that I didn't "cd" on any directory.
Cheers,

for dir in PARENT/*
do
test -d "$dir" || continue
# Do something with $dir...
done

While one liners are good for quick and dirty usage, I prefer below more verbose version for writing scripts. This is the template I use which takes care of many edge cases and allows you to write more complex code to execute on a folder. You can write your bash code in the function dir_command. Below, dir_coomand implements tagging each repository in git as an example. Rest of the script calls dir_command for each folder in directory. The example of iterating through only given set of folder is also include.
#!/bin/bash
#Use set -x if you want to echo each command while getting executed
#set -x
#Save current directory so we can restore it later
cur=$PWD
#Save command line arguments so functions can access it
args=("$#")
#Put your code in this function
#To access command line arguments use syntax ${args[1]} etc
function dir_command {
#This example command implements doing git status for folder
cd $1
echo "$(tput setaf 2)$1$(tput sgr 0)"
git tag -a ${args[0]} -m "${args[1]}"
git push --tags
cd ..
}
#This loop will go to each immediate child and execute dir_command
find . -maxdepth 1 -type d \( ! -name . \) | while read dir; do
dir_command "$dir/"
done
#This example loop only loops through give set of folders
declare -a dirs=("dir1" "dir2" "dir3")
for dir in "${dirs[#]}"; do
dir_command "$dir/"
done
#Restore the folder
cd "$cur"

I don't get the point with the formating of the file, since you only want to iterate through folders... Are you looking for something like this?
cd parent
find . -type d | while read d; do
ls $d/
done

you can use
find .
to search all files/dirs in the current directory recurive
Than you can pipe the output the xargs command like so
find . | xargs 'command here'

#!/bin.bash
for folder_to_go in $(find . -mindepth 1 -maxdepth 1 -type d \( -name "*" \) ) ;
# you can add pattern insted of * , here it goes to any folder
#-mindepth / maxdepth 1 means one folder depth
do
cd $folder_to_go
echo $folder_to_go "########################################## "
whatever you want to do is here
cd ../ # if maxdepth/mindepath = 2, cd ../../
done
#you can try adding many internal for loops with many patterns, this will sneak anywhere you want

You could run sequence of commands in each folder in 1 line like:
for d in PARENT_FOLDER/*; do (cd "$d" && tar -cvzf $d.tar.gz *.*)); done

for p in [0-9][0-9][0-9];do
(
cd $p
for f in [0-9][0-9][0-9][0-9]*.txt;do
ls $f; # Your operands
done
)
done

Related

Correct usage of find and while-read loop in different formats?

After reading multiple anwers on stackoverflow I came up with the following solution to read directory paths from find's output:
find "$searchdir" -type d -execdir test -d {}/.git \; -prune -print0 | while read -r -d $'\0' dir; do
# do stuff
done
However, most sources recommend something like the following approach:
while IFS= read -r -d '' file; do
some command "$file"
done < <(find . -type f -name '*.mp3' -print0)
Why are they using process substitution? Does this change anything about the whole process or is it just an other way to do the same thing?
Is the read argument -d '' different from -d $'\0' or again the same thing? Does empty string always contain at least \0 so the bash specific $'' syntax is completely unnecessary?
I also tried doing it directly in find -exec/-execdir by passing it multiple times and failed. Maybe filtering and testing can be done in one command?
non working example:
find "$repositories_root_dir" -type d -execdir test -d {}/.git \; -prune -execdir sh -c "if git ls-remote --exit-code . \"origin/${target_branch_name}\" &> /dev/null; then echo \"Found branch '${target_branch_name}' in {}\"; git checkout \"${target_branch_name}\"; fi" \;
Sources:
https://github.com/koalaman/shellcheck/wiki/Sc2044
https://mywiki.wooledge.org/BashPitfalls#for_f_in_.24.28ls_.2A.mp3.29
In your non-working example, if you test the existence of a .git sub-directory to process only git clones and discard the other directories, then you should probably not prune because it does the exact opposite: skip only git clones.
Moreover, when using -execdir sh -c SCRIPT, you should pass positional parameters to your script instead of trying to embed the current directory name in the script with {}, which is not portable. And you could do the same for the branch name. Note that the directory name is not needed for what you try to accomplish in each git clone, because your script is executed from there.
Try this, maybe:
find "$repositories_root_dir" -type d -name '.git' -execdir sh -c '
if git ls-remote --exit-code . "origin/$1" &> /dev/null; then
printf "Found branch %s in " "$1"; pwd
echo git checkout "$1"
fi' _ "$target_branch_name" \;
(_ is assigned to positional parameter $0). Remove the echo if the result looks correct.

executing multiple commands with find -exec but one of the command is 'cd'

Here's what I am trying to achieve:
find .. -type d -depth 1 \( -exec cd "{}" \; -exec touch abc \; \)
I find that the 'cd' part of the command is not working, I get the file 'abc' in the current folder and not in the children folders
how can I execute the command inside the folders found?
To clarify, following Dibery's comment: I need to be able to cd to each folder to execute more complex commands (touch was an example)
I'm on MacOS if it makes a difference
The command cd cannot be used with -exec in find because cd is a shell built-in (you can check this with type cd) rather than an executable (i.e., there's no such executable /usr/bin/cd). In your case, you may corporate the folder name into the touch command as:
find .. -type d -depth 1 -exec touch "{}/abc" \;
Or using git as you requested (the -C option allows you to run git as if you were in that directory):
find .. -type d -depth 1 -exec git -C "{}" some_git_action \;
Even without find:
for i in ../*/; do cd "$i"; some_cmd; cd -; done
cd to that directory and use cd - to go back to the original position, and adding the trailing / will make the asterisk expand to only the directories.
If diberys' comment isn't sufficient, you can pipe the find to a while loop as such:
find . -maxdepth 1 -type d | while read -r dir; do
cd $dir
touch some_file.txt
cd -
done
You can use a shell loop and run your commands in a subshell so you don't have to change directory back again:
for d in ./*/; do (
cd "$d"
touch foo # Or whatever you want
)
done
Alternatively, to get your find command to work, you could start a subshell for each directory:
find -maxdepth 1 -type d -exec bash -c 'cd "$1"; touch bar' _ {} \;
Where again, touch bar can be something arbitrarily complex.

How to cd into grep output?

I have a shell script which basically searches all folders inside a location and I use grep to find the exact folder I want to target.
for dir in /root/*; do
grep "Apples" "${dir}"/*.* || continue
While grep successfully finds my target directory, I'm stuck on how I can move the folders I want to move in my target directory. An idea I had was to cd into grep output but that's where I got stuck. Tried some Google results, none helped with my case.
Example grep output: Binary file /root/ant/containers/secret/Documents/2FD412E0/file.extension matches
I want to cd into 2FD412E0and move two folders inside that directory.
dirname is the key to that:
cd $(dirname $(grep "...." ...))
will let you enter the directory.
As people mentioned, dirname is the right tool to strip off the file name from the path.
I would use find for such kind of task:
while read -r file
do
target_dir=`dirname $file`
# do something with "$target_dir"
done < <(find /root/ -type f \
-exec grep "Apples" --files-with-matches {} \;)
Consider using find's -maxdepth option. See the man page for find.
Well, there is actually simpler solution :) I just like to write bash scripts. You might simply use single find command like this:
find /root/ -type f -exec grep Apples {} ';' -exec ls -l {} ';'
Note the second -exec. It will be executed, if the previous -exec command exited with status 0 (success). From the man page:
-exec command ;
Execute command; true if 0 status is returned. All following arguments to find are taken to be arguments to the command until an argument consisting of ; is encountered. The string {} is replaced by the current file name being processed everywhere it occurs in the arguments to the command, not just in arguments where it is alone, as in some versions of find.
Replace the ls -l command with your stuff.
And if you want to execute dirname within the -exec command, you may do the following trick:
find /root/ -type f -exec grep -q Apples {} ';' \
-exec sh -c 'cd `dirname $0`; pwd' {} ';'
Replace pwd with your stuff.
When find is not available
In the comments you write that find is not available on your system. The following solution works without find:
grep -R --files-with-matches Apples "${dir}" | while read -r file
do
target_dir=`dirname $file`
# do something with "$target_dir"
echo $target_dir
done

Find path in bash on insensitive manner

Suppose a path like
/home/albfan/Projects/InSaNEWEBproJECT
Despite of the fact to not use such that names. Is there a way to check for a path in an insensitive manner?
I came across to this solution, but I would like to find a builtin or gnu program, if it is possible.
function searchPathInsensitive {
# Replace bar with comma (not valid directory character allowing parse dirs with spaces)
#also remove first / if exist (if not this create a first empty element
ORG="$1"
if [ "${ORG:0:1}" = "/" ]
then
ORG="${ORG:1}"
else
ORG="${PWD:1}/$ORG"
fi
OLDIFS=$IF
IFS=,
for dir in ${ORG//\//,}
do
if [ -z $DIR ]
then
DIR="/$dir"
else
TMP_DIR="$DIR/$dir"
DIR=$(/usr/bin/find $DIR -maxdepth 1 -ipath $TMP_DIR -print -quit)
if [ -z $DIR ]
then
# If some of the path does not exist just copy the element
# exit 1
DIR="$TMP_DIR"
fi
fi
done
IFS=$OLDIFS
echo "$DIR"
}
to use it just do:
(searching on my home)
$ searchPathInsensitive projects/insanewebproject
/home/albfan/Projects/InSaNEWEBproJECT
(inside a project)
$ searchPathInsensitive src/main/java/org/package/webprotocolhttpwrapper.java
/home/albfan/Projects/InSaNEWEBproJECT/src/main/java/org/package/WebProtocolHTTPWrapper.java
$ searchPathInsensitive src/main/resources/logout.png
/home/albfan/Projects/InSaNEWEBproJECT/src/main/resources/LogOut.PNG
I guess the solution is related in any way with find -ipath as all I do with the function is search only for next element in path given on insensitive manner
My fault! I guess I tried
find -ipath 'projects/insanewebproject'
but the trick here is that I must use
find -ipath './projects/insanewebproject'
That ./ does the change. Thanks!.
man says -path is more portable than -wholename
if you expect only one result, you can add | head -n1, cause that way head kill pipe when it fills its buffer, which is only one line length
find -ipath './projects/insanewebproject'| head -n1
The simplest solution:
$ find . | grep -qi /path/to/something[^/]*$
But if you have some additional conditions that must be checked for matched file, you can run grep inside find:
$ find . -exec sh -c 'echo {} | grep -qi /path/to/something' \; -print
Here you will get all files that are in the directory. If you want to get only the directory's name:
$ find . -exec sh -c 'echo {} | grep -qi /path/to/something[^/]*$' \; -print
Example of usage:
$ mkdir -p Projects/InSaNEWEBproJECT/src/main/resources/
$ find . -exec sh -c 'echo {} | grep -qi /projects/insanewebproject[^/]*$' \; -print
./Projects/InSaNEWEBproJECT

Apply a script to subdirectories

I have read many times that if I want to execute something over all subdirectories I should run something like one of these:
find . -name '*' -exec command arguments {} \;
find . -type f -print0 | xargs -0 command arguments
find . -type f | xargs -I {} command arguments {} arguments
The problem is that it works well with corefunctions, but not as expected when the command is a user-defined function or a script. How to fix it?
So what I am looking for is a line of code or a script in which I can replace command for myfunction or myscript.sh and it goes to every single subdirectory from current directory and executes such function or script there, with whatever arguments I supply.
Explaining in another way, I want something to work over all subdirectories as nicely as for file in *; do command_myfunction_or_script.sh arguments $file; done works over current directory.
Instead of -exec, try -execdir.
It may be that in some cases you need to use bash:
foo () { echo $1; }
export -f foo
find . -type f -name '*.txt' -exec bash -c 'foo arg arg' \;
The last line could be:
find . -type f -name '*.txt' -exec bash -c 'foo "$#"' _ arg arg \;
Depending on what args might need expanding and when. The underscore represents $0.
You could use -execdir where I have -exec if that's needed.
The examples that you give, such as:
find . -name '*' -exec command arguments {} \;
Don't go to every single subdirectory from current directory and execute command there, but rather execute command from the current directory with the path to each file listed by the find as an argument.
If what you want is to actually change directory and execute a script, you could try something like this:
STDIR=$PWD; IFS=$'\n'; for dir in $(find . -type d); do cd $dir; /path/to/command; cd $STDIR; done; unset IFS
Here the current directory is saved to STDIR and the bash Internal Field Separator is set to a newline so names won't split on spaces. Then for each directory (-type d) that find returns, we cd to that directory, execute the command (using the full path here as changing directories will break a relative path) and then cd back to the starting directory.
There may be some way to use find with a function, but it won't be terribly elegant. If you have bash 4, what you probably want to do is use globstar:
shopt -s globstar
for file in **/*; do
myfunction "$file"
done
If you're looking for compatibility with POSIX or older versions of bash, you will be forced to source the file defining your function when you invoke bash. So something like this:
find <args> -exec bash -c '. funcfile;
for file; do
myfunction "$file"
done' _ {} +
But that's just ugly. When I get to this point, I usually just put my function in a script on my PATH and live with it.
If you want to use a bash function, this is one way.
work ()
{
local file="$1"
local dir=$(dirname $file)
pushd "$dir"
echo "in directory $(pwd) working with file $(basename $file)"
popd
}
find . -name '*' | while read line;
do
work "$line"
done

Resources