executing multiple commands with find -exec but one of the command is 'cd' - bash

Here's what I am trying to achieve:
find .. -type d -depth 1 \( -exec cd "{}" \; -exec touch abc \; \)
I find that the 'cd' part of the command is not working, I get the file 'abc' in the current folder and not in the children folders
how can I execute the command inside the folders found?
To clarify, following Dibery's comment: I need to be able to cd to each folder to execute more complex commands (touch was an example)
I'm on MacOS if it makes a difference

The command cd cannot be used with -exec in find because cd is a shell built-in (you can check this with type cd) rather than an executable (i.e., there's no such executable /usr/bin/cd). In your case, you may corporate the folder name into the touch command as:
find .. -type d -depth 1 -exec touch "{}/abc" \;
Or using git as you requested (the -C option allows you to run git as if you were in that directory):
find .. -type d -depth 1 -exec git -C "{}" some_git_action \;
Even without find:
for i in ../*/; do cd "$i"; some_cmd; cd -; done
cd to that directory and use cd - to go back to the original position, and adding the trailing / will make the asterisk expand to only the directories.

If diberys' comment isn't sufficient, you can pipe the find to a while loop as such:
find . -maxdepth 1 -type d | while read -r dir; do
cd $dir
touch some_file.txt
cd -
done

You can use a shell loop and run your commands in a subshell so you don't have to change directory back again:
for d in ./*/; do (
cd "$d"
touch foo # Or whatever you want
)
done
Alternatively, to get your find command to work, you could start a subshell for each directory:
find -maxdepth 1 -type d -exec bash -c 'cd "$1"; touch bar' _ {} \;
Where again, touch bar can be something arbitrarily complex.

Related

copy files with the base directory

I am searching specific directory and subdirectories for new files, I will like to copy the files. I am using this:
find /home/foo/hint/ -type f -mtime -2 -exec cp '{}' ~/new/ \;
It is copying the files successfully, but some files have same name in different subdirectories of /home/foo/hint/.
I will like to copy the files with its base directory to the ~/new/ directory.
test#serv> find /home/foo/hint/ -type f -mtime -2 -exec ls '{}' \;
/home/foo/hint/do/pass/file.txt
/home/foo/hint/fit/file.txt
test#serv>
~/new/ should look like this after copy:
test#serv> ls -R ~/new/
/home/test/new/pass/:
file.txt
/home/test/new/fit/:
file.txt
test#serv>
platform: Solaris 10.
Since you can't use rsync or fancy GNU options, you need to roll your own using the shell.
The find command lets you run a full shell in your -exec, so you should be good to go with a one-liner to handle the names.
If I understand correctly, you only want the parent directory, not the full tree, copied to the target. The following might do:
#!/usr/bin/env bash
findopts=(
-type f
-mtime -2
-exec bash -c 'd="${0%/*}"; d="${d##*/}"; mkdir -p "$1/$d"; cp -v "$0" "$1/$d/"' {} ./new \;
)
find /home/foo/hint/ "${findopts[#]}"
Results:
$ find ./hint -type f -print
./hint/foo/slurm/file.txt
./hint/foo/file.txt
./hint/bar/file.txt
$ ./doit
./hint/foo/slurm/file.txt -> ./new/slurm/file.txt
./hint/foo/file.txt -> ./new/foo/file.txt
./hint/bar/file.txt -> ./new/bar/file.txt
I've put the options to find into a bash array for easier reading and management. The script for the -exec option is still a little unwieldy, so here's a breakdown of what it does for each file. Bearing in mind that in this format, options are numbered from zero, the {} becomes $0 and the target directory becomes $1...
d="${0%/*}" # Store the source directory in a variable, then
d="${d##*/}" # strip everything up to the last slash, leaving the parent.
mkdir -p "$1/$d" # create the target directory if it doesn't already exist,
cp "$0" "$1/$d/" # then copy the file to it.
I used cp -v for verbose output as shown in "Results" above, but IIRC it's also not supported by Solaris, and can be safely ignored.
The --parents flag should do the trick:
find /home/foo/hint/ -type f -mtime -2 -exec cp --parents '{}' ~/new/ \;
Try testing with rsync -R, for example:
find /your/path -type f -mtime -2 -exec rsync -R '{}' ~/new/ \;
From the rsync man:
-R, --relative
Use relative paths. This means that the full path names specified on the
command line are sent to the server rather than just the last parts of the
filenames.
The problem with the answers by #Mureinik and #nbari might be that the absolute path of new files will spawn in the target directory. In this case you might want to switch to the base directory before the command and go back to your current directory afterwards:
path_current=$PWD; cd /home/foo/hint/; find . -type f -mtime -2 -exec cp --parents '{}' ~/new/ \; ; cd $path_current
or
path_current=$PWD; cd /home/foo/hint/; find . -type f -mtime -2 -exec rsync -R '{}' ~/new/ \; ; cd $path_current
Both ways work for me at a Linux platform. Let’s hope that Solaris 10 knows about rsync’s -R ! ;)
I found a way around it:
cd ~/new/
find /home/foo/hint/ -type f -mtime -2 -exec nawk -v f={} '{n=split(FILENAME, a, "/");j= a[n-1];system("mkdir -p "j"");system("cp "f" "j""); exit}' {} \;

removing path prefix from find results

At the simplest, if I execute
find . -type f -exec cp {} /new/path/{}
The path that is expanded is /new/path/./path/to/file. I would like to remove that ./ that is prefixed by the find command before I use {} in the exec.
I am using the builtin Freebsd find, but I do have access to gnufind if that will help (though I do not normally use gnufind).
Where you will have a problem is when find descends into subdirectories, and it tries to exec something like cp ./foo/bar.txt /new/path/./foo/bar.txt and "/new/path" has no subdirectory "foo" -- you might want to:
specify -maxdepth 1 so you do not descend into subdirs
find . -maxdepth 1 -type f -exec cp {} /new/path/{} \;
just use a directory destination for cp, so the files end up in a single dir (will suffer from collisions if you have "./foo/bar.txt" and "./qux/bar.txt")
find . -type f -exec cp -t /new/path {} +
use tar to copy the whole tree: this will preserve directory structure
tar cf - . | ( cd /new/path && tar xvf - )

How to use the pwd as a filename

I need to go through a whole set of subdirectories, and each time it finds a subdirectory named 0, it has to go inside it. Once there, I need to execute a tar command to compact some files.
I tried the following
find . -type d -name "0" -exec sh -c '(cd {} && tar -cvf filename.tar ../PARFILE RS* && cp filename.tar ~/home/directoryForTransfer/)' ';'
which seems to work. However, because this is done in many directories named 0, it will always overwrite the previous filename.tar one (and I lose the info about where it was created).
One way to solve this would be to use the $pwd as the filename (+.tar at the end).
I tried double ticks, backticks, etc, but I never manage to get the correct filename.
"$PWD"".tar", `$PWD.tar`, etc
Any idea? Any other way is ok, as long as I can link the name of the file with the directory it was created.
I'd need this to transfer the directoryToTransfer easily from the cluster to my home computer.
You can try "${PWD//\//_}.tar". However you have to use bash -c instead of sh -c.
Edit:
So now your code should look like this:
find . -type d -name "0" -exec bash -c 'cd {} && tar -cvf filename.tar ../PARFILE RS* && cp filename.tar ~/home/directoryForTransfer/"${PWD//\//_}.tar"' ';'
I personally don't really like the using -exec flag for find as it makes the code less readable and also forks a new process for each file. I would do it like this, which should work unless a filename somewhere contains a newline (which is very unlikely).
while read dir; do
cd {} && tar -cvf filename.tar ../PARFILE RS* && cp filename.tar ~/home/directoryForTransfer/"${PWD//\//_}.tar"
done < <(find . -type d -name "0")
But this is just my personal preference. The -exec variant should work too.
You can use -execdir option in find to descend in each found directory and then run the tar command to greatly simplify your tar command:
find . -type d -name "0" -execdir tar -cvf filename.tar RS* \;
If you want tar file to be created in ~/home/directoryForTransfer/ then use:
find . -type d -name "0" -execdir tar -cvf ~/home/directoryForTransfer/filename.tar RS* \;

How to create in all subfolders of a given directory subfolder with given name

I have a directory structure like this in my CentOS:
dir1
dir2
dir3
...
Now, I would like to create in each dirN folder a sub-folder named converted. I tried with:
> find . -maxdepth 1 -type d -execdir mkdir converted {} +
But without success. Could anybody help?
TCLSH VERSION:
find . -maxdepth 1 -type d -exec mkdir -p \{\}/converted \;
You have to escape the curly braces with "\".
BASH VERSION:
Fist login into BASH:
bash --login
then perform the command:
sudo find . -maxdepth 1 -type d -exec mkdir -p {}/converted \;
What worked out for me.
But it also creates the folder converted on top of those directories.
Also are you in the specific folder where you want to create those sub dirs?
If not please navigate first into the folder or change the find command params like so:
sudo find /PATH/TO/YOUR/DIR -maxdepth 1 -type d -exec mkdir -p {}/converted \;
IF you really want to stick to tclsh:
Optional:
This is just my opinion:
You should really use BASH or ZSH for your daily sysadmin/programming work.
If you want to change the $SHELL type in:
sudo chsh
Another way:
bash
for i in dir*; do [ -d "$i" ] || continue; mkdir "$i"/converted; done

How to go to each directory and execute a command?

How do I write a bash script that goes through each directory inside a parent_directory and executes a command in each directory.
The directory structure is as follows:
parent_directory (name could be anything - doesnt follow a pattern)
001 (directory names follow this pattern)
0001.txt (filenames follow this pattern)
0002.txt
0003.txt
002
0001.txt
0002.txt
0003.txt
0004.txt
003
0001.txt
the number of directories is unknown.
This answer posted by Todd helped me.
find . -maxdepth 1 -type d \( ! -name . \) -exec bash -c "cd '{}' && pwd" \;
The \( ! -name . \) avoids executing the command in current directory.
You can do the following, when your current directory is parent_directory:
for d in [0-9][0-9][0-9]
do
( cd "$d" && your-command-here )
done
The ( and ) create a subshell, so the current directory isn't changed in the main script.
You can achieve this by piping and then using xargs. The catch is you need to use the -I flag which will replace the substring in your bash command with the substring passed by each of the xargs.
ls -d */ | xargs -I {} bash -c "cd '{}' && pwd"
You may want to replace pwd with whatever command you want to execute in each directory.
If you're using GNU find, you can try -execdir parameter, e.g.:
find . -type d -execdir realpath "{}" ';'
or (as per #gniourf_gniourf comment):
find . -type d -execdir sh -c 'printf "%s/%s\n" "$PWD" "$0"' {} \;
Note: You can use ${0#./} instead of $0 to fix ./ in the front.
or more practical example:
find . -name .git -type d -execdir git pull -v ';'
If you want to include the current directory, it's even simpler by using -exec:
find . -type d -exec sh -c 'cd -P -- "{}" && pwd -P' \;
or using xargs:
find . -type d -print0 | xargs -0 -L1 sh -c 'cd "$0" && pwd && echo Do stuff'
Or similar example suggested by #gniourf_gniourf:
find . -type d -print0 | while IFS= read -r -d '' file; do
# ...
done
The above examples support directories with spaces in their name.
Or by assigning into bash array:
dirs=($(find . -type d))
for dir in "${dirs[#]}"; do
cd "$dir"
echo $PWD
done
Change . to your specific folder name. If you don't need to run recursively, you can use: dirs=(*) instead. The above example doesn't support directories with spaces in the name.
So as #gniourf_gniourf suggested, the only proper way to put the output of find in an array without using an explicit loop will be available in Bash 4.4 with:
mapfile -t -d '' dirs < <(find . -type d -print0)
Or not a recommended way (which involves parsing of ls):
ls -d */ | awk '{print $NF}' | xargs -n1 sh -c 'cd $0 && pwd && echo Do stuff'
The above example would ignore the current dir (as requested by OP), but it'll break on names with the spaces.
See also:
Bash: for each directory at SO
How to enter every directory in current path and execute script? at SE Ubuntu
If the toplevel folder is known you can just write something like this:
for dir in `ls $YOUR_TOP_LEVEL_FOLDER`;
do
for subdir in `ls $YOUR_TOP_LEVEL_FOLDER/$dir`;
do
$(PLAY AS MUCH AS YOU WANT);
done
done
On the $(PLAY AS MUCH AS YOU WANT); you can put as much code as you want.
Note that I didn't "cd" on any directory.
Cheers,
for dir in PARENT/*
do
test -d "$dir" || continue
# Do something with $dir...
done
While one liners are good for quick and dirty usage, I prefer below more verbose version for writing scripts. This is the template I use which takes care of many edge cases and allows you to write more complex code to execute on a folder. You can write your bash code in the function dir_command. Below, dir_coomand implements tagging each repository in git as an example. Rest of the script calls dir_command for each folder in directory. The example of iterating through only given set of folder is also include.
#!/bin/bash
#Use set -x if you want to echo each command while getting executed
#set -x
#Save current directory so we can restore it later
cur=$PWD
#Save command line arguments so functions can access it
args=("$#")
#Put your code in this function
#To access command line arguments use syntax ${args[1]} etc
function dir_command {
#This example command implements doing git status for folder
cd $1
echo "$(tput setaf 2)$1$(tput sgr 0)"
git tag -a ${args[0]} -m "${args[1]}"
git push --tags
cd ..
}
#This loop will go to each immediate child and execute dir_command
find . -maxdepth 1 -type d \( ! -name . \) | while read dir; do
dir_command "$dir/"
done
#This example loop only loops through give set of folders
declare -a dirs=("dir1" "dir2" "dir3")
for dir in "${dirs[#]}"; do
dir_command "$dir/"
done
#Restore the folder
cd "$cur"
I don't get the point with the formating of the file, since you only want to iterate through folders... Are you looking for something like this?
cd parent
find . -type d | while read d; do
ls $d/
done
you can use
find .
to search all files/dirs in the current directory recurive
Than you can pipe the output the xargs command like so
find . | xargs 'command here'
#!/bin.bash
for folder_to_go in $(find . -mindepth 1 -maxdepth 1 -type d \( -name "*" \) ) ;
# you can add pattern insted of * , here it goes to any folder
#-mindepth / maxdepth 1 means one folder depth
do
cd $folder_to_go
echo $folder_to_go "########################################## "
whatever you want to do is here
cd ../ # if maxdepth/mindepath = 2, cd ../../
done
#you can try adding many internal for loops with many patterns, this will sneak anywhere you want
You could run sequence of commands in each folder in 1 line like:
for d in PARENT_FOLDER/*; do (cd "$d" && tar -cvzf $d.tar.gz *.*)); done
for p in [0-9][0-9][0-9];do
(
cd $p
for f in [0-9][0-9][0-9][0-9]*.txt;do
ls $f; # Your operands
done
)
done

Resources