How to use if command as an extension of find? - shell

I want to find for some name(s) in directory tree, and when I find specific directory, I want to check if it has some default subdirectory. Problem is, I do not know how to accomplish this. I tried using this command:
find -iname $i -exec if [ -d $1/subdir ] then echo $1 fi
but then I get report like this:
find: missing argument to `-exec'
So, what is right solution for this?

exec requires a single executable, not an arbitrary shell command. Run a new shell instance explicitly, and pass your shell command as the argument to the -c option. Use {} as the single positional argument to sh so that the name of the found directory is
properly passed to the shell command.
find -iname "$i" -exec sh -c 'if [ -d "$1"/subdir ]; then echo "$1"; fi' '{}' \;
It might be a little simpler to reorganize your logic, if possible:
find -wholename "$i/subdir" -type d -exec dirname '{}' \;
This has find look for the actual subdir directory instead of its parent, then prints the directory name containing subdir.

Related

find + cp spaces in path AND need to rename. Howto?

I need to find all files recursively with the name 'config.xml' and set them aside for analysis. The paths have spaces in them just to keep it interesting. However, I need them to be unique or they will collide in the same folder. What I would like to do is basically copy them off but using the name of the directory they were found in. The command I want is something like from this question except I need it to do something like $(dirname {}). When I do that, nothing gets moved (but I get no error)
Sample, but non-functional command:
find . -name 'config.xml' -exec sh -c 'cp "$1" "$2.xml"' -- {} "$HOME/data/$(dirname {})" \;
To do this with just one shell, not one per file found (as used by prior answers):
while IFS= read -r -d '' filename; do
outFile="$HOME/data/${filename%/*}.xml"
mkdir -p -- "${outFile%/*}"
cp -- "$filename" "$outFile"
done < <(find . -name 'config.xml' -print0)
This way your find emits a NUL-delimited stream of filenames, consumed one-by-one by the while read loop in the parent shell.
(You could use "$HOME/data/$(dirname "$filename").xml", but from a performance perspective that's really silly: $() fork()s off a subshell, and dirname is an external executable that needs to be exec'd, linked and loaded; no point to all that overhead when you can just do the string manipulation internal to the shell itself).
You may use it like this:
find . -name 'config.xml' -exec bash -c \
'd="$HOME/data/${1%/*}/"; mkdir -p "$d"; command cp -p "$1" "$d"' - {} \;
-exec sh is a little hard to handle, but not impossible. The $(dirname ...) is expanded prior sh is run, so it's equal dirname {} - the dirname of file {}. Do something like -exec sh -c ' .... ' -- {} and put the $(dirname ... ) inside sh script using $1.
find . -name 'config.xml' -exec sh -c 'cp "$1" "$2/data/$(dirname "$1").xml"' -- {} "$HOME" \;

How to cd into grep output?

I have a shell script which basically searches all folders inside a location and I use grep to find the exact folder I want to target.
for dir in /root/*; do
grep "Apples" "${dir}"/*.* || continue
While grep successfully finds my target directory, I'm stuck on how I can move the folders I want to move in my target directory. An idea I had was to cd into grep output but that's where I got stuck. Tried some Google results, none helped with my case.
Example grep output: Binary file /root/ant/containers/secret/Documents/2FD412E0/file.extension matches
I want to cd into 2FD412E0and move two folders inside that directory.
dirname is the key to that:
cd $(dirname $(grep "...." ...))
will let you enter the directory.
As people mentioned, dirname is the right tool to strip off the file name from the path.
I would use find for such kind of task:
while read -r file
do
target_dir=`dirname $file`
# do something with "$target_dir"
done < <(find /root/ -type f \
-exec grep "Apples" --files-with-matches {} \;)
Consider using find's -maxdepth option. See the man page for find.
Well, there is actually simpler solution :) I just like to write bash scripts. You might simply use single find command like this:
find /root/ -type f -exec grep Apples {} ';' -exec ls -l {} ';'
Note the second -exec. It will be executed, if the previous -exec command exited with status 0 (success). From the man page:
-exec command ;
Execute command; true if 0 status is returned. All following arguments to find are taken to be arguments to the command until an argument consisting of ; is encountered. The string {} is replaced by the current file name being processed everywhere it occurs in the arguments to the command, not just in arguments where it is alone, as in some versions of find.
Replace the ls -l command with your stuff.
And if you want to execute dirname within the -exec command, you may do the following trick:
find /root/ -type f -exec grep -q Apples {} ';' \
-exec sh -c 'cd `dirname $0`; pwd' {} ';'
Replace pwd with your stuff.
When find is not available
In the comments you write that find is not available on your system. The following solution works without find:
grep -R --files-with-matches Apples "${dir}" | while read -r file
do
target_dir=`dirname $file`
# do something with "$target_dir"
echo $target_dir
done

find piped to xargs with complex command

I am trying to process DVD files that are in many different locations on a disk. The thing they have in common is that they (each set of input files) are in a directory named VIDEO_TS. The output in each case will be a single file named for the parent of this directory.
I know I can get a fully qualified path to each directory with:
find /Volumes/VolumeName -type d -name "VIDEO_TS" -print0
and I can get the parent directory by piping to xargs:
find /Volumes/VolumeName -type d -name "VIDEO_TS" -print0 | xargs -0 -I{} dirname {}
and I also know that I can get the parent directory name on its own by appending:
| xargs -o I{} basename {}
What I can't figure out is how do I then pass these parameters to, e.g. HandBrakeCLI:
./HandBrakeCLI -i /path/to/filename/VIDEO_TS -o /path/to/convertedfiles/filename.m4v
I have read here about expansion capability of the shell and suspect that's going to help here (not using dirname or basename for a start), but the more I read the more confused I am getting!
You don't actually need xargs for this at all: You can read a NUL-delimited stream into a shell loop, and run the commands you want directly from there.
#!/bin/bash
source_dir=/Volumes/VolumeName
dest_dir=/Volumes/OtherName
while IFS= read -r -d '' dir; do
name=${dir%/VIDEO_TS} # trim /VIDEO_TS off the end of dir, assign to name
name=${name##*/} # remove everything before last remaining / from name
./HandBrakeCLI -i "$dir" -o "$dest_dir/$name.m4v"
done < <(find "$source_dir" -type d -name "VIDEO_TS" -print0)
See the article Using Find on Greg's wiki, or BashFAQ #001 for general information on processing input streams in bash, or BashFAQ #24 to understand the value of using process substitution (the <(...) construct here) rather than piping from find into the loop.
Also, find contains an -exec action which can be used as follows:
source_dir=/Volumes/VolumeName
dest_dir=/Volumes/OtherName
export dest_dir # export allows use by subprocesses!
find "$source_dir" -type d -name "VIDEO_TS" -exec bash -c '
for dir; do
name=${dir%/VIDEO_TS}
name=${name##*/}
./HandBrakeCLI -i "$dir" -o "$dest_dir/$name.m4v"
done
' _ {} +
This passes the found directory names directly on the argument list to the shell invoked with bash -c. Since the default object for for loop to iterate over is "$#", the argument list, this implicitly iterates over directories found by find.
If I understand what you are trying to do, the simplest solution would be to create a little wrapper which takes a path and invokes your CLI:
File: CLIWrapper
#!/bin/bash
for dir in "$#"; do
./HandBrakeCLI -i "${dir%/*}" -o "/path/to/convertedfiles/${dir##*/}.m4v"
done
Edit: I think I misunderstood the question. It's possible that the above script should read:
./HandBrakeCLI -i "$dir" -o "/path/to/convertedfiles/${dir##*/}.m4v"
or perhaps something slightly different. But the theory is valid. :)
Then you can invoke that script using the -exec option to find. The script loops over its arguments, making it possible for find to send multiple arguments to a single invocation using the + terminator:
find /Volumes/VolumeName -type d -name "VIDEO_TS" -exec ./CLIWrapper {} +

Apply a script to subdirectories

I have read many times that if I want to execute something over all subdirectories I should run something like one of these:
find . -name '*' -exec command arguments {} \;
find . -type f -print0 | xargs -0 command arguments
find . -type f | xargs -I {} command arguments {} arguments
The problem is that it works well with corefunctions, but not as expected when the command is a user-defined function or a script. How to fix it?
So what I am looking for is a line of code or a script in which I can replace command for myfunction or myscript.sh and it goes to every single subdirectory from current directory and executes such function or script there, with whatever arguments I supply.
Explaining in another way, I want something to work over all subdirectories as nicely as for file in *; do command_myfunction_or_script.sh arguments $file; done works over current directory.
Instead of -exec, try -execdir.
It may be that in some cases you need to use bash:
foo () { echo $1; }
export -f foo
find . -type f -name '*.txt' -exec bash -c 'foo arg arg' \;
The last line could be:
find . -type f -name '*.txt' -exec bash -c 'foo "$#"' _ arg arg \;
Depending on what args might need expanding and when. The underscore represents $0.
You could use -execdir where I have -exec if that's needed.
The examples that you give, such as:
find . -name '*' -exec command arguments {} \;
Don't go to every single subdirectory from current directory and execute command there, but rather execute command from the current directory with the path to each file listed by the find as an argument.
If what you want is to actually change directory and execute a script, you could try something like this:
STDIR=$PWD; IFS=$'\n'; for dir in $(find . -type d); do cd $dir; /path/to/command; cd $STDIR; done; unset IFS
Here the current directory is saved to STDIR and the bash Internal Field Separator is set to a newline so names won't split on spaces. Then for each directory (-type d) that find returns, we cd to that directory, execute the command (using the full path here as changing directories will break a relative path) and then cd back to the starting directory.
There may be some way to use find with a function, but it won't be terribly elegant. If you have bash 4, what you probably want to do is use globstar:
shopt -s globstar
for file in **/*; do
myfunction "$file"
done
If you're looking for compatibility with POSIX or older versions of bash, you will be forced to source the file defining your function when you invoke bash. So something like this:
find <args> -exec bash -c '. funcfile;
for file; do
myfunction "$file"
done' _ {} +
But that's just ugly. When I get to this point, I usually just put my function in a script on my PATH and live with it.
If you want to use a bash function, this is one way.
work ()
{
local file="$1"
local dir=$(dirname $file)
pushd "$dir"
echo "in directory $(pwd) working with file $(basename $file)"
popd
}
find . -name '*' | while read line;
do
work "$line"
done

Shell script help! Looping over directories under a directory

I want to write a shell script that loops through all directories under a directory, and call a java program with the directory name as an argument at each iteration.
So my parent directory is provided as an argument to the shell script: eg:
. myShell.sh /myFolder/myDirectory
There are 100 directories under /myFolder/myDirectory. For each "directory_i", i want to run:
java myProg directory_i
If someone can provide me with a working shell script that'll be perfect!
You could use find.
The myShell.sh script might look a bit like this, this is a version that will recursively process any and all subdirectories under your target.
DIR="$1"
find "$DIR" -type d -exec java myProg {} \;
The exact set of find options available depends on your variety of unix. If you don't want recursion, you may be able to use -maxdepth as Neeraj noted, or perhaps -prune, which starts get a bit ugly:
find "$DIR" \( ! -name . -prune \) -type d -exec java myProg {} \;
EDIT: Added prune example.
#!/bin/bash -f
files=`ls $1`
for file in $files; do
if [ -d $file ];then
java myProg $file
# java your_program_name directory_i
fi
done
#!/bin/sh
for i in */.; do
echo "$i" aka "${i%/.}"
: your_command
done

Resources