Using an alias in find -exec - bash

I have a very long command in bash, which I do not want to type all the time, so I put an alias in my .profile
alias foo='...'
Now I want to execute this alias using find -exec
find . -exec foo '{}' \;
but find cannot find foo:
find: foo: No such file or directory
Is it possible to use an alias in find?

find itself doesn't know anything about aliases, but your shell does. If you are using a recent enough version of bash (I think 4.0 added this feature), you can use find . -exec ${BASH_ALIASES[foo]} {} \; to insert the literal content of the alias at that point in the command line.

Nope, find doesn't know anything about your aliases. Aliases are not like environment variables in that they aren't "inherited" by child processes.
You can create a shell script with the same commands, set +x permissions and have it in your path. This will work with find.

Another way of calling an alias when processing the results of find is to use something like this answer
so the following should work:
alias ll="ls -al"
find . -type d | while read folder; do ll $folder; done

I am using the ll commonly know alias for this example but you may use your alias instead, just replace ll in the following line with your alias (foo) and it should work:
find . -exec `alias ll | cut -d"'" -f2` {} \;
your case:
find . -exec `alias foo | cut -d"'" -f2` {} \;
Note it assumes your alias is quoted using the following syntax:
alias foo='your-very-long-command'

It's not possible (or difficult / error-prone) to use aliases in the find command.
An easier way to achieve the desired result is putting the contents of the alias in a shellscript and run that shellscript:
alias foo | sed "s/alias foo='//;s/'$/ \"\$#\"/" > /tmp/foo
find -exec bash /tmp/foo {} \;
The sed command removes the leading alias foo=' and replaces the trailing ' by "$#" which will contain the arguments passed to the script.

You can use the variable instead.
So instead of:
alias foo="echo test"
use:
foo="echo test"
then execute it either by command substitution or eval, for instance:
find . -type f -exec sh -c "eval $foo" \;
or:
find . -type f -exec sh -c "echo `$foo`" \;
Here is real example which is finding all non-binary files:
IS_BINARY='import sys; sys.exit(not b"\x00" in open(sys.argv[1], "rb").read())'
find . -type f -exec bash -c "python -c '$IS_BINARY' {} || echo {}" \;

I ran into the same thing and pretty much implemented skjaidev's solution.
I created a bash script called findVim.sh with the following contents:
[ roach#sepsis:~ ]$ cat findVim.sh #!/bin/bash
find . -iname $1 -exec vim '{}' \;
Then I added the the .bashrc alias as:
[ roach#sepsis:~ ]$ cat ~/.bashrc | grep fvim
alias fvim='sh ~/findVim.sh'
Finally, I reloaded .bashrc with source ~/.bashrc.
Anyways long story short I can edit arbitrary script files slightly faster with:
$ fvim foo.groovy

Related

How to cd into grep output?

I have a shell script which basically searches all folders inside a location and I use grep to find the exact folder I want to target.
for dir in /root/*; do
grep "Apples" "${dir}"/*.* || continue
While grep successfully finds my target directory, I'm stuck on how I can move the folders I want to move in my target directory. An idea I had was to cd into grep output but that's where I got stuck. Tried some Google results, none helped with my case.
Example grep output: Binary file /root/ant/containers/secret/Documents/2FD412E0/file.extension matches
I want to cd into 2FD412E0and move two folders inside that directory.
dirname is the key to that:
cd $(dirname $(grep "...." ...))
will let you enter the directory.
As people mentioned, dirname is the right tool to strip off the file name from the path.
I would use find for such kind of task:
while read -r file
do
target_dir=`dirname $file`
# do something with "$target_dir"
done < <(find /root/ -type f \
-exec grep "Apples" --files-with-matches {} \;)
Consider using find's -maxdepth option. See the man page for find.
Well, there is actually simpler solution :) I just like to write bash scripts. You might simply use single find command like this:
find /root/ -type f -exec grep Apples {} ';' -exec ls -l {} ';'
Note the second -exec. It will be executed, if the previous -exec command exited with status 0 (success). From the man page:
-exec command ;
Execute command; true if 0 status is returned. All following arguments to find are taken to be arguments to the command until an argument consisting of ; is encountered. The string {} is replaced by the current file name being processed everywhere it occurs in the arguments to the command, not just in arguments where it is alone, as in some versions of find.
Replace the ls -l command with your stuff.
And if you want to execute dirname within the -exec command, you may do the following trick:
find /root/ -type f -exec grep -q Apples {} ';' \
-exec sh -c 'cd `dirname $0`; pwd' {} ';'
Replace pwd with your stuff.
When find is not available
In the comments you write that find is not available on your system. The following solution works without find:
grep -R --files-with-matches Apples "${dir}" | while read -r file
do
target_dir=`dirname $file`
# do something with "$target_dir"
echo $target_dir
done

Edit a find -exec echo command to include a grep for a string

So I have the following command which looks for a series of files and appends three lines to the end of everything found. Works as expected.
find /directory/ -name "file.php" -type f -exec sh -c "echo -e 'string1\string2\nstring3\n' >> {}" \;
What I need to do is also look for any instance of string1, string2, or string3 in the find ouput of file.php prior to echoing/appending the lines so I don't append a file unnecessarily. (This is being run in a crontab)
Using | grep -v "string" after the find breaks the -exec command.
How would I go about accomplishing my goal?
Thanks in advance!
That -exec command isn't safe for strings with spaces.
You want something like this instead (assuming finding any of the strings is reason not to add any of the strings).
find /directory/ -name "file.php" -type f -exec sh -c "grep -q 'string1|string2|string3' \"\$1\" || echo -e 'string1\nstring2\nstring3\n' >> \"\$1\"" - {} \;
To explain the safety issue.
find places {} in the command it runs as a single argument but when you splat that into a double-quoted string you lose that benefit.
So instead of doing that you pass the file as an argument to the shell and then use the positional arguments in the shell command with quotes.
The command above simply chains the echo to a failure from grep to accomplish the goal.

Apply a script to subdirectories

I have read many times that if I want to execute something over all subdirectories I should run something like one of these:
find . -name '*' -exec command arguments {} \;
find . -type f -print0 | xargs -0 command arguments
find . -type f | xargs -I {} command arguments {} arguments
The problem is that it works well with corefunctions, but not as expected when the command is a user-defined function or a script. How to fix it?
So what I am looking for is a line of code or a script in which I can replace command for myfunction or myscript.sh and it goes to every single subdirectory from current directory and executes such function or script there, with whatever arguments I supply.
Explaining in another way, I want something to work over all subdirectories as nicely as for file in *; do command_myfunction_or_script.sh arguments $file; done works over current directory.
Instead of -exec, try -execdir.
It may be that in some cases you need to use bash:
foo () { echo $1; }
export -f foo
find . -type f -name '*.txt' -exec bash -c 'foo arg arg' \;
The last line could be:
find . -type f -name '*.txt' -exec bash -c 'foo "$#"' _ arg arg \;
Depending on what args might need expanding and when. The underscore represents $0.
You could use -execdir where I have -exec if that's needed.
The examples that you give, such as:
find . -name '*' -exec command arguments {} \;
Don't go to every single subdirectory from current directory and execute command there, but rather execute command from the current directory with the path to each file listed by the find as an argument.
If what you want is to actually change directory and execute a script, you could try something like this:
STDIR=$PWD; IFS=$'\n'; for dir in $(find . -type d); do cd $dir; /path/to/command; cd $STDIR; done; unset IFS
Here the current directory is saved to STDIR and the bash Internal Field Separator is set to a newline so names won't split on spaces. Then for each directory (-type d) that find returns, we cd to that directory, execute the command (using the full path here as changing directories will break a relative path) and then cd back to the starting directory.
There may be some way to use find with a function, but it won't be terribly elegant. If you have bash 4, what you probably want to do is use globstar:
shopt -s globstar
for file in **/*; do
myfunction "$file"
done
If you're looking for compatibility with POSIX or older versions of bash, you will be forced to source the file defining your function when you invoke bash. So something like this:
find <args> -exec bash -c '. funcfile;
for file; do
myfunction "$file"
done' _ {} +
But that's just ugly. When I get to this point, I usually just put my function in a script on my PATH and live with it.
If you want to use a bash function, this is one way.
work ()
{
local file="$1"
local dir=$(dirname $file)
pushd "$dir"
echo "in directory $(pwd) working with file $(basename $file)"
popd
}
find . -name '*' | while read line;
do
work "$line"
done

how to use a bash function defined in your .bashrc with find -exec

my .bashrc has the following function
function myfile {
file $1
}
export -f myfile
it works fine when i call it directly
rajesh#rajesh-desktop:~$ myfile out.ogv
out.ogv: Ogg data, Skeleton v3.0
it does not work when i try to invoke it through exec
rajesh#rajesh-desktop:~$ find ./ -name *.ogv -exec myfile {} \;
find: `myfile': No such file or directory
is there a way to call bash script functions with exec?
Any help is greatly appreciated.
Update:
Thanks for the response Jim.
But that's exactly what I wanted to avoid in the first place, since I have lot of utility functions defined in my bash scripts, I wanted to use them with other useful commands like find -exec.
I totally see your point though, find can run executables, it has no idea that the argument passed is function defined in a script.
I will get the same error when I try to exec is on bash prompt.
$ exec myfile out.ogv
I was hoping that there may be some neat trick that exec could be given some hypothetical command like "bash -myscriptname -myfunctionname".
I guess I should try to find some way to create a bash script on the fly and run it with exec.
find ./ -name *.ogv -exec bash -c 'myfile {}' \;
I managed to run it perhaps more elegantly as:
function myfile { ... }
export -f myfile
find -name out.ogv -exec bash -c '"$#"' myfile myfile '{}' \;
Notice that myfile is given twice. The first one is the $0 parameter of the script (and in this case it can be basically anything). The second one is the name of the function to run.
$ cat functions.bash
#!/bin/bash
function myecho { echo "$#"; }
function myfile { file "$#"; }
function mycat { cat "$#"; }
myname=`basename $0`
eval ${myname} "$#"
$ ln functions.bash mycat
$ ./mycat /etc/motd
Linux tallguy 2.6.32-22-core2 ...
$ ln functions.bash myfile
$ myfile myfile
myfile: Bourne-Again shell script text executable
$ ln functions.bash myecho
$ myecho does this do what you want\?
does this do what you want?
$
where, of course, the functions can be a tad more complex than my examples.
You can get bash to run a function by putting the command into bash's StdIn:
bash$ find ./ -name *.ogv -exec echo myfile {} \; | bash
The command above will work for your example but you need to take note of the fact that all of the 'myfile...' commands are generated at once and sent to a single bash process.
I don't think find can do this, since it's the find command itself that's executing
the command, and not the shell you're currently running...so bash functions or aliases
won't work there. If you take your function definition and turn it into a separate
bash script called myfile, make it executable, and install it on your path somewhere,
find should do the right thing with it.
even simpiler
function myfile { echo $* ; }
export -f myfile
find . -type f -exec bash -c 'myfile "{}"' \;
Child shell scripts seems to keep the parent functions so you could do a script similar to this one:
'runit.sh'
#! /bin/bash
"$#"
then do find -name out.ogv -exec ./runit.sh myfile '{}' \;
and it works! :)
Thanks Joao. This looks like very clever and elegant solution. Little issue was that I had to source my script first to run myfile function e.g. I borrowed from your suggestion and made my runint.sh as follows
#!/bin/bash
script_name=$1
func_name=$2
func_arg=$3
source $script_name
$func_name $func_arg
Now I can run it as follows
$ find ./ -name *.ogv -exec ./runit.sh ~/.bashrc myfile {} \;
./out.ogv: Ogg data, Skeleton v3.0
Otherwise I was getting
$ find ./ -name *.ogv -exec ./runit.sh myfile {} \;
./runit.sh: 1: myfile: not found
Anyway thanks a lot.

How do I write a bash alias/function to grep all files in all subdirectories for a string?

I've been using the following command to grep for a string in all the python source files in and below my current directory:
find . -name '*.py' -exec grep -nHr <string> {} \;
I'd like to simplify things so that I can just type something like
findpy <string>
And get the exact same result. Aliases don't seem sufficient since they only do a string expansion, and the argument I need to specify is not the last argument. It sounds like functions are suitable for the task, so I have several questions:
How do I write it?
Where do I put it?
If you don't want to create an entire script for this, you can do it with just a shell function:
findpy() { find . -name '*.py' -exec grep -nHr "$1" {} \; ; }
...but then you may have to define it in both ~/.bashrc and ~/.bash_profile, so it gets defined for both login and interactive shells (see the INVOCATION section of bash's man page).
All the "find ... -exec" solutions above are OK in the sense that they work, but they are horribly inefficient and will be extremely slow for large trees. The reason is that they launch a new process for every single *.py file. Instead, use xargs(1), and run grep only on files (not directories):
#! /bin/sh
find . -name \*.py -type f | xargs grep -nHr "$1"
For example:
$ time sh -c 'find . -name \*.cpp -type f -exec grep foo {} \; >/dev/null'
real 0m3.747s
$ time sh -c 'find . -name \*.cpp -type f | xargs grep foo >/dev/null'
real 0m0.278s
On a side note, you should take a look at Ack for what you are doing. It is designed as a replacement for Grep written in Perl. Filtering files based on the target language or ignoring .svn directories and the like.
Example (snippet from Trac source):
$ ack --python foo ./mysource
ticket/tests/wikisyntax.py
139:milestone:foo
144:<a class="missing milestone" href="/milestone/foo" rel="nofollow">milestone:foo</a>
ticket/tests/conversion.py
34: ticket['foo'] = 'This is a custom field'
ticket/query.py
239: count_sql = 'SELECT COUNT(*) FROM (' + sql + ') AS foo'
I wanted something similar, and the answer by Idelic reminded of one of the nice features of xargs: that it puts the command at the end. You see, my problem was that I wanted to write a shell alias that would "accept parameters" (really, that it would expand in such a way to allow me to pass parameter so grep).
Here's what I added to my bash_aliases:
alias findpy="find . -type f -name '*.py' | xargs grep"
This way, I could write findpy WORD or findpy -e REGEX or findpy -il WORD - the point being that could use any grep command-line option.
Put the following three lines in a file named findpy
#!/bin/bash
find . -name '*.py' -exec grep -nHr $1 {} \;
Then say
chmod u+x findpy
I normally have a directory called bin in my home directory where I put little shell scripts like this. Make sure to add the directory to your PATH.
The script:
#!/bin/bash
find . -name '*.py' -exec grep -nHr "$1" {} ';'
is how I'd do it.
You write it with an editor like vim and put it somewhere on your path. My normal approach is to have a ~/bin directory and make sure my .profile file (or equivalent) contains:
PATH=$PATH:~/bin
Many versions of grep have options to do recursion, specify filename pattern, etc.
grep --perl-regexp --recursive --include='*.py' --regexp="$1" .
This recurses starting from the current directory (.), looks only at files ending in 'py', uses Perl-style regular expressions.
If your version of grep doesn't support --recursive and --include, then you can still use find and xargs, but be sure to allow for pathnames with embedded spaces by using the -print0 argument to find and the --null option to xargs to handle that.
find . -type f -name '*.py' -print0 | xargs --null grep "$1"
should work.
Add the following line to your ~/.bashrc or ~/.bash_profile or ~/.profile
alias findpy='find . -type f -name "*.py" -print0 | xargs -0 grep'
then you can use it like this
findpy def
or with grep options
findpy -i class
the following alias will ignore the version control meta-directory of git and svn
alias findpy='find . -type f -not -path "*/.git/*" -a -not -path "*/.svn/*" -name "*.py" -print0 | xargs -0 grep'
#######################################################################################
#
# Function to search all files (including sub-directories) that match a given file
# extension ($2) looking for an indicated string ($1) - in a case insensitive manner.
#
# For Example:
#
# -> findfile AllowNegativePayments cpp
#
#
#######################################################################################
findfile ()
{
find . -iname "*.$2*" -type f -print0 | xargs -0 grep -i "$1" {} \; 2> /dev/nul
}
alias _ff='findfile'

Resources