how to find bash script, then execute it and pass some arguments - bash

how to write the command??
I tried find . -name test.bash | xargs bash dd, this threw an error bash: xx: No such file or directory
I also tried find . -name test.bash | xargs bash -c, can't work too.

I created a test script, super simple, it just prints the arguments.
#!/bin/bash
echo "$#"
Then I find it in it's directory and call it with arguments:
find . -name test.bash -exec {} arg1 arg2 \;
It runs and outputs "arg1 arg2".

Related

Unix shell script not executing from another script

I have written the below command using a shell script:
/usr/bin/find ${FilePath[$i]} -name ${FileName[$i]}* -type f -mtime +${DaysNo[$i]} | grep ${FilePath[$i]}$tempfile > tempFilesList
It looks good when I execute this script directly, but gives me below error when I try to execute it from another shell script.
ERROR : /usr/bin/find: bad option resultmgr.log_2019-11-07
/usr/bin/find: [-H | -L] path-list predicate-list
It's likely that ${FileName[$i]}* is being expanded to multiple file names which would give you something like -name file1 file2 in your command.
That could happen if, for example, files matching that mask existed in your current working directory for the case where you run it from another script, but not when you're running it from the command line. Some shells will expand if possible but leave alone if not, as per the following transcript:
~> echo testprog*
testprog testprog.c
~> echo nosuchfile*
nosuchfile*
~> _
That file2 would then be considered a control argument to find and therefore invalid.
You can check this by simply echoing out the command before running it:
echo Will run: /usr/bin/find ${FilePath[$i]} -name ${FileName[$i]}* -type f -mtime +${DaysNo[$i]} ...
and seeing what it outputs.

Bash -c argument passing to find

This command works
find . -name \*.txt -print
and outputs two filenames
This command works
bash -c 'echo . $0 $1 -print' "-name" "\*.txt"
and outputs this result:
. -name *.txt -print
But this command
bash -c 'find . $0 $1 -print' "-name" "\*.txt"
does not give an error but does not output anything either.
Can anyone tell me what is happening here?
It looks like you're trying to use "\*.txt" to forestall glob-expansion so that the find command sees *.txt instead of e.g. foo.txt.
However, what it ends up seeing is \*.txt. No files match that pattern, so you see no output.
To make find see *.txt as its 3rd argument, you could do this:
bash -c 'find . $0 "$1" -print' "-name" "*.txt"
Edit: Are you really getting . -name *.txt -print as the output of the first command where you replaced find with echo? When I run that command, I get . -name \*.txt -print.
Well the suggestions from francesco work. But I am still confused by the behaviour here.
We know that putting unquoted wild cards in a find command will usually result in an error. To wit:
find . -name *.txt -print
find: paths must precede expression: HowTo-Word-Split.txt' find:
possible unquoted pattern after predicate-name'?
However putting the wild card in single quotes (or escaping it if it is only 1 char) will work like so:
find . -name \*.txt -print
which gives this output (on two separate lines)
> ./HowTo-Word-Split.txt
> ./bash-parms.txt
So in the bash -c version what I was thinking was this:
bash -c 'find . $0 $1 -print' "-name" "*.txt"
would result in the *.txt being expanded even before being passed in to the cmd string,
and using single quotes around it would result in trying to execute (after the arg substitution and the -c taking effect)
find . -name *.txt -print
which as I just demonstrated does not work.
However there seems to be some sort of magic associated with the -c switch as demonstrated by setting -x at the bash prompt, like so:
$ set -x
$ bash -c ' find . $0 "$1" -print' "-name" "*.txt"
+ bash -c ' find . $0 "$1" -print' -name '*.txt'
./HowTo-Word-Split.txt
./bash-parms.txt
Note that even though I used double quotes in the -c line, bash actually executed the find with single quotes put around the argument, thus making find work.
Problem solved. :)!

find -exec when used with sed for file rename not working

I've been trying:
find dev-other -name '*.flac' -type f -exec echo $(echo {} | sed 's,^[^/]*/,,') \;
I expect to see a list of paths to .flac files within dev-other, but without a prepended dev-other/, e.g.:
4515/11057/4515-11057-0095.flac
4515/11057/4515-11057-0083.flac
4515/11057/4515-11057-0040.flac
4515/11057/4515-11057-0105.flac
4515/11057/4515-11057-0017.flac
4515/11057/4515-11057-0001.flac
Instead I see
dev-other/4515/11057/4515-11057-0095.flac
dev-other/4515/11057/4515-11057-0083.flac
dev-other/4515/11057/4515-11057-0040.flac
dev-other/4515/11057/4515-11057-0105.flac
dev-other/4515/11057/4515-11057-0017.flac
Why isn't the sed replace working here even though it works on its own
$ echo $(echo dev-other/4515/11057/4515-11057-0047.flac | sed 's,^[^/]*/,,')
4515/11057/4515-11057-0047.flac
I first tried with expansions:
find dev-other -name '*.flac' -type f -exec a={} echo ${a#*/} \;
But got the errors:
find: a=dev-other/700/122866/700-122866-0001.flac: No such file or directory
find: a=dev-other/700/122866/700-122866-0030.flac: No such file or directory
find: a=dev-other/700/122866/700-122866-0026.flac: No such file or directory
find: a=dev-other/700/122866/700-122866-0006.flac: No such file or directory
find: a=dev-other/700/122866/700-122866-0010.flac: No such file or directory
You can just use parameter expansion for your use-case when using find with the -exec option,
find dev-other -name '*.flac' -type f -exec bash -c 'x=$1; y="${x#*/}"; echo "$y"' bash {} \;
I used a separate shell (use bash or sh) using bash -c because to involve separate string operations involving parameter expansion. Think of each output of find result to be passed as argument to this sub-shell where this manipulation takes place.
When bash -c executes a command, the next argument after the command is used as $0 (the script's "name" in the process listing), and subsequent arguments become the positional parameters ($1, $2, etc.). This means that the filename passed by find (in place of the {}) becomes the first parameter of the script -- and is referenced by $1 inside the mini-script
If you don't want to use an extra bash, use _ in-place
find dev-other -name '*.flac' -type f -exec bash -c 'x=$1; y="${x#*/}"; echo "$y"' _ {} \;
where _ i is a bash predefined variable (not defined in dash for instance): "At shell startup, set to the absolute path-name used to invoke the shell or shell script being executed as passed in the environment or argument list" ( See man bash - Special Parameters section)
Worth looking at Using Find - Complex Actions

Apply a script to subdirectories

I have read many times that if I want to execute something over all subdirectories I should run something like one of these:
find . -name '*' -exec command arguments {} \;
find . -type f -print0 | xargs -0 command arguments
find . -type f | xargs -I {} command arguments {} arguments
The problem is that it works well with corefunctions, but not as expected when the command is a user-defined function or a script. How to fix it?
So what I am looking for is a line of code or a script in which I can replace command for myfunction or myscript.sh and it goes to every single subdirectory from current directory and executes such function or script there, with whatever arguments I supply.
Explaining in another way, I want something to work over all subdirectories as nicely as for file in *; do command_myfunction_or_script.sh arguments $file; done works over current directory.
Instead of -exec, try -execdir.
It may be that in some cases you need to use bash:
foo () { echo $1; }
export -f foo
find . -type f -name '*.txt' -exec bash -c 'foo arg arg' \;
The last line could be:
find . -type f -name '*.txt' -exec bash -c 'foo "$#"' _ arg arg \;
Depending on what args might need expanding and when. The underscore represents $0.
You could use -execdir where I have -exec if that's needed.
The examples that you give, such as:
find . -name '*' -exec command arguments {} \;
Don't go to every single subdirectory from current directory and execute command there, but rather execute command from the current directory with the path to each file listed by the find as an argument.
If what you want is to actually change directory and execute a script, you could try something like this:
STDIR=$PWD; IFS=$'\n'; for dir in $(find . -type d); do cd $dir; /path/to/command; cd $STDIR; done; unset IFS
Here the current directory is saved to STDIR and the bash Internal Field Separator is set to a newline so names won't split on spaces. Then for each directory (-type d) that find returns, we cd to that directory, execute the command (using the full path here as changing directories will break a relative path) and then cd back to the starting directory.
There may be some way to use find with a function, but it won't be terribly elegant. If you have bash 4, what you probably want to do is use globstar:
shopt -s globstar
for file in **/*; do
myfunction "$file"
done
If you're looking for compatibility with POSIX or older versions of bash, you will be forced to source the file defining your function when you invoke bash. So something like this:
find <args> -exec bash -c '. funcfile;
for file; do
myfunction "$file"
done' _ {} +
But that's just ugly. When I get to this point, I usually just put my function in a script on my PATH and live with it.
If you want to use a bash function, this is one way.
work ()
{
local file="$1"
local dir=$(dirname $file)
pushd "$dir"
echo "in directory $(pwd) working with file $(basename $file)"
popd
}
find . -name '*' | while read line;
do
work "$line"
done

Using an alias in find -exec

I have a very long command in bash, which I do not want to type all the time, so I put an alias in my .profile
alias foo='...'
Now I want to execute this alias using find -exec
find . -exec foo '{}' \;
but find cannot find foo:
find: foo: No such file or directory
Is it possible to use an alias in find?
find itself doesn't know anything about aliases, but your shell does. If you are using a recent enough version of bash (I think 4.0 added this feature), you can use find . -exec ${BASH_ALIASES[foo]} {} \; to insert the literal content of the alias at that point in the command line.
Nope, find doesn't know anything about your aliases. Aliases are not like environment variables in that they aren't "inherited" by child processes.
You can create a shell script with the same commands, set +x permissions and have it in your path. This will work with find.
Another way of calling an alias when processing the results of find is to use something like this answer
so the following should work:
alias ll="ls -al"
find . -type d | while read folder; do ll $folder; done
I am using the ll commonly know alias for this example but you may use your alias instead, just replace ll in the following line with your alias (foo) and it should work:
find . -exec `alias ll | cut -d"'" -f2` {} \;
your case:
find . -exec `alias foo | cut -d"'" -f2` {} \;
Note it assumes your alias is quoted using the following syntax:
alias foo='your-very-long-command'
It's not possible (or difficult / error-prone) to use aliases in the find command.
An easier way to achieve the desired result is putting the contents of the alias in a shellscript and run that shellscript:
alias foo | sed "s/alias foo='//;s/'$/ \"\$#\"/" > /tmp/foo
find -exec bash /tmp/foo {} \;
The sed command removes the leading alias foo=' and replaces the trailing ' by "$#" which will contain the arguments passed to the script.
You can use the variable instead.
So instead of:
alias foo="echo test"
use:
foo="echo test"
then execute it either by command substitution or eval, for instance:
find . -type f -exec sh -c "eval $foo" \;
or:
find . -type f -exec sh -c "echo `$foo`" \;
Here is real example which is finding all non-binary files:
IS_BINARY='import sys; sys.exit(not b"\x00" in open(sys.argv[1], "rb").read())'
find . -type f -exec bash -c "python -c '$IS_BINARY' {} || echo {}" \;
I ran into the same thing and pretty much implemented skjaidev's solution.
I created a bash script called findVim.sh with the following contents:
[ roach#sepsis:~ ]$ cat findVim.sh #!/bin/bash
find . -iname $1 -exec vim '{}' \;
Then I added the the .bashrc alias as:
[ roach#sepsis:~ ]$ cat ~/.bashrc | grep fvim
alias fvim='sh ~/findVim.sh'
Finally, I reloaded .bashrc with source ~/.bashrc.
Anyways long story short I can edit arbitrary script files slightly faster with:
$ fvim foo.groovy

Resources