Why doesn't this command work?
find / -type f -name "*.c" -exec wc -c < \{} \;
I'm trying to hide the filename while showing number of characters
You can do the following:
find / -type f -name "*.c" -exec wc -c {} + | awk '{print $1}'
< does not work because redirection applies to the find command itself. So what you can do is use awk to only print the first column
As pointed out by #Charles Duffy, this is a bit more efficient since it means we're only starting one wc process, and no shell wrapper, once per file (as the awk instance is shared, not invoked per-file).
Use a bash command with exec like below
find . -type f -name "*.c" -exec bash -c 'wc -c < "$1"' _ {} \;
In your case the redirection < applies to the find command itself resulting a syntax error as there is no file literally named {}.
A sidenote : The positional parameter inside bash shell should be double quoted to deal with non-standard filenames, say those with spaces and so.
Edit
To NOT pay the extra cost of invoking multiple sub-shells, we can further improve the exec part like below
find . -type f -name "*.c" -exec bash -c 'for arg; do wc -c <"$arg"; done' _ {} +
Courtesy #Charles Duffy for this suggestion.
If you really want the redirection, you could do it in a subshell:
find / -type f -name "*.c" -exec sh -c 'wc -c < "{}"' \;
Edit: don't do this - see comments below!
Related
I want to create shell script that search inside all folders of the actual directory and return all files that satisfy some condition, but without using any print flag.
(Here the condition is to end with .py)
What I have done:
find . -name '*.py'| sed -n 's/\.py$//p'
The output:
./123
./test
./abc/dfe/test3
./testing
./test2
What I would like to achieve:
123
test
test3
testing
test2
Use -exec:
find . -name '*.py' -exec sh -c 'for f; do f=${f%.py}; echo "${f##*/}"; done' sh {} +
If GNU basename is an option, you can simplify this to
find . -name '*.py' -exec basename -s .py {} +
POSIX basename is a little more expensive, as you'll have to call it on every file individually:
find . -name '*.py' -exec basename {} .py \;
Using GNU grep instead of sed:
find . -name '*.py' | grep -oP '[^/]+(?=\.py$)'
If portability is not a concern, this is a very readable option:
find . -name '*.py' | xargs basename -a
This is also differentiated from chepner's answer in that it retains the .py file ending in the output.
I'm not familiar with the -exec flag, and I'm sure his one-liners can be customized to do the same, but I couldn't do so off the top of my head.
Chepner's version achieves the same with the small modification:
find . -name '*.py' -exec basename {} \;
if you want the literal output from find and didn't intend to drop the file endings when you used dummy variables (123,test, etc.) in your question.
find shows entries relative to where you ask it to search, you can simply replace the . with a *:
find * -name '*.py'| sed -n 's/\.py$//p'
(Be aware that this skips top level hidden directories)
This might work for you (GNU parallel):
find . -name '*.py*' 2>/dev/null | parallel echo "{/.}"
i have a piece of code that should print all files in specific dir. i use find exec for this:
find ${_di} -type f -print -exec Log "$(stat -c%y {}) - {}" \;
Where log is function of mine defines in same file.
But id does not work and i get error message:
"find: Log: No such file or directory".
Why? What is wrong in this piece of code?
Function can't be used in -exec however bash -c can be used as command.
Slightly modified to using + as -exec command terminator and {} last to allow to reduce the number of bash processes spawned.
find ${_di} -type f -print -exec bash -c "$(typeset -f Log)"$'\n''for arg; do Log "$(stat -c%y "$arg") - $arg"; done' -- {} +
the argument -- can be replaced by anything else it is used for $0 argument of shell.
bash -c 'echo $0' hello
Maybe -printf "%TY-%Tm-%Td %TT - %p\n" option could achieve the same result, more efficiently without launching other process.
Also using echo may be less safe than using find -print option, considering the following use case.
touch file.$'\e#8'
find . -type d ! -name . -prune -o -name file'*' -print
find . -type d ! -name . -prune -o -name file'*' -exec echo {} \;
You need to export the function, and then, as Nahuel says, run bash in the -exec:
$ export -f Log
$ find ${_di} -type f -exec bash -c 'Log "$(stat -c%y {}) - {}"' bash \;
Is there a shell-independent equivalence of Bash substring replacement:
foo=Hello
echo ${foo/o/a} # will output "Hella"
Most of the time I can use bash so that is not a problem, however when combined with find -exec it does not work. For instance, to rename all .cpp files to .c, I'd like to use:
# does not work
find . -name '*.cpp' -exec mv {} {/.cpp$/.c}
For now, I'm using:
# does work, but longer
while read file; do
mv "$file" "${file/.cpp$/.c}";
done <<< $(find . -name '*.cpp')
Ideally a solution that could be used in scripts is better!
Using find and -exec you can do this:
find . -name '*.cpp' -exec bash -c 'f="$1"; mv "$f" "${f/.cpp/.c}"' - '{}' \;
However this will fork bash -c for each filename so using xargs or a for loop like this is better for performance reasons:
while IFS= read -d '' -r file; do
mv "$file" "${file/.cpp/.c}"
done < <(find . -name '*.cpp' -print0)
Btw, an alternative to using bash would be to use rename. If you have the cool version of the rename command, which is shipped along with perl you can do:
find -name '*.cpp' -exec rename 's/\.cpp$/.c/' {} +
The above example assumes that you have GNU findutils, having this you don't need to pass the current directory since it is the default. If you don't have GNU findutils, you need to explicitly pass it:
find . -name '*.cpp' -exec rename 's/\.cpp$/.c/' {} +
When I run this command in Terminal:
find . -type f -name "*.png" -exec sh -c "file {} | egrep -o '^.*\d+,'" \;
I get this error if a filename contains parentheses:
sh: -c: line 0: syntax error near unexpected token `('
sh: -c: line 0: `file ./(terrible filename).png | egrep -o '^.*\d+,''
I know it has something to do with the sh -c, but I don't know how to fix it, thanks.
./(terrible filename).png: PNG image data, 512 x 512,
// trying to get this result
You are basically pasting the file name into sh -c '...' without any quoting. The string inside sh -c after the substitutions made by find needs to be valid sh syntax, which means there can be no unquoted single quotes, parentheses, etc.
A more robust approach is to use -exec file {} and pass all the output from find to egrep.
find . -type f -name "*.png" -exec file {} \; | egrep -o '^.*\d+,'
The placeholder token {} gets replaced by find with the filename currently being processed. When it is a lone token, find can pass in any file name at all; but if you interpolate it into a longer string, such as a shell command, you will need to ensure that any necessary quoting etc. is added somehow. That's messy, so usually you will want to find a solution where you don't need to do that.
(As pointed out in comments to the other answer, -exec sh -c 'file "$1"' _ {} \; is another way to accomplish that; this generalizes to arbitrarily complex shell commands. If your find supports exec {} \+ you want to add a simple loop: -exec sh 'for f; do file "$f"; done' _ {} \+ -- incidentally, the _ is a dummy placeholder for $0.)
Are there parentheses in the file names? This might help:
find . -type f -name "*.png" -exec sh -c "file '{}' | egrep -o '^.*\d+,'" \;
I'm trying to construct a find command to process a bunch of files in a directory using two different executables. Unfortunately, -exec on find doesn't allow to use pipe or even \| because the shell interprets that character first.
Here is specifically what I'm trying to do (which doesn't work because pipe ends the find command):
find /path/to/jpgs -type f -exec jhead -v {} | grep 123 \; -print
Try this
find /path/to/jpgs -type f -exec sh -c 'jhead -v {} | grep 123' \; -print
Alternatively you could try to embed your exec statement inside a sh script and then do:
find -exec some_script {} \;
A slightly different approach would be to use xargs:
find /path/to/jpgs -type f -print0 | xargs -0 jhead -v | grep 123
which I always found a bit easier to understand and to adapt (the -print0 and -0 arguments are necessary to cope with filenames containing blanks)
This might (not tested) be more effective than using -exec because it will pipe the list of files to xargs and xargs makes sure that the jhead commandline does not get too long.
With -exec you can only run a single executable with some arguments, not arbitrary shell commands. To circumvent this, you can use sh -c '<shell command>'.
Do note that the use of -exec is quite inefficient. For each file that is found, the command has to be executed again. It would be more efficient if you can avoid this. (For example, by moving the grep outside the -exec or piping the results of find to xargs as suggested by Palmin.)
Using find command for this type of a task is maybe not the best alternative. I use the following command frequently to find files that contain the requested information:
for i in dist/*.jar; do echo ">> $i"; jar -tf "$i" | grep BeanException; done
As this outputs a list would you not :
find /path/to/jpgs -type f -exec jhead -v {} \; | grep 123
or
find /path/to/jpgs -type f -print -exec jhead -v {} \; | grep 123
Put your grep on the results of the find -exec.
There is kind of another way you can do it but it is also pretty ghetto.
Using the shell option extquote you can do something similar to this in order to make find exec stuff and then pipe it to sh.
root#ifrit findtest # find -type f -exec echo ls $"|" cat \;|sh
filename
root#ifrit findtest # find -type f -exec echo ls $"|" cat $"|" xargs cat\;|sh
h
I just figured I'd add that because at least the way i visualized it, it was closer to the OP's original question of using pipes within exec.