search with find command until first match - bash

I just need to search for a specific directory that can be anywhere is there a way to run this command until the first match? Thanx!
Im now ussing
find / -noleaf -name 'experiment' -type d | wc -l

As Rudolf Mühlbauer mentions, the -quit option tells find to quit. The man page example is that
find /tmp/foo /tmp/bar -print -quit
will print only /tmp/foo.
Slightly more generally, find may be the wrong tool for what you want to do. See man locate. For example, on my system,
locate experiment | head -3
produces
/usr/lib/tc/experimental.dist
/usr/share/doc/xorg/reference/experimental.html
/usr/share/doc/xorg/reference/experimental.txt
while locate -r 'experimental..$' produces (with 6 lines snipped for brevity)
/usr/src/linux-headers-3.2.0-24-generic/include/config/experimental.h
(snip)
/usr/src/linux-headers-3.2.0-32-generic/include/config/experimental.h

Related

Grep - showing current directory/file in a recursive search

The problem
Sometimes, when I run the grep tool recursively it gets stuck in some big directories or in some big files, and I would like to see the directory or file name because perhaps I may realise I don't need to scan that specific directory/file the next time I use grep for a similar purpose, therefore excluding it with the corresponding grep options.
Is there a way to tell grep the current path directory/file which is being scanned in such searches?
My attempts
I tried to search here but it's impossible to find something since usually the keywords current directory are used for other reasons, so there is a conflicting terminology.
I have also tried things like:
man grep | grep -i current
man grep | grep -i status
(and many others) without success so far.
EDIT: I have just found a useful answer here which is for a different problem, but I guess that it may work if I modify the following code by adding an echo command somewhere in the for loop, although I have also just realised it requires bash 4 and sadly I have bash 3.
# Requires bash 4 and Gnu grep
shopt -s globstar
files=(**)
total=${#files[#]}
for ((i=0; i<total; i+=100)); do
echo $i/$total >>/dev/stderr
grep -d skip -e "$pattern" "${files[#]:i:100}" >>results.txt
done
find . -type f -exec echo grepping {} \; -exec time grep pattern {} \; 2>&1
find . -type f to find all the files recursively.
-exec echo grepping {} to call out each file
-exec time grep ... {} to report the time each grep takes
2>&1 to get time's stderr onto stdout.
This doesn't report a total time per directory. Doing that this way either requires more advanced find, to find leaf dirs for grep -d, or to add some cumulative time per path, which I'd do with perl -p... but that's nontrivial as well.

How to correctly redirect output of ls -d */?

Rather new to command line, so bear with me.
I'm supposed to be finding directories in /usr/local that end with a number. I've managed to list just the directories with:
ls -d */
but when I try using anything with via piping:
find -name
grep
look
there's no output shown. I've even tried just using the '*' wildcard for searching, but nothing shows up.
Any ideas where I'm going wrong?
The find command should be able to do what you want, and from the looks of it you have it just about right:
find / -type d -name <directory_name>
That will look for any directory with the name you specify from the root directory. If you ran the command as you show above I think the flaw was you were not specifying the directory to start your search. You can use the man page as well if you need any other parameters to specify:
http://unixhelp.ed.ac.uk/CGI/man-cgi?find
find /usr/local -type d -name '*[0-9]'
This does it all in one; looks under /usr/local/ for directories where the name ends with a digit (and implicitly prints the result).
Your code using ls might need to look like:
cd /usr/local || exit 1
ls -d */ | grep '[0-9]/$'
This will list directories with a slash at the end of the name, so you need to search for the names where there's a digit followed by the slash and the end of the name. One difference between this and the find command is that ls only lists directories immediately in /usr/local whereas find will search down directory hierarchies. If you don't want find to search down the hierarchy, say so:
find /usr/local -maxdepth 1 -type d -name '*[0-9]'
(If you place -maxdepth 1 at the end, some versions of find get snotty about it and complain.)
find "/path/to/some/dir/*[0-9]" -type d -maxdepth 1
ls -l /path/to/some/dir | grep "^d" | awk '{print $9}'
for file in /path/to/some/dir/*; do \
if [[ -d $file ]]; then \
echo $file; \
fi \
done

using find and sed in Windows 7 to recursively modify files

I'm trying to use find in Windows 7 with GNU sed to recursively replace a line of text in multiple files, across multiple directories. I looked at this question but the PowerShell solution seems to work with only one file, and I want to work with all files with a certain extension, recursively from the current directory. I tried this command:
find "*.mako" -exec sed -i "s:<%inherit file="layout.mako"/>:<%inherit file="../layout.mako"/>:"
But that gives me a bunch of crap and doesn't change any files:
---------- EDIT.MAKO
File not found - -EXEC
File not found - SED
File not found - -I
File not found - LAYOUT.MAKO/>:<%INHERIT FILE=../LAYOUT.MAKO/>:
How can I do this? It seems like I should have all the tools installed that I need, without having to install Cygwin or UnixUtils or anything else.
Edit: okay, working with GNU find, I still can't get anywhere, because I can't get the find part to work:
> gfind -iname "*.mako" .
C:\Program Files (x86)\GnuWin32\bin\gfind.exe: paths must precede expression
> gfind . -iname "*.mako"
C:\Program Files (x86)\GnuWin32\bin\gfind.exe: paths must precede expression
> gfind -iname "*.mako" .
C:\Program Files (x86)\GnuWin32\bin\gfind.exe: paths must precede expression
I was originally not using GNU find in Windows 7 because of this question.
Edit:
I tried the following, but sed doesn't see any input files this way:
> ls -r | grep mako | sed -i 's/file="layout.mako"/file="..\/layout.mako"/'
sed.exe: no input files
FIND from windows is being found instead of find from gnu.
So, rename your find.exe (from gnu) to gfind.exe (for example) and then call gfind instead of find when you wish to run it.
[edit]
gfind . -name "*.mako" (not gfind -iname "*.make" .)
[/edit]
You're executing the regular windows 'find' command, which has completely different command line arguments than gnu find. MS find has no capability of executing a program for each match, it simply searches.
Addition to Marc B/KevinDTimm answers: your find syntax is wrong.
It is not:
find "*.mako"
but:
find -name "*.mako"
Also, if there are directories that matches "*.mako", they would be sent to sed. To avoid that:
find -name "*.mako" -type f
Finally, I think that you are missing a '\;' at the end or your find command.
In Powershell, note the escape sequence using backticks
find . -type f -exec grep hello `{`} `;
Its much easier to use xargs
find . | xargs grep hello
I tried the following, but sed doesn't see any input files this way:
ls -r | grep mako | sed -i 's/file="layout.mako"/file="../layout.mako"/'
sed.exe: no input files
With this you are now running into PowerShell's "ls" alias. Either call "ls.exe" or go all PowerShell like this:
ls -r | select-string mako -list | select -exp path | sed -i 's/file="layout.mako"/file="..\/layout.mako"/'
Edit:
Workaround if stdin handling doesn't seem to be working.
ls -r | select-string mako -list | select -exp path | % {sed -i 's/file="layout.mako"/file="..\/layout.mako"/' $_}
Per your
Edit:
I tried the following, but sed doesn't see any input files this way:
ls -r | grep mako | sed -i
's/file="layout.mako"/file="../layout.mako"/' sed.exe: no input files
you need to use xargs to assemble the list of files passed to sed, i.e.
ls -r | grep mako | xargs sed -i
's\:file="layout.mako":file="../layout.mako":'
Note that for most versions of sed, you can use an alternate character to identify the substitute match/replace strings (usually '/'). Some seds require escaping that alternate char, which I have done in this example.
I hope this helps.

Find, grep, and execute - all in one?

This is the command I've been using for finding matches (queryString) in php files, in the current directory, with grep, case insensitive, and showing matching results in line:
find . -iname "*php" -exec grep -iH queryString {} \;
Is there a way to also pipe just the file name of the matches to another script?
I could probably run the -exec command twice, but that seems inefficient.
What I'd love to do on Mac OS X is then actually to "reveal" that file in the finder. I think I can handle that part. If I had to give up the inline matches and just let grep show the files names, and then pipe that to a third script, that would be fine, too - I would settle.
But I'm actually not even sure how to pipe the output (the matched file names) to somewhere else...
Help! :)
Clarification
I'd like to reveal each of the files in a finder window - so I'm probably not going to using the -q flag and stop at the first one.
I'm going to run this in the console, ideally I'd like to see the inline matches printed out there, as well as being able to pipe them to another script, like oascript (applescript, to reveal them in the finder). That's why I have been using -H - because I like to see both the file name and the match.
If I had to settle for just using -l so that the file name could more easily be piped to another script, that would be OK, too. But I think after looking at the reply below from #Charlie Martin, that xargs could be helpful here in doing both at the same time with a single find, and single grep command.
I did say bash but I don't really mind if this needs to be ran as /bin/sh instead - I don't know too much about the differences yet, but I do know there are some important ones.
Thank you all for the responses, I'm going to try some of them at the command line and see if I can get any of them to work and then I think I can choose the best answer. Leave a comment if you want me to clarify anything more.
Thanks again!
You bet. The usual thing is something like
$ find /path -name pattern -print | xargs command
So you might for example do
$ find . -name '*.[ch]' -print | xargs grep -H 'main'
(Quiz: why -H?)
You can carry on with this farther; for example. you might use
$ find . -name '*.[ch]' -print | xargs grep -H 'main' | cut -d ':' -f 1
to get the vector of file names for files that contain 'main', or
$ find . -name '*.[ch]' -print | xargs grep -H 'main' | cut -d ':' -f 1 |
xargs growlnotify -
to have each name become a Growl notification.
You could also do
$ grep pattern `find /path -name pattern`
or
$ grep pattern $(find /path -name pattern)
(in bash(1) at least these are equivalent) but you can run into limits on the length of a command line that way.
Update
To answer your questions:
(1) You can do anything in bash you can do in sh. The one thing I've mentioned that would be any different is the use of $(command) in place of using backticks around command, and that works in the version of sh on Macs. The csh, zsh, ash, and fish are different.
(2) I think merely doing $ open $(dirname arg) will opena finder window on the containing directory.
It sounds like you want to open all *.php files that contain querystring from within a Terminal.app session.
You could do it this way:
find . -name '*.php' -exec grep -li 'querystring' {} \; | xargs open
With my setup, this opens MacVim with each file on a separate tab. YMMV.
Replace -H with -l and you will get a list of those filenames that matched the pattern.
if you have bash4, simply do
grep pattern /path/**/*.php
the ** operator is like
grep pattern `find -name \*.php -print`
find /home/aaronmcdaid/Code/ -name '*.cpp' -exec grep -q -iH boost {} \; -exec echo {} \;
The first change I made is to add -q to your grep command. This is "Exit immediately with zero status if any match is found".
The good news is that this speeds up grep when a file has many matching lines. You don't care how many matches there are. But that means we need another exec on the end to actually print the filenames when grep has been successful
The grep result will be sent to stdout, so another -exec predicate is probably the best solution here.
Pipe to another script:
find . -iname "*.php" | myScript
File names will come into the stdin of myScript 1 line at a time.
You can also use xargs to form/execute commands to act on each file:
find . -iname "*.php" | xargs ls -l
act on files you find that match:
find . -iname "*.php" | xargs grep -l pattern | myScript
act that don't match pattern
find . -iname "*.php" | xargs grep -L pattern | myScript
In general using multiple -exec's and grep -q will be FAR faster than piping, since find has implied short circuits -a's separating each juxtaposed pair of expressions that's not separated with an explicit operator. The main problem here, is that you want something to happen if grep matches something AND for matches to be printed. If the files are reasonably sized then this should be faster (because grep -q exits after finding a single match)
find . -iname "*php" -exec grep -iq queryString {} \; -exec grep -iH queryString {} \; -exec otherprogram {} \;
If the files are particularly big, encapsulating it in a shell script may be faster then running multiple grep commands
find . -iname "*php" -exec bash -c \
'out=$(grep -iH queryString "$1"); [[ -n $out ]] && echo "$out" && exit 0 || exit 1' \
bash {} \; -print
Also note, if the matches are not particularly needed, then
find . -iname "*php" -exec grep -iq queryString {} \; -exec otherprogram {} \;
Will virtually always be faster than then a piped solution like
find . -iname "*php" -print0 | xargs -0 grep -iH | ...
Additionally, you should really have -type f in all cases, unless you want to catch *php directories
Regarding the question of which is faster, and you actually care about the minuscule time difference, which maybe you might if you are trying to see which will save your processor some time... perhaps testing using the command as a suffix to the "time" command, and see which one performs better.

grep returns "Too many argument specified on command" [duplicate]

This question already has answers here:
Argument list too long error for rm, cp, mv commands
(31 answers)
Closed 7 years ago.
I am trying to list all files we received in one month
The filename pattern will be
20110101000000.txt
YYYYMMDDHHIISS.txt
The entire directory is having millions of files.
For one month there can be minimum 50000 files.
Idea of sub directory is still pending.
Is there any way to list huge number of files with file name almost similar.
grep -l 20110101*
Am trying this and returning error.
I try php it took a huge time , thats why i use shell script . I dont understand why shell also not giving a result
Any suggestion please!!
$ find ./ -name '20110101*' -print0 -type f | xargs -0 grep -l "search_pattern"
you can use find and xargs. xargs will run grep for each file found by find. You can use -P to run multiple grep's parallely and -n for multiple files per grep command invocation. The print0 argument in find separates each filename with a null character to avoid confusion caused by any spaces in the file name. If you are sure there will not be any spaces you can remove -print0 and -0 args.
This should be the faster way:
find . -name "20110101*" -exec grep -l "search_pattern" {} +
Should you want to avoid the leading dot:
find . -name "20110101*" -exec grep -l "search_pattern" {} + | sed 's/^.\///'
or better thanks to adl:
find . -name "20110101*" -exec grep -l "search_pattern" {} + | cut -c3-
The 20110101* is getting expanded by your shell before getting passed to the command, so you're getting one argument passed for every file in the dir that starts with 20110101.
If you just want a list of matching files you can use find:
find . -name "20110101*"
(note that this will search every subdirectory also)
Some in depth information available here and also another work-around: for FILE in 20110101*; do grep foo ${FILE}; done. Most people will go with xargs and more seasoned admins with -exec {} + which accomplishes exactly the same, except is shorter to type. One would use the inline shell for construct, when running more processes is less important then seeing the results. With the for construct you may end up running grep thousands of times, but you see each match in real time, while using find and/or xargs you see batched results, however grep is run significantly less.
you need to put in a search term, so
grep -l "search term" 20110101*
if you want to just find the files, use ls 20110101*
Just pipe the output of ls to grep: ls | grep '^20110101'

Resources