Finding text inside a folder using terminal - macos

I am trying to find a line of code in a folder and using the terminal. I use this command, which I think should work:
MacBook-Pro:myWordpress myId$ grep "<?php get_template_part('loop', 'archives'); ?>" -R
where the folder I am inspecting is called "myWordpress". But I get this:
grep: warning: recursive search of stdin
I don't use much the terminal, so I am unsure as to how to do what I want. Any help appreciated. Thanks
David

You have to specify the directory too as the last argument:
grep -r -F "string to search" /path/to/dir
Or if you want to search in current directory, write . as the target directory:
grep -r -F "string to search" .
If you don't specify a directory, grep will try to search in the standard input, but recursive search does not make sense there - that's why you get that warning.
The manpage for grep says that:
Not all grep implementations support -r and among those that do, the behaviour with symlinks may differ.
In this case, you can try a different approach, with find and xargs:
find /path/to/dir -name '*.*' -print0 | xargs -0r grep -F "string to search"
Again, you can write . as the directory parameter (after find) if you want to search in the current directory.
Edit: as #EdMorton pointed out, the -F option is needed for grep if you want to search for a simple text instead of a regular expression. I added it to my examples above as it seems you are trying to search for PHP snippets that may contain special characters, which would lead to a different output in regexp mode.

For macOS this will be super useful and easy, and also it highlights the search result matches of text!
brew install ack
ack "text"

I would suggest giving a try to ripgrep
$ brew install ripgrep
Besides been faster it gives you multiple options, check the examples
Once you have it installed just need to do:
$ rg your-string

Never use -r or -R with grep. It is a terrible idea, completely unnecessary (the UNIX tool to find files is named find!), and makes your code GNU-specific.
Best I can tell without any sample input/output, all you need is:
grep "<?php get_template_part('loop', 'archives'); ?>" *
or if you have files AND directories in your current directory but only want to search in the files:
find . -type f -maxdepth 1 -exec grep -H "<?php get_template_part('loop', 'archives'); ?>" {} +
or to search in all sub-directories:
find . -type f -exec grep -H "<?php get_template_part('loop', 'archives'); ?>" {} +
If that doesn't do what you want then provide the sample input/output in your question.

Related

Alias for a combination of grep and find is needed

Many times I need to search from a directory and below for a pattern in all files with a specific type. For example, I need to ask grep not to look into files other than *.h, *.cpp or *.c. But if I enter:
grep -r pattern .
it looks into all files. If I enter:
grep -r pattern *.c
it tries *.c files in the current folder (no file in my case) and files in *.c folders (no folder in my case). I want to ask it too look into all folders but only into file with the given type. I think grep is not enough to be used for this purpose. So, I get help from find too, like this:
grep pattern `find . -name '*c'`
First, let me know whether I'm right about getting help from find. Can grep be enough? Second, I prefer to write an alias for bash to be used like this:
mygrep pattern c
to be translated to the same command avoiding usage of ` and ' and be simpler. I tried:
alias mygrep="grep $1 `find . -name '*$2'`"
But it doesn't work and issues an error:
grep: c: No such file or directory
I tried to change it, but I couldn't succeed to a successful alias.
Any idea?
This would be better done as a function than an alias, and using -exec instead of passing the output of find to grep. That output would be subject to word splitting and globbing, so could produce surprising results as is. Instead try:
mygrep () {
find . -name "*$2" -exec grep "$1" {} +
}

search with find command until first match

I just need to search for a specific directory that can be anywhere is there a way to run this command until the first match? Thanx!
Im now ussing
find / -noleaf -name 'experiment' -type d | wc -l
As Rudolf Mühlbauer mentions, the -quit option tells find to quit. The man page example is that
find /tmp/foo /tmp/bar -print -quit
will print only /tmp/foo.
Slightly more generally, find may be the wrong tool for what you want to do. See man locate. For example, on my system,
locate experiment | head -3
produces
/usr/lib/tc/experimental.dist
/usr/share/doc/xorg/reference/experimental.html
/usr/share/doc/xorg/reference/experimental.txt
while locate -r 'experimental..$' produces (with 6 lines snipped for brevity)
/usr/src/linux-headers-3.2.0-24-generic/include/config/experimental.h
(snip)
/usr/src/linux-headers-3.2.0-32-generic/include/config/experimental.h

How to rename files in current directory and its subdirectories using bash script?

I'm able to use the 'rename' command to add the missing character to all filenames in the current directory like this:
echo "Renaming files..."
rename -v "s/^abcd124(.+)/abcd1234$1/" *.wav.gz;
echo "Done."
However, I'd like to do this for the current directory and all its subdirectories. I tried this:
echo "Renaming files..."
for dir in $(find ./ -type d); do
rename -v "s/^$dir\/abcd124(.+)/$dir\/abcd1234$1/" *.wav.gz;
done;
echo "Done."
However, if the $dir variable contains any of these special characters: {}[]()^$.|*+?\ then they are not escaped properly with \ and my script fails.
What would be the best way to solve this problem? Also, what do you guys think of using awk to solve this problem (advantages/disadvantages?)
You can also try:
find . -name "*.wav.gz" | xargs rename -v "s/abcd124*/abcd1234$1/"
It works on newer Unix systems with "xargs" command available. Note that I edited the regular expression slightly.
Try:
find ./ -type d -execdir rename -v "s/^abcd124(.+)/abcd1234\1/" *.wav.gz ";"
Find does already provide an iterator over your files - you don't need for around it or xargs behind , which are often seen. Well - in rare cases, they might be helpful, but normally not.
Here, -execdir is useful. Gnu-find has it; I don't know if your find has it too.
But you need to make sure not to have a *.wav.gz-file in the dir you're starting this command, because else your shell will expand it, and hand the expanded names over to rename.
Note: I get an warning from rename, that I should replace \1 with $1 in the regex, but if I do so, the pattern isn't catched. I have to use \1 to make it work.
Here is another approach. Why at all search for directories, if we search for wav.gz-files?
find . -name "*.wav.gz" -exec rename -v "s/^abcd124(.+)/abcd1234\1/" {} ";"
In bash 4:
shopt -s globstar
rename -v "s/^$dir\/abcd124(.+)/$dir\/abcd1234$1/" **/*.wav.gz;
Just be aware that Gentoo Linux has its rename utility points to
http://developer.berlios.de/project/showfiles.php?group_id=413
by its ebuild
http://sources.gentoo.org/cgi-bin/viewvc.cgi/gentoo-x86/sys-apps/rename/rename-1.3.ebuild?view=markup
for Debian or maybe Ubuntu,
rename is /usr/bin/prename, which is a perl script
See rename --help before your move.

How do I get a list of all available shell commands

In a typical Linux shell (bash) it is possible to to hit tab twice, to get a list of all available shell commands.
Is there a command which has the same behaviour? I want to pipe it into grep and search it.
You could use compgen. For example:
compgen -c
You also could grep it, like this:
compgen -c | grep top$
Source: http://www.cyberciti.biz/open-source/command-line-hacks/compgen-linux-command/
You can list the directories straight from $PATH if you tweak the field separator first. The parens limit the effect to the one command, so use: (...) | grep ...
(IFS=': '; ls -1 $PATH)
"tab" twice & "y" prints all files in the paths of $PATH. So just printing all files in PATH is sufficient.
Just type this in the shell:
# printf "%s\n" ${PATH//:/\/* } > my_commands
This redirect all the commands to a file "my_commands".
List all the files in your PATH variable (ls all the directories in the PATH). The default user and system commands will be in /bin and /sbin respectively but on installing some software we will add them to some directory and link it using PATH variable.
There may be things on your path which aren't actually executable.
#!/bin/sh
for d in ${PATH//:/ }; do
for f in "$d"/*; do
test -x "$f" && echo -n "$f "
done
done
echo ""
This will also print paths, of course. If you only want unqualified filenames, it should be easy to adapt this.
Funny, StackOverflow doesn't know how to handle syntax highlighting for this. :-)
tabtaby
Similar to #ghoti, but using find:
#!/bin/sh
for d in ${PATH//:/ }; do
find $d -maxdepth 1 -type f -executable
done
Bash uses a builtin command named 'complete' to implement the tab feature.
I don't have the details to hand, but the should tell you all you need to know:
help complete
(IFS=':'; find $PATH -maxdepth 1 -type f -executable -exec basename {} \; | sort | uniq)
It doesn't include shell builtins though.
An answer got deleted, I liked it most, so I'm trying to repost it:
compgen is of course better
echo $PATH | tr ':' '\n' | xargs -n 1 ls -1
I found this to be the most typical shell thing, I think it works also with other shells (which I doubt with things like IFS=':' )
Clearly, there maybe problems, if the file is not an executable, but I think for my question, that is enough - I just want to grep my output - which means searching for some commands.

List files not matching a pattern?

Here's how one might list all files matching a pattern in bash:
ls *.jar
How to list the complement of a pattern? i.e. all files not matching *.jar?
Use egrep-style extended pattern matching.
ls !(*.jar)
This is available starting with bash-2.02-alpha1.
Must first be enabled with
shopt -s extglob
As of bash-4.1-alpha there is a config option to enable this by default.
ls | grep -v '\.jar$'
for instance.
Little known bash expansion rule:
ls !(*.jar)
With an appropriate version of find, you could do something like this, but it's a little overkill:
find . -maxdepth 1 ! -name '*.jar'
find finds files. The . argument specifies you want to start searching from ., i.e. the current directory. -maxdepth 1 tells it you only want to search one level deep, i.e. the current directory. ! -name '*.jar' looks for all files that don't match the regex *.jar.
Like I said, it's a little overkill for this application, but if you remove the -maxdepth 1, you can then recursively search for all non-jar files or what have you easily.
POSIX defines non-matching bracket expressions, so we can let the shell expand the file names for us.
ls *[!j][!a][!r]
This has some quirks though, but at least it is compatible with about any unix shell.
If your ls supports it (man ls) use the --hide=<PATTERN> option. In your case:
$> ls --hide=*.jar
No need to parse the output of ls (because it's very bad) and it scales to not showing multiple types of files. At some point I needed to see what non-source, non-object, non-libtool generated files were in a (cluttered) directory:
$> ls src --hide=*.{lo,c,h,o}
Worked like a charm.
Another approach can be using ls -I flag (Ignore-pattern).
ls -I '*.jar'
And if you want to exclude more than one file extension, separate them with a pipe |, like ls test/!(*.jar|*.bar). Let's try it:
$ mkdir test
$ touch test/1.jar test/1.bar test/1.foo
$ ls test/!(*.jar|*.bar)
test/1.foo
Looking at the other answers you might need to shopt -s extglob first.
One solution would be ls -1|grep -v '\.jar$'
Some mentioned variants of this form:
ls -d *.[!j][!a][!r]
But this seems to be only working on bash, while this seems to work on both bash and zsh:
ls -d *.[^j][^a][^r]
ls -I "*.jar"
-I, --ignore=PATTERN
do not list implied entries matching shell PATTERN
It works without having to execute anything before
It works also inside watch quotes: watch -d 'ls -I "*.gz"', unlike watch 'ls !(*.jar)' which produces: sh: 1: Syntax error: "(" unexpected
Note: For some reason in Centos requires quoting the pattern after -I while Ubuntu does not

Resources