Removing final bash script argument - bash

I'm trying to write a script that searches a directory for files and greps for a pattern. Something similar to the below except the find expression is much more complicated (excludes particular directories and files).
#!/bin/bash
if [ -d "${!#}" ]
then
path=${!#}
else
path="."
fi
find $path -print0 | xargs -0 grep "$#"
Obviously, the above doesn't work because "$#" still contains the path. I've tried variants of building up an argument list by iterating over all the arguments to exclude path such as
args=${#%$path}
find $path -print0 | xargs -0 grep "$path"
or
whitespace="[[:space:]]"
args=""
for i in "${#%$path}"
do
# handle the NULL case
if [ ! "$i" ]
then
continue
# quote any arguments containing white-space
elif [[ $i =~ $whitespace ]]
then
args="$args \"$i\""
else
args="$args $i"
fi
done
find $path -print0 | xargs -0 grep --color "$args"
but these fail with quoted input. For example,
# ./find.sh -i "some quoted string"
grep: quoted: No such file or directory
grep: string: No such file or directory
Note that if $# doesn't contain the path, the first script does do what I want.
EDIT : Thanks for the great solutions! I went with a combination of the answers:
#!/bin/bash
path="."
end=$#
if [ -d "${!#}" ]
then
path="${!#}"
end=$((end - 1))
fi
find "$path" -print0 | xargs -0 grep "${#:1:$end}"

EDIT:
Original was just slightly off. No removal is to be done if the last argument is not a directory.
#!/bin/bash
if [ -d "${!#}" ]
then
path="${!#}"
remove=1
else
path="."
remove=0
fi
find "$path" -print0 | xargs -0 grep "${#:1:$(($#-remove))}"

Related

Get directory when last folder in path ends in given string (sed in ifelse)

I am attempting to find multiple files with .py extension and grep to see if any of these files contain the string nn. Then return only the directory name (uniques), afterwards, if the last folder of the path ends in nn, then select this.
For example:
find `pwd` -iname '*.py' | xargs grep -l 'nn' | xargs dirname | sort -u | while read files; do if [[ sed 's|[\/](.*)*[\/]||g' == 'nn' ]]; then echo $files; fi; done
However, I cannot use sed in if-else expression, how can I use it for this case?
[[ ]] is not bracket syntax for an if statement like in other languages such as C or Java. It's a special command for evaluating a conditional expression. Depending on your intentions you need to either exclude it or use it correctly.
If you're trying to test a command for success or failure just call the command:
if command ; then
:
fi
If you want to test the output of the command is equal to some value, you need to use a command substitution:
if [[ $( command ) = some_value ]] ; then
:
fi
In your case though, a simple parameter expansion will be easier:
# if $files does not contain a trailing slash
if [[ "${files: -2}" = "nn" ]] ; then
echo "${files}"
fi
# if $files does contain a trailing slash
if [[ "${files: -3}" = "nn/" ]] ; then
echo "${files%/}"
fi
Shell loop and the [[ is superfluous here, since you use the sed anyway. This task could be accomplished by:
find "$PWD" -type f -name '*.py' -exec grep -l 'nn' {} + |
sed -n 's%\(.*nn\)/[^/]*$%\1%p' | sort -u
assuming pathnames don't contain a newline character.

iterate over lines in file then find in directory

I am having trouble looping and searching. It seems that the loop is not waiting for the find to finish. What am I doing wrong?
I made a loop the reads a file line by line. I then want to use that "name" to search a directory looking to see if a folder has that name. If it exists copy it to a drive.
#!/bin/bash
DIRFIND="$2"
DIRCOPY="$3"
if [ -d $DIRFIND ]; then
while IFS='' read -r line || [[ -n "$line" ]]; do
echo "$line"
FILE=`find "$DIRFIND" -type d -name "$line"`
if [ -n "$FILE" ]; then
echo "Found $FILE"
cp -a "$FILE" "$DIRCOPY"
else
echo "$line not found."
fi
done < "$1"
else
echo "No such file or directory"
fi
Have you tried xargs...
Proposed Solution
cat filenamelist | xargs -n1 -I {} find . -type d -name {} -print | xargs -n1 -I {} mv {} .
what the above does is pipe a list of filenames into find (one at a time), when found find prints the name and passes to xarg which moves the file...
Expansion
file = yogo
yogo -> | xargs -n1 -I yogo find . -type d -name yogo -print | xargs -n1 -I {} mv ./<path>/yogo .
I hope the above helps, note that xargs has the advantage that you do not run out of command line buffer.

find executables in my PATH with a particular string

Is there a way to quickly know whether an executable in my $PATH contains a particular string? For instance, I want to quickly list the executables that contain SRA.
The reason I'm asking is that I have several scripts with the characters SRA in them. The problem is that I always forget the starting character of the file (if I do remember, I use tab completion to find it).
You can store all the paths in an array and then use find with its various options:
IFS=":" read -ra paths <<< "$PATH"
find "${paths[#]}" -type f -executable -name '*SRA*'
IFS=":" read -ra paths <<< "$PATH" reads all the paths into an array, setting the field separator temporarily to :, as seen in Setting IFS for a single statement.
-type f looks for files.
-executable looks for files that are executable. You may use -perm +111 instead in OSX (source).
Since the -executable option is not available in FreeBSD or OSX, ghoti nicely recommends to use the -perm option:
find -perm -o=x,-g=x,-u=x
For example:
find ${PATH//:/ } -maxdepth 1 -executable -name '*SRA*'
And if you happen to have spaces (or other hazardous characters) in the $PATH (the <<< trick borrowed from the answer of #fedorqui):
tr ":\n" "\\000" <<< "$PATH" | \
xargs -0r -I{} -n1 find {} -maxdepth 1 -executable -name '*SRA*'
It also handles the empty $PATH correctly.
A bit clumsy:
find $(echo $PATH | tr : ' ') -name \*SRA\*
I wrote a bash script that wraps this up for OSX, based off this great answer on this page.
I think will work for other operating systems as well. Note that it also ignores errors, sorts the results, and only shows unique values too!
executables_in_path_matching_substring.sh
#!/usr/bin/env bash
function show_help()
{
ME=$(basename "$0")
IT=$(cat <<EOF
returns a list of files in the path that match a substring
usage: $ME SUBSTRING
e.g.
# Find all files in the path that have "git" in their name
$ME git
EOF
)
echo "$IT"
echo
exit
}
if [ -z "$1" ]
then
show_help
fi
if [ "$1" == "help" ] || [ "$1" == '?' ] || [ "$1" == "--help" ] || [ "$1" == "h" ]; then
show_help
fi
SUBSTRING="$1"
IFS=":" read -ra paths <<< "$PATH"
find "${paths[#]}" -type f -perm +111 -name "*$SUBSTRING*" 2>/dev/null | sort | uniq

FInd all files that contains both the string1 and string2

The following script finds and prints the names of all those files that contains either string1 or string2.
However I could not figure out how to make change into this code so that it prints only those files that contains both string1 and string2. Kindly suggest the required change
number=0
for file in `find -name "*.txt"`
do
if [ "`grep "string2\|string1" $file`" != "" ] // change has to be done here
then
echo "`basename $file`"
number=$((number + 1))
fi
done
echo "$number"
Using grep and cut:
grep -H string1 input | grep -E '[^:]*:.*string2' | cut -d: -f1
You can use this with the find command:
find -name '*.txt' -exec grep -H string1 {} \; | grep -E '[^:]*:.*string2'
And if the patterns are not necessarily on the same line:
find -name '*.txt' -exec grep -l string1 {} \; | \
xargs -n 1 -I{} grep -l string2 {}
This solution can handle files with spaces in their names:
number=0
oldIFS=$IFS
IFS=$'\n'
for file in `find -name "*.txt"`
do
if grep -l "string1" "$file" >/dev/null; then
if grep -l "string2" "$file" >/dev/null; then
basename "$file"
number=$((number + 1))
fi
fi
done
echo $number
IFS=$oldIFS

Bash: Native way to check if an entry is one line?

I have a find script that automatically opens a file if just one file is found. The way I currently handle it is doing a word count on the number of lines of the search results. Is there an easier way to do this?
if [ "$( cat "$temp" | wc -l | xargs echo )" == "1" ]; then
edit `cat "$temp"`
fi
EDITED - here is the context of the whole script.
term="$1"
temp=".aafind.txt"
find src sql common -iname "*$term*" | grep -v 'src/.*lib' >> "$temp"
if [ ! -s "$temp" ]; then
echo "ΓΈ - including lib..." 1>&2
find src sql common -iname "*$term*" >> "$temp"
fi
if [ "$( cat "$temp" | wc -l | xargs echo )" == "1" ]; then
# just open it in an editor
edit `cat "$temp"`
else
# format output
term_regex=`echo "$term" | sed "s%\*%[^/]*%g" | sed "s%\?%[^/]%g" `
cat "$temp" | sed -E 's%//+%/%' | grep --color -E -i "$term_regex|$"
fi
rm "$temp"
Unless I'm misunderstanding, the variable $temp contains one or more filenames, one per line, and if there is only one filename it should be edited?
[ $(wc -l <<< "$temp") = "1" ] && edit "$temp"
If $temp is a file containing filenames:
[ $(wc -l < "$temp") = "1" ] && edit "$(cat "$temp")"
Several of the results here will read through an entire file, whereas one can stop and have an answer after one line and one character:
if { IFS='' read -r result && ! read -n 1 _; } <file; then
echo "Exactly one line: $result"
else
echo "Either no valid content at all, or more than one line"
fi
For safely reading from find, if you have GNU find and bash as your shell, replace <file with < <(find ...) in the above. Even better, in that case, is to use NUL-delimited names, such that filenames with newlines (yes, they're legal) don't trip you up:
if { IFS='' read -r -d '' result && ! read -r -d '' -n 1 _; } \
< <(find ... -print0); then
printf 'Exactly one file: %q\n' "$result"
else
echo "Either no results, or more than one"
fi
Well, given that you are storing these results in the file $temp this is a little easier:
[ "$( wc -l < $temp )" -eq 1 ] && edit "$( cat $temp )"
Instead of 'cat $temp' you can do '< $temp', but it might take away some readability if you are not very familiar with redirection 8)
If you want to test whether the file is empty or not, test -s does that.
if [ -s "$temp" ]; then
edit `cat "$temp"`
fi
(A non-empty file by definition contains at least one line. You should find that wc -l agrees.)
If you genuinely want a line count of exactly one, then yes, it can be simplified substantially;
if [ $( wc -l <"$temp" ) = 1 ]; then
edit `cat "$temp"`
fi
You can use arrays:
x=($(find . -type f))
[ "${#x[*]}" -eq 1 ] && echo "just one || echo "many"
But you might have problems in case of filenames with whitespace, etc.
Still, something like this would be a native way
no this is the way, though you're making it over-complicated:
if [ "`wc -l $temp | cut -d' ' -f1`" = "1" ]; then
edit "$temp";
fi
what's complicating it is:
useless use of cat,
unuseful use of xargs
and I'm not sure if you really want the editcat $temp`` which is editing the file at the content of $temp

Resources