This question already has answers here:
Difference between single and double quotes in Bash
(7 answers)
Closed last year.
# first examle
> alias gostyle="goimports -w $(find . -type f -name '*.go' -not -path './vendor/*')"
> alias gostyle
gostyle=$'goimports -w / gofiles /'
# second example
> alias gostyle="goimports -w $(find . -type f -name 'main.go' -not -path './vendor/*')"
> alias gostyle
gostyle='goimports -w ./main.go'
Why in first example I have $ in the front of command.
How I can use wildcard * right in alias.
Why I have permission denied when I use first alias
Because you are using double quotes instead of single, the $(find ...) is executed once, at the time you define your alias. You end up with an alias with a hard-coded list of files.
The trivial fix is to use single quotes instead of double (where obviously then you need to change the embedded single quotes to double quotes instead, or come up with a different refactoring); but a much better solution is to use a function instead of an alias. There is basically no good reason to use an alias other than for backwards compatibility with dot files from the paleolithic age of Unix.
gostyle () {
goimports -w $(find . -type f -name '*.go' -not -path './vendor/*') "$#"
}
(Unfortunately, I am not familiar with goimports; perhaps it needs its argument to be double quoted, or perhaps you should add one -w for each result that find produces. Or perhaps you actually want
gostyle () {
find . -type f -name '*.go' -not -path './vendor/*' -print0 |
xargs -0 goimports -w "$#"
}
where you might or might not want to include "$#".)
Related
I'm trying to write one line of code that finds all .sh files in the current directory and its subdirectories, and print them without the .sh extension (preferably without the path too).
I think I got the find command down. I tried using the output of
find . -type f -iname "*.sh" -print
as input for echo, and formatting it along these lines
echo "${find_output%.sh}"
However, I cannot get it to work in one line, without variable assigment.
I got inspiration from this answer on stackoverflow https://stackoverflow.com/a/18639136/15124805
to use this line:
echo "${$( find . -type f -iname "*.sh" -print)%.sh}"
But I get this error:
ash: ${$( find . -type f -iname "*.sh" -print)%.sh}: bad substitution
I also tried using xargs
find . -type f -iname "*.sh" -print |"${xargs%.sh}" echo
But I get a "command not found error" -probably I didn't use xargs correctly, but I'm not sure how I could improve this or if it's the right way to go.
How can I make this work?
That's the classic useless use of echo. You simply want
find . -type f -iname "*.sh" -exec basename {} .sh \;
If you have GNU find, you can also do this with -printf.
However, basename only matches .sh literally, so if you really expect extensions with different variants of capitalization, you need a different approach.
For the record, the syntax you tried to use for xargs would attempt to use the value of a variable named xargs. The correct syntax would be something like
find . -type f -iname "*.sh" -print |
xargs -n 1 sh -c 'echo "${1%.[Ss][Hh]}"' _
but that's obviously rather convoluted. In some more detail, you need sh because the parameter expansion you are trying to use is a feature of the shell, not of echo (or xargs, or etc).
(You can slightly optimize by using a loop:
find . -type f -iname "*.sh" -print |
xargs sh -c 'for f; do
echo "${f%.[Ss][Hh]}"
done' _
but this is still not robust for all file names; see also https://mywiki.wooledge.org/BashFAQ/020 for probably more than you realized you needed to know about this topic. If you have GNU find and GNU xargs, you can use find ... -print0 | xargs -r0)
I am trying to find all directories that start with a year in brackets, such as this:
[1990] Nature Documentary
and then rename them removing brackets and inserting a dash in between.
1990 - Nature Documentary
The find command below seems to find the results, however I could not prefix the pattern with ^ to mark start of directory name otherwise its not returning hits.
I am pretty sure I need to use -exec or -execdir, but I am not sure how to store the found pattern and manipulate it.
find . -type d -name '\[[[:digit:]][[:digit:]][[:digit:]][[:digit:]]] *'
With [p]rename:
-depth -exec prename -n 's/\[(\d{4})]([^\/]+)$/$1 -$2/' {} +
Drop -n if the output looks good.
Without it, you'd need a shell script with several hardly intelligible parameter expansions there:
-depth -exec sh -c '
for dp; do
yr=${dp##*/[} yr=${yr%%]*}
echo mv "$dp" "${dp%/*}/$yr -${dp##*/\[????]}"
done' sh {} +
Remove echo to apply changes.
You can use the rename command
find . -type d -name '\[[[:digit:]][[:digit:]][[:digit:]][[:digit:]]\] *'| rename -n 's/(\[\d{4}\]) ([\w,\s]+)+$/$1 - $2/'
Note: The effect will not take place until you delete the -n option.
I very often use find to search for files and symbols in a huge source tree. If I don't limit the directories and file types, it takes several minutes to search for a symbol in a file. (I already mounted the source tree on an SSD and that halved the search time.)
I have a few aliases to limit the directories that I want to search, e.g.:
alias findhg='find . -name .hg -prune -o'
alias findhgbld='find . \( -name .hg -o -name bld \) -prune -o'
alias findhgbldins='find . \( -name .hg -o -name bld -o -name install \) -prune -o'
I then also limit the file types as well, e.g.:
findhgbldins \( -name '*.cmake' -o -name '*.txt' -o -name '*.[hc]' -o -name '*.py' -o -name '*.cpp' \)
But sometimes I only want to check for symbols in cmake files:
findhgbldins \( -name '*.cmake' -o -name '*.txt' \) -exec egrep -H 'pattern' \;
I could make a whole bunch of aliases for all possible combinations, but it would be a lot easier if I could use variables to select the file types, e.g:
export SEARCHALL="\( -name '*.cmake' -o -name '*.txt' -o -name '*.[hc]' -o -name '*.py' -o -name '*.cpp' \)"
export SEARCHSRC="\( -name '*.[hc]' -o -name '*.cpp' \)"
and then call:
findhgbldins $SEARCHALL -exec egrep -H 'pattern' \;
I tried several variants of escaping \, (, * and ), but there was no combination that did work.
The only way I could make it to work, was to turn off globbing in Bash, i.e. set -f, before calling my 'find'-contraption and then turn globbing on again.
One alternative I came up with is to define a set of functions (with the same names as my aliases findhg, findhgbldins, and findhgbldins), which take a simple parameter that is used in a case structure that selects the different file types I am looking for, something like:
findhg {
case $1 in
'1' )
find <many file arguments> ;;
'2' )
find <other file arguments> ;;
...
esac
}
findhgbld {
case $1 in
'1' )
find <many file arguments> ;;
'2' )
find <other file arguments> ;;
...
esac
}
etcetera
My question is: Is it at all possible to pass these types of arguments to a command as a variable ?
Or is there maybe a different way to achieve the same i.e. having a combination of a command (findhg, findhgbld,findhgbldins) and a single argument to create a large number of combinations for searching ?
It's not really possible to do what you want without unpleasantness. The basic problem is that when you expand a variable without double-quotes around it (e.g. findhgbldins $SEARCHALL), it does word splitting and glob expansion on the variable's value, but does not interpret quotes or escapes, so there's no way to embed something in the variable's value to suppress glob expansion (well, unless you use invalid glob patterns, but that'd keep find from matching them properly too). Putting double-quotes around it (findhgbldins "$SEARCHALL") suppresses glob expansion, but it also suppresses word splitting, which you need to let find interpret the expression properly. You can turn off glob expansion entirely (set -f, as you mentioned), but that turns it off for everything, not just this variable.
One thing that would work (but would be annoying to use) would be to put the search options in arrays rather than plain variables, e.g.:
SEARCHALL=( \( -name '*.cmake' -o -name '*.txt' -o -name '*.[hc]' -o -name '*.py' -o -name '*.cpp' \) )
findhgbldins "${SEARCHALL[#]}" -exec egrep -H 'pattern' \;
but that's a lot of typing to use it (and you do need every quote, bracket, brace, etc to get the array to expand right). Not very helpful.
My preferred option would be to build a function that interprets its first argument as a list of file types to match (e.g. findhgbldins mct -exec egrep -H 'pattern' \; might find make/cmake, c/h, and text files). Something like this:
findhgbldins() {
filetypes=()
if [[ $# -ge 1 && "$1" != "-"* ]]; then # if we were passed a type list (not just a find primitive starting with "-")
typestr="$1"
while [[ "${#typestr}" -gt 0 ]]; do
case "${typestr:0:1}" in # this looks at the first char of typestr
c) filetypes+=(-o -name '*.[ch]');;
C) filetypes+=(-o -name '*.cpp');;
m) filetypes+=(-o -name '*.make' -o '*.cmake');;
p) filetypes+=(-o -name '*.py');;
t) filetypes+=(-o -name '*.txt');;
?) echo "Usage: $0 [cCmpt] [find options]" >2
exit ;;
esac
typestr="${typestr:1}" # remove first character, so we can process the remainder
done
# Note: at this point filetypes will be something like '-o' -name '*.txt' -o -name '*.[ch]'
# To use it with find, we need to remove the first element (`-o`), and add parens
filetypes=( \( "${filetypes[#]:1}" \) )
shift # and get rid of $1, so it doesn't get passed to `find` later!
fi
# Run `find`
find . \( -name .hg -o -name bld -o -name install \) -prune -o "${filetypes[#]}" "$#"
}
...you could also use a similar approach to building a list of directories to prune, if you wanted to.
As I said, that'd be my preferred option. But there is a trick (and I do mean trick), if you really want to use the variable approach. It's called a magic alias, and it takes advantage of the fact that aliases are expanded before wildcards, but functions are processed afterward, and does something completely unnatural with the combination. Something like this:
alias findhgbldins='shopts="$SHELLOPTS"; set -f; noglob_helper find . \( -name .hg -o -name bld -o -name install \) -prune -o'
noglob_helper() {
"$#"
case "$shopts" in
*noglob*) ;;
*) set +f ;;
esac
unset shopts
}
export SEARCHALL="( -name *.cmake -o -name *.txt -o -name *.[hc] -o -name *.py -o -name *.cpp )"
Then if you run findhgbldins $SEARCHALL -exec egrep -H 'pattern' \;, it expands the alias, records the current shell options, turns off globbing, and passes the find command (including $SEARCHALL, word-split but not glob-expanded) to noglob_helper, which runs the find command with all options, then turns glob expansion back on (if it wasn't disabled in the saved shell options) so it doesn't mess you up later. It's a complete hack, but it should actually work.
im trying to capture a find outcome within a batch script
it works fine until i add another word
example alien (works)
alien 1 (not working)
found201=$(ssh root#192.168.1.201 find "${folder201[#]}" ! -path "*/.wdmc/*" -type f -iname "*$ffind*" | sort)
if i run in a terminal
found201=$(ssh root#192.168.1.201 'find /shares/Public/ /shares/Videos/ -type f -iname "*alien 1*"' | sort)
with 'find .....' it works but nothing is plus its not using string/array
when i add '' to script get bad substitution (assuming it now treats as a string not a command)
i need to use find as later i need to delete files etc in a set manner
how can i add ' ' to find and use strings/array
running normally get this
++ ssh root#192.168.1.201 find /shares/Public/ /shares/Videos/ '' '!' -path '*/.wdmc/*' -type f -iname '*alien 1*'
++ sort
find: unrecognized: 1*
Try like this, enclosing the entire find command within double-quotes, and using single-quotes inside:
found201=$(ssh root#192.168.1.201 "find ${folder201[#]} ! -path '*/.wdmc/*' -type f -iname '*$ffind*'" | sort)
However there is a caveat:
this won't work with folders in ${folder201[#]} that contain spaces.
think i have sorted it
found201=$(ssh root#192.168.1.201 "find ${folder201[#]} ! -path '*/.wdmc/*' -type f -iname '*$ffind*'" | sort)
changed '${folder201[#]}' to ${folder201[#]} (removed single quotes)
I'm trying to run find, and exclude several directories listed in an array. I'm finding some weird behavior when it's expanding, though, which is causing me issues:
~/tmp> skipDirs=( "./dirB" "./dirC" )
~/tmp> bars=$(find . -name "bar*" -not \( -path "${skipDirs[0]}/*" $(printf -- '-o -path "%s/\*" ' "${skipDirs[#]:1}") \) -prune); echo $bars
./dirC/bar.txt ./dirA/bar.txt
This did not skip dirC as I wold have expected. The problem is that the print expands the quotes around "./dirC".
~/tmp> set -x
+ set -x
~/tmp> bars=$(find . -name "bar*" -not \( -path "${skipDirs[0]}/*" $(printf -- '-o -path "%s/*" ' "${skipDirs[#]:1}") \) -prune); echo $bars
+++ printf -- '-o -path "%s/*" ' ./dirC
++ find . -name 'bar*' -not '(' -path './dirB/*' -o -path '"./dirC/*"' ')' -prune
+ bars='./dirC/bar.txt
./dirA/bar.txt'
+ echo ./dirC/bar.txt ./dirA/bar.txt
./dirC/bar.txt ./dirA/bar.txt
If I try to remove the quotes in the $(print..), then the * gets expanded immediately, which also gives the wrong results. Finally, if I remove the quotes and try to escape the *, then the \ escape character gets included as part of the filename in the find, and that does not work either. I'm wondering why the above does not work, and, what would work? I'm trying to avoid using eval if possible, but currently I'm not seeing a way around it.
Note: This is very similar to: Finding directories with find in bash using a exclude list, however, the posted solutions to that question seem to have the issues I listed above.
The safe approach is to build your array explicitly:
#!/bin/bash
skipdirs=( "./dirB" "./dirC" )
skipdirs_args=( -false )
for i in "${skipdirs[#]}"; do
args+=( -o -type d -path "$i" )
done
find . \! \( \( "${skipdirs_args[#]}" \) -prune \) -name 'bar*'
I slightly modify the logic in your find since you had a slight (logic) error in there: your command was:
find -name 'bar*' -not stuff_to_prune_the_dirs
How does find proceed? it will parse the files tree and when it finds a file (or directory) that matches bar* then it will apply the -not ... part. That's really not what you want! your -prune is never going to be applied!
Look at this instead:
find . \! \( -type d -path './dirA' -prune \)
Here find will completely prune the directory ./dirA and print everything else. Now it's among everything else that you want to apply the filter -name 'bar*'! the order is very important! there's a big difference between this:
find . -name 'bar*' \! \( -type d -path './dirA' -prune \)
and this:
find . \! \( -type d -path './dirA' -prune \) -name 'bar*'
The first one doesn't work as expected at all! The second one is fine.
Notes.
I'm using \! instead of -not as \! is POSIX, -not is an extension not specified by POSIX. You'll argue that -path is not POSIX either so it doesn't matter to use -not. That's a detail, use whatever you like.
You had to use some dirty trick to build your commands to skip your dir, as you had to consider the first term separately from the other. By initializing the array with -false, I don't have to consider any terms specially.
I'm specifying -type d so that I'm sure I'm pruning directories.
Since my pruning really applies to the directories, I don't have to include wildcards in my exclude terms. This is funny: your problem that seemingly is about wildcards that you can't handle disappears completely when you use find appropriately as explained above.
Of course, the method I gave really applies with wildcards too. For example, if you want to exclude/prune all subdirectories called baz inside subdirectories called foo, the skipdirs array given by
skipdirs=( "./*/foo/baz" "./*/foo/*/baz" )
will work fine!
The issue here is that the quotes you are using on "%s/*" aren't doing what you think they are.
That is to say, you think you need the quotes on "%s/*" to prevent the results from the printf from being globbed however that isn't what is happening. Try the same thing without the directory separator and with files that start and end with double quotes and you'll see what I mean.
$ ls
"dirCfoo"
$ skipDirs=( "dirB" "dirC" )
$ printf '%s\n' -- -path "${skipDirs[0]}*" $(printf -- '-o -path "%s*" ' "${skipDirs[#]:1}")
-path
dirB*
-o
-path
"dirCfoo"
$ rm '"dirCfoo"'
$ printf -- '%s\n' -path "${skipDirs[0]}*" $(printf -- '-o -path "%s*" ' "${skipDirs[#]:1}")
-path
dirB*
-o
-path
"dirC*"
See what I mean? The quotes aren't being handled specially by the shell. They just happen not to glob in your case.
This issue is part of why things like what is discussed at http://mywiki.wooledge.org/BashFAQ/050 don't work.
To do what you want here I believe you need to create the find arguments array manually.
sD=(-path /dev/null)
for dir in "${skipDirs}"; do
sD+=(-o -path "$dir")
done
and then expand "${sD[#]}" on the find command line (-not \( "${sD[#]}" \) or so).
And yes, I believe this makes the answer you linked to incorrect (though the other answer might work (for non-whitespace, etc. files) because of the array indirection that is going on.