I would like to use sed to delete and replace some characters in a bash script.
#!/bin/bash
DIR="."
file_extension=".mkv|.avi|.mp4"
files= `find $DIR -maxdepth 1 -type f -regex ".*\.\(mkv\|avi\|mp4\)" -printf "%f\n"`
In order to simplify $files, I would like to use $file_extension in it, i.e. change .mkv|.avi|.mp4 to mkv\|avi\|mp4
How can I do that with sed ? Or maybe an easier alternative ?
No need for sed; bash has basic substitution operators built in. The basic syntax for a replace-all operation is ${variable//pattern/replacement}, but unfortunately it can't be nested so you need a helper variable. For clarity, I'll even use two:
file_extension_without_dot="${file_extension//./}" # mkv|avi|mp4
file_extension_regex="${file_extension_without_dot//|/\\|}" # mkv\|avi\|mp4
files= `find $DIR -maxdepth 1 -type f -regex ".*\.\($file_extension_regex\)" -printf "%f\n"`
If your find supports it, you could also consider using a different -regextype (see find -regextype help) so you don't need quite so many backslashes anymore.
Related
Is there any way to get the basename in the command find?
What I don't need:
find /dir1 -type f -printf "%f\n"
find /dir1 -type f -exec basename {} \;
Why you may ask? Because I need to continue using the found file. I basically want something like this:
find . -type f -exec find /home -type l -name "*{}*" \;
And it uses ./file1, not file1 as the agrument for -name.
Thanks for your help :)
If you've got Bash version 4.3 or later, try this Shellcheck-clean pure Bash code:
#! /bin/bash -p
shopt -s dotglob globstar nullglob
for path in ./**; do
[[ -L $path ]] && continue
[[ -f $path ]] || continue
base=${path##*/}
for path2 in /home/**/*"$base"*; do
[[ -L $path2 ]] && printf '%s\n' "$path2"
done
done
shopt -s ... enables some Bash settings that are required by the code:
dotglob enables globs to match files and directories that begin with .. find shows such files by default.
globstar enables the use of ** to match paths recursively through directory trees. globstar was introduced in Bash 4.0, but it was dangerous to use before Bash 4.3 (2014) because it followed symlinks when looking for matches.
nullglob makes globs expand to nothing when nothing matches (otherwise they expand to the glob pattern itself, which is almost never useful in programs).
See Removing part of a string (BashFAQ/100 (How do I do string manipulation in bash?)) for an explanation of ${path##*/}. That always works, even in some rare cases where $(basename "$path") doesn't.
See the accepted, and excellent, answer to Why is printf better than echo? for an explanation of why I used printf instead of echo to output the found paths.
This solution works correctly if you've got files that contain pattern characters (?, *, [, ], \) in their names.
Spawn a shell and make the second call to find from there
find /dir1 -type f -exec sh -c '
for p; do
find /dir2 -type l -name "*${p##*/}*"
done' sh {} +
If your files may contain special characters in their names (like [, ?, etc.), you may want to escape them like this to avoid false positives
find /dir1 -type f -exec sh -c '
for p; do
esc=$(printf "%sx\n" "${p##*/}" | sed "s/[?*[\]/\\\&/g")
esc=${esc%x}
find /dir2 -type l -name "*$esc*"
done' sh {} +
You'll have to forward it to another evaluator. There is no way to do that in find.
find . -type f -printf '%f\0' |
xargs -r0I{} find /home -type l -name '*{}*'
This answers your question about trying to merge the functionality of %f and -exec find and is based on your example but your example injects raw filenames as -name patterns so avoid that and look at other solutions instead.
Simply spawn a bash shell:
find /dir1 -type f -exec bash -c '
base=$(basename "$1")
echo "$base"
do_something_else "$base"
' bash {} \;
$1 in the bash part is each file filtered by find.
I'm trying to write one line of code that finds all .sh files in the current directory and its subdirectories, and print them without the .sh extension (preferably without the path too).
I think I got the find command down. I tried using the output of
find . -type f -iname "*.sh" -print
as input for echo, and formatting it along these lines
echo "${find_output%.sh}"
However, I cannot get it to work in one line, without variable assigment.
I got inspiration from this answer on stackoverflow https://stackoverflow.com/a/18639136/15124805
to use this line:
echo "${$( find . -type f -iname "*.sh" -print)%.sh}"
But I get this error:
ash: ${$( find . -type f -iname "*.sh" -print)%.sh}: bad substitution
I also tried using xargs
find . -type f -iname "*.sh" -print |"${xargs%.sh}" echo
But I get a "command not found error" -probably I didn't use xargs correctly, but I'm not sure how I could improve this or if it's the right way to go.
How can I make this work?
That's the classic useless use of echo. You simply want
find . -type f -iname "*.sh" -exec basename {} .sh \;
If you have GNU find, you can also do this with -printf.
However, basename only matches .sh literally, so if you really expect extensions with different variants of capitalization, you need a different approach.
For the record, the syntax you tried to use for xargs would attempt to use the value of a variable named xargs. The correct syntax would be something like
find . -type f -iname "*.sh" -print |
xargs -n 1 sh -c 'echo "${1%.[Ss][Hh]}"' _
but that's obviously rather convoluted. In some more detail, you need sh because the parameter expansion you are trying to use is a feature of the shell, not of echo (or xargs, or etc).
(You can slightly optimize by using a loop:
find . -type f -iname "*.sh" -print |
xargs sh -c 'for f; do
echo "${f%.[Ss][Hh]}"
done' _
but this is still not robust for all file names; see also https://mywiki.wooledge.org/BashFAQ/020 for probably more than you realized you needed to know about this topic. If you have GNU find and GNU xargs, you can use find ... -print0 | xargs -r0)
This question already has answers here:
Difference between single and double quotes in Bash
(7 answers)
Closed last year.
# first examle
> alias gostyle="goimports -w $(find . -type f -name '*.go' -not -path './vendor/*')"
> alias gostyle
gostyle=$'goimports -w / gofiles /'
# second example
> alias gostyle="goimports -w $(find . -type f -name 'main.go' -not -path './vendor/*')"
> alias gostyle
gostyle='goimports -w ./main.go'
Why in first example I have $ in the front of command.
How I can use wildcard * right in alias.
Why I have permission denied when I use first alias
Because you are using double quotes instead of single, the $(find ...) is executed once, at the time you define your alias. You end up with an alias with a hard-coded list of files.
The trivial fix is to use single quotes instead of double (where obviously then you need to change the embedded single quotes to double quotes instead, or come up with a different refactoring); but a much better solution is to use a function instead of an alias. There is basically no good reason to use an alias other than for backwards compatibility with dot files from the paleolithic age of Unix.
gostyle () {
goimports -w $(find . -type f -name '*.go' -not -path './vendor/*') "$#"
}
(Unfortunately, I am not familiar with goimports; perhaps it needs its argument to be double quoted, or perhaps you should add one -w for each result that find produces. Or perhaps you actually want
gostyle () {
find . -type f -name '*.go' -not -path './vendor/*' -print0 |
xargs -0 goimports -w "$#"
}
where you might or might not want to include "$#".)
here are some names:
El Peulo'Pasa, Van O'Driscoll, Mike_Willam
how to filter the name contains ', using POSIX in bash by command find?
if I use the following command,
find . -maxdepth 1 -mindepth 1 -type d -regex '^.*[']*$' -print
Bash runs into a problem because the syntax ' will automatically convert the input to string
You don't need -regex (which is a non-POSIX action) for this at all; -name is more than adequate. (-mindepth and -maxdepth are also extensions that aren't present in the POSIX standard).
To make a ' literal, put it inside double quotes, or in an unquoted context and precede it with a backslash:
find . -maxdepth 1 -mindepth 1 -type d -name "*'*" -print
...or the 100% identical but harder-to-read command line...
find . -maxdepth 1 -mindepth 1 -type d -name '*'\''*' -print
If you're just searching the current directory (and not its subdirectories), you don't even need find, just a wildcard ("glob") expression:
ls *\'*
(Note that the ' must be escaped or double-quoted, but the asterisks must not be.)
If you want to do operations on these files, you can either use that wildcard expression directly:
dosomethingwith *\'*
# or
for file in *\'*; do
dosomethingwith "$file"
done
...or if you're using bash, store the filenames in an array, then use that. This involves getting the quoting just right, to avoid trouble with other weird characters in filenames (e.g. spaces):
filelist=( *\'* )
dosomethingwith "${filelist[#]}"
# or
for file in "${filelist[#]}"; do
dosomethingwith "$file"
done
The note here is that arrays are not part of the POSIX shell standard; they work in some shells (bash, ksh, zsh, etc), but not in others (e.g. dash). If you want to use arrays, be sure to use the right shebang to get the shell you want (and don't override it by running the script with sh scriptname).
I need to create a list of files in a directory and all of its subdirectories. I've used
find . -type f
to get the list of files, but I only need the file name, not the path that leads to it.
So instead of returning this:
./temp2/temp3/file3.txt
./temp2/file2.txt
./file.txt
I need to return
file3.txt
file2.txt
file.txt
find . -type f -exec basename {} \;
or better yet:
find . -type f -printf "%f\n"
You can use printf option in gnu find:
find . -type f -printf "%f\n"
For non-gnu find use -execdir and printf:
find . -type f -execdir printf "%s\n" {} +
find . -type f | xargs basename
The command basename strips the directories and outputs only the file name. Use xargs to chain the output of find as the input of basename.
Late answer :)
The -printf "%f" is the best solution, but the printf is available only in the GNU find. For example on the OS X (and probably FreeBSD too) will print:
find: -printf: unknown primary or operator
for such cases, myself prefer the following
find . -type f | sed 's:.*/::'
It is faster (on the large trees) as the multiple basename execution.
Drawback, it will handle only filenames without the \n (newline) in the filenames. For handling such cases the easiest (and most universal - but slow) way is using the basename.
You can also use perl, like:
perl -MFile::Find -E 'find(sub{say if -f},".")'