I'm trying to pass parameter to rm in bash script to clean my system automatically. For example, I want to remove everything except the *.doc files. So I wrote the following codes.
#!/bin/bash
remove_Target="!*.txt"
rm $remove_Target
However, the output always say
rm: cannot remove ‘!*.txt’: No such file or directory
It is obviously that bash script add single quotes for me when passing the variable to rm. How can I remove the single quotes?
Using Bash
Suppose that we have a directory with three files
$ ls
a.py b.py c.doc
To delete all except *.doc:
$ shopt -s extglob
$ rm !(*.doc)
$ ls
c.doc
!(*.doc) is an extended shell glob, or extglob, that matches all files except those ending in .doc.
The extglob feature requires a modern bash.
Using find
Alternatively:
find . -maxdepth 1 -type f ! -name '*.doc' -delete
Related
I'm trying to understand how to use curly braces and quotes properly in bash. I'm wondering why the third example of an ls command doesn't work.
#!/bin/bash -vx
# File name prefix.
File_name_prefix='this_is_a_file_name_prefix'
# Let's do this in the /tmp directory.
cd /tmp
# Let's make three empty files.
touch ${File_name_prefix}_1.txt
touch ${File_name_prefix}_2.txt
touch ${File_name_prefix}_3.txt
# Let's list the three files.
# This works.
ls "$File_name_prefix"*
# This works.
ls ${File_name_prefix}*
# This does not work.
ls "${File_name_prefix}*"
# This fails.
find ./ -type f -name '${File_name_prefix}*'
# This fails spectacularly.
find ./ -type f -name ${File_name_prefix}*
# But this works.
find ./ -type f -name "${File_name_prefix}*"
echo "Why?"
# Clean up.
rm ${File_name_prefix}*
exit
When you execute a command like first and second examples:
ls "$File_name_prefix"*
ls ${File_name_prefix}*
Command interpreter actually execute a command with interpolation according to directory content. A directory is used from command line itself or current directory is used (if command line has relative path),
so it executes like this (assume $File_name_prefix is fp and directory has files fp1 fp2 fp3):
ls fp1 fp2 fp3
But for third example command interpreter consider quoted argument as ready to use and do not applies * interpolation.
so it executes like this:
ls "fp*"
And because there is no file with name fp* (with asterisk in name) but only files fp1 fp2 fp3 (as we assumes) in the directory, therefore it show empty list or says that there is no such file or directory
I'm trying to remove all .js and .js.map files from any sub-directory of src called __tests__.
$ find . -path './src/**' -name __tests__ | # find subdirectories
> sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' | # for each subdirectory, concat *.js and *.js.map
> xargs rm # remove files
This fails with the following errors:
rm: cannot remove './src/game/__tests__/*.js': No such file or directory
rm: cannot remove './src/game/__tests__/*.js.map': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js.map': No such file or directory
However, if I change my xargs rm to xargs echo rm, copy and paste the output, and run it, it works.
$ find . -path './src/**' -name __tests__ | sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' |
> xargs echo rm # echo command to remove files
rm ./src/game/__tests__/*.js ./src/game/__tests__/*.js.map ./src/helpers/__tests__/*.js ./src/helpers/__tests__/*.js.map
$ rm ./src/game/__tests__/*.js ./src/game/__tests__/*.js.map ./src/helpers/__tests__/*.js ./src/helpers/__tests__/*.js.map
Wrapping the output of my echo in $(...) and prepending rm results in the same error as before.
$ rm $(find . -path './src/**' -name __tests__ | sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' | xargs echo rm
rm: cannot remove './src/game/__tests__/*.js': No such file or directory
rm: cannot remove './src/game/__tests__/*.js.map': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js.map': No such file or directory
What am I doing wrong?
I doubt it matters, but I'm using GitBash on Windows.
First, to explain the issue: In find | sed | xargs rm, the shell only sets up communication between those programs, but it doesn't actually process the results in any way. That's a problem here because *.js needs to be expanded by a shell to replace it with a list of filenames; rm treats every argument it's given as a literal name. (This is unlike Windows, where programs do their own command-line parsing and glob expansion).
Arguably, you don't need find here at all. Consider:
shopt -s globstar # enable ** as a recursion operator
rm ./src/**/__tests__/*.js{,.map} # delete *.js and *.js.map in any __tests__ directory under src
...or, if you do want to use find, let it do the work of coming up with a list of individual files matching *.js, instead of leaving that work to happen later:
find src -regextype posix-egrep -regex '.*/__tests__/[^/]*[.]js([.]map)?' -delete
You need to have your globs (*) expanded. File name expansion is performed by the shell on UNIX, not by rm or other programs. Try:
.... | xargs -d $'\n' sh -c 'IFS=; for f; do rm -- $f; done' sh
...to explain this:
The -d $'\n' ensures that xargs splits only on newlines (not spaces!), and also stops it from treating backslashes and quotes as special.
sh -c '...' sh runs ... as a script, with sh as $0, and subsequent arguments in $1, etc; for f; will thus iterate over those arguments.
Clearing IFS with IFS= prevents string-splitting from happening when $f is used unquoted, so only glob expansion happens.
Using the -- argument to rm ensures that it treats subsequent arguments as filenames, not options, even if they start with dashes.
That said, if you have really a lot of files for each pattern, you might run into an "argument list too long", even though you are using xargs.
Another caveat is that filenames containing newlines can potentially be split into multiple names (depending on the details of the version of find you're using). A way to solve this that will work with all POSIX-compliant versions of find might be:
find ./src -type d -name __tests__ -exec sh -c '
for d; do
rm -- "$d"/*.js{,.map}
done
' sh {} +
I am trying to trying to find some scripts in bash.
FOLDERS='one,two'
eval find "{$FOLDERS}/*.sh"
Of course I want to do this without eval. But removing eval simply gives:
find: {one,two}/*.sh: No such file or directory
How can I make find accept a variable set of folders using something like brace expansion, and without using a loop?
Use an array, then you can expand the array elements directly.
folders=(one two)
find "${folders[#]}" -name '*.sh'
The nice thing about this is it'll work even if the folders have whitespace, commas, or other special characters.
folders=('comma,separated,name' 'My Documents')
A working command using brace expansion would look like:
$ find {one,two} -name "*.sh"
Demonstration:
$ cd /tmp
$ mkdir {one,two}
$ touch {one,two}/{a.sh,b.sh,c.txt}
$ ls one/
a.sh b.sh c.txt
$ ls two/
a.sh b.sh c.txt
$ find {one,two} -name "*.sh"
one/b.sh
one/a.sh
two/b.sh
two/a.sh
I've created a script that removes all zero-length files from a directory.
#!/bin/bash
find . -size 0 -type f -exec rm -i '{}' \;
It works well, except that it only works in the directory the script is actually located in and its sub-directories. I want to be able to use a directory as a command line argument (bash scriptname dirname) while executing the script and have it only search that directory and it's sub-directories, not the actual directory the script is located in. Is there a way to do this?
With $x you can access the x-th command line argument of your bash script. So in your case this should be something like
find $1 -size 0 -type f -exec rm -i '{}' \;
In Bash you can pass argument to your script. These argument can be used using $ sign. For example:
./hello 123 abc xyz
here $0 is your program name
$1 is 123 and so on
You can pass values and use them in your program from $1 to $9.
If your version of find accepts multiple paths, you can pass all the positional parameters like this:
find "$#" -size 0...
Note: you should not use $* for file names! it expands parameters into a string, so any file names with spaces or new lines in will break the command. "$#" preserves these, so is safe to use for this. If find doesn't accept multiple paths, you can loop over the parameters like this:
for dir in "$#"; do
find "$dir" -size 0...
done
if I write this string in my script:
list=$(ls /path/to/some/files/*/*/*.dat)
it works fine. But what I need is
files="files/*/*/*.dat"
list=$(ls /path/to/some/${files})
and it says
ls: /path/to/some/files/*/*/*.dat: No such file or directory
How should I do it?
If you only get that message where there truly are no matching .dat files, add this to your script:
shopt -s nullglob
It will cause the glob to expand to an empty list if there are no matching files, rather than being treated literally.
Try this:
list=$(find /path/to/some/files/ -mindepth 3 -maxdepth 3 -name '*.dat')