Need a shell script that deletes all files except *.pdf - bash

Can anyone write a shell script that deletes all the files in the folder except those with pdf extension?

This will include all subdirectories:
find . -type f ! -iname '*.pdf' -delete
This will act only in the current directory:
find . -maxdepth 1 -type f ! -iname '*.pdf' -delete

$ ls -1 | grep -v '.pdf$' | xargs -I {} rm -i {}
Or, if you are confident:
$ ls -1 | grep -v '.pdf$' | xargs -I {} rm {}
Or, the bulletproof version:
$ find . -maxdepth 1 -type f ! -iname '*.pdf' -delete

This should do the trick:
shopt -s extglob
rm !(*.pdf)

ls | grep -v '.pdf$' | xargs rm
This will filter all files that don't end in PDF, and execute RM on them

Related

Copying the result of a find operation in shell

I want to find a file, and simultaneously copy it to another directory like this:
cp (find . -name myFile | tail -n 1) dir/to/copy/to
But this says unexpected token `find'
Is there a better way to do this?
You may use a pipeline:
find . -name 'myFile' -print0 | tail -z -n 1 | xargs -0 -I {} cp {} /dir/to/copy/to/
Using -print0 option to address filenames with whitespace, glob characters
find . -name 'myFile' -print0 | tail -n 1 | xargs -0 -I {} cp {} /dir/to/copy/to/
Two options are available-
Appended the missing $() - to evaluate command (not sure the purpose of tail command, only required for samefile in multiple directories)
cp $(find . -name myFile | tail -n 1) dir/to/copy/to
find . -name myFile -type f -exec cp {} dir/to/copy/to \;

Clear multiple files

How can I clear/empty multiple files using bash?
For a single file you can use
> foo.log
But I've tried
> *.log;
find . -maxdepth 1 -name "*.log" | xargs >;
But they don't seem to work. How can I do this?
The redirection must be performed in a shell, one at a time.
... -exec sh -c "> {}" \; ...
...
for f in *.log
do
> "$f"
done
try this , it works :
find . -maxdepth 1 -name "*.log" | xargs -t -n1 -I '{}' perl -e "open(I,'>{}')"

removing files that have extension .bin then printing bye

filename=file.bin
extension=$(echo ${filename}|awk -F\. '{print $2}')
if [ ${extension} == "bin" ]; then
rm *.extenstion
fi
would something like this work how do I delete all files that have the same extention in a folder
You don't need to extract the extension yourself, this is what globbing is for. Simply do:
rm *.bin
Or recursively find ./ -name "*.bin" -exec rm -f {} \;
Aside from globbing, this is also doable with find.
find -type f -name "*.bin" -exec rm {} \;
Or more efficiently, with newer version of find:
find -type f -name "*.bin" -exec rm {} +
which is equivalent to
find -type f -name "*.bin" | xargs rm
Note: by default, find will do it recursively.

How to find the executable files in the current directory and find out their extensions?

I need to find all executable files from /bin. How to do it using
find . -executable
and how to check if the file is script (for example, sh, pl, bash)?
#!/bin/bash
for file in `find /bin` ; do
if [ -x $file ] ; then
file $file
fi
done
and even better to do
find /bin -type f -perm +111 -print0 | xargs -0 file
find /bin/ -executable returns all executable files from /bin/ directory.
To filtering extension there are usable -name flag. For example, find /bin/ -executable -name "*.sh" returns sh-scripts.
UPD:
If file is not a binary file and do not have extension, it's possible to figured out it's type from shabang.
For example find ~/bin/ -executable | xargs grep --files-with-matches '#!/bin/bash' returns files from ~/bin/ directory, which contains #!/bin/bash.
To find all the scripts shell
find . -type f -executable | xargs file -i | grep x-shellscript | cut -d":" -f1
To find all the executable
find . -type f -executable | xargs file -i | grep x-exec | cut -d":" -f1
To find all the shared libraries
find . -type f -executable | xargs file -i | grep x-sharedlib | cut -d":" -f1
This worked for me & thought of sharing...
find ./ -type f -name "*" -exec sh -c '
case "$(head -n 1 "$1")" in
?ELF*) exit 0;;
MZ*) exit 0;;
#!*/ocamlrun*)exit0;;
esac
exit 1
' sh {} \; -print

how to ignore directories but not the files in them in bash script with find

I want to run a find command but only find the files in directories, not the directories or subdirectories themselves. Also acceptable would be to find the directories but grep them out or something similar, still listing the files in those directories. As of right now, to find all files changed in the last day in the working directory, and grep'ing out DS_Store and replacing spaces with underscores:
find . -mtime -1 -type f -print | grep -v '\.DS_Store' | awk '{gsub(/ /,"_")}; 1'
Any help would be appreciated!
If you have GNU find:
find . -mtime -1 ! -name '.DS_Store' -type f -printf '%f\n'
will print only the basename of the file.
For other versions of find:
find . -mtime -1 ! -name '.DS_Store' -type f -exec basename {} \;
you could then do:
find -name index.html -exec sh -c 'basename "$1" | tr " " _' _ {} \;

Resources