Passing expression to find command not working - terminal

I tried the following command to list all the files which has /SL/src/ in its path
find * -type f -name '*/SL/src/*'
but it does not prints any thing even though there are files within the directory SL/src

You might use something like that:
find * -type d -exec sh -c '
for i do
[ "$(basename $i)" = src ] &&
[ "$(basename $(dirname $i))" = SL ] &&
find "$i" -type f ; done
' sh {} +
or simply:
find * -type f | grep /SL/src/

Related

Rename file if it is the only one with the extension in directory

This works however I would like to do it only if it is the only .jpg for the given directory, the one below will just rename them all to folder.jpg, overwriting the other files:
find . -type f -name '*.jpg' -execdir mv {} 'folder.jpg' \;
I guess find cannot filter by the number of matches, but you can always exec a shell which does more elaborate checks for you:
find . -type f -name '*.jpg' -execdir sh -c '[ $# = 1 ] && mv "$1" folder.jpg' sh {} +

BASH: If condition uses result from find command to determine which file will be written

I want to list all files in a nested directory, but in that directory has some files which having space in their name. So I wanna write down the paths of which files don't have space in their name and which have in 2 different files.
So far, I just know how to find those having space in their name by this command:
find /<my directory> -type f -name * *
I want something like:
find /<my directory> -type f
if [ name has space]
then > a.txt
else > b.txt
fi
Thank you in advance.
You can put a condition in a brief -exec. This is somewhat more complex than you would hope because -exec cannot directly contain shell builtins.
find "$path" -type f -exec sh -c 'for f; do
case $f in *\ *) dest=a;; *) dest=b;; esac;
echo "$f" >>$dest.txt
done' _ {} +
In other words, pass the found files to the following sh -c ... script. (The underscore is to populate $0 with something inside the subshell.)
If the directory tree isn't too deep, perhaps it would be a lot easier to just run find twice.
find "$path" -type f -name '* *' >a.txt
find "$path" -type f \! -name '* *' >b.txt
Use two separate commands:
find "$path" -type f -name '* *' > a.txt
find "$path" -type f -not -name '* *' > b.txt

Count filenumber in directory with blank in its name

If you want a breakdown of how many files are in each dir under your current dir:
for i in $(find . -maxdepth 1 -type d) ; do
echo -n $i": " ;
(find $i -type f | wc -l) ;
done
It does not work when the directory name has a blank in the name. Can anyone here tell me how I must edite this shell script so that such directory names also accepted for counting its file contents?
Thanks
Your code suffers from a common issue described in http://mywiki.wooledge.org/BashPitfalls#for_i_in_.24.28ls_.2A.mp3.29.
In your case you could do this instead:
for i in */; do
echo -n "${i%/}: "
find "$i" -type f | wc -l
done
This will work with all types of file names:
find . -maxdepth 1 -type d -exec sh -c 'printf "%s: %i\n" "$1" "$(find "$1" -type f | wc -l)"' Counter {} \;
How it works
find . -maxdepth 1 -type d
This finds the directories just as you were doing
-exec sh -c 'printf "%s: %i\n" "$1" "$(find "$1" -type f | wc -l)"' Counter {} \;
This feeds each directory name to a shell script which counts the files, similarly to what you were doing.
There are some tricks here: Counter {} are passed as arguments to the shell script. Counter becomes $0 (which is only used if the shell script generates an error. find replaces {} with the name of a directory it found and this will be available to the shell script as $1. This is done is a way that is safe for all types of file names.
Note that, wherever $1 is used in the script, it is inside double-quotes. This protects it for word splitting or other unwanted shell expansions.
I found the solution what I have to consider:
Consider_this
#!/bin/bash
SAVEIFS=$IFS
IFS=$(echo -en "\n\b")
for i in $(find . -maxdepth 1 -type d); do
echo -n " $i: ";
(find $i -type f | wc -l) ;
done
IFS=$SAVEIFS

ls command and size of files in shell script

count=0; #count for counting
IFS='
'
for x in `ls -l $input`; #for loop using ls command
do
a=$(ls -ls | awk '{print $6}') #print[6] is sizes of file
echo $a
b=`echo $a | awk '{split($0,numbers," "); print numbers[1]}'`
echo $b
if [ $b -eq 0 ] # b is only size of a file
then
count=`expr $count + 1` #if b is zero , the count will increase one by one
fi
echo $count
done
I want to find 0 size files . I do that using find command. The second thing is I want to count number of has 0 size of files using ls command and awk. But It doesn't true code . What is my mistake ?
The -s test is true if a file has non-zero size. If that test fails for file, increment your empty-file count.
empty_files=0
for f in "$input"/*; do
[ -s "$f" ] || : $(( empty_files++ ))
done
Your main mistake is that you're parsing ls!
If you want to find (regular) files that are empty, and if you have a version of find that supports the -empty predicate, use it:
find . -type f -empty
Note that this will recurse in subfolders too; if you don't want that, use:
find . -maxdepth 1 -type f -empty
(assuming that your find also supports -maxdepth).
If you only want to count how many empty (regular) files you have:
find . -maxdepth 1 -type f -empty -printf x | wc -m
and if you want to perform both operations at the same time, i.e., print out the name or save them in an array for future use, and count them:
empty_files=()
while IFS= read -r -d '' f; do
empty_files+=( "$f" )
done < <(find . -maxdepth 1 -type f -empty -print0)
printf 'There are %d empty files:\n' "${#empty_files[#]}"
printf ' %s\n' "${empty_files[#]}"
With Bash≥4.4, you could use mapfile instead of the while-read loop:
mapfile -t -d '' empty_files < <(find . -maxdepth 1 -type f -empty -print0)
printf 'There are %d empty files:\n' "${#empty_files[#]}"
printf ' %s\n' "${empty_files[#]}"
For a POSIX-compliant way, use test with the -s option:
find . -type f \! -exec test -s {} \; -print
and if you don't want to recurse into subdirectories, you'll have to -prune them:
find . \! -name . -prune -type f \! -exec test -s {} \; -print
and if you want to count them:
find . \! -name . -prune -type f \! -exec test -s {} \; -exec printf x | wc -m
and here, if you want to perform both operations (count them and save them in an array for later use), use the previous while-read loop (or mapfile if you live in the future) with this find:
find . \! -name . -prune -type f \! -exec test -s {} \; -exec printf '%s\0' {} \;
Also see chepner's answer for a pure shell solution (needs minor tweaking to be POSIX compliant).
Regarding your comment
I want to count and delete [empty files]. How can I do that at the same time?
If you have GNU find (or a find that supports all the goodies):
find . -maxdepth 1 -type f -empty -printf x -delete | wc -m
if not,
find . \! -name . -prune -type f \! -exec test -s {} \; -printf x -exec rm {} \; | wc -m
Make sure that the -delete (or -exec rm {} \;) predicate is at the end! do not exchange the order of the predicates!

Perform an action in every sub-directory using Bash

I am working on a script that needs to perform an action in every sub-directory of a specific folder.
What is the most efficient way to write that?
A version that avoids creating a sub-process:
for D in *; do
if [ -d "${D}" ]; then
echo "${D}" # your processing here
fi
done
Or, if your action is a single command, this is more concise:
for D in *; do [ -d "${D}" ] && my_command; done
Or an even more concise version (thanks #enzotib). Note that in this version each value of D will have a trailing slash:
for D in */; do my_command; done
for D in `find . -type d`
do
//Do whatever you need with D
done
The simplest non recursive way is:
for d in */; do
echo "$d"
done
The / at the end tells, use directories only.
There is no need for
find
awk
...
Use find command.
In GNU find, you can use -execdir parameter:
find . -type d -execdir realpath "{}" ';'
or by using -exec parameter:
find . -type d -exec sh -c 'cd -P "$0" && pwd -P' {} \;
or with xargs command:
find . -type d -print0 | xargs -0 -L1 sh -c 'cd "$0" && pwd && echo Do stuff'
Or using for loop:
for d in */; { echo "$d"; }
For recursivity try extended globbing (**/) instead (enable by: shopt -s extglob).
For more examples, see: How to go to each directory and execute a command? at SO
Handy one-liners
for D in *; do echo "$D"; done
for D in *; do find "$D" -type d; done ### Option A
find * -type d ### Option B
Option A is correct for folders with spaces in between. Also, generally faster since it doesn't print each word in a folder name as a separate entity.
# Option A
$ time for D in ./big_dir/*; do find "$D" -type d > /dev/null; done
real 0m0.327s
user 0m0.084s
sys 0m0.236s
# Option B
$ time for D in `find ./big_dir/* -type d`; do echo "$D" > /dev/null; done
real 0m0.787s
user 0m0.484s
sys 0m0.308s
find . -type d -print0 | xargs -0 -n 1 my_command
This will create a subshell (which means that variable values will be lost when the while loop exits):
find . -type d | while read -r dir
do
something
done
This won't:
while read -r dir
do
something
done < <(find . -type d)
Either one will work if there are spaces in directory names.
You could try:
#!/bin/bash
### $1 == the first args to this script
### usage: script.sh /path/to/dir/
for f in `find . -maxdepth 1 -mindepth 1 -type d`; do
cd "$f"
<your job here>
done
or similar...
Explanation:
find . -maxdepth 1 -mindepth 1 -type d :
Only find directories with a maximum recursive depth of 1 (only the subdirectories of $1) and minimum depth of 1 (excludes current folder .)
the accepted answer will break on white spaces if the directory names have them, and the preferred syntax is $() for bash/ksh. Use GNU find -exec option with +; eg
find .... -exec mycommand +; #this is same as passing to xargs
or use a while loop
find .... | while read -r D
do
# use variable `D` or whatever variable name you defined instead here
done
if you want to perform an action INSIDE the folder and not ON folder.
Explanation: You have many pdfs and you would like to concetrate them inside a single folder.
my folders
AV 001/
AV 002/
for D in *; do cd "$D"; # VERY
DANGEROUS COMMAND - DONT USE
#-- missing "", it will list files too. It can go up too.
for d in */; do cd "$d"; echo $d; cd ..; done; # works
succesfully
for D in "$(ls -d */)"; do cd "$D"; done; #
bash: cd: $'Athens Voice 001/\nAthens Voice 002/' - there is no such
folder
for D in "$(*/)"; do cd "$D"; done; # bash: Athens
Voice 001/: is folder
for D in "$(`find . -type d`)"; do cd $D; done; # bash: ./Athens: there is no such folder or file
for D in *; do if [ -d "${D}" ] then cd ${D}; done; # many
arguments

Resources