Get folder list of files with specific system permissions - bash

I have hundreds of subdirectories and I have to find out every subdirectory that has files with a specific system permission (in this case 0755).
I'm trying to achieve that with:
find . -maxdepth 2 -type f -perm 0775 -printf '%h\n' | wc -l
This command shows multiple times the same directory name, I would need to display only one occurrence for each directory.
Thanks for any help!

find . -maxdepth 2 -type f -perm 0775 -printf '%h\n' | sort | uniq | wc -l

You search about files with type f and print only the directories of this files with %h. That is not what you need. Search about directories with -type d and delete the printf option complete.
find . -maxdepth 2 -type d -perm 0775 | wc -l

Related

Command to find N Largest files in macOS directory excluding specific directories [duplicate]

This question already has answers here:
"find" files under current directory but skipping some specific sub-directories
(2 answers)
Closed 1 year ago.
I want to find the largest files in my project, but excluding .target folder.
so far I have found the following command, but not sure how to exclude .target folder.
find . -type f -exec du -h {} + | sort -hr | head -n 10
Also please share what will be the command, for excluding multiple directories.
Simply use -path and -prune to exclude the .target directory:
find . -path ./.target -prune -o -type f -exec du -h {} + | sort -hr | head -n 10
Would you please try the following:
find . -type d -name .target -prune -o -type f -exec du -h {} + | sort -hr | head -n 10
-prune is an action to skip the matched directory in the previous
condition.
If you want to exclude multiple directories, you can
append the directory name, say "exclude2", with -o such as:
find . -type d \( -name .target -o -name exclude2 \) -prune -o -type f -exec du -h {} + | sort -hr | head -n 10

How to write the wc's stdout into a file?

The below command will show how many characters contains in every file in current directory.
find -name '*.*' |xargs wc -c
I want to write the standout into a file.
find -name '*.*' |xargs wc -c > /tmp/record.txt
It encounter an issue:
wc: .: Is a directory
How to write all the standard output into a file?
Why -name '*.*'? That will not find every file and will find directories. You need to use -type f, and better than piping the result to xargs is using -exec:
find . -type f -maxdepth 1 -exec wc -c {} + > /tmp/record.txt
-maxdepth 1 guarantees that the search won't dive in subdirectories.
I think you maybe meant find |xargs wc -c?
find -name '.' just returns .
Filter only files, if you want only files.
find -type f

BASH script : list all files including subdirectories and sort them by date

I have a bash script:
for entry in "/home/pictures"/*
do
echo "ls -larth $entry"
done
I want to list also the files in subfolders and include their path
I want to sort the results by date
It must be a bash script, because some other software (Jenkins) will call it .
Try find.
find /home/pictures -type f -exec ls -l --full-time {} \; | sort -k 6
If there are no newlines in file names use:
find /home/pictures -type f -printf '%T# %p\n'|sort -n
If you can not tolerate timestamps in output, use:
find /home/pictures -type f -printf '%28T# %p\n' | sort -n | cut -c30-
If there is possibility of newlines in file name, and, if you can make the program that consumes the output accept null terminated records, you can use:
find /home/pictures -type f -printf '%T#,%p\0' | sort -nz
For no timestamps in output, use:
find /home/pictures -type f -printf '%28T# %p\0' | sort -nz | cut -zc30-
P.S.
I have assumed that you want to sort by last modification time.
I found out the solution for my question:
find . -name * -exec ls -larth {} +

How to check if file has a symlink

I am writing a bash shell script (for RH and Solaris) that uses openssl to create hash symlinks to certificates all within the same directory. I have approx 130 certificates and when new certs are added I would like the script to create symlinks for the new certs only. I may have to resort to deleting all symlinks and recreating them but if there is a not-so-difficult way to do this; that is preferable than deleting all and recreating.
I know how to find all files with symlinks:
find . -lname '*.cer'
or
find . -type l -printf '%p -> %l\n'
But I am not sure how to negate or find the inverse of this result in a for or other loop. I want to find all files in the current directory missing a symlink.
Thanks!
$ ls -srlt
example1.png
example.png -> ../example.png
$ find . -type f -print
./example1.png
$ find . ! -type f -print
.
./example.png
Assuming GNU find (whose use I infer from your use of nonstandard action -printf), the following command will output the names of all files in the current directory that aren't the target of symlinks located in the same directory:
comm -23 <(find . -maxdepth 1 -type f -printf '%f\n' | sort) \
<(find . -maxdepth 1 -lname '*' -printf '%l\n' | sort)
Note: This assumes that the symlinks were defined with a mere filename as the target.
Also, GNU find doesn't sort the output, so explicit sort commands are needed.
You can process the resulting files in a shell loop to create the desired symlinks.
comm -23 <(find . -maxdepth 1 -type f -printf '%f\n' | sort) \
<(find . -maxdepth 1 -lname '*' -printf '%l\n' | sort) |
while IFS= read -r name; do ln -s "$name" "${name}l"; done
(In the above command, l is appended to the target filename to form the symlink name, as an example.)

Using wc * to count the numer of lines of the files in the current directory

When I use wc * it shows warning for the directories.
Is it possible to filter the current directory and get only the list of files and pass it to the wc.
Another way with find:
find . -maxdepth 1 -type f -exec wc {} \;
You can use find:
find . -maxdepth 1 -type f -print0 | xargs -0 wc -l
Or with gnu wc:
find . -maxdepth 1 -type f -print0 | wc --files0-from=- -l

Resources