How can I get the path of a .pid file that is inside a directory.
the code below returns only the file
root#linux [/]# ls -l $(find /* -name dovecot | grep var/run) | grep pid
-rw------- 1 root root 5 Nov 28 15:22 master.pid
Guess this is what you are looking for:
find /var/run -name "*.pid" 2>/dev/null | grep dovecot | xargs ls -l
You can also narrow the matches down in the grep command when you specify (parts of) the path inside the filter expression.
I think the interpretation of the output must be that the find command finds a directory name such as:
/var/run/dovecot
and you do an ls -l on the directory, which lists the files in the directory without any path leading to it. What you need is to find a reliable way of listing the files in the directory with their full path names.
One way — not I think a good way — of doing it would be:
find $(find /* -name dovecot -type d | grep var/run) -type f -name '*.pid' \
-exec ls -l {} +
This uses your first find command to get the directories you're interested in, then runs find again to find .pid files and execs ls -l on them. The + notation means that find behaves a bit like xargs, bunching a lot of file names together into a single run of ls -l.
cat /var/run/dovecot/master.pid
?
Or :
# readlink -f /var/run/dovecot/*.pid
/var/run/dovecot/master.pid
Related
I get error printouts when I use methods of (1) xargs find | xargs mv and (2) xargs find -exec mv to move files, but the moving of the files works as expected. I get no error printouts when I use (3) an intermediate variable, which is what I want.
Finding the inode numbers, which is what I'm doing with stat and sed, shouldn't be related to the problem.
I have condensed my problem into the following. Basically I want to get rid of the error printouts in both of Methods 1 and 2 (no piping to /dev/null please).
Note I'm on macOS, using BSD stat, and I'm also using zsh.
And I also know about find -print0 | xargs -0, I just wanted to condense the problem so that it's not as long. This is as condensed as I want to get.
Setup
Make test subdirectories (which are later moved by inode number).
mkdir -p "./testdir/dir"{1..4}
Method 1 - xargs find | xargs mv
Works to move the subdirectories, but with error printouts.
stat -s ./testdir/* | sed -E -n 's/.*st_ino=([0-9]+).*/\1/p' | xargs -I abc find ./testdir/* -maxdepth 1 -inum abc | xargs -I {} mv {} ~/.Trash
Error printouts:
find: ./testdir/dir1: No such file or directory
find: ./testdir/dir1: No such file or directory
find: ./testdir/dir2: No such file or directory
find: ./testdir/dir1: No such file or directory
find: ./testdir/dir2: No such file or directory
find: ./testdir/dir3: No such file or directory
Method 2 - xargs find -exec mv
Works to move the subdirectories, but with error printouts.
stat -s ./testdir/* | sed -E -n 's/.*st_ino=([0-9]+).*/\1/p' | xargs -I abc find ./testdir/* -maxdepth 1 -inum abc -exec mv {} ~/.Trash \;
Error printouts:
find: ./testdir/dir1: No such file or directory
find: ./testdir/dir1: No such file or directory
find: ./testdir/dir2: No such file or directory
find: ./testdir/dir1: No such file or directory
find: ./testdir/dir2: No such file or directory
find: ./testdir/dir3: No such file or directory
find: ./testdir/dir1: No such file or directory
find: ./testdir/dir2: No such file or directory
find: ./testdir/dir3: No such file or directory
find: ./testdir/dir4: No such file or directory
Method 3 - intermediate variable
Works to move the subdirectories, but this time without error printouts, which is what I want.
dirs_to_move=$(stat -s ./testdir/* | sed -E -n 's/.*st_ino=([0-9]+).*/\1/p' | xargs -I abc find ./testdir/* -maxdepth 1 -inum abc)
printf '%s\n' "$dirs_to_move" | xargs -I {} mv {} ~/.Trash
Question
How do I eliminate the error printouts from both of Methods 1 and 2? What is causing the printouts?
Short answer: find is being told to search all 4 subdirectories, even after some of them have been deleted.
Detailed eplanation: The root problem is that the wildcard in the xargs -I abc find ./testdir/* -maxdepth 1 ... part gets expanded by the shell before any of the commands get run. So that part of the script becomes:
... | xargs -I abc find ./testdir/dir1 ./testdir/dir2 ./testdir/dir3 ./testdir/dir4 -maxdepth 1 ...
What happens then is the stat | sed part sends the first inode number to xargs, that runs find, find searches all 4 directories for the matching item, and either deletes it directly or sends its path to the next xargs for deletion.
Next, the stat | sed part sends the second inode number to xargs, that runs find, find searches all 4 directories... oops, hey, one of them's missing! So it prints an error message about ./testdir/dir1 not existing, searches the other three, and (one way or another) deletes the next one.
Next comes the third inode number, and this time neither ./testdir/dir1 nor ./testdir/dir2 exists, so you get two error messages. Etc etc etc.
(There's also an additional problem with the second one, where immediately after running the mv command, find then tries to search its contents, and oops it's gone. That's why you get more error messages that way. I think you might've wanted -maxdepth 0 to keep it from trying to do that.)
Solution: I'm not sure what the larger context is, but my immediate reaction is that this is an overcomplex mess and as much as possible should be removed. But without knowing what can be changed without breaking the big picture, the minimal fix I see is to just have find search the entire testdir directory, rather than using a wildcard to list specific (wrong) subdirectories:
... | xargs -I abc find ./testdir -maxdepth 1 ...
(And note that in this form, -maxdepth 1 is actually correct.)
I have a directory like this:
A/B/C/D/data. Inside this exists folders like, 202012, 202013, etc.
Now, I want to find all folders starting with 2020 inside data folder and then obtain the name of the one which was created most recently. So, I did this,
find /A/B/C/D/data/ -name "2020*" -type d. This gave me all folders starting with 2020. Now, when I am piping the output of this to ls -t | head -1 using the | operator, it simply returns the data folder. My expectation is that it should return the latest folder inside the data folder.
I am doing like this,
find /A/B/C/D/data/ -name "2020*" -type d | ls -t | head -1
How can I do this?
Thanks!
shopt -s globstar # Enable **
ls -dFt /A/B/C/D/data/**/2020*
Note that this would also list files starting with 2020, not only directories. For this reason, I used the -F flag. This appends a / to each directory, so you can distinguish files and directories easier. If you are sure, that your directory entries don't contain newline characters or slashes, you can pipe the output to | grep '/$' and get only the directories.
If you need this for a quick inspection in an interactive shell, I would do a ls -dFtr .... to get them sorted in reverse order. This makes sure that the ones you are interested in, show up at the end of the list.
You need to run the output of find through xargs to give it as command line arguments to ls:
find /A/B/C/D/data/ -name "2020*" -type d | xargs ls -t -d | head -1
Rather new to command line, so bear with me.
I'm supposed to be finding directories in /usr/local that end with a number. I've managed to list just the directories with:
ls -d */
but when I try using anything with via piping:
find -name
grep
look
there's no output shown. I've even tried just using the '*' wildcard for searching, but nothing shows up.
Any ideas where I'm going wrong?
The find command should be able to do what you want, and from the looks of it you have it just about right:
find / -type d -name <directory_name>
That will look for any directory with the name you specify from the root directory. If you ran the command as you show above I think the flaw was you were not specifying the directory to start your search. You can use the man page as well if you need any other parameters to specify:
http://unixhelp.ed.ac.uk/CGI/man-cgi?find
find /usr/local -type d -name '*[0-9]'
This does it all in one; looks under /usr/local/ for directories where the name ends with a digit (and implicitly prints the result).
Your code using ls might need to look like:
cd /usr/local || exit 1
ls -d */ | grep '[0-9]/$'
This will list directories with a slash at the end of the name, so you need to search for the names where there's a digit followed by the slash and the end of the name. One difference between this and the find command is that ls only lists directories immediately in /usr/local whereas find will search down directory hierarchies. If you don't want find to search down the hierarchy, say so:
find /usr/local -maxdepth 1 -type d -name '*[0-9]'
(If you place -maxdepth 1 at the end, some versions of find get snotty about it and complain.)
find "/path/to/some/dir/*[0-9]" -type d -maxdepth 1
ls -l /path/to/some/dir | grep "^d" | awk '{print $9}'
for file in /path/to/some/dir/*; do \
if [[ -d $file ]]; then \
echo $file; \
fi \
done
How to write a single-line command line invocation that counts the total number of files in the directories /usr/bin, /bin and /usr/doc ?
So far, what I can think of is to use
cd /usr/bin&&ls -l | wc -l
but I don't know how to add them together, something like:
(cd /usr/bin&&ls -l | wc -l) + (cd /bin&&ls -l | wc -l)
Maybe there is a better way to do it, like get all the stdout of each directory, then pipe to wc -l
Any idea?
how about using find command + wc -l?
find /usr/bin /bin /usr/doc -type f |wc -l
Use ls for multiple directories in conjunction with wc is a little more succinct:
ls /usr/bin /bin /usr/doc | wc -l
Assuming bash or similarly capable shell, you can use an array:
files=(/usr/bin/* /bin/* /usr/doc*)
num=${#files[#]}
This technique will correctly handle filenames that contain newlines.
As Kent points out, find may be preferred as it will ignore directory entries. Tweak it if you want symbolic links.
A -maxdepth, if your find supports it, is needed unless you want to recurse into any unexpected directories therein. Also throwing away stderr in case a directory is not present for some odd reason.
find /usr/bin /bin /usr/doc -maxdepth 1 -type f 2>/dev/null | wc -l
I want to write a bash script which will use a list of all the directories containing specific files. I can use find to echo the path of each and every matching file. I only want to list the path to the directory containing at least one matching file.
For example, given the following directory structure:
dir1/
matches1
matches2
dir2/
no-match
The command (looking for 'matches*') will only output the path to dir1.
As extra background, I'm using this to find each directory which contains a Java .class file.
find . -name '*.class' -printf '%h\n' | sort -u
From man find:
-printf format
%h Leading directories of file’s name (all but the last element). If the file name contains no slashes (since it is in the current directory) the %h specifier expands to ".".
On OS X and FreeBSD, with a find that lacks the -printf option, this will work:
find . -name *.class -print0 | xargs -0 -n1 dirname | sort --unique
The -n1 in xargs sets to 1 the maximum number of arguments taken from standard input for each invocation of dirname
GNU find
find /root_path -type f -iname "*.class" -printf "%h\n" | sort -u
Ok, i come way too late, but you also could do it without find, to answer specifically to "matching file with Bash" (or at least a POSIX shell).
ls */*.class | while read; do
echo ${REPLY%/*}
done | sort -u
The ${VARNAME%/*} will strip everything after the last / (if you wanted to strip everything after the first, it would have been ${VARNAME%%/*}).
Regards.
find / -name *.class -printf '%h\n' | sort --unique
Far too late, but this might be helpful to future readers:
I personally find it more helpful to have the list of folders printed into a file, rather than to Terminal (on a Mac).
For that, you can simply output the paths to a file, e.g. folders.txt, by using:
find . -name *.sql -print0 | xargs -0 -n1 dirname | sort --unique > folders.txt
How about this?
find dirs/ -name '*.class' -exec dirname '{}' \; | awk '!seen[$0]++'
For the awk command, see #43 on this list