Counting directories on UNIX - bash

I need a bash script that can count directories that are inside other directories on a FreeBSD.
The case is like this:
The path is home/myuser/direct than inside this directory there are like 20 directories named only by one letter like A B C D E F and so on. Inside every A directory, B directory there are many other directories with different names, such as mydirectory1, mydirectory2 and so on. Inside mydirectory1 there are differnet files and directories, and I need to count only the directories that are under the mydirectory1, and not the files. I came up with this, but using this, I will have to do that manually for each directory:
home/myuser/direct# ls -l A/* | grep ^d | wc -l
than for the B directory I will have to:
home/myuser/direct# ls -l B/* | grep ^d | wc -l
and so on. Is there a way, to automatically do this, I mean change the letter A to B and so on?
P.S, sorry about the confusion as English is not my first language :(

This solution assumes you want the number of subfolders for each folder in the current directory. If you want to sum them all up into one value, that is a different question... It is not incredibly robust to variations in folder names, but should work for most cases when there is not strange punctuation:
for D in */; do echo "$D": $(ls -d "$D"*/| wc -l); done
Example output:
DATA/: 14
LOGS/: 2
PLOTS/: 3
SCRIPTS/: 2
ls: libraries//*/: No such file or directory
libraries/: 0
Here is a version which suppresses the error for empty folders:
for D in */; do echo "$D": $(ls -d "$D"*/ 2>/dev/null |wc -l); done

Try
find * -type d -print | wc -l

Related

#bash recursive concatenation over multiple folders per folder

I have system that creates a folder per day with a txt file generated every 10 minutes.
I need to write a bash script that runs from the start folder over each day merges all txt files into one file per day and writes this file with the into a destination folder.
the last solution I had was something like this
for i in $dirm;
do
ls -1U | find . -name "*.txt" | xargs cat *.txt > all
cut -c 1-80 $i/all > $i/${i##*/}
.....
done
for some reason i can't get the loop right to go through each folder. this finds all .txt. but not per folder. the cut thing is i only need the first 80 chars.
probably a really easy problem but i can't get my head around it.
I assume $dirm is the directory list, then you should find from $i and not from current directory (.)
for i in $dirm;
do
ls -1U | find $i -name "*.txt" | xargs cat *.txt > all
cut -c 1-80 $i/all > $i/${i##*/}
.....
done
I think you're trying to combine the output of ls and find. To do that, piping one command into the other does not work. Instead, run them together in a subshell:
(ls; find) | xargs...

Get the latest created directory from a filepath

I am trying to find what is the latest directory created in a given filepath.
ls -t sorts the content by timestamp of the of file or directory. But I need only directory.
You can use the fact that directories have a d in the beginning of its information.
Hence, you can do:
ls -lt /your/dir | grep ^d
This way, the last created directory will appear at the top. If you want it to be the other way round, with oldest at the top and newer at the bottom, use -r:
ls -ltr /your/dir | grep ^d
*/ matches directories.
So you could use the following command to get the most recent directory:
ls -td /path/to/dir/*/ | head -1
BUT, I would not recommend this because parsing the output of ls is unsafe.
Instead, you should create a loop and compare timestamps:
dirs=( /path/to/dir/*/ )
newest=${dirs[0]}
for d in "${dirs[#]}"
do
if [[ $d -nt $newest ]]
then
newest=$d
fi
done
echo "Most recent directory is: $newest"

Listing only directories using ls in Bash? [closed]

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 28 days ago.
The community reviewed whether to reopen this question 28 days ago and left it closed:
Original close reason(s) were not resolved
Improve this question
This command lists directories in the current path:
ls -d */
What exactly does the pattern */ do?
And how can we give the absolute path in the above command (e.g. ls -d /home/alice/Documents) for listing only directories in that path?
*/ is a pattern that matches all of the subdirectories in the current directory (* would match all files and subdirectories; the / restricts it to directories). Similarly, to list all subdirectories under /home/alice/Documents, use ls -d /home/alice/Documents/*/
Four ways to get this done, each with a different output format
1. Using echo
Example: echo */, echo */*/
Here is what I got:
cs/ draft/ files/ hacks/ masters/ static/
cs/code/ files/images/ static/images/ static/stylesheets/
2. Using ls only
Example: ls -d */
Here is exactly what I got:
cs/ files/ masters/
draft/ hacks/ static/
Or as list (with detail info): ls -dl */
3. Using ls and grep
Example: ls -l | grep "^d"
Here is what I got:
drwxr-xr-x 24 h staff 816 Jun 8 10:55 cs
drwxr-xr-x 6 h staff 204 Jun 8 10:55 draft
drwxr-xr-x 9 h staff 306 Jun 8 10:55 files
drwxr-xr-x 2 h staff 68 Jun 9 13:19 hacks
drwxr-xr-x 6 h staff 204 Jun 8 10:55 masters
drwxr-xr-x 4 h staff 136 Jun 8 10:55 static
4. Bash Script (Not recommended for filename containing spaces)
Example: for i in $(ls -d */); do echo ${i%%/}; done
Here is what I got:
cs
draft
files
hacks
masters
static
If you like to have '/' as ending character, the command will be: for i in $(ls -d */); do echo ${i}; done
cs/
draft/
files/
hacks/
masters/
static/
I use:
ls -d */ | cut -f1 -d'/'
This creates a single column without a trailing slash - useful in scripts.
For all folders without subfolders:
find /home/alice/Documents -maxdepth 1 -type d
For all folders with subfolders:
find /home/alice/Documents -type d
Four (more) Reliable Options.
An unquoted asterisk * will be interpreted as a pattern (glob) by the shell. The shell will use it in pathname expansion. It will then generate a list of filenames that match the pattern.
A simple asterisk will match all filenames in the PWD (present working directory). A more complex pattern as */ will match all filenames that end in /. Thus, all directories. That is why the command:
1.- echo.
echo */
echo ./*/ ### Avoid misinterpreting filenames like "-e dir"
will be expanded (by the shell) to echo all directories in the PWD.
To test this: Create a directory (mkdir) named like test-dir, and cd into it:
mkdir test-dir; cd test-dir
Create some directories:
mkdir {cs,files,masters,draft,static} # Safe directories.
mkdir {*,-,--,-v\ var,-h,-n,dir\ with\ spaces} # Some a bit less secure.
touch -- 'file with spaces' '-a' '-l' 'filename' # And some files:
The command echo ./*/ will remain reliable even with odd named files:
./--/ ./-/ ./*/ ./cs/ ./dir with spaces/ ./draft/ ./files/ ./-h/
./masters/ ./-n/ ./static/ ./-v var/
But the spaces in filenames make reading a bit confusing.
If instead of echo, we use ls. The shell is still what is expanding the list of filenames. The shell is the reason to get a list of directories in the PWD. The -d option to ls makes it list the present directory entry instead of the contents of each directory (as presented by default).
ls -d */
However, this command is (somewhat) less reliable. It will fail with the odd named files listed above. It will choke with several names. You need to erase one by one till you find the ones with problems.
2.- ls
The GNU ls will accept the "end of options" (--) key.
ls -d ./*/ ### More reliable BSD ls
ls -d -- */ ### More reliable GNU ls
3.-printf
To list each directory in its own line (in one column, similar to ls -1), use:
$ printf "%s\n" */ ### Correct even with "-", spaces or newlines.
And, even better, we could remove the trailing /:
$ set -- */; printf "%s\n" "${#%/}" ### Correct with spaces and newlines.
An attempt like
$ for i in $(ls -d */); do echo ${i%%/}; done
will fail on:
some names (ls -d */) as already shown above.
will be affected by the value of IFS.
will split names on spaces and tabs (with default IFS).
each newline in the name will start a new echo command.
4.- Function
Finally, using the argument list inside a function will not affect the arguments list of the present running shell. Simply
$ listdirs(){ set -- */; printf "%s\n" "${#%/}"; }
$ listdirs
presents this list:
--
-
*
cs
dir with spaces
draft
files
-h
masters
-n
static
-v var
These options are safe with several types of odd filenames.
The tree command is also pretty useful here. By default it will show all files and directories to a complete depth, with some ASCII characters showing the directory tree.
$ tree
.
├── config.dat
├── data
│ ├── data1.bin
│ ├── data2.inf
│ └── sql
| │ └── data3.sql
├── images
│ ├── background.jpg
│ ├── icon.gif
│ └── logo.jpg
├── program.exe
└── readme.txt
But if we wanted to get just the directories, without the ASCII tree, and with the full path from the current directory, you could do:
$ tree -dfi
.
./data
./data/sql
./images
The arguments being:
-d List directories only.
-f Prints the full path prefix for each file.
-i Makes tree not print the indentation lines, useful when used in conjunction with the -f option.
And if you then want the absolute path, you could start by specifying the full path to the current directory:
$ tree -dfi "$(pwd)"
/home/alice/Documents
/home/alice/Documents/data
/home/alice/Documents/data/sql
/home/alice/Documents/images
And to limit the number of subdirectories, you can set the max level of subdirectories with -L level, e.g.:
$ tree -dfi -L 1 "$(pwd)"
/home/alice/Documents
/home/alice/Documents/data
/home/alice/Documents/images
More arguments can be seen with man tree.
In case you're wondering why output from 'ls -d */' gives you two trailing slashes, like:
[prompt]$ ls -d */
app// cgi-bin// lib// pub//
it's probably because somewhere your shell or session configuration files alias the ls command to a version of ls that includes the -F flag. That flag appends a character to each output name (that's not a plain file) indicating the kind of thing it is. So one slash is from matching the pattern '*/', and the other slash is the appended type indicator.
To get rid of this issue, you could of course define a different alias for ls. However, to temporarily not invoke the alias, you can prepend the command with backslash:
\ls -d */
Actual ls solution, including symlinks to directories
Many answers here don't actually use ls (or only use it in the trivial sense of ls -d, while using wildcards for the actual subdirectory matching. A true ls solution is useful, since it allows the use of ls options for sorting order, etc.
Excluding symlinks
One solution using ls has been given, but it does something different from the other solutions in that it excludes symlinks to directories:
ls -l | grep '^d'
(possibly piping through sed or awk to isolate the file names)
Including symlinks
In the (probably more common) case that symlinks to directories should be included, we can use the -p option of ls, which makes it append a slash character to names of directories (including symlinked ones):
ls -1p | grep '/$'
or, getting rid of the trailing slashes:
ls -1p | grep '/$' | sed 's/\/$//'
We can add options to ls as needed (if a long listing is used, the -1 is no longer required).
Note: if we want trailing slashes, but don't want them highlighted by grep, we can hackishly remove the highlighting by making the actual matched portion of the line empty:
ls -1p | grep -P '(?=/$)'
A plain list of the current directory, it'd be:
ls -1d */
If you want it sorted and clean:
ls -1d */ | cut -c 1- | rev | cut -c 2- | rev | sort
Remember: capitalized characters have different behavior in the sort
I just add this to my .bashrc file (you could also just type it on the command line if you only need/want it for one session):
alias lsd='ls -ld */'
Then lsd will produce the desired result.
Here is what I am using
ls -d1 /Directory/Path/*;
If a hidden directory is not needed to be listed, I offer:
ls -l | grep "^d" | awk -F" " '{print $9}'
And if hidden directories are needed to be listed, use:
ls -Al | grep "^d" | awk -F" " '{print $9}'
Or
find -maxdepth 1 -type d | awk -F"./" '{print $2}'
For listing only directories:
ls -l | grep ^d
For listing only files:
ls -l | grep -v ^d
Or also you can do as:
ls -ld */
Try this one. It works for all Linux distribution.
ls -ltr | grep drw
ls and awk (without grep)
No need to use grep since awk can perform regularexpressino check so it is enough to do this:
ls -l | awk '/^d/ {print $9}'
where ls -l list files with permisions
awk filter output
'/^d/' regularexpresion that search only for lines starting with letter d (as directory) looking at first line - permisions
{print} would prints all columns
{print $9} will print only 9th column (name) from ls -l output
Very simple and clean
To show folder lists without /:
ls -d */|sed 's|[/]||g'
I found this solution the most comfortable, I add to the list:
find * -maxdepth 0 -type d
The difference is that it has no ./ at the beginning, and the folder names are ready to use.
Test whether the item is a directory with test -d:
for i in $(ls); do test -d $i && echo $i ; done
FYI, if you want to print all the files in multi-line, you can do a ls -1 which will print each file in a separate line.
file1
file2
file3
*/ is a filename matching pattern that matches directories in the current directory.
To list directories only, I like this function:
# Long list only directories
llod () {
ls -l --color=always "$#" | grep --color=never '^d'
}
Put it in your .bashrc file.
Usage examples:
llod # Long listing of all directories in current directory
llod -tr # Same but in chronological order oldest first
llod -d a* # Limit to directories beginning with letter 'a'
llod -d .* # Limit to hidden directories
Note: it will break if you use the -i option. Here is a fix for that:
# Long list only directories
llod () {
ls -l --color=always "$#" | egrep --color=never '^d|^[[:digit:]]+ d'
}
file * | grep directory
Output (on my machine) --
[root#rhel6 ~]# file * | grep directory
mongo-example-master: directory
nostarch: directory
scriptzz: directory
splunk: directory
testdir: directory
The above output can be refined more by using cut:
file * | grep directory | cut -d':' -f1
mongo-example-master
nostarch
scriptzz
splunk
testdir
* could be replaced with any path that's permitted
file - determine file type
grep - searches for string named directory
-d - to specify a field delimiter
-f1 - denotes field 1
One-liner to list directories only from "here".
With file count.
for i in `ls -d */`; do g=`find ./$i -type f -print| wc -l`; echo "Directory $i contains $g files."; done
Using Perl:
ls | perl -nle 'print if -d;'
I partially solved it with:
cd "/path/to/pricipal/folder"
for i in $(ls -d .*/); do sudo ln -s "$PWD"/${i%%/} /home/inukaze/${i%%/}; done
 
ln: «/home/inukaze/./.»: can't overwrite a directory
ln: «/home/inukaze/../..»: can't overwrite a directory
ln: accesing to «/home/inukaze/.config»: too much symbolics links levels
ln: accesing to «/home/inukaze/.disruptive»: too much symbolics links levels
ln: accesing to «/home/inukaze/innovations»: too much symbolics links levels
ln: accesing to «/home/inukaze/sarl»: too much symbolics links levels
ln: accesing to «/home/inukaze/.e_old»: too much symbolics links levels
ln: accesing to «/home/inukaze/.gnome2_private»: too much symbolics links levels
ln: accesing to «/home/inukaze/.gvfs»: too much symbolics links levels
ln: accesing to «/home/inukaze/.kde»: too much symbolics links levels
ln: accesing to «/home/inukaze/.local»: too much symbolics links levels
ln: accesing to «/home/inukaze/.xVideoServiceThief»: too much symbolics links levels
Well, this reduce to me, the major part :)
Here is a variation using tree which outputs directory names only on separate lines, yes it's ugly, but hey, it works.
tree -d | grep -E '^[├|└]' | cut -d ' ' -f2
or with awk
tree -d | grep -E '^[├|└]' | awk '{print $2}'
This is probably better however and will retain the / after directory name.
ls -l | awk '/^d/{print $9}'
if you have space in your folder name $9 print wont work try below command
ls -l yourfolder/alldata/ | grep '^d' | awk '{print $9" " $10}'
output
ls -l yourfolder/alldata/ | grep '^d' | awk '{print $9" " $10}'
testing_Data
Folder 1
To answer the original question, */ has nothing to do with ls per se; it is done by the shell/Bash, in a process known as globbing.
This is why echo */ and ls -d */ output the same elements. (The -d flag makes ls output the directory names and not contents of the directories.)
Adding on to make it full circle, to retrieve the path of every folder, use a combination of Albert's answer as well as Gordans. That should be pretty useful.
for i in $(ls -d /pathto/parent/folder/*/); do echo ${i%%/}; done
Output:
/pathto/parent/folder/childfolder1/
/pathto/parent/folder/childfolder2/
/pathto/parent/folder/childfolder3/
/pathto/parent/folder/childfolder4/
/pathto/parent/folder/childfolder5/
/pathto/parent/folder/childfolder6/
/pathto/parent/folder/childfolder7/
/pathto/parent/folder/childfolder8/
Here is what I use for listing only directory names:
ls -1d /some/folder/*/ | awk -F "/" "{print \$(NF-1)}"

How to get the total number of files in 3 directories?

How to write a single-line command line invocation that counts the total number of files in the directories /usr/bin, /bin and /usr/doc ?
So far, what I can think of is to use
cd /usr/bin&&ls -l | wc -l
but I don't know how to add them together, something like:
(cd /usr/bin&&ls -l | wc -l) + (cd /bin&&ls -l | wc -l)
Maybe there is a better way to do it, like get all the stdout of each directory, then pipe to wc -l
Any idea?
how about using find command + wc -l?
find /usr/bin /bin /usr/doc -type f |wc -l
Use ls for multiple directories in conjunction with wc is a little more succinct:
ls /usr/bin /bin /usr/doc | wc -l
Assuming bash or similarly capable shell, you can use an array:
files=(/usr/bin/* /bin/* /usr/doc*)
num=${#files[#]}
This technique will correctly handle filenames that contain newlines.
As Kent points out, find may be preferred as it will ignore directory entries. Tweak it if you want symbolic links.
A -maxdepth, if your find supports it, is needed unless you want to recurse into any unexpected directories therein. Also throwing away stderr in case a directory is not present for some odd reason.
find /usr/bin /bin /usr/doc -maxdepth 1 -type f 2>/dev/null | wc -l

Counting the contents of a directory

So I know how I would approach counting the number of files in a directory- I would use a for filename in * loop and then test the files names to fit my purpose, but I'm having trouble figuring out how to loop through a directory and then count how many (sub)directories are in it.
Could anyone point me in the right direction?
You can test if its a directory by using -d.
You can use find: find . -mindepth 1 -maxdepth 1 -type d
((n=0))
for fn in *
do
[[ -d "${fn}" ]] && ((n=1+${n}))
done
Keep a counter and only increment it for directories...
What are you trying to do? Take a look at the wc command. Specifically wc -l which counts the number of lines in the output. You can use a whole array of commands that generate output and then pipe that to the wc -l. Be careful of commands that add headers and footers to the files (like ls -l).
Here are some examples:
This will count all files and directories that don't start with .:
$ ls | wc -l
It's the same as your for loop you had in your question.
This will count all files and directories including those hidden ones. Note the ls -A instead of ls -a. The first won't list . and .. as files while the second will:
$ ls -A | wc -l
This will count all files and directories in the entire directory tree
$ find . | wc -l
This will only count the directories in the whole directory tree
$ find . -type d| wc -l
This will count all the files in the whole directory tree
$ find . -type f | wc -l
ls -
This will limit you to the number of directories in the current directory
$ find . -mindepth 1 -maxdepth 1 -type d | wc -l
And, you can use this to assign it to a variable:
$ num_of_files=$(find . -type f | wc -l)
here is how count directories or do stuff with directory names.
#!/bin/bash
old_IFS=$IFS
IFS=$'\n'
array=($(ls -F /foo/bar/ | grep '/$')) # this creates an array named "array" that holds
IFS=$old_IFS # all the directory names located in /foo/bar/
echo ${#array[#]} # this will give you the number of directories in /foo/bar/
for ((i=0; i<${#array[#]}; i++))
do
echo ${array[$i]} # this will output a list of all the directories
done
alternatively you could:
ls -F /foo/bar/ | grep '/$' | cat > directorynames.txt
and then count the number of lines. or you could get rid of the cat and just put the above in a for loop that would count up for every newline character.

Resources