I have a ruby one-liner ruby1.9 -ine '#some statement' src/**. I assumed, like perl does, ruby skips the directories ( well that's how I remember it ). But I get this error e:1:in 'gets': Is a directory. Besides giving it a list of files, is there a quick way of getting round this?
I don't think it ever skipped directories, at least even 1.8.6 does not. So I suppose the only quick way is to give it a list of files, or to manipulate ARGV, but it'd hardly be a proper one-liner then anymore.
Something like this:
ruby -ne 'ARGV.delete_if{|s| File.ftype(s) == "directory"}; do_stuff_here' src/**
So yeah, giving it a proper file list seems to be the nicer solution.
If you want only files recursively, then find(1) will be your best bet :
find ./src -type f | ruby1.9 -ne '#some statement'
I believe your assumption is actually about your shell's handling of **. The shell interprets un-escaped meta-characters in commands.
bash(1) by default will not expand ** recursively. You need to set the globstar option for this behavior:
$ ls -l /tmp
total 20
drwx------ 2 sarnold sarnold 4096 2011-11-17 15:43 keyring-9mdW7p
drwx------ 2 gdm gdm 4096 2011-11-17 15:43 orbit-gdm
drwx------ 2 sarnold sarnold 4096 2011-11-17 15:44 orbit-sarnold
drwx------ 2 sarnold sarnold 4096 2011-11-17 15:46 plugtmp
drwx------ 2 sarnold sarnold 4096 2011-11-17 15:43 ssh-ZriaCoWL2248
$ shopt -u globstar
$ echo /tmp/**
/tmp/keyring-9mdW7p /tmp/orbit-gdm /tmp/orbit-sarnold /tmp/plugtmp /tmp/ssh-ZriaCoWL2248
$ shopt -s globstar
$ echo /tmp/**
/tmp/ /tmp/keyring-9mdW7p /tmp/keyring-9mdW7p/control /tmp/orbit-gdm /tmp/orbit-sarnold /tmp/orbit-sarnold/linc-9a5-0-240e051029b41 /tmp/orbit-sarnold/linc-9ad-0-1b1412421b16c /tmp/plugtmp /tmp/ssh-ZriaCoWL2248 /tmp/ssh-ZriaCoWL2248/agent.2248
Related
This question already has answers here:
Listing only directories using ls in Bash? [closed]
(29 answers)
Closed 4 years ago.
I would like to list all directories in a directory. Some of them have spaces in their names. There are also files in the target directory, which I would like to ignore.
Here is the output of ls -lah data/:
drwxr-xr-x 5 me staff 160B 24 Sep 11:30 Wrecsam - Wrexham
-rw-r--r-- 1 me staff 77M 24 Sep 11:31 Wrexham.csv
drwxr-xr-x 5 me staff 160B 24 Sep 11:32 Wychavon
-rw-r--r-- 1 me staff 84M 24 Sep 11:33 Wychavon.csv
I would like to iterate only over the "Wrecsam - Wrexham" and "Wychavon" directories.
This is what I've tried.
for d in "$(find data -maxdepth 1 -type d -print | sort -r)"; do
echo $d
done
But this gives me output like this:
Wychavon
Wrecsam
-
Wrexham
I want output like this:
Wychavon
Wrecsam - Wrexham
What can I do?
Your for loop is not doing the right thing because of word splitting. You can use a glob instead of having to invoke an external command in a subshell:
shopt -s nullglob # make glob expand to nothing if there are no matches
for dir in data/*/; do
echo dir="$dir"
done
Related:
Looping over directories in Bash
Why you shouldn't parse the output of ls(1)
I am trying to output the number of directories in a given path on a SINGLE line. My desire is to output this:
X-many directories
Currently, with my bash sript, I get this:
X-many
directories
Here's my code:
ARGUMENT=$1
ls -l $ARGUMENT | egrep -c '^drwx'; echo -n "directories"
How can I fix my output? Thanks
I suggest
echo "$(ls -l "$ARGUMENT" | egrep -c '^drwx') directories"
This uses the shell's feature of final newline removal for command substitution.
Do not pipe to ls output and count directories as you can get wrong results if special characters have been used in file/directory names.
To count directories use:
shopt -s nullglob
arr=( "$ARGUMENT"/*/ )
echo "${#arr[#]} directories"
/ at the end of glob will make sure to match only directories in "$ARGUMENT" path.
shopt -s nullglob is to make sure to return empty results if glob pattern fails (no directory in given argument).
as alternative solution
$ bc <<< "$(find /etc -maxdepth 1 -type d | wc -l)-1"
116
another one
$ count=0; while read curr_line; do count=$((count+1)); done < <(ls -l ~/etc | grep ^d); echo ${count}
116
Would work correctly with spaces in the folder name
$ ls -la
total 20
drwxrwxr-x 5 alex alex 4096 Jun 30 18:40 .
drwxr-xr-x 11 alex alex 4096 Jun 30 16:41 ..
drwxrwxr-x 2 alex alex 4096 Jun 30 16:43 asdasd
drwxrwxr-x 2 alex alex 4096 Jun 30 16:43 dfgerte
drwxrwxr-x 2 alex alex 4096 Jun 30 16:43 somefoler with_space
$ count=0; while read curr_line; do count=$((count+1)); done < <(ls -l ./ | grep ^d); echo ${count}
3
I'm trying to do the following on OSX:
ls -lR --ignore *.app
So that I can recursively search through all folders except for .app folders.
However it seems there is seems to be no --ignore or --hide options in Darwin.
Perhaps a script to recursively search one folder deep for a given set and I'm not sure I cant pipe ls -lR through anything because of the format of the output:
./ROOT/Applications/Some_app:
drwxr-xr-x 3 admin root 102 26 Jun 11:03 app-bundle.app #<- WANT THIS
drwxr-xr-x# 24 admin root 816 26 Jun 11:24 folder #<- WANT THIS
./ROOT/Applications/Some_app/app-bundle.app: #<- DON'T WANT
drwxr-xr-x 7 admin root 238 26 Jun 11:03 Contents #<- DON'T WANT
...
Use find:
find . -ls -name '*.app' -prune
In bash, you can use extended globbing to exclude a pattern.
shopt -s extglob # this must be on its own line
echo !(*.app) # match everything except for the given pattern
If you have bash version 4 or higher, you can use globstar to do this recursively.
shopt -s globstar
shopt -s extglob
echo **/!(*.app)
An alternative is to pipe to grep:
ls | grep -v
How can I write a bash script on Linux to determine which files in two directories have different permissions?
For example, I have two directories:
fold1 having two files:
1- file1 (-rw-rw-r--)
2- file2 (-rw-rw-r--)
fold2 having same-name files with different permissions:
1- file1 (-rwxrwxr-x)
2- file2 (-rw-rw-r--)
I need a script to output the file names that have different permissions,
so the script will print only file1
I am currently checking the permissions manually by displaying the files with:
for i in `find .`; do ls -l $i ls -l ../file2/$i; done
Parsing find . output with: for i in $(find .) is going to give you trouble for any filenames with spaces, newlines, or other perfectly normal characters:
$ touch "one file"
$ for i in `find .` ; do ls -l $i ; done
total 0
-rw-r--r-- 1 sarnold sarnold 0 2012-02-08 17:30 one file
ls: cannot access ./one: No such file or directory
ls: cannot access file: No such file or directory
$
Since permissions can also differ by owner or by group, I think you should include those as well. If you need to include the SELinux security label, the stat(1) program makes that easy to get as well via the %C directive:
for f in * ; do stat -c "%a%g%u" "$f" "../scatman/${f}" |
sort | uniq -c | grep -q '^\s*1' && echo "$f" is different ; done
(Do whatever you want for the echo command...)
Example:
$ ls -l sarnold/ scatman/
sarnold/:
total 0
-r--r--r-- 1 sarnold sarnold 0 2012-02-08 18:00 funky file
-rw-r--r-- 1 sarnold sarnold 0 2012-02-08 18:01 second file
-rw-r--r-- 1 root root 0 2012-02-08 18:05 third file
scatman/:
total 0
-rw-r--r-- 1 sarnold sarnold 0 2012-02-08 17:30 funky file
-rw-r--r-- 1 sarnold sarnold 0 2012-02-08 18:01 second file
-rw-r--r-- 1 sarnold sarnold 0 2012-02-08 18:05 third file
$ cd sarnold/
$ for f in * ; do stat -c "%a%g%u" "$f" "../scatman/${f}" | sort | uniq -c | grep -q '^\s*1' && echo "$f" is different ; done
funky file is different
third file is different
$
If the glob */ only matches directories, then logically the extglob !(*/) should match non-directories; but this doesn't work. Is this a bug or am I missing something? Does this work on any shell?
Test 1 to prove that */ works
$ cd /tmp; ls -ld */
drwxr-xr-x 2 seand users 4096 Jan 1 15:59 test1//
drwxr-xr-x 2 seand users 4096 Jan 1 15:59 test2//
drwxr-xr-x 2 seand users 4096 Jan 1 15:59 test3//
Test 2 to show potential bug with !(*/)
$ cd /tmp; shopt -s extglob; ls -ld !(*/)
/bin/ls: cannot access !(*/): No such file or directory
In Bash, !() (like *, ?, *(), and #()) only applies to one path component. Thus, !(anything containing a / slash) doesn't work.
If you switch to zsh, you can use *(^/) to match all non-directories, or *(.) to match all plain files.
The answer to the specific question has already been given; and I am not sure if you really wanted another solution or if you were just interested to analyze the behavior, but one way to list all non-directories in the current folder is to use find:
find . ! -type d -maxdepth 1