List *.h and *.cpp using bash - bash

I am using bash version 4.2.28 on Fedora 16. I have the extglob option set. I am trying to list all files matching *.h or *.cpp using ls *(h|cpp) but the command returns the following:
[agnel#damien cadcore]$ ls *(h|cpp)
ls: cannot access *(h|cpp): No such file or directory
I have verified that there are indeed several .h and .cpp files in my current directory. Am I doing something wrong or could this be a bug in bash or ls?
Update: Thank you for your answers. Using *.h *.cpp does what I need. However, I would still like to know why extglob didn't work like I expected.

The extended glob *(pattern-list) matches 0 or more occurrences of the following pattern list. It does not match an arbitrary string followed by something from the option list. You want:
$ ls *.#(h|cpp)
This matches something, followed by a period, followed by either "h" or "cpp"

I don't think you need complicated globbing in this case: simply try echo *.h *.cpp.

You should be able to do just ls *h *cpp

Related

concatenating an element of array in a for loop

I have a script in which I am attempting to match strings in filenames on either side of a word. The keywords which are meant for pattern matchin with the wildcard character, as in:
ls *spin*.txt
This will of course match any of the following filenames:
one_spin.txt
4_5342spin-yyy.txt
fifty_spinout.txt
...etc.
What I would like to do is use the word 'spin' and a number of other words as match cases in an array that I can pass through a loop. I'd like these matches to be case-insensitive I attempt this like so:
types=(spin wheel rotation)
for file in $types; do
ls '*'${file}'*.txt'
done
EDIT: Because I'm looking for a solution that is maleable, I'd also like to be able to do something like:
types=(spin wheel rotation)
for file in $types; do
find . -iname "*$file*.txt"
done
I'm not sure how bash interprets either of these except seeing that it does not list the desired files. Can someone clarify what is happening in my script, and offer a solution that meets the aforementioned criteria?
Your attempt will work with a little more tweaks. As you are assigning types
as an array, you need to access it as an array.
Would you please try:
types=(spin wheel rotation)
for file in "${types[#]}"; do
ls *${file}*.txt
done
If your bash supports shopt builtin, you can also say:
shopt -s extglob
ls *#(spin|wheel|rotation)*.txt
If you want to make it match in a case-insensitive way, please try:
shopt -s extglob nocaseglob
ls *#(spin|wheel|rotation)*.txt
which will match one_Spin.txt, fifty_SPINOUT.TXT, etc.
Hope this helps.
don't make it complicate please try this instead
ls *{spin,wheel,rotation}*.txt
This also helpfull in creating files
touch 1_{spin,wheel,rotation,ads,sad,zxc}_2.txt
Or dirs
mkdir -p {test,test2/log,test3}

Terminal - run 'file' (file type) for the whole directory

I'm a beginner in the terminal and bash language, so please be gentle and answer thoroughly. :)
I'm using Cygwin terminal.
I'm using the file command, which returns the file type, like:
$ file myfile1
myfile1: HTML document, ASCII text
Now, I have a directory called test, and I want to check the type of all files in it.
My endeavors:
I checked in the man page for file (man file), and I could see in the examples that you could type the names of all files after the command and it gives the types of all, like:
$ file myfile{1,2,3}
myfile1: HTML document, ASCII text
myfile2: gzip compressed data
myfile3: HTML document, ASCII text
But my files' names are random, so there's no specific pattern to follow.
I tried using the for loop, which I think is going to be the answer, but this didn't work:
$ for f in ls; do file $f; done
ls: cannot open `ls' (No such file or directory)
$ for f in ./; do file $f; done
./: directory
Any ideas?
Every Unix or Linux shell supports some kind of globs. In your case, all you need is to use * glob. This magic symbol represents all folders and files in the given path.
eg., file directory/*
Shell will substitute the glob with all matching files and directories in the given path. The resulting command that will actually get executed might be something like:
file directory/foo directory/bar directory/baz
You can use a combination of the find and xargs command.
For example:
find /your/directory/ | xargs file
HTH
file directory/*
Is probably the shortest simplest solution to fix your issue, but this is more of an answer as to why your loops weren't working.
for f in ls; do file $f; done
ls: cannot open `ls' (No such file or directory)
For this loop it is saying "for f in the directory or file 'ls' ; do..." If you wanted it to execute the ls command then you would need to do something like this
for f in `ls`; do file "$f"; done
But that wouldn't work correctly if any of the filenames contain whitespace. It is safer and more efficient to use the shell's builtin "globbing" like this
for f in *; do file "$f"; done
For this one there's an easy fix.
for f in ./; do file $f; done
./: directory
Currently, you're asking it to run the file command for the directory "./".
By changing it to " ./* " meaning, everything within the current directory (which is the same thing as just *).
for f in ./*; do file "$f"; done
Remember, double quote variables to prevent globbing and word splitting.
https://github.com/koalaman/shellcheck/wiki/SC2086

Looping through files of specified extensions in bash

I am trying to loop through files of a list of specified extensions with a bash script. I tried the solution given at Matching files with various extensions using for loop but it does not work as expected. The solution given was:
for file in "${arg}"/*.{txt,h,py}; do
Here is my version of it:
for f in "${arg}"/*.{epub,mobi,chm,rtf,lit,djvu}
do
echo "$f"
done
When I run this in a directory with an epub file in it, I get:
/*.epub
/*.mobi
/*.chm
/*.rtf
/*.lit
/*.djvu
So I tried changing the for statement:
for f in "${arg}"*.{epub,mobi,chm,rtf,lit,djvu}
Then I got:
089281098X.epub
*.mobi
*.chm
*.rtf
*.lit
*.djvu
I also get the same result with:
for f in *.{epub,mobi,chm,rtf,lit,djvu}
So it seems that the "${arg}" argument is unnecessary.
Although either of these statements finds files of the specified extensions and can pass them to a program, I get read errors from the unresolved *. filenames.
I am running this on OS X Mountain Lion. I was aware that the default bash shell was outdated so I upgraded it from 3.2.48 to 4.2.45 using homebrew to see if this was the problem. That didn't help so I am wondering why I am getting these unexpected results. Is the given solution wrong or is the OS X bash shell somehow different from the *NIX version? Is there perhaps an alternate way to accomplish the same thing that might work better in the OS X bash shell?
This may be a BASH 4.2ism. It does not work in my BASH which is still 3.2. However, if you shopt -s extglob, you can use *(...) instead:
shopt -s extglob
for file in *.*(epub|mobi|chm|rtf|lit|djvu)
do
...
done
#David W.: shopt -s extglob for f in .(epub|mobi|chm|rtf|lit|djvu) results in: 089281098X.epub #kojiro: arg=. shopt -s nullglob for f in "${arg}"/.{epub,mobi,chm,rtf,lit,djvu} results in: ./089281098X.epub shopt -s nullglob for f in "${arg}".{epub,mobi,chm,rtf,lit,djvu} results in: 089281098X.epub So all of these variations work but I don't understand why. Can either of you explain what is going on with each variation and what ${arg} is doing? I would really like to understand this so I can increase my knowledge. Thanks for the help.
In mine:
for f in *.*(epub|mobi|chm|rtf|lit|djvu)
I didn't include ${arg} which expands to the value of $arg. The *(...) matches the pattern found in the parentheses which is one of any of the series of extensions. Thus, it matches *.epub.
Kojiro's:
arg=.
shopt -s nullglob
for f in "${arg}"/*.{epub,mobi,chm,rtf,lit,djvu}
Is including $arg and the slash in his matching. Thus, koriro's start with ./ because that's what they are asking for.
It's like the difference between:
echo *
and
echo ./*
By the way, you could do this with the other expressions too:
echo *.*(epub|mobi|chm|rtf|lit|djvu)
The shell is doing all of the expansion for you. It's really has nothing to do with the for statement itself.
A glob has to expand to an existing, found name, or it is left alone with the asterisk intact. If you have an empty directory, *.foo will expand to *.foo. (Unless you use the nullglob Bash extension.)
The problem with your code is that you start with an arg, $arg, which is apparently empty or undefined. So your glob, ${arg}/*.epub expands to /*.epub because there are no files ending in ".epub" in the root directory. It's never looking in the current directory. For it to do that, you'd need to set arg=. first.
In your second example, the ${arg}*.epub does expand because $arg is empty, but the other files don't exist, so they continue not to expand as globs. As I hinted at before, one easy workaround would be to activate nullglob with shopt -s nullglob. This is bash-specific, but will cause *.foo to expand to an empty string if there is no matching file. For a strict POSIX solution, you would have to filter out unexpanded globs using [ -f "$f" ]. (Then again, if you wanted POSIX, you couldn't use brace expansion either.)
To summarize, the best solutions are to use (most intuitive and elegant):
shopt -s extglob
for f in *.*(epub|mobi|chm|rtf|lit|djvu)
or, in keeping with the original solution given in the referenced thread (which was wrong as stated):
shopt -s nullglob
for f in "${arg}"*.{epub,mobi,chm,rtf,lit,djvu}
This should do it:
for file in $(find ./ -name '*.epub' -o -name '*.mobi' -o -name '*.chm' -o -name '*.rtf' -o -name '*.lit' -o -name '*.djvu'); do
echo $file
done

How do I traverse through every file in a folder?

I have a folder called exam. This folder has 3 folders called math, physics and english. All of these folders have some sub folders and files in them. I want to traverse through every folder and print the path of every folder and file on another file called files. I've done this:
#!/bin/bash
LOC="/home/school/exam/*"
{
for f in $LOC
do
echo $f
done
} > "files"
The exit I get is:
/home/school/exam/math
/home/school/exam/physics
/home/school/exam/english
I can't figure out how to make the code visit and do the same thing to the sub folders of exam. Any suggestions?
PS I'm just a beginner in shell scripting.
find /home/school/exam -print > files
With the globstar option bash will recurse all filenames in subdirectories when using two adjacent stars
use :
shopt -s globstar
for i in /home/school/exam/**
The reference here is man bash:
globstar
If set, the pattern ** used in a pathname expansion context
will match all files and zero or more directories and
subdirectories. If the pattern is followed by a /, only
directories and subdirectories match.
and info bash:
* Matches any string, including the null string. When the
globstar shell option is enabled, and * is used in a
pathname expansion context, two adjacent *s used as a
single pattern will match all files and zero or more
directories and subdirectories. If followed by a /, two
adjacent *s will match only directories and subdirecto‐
ries.
you can use find command, it could get all files, then you can do something on them, using exec or xargs for example.
You can also use the tree command which is included in most *nix distributions. (Though Ubuntu is a notable exception - but can be installed via apt-get)
LOC="/home/school/exam/"
tree -if $LOC > files
How about this to list all your files recursively.
for i in *; do ls -l $i ; done

Can I use shell wildcards to select filenames ranging across double-digit numbers (e.g., from foo_1.jpg to foo_54.jpg)?

I have a directory with image files foo_0.jpg to foo_99.jpg. I would like to copy files foo_0.jpg through foo_54.jpg.
Is this possible just using bash wildcards?
I am thinking something like cp foo_[0-54].jpg but I know this selects 0-5 and 4 (right?)
Also, if it is not possible (or efficient) with just wildcards what would be a better way to do this?
Thank you.
I assume you want to copy these files to another directory:
cp -t target_directory foo_{0..54}.jpg
I like glenn jackman answer, but if you really want to use globbing, following might also work for you:
$ shopt -s extglob
$ cp foo_+([0-9]).jpg $targetDir
In extended globbing +() matches one or more instances of whatever expression is in the parentheses.
Now, this will copy ALL files that are named foo_ followed by any number, followed by .jpg. This will include foo_55.jpg, foo_139.jpg, and foo_1223218213123981237987.jpg.
On second thought, glenn jackman has the better answer. But, it did give me a chance to talk about extended globbing.
ls foo_[0-9].jpg foo_[1-4][0-9].jpg foo_5[0-4].jpg
Try it with ls and if that looks good to you then do the copy.
for i in `seq 0 54`; do cp foo_$i.jpg <target>; done
An extension answer of #grok12's answer above
$ ls foo_([0-9]|[0-4][0-9]|5[0-4]).jpg
Basically the regex above will match below
anything with a single digit OR
two digits and that first digit must be between 0-4 and second digit between 0-9 OR
two digits and that first digit is 5 and second digit between 0-9
Alternatively you can achieve similar result with regex below
$ ls file{[0-9],[0-4][0-9],5[0-4]}.txt

Resources