bash script - Iterate list of files without extention in a directory - bash

I want to iterate all the files in a directory in a bash script.
List all files with extentions .LOG .txt .MAP .TL9*
List all files which have no extention.
I am trying this :
for file in *.{LOG,txt,MAP,TL9*}; do
I want to list the files, that only ends with above extension.
So, I do not want to list a file: temp.txt.EXT because it does not end with above given extentions. Similarly I don't want this to be reported temp.TL94.JPG or temp.TL9.JPG.
But in this above for loop, how do i insert the check which gives me the file with no extention?
Please help.

Using extglob, you can do this:
shopt -s nullglob
shopt -s extglob
for file in #(!(*.*)|*.#(csv|LOG|TL9!(*.*))); do
echo "$file"
done

With extglob:
*.#(LOG|txt|MAP|TL9) !(*.*)
*.#(LOG|txt|MAP|TL9) matches all .LOG, txt, .MAP, and .TL9 files
!(*.*) matches all files except ones having . in name
Enable extglob first if not enabled:
shopt -s extglob

You could also use the find command to list files with extension MAP, LOG, TL9or without any extension at all.
#!/bin/bash
files=`find . -type f -regex ".*[\.LOG|\.MAP|\.TL9]" -o ! -name "*.*"`
for file in $files
do
echo $file
done

Related

Rename .txt files

Looking for help with the following code. I have a folder titled data, with 6 subfolders (folder1, folder2, etc). Each folder has a text file I want to rename to "homeworknotes" keeping the .txt extension.
Used notes before for short:
So far I have the following code:
for file in data/*/*.txt; do
mv $file "notes"
done
find
You can use find command with -execdir that will execute command of your choice in a directory where file matching pattern is:
find data -type f -name '*.txt' -execdir mv \{\} notes.txt \;
data is path to directory where find should look for matching files recursively
-type f look only for files, not directories
-name '*.txt' match anything that ends with .txt
-execdir mv \{\} notes.txt run command mv {} notes.txt in directory where file was found; where {} is original filename found by find.
bash
EDIT1: To do this without find you need to handle recursive directory traversal (unless you have fixed directory layout). In bash you can set following shell options with shopt -s command:
extglob - extended globbing support (allows to write extended globs like **; see "Pathname Expansion" in man bash)
globstar - allows ** in pathname expansion; **/ will match any directories and their subdirectories (see "Pathname Expansion" in man bash).
nullglob - allows patterns that match no files (in case there's a directory without any .txt file)
Following script will traverse directories under data/ and rename .txt files to notes.txt:
#!/bin/bash
shopt -s extglob globstar nullglob
for f in data/**/*.txt ; do
mv $f $(dirname $f)/notes.txt
done
mv $f $(dirname $f)/notes.txt moves (renames) file; $f contains matched path so e.g. data/folder1/day4notes.txt and $(dirname $f) gets directory where that file is - in this case data/folder1 so we just append /notes.txt to that.
EDIT2: If you are absolutely positive that you want to do this only in first level of subdirectories under data/ you can omit extglob and globstar (and if you know there's at least one .txt in each directory then also nullglob) and go ahead with pattern you posted; but you still need to use mv $f $(dirname $f)/notes.txt to rename file.
NOTE: When experimenting with things like these always make backup beforehand. If you have multiple .txt files in any of directories they all will get renamed to notes.txt so you might lose data in that case.

find and delete empty files in directory and its subdirs without find

I am trying to make a bash script that finds and removes empty files in a directory including subdirectories, without using the find command.
This is part of the script using the find command but I am unsure how to convert this line without using find.
find . -type f -empty -delete
Try this code:
# enable recursive globstar matching
shopt -s globstar
# directory to delete files from
dir="/tmp"
# loop through files recusively
for f in ${dir}/* ${dir}/**/* ; do
# check if file is empty
if [ ! -s "$f" ]; then
# remove file
rm "$f"
fi
done

bash script: extracting and converting files

I would like to write a bash script that extracts files from a .zip file and converts them into text files - pretty much adds ".txt" at the end of the file name as they are all text files but do not have an extension.
I am pretty much new to shell. I've found this:
cd /path/to/files
for i in *.gz
do
gunzip $i
done
for i in *.zip
do
unzip $i
done
I imagine it extracts the files, but how do I then rename/convert them?
You can use use extglob to find all files that don't have .txt extension:
shopt -s extglob
for f in !(*.txt); do
mv "$f" "$f".txt
done
PS: You can also use !(*.txt) pattern to match all the files with no extension.

Filenames with wildcards in variables

#!/bin/bash
outbound=/home/user/outbound/
putfile=DATA_FILE_PUT_*.CSV
cd $outbound
filecnt=0
for file in $putfile; do let filecnt=filecnt+1; done
echo "Filecount: " $filecnt
So this code works well when there are files located in the outbound directory. I can place files into the outbound path and as long as they match the putfile mask then the files are incremented as expected.
Where the problem comes in is if I run this while there are no files located in $outbound.
If there are zero files there $filecnt still returns a 1 but I'm looking to have it return a 0 if there are no files there.
Am I missing something simple?
Put set -x just below the #! line to watch what your script is doing.
If there is no matching file, then the wildcard is left unexpanded, and the loop runs once, with file having the value DATA_FILE_PUT_*.CSV.
To change that, set the nullglob option. Note that this only works in bash, not in sh.
shopt -s nullglob
putfile=DATA_FILE_PUT_*.CSV
for file in $putfile; do let filecnt=filecnt+1; done
Note that the putfile variable contains the wildcard pattern, not the list of file names. It might make more sense to put the list of matches in a variable instead. This needs to be an array variable, and you need to change the current directory first. The number of matching files is then the length of the array.
#!/bin/bash
shopt -s nullglob
outbound=/home/user/outbound/
cd "$outbound"
putfiles=(DATA_FILE_PUT_*.CSV)
echo "Filecount: " ${#putfiles}
If you need to iterate over the files, take care to protect the expansion of the array with double quotes, otherwise if a file name contains whitespace then it will be split over several words (and if a filename contains wildcard characters, they will be expanded).
#!/bin/bash
shopt -s nullglob
outbound=/home/user/outbound/
cd "$outbound"
putfiles=(DATA_FILE_PUT_*.CSV)
for file in "${putfiles[#]}"; do
echo "Processing $file"
done
You could test if file exists first
for file in $putfile; do
if [ -f "$file" ] ; then
let filecnt=filecnt+1
fi
done
Or look for your files with find
for file in $(find . -type f -name="$putfile"); do
let filecnt=filecnt+1
done
or simply (fixed)
filecnt=$(find . -type f -name "$putfile" | wc -l); echo $filecnt
This is because when no matches are found, bash by default expands the wildcard DATA_FILE_PUT_*.CSV to the word DATA_FILE_PUT_*.CSV and therefore you end up with a count of 1.
To disable this behavior, use shopt -s nullglob
Not sure why you need a piece of code here. Following one liner should do your job.
ls ${outbound}/${putfile} | wc -l
Or
find ${outbound} -maxdepth 1 -type f -name "${putfile}" | wc -l

Bash scripting, loop through files in folder fails

I'm looping through certain files (all files starting with MOVIE) in a folder with this bash script code:
for i in MY-FOLDER/MOVIE*
do
which works fine when there are files in the folder. But when there aren't any, it somehow goes on with one file which it thinks is named MY-FOLDER/MOVIE*.
How can I avoid it to enter the things after
do
if there aren't any files in the folder?
With the nullglob option.
$ shopt -s nullglob
$ for i in zzz* ; do echo "$i" ; done
$
for i in $(find MY-FOLDER/MOVIE -type f); do
echo $i
done
The find utility is one of the Swiss Army knives of linux. It starts at the directory you give it and finds all files in all subdirectories, according to the options you give it.
-type f will find only regular files (not directories).
As I wrote it, the command will find files in subdirectories as well; you can prevent that by adding -maxdepth 1
Edit, 8 years later (thanks for the comment, #tadman!)
You can avoid the loop altogether with
find . -type f -exec echo "{}" \;
This tells find to echo the name of each file by substituting its name for {}. The escaped semicolon is necessary to terminate the command that's passed to -exec.
for file in MY-FOLDER/MOVIE*
do
# Skip if not a file
test -f "$file" || continue
# Now you know it's a file.
...
done

Resources