Shell Script: Get newest file (of an extension) from a directory - bash

I want to get newest file in a directory. I am using
BashFAQ/099 as reference.
My Code:
#!/bin/bash
function newest_file() {
local file_ext="$2"
local files=("$1"/*."${file_ext}")
local newest="${files[#]:0:1}"
for f in "${files[#]}"; do
if [[ "$f" -nt "$newest" ]]; then
newest="$f"
fi
done
echo "$newest"
}
file_path
newest_file "${1:-$HOME/Downloads}" "${2:-*}"
Output:
➜ bash func_get_new_file.sh $HOME/reports txt
/home/ashwin/reports/test4.txt
➜ bash func_get_new_file.sh $HOME/reports
newest_file:2: no matches found: /home/ashwin/reports/*.*
I get error when file extension is *. I want to understand why . does not work.
I also want to understand the (..) syntax used for creating array of files in directory, i.e. files=("$1"/*.${file_ext}) in my code above.
As suggested by #Cyrus, updating code as per shellcheck.net.
On futher debug and reading other posts, I figured that passing asterisk as parameter is troublesome. I am updating my code to check any special characters being passed in $2 as below:
#!/bin/bash
newest_file() {
local file_ext="$2"
if [[ ! "${file_ext}" =~ ^[.a-zA-Z0-9]*$ ]]; then
echo "ERROR: file extension can have "." or alphanumberic characters. Don't provide any ext if all files should be checked"
return 1
fi
[[ -z "${file_ext}" ]] && local files=("$1"/*) || local files=("$1"/*.${file_ext})
local newest="${files[#]:0:1}"
for f in "${files[#]}"; do
if [[ $f -nt $newest ]]; then
newest="$f"
fi
done
echo "$newest"
}
newest_file "${1:-$HOME/Downloads}" "${2}"
Unless anyone can provide a simple solution to passing asterisk, I will go with the above solution.

Related

bash if and then statements both run

I'm running this in bash and even though there is a .txt file it prints out "no new folders to create" in the terminal.
Am I missing something?
FILES=cluster-02/*/*
for f in $FILES
do
if [[ $f == *."txt" ]]
then
cat $f | xargs mkdir -p
else
echo "No new folders to create"
fi
done;
As mentioned in the first comment, the behaviour is indeed as you might expect from your script: you run through all files, text files and other ones. In case your file is a text file, you perform the if-case and in case your file is another type of file, you perform the else-case.
In order to solve this, you might decide not to take the other files into account (only handle text files), I think you might do this as follows:
FILES=cluster-02/*/*.txt
You're looping over multiple files, so the first result may trigger the if and the second can show the else.
You could save the wildcard result in an array, check if there's something in it, and loop if so:
shopt -s nullglob
FILES=( foo/* )
if (( ${#FILES[#]} )); then
for f in "${FILES[#]}"; do
if [[ $f == *."txt" ]]; then
echo $f
fi
done
else
echo "No new folders to create"
fi
#!/usr/bin/env bash
# Create an array containing a list of files
# This is safer to avoid issues with files having special characters such
# as spaces, glob-characters, or other characters that might be cumbersome
# Note: if no files are found, the array contains a single element with the
# string "cluster-02/*/*"
file_list=( cluster-02/*/* )
# loop over the content of the file list
# ensure to quote the list to avoid the same pitfalls as above
for _file in "${file_list[#]}"
do
[ "${_file%.txt}" == "${_file}" ] && continue # skip, not a txt
[ -f "${_file}" ] || continue # check if the file exists
[ -r "${_file}" ] || continue # check if the file is readable
[ -s "${_file}" ] || continue # check if the file is empty
< "${_file}" xargs mkdir -p -- # add -- to avoid issues with entries starting with -
_c=1
done;
[ "${_c}" ] || echo "No new folders to create"

Recursively list hidden files without ls, find or extendedglob

As an exercise I have set myself the task of recursively listing files using bash builtins. I particularly don't want to use ls or find and I would prefer not to use setopt extendedglob. The following appears to work but I cannot see how to extend it with /.* to list hidden files. Is there a simple workaround?
g() { for k in "$1"/*; do # loop through directory
[[ -f "$k" ]] && { echo "$k"; continue; }; # echo file path
[[ -d "$k" ]] && { [[ -L "$k" ]] && { echo "$k"; continue; }; # echo symlinks but don't follow
g "$k"; }; # start over with new directory
done; }; g "/Users/neville/Desktop" # original directory
Added later: sorry - I should have said: 'bash-3.2 on OS X'
Change
for k in "$1"/*; do
to
for k in "$1"/* "$1"/.[^.]* "$1"/..?*; do
The second glob matches all files whose names start with a dot followed by anything other than a dot, while the third matches all files whose names start with two dots followed by something. Between the two of them, they will match all hidden files other than the entries . and ...
Unfortunately, unless the shell option nullglob is set, those (like the first glob) could remain as-is if there are no files whose names match (extremely likely in the case of the third one) so it is necessary to verify that the name is actually a file.
An alternative would be to use the much simpler glob "$1"/.*, which will always match the . and .. directory entries, and will consequently always be substituted. In that case, it's necessary to remove the two entries from the list:
for k in "$1"/* "$1"/.*; do
if ! [[ $k =~ /\.\.?$ ]]; then
# ...
fi
done
(It is still possible for "$1"/* to remain in the list, though. So that doesn't help as much as it might.)
Set the GLOBIGNORE file to exclude . and .., which implicitly turns on "shopt -u dotglob". Then your original code works with no other changes.
user#host [/home/user/dir]
$ touch file
user#host [/home/user/dir]
$ touch .dotfile
user#host [/home/user/dir]
$ echo *
file
user#host [/home/user/dir]
$ GLOBIGNORE=".:.."
user#host [/home/user/dir]
$ echo *
.dotfile file
Note that this is bash-specific. In particular, it does not work in ksh.
You can specify multiple arguments to for:
for k in "$1"/* "$1"/.*; do
But if you do search for .* in directories , you should be aware that it also gives you the . and .. files. You may also be given a nonexistent file if the "$1"/* glob matches, so I would check that too.
With that in mind, this is how I would correct the loop:
g() {
local k subdir
for k in "$1"/* "$1"/.*; do # loop through directory
[[ -e "$k" ]] || continue # Skip missing files (unmatched globs)
subdir=${k##*/}
[[ "$subdir" = . ]] || [[ "$subdir" = .. ]] && continue # Skip the pseudo-directories "." and ".."
if [[ -f "$k" ]] || [[ -L "$k" ]]; then
printf %s\\n "$k" # Echo the paths of files and symlinks
elif [[ -d "$k" ]]; then
g "$k" # start over with new directory
fi
done
}
g ~neville/Desktop
Here the funky-looking ${k##*/} is just a fast way to take the basename of the file, while local was put in so that the variables don't modify any existing variables in the shell.
One more thing I've changed is echo "$k" to printf %s\\n "$k", because echo is irredeemably flawed in its argument handling and should be avoided for the purpose of echoing an unknown variable. (See Rich's sh tricks for an explanation of how; it boils down to -n and -e throwing a spanner in the works.)
By the way, this will NOT print sockets or fifos - is that intentional?

Bash check is file with variable name inside loop exists

I would like to check if a file exists. Of course this is explained in many places. Now I am inside a loop like:
for ((l=0;l<5;l+=1));
do
if -a FILENAMEl #FILENAME contains l!!!!!!!!!
then "FILENAMEl exists"
else
do
.............
fi
done
Any ideas?
Thanks so much!!!
The main problem is that you are mixing the syntax of a variable name l with that of a file name. If you wish to use them together, to form part of a filename with a variable, you need a syntax break (caused by "$"), or use braces ({}).
If the file name has a variable in the middle, then braces work best. For example: "my_file_${l}_head.txt" would create files like my_file_1_head.txt, my_file_2_head.txt, etc.
Here is your original example corrected:
for ((l=0;l<5;l+=1))
do
if test -a FILENAME$l
then echo "FILENAME$l exists"
else echo "FILENAME$l doesn't exist"
fi
done
However, I wouldn't write code this way.
I only took your example and changed it as little as possible to show you the essential difference.
Here's another way to write it, using a more DRY (Don't Repeat Yourself) approach:
for l in {1..5}; do
file="filename$l"
if [[ -a "$file" ]]; then
echo "$file exists"
else
echo "$file does not exist"
fi
done
If you want more minimalism, here's yet another approach:
for l in {1..5}; do f="filename$l"
[[ -a "$f" ]] && echo "$f exists" || echo "$f does not exist"
done
Now, if you need to do something other than just print out the status, using function calls to make the extra work modular works well:
for l in {1..5} ; do f="$filename$l"
[[ -a "$f" ] && process_file $f || non_existant_file $f
done
Then, elsewhere, you should define both process_file and non_existant_file:
process_file() {
local file="$1"
# do whatever is needed for an existing file
}
non_existant_file() {
local file="$1"
# do whatever is needed for a non-existant file
}
Assume you're trying to find which files exist in the filename format file1.csv, file2.csv, etc...
for i in {1..5};
do f="file$i.csv";
if test -e $f;
then echo "$f exists";
else echo "$f does not exist";
fi
done
Perhaps what you need is simply a find
find . -name "file?.csv" -size +10k
You can restrict the file name to suffix 1..5 and do an action on the find result (check for find's -exec or more generally xargs as below).
find . -name "file[1-5].csv" -size +10c | xargs head -1

Bash: Pass alias or function as argument to program

Quite often i need to work on the newest file in a directory.
Normally i do:
ls -rt
and then open the last file in vim or less.
Now i wanted to produce an alias or function, like
lastline() {ls -rt | tail -n1}
# or
alias lastline=$(ls -rt | tail -n1)
Calling lastline outputs the newest file in the directory, which is nice.
But calling
less lastline
wants to open the file "lastline" which doesn't exist.
How do i make bash execute the function or alias, if possible without a lot of typing $() or ``?
Or is there any other way to achieve the same result?
Thanks for your help.
You're parsing ls, and this is very bad. Moreover, if the last modified “file” is a directory, you'll be lessing/viming a directory.
So you need a robust way to determine the last modified file in the current directory. You may use a helper function like the following (that you'll put in your .bashrc):
last_modified_regfile() {
# Finds the last modified regular file in current directory
# Found file is in variable last_modified_regfile_ret
# Returns a failure return code if no reg files are found
local file
last_modified_regfile_ret=
for file in *; do
[[ -f $file ]] || continue
if [[ -z $last_modified_regfile_ret ]] || [[ $file -nt $last_modified_regfile_ret ]]; then
last_modified_regfile_ret=$file
fi
done
[[ $last_modified_regfile_ret ]]
}
Then you may define another function that will vim the last found file:
vimlastline() {
last_modified_regfile && vim -- "$last_modified_regfile_ret"
}
You may even have last_modified_regfile take optional arguments: the directories where it will find the last modified regular file:
last_modified_regfile() {
# Finds the last modified regular file in current directory
# or in directories given as arguments
# Found file is in variable last_modified_regfile_ret
# Returns a failure return code if no reg files are found
local file dir
local save_shopt_nullglob=$(shopt -p nullglob)
shopt -s nullglob
(( $# )) || set .
last_modified_regfile_ret=
for dir; do
dir=${dir%/}
[[ -d $dir/ ]] || continue
for file in "$dir"/*; do
[[ -f $file ]] || continue
if [[ -z $last_modified_regfile_ret ]] || [[ $file -nt $last_modified_regfile_ret ]]; then
last_modified_regfile_ret=$file
fi
done
done
$save_shopt_nullglob
[[ $last_modified_regfile_ret ]]
}
Then you can even alter vimlastline accordingly:
vimlastline() {
last_modified_regfile "$#" && vim -- "$last_modified_regfile_ret"
}
Use command substitution like this:
lastline() { ls -rt | tail -n1; }
less "$(lastline)"
Or pipe it to xargs:
lastline | xargs -I {} less '{}'

How to test filename expansion result in bash?

I want to check whether a directory has files or not in bash.
My code is here.
for d in {,/usr/local}/etc/bash_completion.d ~/.bash/completion.d
do
[ -d "$d" ] && [ -n "${d}/*" ] &&
for f in $d/*; do
[ -f "$f" ] && echo "$f" && . "$f"
done
done
The problem is that "~/.bash/completion.d" has no file.
So, $d/* is regarded as simple string "~/.bash/completion.d/*", not empty string which is result of filename expansion.
As a result of that code, bash tries to run
. "~/.bash/completion.d/*"
and of course, it generates error message.
Can anybody help me?
If you set the nullglob bash option, through
shopt -s nullglob
then globbing will drop patterns that don't match any file.
# NOTE: using only bash builtins
# Assuming $d contains directory path
shopt -s nullglob
# Assign matching files to array
files=( "$d"/* )
if [ ${#files[#]} -eq 0 ]; then
echo 'No files found.'
else
# Whatever
fi
Assignment to an array has other benefits, including desirable (correct!) handling of filenames/paths containing white-space, and simple iteration without using a sub-shell, as the following code does:
find "$d" -type f |
while read; do
# Process $REPLY
done
Instead, you can use:
for file in "${files[#]}"; do
# Process $file
done
with the benefit that the loop is run by the main shell, meaning that side-effects (such as variable assignment, say) made within the loop are visible for the remainder of script. Of course, it's also way faster, if performance is an issue.
Finally, an array can also be inserted in command line arguments (without splitting arguments containing white-space):
$ md5sum fileA "${files[#]}" fileZ
You should always attempt to correctly handle files/paths containing white-space, because one day, they will happen!
You could use find directly in the following way:
for f in $(find {,/usr/local}/etc/bash_completion.d ~/.bash/completion.d -maxdepth 1 -type f);
do echo $f; . $f;
done
But find will print a warning if some of the directory isn't found, you can either put a 2> /dev/null or put the find call after testing if the directories exist (like in your code).
find() {
for files in "$1"/*;do
if [ -d "$files" ];then
numfile=$(ls $files|wc -l)
if [ "$numfile" -eq 0 ];then
echo "dir: $files has no files"
continue
fi
recurse "$files"
elif [ -f "$files" ];then
echo "file: $files";
:
fi
done
}
find /path
Another approach
# prelim stuff to set up d
files=`/bin/ls $d`
if [ ${#files} -eq 0 ]
then
echo "No files were found"
else
# do processing
fi

Resources