While loop does not execute - bash

I currently have this code:
listing=$(find "$PWD")
fullnames=""
while read listing;
do
if [ -f "$listing" ]
then
path=`echo "$listing" | awk -F/ '{print $(NF)}'`
fullnames="$fullnames $path"
echo $fullnames
fi
done
For some reason, this script isn't working, and I think it has something to do with the way that I'm writing the while loop / declaring listing. Basically, the code is supposed to pull out the actual names of the files, i.e. blah.txt, from the find $PWD.

read listing does not read a value from the string listing; it sets the value of listing with a line read from standard input. Try this:
# Ignoring the possibility of file names that contain newlines
while read; do
[[ -f $REPLY ]] || continue
path=${REPLY##*/}
fullnames+=( $path )
echo "${fullnames[#]}"
done < <( find "$PWD" )
With bash 4 or later, you can simplify this with
shopt -s globstar
for f in **/*; do
[[ -f $f ]] || continue
path+=( "$f" )
done
fullnames=${paths[#]##*/}

Related

Generate multiple output files for loop

i'm trying to generate a new output file from each existing file in a directory of .txt files. I want to check line by line in each file for two substrings. And append the lines that match that substring to each new output file.
I'm having trouble generating the new files.
This is what i currently have:
#!/bin/sh
# My first Script
success="(Compiling)\s\".*\"\s\-\s(Succeeded)"
failure="(Compiling)\s\".*\"\s\-\s(Failed)"
count_success=0
count_failure=0
for i in ~/Documents/reports/*;
do
while read -r line;
do
if [[$success=~$line]]; then
echo $line >> output_$i
count_success++
elif [[$failure=~$]]; then
echo $line >> output_$i
count_failure++
fi
done
done
echo "$count_success of jobs ran succesfully"
echo "$count_failure of jobs didn't work"
~
Any help would be appreciated, thanks
Please, use https://www.shellcheck.net/ to check your shell scripts.
If you use Visual Studio Code, you could install "ShellCheck" (by Timon Wong) extension.
About your porgram.
Assume bash
Define different extensions for input and output files (really important if there are in the same directory)
Loop on report, input, files only
Clear output file
Read input file
if sequence:
if [[ ... ]] with space after [[ and before ]]
spaces before and after operators (=~)
reverse operands order for operators =~
Prevent globbing with "..."
#! /bin/bash
# Input file extension
declare -r EXT_REPORT=".txt"
# Output file extension
declare -r EXT_OUTPUT=".output"
# RE
declare -r success="(Compiling)\s\".*\"\s\-\s(Succeeded)"
declare -r failure="(Compiling)\s\".*\"\s\-\s(Failed)"
# Counters
declare -i count_success=0
declare -i count_failure=0
for REPORT_FILE in ~/Documents/reports/*"${EXT_REPORT}"; do
# Clear output file
: > "${REPORT_FILE}${EXT_OUTPUT}"
# Read input file (see named file in "done" line)
while read -r line; do
# does the line match the success pattern ?
if [[ $line =~ $success ]]; then
echo "$line" >> "${REPORT_FILE}${EXT_OUTPUT}"
count_success+=1
# does the line match the failure pattern ?
elif [[ $line =~ $failure ]]; then
echo "$line" >> "${REPORT_FILE}${EXT_OUTPUT}"
count_failure+=1
fi
done < "$REPORT_FILE"
done
echo "$count_success of jobs ran succesfully"
echo "$count_failure of jobs didn't work"
What about using grep?
success='Compiling\s".*"\s-\sSucceeded'
failure='Compiling\s".*"\s-\sFailed'
count_success=0
count_failure=0
for i in ~/Documents/reports/*; do
(( count_success += $(grep -E "$success" "$i" | tee "output_$i" | wc -l) ))
(( count_failure += $(grep -E "$failure" "$i" | tee -a "output_$i" | wc -l) ))
done
echo "$count_success of jobs ran succesfully"
echo "$count_failure of jobs didn't work"

Why is "ls -1 $fl | wc -l" not returning value 0 in my for loop?

I am trying to add a condition in a for loop to check the existence of a file as well as check for file size > 0 KB.
Period file contains monthly data:
20180101
20180201
20180301
20180401
20180501
There are individual files created for each month. Suppose a file is not created for one month, (20180201), then the loop below should terminate.
For example:
xxx_20180101.txt
xxx_20180301.txt
xxx_20180401.txt
xxx_20180501.txt
if [[ $STATUS -eq 0 ]]; then
for per in `cat ${PATH}/${PERIOD}.txt | cut -f 1 -d";"`
do
for fl in `ls -1 ${PATH}/${FILE} | grep ${per}`
do
if [[ `ls -1 $fl | wc -l` -eq 0 ]]; then
echo "File not found"
STATUS=1
else
if [[ -s "$fl" ]]; then
echo "$fl contain data.">>/dev/null
else
echo "$fl File size is 0KB"
STATUS=1
fi
fi
done
done
fi
but ls -1 $fl | wc -l is not returning 0 value when the if condition is executed.
The following is a demonstration of what a best-practices rewrite might look like.
Note:
We do not (indeed, must not) use a variable named PATH to store a directory under which we look for data files; doing this overwrites the PATH environment variable used to find programs to execute.
ls is not used anywhere; it is a tool intended to generate output for human consumption, not machines.
Reading through input is accomplished with a while read loop; see BashFAQ #1 for more details. Note that the input source for the loop is established at the very end; see the redirection after the done.
Finding file sizes is done with stat -c here; for more options, portable to platforms where stat -c is not supported, see BashFAQ #87.
Because your filename format is well-formed (with an underscore before the substring from your input file, and a .txt after that substring), we're refining the glob to look only for names matching that restriction. This prevents a search for 001 to find xxx_0015.txt, xxx_5001.txt, etc. as well.
#!/usr/bin/env bash
# ^^^^ -- NOT /bin/sh; this lets us use bash-only syntax
path=/provided/by/your/code # replacing buggy use of PATH in original code
period=likewise # replacing use of PERIOD in original code
shopt -s nullglob # generate a zero-length list for unmatched globs
while IFS=';' read -r per _; do
# populate an array with a list of files with names containing $per
files=( "$path/$period/"*"_${per}.txt" )
# if there aren't any, log a message and proceed
if (( ${#files[#]} == 0 )); then
echo "No files with $per found in $path/$period" >&2
continue
fi
# if they *do* exist, loop over them.
for file in "${files[#]}"; do
if [[ -s "$file" ]]; then
echo "$file contains data" >&2
if (( $(stat -c +%s -- "$file") >= 1024 )); then
echo "$file contains 1kb of data or more" >&2
else
echo "$file is not empty, but is smaller than 1kb" >&2
fi
else
echo "$file is empty" >&2
fi
done
done < "$path/$period.txt"
Here's a refactoring of Mikhail's answer with the standard http://shellcheck.net/ warnings ironed out. I have not been able to understand the actual question well enough to guess whether this actually solves the OP's problem.
while IFS='' read -r per; do
if [ -e "xxx_$per.txt" ]; then
echo "xxx_$per.txt found" >&2
else
echo "xxx_$per.txt not found" >&2
fi
done <periods.txt
You are over engineering here. Just iterate over content of file with periods and search each period in a list of files. Like this:
for per in `cat periods.txt`
do
if ls | grep -q "$per"; then
echo "$per found";
else
echo "$per not found"
fi
done

Bash: Pass alias or function as argument to program

Quite often i need to work on the newest file in a directory.
Normally i do:
ls -rt
and then open the last file in vim or less.
Now i wanted to produce an alias or function, like
lastline() {ls -rt | tail -n1}
# or
alias lastline=$(ls -rt | tail -n1)
Calling lastline outputs the newest file in the directory, which is nice.
But calling
less lastline
wants to open the file "lastline" which doesn't exist.
How do i make bash execute the function or alias, if possible without a lot of typing $() or ``?
Or is there any other way to achieve the same result?
Thanks for your help.
You're parsing ls, and this is very bad. Moreover, if the last modified “file” is a directory, you'll be lessing/viming a directory.
So you need a robust way to determine the last modified file in the current directory. You may use a helper function like the following (that you'll put in your .bashrc):
last_modified_regfile() {
# Finds the last modified regular file in current directory
# Found file is in variable last_modified_regfile_ret
# Returns a failure return code if no reg files are found
local file
last_modified_regfile_ret=
for file in *; do
[[ -f $file ]] || continue
if [[ -z $last_modified_regfile_ret ]] || [[ $file -nt $last_modified_regfile_ret ]]; then
last_modified_regfile_ret=$file
fi
done
[[ $last_modified_regfile_ret ]]
}
Then you may define another function that will vim the last found file:
vimlastline() {
last_modified_regfile && vim -- "$last_modified_regfile_ret"
}
You may even have last_modified_regfile take optional arguments: the directories where it will find the last modified regular file:
last_modified_regfile() {
# Finds the last modified regular file in current directory
# or in directories given as arguments
# Found file is in variable last_modified_regfile_ret
# Returns a failure return code if no reg files are found
local file dir
local save_shopt_nullglob=$(shopt -p nullglob)
shopt -s nullglob
(( $# )) || set .
last_modified_regfile_ret=
for dir; do
dir=${dir%/}
[[ -d $dir/ ]] || continue
for file in "$dir"/*; do
[[ -f $file ]] || continue
if [[ -z $last_modified_regfile_ret ]] || [[ $file -nt $last_modified_regfile_ret ]]; then
last_modified_regfile_ret=$file
fi
done
done
$save_shopt_nullglob
[[ $last_modified_regfile_ret ]]
}
Then you can even alter vimlastline accordingly:
vimlastline() {
last_modified_regfile "$#" && vim -- "$last_modified_regfile_ret"
}
Use command substitution like this:
lastline() { ls -rt | tail -n1; }
less "$(lastline)"
Or pipe it to xargs:
lastline | xargs -I {} less '{}'

Infinite while-loop in BASH script

I'm really struggling to see why this while-loop never ends, when the loop starts, my variable LOC is set to Testing/, which is a directory I created to test this program, it has the following layout:
I want the loop to end once all Directories have had the "count" function applied to them.
Here are the things I have tried;
I've checked my count function, and it doesn't produce an infinite loop
I've tried running through the algorithm by hand
PARSE=1
LOC=$LOC/
count
AVAILABLEDIR=$(ls $LOC -AFl | sed "1 d" | grep "/$" | awk '{ print $9 }')
while [ $PARSE = "1" ]
do
if [[ ${AVAILABLEDIR[#]} == '' ]]; then
PARSE=0
fi
DIRBASE=$LOC
for a in ${AVAILABLEDIR[#]}; do
LOC="${DIRBASE}${a}"
LOCLIST="$LOCLIST $LOC"
count
done
for a in ${LOCLIST[#]}; do
TMPAVAILABLEDIR=$(ls $a -AFl | sed "1 d" | grep "/$" | awk '{ print $9 }')
PREPEND=$a
if [[ ${TMPAVAILABLEDIR[#]} == '' ]]; then
continue
fi
for a in ${TMPAVAILABLEDIR[#]}; do
TMPAVAILABLEDIR2="$TMPAVAILABLEDIR2 ${PREPEND[#]}${a}"
done
NEWAVAILABLEDIR="$NEWAVAILABLEDIR $TMPAVAILABLEDIR2"
done
AVAILABLEDIR=$NEWAVAILABLEDIR
NEWAVAILABLEDIR=''
LOC=''
done
I am really struggling, and any input would be greatly appreciated, I've been trying to figure this out for the last couple of hours.
You should try to run the script with argument -x, or write it into the first line:
#!/bin/bash -x
Then it tells you everything it does.
In that case, you might notice two errors:
You never reset TMPAVAILABLEDIR2
You do ls on regular files as well.
If you really must avoid recursion, try this. It completely recursion-free:
#!/bin/bash
count() {
echo counting "$1"
}
todo=(Testing)
while test ${#todo[#]} != 0
do
doit=("${todo[#]}")
todo=()
for dir in "${doit[#]}"
do
for entry in "$dir"/* # If the directory is empty, this shows an entry named "*"
do
test -e "$entry" || continue # Skip the entry "*" of an empty directory
count "$entry"
test -d "$entry" || continue
todo+=("$entry")
done
done
done
You wrote you want to perform "count" on all directories.
Look at the options of find:
find $LOC -type d | while read dir; do
cd $LOC
cd ${dir}
count
done
Or shorter (when your function count accepts a directory as parameter 1):
find $LOC -type d | xargs count
I now see you do not want to use find or ls -R (recursive function). Then you should make your own recursive function, something like
function parseDir {
ls -d */ $1 | while read dir; do
count
parseDir $1/$dir
done
}
I have no idea if this will work, but it’s an interesting question I couldn't stop thinking about.
while true ; do
for word in "$(echo *)" ; do
if [[ -d "$word" ]] ; then
d[$((i++))]="$PWD"/"$word"
elif [[ -f "$word" ]] ;then
f[$((j++))]="$PWD"/"$word"
fi
done
[[ $k -gt $i ]] && cd ..
cd "$d[$((k++))]" || break
done

How to test filename expansion result in bash?

I want to check whether a directory has files or not in bash.
My code is here.
for d in {,/usr/local}/etc/bash_completion.d ~/.bash/completion.d
do
[ -d "$d" ] && [ -n "${d}/*" ] &&
for f in $d/*; do
[ -f "$f" ] && echo "$f" && . "$f"
done
done
The problem is that "~/.bash/completion.d" has no file.
So, $d/* is regarded as simple string "~/.bash/completion.d/*", not empty string which is result of filename expansion.
As a result of that code, bash tries to run
. "~/.bash/completion.d/*"
and of course, it generates error message.
Can anybody help me?
If you set the nullglob bash option, through
shopt -s nullglob
then globbing will drop patterns that don't match any file.
# NOTE: using only bash builtins
# Assuming $d contains directory path
shopt -s nullglob
# Assign matching files to array
files=( "$d"/* )
if [ ${#files[#]} -eq 0 ]; then
echo 'No files found.'
else
# Whatever
fi
Assignment to an array has other benefits, including desirable (correct!) handling of filenames/paths containing white-space, and simple iteration without using a sub-shell, as the following code does:
find "$d" -type f |
while read; do
# Process $REPLY
done
Instead, you can use:
for file in "${files[#]}"; do
# Process $file
done
with the benefit that the loop is run by the main shell, meaning that side-effects (such as variable assignment, say) made within the loop are visible for the remainder of script. Of course, it's also way faster, if performance is an issue.
Finally, an array can also be inserted in command line arguments (without splitting arguments containing white-space):
$ md5sum fileA "${files[#]}" fileZ
You should always attempt to correctly handle files/paths containing white-space, because one day, they will happen!
You could use find directly in the following way:
for f in $(find {,/usr/local}/etc/bash_completion.d ~/.bash/completion.d -maxdepth 1 -type f);
do echo $f; . $f;
done
But find will print a warning if some of the directory isn't found, you can either put a 2> /dev/null or put the find call after testing if the directories exist (like in your code).
find() {
for files in "$1"/*;do
if [ -d "$files" ];then
numfile=$(ls $files|wc -l)
if [ "$numfile" -eq 0 ];then
echo "dir: $files has no files"
continue
fi
recurse "$files"
elif [ -f "$files" ];then
echo "file: $files";
:
fi
done
}
find /path
Another approach
# prelim stuff to set up d
files=`/bin/ls $d`
if [ ${#files} -eq 0 ]
then
echo "No files were found"
else
# do processing
fi

Resources