Bash script does not find index from array - bash

I have written a bash script which I want to use to monitor backups on a synology via pushgateway.
The script should search for subfolders in the backup folder, write the newest file into a variable and write the age and size of the file into an array.
To give it finally to the push gateway, I list all metrics with indexes. All folders or files are available. If I execute the script, often one or more indexes are not found. If I execute the commands manually one by one, I get a correct output.
Here is the script:
#!/bin/bash
set -e
backup_dir=$1
for dir in $(find "$backup_dir" -maxdepth 1 -mindepth 1 -type d \( ! -name #eaDir \)); do
if compgen -G "${dir}/*.vib" > /dev/null; then
latest_vib=$(ls -t1 "$dir"/*.vib | head -1)
age_vib=$(( ( $(date +%s) - $(stat -c %Y "$latest_vib") ) ))
size_vib=$(stat -c %s "$latest_vib")
arrage_vib+=("${age_vib}")
arrsize_vib+=("${size_vib}")
fi
if compgen -G "${dir}/*.vbk" > /dev/null; then
latest_vbk=$(ls -t1 "$dir"/*.vbk | head -1)
age_vbk=$(( ( $(date +%s) - $(stat -c %Y "$latest_vbk") ) ))
size_vbk=$(stat -c %s "$latest_vbk")
arrage_vbk+=("${age_vbk}")
arrsize_vbk+=("${size_vbk}")
fi
min_dir=$(echo "$dir" | cut -d'/' -f4- | tr '[:upper:]' '[:lower:]')
sign_dir=${min_dir//_/-}
arrdir+=("${sign_dir}")
done
echo "${arrdir[4]}"
echo "${arrage_vib[4]}"
cat << EOF | curl -ks -u user:pw --data-binary #- https://pushgateway/metrics/job/backup/instance/instance_name
# HELP backup_age displays the age of backups in seconds
# TYPE backup_age gauge
backup_age_vib{dir="${arrdir[1]}"} ${arrage_vib[1]}
backup_age_vib{dir="${arrdir[2]}"} ${arrage_vib[2]}
backup_age_vib{dir="${arrdir[3]}"} ${arrage_vib[3]}
backup_age_vib{dir="${arrdir[4]}"} ${arrage_vib[4]}
backup_age_vbk{dir="${arrdir[1]}"} ${arrage_vbk[1]}
...
# HELP backup_size displays the size of backups in bytes
# TYPE backup_size gauge
backup_size_vib{dir="${arrdir[1]}"} ${arrsize_vib[1]}
...
EOF
I hope you can help me and point out where I made a mistake. I am also open for general optimizations of the script, because I assume that it can be solved better and more performant or optimal. I have a few lines of code from here ;-).
Many thanks in advance.

Related

Bash script comparison in combination with getfattr

I am currently stuck with a problem in my Bash script and seem to run even deeper in the dark with every attempt of trying to fix it.
Background:
We have a folder which is getting filled with numbered crash folders, which get filled with crash files. Someone is exporting a list of these folders on a daily basis. During that export, the numbered crash folders get an attribute "user.exported=1".
Some of them do not get exported, so they will not have the attribute and these should be deleted only if they are older than 30 days.
My problem:
I am setting up a bash script, which is being run via Cron in the end to check on a regular basis for folders, which have the attribute "user.exported=1" and are older than 14 days and deletes them via rm -rfv FOLDER >> deleted.log
We however also have folders which do not have or get the attribute "user.exported=1" which then need to be deleted after they are older than 30 days. I created an IF ELIF FI comparison to check for that but that is where I got stuck.
My Code:
#!/bin/bash
# Variable definition
LOGFILE="/home/crash/deleted.log"
DATE=`date '+%d/%m/%Y'`
TIME=`date '+%T'`
FIND=`find /home/crash -maxdepth 1 -mindepth 1 -type d`
# Code execution
printf "\n$DATE-$TIME\n" >> "$LOGFILE"
for d in $FIND; do
# Check if crash folders are older than 14 days and have been exported
if [[ "$(( $(date +"%s") - $(stat -c "%Y" $d) ))" -gt "1209600" ]] && [[ "$(getfattr -d --absolute-names -n user.exported --only-values $d)" == "1" ]]; then
#echo "$d is older than 14 days and exported"
"rm -rfv $d" >> "$LOGFILE"
# Check if crash folders are older than 30 days and delete regardless
elif [[ "$(( $(date +"%s") - $(stat -c "%Y" $d) ))" -gt "1814400" ]] && [[ "$(getfattr -d --absolute-names -n user.exported $d)" == FALSE ]]; then
#echo "$d is older than 30 days"
"rm -rfv $d" >> "$LOGFILE"
fi
done
The IF part is working fine and it deleted the folders with the attribute "user.exported=1" but the ELIF part does not seem to work, as I only get an output in my bash such as:
/home/crash/1234: user.exported: No such attribut
./crash_remove.sh: Line 20: rm -rfv /home/crash/1234: File or Directory not found
When I look into the crash folder after the script ran, the folder and its content is still there.
I definitely have an error in my script but cannot see it. Please could anyone help me out with this?
Thanks in advance
Only quote the expansions, not the whole command.
Instead of:
"rm -rfv $d"
do:
rm -rfv "$d"
If you quote it all, bash tries to run a command named literally rm<space>-rfv<space><expansion of d>.
Do not use backticks `. Use $(...) instead. Bash hackers wiki obsolete deprecated syntax.
Do not for i in $(cat) or var=$(...); for i in $var. Use a while IFS= read -r loop. How to read a file line by line in bash.
Instead of if [[ "$(( $(date +"%s") - $(stat -c "%Y" $d) ))" -gt "1814400" ]] just do the comparison in the arithmetic expansion, like: if (( ( $(date +"%s") - $(stat -c "%Y" $d) ) > 1814400 )).
I think you could just do it all in find, like::
find /home/crash -maxdepth 1 -mindepth 1 -type d '(' \
-mtime 14 \
-exec sh -c '[ "$(getfattr -d --absolute-names -n user.exported --only-values "$1")" = "1" ]' -- {} \; \
-exec echo rm -vrf {} + \
')' -o '(' \
-mtime 30 \
-exec sh -c '[ "$(getfattr -d --absolute-names -n user.exported "$1")" = FALSE ]' -- {} \; \
-exec echo rm -vrf {} + \
')' >> "$LOGFILE"

Getting the path to the newest file in a directory with f=$(cd dir | ls -t | head) not honoring "dir"

I would like to get file (zip file) from path with this part of code file=$(cd '/path_to_zip_file' | ls -t | head -1). Instead that I got my .sh file in directory where I am running this file.
Why I can't file from /path_to_zip_file
Below is my code in .sh file
file=$(cd '/path_to_zip_file' | ls -t | head -1)
last_modified=`stat -c "%Y" $file`;
current=`date +%s`
echo $file
if [ $(($current-$last_modified)) -gt 86400 ]; then
echo 'Mail'
else
echo 'No Mail'
fi;
If you were going to use ls -t | head -1 (which you shouldn't), the cd would need to be corrected as a prior command (happening before ls takes place), not a pipeline component (running parallel with ls, with its stdout connected to ls's stdin):
set -o pipefail # otherwise, a failure of ls is ignored so long as head succeeds
file=$(cd '/path_to_zip_file' && ls -t | head -1)
A better-practice approach might look like:
newest_file() {
local result=$1; shift # first, treat our first arg as latest
while (( $# )); do # as long as we have more args...
[[ $1 -nt $result ]] && result=$1 # replace "result" if they're newer
shift # then take them off the argument list
done
[[ -e $result || -L $result ]] || return 1 # fail if no file found
printf '%s\n' "$result" # more reliable than echo
}
newest=$(newest_file /path/to/zip/file/*)
newest=${newest##*/} ## trim the path to get only the filename
printf 'Newest file is: %s\n' "$newest"
To understand the ${newest##*/} syntax, see the bash-hackers' wiki on parameter expansion.
For more on why using ls in scripts (except for output displayed to humans) is dangerous, see ParsingLs.
Bot BashFAQ #99, How do I get the latest (or oldest) file from a directory? -- and BashFAQ #3 (How can I sort or compare files based on some metadata attribute (newest / oldest modification time, size, etc)?) have useful discussion on the larger context in which this question was asked.

How to prevent Travis-CI to terminate a job?

I have a bunch of files that need to be copied over to a tmp/ directory and then compressed.
I tried cp -rf $SRC $DST but the job is terminated before the command is complete. The verbose option int help either because the log file exceeds the size limit.
I wrote a small function to print only a percentage bar, but I get the same problem with the log size limit so maybe I need to redirect stdout to stderr but I'm not sure.
This is the snippet with the function:
function cp_p() {
local files=0
while IFS= read -r -d '' file; do ((files++)); done < <(find -L $1 -mindepth 1 -name '*.*' -print0)
local duration=$(tput cols)
duration=$(($duration<80?$duration:80-8))
local count=1
local elapsed=1
local bar=""
already_done() {
bar="\r|"
for ((done=0; done<$(( ($elapsed)*($duration)/100 )); done++)); do
printf -v bar "$bar▇"
done
}
remaining() {
for ((remain=$(( ($elapsed)*($duration)/100 )); remain<$duration; remain++)); do
printf -v bar "$bar "
done
printf -v bar "$bar|"
}
percentage() {
printf -v bar "$bar%3d%s" $elapsed '%%'
}
mkdir -p "$2/$1"
chmod `stat -f %A "$1"` "$2/$1"
while IFS= read -r -d '' file; do
file=$(echo $file | sed 's|^\./\(.*\)|"\1"|')
elapsed=$(( (($count)*100)/($files) ))
already_done
remaining
percentage
printf "$bar"
if [[ -d "$file" ]]; then
dst=$2/$file
test -d "$dst" || (mkdir -p "$dst" && chmod `stat -f %A "$file"` "$dst")
else
src=${file%/*}
dst=$2/$src
test -d "$dst" || (mkdir -p "$dst" && chmod `stat -f %A "$src"` "$dst")
cp -pf "$file" "$2/$file"
fi
((count++))
done < <(find -L $1 -mindepth 1 -name '*.*' -print0)
printf "\r"
}
This is the error I get
packaging files (this may take several minutes) ...
|▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇▇ | 98%
The log length has exceeded the limit of 4 MB (this usually means that the test suite is raising the same exception over and over).
The job has been terminated
Have you tried travis_wait cp -rf $SRC $DST? See https://docs.travis-ci.com/user/common-build-problems/#Build-times-out-because-no-output-was-received for details.
Also, I believe that generally disk operations are rather slow on macOS builds. You might be better off compressing the file structure while the files are touched. Assuming you want to gzip the thing:
travis_wait tar -zcf $DST.tar.gz $SRC

Why is while not not working?

AIM: To find files with a word count less than 1000 and move them another folder. Loop until all under 1k files are moved.
STATUS: It will only move one file, then error with "Unable to move file as it doesn't exist. For some reason $INPUT_SMALL doesn't seem to update with the new file name."
What am I doing wrong?
Current Script:
Check for input files already under 1k and move to Split folder
INPUT_SMALL=$( ls -S /folder1/ | grep -i reply | tail -1 )
INPUT_COUNT=$( cat /folder1/$INPUT_SMALL 2>/dev/null | wc -l )
function moveSmallInput() {
while [[ $INPUT_SMALL != "" ]] && [[ $INPUT_COUNT -le 1003 ]]
do
echo "Files smaller than 1k have been found in input folder, these will be moved to the split folder to be processed."
mv /folder1/$INPUT_SMALL /folder2/
done
}
I assume you are looking for files that has the word reply somewhere in the path. My solution is:
wc -w $(find /folder1 -type f -path '*reply*') | \
while read wordcount filename
do
if [[ $wordcount -lt 1003 ]]
then
printf "%4d %s\n" $wordcount $filename
#mv "$filename" /folder2
fi
done
Run the script once, if the output looks correct, then uncomment the mv command and run it for real this time.
Update
The above solution has trouble with files with embedded spaces. The problem occurs when the find command hands its output to the wc command. After a little bit of thinking, here is my revised soltuion:
find /folder1 -type f -path '*reply*' | \
while read filename
do
set $(wc -w "$filename") # $1= word count, $2 = filename
wordcount=$1
if [[ $wordcount -lt 1003 ]]
then
printf "%4d %s\n" $wordcount $filename
#mv "$filename" /folder2
fi
done
A somewhat shorter version
#!/bin/bash
find ./folder1 -type f | while read f
do
(( $(wc -w "$f" | awk '{print $1}' ) < 1000 )) && cp "$f" folder2
done
I left cp instead of mv for safery reasons. Change to mv after validating
I you also want to filter with reply use #Hai's version of the find command
Your variables INPUT_SMALL and INPUT_COUNT are not functions, they're just values you assigned once. You either need to move them inside your while loop or turn them into functions and evaluate them each time (rather than just expanding the variable values, as you are now).

KornShell script to get files between two dates

Need to get the files between two given dates via a KornShell (ksh) script. If there are multiple files on one day get the latest of the files for that day.
I haven't tried it out, but there's a mailing list post about finding files between two dates. The relevant part:
Touch 2 files, start_date and
stop_date, like this: $ touch -t
200603290000.00 start_date $ touch -t 200603290030.00 stop_date
Ok, start_date is 03/29/06 midnight,
stop_date is 03/29/06 30 minutes after
midnight. You might want to do a ls
-al to check.
On to find, you can find -newer and
then ! -newer, like this: $ find /dir
-newer start_date ! -newer stop_date -print
Combine that with ls -l, you get: $
find /dir -newer start_date ! -newer
stop_date -print0 | xargs -0 ls -l
(Or you can try -exec to execute ls
-l. I am not sure of the format, so you have to muck around a little bit)
in bash shell, just an example, you can use the -nt test operator (korn shell comes with it also, if i am not wrong).
printf "Enter start date( YYYYMMDD ):"
read startdate
printf "Enter end date( YYYYMMDD ):"
read enddate
touch -t "${startdate}0000.00" sdummy
touch -t "${enddate}0000.00" edummy
for fi in *
do
if [ $fi -nt "sdummy" -a ! $fi -nt "edummy" ] ;then
echo "-->" $fi
fi
done
In a nut shell for ksh :
!/usr/bin/ksh
# main from_date to_date path
# date format: YYMMDDhhmmss
ls -l --time-style "+%y%m%d%H%M%S" $3 | awk '{ print $6 " " $7 }' | while read t n
do
if (( t > $1 )) && (( t < $2 )); then
echo $t $n
fi
done

Resources