mv command and rename not working on multiple flies - bash

Below is a bash script to move files around and rename them. The problem is it doesn't work when there is more than one file in the directory. I'm assuming because the last parameter in the mv command is a file. Any suggestions?
'#!/bin/bash'
'INPUTDIR="/home/southern-uniontn/S001007420"'
'OUTPUTDIR="/mnt/edi-06/southern-uniontn/flats-in"'
'BACKUPDIR="/backup/southern-uniontn/S001007420"'
YEAR=`date +%Y`
MONTH=`date +%m`
DAY=`date +%d`
HOUR=`date +%H`
MINUTE=`date +%M`
######## Do some error checking #########
# Does backup dir exist?
if [ ! -d $BACKUPDIR/$YEAR ]
then
mkdir $BACKUPDIR/$YEAR
fi
if [ ! -d $BACKUPDIR/$YEAR/$MONTH ]
then
mkdir $BACKUPDIR/$YEAR/$MONTH
fi
if [ ! -d $BACKUPDIR/$YEAR/$MONTH/$DAY ]
then
mkdir $BACKUPDIR/$YEAR/$MONTH/$DAY
fi
if [[ $(find $INPUTDIR -type f | wc -l) -gt 0 ]];
then
###### Rename the file, move it to Backup, then copy to the Output Directory #####
for f in $INPUTDIR/*
do
echo "`date` - Move recurring txt flat file to BackupDir for Union TN from Southern"
mv $INPUTDIR/* $BACKUPDIR/$YEAR/$MONTH/$DAY/UnionTN-S001007420-$YEAR$MONTH$DAY-$HOUR$MINUTE.txt
sleep 2
echo "`date` - Copy backup file to the Union TN Output Directory"
cp $BACKUPDIR/$YEAR/$MONTH/$DAY/UnionTN-S001007420-$YEAR$MONTH$DAY-$HOUR$MINUTE.txt $OUTPUTDIR/
done;
fi

Some notes:
Get out of the habit of using ALLCAPS variable names, leave those as reserved
by the shell. One day you'll write PATH=something and then wonder
why your script is
broken.
mkdir -p can create parent directories, and will not error if the dir already exists
store the filenames in an array. Then the shell does not have to duplicate
the work, and you don't need to count how many there are: if there are no
files, the loop has zero iterations
if you want to keep the same directory hierarchy in the outputdir,
you need to do that by hand.
use read to get the date parts
with bash v4.2+, printf can be used instead of calling out to date
use magic value "-1" to mean "now".
printf '%(%Y-%m-%d)T\n' -1 prints "2021-10-25" (as of the day I write this)
This is, I think, what you want:
#!/bin/bash
inputdir='/home/southern-uniontn/S001007420'
outputdir='/mnt/edi-06/southern-uniontn/flats-in'
backupdir='/backup/southern-uniontn/S001007420'
read year month day hour minute < <(printf '%(%Y %m %d %H %M)T\n' -1)
# create backup dirs if not exists
date_dir="$year/$month/$day"
mkdir -p "$backupdir/$date_dir"
mkdir -p "$outputdir/$date_dir"
mapfile -t files < <(find $inputdir -type f)
for f in "${files[#]}"
do
###### Rename the file, move it to Backup, then copy to the Output Directory #####
backup_file="UnionTN-S001007420-$year$month$day-$hour$minute.txt"
printf '%(%c)T - Move recurring txt flat file to backupdir for Union TN from Southern\n' -1
mv "$f" "$backupdir/$date_dir/$backup_file"
printf '%(%c)T - Copy backup file to the Union TN Output Directory\n' -1
cp "$backupdir/$date_dir/$backup_file" "$outputdir/$date_dir/$backup_file"
done

When using a glob with mv, the target must be an existing directory, and all matching files will be moved inside that directory.
In your case,
mv $INPUTDIR/* $BACKUPDIR/$YEAR/$MONTH/$DAY/UnionTN-S001007420-$YEAR$MONTH$DAY-$HOUR$MINUTE.txt
tells mv to move all file inside the $INPUTDIR/* directory to a directory named $BACKUPDIR/$YEAR/$MONTH/$DAY/UnionTN-S001007420-$YEAR$MONTH$DAY-$HOUR$MINUTE.txt.
I'm not sure what you're trying to do, but I hope this help.
Some more advice you could use:
Don't put the shebang (the first line beginning with "#") and the first three variable declarations inside single-quotes.
Some argue it is more portable and better to write /usr/bin/env bash instead of /bin/bash in the shebang
if [ CONDITION ] /then ACTION /fi statements can be simplified by writing [ CONDITION ] && ACTION
You reduce your likely hood of encountering unexpected behaviour when double-quoting your strings and variable (i.e. write "${year}/${month}/" instead of $year/$month.
No need to call mkdir a, followed by mkidr a/b, then mkdir a/b/c and so on, you can just call mkdir -p a/b/c. The p flag tells mkdir to create parent directories if they don't already exist.
It is unnecessary to validate the existence of a directory before calling mkdir since mkdir already validates that for you.
As pointed out by commenters, all-caps variables are conventions for special POSIX related variables. You should use another type of casing.
You could use date to do the formatting for you: date +%Y/%m/%d will print 2021/10/25
Strings without interpolation can have single-quotes.
(Optional, prevent undesired behaviors) Put set -e at the beginning of your scripts, after the shebang, to tell bash to halt if an error is encountered
And finally, use man <command_name> for built-in documentation!

Related

Use of mkdir -v output with a newline embedded

First of all, this question is purely theoric; it involves creates a directory with a newline, thing that should NEVER be done.
That said, I'm trying to use mkdir -pv output to remove the created directories in a specific moment of my script, but only the newly created, not the ones that previously existed.
Command mkdir -pv will print one line per directory not-existent before this command call so that I can re-inject in a rm -rf command. It works OK except in the case that directory contains a newline, and I can't see what is wrong with it.
My minimal working example:
declare -a created
# Delete previous traces
mkdir_out=$(mkdir -pv 'new 10'{1,2,3,$'\n',"'a",4})
# Convert to array
IFS=$'\n' read -d '' -a created < <(printf '%s' "${mkdir_out}")
# Debug
printf '=>[1] %s\n' "${created[#]}"
# We only want content between first and last quote
created=( "${created[#]%[\'\"]}" )
created=( "${created[#]#*[\'\"]}" )
# Debug
printf '=>[2] %s\n' "${created[#]}"
rm -rfv "${created[#]}"
ls # Directory "new 10\n" is still there!!
So, what is the safe way to do that?
Output like mkdir: created directory 'foo' is only meant for humans. Don't try to parse it.
If you want to handle all possible filenames and you can't deal in \0 separated lists, you have to do them one by one. Here's an example:
declare -a created dirs
dirs=( 'new 10'{1,2,3,$'\n',"'a",4} )
created=()
for dir in "${dirs[#]}"
do
if [[ ! -d "$dir" ]] && mkdir -p "$dir"
then
created+=( "$dir" )
fi
done
rm -rfv "${created[#]}"
ls # Directory "new 10\n" is not there.

Unix script - Comparing number of filename date with my single input date

I am new to Unix scripting, I am trying to create Unix script since one week but I couldn't. Please help me in this.
I have a number of different files more than 100 (all the filenames are different) which the filename contains the date string(ex: 20171101)in the directory. I want compare these filename dates with my input date (today - 10days =20171114),with the files in the directories only using filename string date if it is less than with my input date then I have to delete the file. could anyone please help on this. Thanks
My script:
ten_days_ago=$(date -d "10 days ago" +%Y%m%d)
cd "$destination_dir" ;
ls *.* | awk -F '-' '{print $2}'
ls *.* | awk -F '-' '{print $2}' > removal.txt
while read filedate
do
if [ "$filedate" -lt "$ten_days_ago" ] ; then
cd "$destination_dir" ;
rm *-"$filedate"*
echo "deletion done"
fi
done <removal.txt
this script is working fine. but I need to send a email as well - if the deletion has been done then -one pass email else fail email.
but here within while loop if I am writing the emails then that will iterate
You're probably trying to pipe to mail from the middle of your loop. (Your question should really show this code, otherwise we can't say what's wrong.) A common technique is to redirect the loop's output to a file, and then send that. (Using a temporary file is slightly ugly, but avoids sending an empty message when there is no output from the loop.)
Just loop over the files and decide which to remove.
#!/bin/bash
t=$(mktemp -t tendays.XXXXXXXX) || exit
# Remove temp file if interrupted, or when done
trap 'rm -f "$t"' EXIT HUP INT TERM
ten_days_ago=$(date -d "10 days ago" +%Y%m%d)
for file in *-[1-9]*[1-9]-*; do
date=${file#*-} # strip prefix up through first dash
date=${date%-*} # strip from last dash from the previous result
if [ "$date" -lt "$ten_days_ago" ]; then
rm -v "$file"
fi
done >"$t" 2>&1
test -s "$t" || exit # Quit if empty
mail -s "Removed files" recipient#example.net <"$t"
I removed the (repeated!) cd so this can be run in any directory -- just switch to the directory you want before running the script. This also makes it easier to test in a directory with a set of temporary files.
Collecting the script's standard error also means the mail message will contain any error messages if rm fails for some reason or you have other exceptions.
By the by you should basically never use ls in scripts.

Change date modified of multiple folders to match that of their most recently modified file

I've been using the following shell bin/bash script as an app which I can drop a folder on, and it will update the date modified of the folder to match the most recently modified file in that folder.
for f in each "$#"
do
echo "$f"
done
$HOME/setMod "$#"
This gets the folder name, and then passes it to this setMod script in my home folder.
#!/bin/bash
# Check that exactly one parameter has been specified - the directory
if [ $# -eq 1 ]; then
# Go to that directory or give up and die
cd "$1" || exit 1
# Get name of newest file
newest=$(stat -f "%m:%N" * | sort -rn | head -1 | cut -f2 -d:)
# Set modification date of folder to match
touch -r "$newest" .
fi
However, if I drop more than one folder on it at a time, it won't work, and I can't figure out how to make it work with multiple folders at once.
Also, I learned from Apple Support that the reason so many of my folders keep getting the mod date updated is due to some Time Machine-related process, despite the fact I haven't touched some of them in years. If anyone knows of a way to prevent this from happening, or to somehow automatically periodically update the date modified of folders to match the date/time of the most-recently-modified file in them, that would save me from having to run this step manually pretty regularly.
The setMod script current accepts only one parameter.
You could either make it accept many parameters and loop over them,
or you could make the calling script use a loop.
I take the second option, because the caller script has some mistakes and weak points. Here it is corrected and extended for your purpose:
for dir; do
echo "$dir"
"$HOME"/setMod "$dir"
done
Or to make setMod accept multiple parameters:
#!/bin/bash
setMod() {
cd "$1" || return 1
# Get name of newest file
newest=$(stat -f "%m:%N" * | sort -rn | head -1 | cut -f2 -d:)
# Set modification date of folder to match
touch -r "$newest" .
}
for dir; do
if [ ! -d "$dir" ]; then
echo not a directory, skipping: $dir
continue
fi
(setMod "$dir")
done
Notes:
for dir; do is equivalent to for dir in "$#"; do
The parentheses around (setMod "$dir") make it run in a sub-shell, so that the script itself doesn't change the working directory, the effect of the cd operation is limited to the sub-shell within (...)

Bash: Creating subdirectories reading from a file

I have a file that contains some keywords and I intend to create subdirectories into the same directory of the same keyword using a bash script. Here is the code I am using but it doesn't seem to be working.
I don't know where I have gone wrong. Help me out
for i in `cat file.txt`
do
# if [[ ! -e $path/$i ]]; then
echo "creating" $i "directory"
mkdir $path/$i
# fi
grep $i file >> $path/$i/output.txt
done
echo "created the files in "$path/$TEMP/output.txt
You've gone wrong here, and you've gone wrong here.
while read i
do
echo "Creating $i directory"
mkdir "$path/$i"
grep "$i" file >> "$path/$i"/output.txt
done < file.txt
echo "created the files in $path/$TEMP/output.txt"
78mkdir will refuse to create a directory, if parts of it do not exist.
e.g. if there is no /foo/bar directory, then mkdir /foo/bar/baz will fail.
you can relax this a bit by using the -p flag, which will create parent directories if necessary (in the example, it might create /foo and /foo/bar).
you should also use quotes, in case your paths contain blanks.
mkdir -p "${path}/${i}"
finally, make sure that you are actually allowed to create directories in $path

bash - recursive script can't see files in sub directory

I got a recursive script which iterates a list of names, some of which are files and some are directories.
If it's a (non-empty) directory, I should call the script again with all of the files in the directory and check if they are legal.
The part of the code making the recursive call:
if [[ -d $var ]] ; then
if [ "$(ls -A $var)" ]; then
./validate `ls $var`
fi
fi
The part of code checking if the files are legal:
if [[ -f $var ]]; then
some code
fi
But, after making the recursive calls, I can no longer check any of the files inside that directory, because they are not in the same directory as the main script, the -f $var if cannot see them.
Any suggestion how can I still see them and use them?
Why not use find? Simple and easy solution to the problem.
Always quote variables, you never known when you will find a file or directory name with spaces
shopt -s nullglob
if [[ -d "$path" ]] ; then
contents=( "$path"/* )
if (( ${#contents[#]} > 0 )); then
"$0" "${contents[#]}"
fi
fi
you're re-inventing find
of course, var is a lousy variable name
if you're recursively calling the script, you don't need to hard-code the script name.
you should consider putting the logic into a function in the script, and the function can recursively call itself, instead of having to spawn an new process to invoke the shell script each time. If you do this, use $FUNCNAME instead of "$0"
A few people have mentioned how find might solve this problem, I just wanted to show how that might be done:
find /yourdirectory -type f -exec ./validate {} +;
This will find all regular files in yourdirectory and recursively in all its sub-directories, and return their paths as arguments to ./validate. The {} is expanded to the paths of the files that find locates within yourdirectory. The + at the end means that each call to validate will be on a large number of files, instead of calling it individually on each file (wherein the + is replaced with a \), this provides a huge speedup sometimes.
One option is to change directory (carefully) into the sub-directory:
if [[ -d "$var" ]] ; then
if [ "$(ls -A $var)" ]; then
(cd "$var"; exec ./validate $(ls))
fi
fi
The outer parentheses start a new shell so the cd command does not affect the main shell. The exec replaces the original shell with (a new copy of) the validate script. Using $(...) instead of back-ticks is sensible. In general, it is sensible to enclose variable names in double quotes when they refer to file names that might contain spaces (but see below). The $(ls) will list the files in the directory.
Heaven help you with the ls commands if any file names or directory names contain spaces; you should probably be using * glob expansion instead. Note that a directory containing a single file with a name such as -n would trigger a syntax error in your script.
Corrigendum
As Jens noted in a comment, the location of the shell script (validate) has to be adjusted as you descend the directory hierarchy. The simplest mechanism is to have the script on your PATH, so you can write exec validate or even exec $0 instead of exec ./validate. Failing that, you need to adjust the value of $0 — assuming your shell leaves $0 as a relative path and doesn't mess around with converting it to an absolute path. So, a revised version of the code fragment might be:
# For validate on PATH or absolute name in $0
if [[ -d "$var" ]] ; then
if [ "$(ls -A $var)" ]; then
(cd "$var"; exec $0 $(ls))
fi
fi
or:
# For validate not on PATH and relative name in $0
if [[ -d "$var" ]] ; then
if [ "$(ls -A $var)" ]; then
(cd "$var"; exec ../$0 $(ls))
fi
fi

Resources