bash to select oldest folder in directory and write to log - bash

In the bash below the oldest folder in a directory is selected. If there are 3 folders in the directory /home/cmccabe/Desktop/NGS/test and nothing is done to them (ie. no files deleted, renamed) then the bash correctly identifies f1 as the oldest. However, if something is done to the folder then the bash identifies f2 as the oldest. I am not sure why or how to prevent that from happening. Thank you :).
folders in directory
f1
f2
f3
Bash
# oldest folder used analysis and version log created
dir=/home/cmccabe/Desktop/NGS/test
{
read -r -d $'\t' time && read -r -d '' filename
} < <(find "$dir" -maxdepth 1 -mindepth 1 -printf '%T+\t%P\0' | sort -z )
printf "The oldest folder is $filename, created on $time and analysis done using v1.3 by $USER at $(date "+%D %r")\n" >> /home/cmccabe/Desktop/NGS/test/log
echo "$filename"

Your idea of using find is right, but with a little tweaking like this
$ IFS= read -r -d $'\0' line < <(find . -maxdepth 1 -type d -printf '%T# %p\0' \
2>/dev/null | sort -z -n)
$ printf "The oldest directory: %s\n" "${line#* }"
Similar to the one answered here.

When you edit a folder or a file in a folder, the modification date of the folder is updated. The creation date of a folder is not saved. See this question for more information How to get file creation date/time in Bash/Debian?

Related

How can i delete the files and folder both at the same time older than 30 days in unix

How can i delete the files and folder both at the same time older than 30 days in unix.
But need to make sure that folder containing files earlier than 30 days must be refrain from deletion.
How can i delete the files and folder both at the same time older than 30 days in unix.
But need to make sure that folder containing files earlier than 30 days must be refrain from deletion.
To delete files older than 30 days you can use:
rm -f $(find . -mtime +30)
After that, you can search and show directories that not contains file after delete:
find . -type d | while read line; do if [ $(ls $line| wc -l) = 0 ];then echo $line; fi; done; > logerr.txt
Then you drop these:
rm -r $(find . -type d | while read line; do if [ $(ls $line| wc -l) = 0 ];then echo $line; fi; done; > logerr.txt)
Let me know if it was helpful.
Thank you

Automator/Apple Script: Move files with same prefix on a new folder. The folder name must be the files prefix

I'm a photographer and I have multiple jpg files of clothings in one folder. The files name structure is:
TYPE_FABRIC_COLOR (Example: BU23W02CA_CNU_RED, BU23W02CA_CNU_BLUE, BU23W23MG_LINO_WHITE)
I have to move files of same TYPE (BU23W02CA) on one folder named as TYPE.
For example:
MAIN FOLDER>
BU23W02CA_CNU_RED.jpg, BU23W02CA_CNU_BLUE.jpg, BU23W23MG_LINO_WHITE.jpg
Became:
MAIN FOLDER>
BU23W02CA_CNU > BU23W02CA_CNU_RED.jpg, BU23W02CA_CNU_BLUE.jpg
BU23W23MG_LINO > BU23W23MG_LINO_WHITE.jpg
Here are some scripts.
V1
#!/bin/bash
find . -maxdepth 1 -type f -name "*.jpg" -print0 | while IFS= read -r -d '' file
do
# Extract the directory name
dirname=$(echo "$file" | cut -d'_' -f1-2 | sed 's#\./\(.*\)#\1#')
#DEBUG echo "$file --> $dirname"
# Create it if not already existing
if [[ ! -d "$dirname" ]]
then
mkdir "$dirname"
fi
# Move the file into it
mv "$file" "$dirname"
done
it assumes all files that the find lists are of the format you described in your question, i.e. TYPE_FABRIC_COLOR.ext.
dirname is the extraction of the first two words delimited by _ in the file name.
since find lists the files with a ./ prefix, it is removed from the dirname as well (that is what the sed command does).
the find specifies the name of the files to consider as *.jpg. You can change this to something else, if you want to restrict which files are considered in the move.
this version loops through each file, creates a directory with it's first two sections (if it does not exists already), and moves the file into it.
if you want to see what the script is doing to each file, you can add option -v to the mv command. I used it to debug.
However, since it loops though each file one by one, this might take time with a large number of files, hence this next version.
V2
#!/bin/bash
while IFS= read -r dirname
do
echo ">$dirname"
# Create it if not already existing
if [[ ! -d "$dirname" ]]
then
mkdir "$dirname"
fi
# Move the file into it
find . -maxdepth 1 -type f -name "${dirname}_*" -exec mv {} "$dirname" \;
done < <(find . -maxdepth 1 -type f -name "*.jpg" -print | sed 's#^\./\(.*\)_\(.*\)_.*\..*$#\1_\2#' | sort | uniq)
this version loops on the directory names instead of on each file.
the last line does the "magic". It finds all files, and extracts the first two words (with sed) right away. Then these words are sorted and "uniqued".
the while loop then creates each directory one by one.
the find inside the while loop moves all files that match the directory being processed into it. Why did I not simply do mv ${dirname}_* ${dirname}? Since the expansion of the * wildcard could result in a too long arguments list for the mv command. Doing it with the find ensures that it will work even on LARGE number of files.
Suggesting oneliner awk script:
echo "$(ls -1 *.jpg)"| awk '{system("mkdir -p "$1 OFS $2);system("mv "$0" "$1 OFS $2)}' FS=_ OFS=_
Explanation:
echo "$(ls -1 *.jpg)": List all jpg files in current directory one file per line
FS=_ : Set awk field separator to _ $1=type $2=fabric $3=color.jpg
OFS=_ : Set awk output field separator to _
awk script explanation
{ # for each file name from list
system ("mkdir -p "$1 OFS $2); # execute "mkdir -p type_fabric"
system ("mv " $0 " " $1 OFS $2); # execute "mv current-file to type_fabric"
}

Bash script to copy *.log files into a new directory

In my folder there are different files like this:
stats.log
move_2021-05-24.log
sync_2021-05-24.log
application.log
I want to copy all *.log files with another day than today to a specific folder.
My current script looks like this but it does not work as I thought. It is currently moving all log files i think and not just log files with a date older that todays date.
cd /share/CACHEDEV1_DATA/app
for file in *.log
do
day=$(echo ${file} | cut -d"-" -f3)
now="$(date +'%d')"
if [ "$day" != "$now" ];
then
mv ${file} ~/share/CACHEDEV1_DATA/rclone/logs/
fi
done
I would be glad if I could get advice on how my script would need to look like to work correctly.
I hope you consider logrotate. It can do everything you need and more.
But if you want to roll your own, here is how you can find files older than a day and move them. Note: This will overwrite files with the same name at the destination. I added an echo statement before mv so you can see if looks good to you.
find /share/CACHEDEV1_DATA/app -type f -maxdepth 1 -mtime +1d -print0 | \
while read -rd $'\0' file; do
echo mv "$file" ~/share/CACHEDEV1_DATA/rclone/logs/
done

Removing old folders in bash backup script

I have a bash script that rsyncs files onto my NAS to the directory below:
mkdir /backup/folder_`date +%F`
How would I go about writing a cleanup script that removes directories older than 7 days old based upon the date in directories name?
#!/bin/bash
shopt -s extglob
OLD=$(exec date -d "now - 7 days" '+%s')
cd /backup || exit 1 ## If necessary.
while read DIR; do
if read DATE < <(exec date -d "${DIR#*folder_}" '+%s') && [[ $DATE == +([[:digit:]]) && DATE -lt OLD ]]; then
echo "Removing $DIR." ## Just an example message. Or we could just exclude this and add -v option to rm.
rm -ir "$DIR" ## Change to -fr to skip confirmation.
fi
done < <(exec find -maxdepth 1 -type d -name 'folder_*')
exit 0
We could actually use more careful approaches like -rd $'\0', -print0 and IFS= but I don't think they are really necessary this time.
Create a list of folders with the pattern you want to remove, remove the folders you want to keep from the list, delete everything else.
How about a simple find:
find /backup -name 'folder_*' -type d -ctime 7 -exec rm -rf {} \;

Folder Creation Subtract file number?

I havent been able to find an answer that best suites my needs, and I appologize if someone is able to find it easily.
I have a script that works to move files into folders based on their names. It worked perfectly until I realized that The files where missing their extension once I fixed this (another script was responsible for the file naming based on an email subject line) Once I fixed this problem It then started making a folder for each file. Is there anyway I can make this script drop everything in the folder name before the first (.)
Here is the script
#!/bin/bash
#folder script
#Benjamin D. Schran
MAIN_DIR=/PGHWH1/Photos
cd $MAIN_DIR
find . -maxdepth 1 -type f > SCRIPT_LOG1
find . -name '* *' | while read fname
do
new_fname=`echo $fname | tr " " "_"`
if [ -e $new_fname ]
then
echo "File $new_fname already exists. Not replacing $fname"
else
echo "Creating new file $new_fname to replace $fname"
mv "$fname" $new_fname
fi
done
find . -maxdepth 1 -type f | while read file;
do
f=$(basename "$file")
f1=${f%.*}
if [ -d "$f1" ];
then
mv "$f" "$f1"
else
mkdir "$f1"
chmod 777 "$f1"
mv "$f" "$f1"
fi
done
SCRIPTLOG=Script_log.$(date +%Y-%m-%d-%H-%M)
find . -type f > SCRIPT_LOG2
cd /PGHWH1/bin
sh scriptlog.sh > $SCRIPTLOG.html
mv $SCRIPTLOG.html /PGHWH1/log
rm $MAIN_DIR/SCRIPT_LOG1 $MAIN_DIR/SCRIPT_LOG2
What I need it to do is to take a files that is
Filename-date.%.jpg
and make
Foldername-date
then move the files of
Filename-date.1.jpg
Filename-date.2.jpg
Filename-date.3.jpg
to the appropriate folder
Foldername-date
but the current output is
Foldername-date.1
Foldername-date.2
Foldername-date.3
Any help at all would be appreciated
The following lines do the job in my bash:
#first create a tmp file with unique directory names
ls *.jpg | awk -F'.' '{print $1}' | uniq > dirs
#second create the directories
mkdir -p `cat dirs`
#third move the files
for i in `cat dirs`; do mv $i*.jpg $i/; done
#(optionally) remove the tmp file
rm dirs

Resources