BASH: Using a pipe to create a variable [duplicate] - bash

This question already has answers here:
How do I redirect output to a variable in shell? [duplicate]
(8 answers)
bash: Assign variable from pipe?
(6 answers)
Closed 4 years ago.
I have a bash file that is looking to find files in a certain directory older than a certain date and delete them. It works fine and I'm able to echo the number of deleted files but I am having problems when I try to get the integer into a variable.
#!/bin/bash
# Make this dynamic to look at different directories.
pathtofolder=/var/www/website/temp2/
if [ $hours ]; then
# To specify older than one day it is better to talk in hours because
# the days integer is just an integer so everything less than 2 days
# would be 1 day, so 1 day 23 hours and 59 minutes is not greater than
# 1 day.
# For this reason I am using mmin and using the hours in minutes.
timeinmins=$(($hours*60))
elif [ $mins ]
then
timeinmins=$mins
else
# The default is 24 hours but we want to test with 24 minutes
timeinmins=24
fi
find "$pathtofolder"* -mmin +$timeinmins -exec rm -vr {} \; | output="$(wc -l)"
echo "Files deleted: $output"
echo "Minutes: $timeinmins"
In the above case, the $output is blank.
But this works, below, to just to echo...
find "$pathtofolder"* -mmin +$timeinmins -exec rm -vr {} \; | "Files deleted: $(wc -l)"
Any ideas? thanks in advance.

Related

Bash script find if file is created less then X time ago

I want to have a bash script witch tells me if the file is created less than a 1h ago.
Any suggestion how I can achieve that?
You can use find command with -mmin option.
if [ `find path/to/file -mmin -60 | wc -c` -gt 0 ]; then
#found
fi

Move file that has aged x minutes [duplicate]

This question already has answers here:
How can I compare numbers in Bash?
(10 answers)
Closed 3 years ago.
I have a recurring process that runs to check to see if a file has aged x mins. In a situation where it has, I move the file over to a new directory.
However, I noticed that files are being moved instantly. Any ideas what could be causing the issue?
# Expected age time = 10 minutes
EXPECTED_AGE_TIME=10
# How long the file has actually aged
ACTUAL_AGE_TIME=$((`date +%s` - `stat -L --format %Y $FILE`))
if [[ $ACTUAL_AGE_TIME > $((EXPECTED_AGE_TIME * 60)) ]]; then
mv $FILE ./loaded/$FILE
fi
Building on comment to use find in comments above. Apply find to a single find:
find $FILE -mmin +10 -exec mv '{}' ../loaded/ \;
This will eliminate messy date math, formatting of dates, ...
Checking relative age of files can be done by Bash's built-in file date comparison operator -ot.
See help test:
FILE1 -nt FILE2 True if file1 is newer than file2 (according to modification date).
FILE1 -ot FILE2 True if file1 is older than file2.
#!/usr/bin/env bash
declare -- TIME_FILE
TIME_FILE="$(mktemp)" || exit 1 # Failed to create temp-file
trap 'rm -- "$TIME_FILE"' EXIT # Purge the temp-file on exit
declare -i EXPECTED_AGE_TIME=10
# Set the time of the referrence $TIME_FILE to $EXPECTED_AGE_TIME minutes
touch --date "$((EXPECTED_AGE_TIME)) min ago" "$TIME_FILE"
# If $FILE is older than $TIME_FILE, then move it
[[ "$FILE" -ot "$TIME_FILE" ]] && mv -- "$FILE" "./loaded/$FILE"

How to delete files older than 30 days based on the date in the filename [duplicate]

This question already has answers here:
Delete all files older than 30 days, based on file name as date
(3 answers)
Closed 3 years ago.
I have CSV files get updated every day and we process the files and delete the files older than 30 days based on the date in the filename.
Example filenames :
XXXXXXXXXXX_xx00xx_**20171001**.000000_0.csv
I would like to schedule the job in crontab to delete 30 days older files daily.
Path could be /mount/store/
XXXXXXXXXXX_xx00xx_**20171001**.000000_0.csv
if [ $(date -d '-30 days' +%Y%m%d) -gt $D ]; then
rm -rf $D
fi
this above script doesn't seem to help me. Kindly help me on this.
I have been trying this for last two days.
Using CENTOS7
Thanks.
For all files:
Extract the date
touch the file with that date
delete files with the -mtime option
Do this in the desired dir for all files:
f=XXXXXXXXXXX_xx00xx_20171001.000000_0.csv
d=$(echo $f | sed -r 's/[^_]+_[^_]+_(20[0-9]{6})\.[0-9]{6}_.\.csv/\1/')
touch -d $d $f
After performing that for the whole dir, delete the older-thans:
find YourDir -type f -mtime +30 -name "*.csv" -delete
Gnu-sed has the -delete option. Other finds might need -exec rm ... .
Test before. Other pitfalls are different kind of dates, affected by touch (mtime, ctime, atime).
Test, manipulating the date with touch:
touch XXXXXXXXXXX_xx00xx_20171001.000000_0.csv
f=XXXXXXXXXXX_xx00xx_20171001.000000_0.csv; d=$(echo $f | sed -r 's/[^_]+_[^_]+_(20[0-9]{6})\.[0-9]{6}_.\.csv/\1/'); touch -d $d $f
ls -l $f
-rw-rw-r-- 1 stefan stefan 0 Okt 1 00:00 XXXXXXXXXXX_xx00xx_20171001.000000_0.csv
An efficient way to extract date from filename is to use variable expansions
f=XXXXXXXXXXX_xx00xx_20171001.000000_0.csv
d=${f%%.*} # removes largest suffix .*
d=${d##*_} # removes largest prefix *_
Or to use bash specific regex
if [[ $f =~ [0-9]{8} ]]; then echo "$BASH_REMATCH"; fi
Here is a solution if you have dgrep from dateutils.
ls *.csv | dateutils.dgrep -i '%Y%m%d' --le $(date -d "-30 day" +%F) | xargs -d '\n' rm
First we can use either ls or find to obtain a list of filenames. We can then pipe the results to dgrep to filter the filenames that contains a date string which matches our condition (in this case older than 30 days). Finally, we pipe the result to xargs rm to remove all the matched files.
-i '%Y%m%d' input date format as specified in your filename
--le $(date -d "-30 day" +%F) filter dates that are older than 30 days
You can change rm to printf "%s\n" to test the command before actually deleting it.
The following approach does not look at any generation time information of the file, it assumes the date in the filename is unrelated to the day the file is created.
#/usr/bin/env bash
d=$(date -d "-30 days" "+%Y%m%d")
for file in /yourdir/*csv; do
date=${file:$((${#file}-21)):8}
(( date < d )) && rm $file
done

Boolean check if a file has been opened in the past hour

I am trying to write a crontab that checks inside some specified directory and checks if the files are more than an hour old.
!#/bin/bash
for F in /My/Path/*.txt;do
if [ ***TEST IF FILE WAS OPENED IN THE PAST HOUR *** ]
then
echo "$F"
fi
done
thanks for any help
This can be done with a simple find
find /path/to/directory -type f -newermt "1 hours ago"
Any files accessed / modified within the past hour will print to stdout. No need to loop and print.
#/bin/bash
OLD_FILES=$(find /path/to/directory -type f -newermt "1 hours ago")
if [[ -n $OLD_FILES ]]; then
echo "$OLD_FILES"
else
echo "No old files found in dir"
fi
You can always pipe the results to a log file if you're trying to compile a list as well
find /path/to/directory -type f -newermt "1 hours ago" >> $yourLogFile
A more rigorous approach using GNU date, which has an option -r
-r, --reference=FILE
display the last modification time of FILE
Using the above, incorporating in your script
#!/bin/bash
for filename in /My/Path/*.txt ; do
if (( (($(date +%s) - $(date -r "$filename" +%s))/60) <= 60 )); then
echo "$filename"
fi
done
The logic is straight-forward, we are getting the file modification time in minutes by subtracting the file's modification EPOCH with the current time. If the file is modified within 60 minutes, the particular file is printed.

bash script to delete backups older than 7 days by retaining 1 backup each day [duplicate]

This question already has answers here:
Delete all but the most recent X files in bash
(19 answers)
Closed 7 years ago.
We have a backup strategy where every day 6 db dumps are taken. We want to delete backups which are older than 7 days, but we also want to retain single backup of every day. The backup files are in the format 2015_08_09_01_00_01.sql.gz
Any help would be appreciated
Tried below commands which worked just fine.
find . -mtime +7 -mtime -24 | sort -n > testbackups.txt
sort -u -t_ -k5,5 testbackups.txt > testbackups2.txt
grep -v -x -f testbackups2.txt testbackups.txt > delbackups7.txt
cat delbackups7.txt | while read file ; do rm ~/"$file" ; done
I am listing files between 7 days and 24 days, retaining one of the backup out of 6 backups, then deleting the remaining files.
In Linux Bash, you'll want something like /usr/bin/find /PATH/TO/BACKUPS/ -type d -mtime +6 -exec rm -r {} \. That will use the find command to find files in the /PATH/TO/BACKUPS/ that are older than 6 days and remove them. -type d is for directories

Resources