Shell - Delete files in directory after 3 months - shell

All,
I need to delete files within a directory past a certain date. I have a function with format:
Example file: my_test_file_2015_04_01.log
Example function
rem_files $my_directory my_test_file_*.log
How would I remove all files after 3 months?
function rem_files
{
?????????
}

If you can trust file, "modified time" you can use find -name my_test_file_*.log -ctime +90 to find files older than 90 days

Related

making text files containing list of files in different directories with random name in bash

I have a script which makes some files in pair (all the resulting files have either R1 OR R2 in the file name) and then put each pair in a separate directory with random name (every time will be different and I cannot predict). for example if I have 6 files, 3 files would be file_R1.txt or file_R2.txt but in pair like this example:
s1_R1.txt and s1_R2.txt
s2_R1.txt and s2_R2.txt
s3_R1.txt and s3_R2.txt
In this example I will have 3 directories (one per pair of files). I want to make 2 text file (d1.txt and. d2.txt) containing above file names. In fact, d1.txt will have all the files with R1 and d2.txt will contain all the files with R2. To do so, I made the following short bash code but it does not return what I want. Do you know how to fix it?
For file in /./*R*.txt;
do;
touch "s1.txt"
touch "s2.txt"
echo "${fastq}" >> "s1.txt"
done
Weird question, not sure I get it, but for your d1 and d2 files:
#!/bin/bash
find . -type f -name "*R1*" -print >d1.txt
find . -type f -name "*R2*" -print >d2.txt
find is used since you have files under different sub-directories.
Using > ensures the text files are emptied out if they already exist.
Note the code you put in your question is not valid bash.

How to delete specific files in unix

We have a few files on our server instance under /wslogs/instance_name directory and these are all log files created on daily basis.
I am looking for a script to automatically delete those files based on date.
So lets say delete files older than 10 days. The problem is that the filename is not purely of date format rather it is
hostname_%m%d%Y_access.log and hostname_%m%d%Y_error.log
For example, ra70960708_12042016_access.log and ra70960708_12042016_error.log (where ra70960708 is the server name or hostname).
I'm trying to use rm command, but unable to figure out how to specify the files here if say I have to delete those which are 10 days older from current date.
Any help would be greatly appreciated.
Cheers,
Ashley
Forgot about name, and use modification time instead:
The below will list files in current directory, that matches the glob: hostname_*_error.log and which are last modified +10 days ago:
find . -maxdepth 1 -mindepth 1 \
-type f -name 'hostname_*_error.log' \
-mtime +10
They can then be deleted with -delete.
. is the directory to search in.

Count files and folders in directory (ignore hidden files) [duplicate]

This question already has answers here:
Recursively counting files in a Linux directory
(24 answers)
Closed 4 years ago.
This has probably answered before, but I cannot find it.
How do I count the files and directories in a directory without including subdirectories? Also, hidden files should be ignored (because this folder is a git repo).
More precisely, I need an if clause, that the current folder only has one file (namely "script.sh") and no sub-folders with the exception of .git/ .
How can I do this in Bash?
EDIT: In contrast to Recursively count specific files BASH, I want to ignore the .git folder and I do not want to count files and folders in subfolders.
find should help:
if [ "$(cd /some/dir && find * -maxdepth 0 -type f)" == "script.sh" ]; then
* will list only non hidden files and maxdepth won't search subdirectories.

Delete files in a directory older than the current month in shell

I want to know how to delete the files in a directory those are older than the current month?
I have tried for this-
Assume that 30 days per month, use find -mtime +30
-mtime n
File's data was last modified n*24 hours ago. See the comments
for -atime to understand how rounding affects the interpretation
of file modification times.

Bash script to find file older than X days, then subsequently delete it, and any files with the same base name?

I am trying to figure out a way to search a directory for a file older than 365 days. If it finds a match, I'd like it to both delete the file and locate any other files in the directory that have the same basename, and delete those as well.
File name examples: 12345.pdf (Search for) then delete, 12345_a.pdf, 12345_xyz.pdf (delete if exist).
Thanks! I am very new to BASH scripting, so patience is appreciated ;-))
I doubt this can be done cleanly in a single pass.
Your best bet is to use -mtime or a variant to collect names and then use another find command to delete files matching those names.
UPDATE
With respect to your comment, I mean something like:
# find basenames of old files
find .... -printf '%f\n' | sort -u > oldfiles
for file in ($<oldfiles); do find . -name $file -exec rm; done

Resources