In my folder there are different files like this:
stats.log
move_2021-05-24.log
sync_2021-05-24.log
application.log
I want to copy all *.log files with another day than today to a specific folder.
My current script looks like this but it does not work as I thought. It is currently moving all log files i think and not just log files with a date older that todays date.
cd /share/CACHEDEV1_DATA/app
for file in *.log
do
day=$(echo ${file} | cut -d"-" -f3)
now="$(date +'%d')"
if [ "$day" != "$now" ];
then
mv ${file} ~/share/CACHEDEV1_DATA/rclone/logs/
fi
done
I would be glad if I could get advice on how my script would need to look like to work correctly.
I hope you consider logrotate. It can do everything you need and more.
But if you want to roll your own, here is how you can find files older than a day and move them. Note: This will overwrite files with the same name at the destination. I added an echo statement before mv so you can see if looks good to you.
find /share/CACHEDEV1_DATA/app -type f -maxdepth 1 -mtime +1d -print0 | \
while read -rd $'\0' file; do
echo mv "$file" ~/share/CACHEDEV1_DATA/rclone/logs/
done
Related
I am trying to write in shell script to run this..
I guess it requires for syntax or find syntax..
but I am stuck dealing with scan every folder..
I have tried with "find" -maxdepth 1 -name "*.jpg | mv " but failed...
for every jpg files in every dir (folder1, folder2, folder3...folder5...etc)
move files to target dir which is parent dir.
if file name duplicated, move to dup dir.
DIR tree looks like this
Something like
for f in folder*/*.jpg; do
if [ -e "$(basename "$f")" ]; then
mv "$f" dup/
else
mv "$f" .
fi
done
run from the parent directory. Just iterates over every jpg in the folder subdirectories, moving to one place or another depending on if a file with that name already exists or not.
Slightly more efficient bash version:
for f in folder*/*.jpg; do
if [[ -e ${f##*/} ]]; then
mv "$f" dup/
else
mv "$f" .
fi
done
In the bash below the oldest folder in a directory is selected. If there are 3 folders in the directory /home/cmccabe/Desktop/NGS/test and nothing is done to them (ie. no files deleted, renamed) then the bash correctly identifies f1 as the oldest. However, if something is done to the folder then the bash identifies f2 as the oldest. I am not sure why or how to prevent that from happening. Thank you :).
folders in directory
f1
f2
f3
Bash
# oldest folder used analysis and version log created
dir=/home/cmccabe/Desktop/NGS/test
{
read -r -d $'\t' time && read -r -d '' filename
} < <(find "$dir" -maxdepth 1 -mindepth 1 -printf '%T+\t%P\0' | sort -z )
printf "The oldest folder is $filename, created on $time and analysis done using v1.3 by $USER at $(date "+%D %r")\n" >> /home/cmccabe/Desktop/NGS/test/log
echo "$filename"
Your idea of using find is right, but with a little tweaking like this
$ IFS= read -r -d $'\0' line < <(find . -maxdepth 1 -type d -printf '%T# %p\0' \
2>/dev/null | sort -z -n)
$ printf "The oldest directory: %s\n" "${line#* }"
Similar to the one answered here.
When you edit a folder or a file in a folder, the modification date of the folder is updated. The creation date of a folder is not saved. See this question for more information How to get file creation date/time in Bash/Debian?
I am taking one of important folder every day backup by using cron. That folder name it will store with the current date.
Now my requirement is i need to keep only the current day and last two days backup.
i.e I want to keep only:
test_2016-11-04.tgz
test_2016-11-03.tgz
test_2016-11-02.tgz
Remaining folder it has to delete automatically. Please let us know how to do in shell script.
Below is my backup folder structure.
test_2016-10-30.tgz test_2016-11-01.tgz test_2016-11-03.tgz
test_2016-10-31.tgz test_2016-11-02.tgz test_2016-11-04.tgz
With ls -lrt | head -n -3 | awk '{print $9}
you can print all but the last 3 files in your directory.
Passing this output into rm you obtain the result desidered.
you could append end of backup script;
find ./backupFolder -name "test_*.tgz" -mtime +3 -type f -delete
also use this;
ls -1 test_*.tgz | sort -r | awk 'NR > 3 { print }' | xargs -d '\n' rm -f --
Generate an array on files you want to keep:
names=()
for d in {0..2}; do
names+=( "test_"$(date -d"$d days ago" "+%Y-%m-%d")".tgz" )
done
so that it looks like this:
$ printf "%s\n" "${names[#]}"
test_2016-11-04.tgz
test_2016-11-03.tgz
test_2016-11-02.tgz
Then, loop through the files and keep those that are not in the array:
for file in test_*.tgz; do
[[ ! ${names[*]} =~ "$file" ]] && echo "remove $file" || echo "keep $file"
done
If ran on your directory, this would result on an output like:
remove test_2016-10-30.tgz
remove test_2016-10-31.tgz
remove test_2016-11-01.tgz
keep test_2016-11-02.tgz
keep test_2016-11-03.tgz
keep test_2016-11-04.tgz
So now it is just a matter or replacing those echo with something more meaningful like rm.
I have several thousand eBooks named like AuthorFirstName AuthorLastName Title XX.pdf, where xx are number from 1-99 (volume number).
Author tilte name can be of multiple word so here i want to move copy the files to folder with the name AuthorFirstName AuthorLastName title. Everything except the number should be the folder name, so that all volumes of the eBook come in same folder.
For example
root.....>AuthorFirstName AuthorLastName Title>AuthorFirstName AuthorLastName Title XX.pdf
You can use a mix of find, sed and bash script for the task. You have to write it on your own though and ask for help if you fail.
You can also try some ready tools for mass moving/renaming like these: http://tldp.org/LDP/GNU-Linux-Tools-Summary/html/mass-rename.html
Never used one of these though.
I would try with this:
for folder in $(ls | sed -r "s/(.*) ([0-9]{1,2})/\1/" | uniq)
do
mkdir $folder
mv $(find . -name "$folder*") $folder
done
I don't know if this is correct, but it may give you some hints.
edit: added uniq to the pipe.
Use a loop as shown below:
find . -type f -name "*pdf" -print0 | while IFS= read -d '' file
do
# extract the name of the directory to create
dirName="${file% *}"
# create the directory if it doesn't exist
[[ ! -d "$dirName" ]] && mkdir "$dirName"
mv "$file" "$dirName"
done
I have a folder "test" in it there is 20 other folder with different names like A,B ....(actually they are name of people not A, B...) I want to write a shell script that go to each folder like test/A and rename all the .c files with A[1,2..] and copy them to "test" folder. I started like this but I have no idea how to complete it!
#!/bin/sh
for file in `find test/* -name '*.c'`; do mv $file $*; done
Can you help me please?
This code should get you close. I tried to document exactly what I was doing.
It does rely on BASH and the GNU version of find to handle spaces in file names. I tested it on a directory fill of .DOC files, so you'll want to change the extension as well.
#!/bin/bash
V=1
SRC="."
DEST="/tmp"
#The last path we saw -- make it garbage, but not blank. (Or it will break the '[' test command
LPATH="/////"
#Let us find the files we want
find $SRC -iname "*.doc" -print0 | while read -d $'\0' i
do
echo "We found the file name... $i";
#Now, we rip off the off just the file name.
FNAME=$(basename "$i" .doc)
echo "And the basename is $FNAME";
#Now we get the last chunk of the directory
ZPATH=$(dirname "$i" | awk -F'/' '{ print $NF}' )
echo "And the last chunk of the path is... $ZPATH"
# If we are down a new path, then reset our counter.
if [ $LPATH == $ZPATH ]; then
V=1
fi;
LPATH=$ZPATH
# Eat the error message
mkdir $DEST/$ZPATH 2> /dev/null
echo cp \"$i\" \"$DEST/${ZPATH}/${FNAME}${V}\"
cp "$i" "$DEST/${ZPATH}/${FNAME}${V}"
done
#!/bin/bash
## Find folders under test. This assumes you are already where test exists OR give PATH before "test"
folders="$(find test -maxdepth 1 -type d)"
## Look into each folder in $folders and find folder[0-9]*.c file n move them to test folder, right?
for folder in $folders;
do
##Find folder-named-.c files.
leaf_folder="${folder##*/}"
folder_named_c_files="$(find $folder -type f -name "*.c" | grep "${leaf_folder}[0-9]")"
## Move these folder_named_c_files to test folder. basename will hold just the file name.
## Don't know as you didn't mention what name the file to rename to, so tweak mv command acc..
for file in $folder_named_c_files; do basename=$file; mv $file test/$basename; done
done