I have a directory where so many files created daily and need to copy the new files which were generated. And all files will be created with starting name abc_
Ex:I have a file abc_0520123.pdf on the next day two files were created abc_0521234.pdf and abc_0521254.pdf now I want to copy only these two files created newly.
Please help me how can I compare old files with new one and to copy them.
You can use find.
find /my_directory -mtime -1d # Finds everything modified less than one day ago.
find /my_directory -ctime -1d # Finds everything created less than one day ago.
find /my_directory -ctime +5d # Finds stuff created more than 5 days ago.
If you want to move the files you can use -exec
find /my_directory -mtime -1d -type f -exec mv {} /new_dir/. \;
Finds files only located under /my_directory which are less than 1 day old and moves them to /new_dir
Find is one of the most useful commands you can ever learn!
Related
This question already has answers here:
Using find to locate files modified within yesterday
(2 answers)
Closed 2 years ago.
please assist, I am trying to list the files from last 3 days in directory starting from midnight of first day from last 3-days. I created this so far " find ./* -type f -mtime -3 -exec ls {} ;" This only pulls data from current time to - 3days. It doesn't get data from midnight of first day. I need a data start from last 3-days starting midnight to today.
Please assist.
thank you
You can touch a file with the earliest date you want to use. Then use find's -newer option.
touch -t <earliestDate> someTempFile
find . -type f -newer someTempFile -exec ls -l {} \;
You don't need -exec if you execute ls with no options and you're constrained to normal files.
I am using the command "find /path/* -type d -ctime +5" to find directories which are 5 days old. This command lists the directory and all its sub-directories. But I want to stop at the first matched directory.
For the following directory structure:
/temp/a/b/c/file.txt
Let's say directories 'b' and 'c' were created 5 days ago.
The above command lists the following as the output:
/temp/a/b and /temp/a/b/c.
Instead of the above output, I want only "/temp/a/b" as the output.
Is there any way to do that?
You can stop searching a branch with -prune:
find /path -type d -ctime +4 -prune
This will print all the directories whose ctime is older than 5 days, but skip all their subdirectories.
I have a folder called Photos & Videos.
It has folders for the various years for which I have photos and videos.
The year folders have folders for months and the months folders have folders for the days of that month.
The days folders contain all my image and video files.
Now, I want to run commands on all these files using one single exiftool command.
For example, to remove the metadata of all files in a folder, I'm running this type of command on every single day folder one at a time.
exiftool -all= -overwrite_original Documents/Personal/Photos\ \&\ Videos/2015/3-2015/7-3-2015/ .
The problem is I have too many days folders.
So, is there a way I can target all the files in the Photos & Videos folder all at the same time using one single command.
Please help
Add -r to your command. -r will recuse into all the subdirectories.
A more generic solution that will work for all commands is to use find.
find Documents/Personal/Photos\ \&\ Videos/2015/3-2015/7-3-2015/ -type f will print out every file in Documents/Personal/Photos & Videos/2015/3-2015/7-3-2015. To run a command on each of those files, you can use the -exec parameter, with an admittedly odd syntax:
find Documents/Personal/Photos\ \&\ Videos/2015/3-2015/7-3-2015/ -type f -exec exiftool -all= -overwrite_original '{}' \;
A cron action saves database files on an hourly basis and assigns a file name based on year, month, day and hour
/$(date +\%m)/$(date +\%y\%m\%d\%H)_thedb.sql
This leads to archive bloat and the goal is to keep the last file of the day (i.e. delete all those lesser than 15050923* ) in a separate cron action.
What is an effective way of achieving this?
Before you start with complex bash string substitutions, I suggest you try to go after the file date. find can help you with that.
For example, to delete all files in a directory that are older than 5 days, you could try something like this:
find <DIR> -mtime +5 -exec rm {} \;
Now if there are subdirectories in <DIR>, you might also want to include the options -type f to limit the finding to files, and -maxdepth 1 to not search subdirectories.
If you have a file and want to delete everything older than that, you could slightly modify this:
find <DIR> -not -newer <FILE> -not -name <FILE> -exec rm {} \;
I simply don't know why there is no -older search term in find, it seems so obvious.
Warning: I strongly recommend to first leave out -exec and everything after it to check whether the files it finds can all be deleted.
I have a backup system in my ubuntu server that every day makes a database backup and save it in a folder named like the day: $(date +%d%m%y)
But on the script, when I try to find and delete the folders from the last week, the command don't find any directory.
Im trying with: find -name $(date +%d%m%y) -type d -mtime +7 -exec rm -r {};
And never find a directory. Y tryed changing the -mtime time to 1 day or 2, but dont find nothing.
I think you made a small mistake:
When you backup on the 7th of may, you create a folder with name 070515. When you search a week later, you look for a folder with name 140515 modified more then 7 days ago. However, this folder has been created only today.
You may not need the name of the folder, just use
find /backup/path -type d -mtime +7
to find all folders older then 7 days.
I suspect at least two errors in your find command:
The path is missing where to search: find /where/to/search -name ...
$(date +%d%m%y) always gives the actual date. It looks reasonable that directories with actual date don't have a modification time +1 or +7. Instead try following:
find /where/to/search -type d -mtime +7