Linux command to copy recently created/updated files? - shell

I want to copy recently created/updated files to another folder. Say, for eg, the files which created in last 3 days should be copied to another folder(/tmp). how to do that? Is it possible.

You can use the find command's mtime argument to find files that were last modified by a certain time and then use it's exec argument to copy them somewhere.
For example, this command will find files modified within three days in your current directory and copy them to your /tmp directory:
find . -mtime -3 -type f -exec cp "{}" /tmp \;
-mtime n File's data was last modified n*24 hours ago. See the comments for -atime to understand how rounding affects the
interpretation of file modification times.
-exec command ; Execute command; true if 0 status is returned. All following arguments to find are taken to be arguments to the command
until an argument consisting of ';' is encountered. The string '{}' is
replaced by the current file name being processed everywhere it occurs
in the arguments to the command, not just in arguments where it is
alone, as in some versions of find. Both of these constructions might
need to be escaped (with a '\') or quoted to protect them from
expansion by the shell. See the EXAMPLES section for examples of the
use of the -exec option. The specified command is run once for each
matched file. The command is executed in the starting directory. There
are unavoidable security problems surrounding use of the -exec action;
you should use the -execdir option instead.

Related

Check if file is in a folder with a certain name before proceeding

So, I have this simple script which converts videos in a folder into a format which the R4DS can play.
#!/bin/bash
scr='/home/user/dpgv4/dpgv4.py';mkdir -p 'DPG_DS'
find '../Exports' -name "*1080pnornmain.mp4" -exec python3 "$scr" {} \;
The problem is, some of the videos are invalid and won't play, and I've moved those videos to a different directory inside the Exports folder. What I want to do is check to make sure the files are in a folder called new before running the python script on them, preferably within the find command. The path should look something like this:
../Exports/(anything here)/new/*1080pnornmain.mp4
Please note that (anything here) text does not indicate a single directory, it could be something like foo/bar, foo/b/ar, f/o/o/b/a/r, etc.
You cannot use -name because the search is on the path now. My first solution was:
find ./Exports -path '**/new/*1080pnornmain.mp4' -exec python3 "$scr" {} \;
But, as #dan pointed out in the comments, it is wrong because it uses the globstar wildcard (**) unnecessarily:
This checks if /new/ is somewhere in the preceding path, it doesn't have to be a direct parent.
So, the star is not enough here. Another possibility, using find only, could be this one:
find ./Exports -regex '.*/new/[^\/]*1080pnornmain.mp4' -exec python3 "$scr" {} \;
This regex matches:
any number of nested folders before new with .*/new
any character (except / to leave out further subpaths) + your filename with [^\/]*1080pnornmain.mp4
Performances could degrade given that it uses regular expressions.
Generally, instead of using the -exec option of the find command, you should opt to passing each line of find output to xargs because of the more efficient thread spawning, like:
find ./Exports -regex '.*/new/[^\/]*1080pnornmain.mp4' | xargs -0 -I '{}' python3 "$scr" '{}'

Find, unzip and grep the content of multiple files in one step/command

First I made a question here: Unzip a file and then display it in the console in one step
It works and helped me a lot. (please read)
Now I have a second issue. I do not have a single zipped log file but I have a lot of them in defferent folders, which I need to find first. The files have the same names. For example:
/somedir/server1/log.gz
/somedir/server2/log.gz
/somedir/server3/log.gz
and so on...
What I need is a way to:
find all the files like: find /somedir/server* -type f -name log.gz
unzip the files like: gunzip -c log.gz
use grep on the content of the files
Important! The whole should be done in one step.
I cannot first store the extracted files in the filesystem because it is a readonly filesystem. I need somehow to connect, with pipes, the output from one command to the input of the next.
Before, the log files were in text format (.txt), therefore I had not to unzip them first. In this case it was easy:
ex.
find /somedir/server* -type f -name log.txt | xargs grep "term"
Now I have to deal with zipped files. That means, after I find the files, I need first somehow do unzip them and then send the contents to grep.
With one file I do:
gunzip -p /somedir/server1/log.gz | grep term
But for multiple files I don't know how to do it. For example how to pass the output of find to gunzip and the to grep?!
Also if there is another way / "best practise" how to do that, it is welcome :)
find lets you invoke a command on the files it finds:
find /somedir/server* -type f -name log.gz -exec gunzip -c '{}' + | grep ...
From the man page:
-exec command {} +
This variant of the -exec action runs the specified command on
the selected files, but the command line is built by appending
each selected file name at the end; the total number of
invocations of the command will be much less than the number
of matched files. The command line is built in much the same
way that xargs builds its command lines. Only one instance of
{} is allowed within the command, and (when find is being
invoked from a shell) it should be quoted (for example, '{}')
to protect it from interpretation by shells. The command is
executed in the starting directory. If any invocation with
the + form returns a non-zero value as exit status, then
find returns a non-zero exit status. If find encounters an
error, this can sometimes cause an immediate exit, so some
pending commands may not be run at all. This variant of -exec
always returns true.

How to cmp multiple files with relative paths using bash

I have a directory called filesystem that looks a little like this:
- filesystem
- etc
- systemd
- system
- custom.service
- custom2.service
- hostname
These files are copied into the root directory then need to be verified. For example: filesystem/etc/hostname is copied into /etc/hostname.
I've tried this to write a bash script to compare every file in filesystem.
for file in $(find filesystem -type f)
do
cmp file ${file#filesystem}
done
The purpose of ${file#filesystem} is to remove the 'filesystem' from the path of the second file.
This isn't working - it returns 'No such file or directory'. How to fix this code?
As noted in the comments, the specific problem with your code is that you were missing a $ to expand file. That said, processing the output of ls or find can run into problems whenever filenames contain any IFS character. Spaces are a common example, but newlines will trip up many attempts to handle the spaces.
One option for addressing this is to use -exec with find and invoking a shell, since you need some of the shell capabilities for parameter expansion.
Here we'll use sh -c with a string to run which is the cmp command, and we'll pass that sh 2 arguments the first being a placeholder that's the shell's name, the second being the filename parameter:
find filesystem -type f -exec sh -c 'cmp "$1" "${1#filesystem}"' _ {} \;
We quote the variables within sh -c and find will ensure {} is passed in correctly as a single argument.

how can i copy all the xml files which is having current date as filename from all directories

I want to copy all the xml files which is having current date as file name from all directories. Below is the script i have written.
#!/bin/bash
CURRENT_DATE=`date +'%d%m%Y'`
Temp_Path=/appinfprd/bi/infogix/IA83/InfogixClient/Scripts/IRP/New_Vendors/
FILE_PATH=/bishare/DLSFTP/DLSTREAM/
FILE_DATE=`date -d "-2 days" +"%Y%m%d"`
cd $FILE_PATH
find . -name '*$FILE_DATE*.xml' -exec cp $Temp_Path
But it is not working.
Your find statement is wrong. You should end it with \; to indicate the end of the exec command and put {} where the name of your file found should come in the command. So, you want :
find . -name "*$FILE_DATE*.xml" -exec cp "{}" "$Temp_Path" \;
Edit
As stated in the comments, there were also a problem in your initial post with your single quotes that should be double quotes. You might be interested by this man page. In particular by these sections :
-exec command ;
Execute command; true if 0 status is returned. All following arguments to find are taken to be arguments to the command until an argument consisting of ; is encountered. The string {} is replaced by the current file name being processed everywhere it occurs in the arguments to the command, not just in arguments where it is alone, as in some versions of find. Both of these constructions might need to be escaped (with a \) or quoted to protect them from expansion by the shell. See the EXAMPLES section for examples of the use of the -exec option. The specified command is run once for each matched file. The command is executed in the starting directory. There are unavoidable security problems surrounding use of the -exec action; you should use the -execdir option instead.
-exec command {} +
This variant of the -exec action runs the specified command on the selected files, but the command line is built by appending each selected file name at the end; the total number of invocations of the command will be much less than the number of matched files. The command line is built in much the same way that xargs builds its command lines. Only one instance of {} is allowed within the command, and (when find is being invoked from a shell) it should be quoted (for example, '{}') to protect it from interpretation by shells. The command is executed in the starting directory. If any invocation returns a non-zero value as exit status, then find returns a non-zero exit status. If find encounters an error, this can sometimes cause an immediate exit, so some pending commands may not be run at all. This variant of -exec always returns true.

Bash find: changing matched name for use in -exec

I'm writing a deploy script, and I need to run a less compiler against all .less files in a directory. This is easy to do with the following find command:
find -name "*.less" -exec plessc {} {}.css \;
After running this command on a folder with a file named main.less, I'm left with a file named main.less.css, but I want it to be main.css.
I know I can easily strip the .less portion of the resulting files with this command: rename 's/\.less//' *.css but I'm hoping to learn something new about using -exec.
Is it possible to modify the name of the file that matches while using it in the -exec parameter?
Thanks!
Your find command is using a couple of non standard GNU extensions:
You do not state where to find, this is an error in POSIX but GNU find select the current directory in that case
You use a non isolated {}, POSIX find doesn't expand it in that case.
Here is a one liner that should work with most find implementations and fix your double extension issue:
find . -name "*.less" -exec sh -c "plessc \$0 \$(dirname \$0)/\$(basename \$0 less)css" {} \;
On Solaris 10 and older, sh -c should be replaced by ksh -c if the PATH isn't POSIX compliant.
No, it is not possible to do it directly. You can only use {} to directly insert the full filename. However, in exec, you COULD put in other things like awk. Or you can redirect output to another program via pipes.
From the find man page:
-exec command ;
Execute command; true if 0 status is returned. All following
arguments to find are taken to be arguments to the command until
an argument consisting of `;' is encountered. The string `{}'
is replaced by the current file name being processed everywhere
it occurs in the arguments to the command, not just in arguments
where it is alone, as in some versions of find. Both of these
constructions might need to be escaped (with a `\') or quoted to
protect them from expansion by the shell. See the EXAMPLES
section for examples of the use of the -exec option. The
specified command is run once for each matched file. The command
is executed in the starting directory. There are unavoidable
security problems surrounding use of the -exec action; you
should use the -execdir option instead.

Resources