how does grep only today's files in current directory? - shell

I want to grep files which created today in the current directory. So how many ways to do that? What's the best way to do that?
grep --color 'content' ./directory

This should do the trick for you:
find ./directory -maxdepth 1 -type f -daystart -ctime 0 -print | xargs grep --color 'content'
In the above command, we are using find to find all the files (-type f) in directory, that were made today (-daystart -ctime 0) and then -print the full files paths to standard output. We then send the output to xargs. Using xargs we are able to execute each line of the output through the grep command. This is much simpler than having to create a for loop and iterate over each line of the output.

If I understand you want to grep "content" within all file in ./directory modified today, then you can use a combination of find and xargs. For example to find the files in ./directory modified today, you can give the -mtime 0 option which find files modified 0 24 hour periods ago (e.g. today). To handle strange filenames, use the -print0 option to have find output nul-terminated filenames. Your find command could be:
find . -maxdepth 1 -type f -mtime 0 -print0
One the list of files is generated, you can pass the result to xargs -0 which will process the list of filenames as being nul-terminated and using your grep command, you would have:
xargs -0 grep --color 'content'
To put it altogether, simply pipe the result of find to xargs, e.g.
find . -maxdepth 1 -type f -mtime 0 -print0 |
xargs -0 grep --color 'content'
Give that a go and let me know if it does what you need or if you have further questions.
Edit Per Comment
If you want more exact control of the hour, or minute or second from which you want to select your files, you can use the -newermt option for find to file all files newer than the date you give as the option, e.g. -newermt "2021-07-02 02:10:00" would select today's file created after 2:10:00 (all files after 2:10:00 am this morning)
Modifying the test above and replacing -mtime 0 with -newermt "2021-07-02 02:10:00" you would have:
find . -maxdepth 1 -type f -newermt "2021-07-02 02:10:00"` -print 0 |
xargs -0 grep --color 'content'
(adjust the time to your exact starting time you want to begin selecting files from)
Give that a go also. It is quite a bit more flexible as you can specify any time within the day to begin selecting files from based on the files modification time.

Related

Check from files in directory which is the most recent in Bash Shell Script

I am making a bash script to run in a directory with files generated everyday and copy the most recent file to another directory.
I am using this now
for [FILE in directory]
do
if [ls -Art | tail -n 1]
something...
else
something...
fi
done
I know this is not alright. I would like to compare the date modified of the files with the current date and if it was equal, copy that file then.
How would that work or is there an easier method to do it?
We could use find:
find . -maxdepth 1 -daystart -type f -mtime -1 -exec cp -f {} dest \;
Explanation:
-maxdepth 1 limits the search to the current directory.
-daystart sets the reference time of -mtime to the beginning of today.
-type f limits the search to files.
-mtime -1 limits the search to files that have been modified less than 1 day from reference time.
-exec cp -f {} dest \; copies the found files to directory dest.
Note that -daystart -mtime -1 means anytime after today 00:00 (included), but also tomorrow or any time in the future. So if you have files with last modification time in year 2042 they will be copied too. Use -mtime 0 if you prefer coping files that have been modified between today at 00:00 (excluded) and tomorrow at 00:00 (included).
Note also that all this could be impacted by irregularities like daylight saving time or leap seconds (not tested).
The newest file is different from file(s) modified today.
Using ls is actually a pretty simple and portable approach. The stdout output format is defined by POSIX (if not printing to a terminal), and ls -A is also in newer POSIX standards.
It should look more like this though:
newest=$(ls -At | head -n 1)
You could add -1, but it AFAIK it shouldn’t be required, as it’s not printing to a terminal.
If you don’t want to use ls, you can use this on linux:
find . -mindepth 1 -maxdepth 1 -type f -exec stat -c ‘%Y:%n’ {} + |
sort -n |
tail -n 1 |
cut -d : -f 2-
Note using 2- not 2 with cut, in case a filename contains :.
Also, the resulting file name will be a relative path (./file), or an empty string if no files exist.

Script to find recursively the number of files with a certain extension

We have a highly nested directory structure, where we have a directory, let's call it 'my Dir', appearing many times in our hierarchy. I am interested in counting the number of "*.csv" files in all directories named 'my Dir' (yes, there is a whitespace in the name). How can I go about it?
I tried something like this, but it does not work:
find . -type d -name "my Dir" -exec ls "{}/*.csv" \; | wc -l
If you want to the number of files matching the pattern '*.csv' under "my Dir", then:
don't ask for -type d; ask for -type f
don't ask for -name "my Dir" if you really want -name '*.csv'
don't try to ls *.csv on each match, because if there's more N csv files in a directory, you would potentially count each one N times
also beware of embedding {} in -exec code!
For counting files from find, I like to use a trick I learned from Stéphane Chazelas on U&L; for example, from: Counting files in Linux:
find "my Dir" -type f -name '*.csv' -printf . | wc -c
This requires GNU find, as -printf is a GNU extension to the POSIX standard.
It works by looking within "my Dir" (from the current working directory) for files that match the pattern; for each matching file, it prints a single dot (period); that's all piped to wc who counts the number of characters (periods) that find produced -- the number of matching files.
You would exclude all pathcs that are not My Dir:
find . -type f -not '(' -not -path '*/my Dir/*' -prune ')' -name '*.csv'
Another solution is to use the -path predicate to select your files.
find . -path '*/my Dir/*.csv'
Counting the number of occurrences could be a simple matter of piping to wc -l, though this will obviously produce the wrong result if some of the files contain newlines in their names. (This is slightly pathological, but definitely something you want to cover in production code.) A common arrangement is to just print a newline for every found file, instead of its name.
find . -path '*/my Dir/*.csv' -printf '.\n' | wc -l
(The -printf predicate is not in POSIX but it's not hard to replace with an -exec or similar.)

How to find the particular files in a directory in a shell script?

I'm trying to find the particular below files in the directory using find command pattern in shell script .
The below files will create in the directory "/data/output" in the below format every time.
PO_ABCLOAD0626201807383269.txt
PO_DEF 0626201811383639.txt
So I need to find the above txt files starting from "PO_ABCLOAD" and "PO_DEF" is created or not.if not create for four hours then I need to write in logs.
I written script but I am stuck up to find the file "PO_ABCLOAD" and "PO_DEF format text file in the below script.
Please help on this.
What changes i need to add in the find command.
My script is:
file_path=/data/output
PO_count='find ${file_path}/PO/*.txt -mtime +4 -exec ls -ltr {} + | wc -l'
if [ $PO_count == 0 ]
then
find ${file_path}/PO/*.xml -mtime +4 -exec ls -ltr {} + >
/logs/test/PO_list.txt
fi
Thanks in advance
Welcome to the forum. To search for files which match the names you are looking for you could try the -iname or -name predicates. However, there are other issues with your script.
Modification times
Firstly, I think that find's -mtime test works in a different way than you expect. From the manual:
-mtime n
File's data was last modified n*24 hours ago.
So if, for example, you run
find . -mtime +4
you are searching for files which are more than four days old. To search for files that are more than four hours old, I think you need to use the -mmin option instead; this will search for files which were modified a certain number of minutes ago.
Command substitution syntax
Secondly, using ' for command substitution in Bash will not work: you need to use backticks instead - as in
PO_COUNT=`find ...`
instead of
PO_COUNT='find ...'
Alternatively - even better (as codeforester pointed out in a comment) - use $(...) - as in
PO_COUNT=$(find ...)
Redundant options
Thirdly, using -exec ls -ltr {} + is redundant in this context - since all you are doing is determining the number of lines in the output.
So the relevant line in your script might become something like
PO_COUNT=$(find $FILE_PATH/PO/ -mmin +240 -a -name 'PO_*' | wc -l)
or
PO_COUNT=$(find $FILE_PATH/PO/PO_* -mmin +240 | wc -l)
If you wanted tighter matching of filenames, try (as per codeforester's suggestion) something like
PO_COUNT=$(find $file_path/PO/PO_* -mmin +240 -a \( -name 'PO_DEF*' -o -name 'PO_ABCLOAD*' \) | wc -l)
Alternative file-name matching in Bash
One last thing ...
If using bash, you can use brace expansion to match filenames, as in
PO_COUNT=$(find $file_path/PO/PO_{ABCLOAD,DEF}* -mmin +240 | wc -l)
Although this is slightly more concise, I don't think it is compatible with all shells.

I want to grep a pattern inside a file n list that file based on current date

ls -l | grep "Feb 22" | grep -l "good" *
This is the command i am using . i have 4 files among which one file contains the world good . I want to list that file . And that file creation is the current date . based on both the criteria i want to list that file
Try this :
find . -type f -newermt 2018-02-21 ! -newermt 2018-02-22 -exec grep -l good {} \;
or
find . -type f -newermt 2018-02-21 ! -newermt 2018-02-22 | xargs grep -l good
And please, don't parse ls output
Hi Try with below command. How it works? Here find command with parameter -mtime -1 will search for files with current date in current directory as well as its sub directories. Each file found will be pass to grep command one at a time. grep command will check for the string in that file (means each file passes to it)
find . -mtime -1 -type f | xargs grep -i "good"
In the above command it will list all the file with current date. To list a files of specific kind you below command. Here I am listing only txt files.
find . -name "*.txt" -mtime -1 -type f | xargs grep -i "good"
find . is for running it from current directory (dot means current directory). To run it from a specific directory path modify like below:-
find /yourpath/ -name "*.txt" -mtime -1 -type f | xargs grep -i "good"
Also grep -i means "ignore case". For a specific case just use grep "good"

Shell script: Want to find all files in last hour and cat the last 1000 lines to 1 log file

I'm currently trying to get the last files modified for the hour in a directory and cat them to one static log file. The problem is I don't know how many files there will be to cat... could be 5, could be 15. So I'm thinking it'd go something like this but the caveat that I only want probably the last 1000 lines of each file i'm finding... I tried a standard tail with a wild card but got an error saying illegal function or something.
find /xxy/ -mmin -60 | cat /xxy/*.log > /xxy/static.log
It works... but if the file is a month old i'm getting everything. I'd like to shorton it to just 1000 entries per log file found but google isn't aiding me at this point and I'm a bit of a beginner.
Any tips or pointers would be great. But I might have to approach it differently.
Thanks.
The only one of these "last hour" of commands that ever worked for me is:
find . -mtime -.04
Sending that to a file of course could be as easy as:
find . -mtime -.04 > static.log
I first posted this solution as a comment, I thought I was missing some requirement.
This solution (nlu findinf files with a name like *.log) should work
find /xxy/ -type f -name "*.log" -mmin -60 -exec tail -1000 "{}" \;
OP told (comment on the original question) that this did not work.
His error -exec command not found suggests that he had a pipe or semicolon before the -exec. When he has a broken find, he can try something like
find /xxy/ -type f -name "*.log" -mmin -60 -print0 | xargs --null -n1 tail -1000
I added -print0 and --null supporting filenames with a newline.
Thanks for the feedback, ended up using this:
FILENUM=$(find /xxy/ -mmin -60 | wc -l)
find /xxy/ -type f -name "*"| tail -n$FILENUM | xargs tail -n1000

Resources