Find recent files in directory with Bash [closed] - bash

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I want to write a Bash script to look for a new file (recently added file) in a folder and output the name of the new file (including the file type, i.e. example.png) to a variable.
How would I do it?

find . -mmin -10
This command recursively finds all the files and directories in the current folder that have been modified/created in the past 10 minutes.
-mmin stands for minutes. You can also use -mtime which counts days. In addition you can use a + in front of the given number which makes find search for files that have been created/modified at least the given amount of time ago.
If you intend to further parse and use the output, which I assume you do because you intend to store it in a variable, you shouldn't use ls as it optimized for presentation and can change its output format depending on where you write to.

Probably you want to use the -t option of ls:
ls -t
From man ls:
--sort=WORD
sort by WORD instead of name: none (-U), size (-S), time
(-t), version (-v), extension (-X)
--time=WORD
change the default of using modification times; access
time (-u): atime, access, use; change time (-c): ctime,
status; birth time: birth, creation;
with -l, WORD determines which time to show; with
--sort=time, sort by WORD (newest first)
--time-style=TIME_STYLE
time/date format with -l; see TIME_STYLE below
-t sort by time, newest first; see --time

Related

Is there a way to stop cp from overwriting the second of two globbed files [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
Intention:
cp /path/to/code.{c,h} .
Concise version:
cp /path/to/code.* .
Re-occurring typo:
cp /path/to/code.*
In the typo case the second file is overwritten by the first.
This has bitten me repeatedly and I'm not optimistic there's a solution outside of re-writing my neural circuits, but one can dream.
Asking for confirmation every time or some visual indication of danger would both be solutions.
Defaulting to --no-clobber or some such is not a solution because I am usually clobbering something in the intended destination.
As suggested, you could create an alias
alias cp='cp -i'
such that you will always be prompted when invoking cp from the command line. Note that this will not affect scripts.
The man page for cp has this to say:
-i, --interactive
prompt before overwrite (overrides a previous -n option)

Total size of all file types in a folder [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
Is there any way/too in window7 to see the total size of files of particular type ? For example if I have directory which has 5 files. 2 files are Jpg, 1 is log file and 2 are docx file. In such case, it should report something like below
jpg - 2 files- Total size -10 MB
log file - 1 file- Total size -5KM
document file - 2 file - Total -45 MB
-Rajesh
Is there a way to do this in linux (e.g. some form of ls or grep)? If there is, it is probably supported by cygwin.
In other words, you could install cygwin and then run something like the 'find' command shown here: https://askubuntu.com/questions/558979/how-to-display-disk-usage-by-file-type.
Also, if you put the cygwin executable directory in your PATH environment you can run all of the cywin commands from a windows command prompt.
And if you just want a good way to see where all of your disk space is being used there are a number of good tools for that. I personally like spacesniffer.
You can start a command window and use dir.
ex:
dir *.txt

How to find and delete 15 days older files with Cron Jobs? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 7 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Improve this question
I want to delete the files, which are older than 15 days. Files directory is
/home/app_admin/files
There is 100+ folder in the 'files' directory, I only want to delete the files, which are older than 15 days without deleting the folders.
In the 'files' directory, there are many folders like below ↓↓↓
16 Jwellary Image Clipping
300 photos to be retouched
Ashraf Amin
Background removal
Background removal and retouching
Brad C
Change wording in a Photoshop file
Color Correct 123 Product Photos
cut out some high res photos
I want to delete the files from this folders.
Can anyone please help me to figure out this.
Check first what would be deleted:
find path/to/basedir -mtime +15 -type f
If the output looks good, then add -delete:
find path/to/basedir -mtime +15 -type f -delete

way to sort by CPU% on 'top' by default? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
When i run top on my Macbook (BSD?) i almost always want to sort by CPU%
it dosent seem to be the default, so I wind up typing 'o' (to sort) and type in 'cpu' and hit enter
all the time!
is there a way to make sort-by-CPU% the default or something?
%CPU is the default sort column on every system I've ever used. But if it's not, you can press Shift-P to sort by %CPU. Note that it's not listed in interactive help, only the man page.
M %MEM
N PID
P %CPU
T TIME+
top -o +%CPU
As John Kugelman said, usually %CPU is a default sort column. Anyway, reading man top(1) seems a good idea.
Looking at the top manpage, you can use option -o %CPU, although that should already be the default. You could create an alias in your .bashrc (alias top='top -o %CPU') to make this permanent.
-o :Override-sort-field as: -o fieldname
Specifies the name of the field on which tasks will be sorted, independent of what is reflected in the configuration file. You can prepend a '+' or '-' to the field name to
also override the sort direction. A leading '+' will force sorting high to low, whereas a '-' will ensure a low to high ordering.
This option exists primarily to support automated/scripted batch mode operation.
Another way is to establish the sorting you prefer in top and then press W to save the configuration. Next time you start top it is going to load the config.
Press h in top to see keyboard shortcuts.

Log file pattern matching [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 8 years ago.
This question appears to be off-topic because it lacks sufficient information to diagnose the problem. Describe your problem in more detail or include a minimal example in the question itself.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
i want to monitor application log file for specific error patterns on content added in since last 10 min (or since script last run) please not i dont want to monitor entire log file but only lines that are added in last 10 min, when patern is matched i want it displayed on screen. I'm confused how to achieve this thru script.
TIA
regards
tnt5273
FILE=logfile
lines=$(wc -l < "$FILE")
while sleep 600; do
clines=$(wc -l < "$FILE")
diff=$(($clines - $lines))
tail -$diff $FILE | grep PATTERN
lines=$clines
done
What you appear to be describing is commonly achieved at a console by:
tail -F /path/to/my/file | grep "pattern"
This an idiom used by many system adminstrators.
There's another approach where you want to be alerted if a particular event is logged, but you don't want to watch for it.
The Simple Event Correllator is a perl script designed to watch logs, correlate events and perform actions.

Resources