Bash command: head - bash

I am trying to find all files with dummy* in the folder named dummy. Then I need to sort them according to time of creation and get the 1st 10 files. The command I am trying is:
find -L /home/myname/dummy/dummy* -maxdepth 0 -type f -printf '%T# %p\n' | sort -n | cut -d' ' -f 2- | head -n 10 -exec readlink -f {} \;
But this doesn't seem to work with the following error:
head: invalid option -- 'e'
Try 'head --help' for more information.
How do I make the bash to not read -exec as part of head command?
UPDATE1:
Tried the following:
find -L /home/myname/dummy/dummy* -maxdepth 0 -type f -exec readlink -f {} \; -printf '%T# %p\n' | sort -n | cut -d' ' -f 2- | head -n 10
But this is not according to timestamp sort because both find and printf are printing the files and sort is sorting them all together.
Files in dummy are as follows:
dummy1, dummy2, dummy3 etc. This is the order in which they are created.

How do I make the bash to not read -exec as part of head command?
The -exec and subsequent arguments appear intended to be directed to find. The find command stops at the first |, so you would need to move those arguments ahead of that:
find -L /home/myname/dummy/dummy* -maxdepth 0 -type f -printf '%T# %p\n' -exec readlink -f {} \; | sort -n | cut -d' ' -f 2- | head -n 10
However, it doesn't make much sense to both -printf file details and -exec readlink the results. Possibly you wanted to run readlink on each filename that makes it past head. In that case, you might want to look into the xargs command, which serves exactly the purpose of converting data read from the standard input into arguments to a command. For example:
find -L /home/myname/dummy/dummy* -maxdepth 0 -type f -printf '%T# %p\n' |
sort -n |
cut -d' ' -f 2- |
head -n 10 |
xargs -rd '\n' readlink -f

I think you are over-complicating things here. Using just ls and head should get you the results you want:
ls -lt /home/myname/dummy/dummy* | head -10
To sort by ctime specifically, use the -c flag for ls:
ls -ltc /home/myname/dummy/dummy* | head -10

Related

What is the correct Linux command of find, grep and sort?

I am writing a command using find, grep and sort to display a sorted list of all files that contain 'some-text'.
I was unable to figure out the command.
Here is my attempt:
$find . -type f |grep -l "some-text" | sort
but it didn't work.
You need to use something like XARGS so that the content of each file passed through the pipe | is made available for grep.
XARGS: converts input from standard input into arguments to a command
In my case, I have files1,2,3 and they contain the word test. This will do it.
za:tmp za$ find . -type f | xargs grep -l "test" | sort
./file1.txt
./file2.txt
./file3.txt
or
za:tmp za$ find . -type f | xargs grep -i "test" | sort
./file1.txt:some test string
./file2.txt:some test string
./file3.txt:some test string
You can use it in any unix:
find . -type f -exec sh -c 'grep "some text" {} /dev/null > /dev/null 2>&1' \; -a -print 2> /dev/null|sort
A more optimized solution that works only with GNU-grep:
find . -type f -exec grep -Hq "some-text" {} \; -a -print 2> /dev/null|sort

How do I print all the files older than 10 days containing particular string?

I have tried this but not working.
find . -mtime +10 -print| grep -H -r "test" | cut -d: -f1
You can make use of xargs and process the files found by find, but find alone can make it:
find . -mtime +10 -exec grep -l "test" {} \+
find ... -exec XXX {} \; (or \+, thanks Kevin) performs the XXX command on the files found by find.
grep -l just shows the name of the files, as I think you are trying to get with cut -d: -f1.
You may also need to add -type f to just find files, no directories.
You have to execute using xargs like:
find . -mtime +10 -print0 | xargs -0 grep -H -r "test" | cut -d: -f1
edit
I inserted options so that you won't have problems with spaces in the filenames.

Find Files having multiple links in shell script

I want to find the files which have multiple links.
I am using ubuntu 10.10.
find -type l
It will shows all links to the file but I want to count links for particular file.
Thanks.
With this command, you will get a sumary of linked files:
find . -type l -exec readlink -f {} \; | sort | uniq -c | sort -n
or
find . -type l -print0 | xargs -n1 -0 readlink -f | sort | uniq -c | sort -n

using pipes with a find command

I have a series of delimited files, some of which have some bad data and can be recognized by doing a column count on them. I can find them with the following command:
find ./ -name 201201*gz -mtime 12
They are all gzipped and I do not want to un-archive them all. So to check the column counts I've been doing I'm running this as a second command on each file:
zcat ./path/to/file.data | awk '{print NF}' | head
I know I can run a command on each file through find with -exec, but how can I also get it to run through the pipes? A couple things I tried, neither of which I expected to work and neither of which did:
find ./ -name 201201*gz -mtime 12 -print -exec zcat {} \; | awk '{print NF}'| head
find ./ -name 201201*gz -mtime 12 -print -exec "zcat {} | awk '{print NF}'| head" \;
I'd use a explicit loop aproach:
find . -name 201201*gz -mtime 12 | while read file; do
echo "$file: "
zcat "$file" | awk '{print NF}' | head
done
More or less you pipe things through find like:
find . -name "foo" -print0 | xargs -0 echo
So your command would look like:
find ./ -name "201201*gz" -mtime 12 -print0 | xargs -0 zcat | awk '{print NF}'| head
-print0 and xargs -0 just helps to make sure files with special characters dont break the pipe.

Bash Script to find the most recently modified file

I have two folders, for arguments sake /Volumes/A and /Volumes/B. They are mounted network shares from a Windows server containing a load of .bkf files from Backup Exec.
I am looking for a script which will look into both folders and find the most recently modified .bkf file and copy it to another location. There are other files in the folders which must be ignored.
Thanks in advance!!
Shaun
Edit:
I knocked this together:
cp ls -alt /Volumes/E /Volumes/F| grep bkf | head -n 1 | awk '{print $8}' /Volumes/$xservedisk/Windows/
Can anyone think of any reasons why I shouldnt use it?
Thanks again
Shaun
I prefer this for finding the most recently modified file:
find . -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort
NEWEST=
for f in /Volumes/A/*.bkf /Volumes/B/*.bkf
do
if [ -z "$NEWEST" ]
then
NEWEST=$f
elif [ "$f" -nt "$NEWEST" ]
then
NEWEST=$f
fi
done
Goes through some twists just to make sure to handle filenames with odd characters well, which mouviciel's doesn't:
NEWEST=$(find /Volumes/A /Volumes/B -name '*.bkf' -printf '%T# %p\0' | \
sort -rnz | xargs -0n1 2>/dev/null | head -n1 | cut -d' ' -f2-)
[[ -n "$NEWEST" ]] && cp -v "$NEWEST" /other-location
Actually, since these files are coming from Windows and are thus pretty much guaranteed not to have odd characters in their names (like embedded newlines),
NEWEST=$(find /Volumes/A /Volumes/B -name '*.bkf' -printf '%T# %p\n' | \
sort -rn | head -n1 | cut -d' ' -f2-)
[[ -n "$NEWEST" ]] && cp -v "$NEWEST" /other-location
Finding files is done with: find /Volumes/[AB] -name '*.bkf'
Sorting files by modification time is done with: ls -t
if load of files is not that much, you can simply use:
ls -lrt `find /Volumes/[AB] -name '*.bkf'`
The last displayed file is the most recently modified.
edit
A more robust solution (thanks ephemient) is:
find /Volumes/[AB] -type f -name '*.bkf' -print0 | xargs -0 ls -lrt
cp `find /Volumes/[AB] -name '*bkf' -type f -printf "%A#\t%p\n" |sort -nr |head -1 |cut -f2` dst_directory/

Resources