Sort names of zipped files and write list to file - shell

I tried to list the zipped files in sort order and transfer this to new file, but it does not work properly in shell script. Why my script is not working?
ls |grep gz|sort -t '.' -k 2,2n >filename;

I did not find any problem with your commands. But they do not seem the right way to do this, at least to me. These two ways I'm pasting I think are better. Try them out.
With only names :
find . -type f -name '*.html' 2>/dev/null -exec basename {} \; | sort > filename.txt
With full paths :
find . -type f -name '*.html' 2>/dev/null | sort > filename.txt
You can also add the "-maxdepth 1" flag to search only on the current directory where you are running this, and not recursively within nested dirs :
find . -type f -maxdepth 1 -name '*.html' 2>/dev/null | sort > filename.txt
Hope this helps you :)

Related

How to write the wc's stdout into a file?

The below command will show how many characters contains in every file in current directory.
find -name '*.*' |xargs wc -c
I want to write the standout into a file.
find -name '*.*' |xargs wc -c > /tmp/record.txt
It encounter an issue:
wc: .: Is a directory
How to write all the standard output into a file?
Why -name '*.*'? That will not find every file and will find directories. You need to use -type f, and better than piping the result to xargs is using -exec:
find . -type f -maxdepth 1 -exec wc -c {} + > /tmp/record.txt
-maxdepth 1 guarantees that the search won't dive in subdirectories.
I think you maybe meant find |xargs wc -c?
find -name '.' just returns .
Filter only files, if you want only files.
find -type f

I want to grep a pattern inside a file n list that file based on current date

ls -l | grep "Feb 22" | grep -l "good" *
This is the command i am using . i have 4 files among which one file contains the world good . I want to list that file . And that file creation is the current date . based on both the criteria i want to list that file
Try this :
find . -type f -newermt 2018-02-21 ! -newermt 2018-02-22 -exec grep -l good {} \;
or
find . -type f -newermt 2018-02-21 ! -newermt 2018-02-22 | xargs grep -l good
And please, don't parse ls output
Hi Try with below command. How it works? Here find command with parameter -mtime -1 will search for files with current date in current directory as well as its sub directories. Each file found will be pass to grep command one at a time. grep command will check for the string in that file (means each file passes to it)
find . -mtime -1 -type f | xargs grep -i "good"
In the above command it will list all the file with current date. To list a files of specific kind you below command. Here I am listing only txt files.
find . -name "*.txt" -mtime -1 -type f | xargs grep -i "good"
find . is for running it from current directory (dot means current directory). To run it from a specific directory path modify like below:-
find /yourpath/ -name "*.txt" -mtime -1 -type f | xargs grep -i "good"
Also grep -i means "ignore case". For a specific case just use grep "good"

Terminal find, directories last instead of first

I have a makefile that concatenates JavaScript files together and then runs the file through uglify-js to create a .min.js version.
I'm currently using this command to find and concat my files
find src/js -type f -name "*.js" -exec cat {} >> ${jsbuild}$# \;
But it lists files in directories first, this makes heaps of sense but I'd like it to list the .js files in the src/js files above the directories to avoid getting my undefined JS error.
Is there anyway to do this or? I've had a google around and seen the sort command and the -s flag for find but it's a bit above my understanding at this point!
[EDIT]
The final solution is slightly different to the accepted answer but it is marked as accepted as it brought me to the answer. Here is the command I used
cat `find src/js -type f -name "*.js" -print0 | xargs -0 stat -f "%z %N" | sort -n | sed -e "s|[0-9]*\ \ ||"` > public/js/myCleverScript.js
Possible solution:
use find for getting filenames and directory depth, i.e find ... -printf "%d\t%p\n"
sort list by directory depth with sort -n
remove directory depth from output to use filenames only
test:
without sorting:
$ find folder1/ -depth -type f -printf "%d\t%p\n"
2 folder1/f2/f3
1 folder1/file0
with sorting:
$ find folder1/ -type f -printf "%d\t%p\n" | sort -n | sed -e "s|[0-9]*\t||"
folder1/file0
folder1/f2/f3
the command you need looks like
cat $(find src/js -type f -name "*.js" -printf "%d\t%p\n" | sort -n | sed -e "s|[0-9]*\t||")>min.js
Mmmmm...
find src/js -type f
shouldn't find ANY directories at all, and doubly so as your directory names will probably not end in ".js". The brackets around your "-name" parameter are superfluous too, try removing them
find src/js -type f -name "*.js" -exec cat {} >> ${jsbuild}$# \;
find could get the first directory level already expanded on commandline, which enforces the order of directory tree traversal. This solves the problem just for the top directory (unlike the already accepted solution by Sergey Fedorov), but this should answer your question too and more options are always welcome.
Using GNU coreutils ls, you can sort directories before regular files with --group-directories-first option. From reading the Mac OS X ls manpage it seems that directories are grouped always in OS X, you should just drop the option.
ls -A --group-directories-first -r | tac | xargs -I'%' find '%' -type f -name '*.js' -exec cat '{}' + > ${jsbuild}$#
If you do not have the tac command, you could easily implement it using sed. It reverses the order of lines. See info sed tac of GNU sed.
tac(){
sed -n '1!G;$p;h'
}
You could do something like this...
First create a variable holding the name of our output file:
OUT="$(pwd)/theLot.js"
Then, get all "*.js" in top directory into that file:
cat *.js > $OUT
Then have "find" grab all other "*.js" files below current directory:
find . -type d ! -name . -exec sh -c "cd {} ; cat *.js >> $OUT" \;
Just to explain the "find" command, it says:
find
. = starting at current directory
-type d = all directories, not files
-! -name . = except the current one
-exec sh -c = and for each one you find execute the following
"..." = go to that directory and concatenate all "*.js" files there onto end of $OUT
\; = and that's all for today, thank you!
I'd get the list of all the files:
$ find src/js -type f -name "*.js" > list.txt
Sort them by depth, i.e. by the number of '/' in them, using the following ruby script:
sort.rb:
files=[]; while gets; files<<$_; end
files.sort! {|a,b| a.count('/') <=> b.count('/')}
files.each {|f| puts f}
Like so:
$ ruby sort.rb < list.txt > sorted.txt
Concatenate them:
$ cat sorted.txt | while read FILE; do cat "$FILE" >> output.txt; done
(All this assumes that your file names don't contain newline characters.)
EDIT:
I was aiming for clarity. If you want conciseness, you can absolutely condense it to something like:
find src/js -name '*.js'| ruby -ne 'BEGIN{f=[];}; f<<$_; END{f.sort!{|a,b| a.count("/") <=> b.count("/")}; f.each{|e| puts e}}' | xargs cat >> concatenated

Better way to limit the unix command find by filename

I'm getting results using find with filenames that have '~' and .swp, etc. So I did the following, but is there a better way to do this? The '.*.js' -iname '*.js' part feels "redundant".
$ find ./ '.*.js' -iname '*.js' -print0 | xargs -0 grep -n ".*loginError.*"
find: `.*.js': No such file or directory
./js/signin.js:252: foo.loginError();
./js/signin.js:339:foo.loginError = function() {
./js/signin.js:340: foo.log("ui.loginError");
Try using
find . -name \*.js -print0 | xargs -0 grep -n ".*loginError.*"
That will find only files with 'js' extension and not ending in ~ or .swp
EDIT: Added '0' -print0 (edit requires 6 characters so I'm adding this; ergh!)
To do it all in one command without the xargs you could do it like this
find . -name "*.js" -exec grep -n ".*loginError.*" /dev/null {} \;
the /dev/null piece is to make grep think it's searching multiple files and then it'll output the filename correctly, otherwise it'd just print out the line number without telling you which file it's in

Why does my shell script not find anything (find . -name script.sh | grep watermelon)

I have a script that I'm running from the home directory to search for all files called "script.sh" that contain the string "watermelon". It's not finding anything but I can clearly see these scripts in the subdirectories. Could someone please suggest a change to the command I'm using:
find . -name script.sh | grep watermelon
You need to use xargs:
find . -name script.sh | xargs grep watermelon
xargs will modify the behavior to search within the files, rather than just search within the names of the files.
find returns the filename it finds by default. If you want it to search within the files then you need to pipe it to xargs or use the -exec and -print predicates:
find . -name script.sh -exec grep -q watermelon {} \; -print
use -type f to indicate file
find . -type f -name "script.sh" -exec grep "watermelon" "{}" +;
or if you have bash 4
shopt -s globstar
grep -Rl "watermelon" **/script.sh

Resources