How to write the wc's stdout into a file? - bash

The below command will show how many characters contains in every file in current directory.
find -name '*.*' |xargs wc -c
I want to write the standout into a file.
find -name '*.*' |xargs wc -c > /tmp/record.txt
It encounter an issue:
wc: .: Is a directory
How to write all the standard output into a file?

Why -name '*.*'? That will not find every file and will find directories. You need to use -type f, and better than piping the result to xargs is using -exec:
find . -type f -maxdepth 1 -exec wc -c {} + > /tmp/record.txt
-maxdepth 1 guarantees that the search won't dive in subdirectories.

I think you maybe meant find |xargs wc -c?
find -name '.' just returns .

Filter only files, if you want only files.
find -type f

Related

Remove empty files and save a list of deleted files

I need a script that removes all empty files and writes a list of deleted files to a text file.
Deleting files works. Unfortunately, the listing does not work.
find . -type f -empty -print -delete
I tried something like this:
-print >> test.txt
When I redirect the output of your command to a file in ., it gets delete by the find command before anything is written to it, since it is empty.
To solve this, make sure the output file is not empty at the beginning, or save it elsewhere:
find . -type f -empty -print -delete > ../log
or
date > log
find . -type f -empty -print -delete >> log
or, adapted from #DanielFarrell's comment:
find . -type f -empty -a -not -wholename ./log -print -delete > log
The added -a -not -wholename ./log excludes ./log from the find operation.
You can use -exec option with rm command instead of -delete.
find . -type f -emtpy -exec rm --verbose {} \; >> logfile.txt
logfile.txt:
removed './emptyfile1'
removed './emptyfile0'
Or you can use pipes and xargs for a more clean output:
find . -type f -empty | xargs ls | tee -a logfile.txt | xargs rm
This will give you only deleted filenames.

Sort names of zipped files and write list to file

I tried to list the zipped files in sort order and transfer this to new file, but it does not work properly in shell script. Why my script is not working?
ls |grep gz|sort -t '.' -k 2,2n >filename;
I did not find any problem with your commands. But they do not seem the right way to do this, at least to me. These two ways I'm pasting I think are better. Try them out.
With only names :
find . -type f -name '*.html' 2>/dev/null -exec basename {} \; | sort > filename.txt
With full paths :
find . -type f -name '*.html' 2>/dev/null | sort > filename.txt
You can also add the "-maxdepth 1" flag to search only on the current directory where you are running this, and not recursively within nested dirs :
find . -type f -maxdepth 1 -name '*.html' 2>/dev/null | sort > filename.txt
Hope this helps you :)

Get the filename from bash script recursively using find

I am trying retrieve the filename from the find command recursively. This command prints all the filenames with full path
> for f in $(find -name '*.png'); do echo "$f";done
> ./x.png
> ./bg.png
> ./s/bg.png
But when i try to just get the name of the file using these commands, it prints
for f in $(find -name '*.png'); do echo "${f##*/}";done
bg.png
and
for f in $(find -name '*.png'); do echo $(basename $f);done
bg.png
It omits other 2 files. I am new to shell scripting. I couln't figure out whats wrong with this one.
EDIT:
THis is what i have actually wanted.
I want to loop through a directory recursively and find all png images
send it to pngnq for RGBA compression
It outputs the new file with orgfilename-nq8.png
send it to pngcrush and rename and generate a new file (org file will be overwritten )
remove new file
i have a code which works on a single directory
for f in *.png; do pngnq -f -n 256 "$f" && pngcrush "${f%.*}"-nq8.png "$f";rm "${f%.*}"-nq8.png; done
I want to do this recursively
Simply do :
find -name '*.png' -printf '%f\n'
If you want to run something for each files :
find -name '*.png' -printf '%f\n' |
while read file; do
# do something with "$file"
done
Or with xargs :
find -name '*.png' -printf '%f\n' | xargs -n1 command
Be sure you cannot use find directly like this :
find -name '*.png' -exec command {} +
or
find -name '*.png' -exec bash -c 'do_something with ${1##*/}' -- {} \;
Search -printf on http://unixhelp.ed.ac.uk/CGI/man-cgi?find or in
man find | less +/^' *-printf'

Piping find to find

I want to pipe a find result to a new find. What I have is:
find . -iname "2010-06*" -maxdepth 1 -type d | xargs -0 find '{}' -iname "*.jpg"
Expected result: Second find receives a list of folders starting with 2010-06, second find returns a list of jpg's contained within those folders.
Actual result: "find: ./2010-06 New York\n: unknown option"
Oh darn. I have a feeling it concerns the format of the output that the second find receives as input, but my only idea was to suffix -print0 to first find, with no change whatsoever.
Any ideas?
You need 2 things. -print0, and more importantly -I{} on xargs, otherwise the {} doesn't do anything.
find . -iname "2010-06*" -maxdepth 1 -type d -print0 | xargs -0 -I{} find '{}' -iname '*.jpg'
Useless use of xargs.
find 2010-06* -iname "*.jpg"
At least Gnu-find accepts multiple paths to search in. -maxdepth and type -d is implicitly assumed.
How about
find . -iwholename "./2010-06*/*.jpg
etc?
Although you did say that you specifically want this find + pipe problem to work, its inefficient to fork an extra find command. Since you are specifying -maxdepth as 1, you are not traversing subdirectories. So just use a for loop with shell expansion.
for file in *2010-06*/*.jpg
do
echo "$file"
done
If you want to find all jpg files inside each 2010-06* folders recursively, there is also no need to use multiple finds or xargs
for directory in 2010-06*/
do
find $directory -iname "*.jpg" -type f
done
Or just
find 2006-06* -type f -iname "*.jpg"
Or even better, if you have bash 4 and above
shopt -s globstar
shopt -s nullglob
for file in 2010-06*/**/*.jpg
do
echo "$file"
done

shell script to traverse files recursively

I need some assistance in creating a shell script to run a specific command (any) on each file in a folder, as well as recursively dive into sub-directories.
I'm not sure how to start.
a point in the right direction would suffice. Thank you.
To apply a command (say, echo) to all files below the current path, use
find . -type f -exec echo "{}" \;
for directories, use -type d
You should be looking at the find command.
For example, to change permissions all JPEG files under your /tmp directory:
find /tmp -name '*.jpg' -exec chmod 777 {} ';'
Although, if there are a lot of files, you can combine it with xargs to batch them up, something like:
find /tmp -name '*.jpg' | xargs chmod 777
And, on implementations of find and xargs that support null-separation:
find /tmp -name '*.jpg' -print0 | xargs -0 chmod 777
Bash 4.0
#!/bin/bash
shopt -s globstar
for file in **/*.txt
do
echo "do something with $file"
done
To recursively list all files
find . -name '*'
And lets say for example you want to 'grep' on each file then -
find . -type f -name 'pattern' -print0 | xargs -0 grep 'searchtext'
Within a bash script, you can go through the results from "find" command this way:
for F in `find . -type f`
do
# command that uses $F
done

Resources