shell script to traverse files recursively - shell

I need some assistance in creating a shell script to run a specific command (any) on each file in a folder, as well as recursively dive into sub-directories.
I'm not sure how to start.
a point in the right direction would suffice. Thank you.

To apply a command (say, echo) to all files below the current path, use
find . -type f -exec echo "{}" \;
for directories, use -type d

You should be looking at the find command.
For example, to change permissions all JPEG files under your /tmp directory:
find /tmp -name '*.jpg' -exec chmod 777 {} ';'
Although, if there are a lot of files, you can combine it with xargs to batch them up, something like:
find /tmp -name '*.jpg' | xargs chmod 777
And, on implementations of find and xargs that support null-separation:
find /tmp -name '*.jpg' -print0 | xargs -0 chmod 777

Bash 4.0
#!/bin/bash
shopt -s globstar
for file in **/*.txt
do
echo "do something with $file"
done

To recursively list all files
find . -name '*'
And lets say for example you want to 'grep' on each file then -
find . -type f -name 'pattern' -print0 | xargs -0 grep 'searchtext'

Within a bash script, you can go through the results from "find" command this way:
for F in `find . -type f`
do
# command that uses $F
done

Related

How to write the wc's stdout into a file?

The below command will show how many characters contains in every file in current directory.
find -name '*.*' |xargs wc -c
I want to write the standout into a file.
find -name '*.*' |xargs wc -c > /tmp/record.txt
It encounter an issue:
wc: .: Is a directory
How to write all the standard output into a file?
Why -name '*.*'? That will not find every file and will find directories. You need to use -type f, and better than piping the result to xargs is using -exec:
find . -type f -maxdepth 1 -exec wc -c {} + > /tmp/record.txt
-maxdepth 1 guarantees that the search won't dive in subdirectories.
I think you maybe meant find |xargs wc -c?
find -name '.' just returns .
Filter only files, if you want only files.
find -type f

Find file and rename it BASH

Any idea why this is not working? I checked many links and I can't figure it out why I think the syntax is correct.
I want to find the file maplist.txt.old and then rename it to maplist.txt in the same folder. I got no errors.
find ~ -type f -name csgo/maplist.txt.old -execdir mv csgo/maplist.txt.old maplist.txt \;
Lots of ways to handle this. Since you are looking in ~/csgo you can go directly to the directory in the find. The -execdir option will run the command in the directory. So without changing your example much:
find ~/csgo -type f -name maplist.txt.old -execdir mv maplist.txt.old maplist.txt \;
To automate this a bit further, you may want to handle this with a bash for loop, for example:
for file in $( find ~/csgo -type f -name maplist.txt.old ) ; do
mv $file $( echo $file | sed -e 's/\.old//' )
done

copy files with the base directory

I am searching specific directory and subdirectories for new files, I will like to copy the files. I am using this:
find /home/foo/hint/ -type f -mtime -2 -exec cp '{}' ~/new/ \;
It is copying the files successfully, but some files have same name in different subdirectories of /home/foo/hint/.
I will like to copy the files with its base directory to the ~/new/ directory.
test#serv> find /home/foo/hint/ -type f -mtime -2 -exec ls '{}' \;
/home/foo/hint/do/pass/file.txt
/home/foo/hint/fit/file.txt
test#serv>
~/new/ should look like this after copy:
test#serv> ls -R ~/new/
/home/test/new/pass/:
file.txt
/home/test/new/fit/:
file.txt
test#serv>
platform: Solaris 10.
Since you can't use rsync or fancy GNU options, you need to roll your own using the shell.
The find command lets you run a full shell in your -exec, so you should be good to go with a one-liner to handle the names.
If I understand correctly, you only want the parent directory, not the full tree, copied to the target. The following might do:
#!/usr/bin/env bash
findopts=(
-type f
-mtime -2
-exec bash -c 'd="${0%/*}"; d="${d##*/}"; mkdir -p "$1/$d"; cp -v "$0" "$1/$d/"' {} ./new \;
)
find /home/foo/hint/ "${findopts[#]}"
Results:
$ find ./hint -type f -print
./hint/foo/slurm/file.txt
./hint/foo/file.txt
./hint/bar/file.txt
$ ./doit
./hint/foo/slurm/file.txt -> ./new/slurm/file.txt
./hint/foo/file.txt -> ./new/foo/file.txt
./hint/bar/file.txt -> ./new/bar/file.txt
I've put the options to find into a bash array for easier reading and management. The script for the -exec option is still a little unwieldy, so here's a breakdown of what it does for each file. Bearing in mind that in this format, options are numbered from zero, the {} becomes $0 and the target directory becomes $1...
d="${0%/*}" # Store the source directory in a variable, then
d="${d##*/}" # strip everything up to the last slash, leaving the parent.
mkdir -p "$1/$d" # create the target directory if it doesn't already exist,
cp "$0" "$1/$d/" # then copy the file to it.
I used cp -v for verbose output as shown in "Results" above, but IIRC it's also not supported by Solaris, and can be safely ignored.
The --parents flag should do the trick:
find /home/foo/hint/ -type f -mtime -2 -exec cp --parents '{}' ~/new/ \;
Try testing with rsync -R, for example:
find /your/path -type f -mtime -2 -exec rsync -R '{}' ~/new/ \;
From the rsync man:
-R, --relative
Use relative paths. This means that the full path names specified on the
command line are sent to the server rather than just the last parts of the
filenames.
The problem with the answers by #Mureinik and #nbari might be that the absolute path of new files will spawn in the target directory. In this case you might want to switch to the base directory before the command and go back to your current directory afterwards:
path_current=$PWD; cd /home/foo/hint/; find . -type f -mtime -2 -exec cp --parents '{}' ~/new/ \; ; cd $path_current
or
path_current=$PWD; cd /home/foo/hint/; find . -type f -mtime -2 -exec rsync -R '{}' ~/new/ \; ; cd $path_current
Both ways work for me at a Linux platform. Let’s hope that Solaris 10 knows about rsync’s -R ! ;)
I found a way around it:
cd ~/new/
find /home/foo/hint/ -type f -mtime -2 -exec nawk -v f={} '{n=split(FILENAME, a, "/");j= a[n-1];system("mkdir -p "j"");system("cp "f" "j""); exit}' {} \;

How to print the deleted file names along with path in shell script

I am deleting the files in all the directories and subdirectories using the command below:
find . -type f -name "*.txt" -exec rm -f {} \;
But I want to know which are the files deleted along with their paths. How can I do this?
Simply add a -print argument to your find.
$ find . -type f -name "*.txt" -print -exec rm -f {} \;
As noted by #JonathanRoss below, you can achieve an equivalent result with the -v option to rm.
It's not the scope of your question, but more generally it gets more interesting if you want to delete directories recursively. Then:
a simple -exec rm -r argument keeps it silent
a -print -exec rm -r argument reports the toplevel directories you're operating on
a -exec rm -rv argument reports all you're removing

Why does my shell script not find anything (find . -name script.sh | grep watermelon)

I have a script that I'm running from the home directory to search for all files called "script.sh" that contain the string "watermelon". It's not finding anything but I can clearly see these scripts in the subdirectories. Could someone please suggest a change to the command I'm using:
find . -name script.sh | grep watermelon
You need to use xargs:
find . -name script.sh | xargs grep watermelon
xargs will modify the behavior to search within the files, rather than just search within the names of the files.
find returns the filename it finds by default. If you want it to search within the files then you need to pipe it to xargs or use the -exec and -print predicates:
find . -name script.sh -exec grep -q watermelon {} \; -print
use -type f to indicate file
find . -type f -name "script.sh" -exec grep "watermelon" "{}" +;
or if you have bash 4
shopt -s globstar
grep -Rl "watermelon" **/script.sh

Resources