Is there a way to perform echo | tee during find -exec? - bash

I have a problem with a code similar to the following:
function echotee() { echo $1 | tee -a ${FILE}; }
export -f echotee
find . -delete -exec sh -c 'echotee "Deleting: {}"' \;
The function echotee usually works as expected. However, during the -exec it does not. Indeed, it just prints on the terminal, omitting tee.
Hoping the question is not too trivial, thanks in advance.

Why don't you just use this:
find . -delete -exec sh -c 'echo "Deleting: $1" | tee -a "$2"' _ {} "${FILE}" \;
No need to define and call a function.
You mentioned in a comment that you want to use echotee as a central point to print and log information. Have you considered a setup like this instead:
#!/usr/bin/env bash
# Send all script output to console and logfile
LOGFILE="..."
exec > >(tee -ia "${LOGFILE}") 2>&1
find . -delete -printf "Deleting: %f\n"
or this:
#!/usr/bin/env bash
# Set up fd 3 to send output to console and logfile on demand
LOGFILE="..."
exec 3> >(tee -ia "${LOGFILE}")
find . -delete -printf "Deleting: %f\n" 1>&3 2>&1

Use name() instead of function name().
You did not set nor export FILE variable.
sh does not support exporting functions. It's a feature of bash, you have to call bash.
sh -c ' .... "{}"' will break on filenames containing " character. Put it as positional argument and use $1.
$1 and $FILE expansions are not quoted and are subject to word splitting and filename expansion.
echo $1 will break on filenames like -e. Prefer printf.
Check your scripts with shellcheck - it will catch many such mistakes.
I think you meant to:
FILE=/tmp/log.txt
echotee() { printf "%s\n" "$1" | tee -a "$FILE"; }
export -f echotee
export FILE
find . -exec bash -c 'echotee "Deleting: $1"' -- {} \;
But the version from Shawn with -printf "Deleting: %p\n" | tee "$FILE" looks just nicer.
I think spawning tee and pipe will be slower then, I think doing like so could be a bit faster:
echotee() { printf "%s\n" "$1" >> "$FILE"; printf "%s\n" "$1"; }
or like:
exec 10>>"$FILE"
echotee() { printf "%s\n" "$1" >&10; printf "%s\n" "$1"; }
You could remove the pipe either way, just:
echotee() { tee -a "$FILE" <<<"$1"; }

Related

How to extract code into a funciton when using xargs -P?

At fisrt,I have write the code,and it run well.
# version1
all_num=10
thread_num=5
a=$(date +%H%M%S)
seq 1 ${all_num} | xargs -n 1 -I {} -P ${thread_num} sh -c 'echo abc{}'
b=$(date +%H%M%S)
echo -e "startTime:\t$a"
echo -e "endTime:\t$b"
Now I want to extract code into a funciton,but it was wrong,how to fix it?
get_file(i){
echo "abc"+i
}
all_num=10
thread_num=5
a=$(date +%H%M%S)
seq 1 ${all_num} | xargs -n 1 -I {} -P ${thread_num} sh -c "$(get_file {})"
b=$(date +%H%M%S)
echo -e "startTime:\t$a"
echo -e "endTime:\t$b"
Because /bin/sh isn't guaranteed to have support for either printing text that when evaluates defines your function, or exporting functions through the environment, we need to do this the hard way, just duplicating the text of the function inside the copy of sh started by xargs.
Other questions already exist in this site describing how to accomplish this with bash, which is quite considerably easier. See f/e How can I use xargs to run a function in a command substitution for each match?
#!/bin/sh
all_num=10
thread_num=5
batch_size=1 # but with a larger all_num, turn this up to start fewer copies of sh
a=$(date +%H%M%S) # warning: this is really inefficient
seq 1 ${all_num} | xargs -n "${batch_size}" -P "${thread_num}" sh -c '
get_file() { i=$1; echo "abc ${i}"; }
for arg do
get_file "$arg"
done
' _
b=$(date +%H%M%S)
printf 'startTime:\t%s\n' "$a"
printf 'endTime:\t%s\n' "$b"
Note:
echo -e is not guaranteed to work with /bin/sh. Moreover, for a shell to be truly compliant, echo -e is required to write -e to its output. See Why is printf better than echo? on UNIX & Linux Stack Exchange, and the APPLICATION USAGE section of the POSIX echo specification.
Putting {} in a sh -c '...{}...' position is a Really Bad Idea. Consider the case where you're passed in a filename that contains $(rm -rf ~)'$(rm -rf ~)' -- it can't be safely inserted in an unquoted context, or a double-quoted context, or a single-quoted context, or a heredoc.
Note that seq is also nonstandard and not guaranteed to be present on all POSIX-compliant systems. i=0; while [ "$i" -lt "$all_num" ]; do echo "$i"; i=$((i + 1)); done is an alternative that will work on all POSIX systems.

bash - forcing globstar asterisk expansion when passed to loop

I am attempting to write a script that tried to use globstar expressions to execute a command (for example ls)
#!/usr/bin/env bash
shopt -s globstar nullglob
DISCOVERED_EXTENSIONS=$(find . -type f -name '*.*' | sed 's|.*\.||' | sort -u | tr '\n' ' ' | sed "s| | ./\**/*.|g" | rev | cut -c9- | rev | echo "./**/*.$(</dev/stdin)")
IFS=$'\n'; set -f
for f in $(echo $DISCOVERED_EXTENSIONS | tr ' ' '\n'); do
ls $f;
done
unset IFS; set +f
shopt -u globstar nullglob
The script output is:
ls: ./**/*.jpg: No such file or directory
ls: ./**/*.mp4: No such file or directory
It is passing ls "./**/*.avi" instead of ls ./**/*.avi (no variable expansion). I attempted to use eval, envsubst and even used a custom expand function, to no avail
The result of echo "$DISCOVERED_EXTENSIONS" is:
./**/*.jpg ./**/*.mp4
What changes can be recommended so that value of $f is the result of glob expansion and not the expression itself?
EDIT: I'm keeping the question up as I have resolved my problem by not using globstar at all which solves my immediate problem but doesn't solve the question.
As pynexj points out, the set -f un-does shopt -s globstar nullglob so that makes the script I've written as non-functional 'cause removing set -f breaks this script
$f is the result of glob expansion
The result of glob expansion is a list of arguments. It could be saved in an array. Saving it is just calling a subshell and transfering data.
mapfile -t -d '' arr < <(bash -c 'printf "%s\0" '"$f")
ls "${arr[#]}"
Notes:
Do not do for i in $(....). Use a while IFS= read -r loop. Bashfaq how to read a stream line by line.
I have no idea what is going on at that DISCOVERED_EXTENSIONS long line, but I would find . -maxdepth 1 -type f -name '*.*' -exec bash -c 'printf "%s\n" "${0##*.}"' {} \; | sort -u.
I usually recommend using find instead of glubulation and working on pipelines/streams. I guess I would write it as: find . -maxdepth 1 -type f -name '*.*' -exec bash -c 'printf "%s\n" "${0##*.}"' {} \; | sort -u | while IFS= read -r ext; do find . -type f -name "*.$ext" | xargs -d '\n' ls; done

bash: Call a function from a while with arguments

I have this working:
$ find . -name 'copy_*.txt' |while read i ; do echo $i; git mv $i ${i%.txt}.cob ;done
I whant to have the main body in a bash function:
$ my_mv () { echo $1; mv $1 ${1%.cob}.toto; }
To then call it with:
$ find . -name 'copy_*.txt' |while read i ;do my_mv $i; done
But I get a silent execution and nothing append:
$ my_mv () { echo $1; mv $1 ${1%.cob}.toto; }
$ find . -name 'copy_*.txt' |while read i ;do my_mv $i; done
$
same with:
$ my_mv () { printf '%s\n' $1; mv $1 ${1%.cob}.toto; }
As said in the comments, it's not working, because you're searching for files copy_*.txt and are trying to move files with suffix .cob. Furthermore
your variables are not quoted and will cause problems with white space in filenames.
Export your function to make my_mv available in find and use -exec to prevent problems with filenames containing newlines:
my_mv () { for i; do echo "$i"; mv "$i" "${i%.cob}.toto"; done; }
export -f my_mv
find . -name 'copy_*.cob' -exec bash -c 'my_mv "$#"' bash {} +
Its often easier to use a small shell script instead of a function:
find . -name 'copy_*.cob' -exec sh -c '
for i; do
echo "$i"
mv "$i" "${i%.cob}.toto"
done
' sh {} +
Or move the code into a shell script mymv.sh
#!/bin/sh
for i; do
echo "$i"
mv "$i" "${i%.cob}.toto"
done
and execute the script in find:
find . -name 'copy_*.cob' -exec ./mymv.sh {} +

Why I am not getting a value when i call a function within another in a bash script

I have a function that generates a random file name
#generate random file names
get_rand_filename() {
if [ "$ASCIIONLY" == "1" ]; then
for ((i=0; i<$((MINFILENAMELEN+RANDOM%MAXFILENAMELEN)); i++)) {
printf \\$(printf '%03o' ${AARR[RANDOM%aarrcount]});
}
else
# no need to escape double quotes for filename
cat /dev/urandom | tr -dc '[ -~]' | tr -d '[$></~:`\\]' | head -c$((MINFILENAMELEN+RANDOM%MAXFILENAMELEN)) #| sed 's/\(["]\)/\\\1/g'
fi
printf "%s" $FILEEXT
}
export -f get_rand_filename
When I call it from within another function
cf(){
fD=$1
echo "the target dir recieved is " $fD
CFILE="$(get_rand_filename)"
echo "the file name is "$CFILE
}
export -f cf
when I call
echo "$targetdir" | xargs -0 sh -c 'cf $1' sh
I only get the FILEXT (no random file name)
when I call
cf "$targetdir"
I get a valid result
I need to be able to handle spaces in the $targetdir and file name string.
echo "$targetdir" | xargs -0 sh -c 'cf $1' sh
You should invoke bash rather than sh. Function exporting is a bash feature.
$ foo() { echo bar; }
$ export -f foo
$ sh -c 'foo'
sh: 1: foo: not found
$ bash -c 'foo'
bar
Also, get rid of the -0 option since the input isn't NUL-separated. Use -d'\n' instead. And quote "$1" for robustness.
echo "$targetdir" | xargs -d'\n' bash -c 'cf "$1"' bash
Actually, you could use -0 if you change the input format.
printf '%s\0' "$targetdir" | xargs -0 bash -c 'cf "$1"' bash
For what it's worth, mktemp creates random temporary files, and does it safely. It makes sure the file doesn't already exist and then creates it to prevent anybody else from snatching up the name in the split second between the name being generated and it being returned to the caller.

getting the output of a grep command in a loop

I have a shell script that includes this search:
find . -type f -exec grep -iPho "barh(li|mar|ag)" {} \;
I want to capture each string the grep command finds and send it a function I will create named "parser"
parser(){
# do stuff with each single grep result found
}
how can that be done?
is this right?
find . -type f -exec grep -iPho "barh(li|mar|ag)" {parser $1} \;
I do not want to output the entire find command result to the function
Only shell can execute a function. You need to use bash -c in your find in order to execute it. That is also the reason you need to export your function, so that the new process sees it.
parser() {
while IFS= read -r line; do
echo "Processing line: $line"
done <<< "$1"
}
export -f parser
find . -type f -exec bash -c 'parser "$(grep -iPho "barh(li|mar|ag)" "$1")"' -- {} \;
The code above will send all occurrences from file1, then file2 etc to your function to process. It will not send each line one by one and therefore you need to loop over the lines in your function. If there is no occurrence of your regex in a file, it will still call your function with an empty input!
That might not be the best solution for you so let's try to add the loop inside the bash -c statement and really process the lines one by one:
parser() {
echo "Processing line: $1"
}
export -f parser
find . -type f -exec bash -c 'grep -iPho "barh(li|mar|ag)" "$#" | while IFS= read -r line; do parser "$line"; done' -- {} +
EDIT: Very nice and simple solution not using bash -c suggested by #gniourf_gniourf:
parser() {
echo "Processing line: $1"
}
find . -type f -exec grep -iPho "barh(li|mar|ag)" {} + | while IFS= read -r line; do parser "$line"; done
This approach works fine and it will process each line one by one. You also do not need to export your function with this approach. But you have to care for some things that might surprise you.
Each command in a pipeline is executed in its own subshell, and any variable assignment in your parser function or your while in general will be lost after returning from that very subshell. If you are writing a script, simple shopt -s lastpipe will suffice and run the last pipe command in the current shell environment. Or you can use process substitution:
parser() {
echo "Processing line: $1"
}
while IFS= read -r line; do
parser "$line";
done < <(find . -type f -exec grep -iPho "barh(li|mar|ag)" {} +)
Note that in the previous bash -c examples, you will experience the same behavior and your variable assignments will be lost as well.
You need to export your function.
You also need to call bash to execute the function.
parser() {
echo "GOT: $1"
}
export -f parser
find Projects/ -type f -name '*rb' -exec bash -c 'parser "$0"' {} \;
i suggest you to use sed ,this is more powerful tool to do text processing.
for example i want to add string "myparse" after the line that end as "ha",i can do this like
# echo "haha" > text1
# echo "hehe" > text2
# echo "heha" > text3
# find . -type f -exec sed '/ha$/s/ha$/ha myparse/' {} \;
haha myparse
heha myparse
hehe
if you really want to replace the file ,not just print to stdout,you can do this like
# find . -type f -exec sed -i '/ha$/s/ha$/ha myparse/' {} \;

Resources