How to put custom function's output into find utility as an option? - bash

I want to send a list of extensions as a parameter. So I wrote a small helper function that parses a string of extensions and formats it in a way that "find" utility expects:
exts="txt log";
track=0;
function ext_parse()
{
for i in $exts; do
if [[ $track -eq 0 ]]
then
varstr="-iname \"*.$i\"";
track=1;
else
varstr="$varstr -o -iname \"*.$i\" ";
fi
done
echo "$varstr";
}
So it returns:
-iname "*.txt" -o -iname "*.log"
If I put this into "find" directly it works well:
find . -type f \( -iname "*.txt" -o -iname "*.log" \) -print
But any attempt to substitute this string with the function above that I've tried fails.
Is there any way to obtain that behavior or it is impossible by design?

I would argue it's cleaner, safer and easier to use arrays:
ext_parse() {
local i
varstr=()
for i; do
((${#varstr[#]}!=0)) && varstr+=( -o )
varstr+=( -iname "*.$i" )
done
}
To use this function, you would first call it with appropriate arguments, e.g., ext_parse txt log; this will set the array varstr; and then you can use it as:
find -type f \( "${varstr[#]}" \)
So your workflow looks like this:
$ ext_parse txt log
$ find -type f \( "${varstr[#]}" \)
RE: your comment: to clarify your worry about find being run once per element in array (which is wrong!), do the following test: save the following script as banana:
#!/bin/bash
for ((i=1;i<=$#;++i)); do
printf 'Argument %d: %s\n' "$i" "${!i}"
done
Then chmod +x banana, and try it:
$ ext_parse txt log
$ ./banana -type f \( "${varstr[#]}" \)
Argument 1: -type
Argument 2: f
Argument 3: (
Argument 4: -iname
Argument 5: *.txt
Argument 6: -o
Argument 7: -iname
Argument 8: *.log
Argument 9: )
So you can see that banana is executed once only, with all the arguments given above: exactly what you want!

Related

Exec not outputting result to function

How do I get the result of the find to be used by the function? I keep getting blank results.
#!/bin/bash
#functions
function codec_scan () {
echo "paremter 1 is this: $1" # >> $fulllog
}
#exporting
export -f codec_scan
#Main Code
find . -type f \( -name "*.avi" -o -name "*.AVI" -o -name "*.m4v" -o -name "*.mkv" -o -name "*.mp4" -o -name "*.MP4" \) -exec bash -c codec_scan \"\{}\" \;
\"\{}\" is literal characters " with {} inside. It's a sole {}. Nothing before it, nothing after it. Unless you want to add a literal " characters.
It's bash -c 'script...'. The arguments that follow are arguments to the script, not to the function. Each function has its own $1 $2 ... positional arguments, they are separate to script. And they arguments to bash -c start from $0, not from $1, see man bash, it is like bash -c 'script..' <$0> <$1> <$2> ...
You want:
find .... -exec bash -c 'codec_scan "$#"' _ {} \;
Do not use function in bash. Just codec_scan() {. See https://wiki.bash-hackers.org/scripting/obsolete
You may be interested in -iname.

Bash: How to use functions with parameters with find and ssh

I'm trying to search for files of a specific type on a remote ssh client, and want to call a function with the filename passed as a function parameter:
out=$(ssh operator#$IP << EOF
check_cert_date () {
echo "checking" $1
}
$(typeset -f)
find /opt -iname *.der -o -iname *.pem -exec bash -c 'for arg; do check_cert_date "$arg"; done' - {} \;
EOF
)
Files are found, but the filename itself is not passed to check_cert_date(), i.e. $1 is always empty.
Watch out for quoting with 'Here Documents'. Use << "EOF".
Also, find needs parens for your action to apply to both *.der and *.pem files:
find /opt \( -iname *.der -o -iname *.pem \) -print | while read -r file; do check_cert_date "$file"; done

How to stop Bash expansion of '*.h" in a function?

In trying to run the following function—Bash is expanding my variable in an unexpected way—thus preventing me from getting my expected result.
It comes down to the way bash deals with a "*.h" which I am passing in to the function.
Here is the function I call:
link_files_of_type_from_directory "*.h" ./..
And where I would expect this variable to stay this way all the way through at some point, by the time it hits the echo $command_to_run; part of my Bash script...this variable has expanded to...
MyHeader1.h MyHeader2.h MyHeader3.h
and so on.
What I want is for Bash to not expand my files so that my code runs the following:
find ./.. -type f -name '*.h'
Instead of
find ./.. -type f -name MyHeader1.h MyHeader2.h MyHeader3.h
This is the code:
function link_files_of_type_from_directory {
local file_type=$1;
local directory_to_link=$2;
echo "File type $file_type";
echo "Directory to link $directory_to_link";
command="find $directory_to_link -type f -name $file_type";
echo $command;
#for i in $(find $directory_to_link -type f -name $file_type);
for i in $command;
do
echo $i;
if test -e $(basename $i); then
echo $i exists;
else
echo Linking: $i;
ln -s $i;
fi
done;
}
How can I prevent the expansion so that Bash does search for files that end in *.h in my the directory I want to pass in?
UPDATE 1:
So I've updated the call to be
link_files_of_type_from_directory "'*.h'" ..
And the function now assembles the string of the command to be evaluated like so:
mmd="find $directory_to_link -type f -name $file_type";
When I echo it out—it's correct :)
find .. -type f -name '*.h'
But I can't seem to get the find command to actually run. Here are the errors / mistakes I'm getting while trying to correctly assemble the for loop:
# for i in $mmd; # LOOPS THROUGH STRINGS IN COMMAND
# for i in '$(mdd)'; # RUNS MMD LITERALLY
# for i in ${!mmd}; # Errors out with: INVALID VARIABLE NAME — find .. -type f -name '*.h':
Would love help on this part—even though it is a different question :)
With quoting of your variables, removed semicolons and your loop wrapped into an -exec action to prevent problems with spaces, tabs and newlines in filenames, your function looks like this:
function link_files_of_type_from_directory {
local file_type=$1
local directory_to_link=$2
echo "File type $file_type"
echo "Directory to link $directory_to_link"
find "$directory_to_link" -type f -name "$file_type" -exec sh -c '
for i do
echo "$i"
if test -e "$(basename "$i")"; then
echo "$i exists"
else
echo "Linking: $i"
ln -s "$i"
fi
done
' sh {} +
}

BASH: If condition uses result from find command to determine which file will be written

I want to list all files in a nested directory, but in that directory has some files which having space in their name. So I wanna write down the paths of which files don't have space in their name and which have in 2 different files.
So far, I just know how to find those having space in their name by this command:
find /<my directory> -type f -name * *
I want something like:
find /<my directory> -type f
if [ name has space]
then > a.txt
else > b.txt
fi
Thank you in advance.
You can put a condition in a brief -exec. This is somewhat more complex than you would hope because -exec cannot directly contain shell builtins.
find "$path" -type f -exec sh -c 'for f; do
case $f in *\ *) dest=a;; *) dest=b;; esac;
echo "$f" >>$dest.txt
done' _ {} +
In other words, pass the found files to the following sh -c ... script. (The underscore is to populate $0 with something inside the subshell.)
If the directory tree isn't too deep, perhaps it would be a lot easier to just run find twice.
find "$path" -type f -name '* *' >a.txt
find "$path" -type f \! -name '* *' >b.txt
Use two separate commands:
find "$path" -type f -name '* *' > a.txt
find "$path" -type f -not -name '* *' > b.txt

Linux/sh: How to listing only files in folder (with witespace) and save into a variable in one line

I try to get the lists of files in that format (with witespaces):
"file1.html" "file 2.php" "file_3.php"
#!/bin/sh
WEB_DIR="/volume1/web"
IFS=$'\n'
for file in $(find "${WEB_DIR}" -mindepth 1 -maxdepth 1 -type f) ; do
printf "\"$file\" "
done
output:
"/volume1/web/.htaccess" "/volume1/web/file.html" "/volume1/web/a b.php"
and the output is perfect but... how to put this output to the variable?
I do this...
IFS=$'\n'
for file in $(find "${WEB_DIR}" -mindepth 1 -maxdepth 1 -type f) ; do
mystring+=$(printf "\"$file\" ")
done
echo ${mystring}
In output I have this:
tmp.sh: line 48: mystring+="/volume1/web/.htaccess" : not found
Note:
The answer below accepts the premise of the question: to build a single string value with a list of double-quoted file paths, such as the one shown in the question (
"/volume1/web/.htaccess" "/volume1/web/file.html" "/volume1/web/a b.php")
However, the OP ultimately wanted to use that string as part of another command, which does not work, because the embedded double quotes are no longer recognized as string delimiters that identify separate arguments when you reference the string variable.
The correct solution is to use find ... -exec ...+ (in this case) or, generally, xargs to pass a list of filenames as operands (arguments) to another command; e.g., to pass the filenames to command foo ({} robustly passes all file paths, whether they contain spaces or not):
find "${WEB_DIR}" -mindepth 1 -maxdepth 1 -type f -exec foo -bar {} +
If the list of filenames doesn't go at the end of the target command line, an intermediate sh -c command is necessary:
find "${WEB_DIR}" -mindepth 1 -maxdepth 1 -type f -exec sh -c 'foo -bar "$#" -baz' - {} +
You've tagged your question bash, but your shebang line targets sh, where Bash extensions to the POSIX shell specification aren't guaranteed to be available.
In your case (it sounds like you're using dash, which act as /bin/sh on Ubuntu):
ANSI C-quoted strings such as $'\n' aren't available - $'\n' expands to literal $\n.
This means that any of these 3 literal chars. - $, \ or n serve as the field separator - in your case that just happened to work, because the file paths happened not to contain these characters.
Operator += isn't recognized - the whole token mystring+="/volume1/web/.htaccess" is treated as a command name, which causes the error you're seeing.
Possible solutions:
If you do want to target Bash, replace #!/bin/sh with #!/bin/bash.
Note that for your bash code to be fully robust, you should turn off globbing (set +f) in addition to setting $IFS.
Your code can be streamlined - see below.
If not (if your code must be portable), you must find POSIX-compliant alternatives - see below.
Here's a portable solution (works with any POSIX-compliant sh):
while IFS= read -r file; do
mystring="$mystring$(printf "\"$file\" ")"
done <<EOF
$(find "${WEB_DIR}" -mindepth 1 -maxdepth 1 -type f)
EOF
echo "$mystring"
A much more efficient variant that uses find ... -exec ... + to produce the output with (typically) a single printf call:
IFS= read -r mystring <<EOF
$(find "${WEB_DIR}" -mindepth 1 -maxdepth 1 -type f -exec printf '"%s" ' {} +)
EOF
echo "$mystring"
The bash equivalent, using a process substitution (<(...)):
IFS= read -r mystring < \
<(find "${WEB_DIR}" -mindepth 1 -maxdepth 1 -type f -exec printf '"%s" ' {} +)
echo "$mystring"
Also note that GNU find has a built-in -printf action that supports a variety of format strings, which makes calling the external printf utility unnecessary:
IFS= read -r mystring < <(find "${WEB_DIR}" -mindepth 1 -maxdepth 1 -type f -printf '"%p" ')
echo "$mystring"
I am not really sure why we are bothering with the intermediate step of using a variable:
find "${WEB_DIR}" -mindepth 1 -maxdepth 1 -type f -print0 | xargs -0 /usr/syno/bin/7z \
a "${BACKUP_DIR}/backup_webfiles_${TIMESTAMP}.7z" -xr!thumbs.db -xr!#eaDir -xr!#tmp \
-xr!#recycle -xr!lost+found -xr!.DS_Store -t7z -m0=lzma2 -ms=off -mhe -mmt -mx9 \
-v${SPLIT_VOLUME} -p"${PASSWORD}"
I am not sure if you have to tell 7z that the files are coming from stdin, so you might have to add a hyphen (-) where "${only_files}" used to be??

Resources