escaping sh/bash function arguments - bash

I want to submit multiple commands with arguments to shell functions, and thus quote my commands like his:
$ CMD=''\''CMD'\'' '\''ARG1 ARG2 ARG3'\'''
$ echo $CMD
'CMD' 'ARG1 ARG2 ARG3' 'ARG4'
Now when I try to us them in a function like this:
$ function execute { echo "$1"; echo "$2"; echo "$3"; }
I get the result:
$ execute $CMD
'CMD'
'ARG1
ARG2
How can I get to this result:
$ execute $CMD
CMD
ARG1 AGR2 ARG3
Thanks in advance!
PS: I use an unquoting function like:
function unquote { echo "$1" | xargs echo; }
EDIT:
to make my intentions more clear: I want to gradually build up a command that needs arguments with spaces passed to subfunctions:
$ CMD='HOST '\''HOSTNAME'\'' '\''sh SCRIPTNAME'\'' '\''MOVE '\''\'\'''\''/path/to/DIR1'\''\'\'''\'' '\''\'\'''\''/path/to/DIR2'\''\'\'''\'''\'''
$ function execute { echo "$1 : $2 : $3 : $4"; }
$ execute $CMD
HOST : 'HOSTNAME' : 'sh : SCRIPTNAME'
The third arguments breaks unexpected at a space, the quoting is ignored. ??

Use an array and # in double quotes:
function execute () {
echo "$1"
echo "$2"
echo "$3"
}
CMD=('CMD' 'ARG1 ARG2 ARG3' 'ARG4')
execute "${CMD[#]}"

function execute {
while [[ $# > 0 ]]; do
cmd=$(cut -d' ' -f1 <<< $1)
arg=$(sed 's/[^ ]* //' <<< $1)
echo "$cmd receives $arg"
shift
done
}
CMD1="CMD1 ARG11 ARG12 ARG13"
CMD2="CMD2 ARG21 ARG22 ARG23"
execute "$CMD1" "$CMD2"
Gives:
CMD1 receives ARG11 ARG12 ARG13
CMD2 receives ARG21 ARG22 ARG23

Related

Prevent variable expansion in the parameter of a function

function kubeall {
for i in `seq 0 2`; do
echo pod-$i
kubectl exec -it pod-$i -- bash -c "$#"
done
}
kubeall "cat ~/logs/pod-$i/log.out"
Is it possible to prevent expansion of the variable ($i in this case) in the parameter itself?
Passing $i to kubeall as-is won't help. You should pass $i to bash as a positional parameter instead.
function kubeall {
for i in {0..2}; do
echo "pod-$i"
kubectl exec -it "pod-$i" -- bash -c "$1" bash "$i"
done
}
kubeall 'cat ~/"logs/pod-$1/log.out"'
Not sure if this what you want, but here it goes:
#!/bin/bash
function kubeall {
echo '$#: '"$#"
for i in $(seq 0 0); do
echo pod-$i
#Note :: Include $i to the new shell.
bash -c "i=$i; $#"
done
}
#Note :: Using single quotes here to send the arguments as it is.
kubeall 'echo ~/pod-$i/log.out'
Output:
$#: echo ~/pod-$i/log.out
pod-0
/home/username/pod-0/log.out

concatenate a function with string and execute it

I wanna concatenate a command specified in a function with string and execute it after.
I will simplify my need with an exemple to execute "ls -l -a"
#!/bin/bash
echo -e "specify command"
read command # ls
echo -e "specify argument"
read arg # -l
test () {
$command $arg
}
eval 'test -a'
Except that
Use an array, like this:
args=()
read -r command
args+=( "$command" )
read -r arg
args+=( "$arg" )
"${args[#]}" -a
If you want a function, then you could do this:
run_with_extra_switch () {
"$#" -a
}
run_with_extra_switch "${args[#]}"
#!/bin/bash
echo -e "specify command"
read command # ls
echo -e "specify argument"
read arg # -l
# using variable
fun1 () {
line="$command $arg"
}
# call the function
fun1
# parameter expansion will expand to the command and execute
$line
# or using stdout (overhead)
fun2 () {
echo "$command $arg"
}
# process expansion will execute function in sub-shell and output will be expanded to a command and executed
$(fun2)
It will work for the given question however to understand how it works look at shell expansion and attention must be payed to execute arbitrary commands.
Before to execute the command, it can be prepended by printf '<%s>\n' for example to show what will be executed.

Parsing function param in bash and pass it to new function

I'm trying to build a bash function that takes two arguments, and executes the first in a docker container, then greps the second argument in the output..but I can't get it to work.
This is what I've come up so far:
function scriptCheckData() {
local cmd=eval "$1"
local RESULT=$(docker exec -i "$dockerName" cmd)
if ! echo "$RESULT" | grep "$2"; then
echo grep failed for "$2" in: "$RESULT"
cleanupAndExit
fi
}
Function call:
scriptCheckData 'test.php --option 1' 'expected string'
Any help appreciated!
EDIT:
Solution was to not put my var in quotes, like this:
function scriptCheckData() {
local RESULT=$(docker exec -i "$dockerName" $1)
if ! echo "$RESULT" | grep "$2"; then
echo "grep failed for [$2] in: [$RESULT]"
cleanupAndExit
fi
}
change this
local RESULT=$(docker exec -i "$dockerName" cmd)
with
local RESULT=$(docker exec -i "$dockerName" $cmd)
then I would suggest to use this sintax to improve output print
echo "grep failed for [$2] in: [$RESULT]"
Anyhow just to be sure that the function is really getting the parameter, you could add a debug in your script i.e.:
echo "cmd[$cmd]"
echo "param1[$1]"
echo "param2[$2]"
Regards
Claudio

How to change argv[0] value in shell / bash script?

The set command can be used to change values of the positional arguments $1 $2 ...
But, is there any way to change $0 ?
In Bash greater than or equal to 5 you can change $0 like this:
$ cat bar.sh
#!/bin/bash
echo $0
BASH_ARGV0=lol
echo $0
$ ./bar.sh
./bar.sh
lol
ZSH even supports assigning directly to 0:
$ cat foo.zsh
#!/bin/zsh
echo $0
0=lol
echo $0
$ ./foo.zsh
./foo.zsh
lol
Here is another method. It is implemented through direct commands execution which is somewhat better than sourcing (the dot command). But, this method works only for shell interpreter, not bash, since sh supports -s -c options passed together:
#! /bin/sh
# try executing this script with several arguments to see the effect
test ".$INNERCALL" = .YES || {
export INNERCALL=YES
cat "$0" | /bin/sh -s -c : argv0new "$#"
exit $?
}
printf "argv[0]=$0\n"
i=1 ; for arg in "$#" ; do printf "argv[$i]=$arg\n" ; i=`expr $i + 1` ; done
The expected output of the both examples in case ./the_example.sh 1 2 3 should be:
argv[0]=argv0new
argv[1]=1
argv[2]=2
argv[3]=3
#! /bin/sh
# try executing this script with several arguments to see the effect
test ".$INNERCALL" = .YES || {
export INNERCALL=YES
# this method works both for shell and bash interpreters
sh -c ". '$0'" argv0new "$#"
exit $?
}
printf "argv[0]=$0\n"
i=1 ; for arg in "$#" ; do printf "argv[$i]=$arg\n" ; i=`expr $i + 1` ; done

How to make a bash function which can read from standard input?

I have some scripts that work with parameters, they work just fine but i would like them to be able to read from stdin, from a pipe for example, an example, suppose this is called read:
#!/bin/bash
function read()
{
echo $*
}
read $*
Now this works with read "foo" "bar", but I would like to use it as:
echo "foo" | read
How do I accomplish this?
It's a little tricky to write a function which can read standard input, but works properly when no standard input is given. If you simply try to read from standard input, it will block until it receives any, much like if you simply type cat at the prompt.
In bash 4, you can work around this by using the -t option to read with an argument of 0. It succeeds if there is any input available, but does not consume any of it; otherwise, it fails.
Here's a simple function that works like cat if it has anything from standard input, and echo otherwise.
catecho () {
if read -t 0; then
cat
else
echo "$*"
fi
}
$ catecho command line arguments
command line arguments
$ echo "foo bar" | catecho
foo bar
This makes standard input take precedence over command-line arguments, i.e., echo foo | catecho bar would output foo. To make arguments take precedence over standard input (echo foo | catecho bar outputs bar), you can use the simpler function
catecho () {
if [ $# -eq 0 ]; then
cat
else
echo "$*"
fi
}
(which also has the advantage of working with any POSIX-compatible shell, not just certain versions of bash).
You can use <<< to get this behaviour. read <<< echo "text" should make it.
Test with readly (I prefer not using reserved words):
function readly()
{
echo $*
echo "this was a test"
}
$ readly <<< echo "hello"
hello
this was a test
With pipes, based on this answer to "Bash script, read values from stdin pipe":
$ echo "hello bye" | { read a; echo $a; echo "this was a test"; }
hello bye
this was a test
To combine a number of other answers into what worked for me (this contrived example turns lowercase input to uppercase):
uppercase() {
local COMMAND='tr [:lower:] [:upper:]'
if [ -t 0 ]; then
if [ $# -gt 0 ]; then
echo "$*" | ${COMMAND}
fi
else
cat - | ${COMMAND}
fi
}
Some examples (the first has no input, and therefore no output):
:; uppercase
:; uppercase test
TEST
:; echo test | uppercase
TEST
:; uppercase <<< test
TEST
:; uppercase < <(echo test)
TEST
Step by step:
test if file descriptor 0 (/dev/stdin) was opened by a terminal
if [ -t 0 ]; then
tests for CLI invocation arguments
if [ $# -gt 0 ]; then
echo all CLI arguments to command
echo "$*" | ${COMMAND}
else if stdin is piped (i.e. not terminal input), output stdin to command (cat - and cat are shorthand for cat /dev/stdin)
else
cat - | ${COMMAND}
Here is example implementation of sprintf function in bash which uses printf and standard input:
sprintf() { local stdin; read -d '' -u 0 stdin; printf "$#" "$stdin"; }
Example usage:
$ echo bar | sprintf "foo %s"
foo bar
This would give you an idea how function can read from standard input.
Late to the party here. Building off of #andy's answer, here's how I define my to_uppercase function.
if stdin is not empty, use stdin
if stdin is empty, use args
if args are empty, do nothing
to_uppercase() {
local input="$([[ -p /dev/stdin ]] && cat - || echo "$#")"
[[ -n "$input" ]] && echo "$input" | tr '[:lower:]' '[:upper:]'
}
Usages:
$ to_uppercase
$ to_uppercase abc
ABC
$ echo abc | to_uppercase
ABC
$ to_uppercase <<< echo abc
ABC
Bash version info:
$ bash --version
GNU bash, version 3.2.57(1)-release (x86_64-apple-darwin17)
I've discovered that this can be done in one line using test and awk...
test -p /dev/stdin && awk '{print}' /dev/stdin
The test -p tests for input on a pipe, which accepts input via stdin. Only if input is present do we want to run the awk since otherwise it will hang indefinitely waiting for input which will never come.
I've put this into a function to make it easy to use...
inputStdin () {
test -p /dev/stdin && awk '{print}' /dev/stdin && return 0
### accepts input if any but does not hang waiting for input
#
return 1
}
Usage...
_stdin="$(inputStdin)"
Another function uses awk without the test to wait for commandline input...
inputCli () {
local _input=""
local _prompt="$1"
#
[[ "$_prompt" ]] && { printf "%s" "$_prompt" > /dev/tty; }
### no prompt at all if none supplied
#
_input="$(awk 'BEGIN {getline INPUT < "/dev/tty"; print INPUT}')"
### accept input (used in place of 'read')
### put in a BEGIN section so will only accept 1 line and exit on ENTER
### WAITS INDEFINITELY FOR INPUT
#
[[ "$_input" ]] && { printf "%s" "$_input"; return 0; }
#
return 1
}
Usage...
_userinput="$(inputCli "Prompt string: ")"
Note that the > /dev/tty on the first printf seems to be necessary to get the prompt to print when the function is called in a Command Substituion $(...).
This use of awk allows the elimination of the quirky read command for collecting input from keyboard or stdin.
Yet another version that:
works by passing text through a pipe or from arguments
easy to copy and paste by changing command in last line
works in bash, zsh
# Prints a text in a decorated ballon
function balloon()
{
(test -p /dev/stdin && cat - || echo $#) figlet -t | cowsay -n -f eyes | toilet -t --gay -f term
}
Usage:
# Using with a pipe
$ fortune -s | balloon
# Passing text as parameter
balloon "$(fortune -s )"

Resources