Bash 'xargs' and 'stat' behaviour as variable [duplicate] - bash

What is the correct way to call some command stored in variable?
Are there any differences between 1 and 2?
#!/bin/sh
cmd="ls -la $APPROOTDIR | grep exception"
#1
$cmd
#2
eval "$cmd"

Unix shells operate a series of transformations on each line of input before executing them. For most shells it looks something like this (taken from the Bash man page):
initial word splitting
brace expansion
tilde expansion
parameter, variable and arithmetic expansion
command substitution
secondary word splitting
path expansion (aka globbing)
quote removal
Using $cmd directly gets it replaced by your command during the parameter expansion phase, and it then undergoes all following transformations.
Using eval "$cmd" does nothing until the quote removal phase, where $cmd is returned as is, and passed as a parameter to eval, whose function is to run the whole chain again before executing.
So basically, they're the same in most cases and differ when your command makes use of the transformation steps up to parameter expansion. For example, using brace expansion:
$ cmd="echo foo{bar,baz}"
$ $cmd
foo{bar,baz}
$ eval "$cmd"
foobar foobaz

If you just do eval $cmd when we do cmd="ls -l" (interactively and in a script), you get the desired result. In your case, you have a pipe with a grep without a pattern, so the grep part will fail with an error message. Just $cmd will generate a "command not found" (or some such) message.
So try use to eval (near "The args are read and concatenated together") and use a finished command, not one that generates an error message.

$cmd would just replace the variable with it's value to be executed on command line.
eval "$cmd" does variable expansion & command substitution before executing the resulting value on command line
The 2nd method is helpful when you wanna run commands that aren't flexible eg.
for i in {$a..$b} format loop won't work because it doesn't allow variables. In this case, a pipe to bash or eval is a workaround.
Tested on Mac OSX 10.6.8, Bash 3.2.48

I think you should put
`
(backtick) symbols around your variable.

Related

Is there any alternative to using eval in a shell script to achieve variable expansion

I have the following case where exec and eval will handle variables passed as arguments differently.
Here, eval seems to output something which is intended.
But is there any alternative to using that?
$ cat arg.sh
#!/bin/bash
eval ./argtest $*
$ ./arg.sh "arg1 'subarg1 subarg2'"
Args: 2
Arg1: arg1
Arg2: subarg1 subarg2
But at the same time if I use exec instead of eval call, the single quotes are not getting honored.
$ ./arg.sh "arg1 'subarg1 subarg2'"
Args: 3
Arg1: arg1
Arg2: 'subarg1
Arg3: subarg2'
You should do:
#!/bin/bash
./argtest "$#"
To properly pass unchanged arguments.
Then do:
$ ./arg.sh arg1 'subarg1 subarg2'
As you would do with any other command.
Research when to use quoting in shell, how is $# positional arguments expansions handled specially in quotes, research how does $* and $# differ and research word splitting. Also research what is variable expansion and in which contexts it happens and how does single quotes differ from double quotes. And because exec is mentioned see bashfaq Eval command and security issues. Remember to check your scripts with https://shellcheck.net .
Is there any alternative to using eval in a shell script to achieve variable expansion
Yes - use envsubst for variable expansion, it's a tool just for that.
#!/bin/bash
arg=$(VARIABLE=something envsubst '$VARIABLE' <<<"$1")
./argtest "$arg"
$ bash -x ./arg.sh 'string with **not-expanded** $VARIABLE'
+ ./argtest 'string with **not-expanded** something'
Is there any alternative to using eval in a shell script to achieve *single quotes parsing
Yes - you would potentially write your own parser, probably in awk, that would split the string and then reload. A very very crude example:
#!/bin/bash
readfile -t args < <(sed "s/ *'\([^']*\)' */\n\1\n/; s/\n$//" <<<"$*")
./argtest "${args[#]}"
$ bash -x ./arg.sh "arg1 'subarg1 subarg2'"
+ ./argtest 'arg1' 'subarg1 subarg2'
Using $*, the shell applies word splitting to the parameters and passes the effect after word splitting to eval, repsepcitvely exec. What happens after, differs between them:
exec simply replaces the current process by a new one, based on the first parameter it gets. Than in passes the remaining parameters unmodified to this process.
eval on the other hand catenates the parameters together to a single string (using one space as a separator between those strings), then treats this resulting string as a new command where the usual expansion and word splitting mechanism of bash are applied, and finally runs this command.
The mechanism is completely different, which is not surprising, since these commands serve a different purpose.

Can't seem to escape a space in a shell script [duplicate]

What is the correct way to call some command stored in variable?
Are there any differences between 1 and 2?
#!/bin/sh
cmd="ls -la $APPROOTDIR | grep exception"
#1
$cmd
#2
eval "$cmd"
Unix shells operate a series of transformations on each line of input before executing them. For most shells it looks something like this (taken from the Bash man page):
initial word splitting
brace expansion
tilde expansion
parameter, variable and arithmetic expansion
command substitution
secondary word splitting
path expansion (aka globbing)
quote removal
Using $cmd directly gets it replaced by your command during the parameter expansion phase, and it then undergoes all following transformations.
Using eval "$cmd" does nothing until the quote removal phase, where $cmd is returned as is, and passed as a parameter to eval, whose function is to run the whole chain again before executing.
So basically, they're the same in most cases and differ when your command makes use of the transformation steps up to parameter expansion. For example, using brace expansion:
$ cmd="echo foo{bar,baz}"
$ $cmd
foo{bar,baz}
$ eval "$cmd"
foobar foobaz
If you just do eval $cmd when we do cmd="ls -l" (interactively and in a script), you get the desired result. In your case, you have a pipe with a grep without a pattern, so the grep part will fail with an error message. Just $cmd will generate a "command not found" (or some such) message.
So try use to eval (near "The args are read and concatenated together") and use a finished command, not one that generates an error message.
$cmd would just replace the variable with it's value to be executed on command line.
eval "$cmd" does variable expansion & command substitution before executing the resulting value on command line
The 2nd method is helpful when you wanna run commands that aren't flexible eg.
for i in {$a..$b} format loop won't work because it doesn't allow variables. In this case, a pipe to bash or eval is a workaround.
Tested on Mac OSX 10.6.8, Bash 3.2.48
I think you should put
`
(backtick) symbols around your variable.

Bash - difference between <<EOF and <<'EOF'

GNU Bash - 3.6.6 Here Documents
[n]<<[-]word
here-document
delimiter
If any part of word is quoted, the delimiter is the result of quote removal on word, and the lines in the here-document are not expanded. If word is unquoted, all lines of the here-document are subjected to parameter expansion, command substitution, and arithmetic expansion, the character sequence \newline is ignored, and ‘\’ must be used to quote the characters ‘\’, ‘$’, and ‘`’.
If I single-quote EOF, it works. I think because bash /bin/bash process to be invoked gets un-expanded strings and then the invoked process interprets the lines.
$ /bin/bash<<'EOF'
#!/bin/bash
echo $BASH_VERSION
EOF
3.2.57(1)-release
However, the below is causing an error. I thought BASH_VERSION would have been expanded and the version of current bash process is passed to the /bin/bash process to be invoked. But not working.
$ /bin/bash<<EOF
#!/bin/bash
echo $BASH_VERSION
EOF
/bin/bash: line 2: syntax error near unexpected token `('
/bin/bash: line 2: `echo 5.0.17(1)-release'
/bin/bash<<EOF
#!/bin/bash
echo $BASH_VERSION
EOF
As you can infer from the error message, the heredoc is being expanded to:
/bin/bash<<EOF
#!/bin/bash
echo 5.0.17(1)-release
EOF
It sounds like that's what you expect: it's being expanded to the outer shell's version. The problem isn't with the heredoc or the expansion; it's that unquoted parentheses are a syntax error. Try running just the echo command by hand and you'll get the same error:
$ echo 5.0.17(1)-release
bash: syntax error near unexpected token `('
To fix this, you could add extra quotes:
/bin/bash<<EOF
echo '$BASH_VERSION'
EOF
This will work and print the outer shell's version. I used single quotes to demonstrate that these quotes will not inhibit variable expansion. The outer shell doesn't see these quotes. Only the inner shell does.
(I also got rid of the #!/bin/bash shebang line. There's no need for it since you're explicitly invoking bash.)
However, quoting is not 100% robust. If $BASH_VERSION happened to contain single quotes you'd have a problem. The quotes make parentheses ( ) safe but they aren't foolproof. As a general technique, if you want this to be completely safe no matter what special characters are in play you'll have to jump through some ugly hoops.
Use printf '%q' to escape all special characters.
/bin/bash <<EOF
echo $(printf '%q' "$BASH_VERSION")
EOF
This will expand to echo 5.0.17\(1\)-release.
Pass it in as an environment variable and use <<'EOF' to disable interpolation inside the script.
OUTER_VERSION="$BASH_VERSION" /bin/bash <<'EOF'
echo "$OUTER_VERSION"
EOF
This would be my choice. I prefer use the <<'EOF' form whenever possible. Having the parent shell interpolate the script being passed to a child shell can be confusing and difficult to reason about. Also, the explicit $OUTER_VERSION variable makes it crystal clear what's happening.
Use bash -c 'script' instead of a heredoc and then pass the version in as a command-line argument.
bash -c 'echo "$1"' bash "$BASH_VERSION"
I might go with this for a single-line script.
If you don't quote EOF, variables in the heredoc are expanded by the original shell before passing it as input to the invoked shell. So it's equivalent to executing
echo 3.2.57(1)-release
in the invoked shell. That's not valid bash syntax, so you get an error.
Quoting the word prevents variable expansion, so the invoked shell receives $BASH_VERSION literally, and expands it itself.
In the first case, the quotes prevent any changes in the here document, so the sub-shell sees echo $BASH_VERSION and it expands the string and echoes it.
In the second case, the absence of quotes means that the first shell expands the information and it sees echo 3.2.57(1)-release, and if you type that at the command line, you get the syntax error.
If you used echo "$BASH_VERSION" in both, then both would work, but different shells would expand $BASH_VERSION.

Can't use redirections in a shell command stored in a string [duplicate]

Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?
The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.
$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.
There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.

Command works normally in Shell, but not while using a script

I used this command in my Bash Shell:
printf $VAR1 >> `printf $VAR2`
and it normally worked. But when I write this into the script file and run it in Shell, it does not work. File "script.sh" contains this:
#!/bin/bash
printf $VAR1 >> `printf $VAR2`
and the output in Shell is:
script.sh: line2: `printf $VAR2`: ambiguous redirect
I don´t know, how is this possible, because the command is absolutely the same. And of course, I run the script on the same system and in the same Shell window.
Thank you for your help.
There are 3 points worth addressing here:
Shell variables vs. environment variables:
Scripts (unless invoked with . / source) run in a child process that only sees the parent [shell]'s environment variables, not its regular shell variables.
This is what likely happened in the OP's case: $VAR1 and $VAR2 existed as regular shell variables, but not environment variables, so script script.sh didn't see them.
Therefore, for a child process to see a parent shell's shell variables, the parent must export them first, as a result of which they (also) become environment variables: export VAR1=... VAR2=...
Bash's error messages relating to output redirection (>, >>):
If the filename argument to a an output redirection is an - unquoted command substitution (`...`, or its modern equivalent, $(...)) - i.e., the output from a command - Bash reports error ambiguous redirect in the following cases:
The command output has embedded whitespace, i.e., contains more than one word.
The command output is empty, which is what likely happened in the OP's case.
As an aside: In this case, the error message's wording is unfortunate, because there's nothing ambiguous about a missing filename - it simply cannot work, because files need names.
It is generally advisable to double-quote command substitutions (e.g., >> "$(...)") and also variable references (e.g., "$VAR2"): this will allow you to return filenames with embedded whitespace, and, should the output be unexpectedly empty, you'll get the (slightly) more meaningful error message No such file or directory.
Not double-quoting a variable reference or command substitution subjects its value / to so-called shell expansions: further, often unintended interpretation by the shell.
The wisdom of using a command substitution to generate a filename:
Leaving aside that printf $VAR2 is a fragile way to print the value of variable $VAR2 in general (the robust form again involves double-quoting: printf "$VAR2", or, even more robustly, to rule out inadvertent interpretation of escape sequences in the variable value, printf %s "$VAR2"), there is no good reason to employ a command substitution to begin with if all that's needed is a variable's value:
>> "$VAR2" is enough to robustly specify the value of variable $VAR2 as the target filename.
I tried this on my Mac (10.11.1) in a terminal window and it worked fine.
Are you sure your default shell is bash?
echo $SHELL
Did you use EXPORT to set your shell vars?
$ export VAR1="UselessData"
$ export VAR2="FileHoldingUselessData"
$ ./script.sh
$ cat FileHoldingUselessData
UselessData$
However.... echo I think does a better job since with printf the output terminates with the first space so....
$ cat script.sh
#!/bin/bash
echo $VAR1 >> `printf $VAR2`
$ ./script.sh
$ cat FileHoldingUselessData
Some Useless Data
Which leads me to believe you might want to just use echo instead of printf all together..
#!/bin/bash
echo $VAR1 >> `echo $VAR2`

Resources