This question already has answers here:
How to substitute quoted, multi-word strings as arguments?
(4 answers)
Closed 7 years ago.
script.sh:
#!/bin/bash
echo "First argument: $1"
wrapper.sh:
#!/bin/bash
CALLER='./script.sh "this should be one argument"'
$CALLER
what happens:
$ ./wrapper.sh
First argument: "this
what I was expecting:
$ ./wrapper.sh
First argument: this should be one argument
I tried different exercises to make it work the way I want it, but I can't find the way to invoke script.sh with single argument containing spaces from within wrapper.sh.
I would also like to understand the way nested quotes are interpreted.
This works instead (only last line changed):
#!/bin/bash
CALLER='./script.sh "this should be one argument"'
eval "$CALLER"
The reason for this is that quoting is applied at a different place in the parsing process than variable substitution, so you need to re-run the result of the substitution ($CALLER) through the parsing process (using eval), but quoted (the "…" around $CALLER) to avoid the field splitting that comes with the substitution already.
Further reading: the POSIX documentation on this, and the links already given in comments.
Related
This question already has answers here:
Stripping prefixes and suffixes from shell words matching a pattern
(2 answers)
Difference between ${} and $() in Bash [duplicate]
(3 answers)
Closed 1 year ago.
I have a string with the structure task_name-student_name and I want to split it into two variables:
task: containing the chunk before the -
student: containing the chunk after the -
I can get this to work with sed:
string="task_name-student_name"
student=$(echo "$string" | sed "s/.*-//")
task=$(echo "$string" | sed "s/-[^-]*$//")
However, VS Code suggests "See if you can use $(variable//search/replace) instead".
So I have two questions:
Why would $(variable//search/replace) be better
How do I get the parameter expansion to work without it being interpreted as a command?
When I try
echo $("$string"//-[^-]*$//)
or
echo $(echo $("$string"//-[^-]*$//))
I get this output:
bash: task_name-student_name//-[^-]*$//: No such file or directory
Thanks in advance!
First: for variable expansion, you want curly braces instead of parentheses. $(something) will execute something as a command; ${something} will expand something as a variable. And just for completeness, $((something)) will evaluate something as an arithmetic expression (integers only, no floating point).
As for replacing the sed with a variable expansion: I wouldn't use $(variable//search/replace} for this; there are more appropriate modifications. ${variable#pattern} will remove the shortest possible match of pattern from the beginning of the variable's value, so use that with the pattern *- to remove through the first "-":
student=${string#*-}
Similarly, ${variable%pattern} will remove from the end of the variable's value, so you can use this with the pattern -* to remove from the dash to the end:
task=${string%-*}
Note that the patterns used here are "glob" expressions (like filename wildcards), not regular expressions like sed uses; they're just different enough to be confusing. Also, the way I've written these assumes there's exactly one "-" in the string; if there's a possibility some student will have a hyphenated name or something like that, you may need to modify them.
There are lots more modifications you can do in a parameter expansion; see the bash hacker's wiki on the subject. Some of these modifications will work in other shells besides bash; the # and % modifiers (and the "greedy" versions, ## and %%) will work in any shell that conforms to the POSIX standard.
This question already has answers here:
Bash doesn't parse quotes when converting a string to arguments
(5 answers)
Closed 1 year ago.
Imagine I want to call some command some-command via $() with an argument stored in another variable argument, the latter containing space.
With my current understanding, the fact that result="$(some-command $argument)" (e.g. expansion) leads to passing two arguments is as expected.
Question part: why does the result="$(some-command "$argument")" (e.g. concatenation) lead to the desired result of passing one single argument?
More details:
./some-command:
#!/usr/bin/env bash
echo "Arg 1: $1"
echo "Arg 2: $2"
./test-script:
#!/usr/bin/env bash
export PATH="`pwd -P`:$PATH"
argument="one two"
echo "Calling with expansion"
res="$(some-command $argument)"
echo $res
echo "Calling with concatenation"
res="$(some-command "$argument")"
echo $res
Calling test-script leads to the following output:
Calling with expansion
Arg 1: one Arg 2: two
Calling with concatenation
Arg 1: one two Arg 2:
I seem to not grasp when smth is expanded / evaluated and how the expanded variables are grouped into arguments passed to scripts.
Thank you!
P.S. Bonus curiosity is why result="$(some-command \"$argument\")" does not work at all.
That's how quoting and expansions work in bash. In fact, double quotes after = are not needed, as word-splitting is not performed, so
result=$(some-command "$argument")
should work the same way.
There's no "concatenation" going on. Bash treats the string inside $() as a command and runs all the expansions on it before running it.
So, what happens with some-command "$argument"? First, the parameter expansion expands $argument into a string containing spaces. When word-splitting happens, it notes the string was enclosed in double quotes, so it keeps it as a single string.
This question already has answers here:
How can I preserve quotes in printing a bash script's arguments
(7 answers)
Closed 4 years ago.
I have a script that logs the user argument list. This list is later processed by getopt.
If the script is started like:
./script.sh -a 'this is a sentence' -b 1
... and then I save "$#", I get:
-a this is a sentence -b 1
... without the single quotes. I think (because of the way Bash treats quotes) these are removed and are not available to the script.
For logging accuracy, I'd like to include the quotes too.
Can the original argument list be obtained without needing to quote-the-quotes?
No, there is no way to obtain the command line from before the shell performed whitespace tokenization, wildcard expansion, and quote removal on it.
If you want to pass in literal quotes, try
./script.sh '"-a"' '"this is a sentence"' '"-b"' '"1"'
Notice also how your original command line could have been written
'./script.sh' '-a' 'this is a sentence' '-b' '1'
This question already has answers here:
Why does shell ignore quoting characters in arguments passed to it through variables? [duplicate]
(3 answers)
Closed 6 years ago.
If i use this command in pipeline, it's working very well;
pipeline ... | grep -P '^[^\s]*\s3\s'
But if I want to set grep into variable like:
var="grep -P '^[^\s]*\s3\s'"
And if I put variable in pipeline;
pipeline ... | $var
nothing happens, like there isn't any matches.
Any help what am I doing wrong?
The robust way to store a simple command in a variable in Bash is to use an array:
# Store the command names and arguments individually
# in the elements of an *array*.
cmd=( grep -P '^[^\s]*\s3\s' )
# Use the entire array as the command to execute - be sure to
# double-quote ${cmd[#]}.
echo 'before 3 after' | "${cmd[#]}"
If, by contrast, your command is more than a simple command and, for instance, involves pipes, multiple commands, loops, ..., defining a function is the right approach:
# Define a function encapsulating the command...
myGrep() { grep -P '^[^\s]*\s3\s'; }
# ... and use it:
echo 'before 3 after' | myGrep
Why what you tried didn't work:
var="grep -P '^[^\s]*\s3\s'"
causes the single quotes around the regex to become a literal, embedded part of $var's value.
When you then use $var - unquoted - as a command, the following happens:
Bash performs word-splitting, which means that it breaks the value of $var into words (separate tokens) by whitespace (the chars. defined in special variable $IFS, which contains a space, a tab, and a newline character by default).
Bash also performs globbing (pathname expansion) on the resulting works, which is not a problem here, but can have unintended consequences in general.
Also, if any of your original arguments had embedded whitespace, word splitting would split them into multiple words, and your original argument partitioning is lost.
(As an aside: "$var" - i.e., double-quoting the variable reference - is not a solution, because then the entire string is treated as the command name.)
Specifically, the resulting words are:
grep
-P
'^[^\s]*\s3\s' - including the surrounding single quotes
The words are then interpreted as the name of the command and its arguments, and invoked as such.
Given that the pattern argument passed to grep starts with a literal single quote, matching won't work as intended.
Short of using eval "$var" - which is NOT recommended for security reasons - you cannot persuade Bash to see the embedded single quotes as syntactical elements that should be removed (a process appropriate called quote removal).
Using an array bypasses all these problems by storing arguments in individual elements and letting Bash robustly assemble them into a command with "${cmd[#]}".
What you are doing wrong is trying to store a command in a variable. For simplicity, robustness, etc. commands are stored in aliases (if no arguments) or functions (if arguments), not variables. In this case:
$ alias foo='grep X'
$ echo "aXb" | foo
aXb
I recommend you read the book Shell Scripting Recipes by Chris Johnson ASAP to get the basics of shell programming and then Effective Awk Programming, 4th Edition, by Arnold Robbins when you're ready to start writing scripts to manipulate text.
This question already has answers here:
Why does shell ignore quoting characters in arguments passed to it through variables? [duplicate]
(3 answers)
Closed 3 years ago.
Explaining the question through examples...
Demonstrates that the single-quotes after --chapters is gets escaped when the variable is expanded (I didn't expect this):
prompt#ubuntu:/my/scripts$ cat test1.sh
#!/bin/bash
actions="--tags all:"
actions+=" --chapters ''"
mkvpropedit "$1" $actions
prompt#ubuntu:/my/scripts$ ./test1.sh some.mkv
Error: Could not open '''' for reading.
And now for some reason mkvpropedit receives the double quotes as part of the filename (I didn't expect this either):
prompt#ubuntu:/my/scripts$ cat test1x.sh
#!/bin/bash
command="mkvpropedit \"$1\""
command+=" --tags all:"
command+=" --chapters ''"
echo "$command"
$command
prompt#ubuntu:/my/scripts$ ./test1x.sh some.mkv
mkvpropedit "some.mkv" --tags all: --chapters ''
Error: Could not open '''' for reading.
The above echo'd command seems to be correct. Putting the same text in another script gives the expected result:
prompt#ubuntu:/my/scripts$ cat test2.sh
#!/bin/bash
mkvpropedit "$1" --tags all: --chapters ''
prompt#ubuntu:/my/scripts$ ./test2.sh some.mkv
The file is being analyzed.
The changes are written to the file.
Done.
Could anyone please explain why the quotes are not behaving as expected. I found searching on this issue difficult as there are so many other quoting discussions on the web. I wouldn't even know how to explain the question without examples.
I am afraid that some day the file name in the argument contains some character that breaks everything, hence the maybe excessive quoting. I do not understand why the same command executes differently when typed directly in the script or when provided via a variable. Please enlighten me.
Thanks for reading.
The important thing to keep in mind is that quotes are only removed once, when the command line is originally parsed. A quote which is inserted into the command line as a result of parameter substitution ($foo) or command substitution ($(cmd args)) is not treated as a special character. [Note 1]
That seems different from whitespace and glob metacharacters. Word splitting and pathname expansion happen after parameter/command substitution (unless the substitution occurs inside quotes). [Note 2]
The consequence is that it is almost impossible to create a bash variable $args such that
cmd $args
If $args contains quotes, they are not removed. Words inside $args are delimited by sequences of whitespace, not single whitespace characters.
The only way to do it is to set $IFS to include some non-whitespace character; that character can then be used inside $args as a single-character delimiter. However, there is no way to quote a character inside a value, so once you do that, the character you chose cannot be used other than as a delimiter. This is not usually very satisfactory.
There is a solution, though: bash arrays.
If you make $args into an array variable, then you can expand it with the repeated-quote syntax:
cmd "${args[#]}"
which produces exactly one word per element of $args, and suppresses word-splitting and pathname expansion on those words, so they end up as literals.
So, for example:
actions=(--tags all:)
actions+=(--chapters '')
mkvpropedit "$1" "${actions[#]}"
will probably do what you want. So would:
args=("$1")
args+=(--tags)
args+=(all:)
args+=(--chapters)
args+=('')
mkvpropedit "${args[#]}"
and so would
command=(mkvpropedit "$1" --tags all: --chapters '')
"${command[#]}"
I hope that's semi-clear.
man bash (or the online version) contains a blow-by-blow account of how bash assembles commands, starting at the section "EXPANSION". It's worth reading for a full explanation.
Notes:
This doesn't apply to eval or commands like bash -c which evaluate their argument again after command line processing. But that's because command-line processing happens twice.
Word splitting is not the same as "dividing the command into words", which happens when the command is parsed. For one thing, word-splitting uses as separator characters the value of $IFS, whereas command-line parsing uses whitespace. But neither of these are done inside quotes, so they are similar in that respect. In any case, words are split in one way or another both before and after parameter substitution.