Random single quotes being added to option - bash

I'm trying to do an rsync like this in a bash script;
rsync -e "ssh ${flags_ssh}" -avRz --rsync-path="sudo rsync" $direcNew $(eval echo ${user_name})#$(eval echo ${instance_address}):$(eval echo ${mountdir}`)
However, when I run this using bash -x like this:
bash -x ./myscript
I get that it's trying to run that command except with the option rsync-path looking like
'--rsync-path=sudo rsync'
How do I prevent this? I need the double quotes to stay and the single quotes to go away, I don't know why it's happening, and I've tried endless combinations of eval and backslashes with no success.

You don't need to prevent this. What's happening is that when the shell parses --rsync-path="sudo rsync", it removes the quotes (after they have the intended effect of having the space treated as part of the argument, rather than a separator between arguments). Then, when it sees it's in -x mode, it comes up with a representation that would have led to the space being treated that way, and prints that. It could print any equivalent representation, including (but not limited to) any of these:
--rsync-path="sudo rsync"
"--rsync-path=sudo rsync"
--rsync-path=sudo\ rsync
--rsync-path=sudo" "rsync
'--rsync-path=sudo rsync'
$'--rsync-path=sudo rsync'
...etc
The fact that it picked a different representation than you did is not important, because these are all fully equivalent -- they all result in exactly the same thing being passed as an argument to rsync, so you don't need to worry about it.
You also intrinsically can't "fix" it, because by the time the shell prints its interpretation of your command, it's already forgotten which representation you happened to use -- it only knows the resulting argument that's going to be passed to rsync. In order to get it to print something else, you'd have to be passing a different actual argument to rsync, and that would break your rsync command.

Related

Bash command works when I run it myself but fails in the script

My company has a tool that dynamically generates commands to run based on an input json. It works very well when all arguments to the compiled command are single words, but is failing when we attempt multi word args. Here is the minimal example of how it fails.
# Print and execute the command.
print_and_run() { local command=("$#")
if [[ ${command[0]} == "time" ]]; then
echo "Your command: time ${command[#]:1}"
time ${command[#]:1}
fi
}
# How print_and_run is called in the script
print_and_run time docker run our-conainer:latest $generated_flags
# Output
Your command: time docker run our-container:latest subcommand --arg1=val1 --arg2="val2 val3"
Usage: our-program [OPTIONS] COMMAND1 [ARGS]... [COMMAND2 [ARGS]...]...
Try 'our-program --help' for help.
Error: No such command 'val3"'.
But if I copy the printed command and run it myself it works fine (I've omitted docker flags). Shelling into the container and running the program directly with these arguments works as well, so the parsing logic there is solid (It's a python program that uses click to parse the args).
Now, I have a working solution that uses eval, but my entire team jumped down my throat at that suggestion. I've also proposed a solution using delineating characters for multi-word arguments, but that was shot down as well.
No other solutions proposed by other engineers have worked either. So can I ask someone to perhaps explain why val3 is being treated as a separate command, or to help me find a solution to get bash to properly evaluate the dynamically determined command without using eval?
Your command after expanding $generated_flags is:
print_and_run time docker run our-conainer:latest subcommand --arg1=val1 --arg2="val2 val3"
Your specific problem is that in --arg2="val2 val3" the quotes are literal, not syntactical, because quotes are processed before variables are expanded. This means --arg2="val2 and val3" are being split into two separate arguments. Then, I assume, docker is trying to interpret val3" as some kind of docker command because it's not part of any argument, and it's throwing out an error because it doesn't know what that means.
Normally you'd fix this via an array to properly maintain the string boundary.
generated_flags=( "subcommand" "--arg1=val1" "--arg2=val2 val3" )
print_and_run time docker run our-container:latest "${generated_flags[#]}"
This will maintain --arg2=val2 val3 as a single argument as it gets passed into print_and_run, then you just have to expand your command array correctly inside the function (make sure to quote the expansion).
The question is:
why val3 is being treated as a separate command
Unquoted variable expansion undergo word splitting and filename expansion. Word splitting splits the result of the variable expansion on spcaes, tabs and newlines. Splits it into separate "words".
a="something else"
$a # results in two "words"; 'something' and 'else'
It is irrelevent what you put inside the variable value or how many quotes or escape sequences you put inside. Every consecutive spaces splits it into words. Quotes " ' and escapes \ are parsed when part of the input line, not when part of the result of unquoted expansion.
help me find a solution to
Write a parser that will actually parse the commands and split it according to the rules that you want to use and then execute the command split into separate words. For example, a very crude such parser is included in xargs:
$ echo " 'quotes quotes' not quotes" | xargs printf "'%s'\n"
'quotes quotes'
'not'
'quotes'
For example, python has shlex.split which you can just use, and at the same time introduce python which is waaaaay easier to manage than badly written Bash scripts.
tool that dynamically generates commands to run based on an input json
Overall, the proper way forward would is to upgrade the tool to generate a JSON array that represents the words of the command to be executed. Than you can just execute that array of words, which is, again, trivial to do properly in python with json and subprocess.run, and will require some gymnastics with jq and read and Bash arrays in shell.
Check your scripts with shellcheck.

Script takes only first part of double quotes

Yesterday I asked a similar question about escaping double quotes in env variables, although It didn't solve my problem (Probably because I didn't explain good enough) so I would like to specify more.
I'm trying to run a script (Which I know is written in Perl), although I have to use it as a black box because of permissions issue (so I don't know how the script works). Lets call this script script_A.
I'm trying to run a basic command in Shell: script_A -arg "date time".
If I run from the command line, it's works fine, but If I try to use it from a bash script or perl scrip (for example using the system operator), it will take only the first part of the string which was sent as an argument. In other words, it will fail with the following error: '"date' is not valid..
Example to specify a little bit more:
If I run from the command line (works fine):
> script_A -arg "date time"
If I run from (for example) a Perl script (fails):
my $args = $ENV{SOME_ENV}; # Assume that SOME_ENV has '-arg "date time"'
my $cmd = "script_A $args";
system($cmd");
I think that the problem comes from the environment variable, but I can't use the one quote while defining the env variable. For example, I can't use the following method:
setenv SOME_ENV '-arg "date time"'
Because it fails with the following error: '"date' is not valid.".
Also, I tried to use the following method:
setenv SOME_ENV "-arg '"'date time'"'"
Although now the env variable will containe:
echo $SOME_ENV
> -arg 'date time' # should be -arg "date time"
Another note, using \" fails on Shell (tried it).
Any suggestions on how to locate the reason for the error and how to solve it?
The $args, obtained from %ENV as you show, is a string.
The problem is in what happens to that string as it is manipulated before arguments are passed to the program, which needs to receive strings -arg and date time
If the program is executed in a way that bypasses the shell, as your example is, then the whole -arg "date time" is passed to it as its first argument. This is clearly wrong as the program expects -arg and then another string for its value (date time)
If the program were executed via the shell, what happens when there are shell metacharacters in the command line (not in your example), then the shell would break the string into words, except for the quoted part; this is how it works from the command line. That can be enforced with
system('/bin/tcsh', '-c', $cmd);
This is the most straightforward fix but I can't honestly recommend to involve the shell just for arguments parsing. Also, you are then in the game of layered quoting and escaping, what can get rather involved and tricky. For one, if things aren't right the shell may end up breaking the command into words -arg, "date, time"
How you set the environment variable works
> setenv SOME_ENV '-arg "date time"'
> perl -wE'say $ENV{SOME_ENV}' #--> -arg "date time" (so it works)
what I believe has always worked this way in [t]csh.
Then, in a Perl script: parse this string into -arg and date time strings, and have the program is executed in a way that bypasses the shell (if shell isn't used by the command)
my #args = $ENV{SOME_ENV} =~ /(\S+)\s+"([^"]+)"/; #"
my #cmd = ('script_A', #args);
system(#cmd) == 0 or die "Error with system(#cmd): $?";
This assumes that SOME_ENV's first word is always the option's name (-arg) and that all the rest is always the option's value, under quotes. The regex extracts the first word, as consecutive non-space characters, and after spaces everything in quotes.† These are program's arguments.
In the system LIST form the program that is the first element of the list is executed without using a shell, and the remaining elements are passed to it as arguments. Please see system for more on this, and also for basics of how to investigate failure by looking into $? variable.
It is in principle advisable to run external commands without the shell. However, if your command needs the shell then make sure that the string is escaped just right to to preserve quotes.
Note that there are modules that make it easier to use external commands. A few, from simple to complex: IPC::System::Simple, Capture::Tiny, IPC::Run3, and IPC::Run.
I must note that that's an awkward environment variable; any way to ogranize things otherwise?
† To make this work for non-quoted arguments as well (-arg date) make the quote optional
my #args = $ENV{SOME_ENV} =~ /(\S+)\s+"?([^"]+)/;
where I now left out the closing (unnecessary) quote for simplicity

Echo-ing an environment variable returns string literal rather than environment variable value

I have two bash scripts. The first listens to a pipe "myfifo" for input and executes the input as a command:
fifo_name="myfifo"
[ -p $fifo_name ] || mkfifo $fifo_name;
while true
do
if read line; then
$line
fi
done <"$fifo_name"
The second passes a command 'echo $SET_VAR' to the "myfifo" pipe:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
As you can see, I want to pass 'echo $SET_VAR' through the pipe. In the listener process, I've set a $SET_VAR environment variable. I expect the output of the command 'echo $SET_VAR' to be 'var_value,' which is the value of the environment variable SET_VAR.
Running the first (the listener) script in one bash process and then passing a command via the second in another process gives the following result:
$SET_VAR
I expected to "var_value" to be printed. Instead, the string literal $SET_VAR is printed. Why is this the case?
Before I get to the problem you're reporting, I have to point out that your loop won't work. The while true part (without a break somewhere in the loop) will run forever. It'll read the first line from the file, loop, try to read a second line (which fails), loop again, try to read a third line (also fails), loop again, try to read a fourth line, etc... You want the loop to exit as soon as the read command fails, so use this:
while read line
do
# something I'll get to
done <"$fifo_name"
The other problem you're having is that the shell expands variables (i.e. replaces $var with the value of the variable var) partway through the process of parsing a command line, and when it's done that it doesn't go back and re-do the earlier parsing steps. In particular, if the variable's value included something like $SET_VAR it doesn't go back and expand that, since it's just finished the bit where it expands variables. In fact, the only thing it does with the expanded value is split it into "words" (based on whitespace), and expand any filename wildcards it finds -- no variable expansions happen, no quote or escape interpretation, etc.
One possible solution is to tell the shell to run the parsing process twice, with the eval command:
while read line
do
eval "$line"
done <"$fifo_name"
(Note that I used double-quotes around "$line" -- this prevents the word splitting and wildcard expansion I mentioned from happening before eval goes through the normal parsing process. If you think of your original code half-parsing the command in $line, without double-quotes it gets one and a half-parsed, which is weird. Double-quotes suppress that half-parsing stage, so the contents of the variable get parsed exactly once.)
However, this solution comes with a big warning, because eval has a well-deserved reputation as a bug magnet. eval makes it easy to do complex things without quite understanding what's going on, which means you tend to get scripts that work great in testing, then fail incomprehensibly later. And in my experience, when eval looks like the best solution, it probably means you're trying to solve the wrong problem.
So, what're you actually trying to do? If you're just trying to execute the lines coming from the fifo as shell commands, then you can use bash "$fifo_name" to run them in a subshell, or source "$fifo_name" to run them in the current shell.
BTW, the script that feeds the fifo:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
Is also a disaster waiting to happen. Putting commands in variables doesn't work very well in the shell (I second chepner's recommendation of BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!), and putting a command to print another command in a variable is just begging for trouble.
bash, by it's nature, reads commands from stdin. You can simply run:
bash < myfifo

Rsync and quotes [duplicate]

This question already has answers here:
Setting an argument with bash [duplicate]
(2 answers)
Closed 6 years ago.
I wrote a bash script with the following:
SRC="dist_serv:$HOME/www/"
DEST="$HOME/www/"
OPTIONS="--exclude 'file.php'"
rsync -Cavz --delete $OPTIONS $SRC $DEST
rsync fails and I can't figure out why, although it seems to be related to the $OPTIONS variable (it works when I remove it). I tried escaping the space with a backslash (among many other things) but that didn't work.
The error message is :
rsync: mkdir "/home/xxx/~/public_html/" failed: No such file or directory (2)
I tried quoting the variable, which throws another error ("unknown option" on my variable $OPTIONS):
rsync: --exclude 'xxx': unknown option
rsync error: syntax or usage error (code 1) at main.c(1422) [client=3.0.6]
You shouldn't put $ in front of the variable names when assigning values to them. SRC is a variable, $SRC is the value that it expands to.
Additionally, ~ is not expanded to the path of your home directory when you put it in quotes. It is generally better to use $HOME in scripts as this variable behaves like a variable, which ~ doesn't do.
Always quote variable expansions:
rsync -Cavz --delete "$OPTIONS" "$SRC" "$DEST"
unless there is some reason not to (there very seldom is). The shell will perform word splitting on them otherwise.
User #Fred points out that you can't use double quotes around $OPTIONS (in in comments below), but it should be ok if you use OPTIONS='--exclude="file.php"' (note the =).
One technique which I find invaluable is using positional parameters to make it easy to work with list of options.
When you put options inside a variable (such as your OPTIONS variable), you need to find a way to include quotes inside the value, and omit quotes when referencing the variable. It works, but you are always one typo away from a difficult to debug failure.
Instead, try the following.
set -- -Cavz --delete
set -- "$#" --exclude "file.php"
set -- "$#" "dist_serv:~/www/"
set -- "$#" "~/www/"
rsync "$#"
Of course, in this case, everything could be on the same line, but in many cases there will be conditional expressions so that, for instance, you can omit a given option, or select difference files to work with. The nice thing is, you always use the same quoting you would use on a single command line, all thanks to the magic of "$#" that avoids having to reference (or quote) any specific variable.
If actual positional parameters get in the way, you can put them in variables, or create a function to isolate a context that avoids touching them where they matter.
I use this trick all the time, and I have stopped pulling my hair out due to quoting causing problems inside values I pass as parameter to commands.
A similar result can be achieved by using an array.
declare -a ARGUMENTS=()
ARGUMENTS=(-Cavz --delete )
ARGUMENTS+=(--exclude "file.php")
ARGUMENTS+=("dist_serv:~/www/")
ARGUMENTS+=("~/www/")
rsync "${ARGUMENTS[#]}"

Command substitution in shell script without globbing

Consider this little shell script.
# Save the first command line argument
cmd="$1"
# Execute the command specified in the first command line argument
out=$($cmd)
# Do something with the output of the specified command
# Here we do a silly thing, like make the output all uppercase
echo "$out" | tr -s "a-z" "A-Z"
The script executes the command specified as the first argument, transforms the output obtained from that command and prints it to standard output. This script may be executed in this manner.
sh foo.sh "echo select * from table"
This does not do what I want. It may print something like the following,
$ sh foo.sh "echo select * from table"
SELECT FILEA FILEB FILEC FROM TABLE
if fileA, fileB and fileC is present in the current directory.
From a user perspective, this command is reasonable. The user has quoted the * in the command line argument, so the user doesn't expect the * to be globbed. But my script astonishes the user by using this argument in a command substitution which causes globbing of * as seen in the above output.
I want the output to be the following instead.
SELECT * FROM TABLE
The entire text in cmd actually comes from command line arguments to the script so I would like to preserve any * symbol present in the argument without globbing them.
I am looking for a solution that works for any POSIX shell.
One solution I have come up with is to disable globbing with set -o noglob just before the command substitution. Here is the complete code.
# Save the first command line argument
cmd="$1"
# Execute the command specified in the first command line argument
set -o noglob
out=$($cmd)
# Do something with the output of the specified command
# Here we do a silly thing, like make the output all uppercase
echo "$out" | tr -s "a-z" "A-Z"
This does what I expect.
$ sh foo.sh "echo select * from table"
SELECT * FROM TABLE
Apart from this, is there any other concept or trick (such as a quoting mechanism) I need to be aware of to disable globbing only within a command substitution without having to use set -o noglob.
I am not against set -o noglob. I just want to know if there is another way. You know, globbing can be disabled for normal command line arguments just by quoting them, so I was wondering if there is anything similar for command substiution.
If I understand correctly, you want the user to provide a shell command as a command-line argument, which will be executed by the script, and is expected to produce an SQL string, which will be processed (upper-cased) and echoed to stdout.
The first thing to say is that there is no point in having the user provide a shell command that the script just blindly executes. If the script applied some kind of modification/preprocessing of the command before it executed it then perhaps it could make sense, but if not, then the user might as well execute the command himself and pass the output to the script as a command-line argument, or via stdin.
But that being said, if you really want to do it this way, then there are two things that need to be said. Firstly, this is the proper form to use:
out=$(eval "$cmd");
A fairly advanced understanding of the shell grammer and expansion rules would be required to fully understand the rationale for using the above syntax, but basically executing $cmd and executing eval "$cmd" have subtle differences that render the $cmd form inappropriate for executing a given shell command string.
Just to give some detail that will hopefully clarify the above point, there are seven kinds of expansion that are performed by the shell in the following order when processing input: (1) brace expansion, (2) tilde expansion, (3) parameter and variable expansion, (4) arithmetic expansion, (5) command substitution, (6) word splitting, and (7) pathname expansion. Notice that variable expansion happens somewhat in the middle of that sequence, and thus the variable-expanded shell command (which was provided by the user) will not receive the benefit of the prior expansion types. Other issues are that leading variable assignments, pipelines, and command list tokens will not be executed correctly under the $cmd form, because they are parsed and processed prior to variable expansion (actually prior to all expansions) as well.
By running the command through eval, properly double-quoted, you ensure that the full shell parsing/processing/execution algorithm will be applied to the shell command string that was given by the user of your script.
The second thing to say is this: If you try the above proper form in your script, you will find that it has not solved your problem. You will still get SELECT FILEA FILEB FILEC FROM TABLE as output.
The reason is this: Since you've decided you want to accept an arbitrary shell command from the user of your script, it is now the user's responsibility to properly quote all metacharacters that may be embedded in that piece of code. It does not make sense for you to accept a shell command as a command-line argument, but somehow change the processing rules for shell commands so that certain metacharacters will no longer be metacharacters when the given shell command is executed. Actually, you could do something like that, perhaps using set -o noglob as you discovered, but then that must become a contract between the script and the user of the script; the user must be made aware of exactly what the precise processing rules will be when the command is executed so that he can properly use the script.
Under this design, the user could call the script as follows (notice the extra layer of quoting for the shell command string evaluation; could alternatively backslash-escape just the asterisk):
$ sh foo.sh "echo 'select * from table'";
I'd like to return to my earlier comment about the overall design; it doesn't really make sense to do it this way. It makes more sense to take the text-to-process itself, not a shell command that is expected to produce the text-to-process.
Here is how that could be done:
## take the text-to-process via a command-line argument
sql="$1";
## process and echo it
echo "$sql"| tr a-z A-Z;
(I also removed the -s option of tr, which really doesn't make sense here.)
Notice that the script is simpler now, and usage is also simpler:
$ sh foo.sh 'select * from table';

Resources