Consider this little shell script.
# Save the first command line argument
cmd="$1"
# Execute the command specified in the first command line argument
out=$($cmd)
# Do something with the output of the specified command
# Here we do a silly thing, like make the output all uppercase
echo "$out" | tr -s "a-z" "A-Z"
The script executes the command specified as the first argument, transforms the output obtained from that command and prints it to standard output. This script may be executed in this manner.
sh foo.sh "echo select * from table"
This does not do what I want. It may print something like the following,
$ sh foo.sh "echo select * from table"
SELECT FILEA FILEB FILEC FROM TABLE
if fileA, fileB and fileC is present in the current directory.
From a user perspective, this command is reasonable. The user has quoted the * in the command line argument, so the user doesn't expect the * to be globbed. But my script astonishes the user by using this argument in a command substitution which causes globbing of * as seen in the above output.
I want the output to be the following instead.
SELECT * FROM TABLE
The entire text in cmd actually comes from command line arguments to the script so I would like to preserve any * symbol present in the argument without globbing them.
I am looking for a solution that works for any POSIX shell.
One solution I have come up with is to disable globbing with set -o noglob just before the command substitution. Here is the complete code.
# Save the first command line argument
cmd="$1"
# Execute the command specified in the first command line argument
set -o noglob
out=$($cmd)
# Do something with the output of the specified command
# Here we do a silly thing, like make the output all uppercase
echo "$out" | tr -s "a-z" "A-Z"
This does what I expect.
$ sh foo.sh "echo select * from table"
SELECT * FROM TABLE
Apart from this, is there any other concept or trick (such as a quoting mechanism) I need to be aware of to disable globbing only within a command substitution without having to use set -o noglob.
I am not against set -o noglob. I just want to know if there is another way. You know, globbing can be disabled for normal command line arguments just by quoting them, so I was wondering if there is anything similar for command substiution.
If I understand correctly, you want the user to provide a shell command as a command-line argument, which will be executed by the script, and is expected to produce an SQL string, which will be processed (upper-cased) and echoed to stdout.
The first thing to say is that there is no point in having the user provide a shell command that the script just blindly executes. If the script applied some kind of modification/preprocessing of the command before it executed it then perhaps it could make sense, but if not, then the user might as well execute the command himself and pass the output to the script as a command-line argument, or via stdin.
But that being said, if you really want to do it this way, then there are two things that need to be said. Firstly, this is the proper form to use:
out=$(eval "$cmd");
A fairly advanced understanding of the shell grammer and expansion rules would be required to fully understand the rationale for using the above syntax, but basically executing $cmd and executing eval "$cmd" have subtle differences that render the $cmd form inappropriate for executing a given shell command string.
Just to give some detail that will hopefully clarify the above point, there are seven kinds of expansion that are performed by the shell in the following order when processing input: (1) brace expansion, (2) tilde expansion, (3) parameter and variable expansion, (4) arithmetic expansion, (5) command substitution, (6) word splitting, and (7) pathname expansion. Notice that variable expansion happens somewhat in the middle of that sequence, and thus the variable-expanded shell command (which was provided by the user) will not receive the benefit of the prior expansion types. Other issues are that leading variable assignments, pipelines, and command list tokens will not be executed correctly under the $cmd form, because they are parsed and processed prior to variable expansion (actually prior to all expansions) as well.
By running the command through eval, properly double-quoted, you ensure that the full shell parsing/processing/execution algorithm will be applied to the shell command string that was given by the user of your script.
The second thing to say is this: If you try the above proper form in your script, you will find that it has not solved your problem. You will still get SELECT FILEA FILEB FILEC FROM TABLE as output.
The reason is this: Since you've decided you want to accept an arbitrary shell command from the user of your script, it is now the user's responsibility to properly quote all metacharacters that may be embedded in that piece of code. It does not make sense for you to accept a shell command as a command-line argument, but somehow change the processing rules for shell commands so that certain metacharacters will no longer be metacharacters when the given shell command is executed. Actually, you could do something like that, perhaps using set -o noglob as you discovered, but then that must become a contract between the script and the user of the script; the user must be made aware of exactly what the precise processing rules will be when the command is executed so that he can properly use the script.
Under this design, the user could call the script as follows (notice the extra layer of quoting for the shell command string evaluation; could alternatively backslash-escape just the asterisk):
$ sh foo.sh "echo 'select * from table'";
I'd like to return to my earlier comment about the overall design; it doesn't really make sense to do it this way. It makes more sense to take the text-to-process itself, not a shell command that is expected to produce the text-to-process.
Here is how that could be done:
## take the text-to-process via a command-line argument
sql="$1";
## process and echo it
echo "$sql"| tr a-z A-Z;
(I also removed the -s option of tr, which really doesn't make sense here.)
Notice that the script is simpler now, and usage is also simpler:
$ sh foo.sh 'select * from table';
Related
So I generally create job files with a list of commands in it. Then I execute it like so
cat jobFile | while read a; do $a; done
Which always works in bash. However, I've just started working in Mac which apparently uses zsh. And this command fails with "no such file" etc. I've tested the job file by running few lines from it manually, so it should be fine.
I've found questions on zsh read inbut they tend to be reading in from variables e.g. $a=('a' 'b' 'c') or echo $a
Thank you for your answers!
In bash, unquoted parameter expansions always undergo word-splitting, so if a="foo bar", then $a expands to two words, foo and bar. As a command, this means running the command foo with an argument bar.
In zsh, parameter expansions to not undergo word-splitting by default, which means the same expansion $a would produce a single word foo bar, treated as the name of the command to execute.
In either case, relying on parameter expansion to "parse" a shell command is fragile; in addition to word-splitting, the expansion is subject to pathname expansion (globbing), and you are limited to simple commands and their arguments. No pipes, lists (&&, ||), or redirections allowed, as everything will be treated as the command name and a sequence of arguments.
What you want in both shells is to simply treat your job file as a shell script, which can be executed in the current shell using the . command:
. jobFile
Why are you executing it in such a cumbersome way? Assuming jobFile is a file holding a sequence of bash commands, you can simply run it as
bash jobFile
If it contains a sequence of zsh commands, you can likewise run it as
zsh jobFile
If you follow this approach, I would however reflect in the name of the job file, what shell it is intended for, i.e.
bash jobFile.bash
zsh jobFile.zsh
and, if you write a job file so that it is supposed to be compatible with either shell, I would name it jobFile.sh.
Yesterday I asked a similar question about escaping double quotes in env variables, although It didn't solve my problem (Probably because I didn't explain good enough) so I would like to specify more.
I'm trying to run a script (Which I know is written in Perl), although I have to use it as a black box because of permissions issue (so I don't know how the script works). Lets call this script script_A.
I'm trying to run a basic command in Shell: script_A -arg "date time".
If I run from the command line, it's works fine, but If I try to use it from a bash script or perl scrip (for example using the system operator), it will take only the first part of the string which was sent as an argument. In other words, it will fail with the following error: '"date' is not valid..
Example to specify a little bit more:
If I run from the command line (works fine):
> script_A -arg "date time"
If I run from (for example) a Perl script (fails):
my $args = $ENV{SOME_ENV}; # Assume that SOME_ENV has '-arg "date time"'
my $cmd = "script_A $args";
system($cmd");
I think that the problem comes from the environment variable, but I can't use the one quote while defining the env variable. For example, I can't use the following method:
setenv SOME_ENV '-arg "date time"'
Because it fails with the following error: '"date' is not valid.".
Also, I tried to use the following method:
setenv SOME_ENV "-arg '"'date time'"'"
Although now the env variable will containe:
echo $SOME_ENV
> -arg 'date time' # should be -arg "date time"
Another note, using \" fails on Shell (tried it).
Any suggestions on how to locate the reason for the error and how to solve it?
The $args, obtained from %ENV as you show, is a string.
The problem is in what happens to that string as it is manipulated before arguments are passed to the program, which needs to receive strings -arg and date time
If the program is executed in a way that bypasses the shell, as your example is, then the whole -arg "date time" is passed to it as its first argument. This is clearly wrong as the program expects -arg and then another string for its value (date time)
If the program were executed via the shell, what happens when there are shell metacharacters in the command line (not in your example), then the shell would break the string into words, except for the quoted part; this is how it works from the command line. That can be enforced with
system('/bin/tcsh', '-c', $cmd);
This is the most straightforward fix but I can't honestly recommend to involve the shell just for arguments parsing. Also, you are then in the game of layered quoting and escaping, what can get rather involved and tricky. For one, if things aren't right the shell may end up breaking the command into words -arg, "date, time"
How you set the environment variable works
> setenv SOME_ENV '-arg "date time"'
> perl -wE'say $ENV{SOME_ENV}' #--> -arg "date time" (so it works)
what I believe has always worked this way in [t]csh.
Then, in a Perl script: parse this string into -arg and date time strings, and have the program is executed in a way that bypasses the shell (if shell isn't used by the command)
my #args = $ENV{SOME_ENV} =~ /(\S+)\s+"([^"]+)"/; #"
my #cmd = ('script_A', #args);
system(#cmd) == 0 or die "Error with system(#cmd): $?";
This assumes that SOME_ENV's first word is always the option's name (-arg) and that all the rest is always the option's value, under quotes. The regex extracts the first word, as consecutive non-space characters, and after spaces everything in quotes.† These are program's arguments.
In the system LIST form the program that is the first element of the list is executed without using a shell, and the remaining elements are passed to it as arguments. Please see system for more on this, and also for basics of how to investigate failure by looking into $? variable.
It is in principle advisable to run external commands without the shell. However, if your command needs the shell then make sure that the string is escaped just right to to preserve quotes.
Note that there are modules that make it easier to use external commands. A few, from simple to complex: IPC::System::Simple, Capture::Tiny, IPC::Run3, and IPC::Run.
I must note that that's an awkward environment variable; any way to ogranize things otherwise?
† To make this work for non-quoted arguments as well (-arg date) make the quote optional
my #args = $ENV{SOME_ENV} =~ /(\S+)\s+"?([^"]+)/;
where I now left out the closing (unnecessary) quote for simplicity
I have two bash scripts. The first listens to a pipe "myfifo" for input and executes the input as a command:
fifo_name="myfifo"
[ -p $fifo_name ] || mkfifo $fifo_name;
while true
do
if read line; then
$line
fi
done <"$fifo_name"
The second passes a command 'echo $SET_VAR' to the "myfifo" pipe:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
As you can see, I want to pass 'echo $SET_VAR' through the pipe. In the listener process, I've set a $SET_VAR environment variable. I expect the output of the command 'echo $SET_VAR' to be 'var_value,' which is the value of the environment variable SET_VAR.
Running the first (the listener) script in one bash process and then passing a command via the second in another process gives the following result:
$SET_VAR
I expected to "var_value" to be printed. Instead, the string literal $SET_VAR is printed. Why is this the case?
Before I get to the problem you're reporting, I have to point out that your loop won't work. The while true part (without a break somewhere in the loop) will run forever. It'll read the first line from the file, loop, try to read a second line (which fails), loop again, try to read a third line (also fails), loop again, try to read a fourth line, etc... You want the loop to exit as soon as the read command fails, so use this:
while read line
do
# something I'll get to
done <"$fifo_name"
The other problem you're having is that the shell expands variables (i.e. replaces $var with the value of the variable var) partway through the process of parsing a command line, and when it's done that it doesn't go back and re-do the earlier parsing steps. In particular, if the variable's value included something like $SET_VAR it doesn't go back and expand that, since it's just finished the bit where it expands variables. In fact, the only thing it does with the expanded value is split it into "words" (based on whitespace), and expand any filename wildcards it finds -- no variable expansions happen, no quote or escape interpretation, etc.
One possible solution is to tell the shell to run the parsing process twice, with the eval command:
while read line
do
eval "$line"
done <"$fifo_name"
(Note that I used double-quotes around "$line" -- this prevents the word splitting and wildcard expansion I mentioned from happening before eval goes through the normal parsing process. If you think of your original code half-parsing the command in $line, without double-quotes it gets one and a half-parsed, which is weird. Double-quotes suppress that half-parsing stage, so the contents of the variable get parsed exactly once.)
However, this solution comes with a big warning, because eval has a well-deserved reputation as a bug magnet. eval makes it easy to do complex things without quite understanding what's going on, which means you tend to get scripts that work great in testing, then fail incomprehensibly later. And in my experience, when eval looks like the best solution, it probably means you're trying to solve the wrong problem.
So, what're you actually trying to do? If you're just trying to execute the lines coming from the fifo as shell commands, then you can use bash "$fifo_name" to run them in a subshell, or source "$fifo_name" to run them in the current shell.
BTW, the script that feeds the fifo:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
Is also a disaster waiting to happen. Putting commands in variables doesn't work very well in the shell (I second chepner's recommendation of BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!), and putting a command to print another command in a variable is just begging for trouble.
bash, by it's nature, reads commands from stdin. You can simply run:
bash < myfifo
Consider the following script:
#!/bin/bash
CMD="echo hello world > /tmp/hello.out"
${CMD}
The output for this is:
hello world > /tmp/hello.out
How can I modify CMD so that the output gets redirected to hello.out?
For my use case, it is not feasible to either do this:
${CMD} > /tmp/hello.out
or to add this at the top of the script:
exec > /tmp/hello.out
No, there is no way to make a redirection happen from a variable.
Why?
The first thing the shell does with a command line is:
Each line that the shell reads from the standard input or a script is called a pipeline; it contains one or more commands separated by zero or
more pipe characters (|). For each pipeline it reads, the shell breaks it up into commands, sets up the I/O for the pipeline, then does the following for each command (Figure 7-1):
From: Learning the bash Shell Unix Shell Programming . Chapter Preview / Figure . Pdf
That means that even before starting with the first word of a command line, the redirections are set up.
The "Parameter Expansion" happens quite a lot latter (in step 6 of the Figure).
There is no way to set up redirections after a variable is expanded.
Unless ...
The "command line is reprocessed" using eval.
eval "$CMD"
But this comes with a lot of danger.
The command line is changed by the first processing in the 12 steps detailed in the book (quotes are removed, variables expanded, words split, etc.).
It is usually quite difficult to estimate all the changes and consequences before the line is actually processed.
And then, it is processed again.
You can use eval to instruct the shell to reinterpret the variable content as a shell command:
eval $CMD
So I have a script where I type the script.sh followed by input for a set of if-else statements. Like this:
script.sh fnSw38h$?2
The output echoes out the input in the end.
But I noticed that $? is interpreted as 0/1 so the output would echo:
fnSw38h12
How can I stop the shell from expanding the characters and take it face value?
I looked at something like opt noglob or something similar but they didn't work.
When I put it like this:
script.sh 'fnSw38h$?2'
it works. But how do I capture that within single quotes ('') when I can't state variables inside it like Var='$1'
Please help!
How to pass a password to a script
I gather from the comments that the true purpose of this script is to validate a password. If this is an important or sensitive application, you really should be using professional security tools. If this application is not sensitive or this is just a learning exercise, then read on for a first introduction to the issues.
First, do not do this:
script.sh fnSw38h$?2
This password will appear in ps and be visible to any user on the system in plain text.
Instead, have the user type the password as input to the script, such as:
#!/bin/sh
IFS= read -r var
Here, read will gather input from the keyboard free from shell interference and it will not appear in ps output.
var will have the password for you to verify but you really shouldn't have plain text passwords saved anywhere for you to verify against. It is much better to put the password through a one-way hash and then compare the hash with something that you have saved in a file. For example:
var=$(head -n1 | md5sum)
Here, head will read one line (the password) and pass it to md5sum which will convert it to a hash. This hash can be compared with the known correct hash for this user's password. The text returned by head will be exactly what the user typed, unmangled by the shell.
Actually, for a known hash algorithm, it is possible to make a reverse look-up table for common passwords. So, the solution is to create a variable, called salt, that has some user dependent information:
var=$( { head -n1; echo "$salt"; } | md5sum)
The salt does not have to be kept secret. It is just there to make look-up tables more difficult to compute.
The md5sum algorithm, however, has been found to have some weaknesses. So, it should be replaced with more recent hash algorithms. As I write, that would probably be a sha-2 variant.
Again, if this is a sensitive application, do not use home-made tools
Answer to original question
how do I capture that within single quotes ('') when I can't state variables inside it like Var='$1'
The answer is that you don't need to. Consider, for example, this script:
#!/bin/sh
var=$1
echo $var
First, note that $$ and $? are both shell variables:
$ echo $$ $?
28712 0
Now, let's try our script:
$ bash ./script.sh '$$ $?'
$$ $?
These variables were not expanded because (1) when they appeared on the command line, they were in single-quotes, and (2) in the script, they were assigned to variables and bash does not expand variables recursively. In other words, on the line echo $var, bash will expand $var to get $$ $? but there it stops. It does not expand what was in var.
You can escape any dollar signs in a double-quoted string that are not meant to introduce a parameter expansion.
var=foo
# Pass the literal string fnSw38h$?2foo to script.sh
script.sh "fnSw38h\$?2$var"
You cannot do what you are trying to do. What is entered on the command line (such as the arguments to your script) must be in shell syntax, and will be interpreted by the shell (according to the shell's rules) before being handed to your script.
When someone runs the command script.sh fnSw38h$?2, the shell parses the argument as the text "fnSw38h", followed by $? which means "substitute the exit status of the last command here", followed by "2". So the shell does as it's been told, it substitutes the exit status of the last command, then hands the result of that to your script.
Your script never receives "fnSw38h$?2", and cannot recover the argument in that form. It receives something like "fnSw38h02" or "fnSw38h12", because that's what the user asked the shell to pass it. That might not be what the user wanted to pass it, but as I said, the command must be in shell syntax, and in shell syntax an unescaped and unnquoted $? means "substitute the last exit status here".
If the user wants to pass "$?" as part of the argument, they must escape or single-quote it on the command line. Period.