So I generally create job files with a list of commands in it. Then I execute it like so
cat jobFile | while read a; do $a; done
Which always works in bash. However, I've just started working in Mac which apparently uses zsh. And this command fails with "no such file" etc. I've tested the job file by running few lines from it manually, so it should be fine.
I've found questions on zsh read inbut they tend to be reading in from variables e.g. $a=('a' 'b' 'c') or echo $a
Thank you for your answers!
In bash, unquoted parameter expansions always undergo word-splitting, so if a="foo bar", then $a expands to two words, foo and bar. As a command, this means running the command foo with an argument bar.
In zsh, parameter expansions to not undergo word-splitting by default, which means the same expansion $a would produce a single word foo bar, treated as the name of the command to execute.
In either case, relying on parameter expansion to "parse" a shell command is fragile; in addition to word-splitting, the expansion is subject to pathname expansion (globbing), and you are limited to simple commands and their arguments. No pipes, lists (&&, ||), or redirections allowed, as everything will be treated as the command name and a sequence of arguments.
What you want in both shells is to simply treat your job file as a shell script, which can be executed in the current shell using the . command:
. jobFile
Why are you executing it in such a cumbersome way? Assuming jobFile is a file holding a sequence of bash commands, you can simply run it as
bash jobFile
If it contains a sequence of zsh commands, you can likewise run it as
zsh jobFile
If you follow this approach, I would however reflect in the name of the job file, what shell it is intended for, i.e.
bash jobFile.bash
zsh jobFile.zsh
and, if you write a job file so that it is supposed to be compatible with either shell, I would name it jobFile.sh.
Related
I noticed that my script was ignoring my positional arguments in old terminal tabs, but working on recently created ones, so I decided to reduce it to the following:
TAG=test
while getopts 't:' c
do
case $c in
t)
TAG=$OPTARG
;;
esac
done
echo $TAG
And running the script I have:
~ source my_script
test
~ source my_script -t "test2"
test2
~ source my_script -t "test2"
test
I thought it could be that c was an special used variable elsewhere but after changing it to other names I had the exact same problem. I also tried adding a .sh extension to the file to see it that was a problem, but nothing worked.
Am I doing something wrong ? And why does it work the first time, but not the subsequent attempts ?
I am on MacOS and I use zsh.
Thank you very much.
The problem is that you're using source to run the script (the . command does the same thing). This makes it run in your current (interactive) shell (rather than a subprocess, like scripts normally do). This means it uses the same variables as the current shell, which is necessary if you want it to change those variables, but it can also have weird effects if you're not careful.
In this case, the problem is that getopts uses the variable OPTIND to keep track of where it is in the argument list (so it doesn't process the same argument twice). The first time you run the script with -t test2, getopts processes those arguments, and leaves OPTIND set to 3 (meaning that it's already done the first two arguments, "-t" and "test2". The second time you run it with options, it sees that OPTIND is set to 3, so it thinks it's already processed both arguments and just exits the loop.
One option is to add unset OPTIND before the while getopts loop, to reset the count and make it start from the beginning each time.
But unless there's some reason for this script to run in the current shell, it'd be better to make it a standard shell script and have it run as a subprocess. To do this:
Add a "shebang" line as the first line of the script. To make the script run in bash, that'd be either #!/bin/bash or #!/usr/bin/env bash. For zsh, use #!/bin/zsh or #!/usr/bin/env zsh. Since the script runs in a separate shell process, the you can run bash scripts from zsh or zsh scripts from bash, or whatever.
Add execute permission to the script file with chmod -x my_script (or whatever the file's actual name is).
Run the script with ./my_script (note the lack of a space between . and /), or by giving the full path to the script, or by putting the script in some directory in your PATH (the directories that're automatically searched for commands) and just running my_script. Do NOT run it with the bash, sh, zsh etc commands; these override the shebang and therefore can cause confusion.
Note: adding ".sh" to the filename is not recommended; it does nothing useful, and makes the script less convenient to run since you have to type in the extension every time you run it.
Also, a couple of recommendations: there are a bunch of all-caps variable names with special meanings (like PATH and OPTIND), so unless you want one of those special meanings, it's best to use lower- or mixed-case variable names (e.g. tag instead of TAG). Also, double-quoting variable references (e.g. echo "$tag" instead of echo $tag) avoids a lot of weird parsing headaches. Run your scripts through shellcheck.net; it's good at spotting common mistakes like this.
I have a simple bash script that does something like the following:
#!/bin/bash
a=$(curl -s http://metadata/endpoint/internal)
echo "$a - bar"
(this is just a simplification). Note the use of the two $ signs to execute a command and resolve a variable.
Using Terraform, I want to write this file to a GCP instance during startup. Per Terraform's instructions, I'm attempting to avoid using the File Provisioner. Using the metadata_startup_script field in google_compute_instance2, I am including the script so that I can write it to a particular location.
E.g.
metadata_startup_script = <<-EOF
#!/bin/bash -xe
sudo tee /etc/myservice/serv.conf > /dev/null <<EOI
${file("${path.module}/scripts/simple_bash.sh")}
EOI
EOF
Terraform is interpolating the $ in the subscript somewhere (either in the loading into the metadata_startup_script, or in the writing out into the script to disk.
So, depending on what I use to try to escape the interpolation, it still fails to write. For example, I have tried (in the subscript):
echo "\$a - bar"
echo "${“$”}a - bar"
echo "$$a - bar"
According to the terraform docs, I'm supposed to use $$, but when I do it in the above, I get:
echo "1397a - bar"
All which fail to replicate the original script.
I’m just looking for the exact bash script, as written, to be written to disk.
My goal would be to do the above without extra escape sequences (as detailed here - Escaping dollar sign in Terraform) so that i can continue to run the original script (for debugging purposes).
I would also prefer not to build a packer image with the original script in it.
Thanks!
I don't think it's Terraform eating your variable interpolations here, because Terraform only understands ${ (a dollar sign followed by a brace) as starting an interpolation, whereas your example contains only $a.
However, you do seem to be embedded one bash script inside another, so it seems plausible to me that the outer bash is resolving your $a before the inner bash gets a chance to look at it. If so, you can use the literal variant of bash heredoc syntax, as described in answers to How to cat <> a file containing code?, so that the outer bash will take the content as literal and leave it to the inner bash to evaluate.
metadata_startup_script = <<-EOF
#!/bin/bash -xe
sudo tee /etc/myservice/serv.conf > /dev/null <<'EOI'
${file("${path.module}/scripts/simple_bash.sh")}
EOI
EOF
Notice that I wrote <<'EOI' instead of <<EOI, following the guidance from that other question in combination with the main Bash documentation on "here documents" (bold emphasis mine):
This type of redirection instructs the shell to read input from the current source until a line containing only word (with no trailing blanks) is seen. All of the lines read up to that point are then used as the standard input (or file descriptor n if n is specified) for a command.
The format of here-documents is:
[n]<<[-]word
here-document
delimiter
No parameter and variable expansion, command substitution, arithmetic expansion, or filename expansion is performed on word. If any part of word is quoted, the delimiter is the result of quote removal on word, and the lines in the here-document are not expanded. If word is unquoted, all lines of the here-document are subjected to parameter expansion, command substitution, and arithmetic expansion, the character sequence \newline is ignored, and \ must be used to quote the characters \, $, and `.
If the redirection operator is <<-, then all leading tab characters are stripped from input lines and the line containing delimiter. This allows here-documents within shell scripts to be indented in a natural fashion.
If your machine image is configured to run cloud-init at startup -- this is often but not always what is responsible for executing metadata_startup_script -- you may be able to achieve a similar effect without so much Bash scripting indirection by using Cloud Config YAML instead of a shell script directly.
For example, if your intent is only to write the content of that file into the designated location in the filesystem, you could potentially follow the Writing out arbitrary files example:
metadata_startup_script = <<-EOF
#cloud-config
${yamlencode({
write_files = [
{
encoding = "b64"
content = filebase64("${path.module}/scripts/simple_bash.sh")
path = "/etc/myservice/serv.conf"
owner = "root:root"
permissions = "0644"
},
]
}}
EOF
Cloud-init evaluates its modules at various points in the startup lifecycle. The Write Files module used here is specified to run once on the first boot of an instance, which matches how Cloud-init would typically treat a naked shell script too.
I do not think your issue is related to TF interpolation. I think you have problems because of normal bash interpolation, as its bash which is going to try to resolve $ in your /etc/myservice/serv.conf while writing its content.
The regular solution is to use 'EOI', not EOI:
metadata_startup_script = <<-EOF
#!/bin/bash -xe
sudo tee /etc/myservice/serv.conf > /dev/null <<'EOI'
${file("${path.module}/scripts/simple_bash.sh")}
EOI
EOF
I have the following bash script:
$ echo $(dotnet run --project Updater)
UPDATE_NEEDED='0' MD5_SUM="7e3ad68397421276a205ac5810063e0a"
$ export UPDATE_NEEDED='0' MD5_SUM="7e3ad68397421276a205ac5810063e0a"
$ echo $UPDATE_NEEDED
0
$ export $(dotnet run --project Updater)
$ echo $UPDATE_NEEDED
'0'
Why is it $UPDATE_NEEDED is 0 on the 3rd command, but '0' on the 5th command?
What would I need to do to get it to simply set 0? Using UPDATE_NEEDED=0 instead is not an option, as some of the other variables may contain a space (And I'd like to optimistically quote them to have it properly parse spaces).
Also, this is a bit of a XY problem. If anyone knows an easier way to export multiple variables from an executable that can be used later on in the bash script, that could also be useful.
To expand on the answer by Glenn:
When you write something like export UPDATE_NEEDED='0' in Bash code, this is 100% identical to export UPDATE_NEEDED=0. The quotes are used by Bash to parse the command expression, but they are then discarded immediately. Their only purpose is to prevent word splitting and to avoid having to escape special characters. In the same vein, the code fragment 'foo bar' is exactly identical to foo\ bar as far as Bash is concerned: both lead to space being treated as literal rather than as a word splitter.
Conversely, parameter expansion and command substitution follows different rules, and preserves literal quotes.
When you use eval, the command line arguments passed to eval are treated as if they were Bash code, and thus follow the same rules of expansion as regular Bash code, which leads to the same result as (1).
Apparently that Updater project is doing the equivalent of
echo "UPDATE_NEEDED=\'0\' MD5_SUM=\"7e3ad68397421276a205ac5810063e0a\""
It's explicitly outputting the quotes.
When you do export UPDATE_NEEDED='0' MD5_SUM="7e3ad68397421276a205ac5810063e0a",
bash will eventually remove the quotes before actually setting the variables.
I agree with #pynexj, eval is warranted here, although additional quoting is recommended:
eval export "$(dotnet ...)"
I often see key=value bash-script to pass variables to a bash script. Here is an example:
$ echo $0
-bash
$ cat foo.sh
#!/usr/bin/env bash
echo "key1: $key1"
$ key1=value1 ./foo.sh
key1: value1
I have checked Bash Reference Manual. But I can't a description related to this usage.
Section 3.7.4 "Environment"
The environment for any simple command or function may be augmented temporarily by prefixing it with parameter assignments, as described in Shell Parameters. These assignment statements affect only the environment seen by that command.
Apparently the online version of the documentation fails to describe the syntax of a simple command completely. It reads:
3.2.1 Simple Commands
A simple command is the kind of command encountered most often. It’s just a sequence of words separated by blanks, terminated by one of the shell’s control operators (see Definitions). The first word generally specifies a command to be executed, with the rest of the words being that command’s arguments.
There is some information about the variable assignments that may precede a simple command on the section Simple Commands Expansion:
3.7.1 Simple Command Expansion
When a simple command is executed, the shell performs the following expansions, assignments, and redirections, from left to right.
The words that the parser has marked as variable assignments (those preceding the command name) and redirections are saved for later processing.
...
The text after the = in each variable assignment undergoes tilde expansion, parameter expansion, command substitution, arithmetic expansion, and quote removal before being assigned to the variable.
The manual page bundled with the shell seems to be better at this. It says (the emphasis is mine):
Simple Commands
A simple command is a sequence of optional variable assignments followed by blank-separated words and redirections, and terminated by a control operator.
You can always read the documentation of bash (and of any other CLI program installed on your computer) using man.
Type man bash on your terminal and use the usual keys (<spacebar>, b, /, q etc.) to navigate in the document. Behind the scenes, man uses your default pager program (which is, most probably, less) to put the information on screen.
Consider this little shell script.
# Save the first command line argument
cmd="$1"
# Execute the command specified in the first command line argument
out=$($cmd)
# Do something with the output of the specified command
# Here we do a silly thing, like make the output all uppercase
echo "$out" | tr -s "a-z" "A-Z"
The script executes the command specified as the first argument, transforms the output obtained from that command and prints it to standard output. This script may be executed in this manner.
sh foo.sh "echo select * from table"
This does not do what I want. It may print something like the following,
$ sh foo.sh "echo select * from table"
SELECT FILEA FILEB FILEC FROM TABLE
if fileA, fileB and fileC is present in the current directory.
From a user perspective, this command is reasonable. The user has quoted the * in the command line argument, so the user doesn't expect the * to be globbed. But my script astonishes the user by using this argument in a command substitution which causes globbing of * as seen in the above output.
I want the output to be the following instead.
SELECT * FROM TABLE
The entire text in cmd actually comes from command line arguments to the script so I would like to preserve any * symbol present in the argument without globbing them.
I am looking for a solution that works for any POSIX shell.
One solution I have come up with is to disable globbing with set -o noglob just before the command substitution. Here is the complete code.
# Save the first command line argument
cmd="$1"
# Execute the command specified in the first command line argument
set -o noglob
out=$($cmd)
# Do something with the output of the specified command
# Here we do a silly thing, like make the output all uppercase
echo "$out" | tr -s "a-z" "A-Z"
This does what I expect.
$ sh foo.sh "echo select * from table"
SELECT * FROM TABLE
Apart from this, is there any other concept or trick (such as a quoting mechanism) I need to be aware of to disable globbing only within a command substitution without having to use set -o noglob.
I am not against set -o noglob. I just want to know if there is another way. You know, globbing can be disabled for normal command line arguments just by quoting them, so I was wondering if there is anything similar for command substiution.
If I understand correctly, you want the user to provide a shell command as a command-line argument, which will be executed by the script, and is expected to produce an SQL string, which will be processed (upper-cased) and echoed to stdout.
The first thing to say is that there is no point in having the user provide a shell command that the script just blindly executes. If the script applied some kind of modification/preprocessing of the command before it executed it then perhaps it could make sense, but if not, then the user might as well execute the command himself and pass the output to the script as a command-line argument, or via stdin.
But that being said, if you really want to do it this way, then there are two things that need to be said. Firstly, this is the proper form to use:
out=$(eval "$cmd");
A fairly advanced understanding of the shell grammer and expansion rules would be required to fully understand the rationale for using the above syntax, but basically executing $cmd and executing eval "$cmd" have subtle differences that render the $cmd form inappropriate for executing a given shell command string.
Just to give some detail that will hopefully clarify the above point, there are seven kinds of expansion that are performed by the shell in the following order when processing input: (1) brace expansion, (2) tilde expansion, (3) parameter and variable expansion, (4) arithmetic expansion, (5) command substitution, (6) word splitting, and (7) pathname expansion. Notice that variable expansion happens somewhat in the middle of that sequence, and thus the variable-expanded shell command (which was provided by the user) will not receive the benefit of the prior expansion types. Other issues are that leading variable assignments, pipelines, and command list tokens will not be executed correctly under the $cmd form, because they are parsed and processed prior to variable expansion (actually prior to all expansions) as well.
By running the command through eval, properly double-quoted, you ensure that the full shell parsing/processing/execution algorithm will be applied to the shell command string that was given by the user of your script.
The second thing to say is this: If you try the above proper form in your script, you will find that it has not solved your problem. You will still get SELECT FILEA FILEB FILEC FROM TABLE as output.
The reason is this: Since you've decided you want to accept an arbitrary shell command from the user of your script, it is now the user's responsibility to properly quote all metacharacters that may be embedded in that piece of code. It does not make sense for you to accept a shell command as a command-line argument, but somehow change the processing rules for shell commands so that certain metacharacters will no longer be metacharacters when the given shell command is executed. Actually, you could do something like that, perhaps using set -o noglob as you discovered, but then that must become a contract between the script and the user of the script; the user must be made aware of exactly what the precise processing rules will be when the command is executed so that he can properly use the script.
Under this design, the user could call the script as follows (notice the extra layer of quoting for the shell command string evaluation; could alternatively backslash-escape just the asterisk):
$ sh foo.sh "echo 'select * from table'";
I'd like to return to my earlier comment about the overall design; it doesn't really make sense to do it this way. It makes more sense to take the text-to-process itself, not a shell command that is expected to produce the text-to-process.
Here is how that could be done:
## take the text-to-process via a command-line argument
sql="$1";
## process and echo it
echo "$sql"| tr a-z A-Z;
(I also removed the -s option of tr, which really doesn't make sense here.)
Notice that the script is simpler now, and usage is also simpler:
$ sh foo.sh 'select * from table';