Including "cat" command in unix shell Here Document - bash

I'm trying to create a Here Document which is a shell script that includes the cat command. Of course, it fails when encountering the 2nd cat. I'm performing a lot of substitutions as well, so can't use the "DOC" escape trick.
myfile="/tmp/myipaddr"
cat >/usr/bin/setIPaddress <<_DOC_
...
OUT=`cat $myfile`
...
_DOC_
I supposed I could echo into a file, but that seems kludgy and I have a lot of quotes and backticks I'd need to escape?!? Any other thoughts?

Suppose the file contains
hello world
As written, the script you generate will contain the line
OUT=hello world
because the command substitution is performed immediately.
At the very least, you need to quote the line in the here document as
OUT="`cat $myfile`"
I suspect what you want is to include the literal command substitution in the resulting shell script. To do that, you would want to quote the backticks to prevent them from being evaluated immediately. Better still, use the recommended form of command substitution, $(...), and quote the dollar sign.
cat >/usr/bin/setIPaddress <<_DOC_
...
OUT=\$(cat $myfile)
...
_DOC_
/usr/bin/setIPaddress will then include the line
OUT=$(cat /tmp/myipaddr)

Related

Bash script: any way to collect remainder of command line as a string, including quote characters?

The following simplified version of a script I'll call logit obviously just appends everything but $1 in a text file, so I can keep track of time like this:
$ logit Started work on default theme
But bash expansion gets confused by quotes of any kind. What I'd like is to do things like
$ logit Don't forget a dark mode
But when that happens of course shell expansion rules cause a burp:
quote>
I know this works:
# Yeah yeah I can enclose it in quotes but I'd prefer not to
$ logit "Don't forget a dark mode"
Is there any way to somehow collect the remainder of the command line before bash gets to it, without having to use quotes around my command line?
Here's a minimal working version of the script.
#!/bin/bash
log_file=~/log.txt
now=$(date +"%T %r")
echo "${now} ${#:1}" >> $log_file
Is there any way to somehow collect the remainder of the command line before bash gets to it, without having to use quotes around my command line?
No. There is no "before bash gets into it" time. Bash reads the input you are typing, Bash parses the input you are typing, there is nothing in between or "before". There is only Bash.
You can: use a different shell or write your own. Note that quotes parsing like in shell is very common, you may consider that it could be better for you to understand and get used to it.
you can use a backslash "\" before the single quote
$ logit Don\'t forget a dark mode

How to inject a bash script with a dollar sign ($) in terraform?

I have a simple bash script that does something like the following:
#!/bin/bash
a=$(curl -s http://metadata/endpoint/internal)
echo "$a - bar"
(this is just a simplification). Note the use of the two $ signs to execute a command and resolve a variable.
Using Terraform, I want to write this file to a GCP instance during startup. Per Terraform's instructions, I'm attempting to avoid using the File Provisioner. Using the metadata_startup_script field in google_compute_instance2, I am including the script so that I can write it to a particular location.
E.g.
metadata_startup_script = <<-EOF
#!/bin/bash -xe
sudo tee /etc/myservice/serv.conf > /dev/null <<EOI
${file("${path.module}/scripts/simple_bash.sh")}
EOI
EOF
Terraform is interpolating the $ in the subscript somewhere (either in the loading into the metadata_startup_script, or in the writing out into the script to disk.
So, depending on what I use to try to escape the interpolation, it still fails to write. For example, I have tried (in the subscript):
echo "\$a - bar"
echo "${“$”}a - bar"
echo "$$a - bar"
According to the terraform docs, I'm supposed to use $$, but when I do it in the above, I get:
echo "1397a - bar"
All which fail to replicate the original script.
I’m just looking for the exact bash script, as written, to be written to disk.
My goal would be to do the above without extra escape sequences (as detailed here - Escaping dollar sign in Terraform) so that i can continue to run the original script (for debugging purposes).
I would also prefer not to build a packer image with the original script in it.
Thanks!
I don't think it's Terraform eating your variable interpolations here, because Terraform only understands ${ (a dollar sign followed by a brace) as starting an interpolation, whereas your example contains only $a.
However, you do seem to be embedded one bash script inside another, so it seems plausible to me that the outer bash is resolving your $a before the inner bash gets a chance to look at it. If so, you can use the literal variant of bash heredoc syntax, as described in answers to How to cat <> a file containing code?, so that the outer bash will take the content as literal and leave it to the inner bash to evaluate.
metadata_startup_script = <<-EOF
#!/bin/bash -xe
sudo tee /etc/myservice/serv.conf > /dev/null <<'EOI'
${file("${path.module}/scripts/simple_bash.sh")}
EOI
EOF
Notice that I wrote <<'EOI' instead of <<EOI, following the guidance from that other question in combination with the main Bash documentation on "here documents" (bold emphasis mine):
This type of redirection instructs the shell to read input from the current source until a line containing only word (with no trailing blanks) is seen. All of the lines read up to that point are then used as the standard input (or file descriptor n if n is specified) for a command.
The format of here-documents is:
[n]<<[-]word
here-document
delimiter
No parameter and variable expansion, command substitution, arithmetic expansion, or filename expansion is performed on word. If any part of word is quoted, the delimiter is the result of quote removal on word, and the lines in the here-document are not expanded. If word is unquoted, all lines of the here-document are subjected to parameter expansion, command substitution, and arithmetic expansion, the character sequence \newline is ignored, and \ must be used to quote the characters \, $, and `.
If the redirection operator is <<-, then all leading tab characters are stripped from input lines and the line containing delimiter. This allows here-documents within shell scripts to be indented in a natural fashion.
If your machine image is configured to run cloud-init at startup -- this is often but not always what is responsible for executing metadata_startup_script -- you may be able to achieve a similar effect without so much Bash scripting indirection by using Cloud Config YAML instead of a shell script directly.
For example, if your intent is only to write the content of that file into the designated location in the filesystem, you could potentially follow the Writing out arbitrary files example:
metadata_startup_script = <<-EOF
#cloud-config
${yamlencode({
write_files = [
{
encoding = "b64"
content = filebase64("${path.module}/scripts/simple_bash.sh")
path = "/etc/myservice/serv.conf"
owner = "root:root"
permissions = "0644"
},
]
}}
EOF
Cloud-init evaluates its modules at various points in the startup lifecycle. The Write Files module used here is specified to run once on the first boot of an instance, which matches how Cloud-init would typically treat a naked shell script too.
I do not think your issue is related to TF interpolation. I think you have problems because of normal bash interpolation, as its bash which is going to try to resolve $ in your /etc/myservice/serv.conf while writing its content.
The regular solution is to use 'EOI', not EOI:
metadata_startup_script = <<-EOF
#!/bin/bash -xe
sudo tee /etc/myservice/serv.conf > /dev/null <<'EOI'
${file("${path.module}/scripts/simple_bash.sh")}
EOI
EOF

Read file and run command in zsh

So I generally create job files with a list of commands in it. Then I execute it like so
cat jobFile | while read a; do $a; done
Which always works in bash. However, I've just started working in Mac which apparently uses zsh. And this command fails with "no such file" etc. I've tested the job file by running few lines from it manually, so it should be fine.
I've found questions on zsh read inbut they tend to be reading in from variables e.g. $a=('a' 'b' 'c') or echo $a
Thank you for your answers!
In bash, unquoted parameter expansions always undergo word-splitting, so if a="foo bar", then $a expands to two words, foo and bar. As a command, this means running the command foo with an argument bar.
In zsh, parameter expansions to not undergo word-splitting by default, which means the same expansion $a would produce a single word foo bar, treated as the name of the command to execute.
In either case, relying on parameter expansion to "parse" a shell command is fragile; in addition to word-splitting, the expansion is subject to pathname expansion (globbing), and you are limited to simple commands and their arguments. No pipes, lists (&&, ||), or redirections allowed, as everything will be treated as the command name and a sequence of arguments.
What you want in both shells is to simply treat your job file as a shell script, which can be executed in the current shell using the . command:
. jobFile
Why are you executing it in such a cumbersome way? Assuming jobFile is a file holding a sequence of bash commands, you can simply run it as
bash jobFile
If it contains a sequence of zsh commands, you can likewise run it as
zsh jobFile
If you follow this approach, I would however reflect in the name of the job file, what shell it is intended for, i.e.
bash jobFile.bash
zsh jobFile.zsh
and, if you write a job file so that it is supposed to be compatible with either shell, I would name it jobFile.sh.

How to use a pure string as an argument for python program through bash terminal

I am trying to give an argument to my python program through the terminal.
For this I am using the lines:
import sys
something = sys.argv[1]
I now try to put in a string like this through the bash terminal:
python my_script.py 2m+{N7HiwH3[>!"4y?t9*y#;/$Ar3wF9+k$[3hK/WA=aMzF°L0PaZTM]t*P|I_AKAqIb0O4# cm=sl)WWYwEg10DDv%k/"c{LrS)oVd§4>8bs:;9u$ *W_SGk3CXe7hZMm$nXyhAuHDi-q+ug5+%ioou.,IhC]-_O§V]^,2q:VBVyTTD6'aNw9:oan(s2SzV
This returns a bash error because some of the characters in the string are bash special characters.
How can I use the string exactly as it is?
You can put the raw string into a file, for example like this, with cat and a here document.
cat <<'EOF' > file.txt
2m+{N7HiwH3[>!"4y?t9*y#;/$Ar3wF9+k$[3hK/WA=aMzF°L0PaZTM]t*P|I_AKAqIb0O4# cm=sl)WWYwEg10DDv%k/"c{LrS)oVd§4>8bs:;9u$ *W_SGk3CXe7hZMm$nXyhAuHDi-q+ug5+%ioou.,IhC]-_O§V]^,2q:VBVyTTD6'aNw9:oan(s2SzV
EOF
and then run
python my_script.py "$(< file.txt)"
You can also use the text editor of your choice for the first step if you prefer that.
If this is a reoccurring task, which you have to perform from time to time, you can make your life easier with a little alias in your shell:
alias escape='read -r string ; printf "Copy this:\n%q\n" "${string}"'
It is using printf "%q" to escape your input string.
Run it like this:
escape
2m+{N7HiwH3[>!"4y?t9*y#;/$Ar3wF9+k$[3hK/WA=aMzF°L0PaZTM]t*P|I_AKAqIb0O4# cm=sl)WWYwEg10DDv%k/"c{LrS)oVd§4>8bs:;9u$ *W_SGk3CXe7hZMm$nXyhAuHDi-q+ug5+%ioou.,IhC]-_O§V]^,2q:VBVyTTD6'aNw9:oan(s2SzV
Copy this:
2m+\{N7HiwH3\[\>\!\"4y\?t9\*y#\;/\$Ar3wF9+k\$\[3hK/WA=aMzF°L0PaZTM\]t\*P\|I_AKAqIb0O4#\ cm=sl\)WWYwEg10DDv%k/\"c\{LrS\)oVd§4\>8bs:\;9u\$\ \*W_SGk3CXe7hZMm\$nXyhAuHDi-q+ug5+%ioou.\,IhC\]-_O§V\]\^\,2q:VBVyTTD6\'aNw9:oan\(s2SzV
You can use the escaped string directly in your shell, without additional quotes, like this:
python my_script.py 2m+\{N7HiwH3\[\>\!\"4y\?t9\*y#\;/\$Ar3wF9+k\$\[3hK/WA=aMzF°L0PaZTM\]t\*P\|I_AKAqIb0O4#\ cm=sl\)WWYwEg10DDv%k/\"c\{LrS\)oVd§4\>8bs:\;9u\$\ \*W_SGk3CXe7hZMm\$nXyhAuHDi-q+ug5+%ioou.\,IhC\]-_O§V\]\^\,2q:VBVyTTD6\'aNw9:oan\(s2SzV
In order to make life easier, shells like bash do a little bit of extra work to help users pass the correct arguments to the programs they instruct it to execute. This extra work usually results in predictable argument arrays getting passed to programs.
Oftentimes, though, this extra help results in unexpected arguments getting passed to programs; and sometimes results in the execution of undesired additional commands. In this case, though, it ended up causing Bash to emit an error.
In order to turn off this extra work, Bash allows users to indicate where arguments should begin and end by surrounding them by quotation marks. Bash supports both single quotes (') and double quotes (") to delimit arguments. As a last resort, if a string may contain single and double quotes (or double quotes are required but aren't aggressive enough), Bash allows you to indicate that a special- or whitespace-character should be part of the adjacent argument by preceding it with a backslash (\\).
If this method of escaping arguments is too cumbersome, it may be worth simplifying your program's interface by having it consume this data from a file instead of a command line argument. Another option is to create a program that loads the arguments from a more controlled location (like a file) and directly execs the target program with the desired argument array.

How do I avoid calling part of my string as a command?

I run with the file with command line arguments:
samplebash.bsh fakeusername fakepassword&123
.bsh file:
echo "Beginning script..."
argUsername='$1'
argPassword='$2'
protractor indv.js --params.login.username=$argUsername --params.login.password=$argPassword
Output:
Beginning script...
123: command not found
The Issue: For some reason, it interprets what follows the & symbol from the password as a command, how do I avoid this?
The problem isn't happening in your script, it's happening in your original command line. & is a command terminator, which specifies that the command before it should be executed in the background. So your command was equivalent to:
samplebash.bsh fakeusername fakepassword &
123
You need to quote the argument to prevent special characters from being interpreted by the shell.
samplebash.bsh fakeusername 'fakepassword&123'
Also, you shouldn't put single quotes around a variable like you do in your assignments, that prevents the variable from being expanded. So it should be:
argUsername=$1
argPassword=$2
And you should put double quotes around the variables when you use them in the command, to prevent wildcards and whitespace from being interpreted.
protractor indv.js --params.login.username="$argUsername" --params.login.password="$argPassword"
As a general rule, you should always put double quotes around variables unless you know they're not needed.

Resources