Stop bash from expanding $ from command line - bash

I have a script I am trying to call that needs to have the $ symbol passed to it. If I run the script as
./script "blah$blah"
it is passed in fine but then the script calls another program I have no control over which then expands the parameter to just "blah". The program is being called by the command program $#. I was wondering if there was a way to prevent the parameter from being expanded when passed to the next script.

Escape the character $ with: \, e.g.: "This will not expand \$hello"
use single quotes: 'This will not expand $hello'
Use a HERE DOC:
<<'EOF'
This will not expand $hello
EOF
In your case I recommend using single quotes for readability: ./script 'blah$blah'.

A couple of options involving changing the quoting:
./script 'blah$blah'
./script "blah\$blah"
I hope this helps.

Call using single quotes:
./script 'blah$blah'
Or escape the $
./script "blah\$blah"

Related

Can't seem to escape a space in a shell script [duplicate]

What is the correct way to call some command stored in variable?
Are there any differences between 1 and 2?
#!/bin/sh
cmd="ls -la $APPROOTDIR | grep exception"
#1
$cmd
#2
eval "$cmd"
Unix shells operate a series of transformations on each line of input before executing them. For most shells it looks something like this (taken from the Bash man page):
initial word splitting
brace expansion
tilde expansion
parameter, variable and arithmetic expansion
command substitution
secondary word splitting
path expansion (aka globbing)
quote removal
Using $cmd directly gets it replaced by your command during the parameter expansion phase, and it then undergoes all following transformations.
Using eval "$cmd" does nothing until the quote removal phase, where $cmd is returned as is, and passed as a parameter to eval, whose function is to run the whole chain again before executing.
So basically, they're the same in most cases and differ when your command makes use of the transformation steps up to parameter expansion. For example, using brace expansion:
$ cmd="echo foo{bar,baz}"
$ $cmd
foo{bar,baz}
$ eval "$cmd"
foobar foobaz
If you just do eval $cmd when we do cmd="ls -l" (interactively and in a script), you get the desired result. In your case, you have a pipe with a grep without a pattern, so the grep part will fail with an error message. Just $cmd will generate a "command not found" (or some such) message.
So try use to eval (near "The args are read and concatenated together") and use a finished command, not one that generates an error message.
$cmd would just replace the variable with it's value to be executed on command line.
eval "$cmd" does variable expansion & command substitution before executing the resulting value on command line
The 2nd method is helpful when you wanna run commands that aren't flexible eg.
for i in {$a..$b} format loop won't work because it doesn't allow variables. In this case, a pipe to bash or eval is a workaround.
Tested on Mac OSX 10.6.8, Bash 3.2.48
I think you should put
`
(backtick) symbols around your variable.

Bash 'xargs' and 'stat' behaviour as variable [duplicate]

What is the correct way to call some command stored in variable?
Are there any differences between 1 and 2?
#!/bin/sh
cmd="ls -la $APPROOTDIR | grep exception"
#1
$cmd
#2
eval "$cmd"
Unix shells operate a series of transformations on each line of input before executing them. For most shells it looks something like this (taken from the Bash man page):
initial word splitting
brace expansion
tilde expansion
parameter, variable and arithmetic expansion
command substitution
secondary word splitting
path expansion (aka globbing)
quote removal
Using $cmd directly gets it replaced by your command during the parameter expansion phase, and it then undergoes all following transformations.
Using eval "$cmd" does nothing until the quote removal phase, where $cmd is returned as is, and passed as a parameter to eval, whose function is to run the whole chain again before executing.
So basically, they're the same in most cases and differ when your command makes use of the transformation steps up to parameter expansion. For example, using brace expansion:
$ cmd="echo foo{bar,baz}"
$ $cmd
foo{bar,baz}
$ eval "$cmd"
foobar foobaz
If you just do eval $cmd when we do cmd="ls -l" (interactively and in a script), you get the desired result. In your case, you have a pipe with a grep without a pattern, so the grep part will fail with an error message. Just $cmd will generate a "command not found" (or some such) message.
So try use to eval (near "The args are read and concatenated together") and use a finished command, not one that generates an error message.
$cmd would just replace the variable with it's value to be executed on command line.
eval "$cmd" does variable expansion & command substitution before executing the resulting value on command line
The 2nd method is helpful when you wanna run commands that aren't flexible eg.
for i in {$a..$b} format loop won't work because it doesn't allow variables. In this case, a pipe to bash or eval is a workaround.
Tested on Mac OSX 10.6.8, Bash 3.2.48
I think you should put
`
(backtick) symbols around your variable.

sed - inserting line with /c\ that has a variable that contains spaces

I have just recently got back into learning bash. Currently working on a project of mine and when using sed I've run into an issue, I've tried looking around the web for help but haven't had any joy. I suspect as I may not be using the correct terminology so I can't find what I'm looking for. ANYHOW.
So in my script I'm trying to assign the output of date to a variable. Here's the line from my script.
origdate=$(date)
When I call it the output looks like this:
Wed Oct 5 19:40:45 BST 2016
Part of my script then generates a file and writes information to it, part of which I am trying to use sed to find lines and replace parts of it. This is the first I've been playing around with sed, I've used it successfully so far for my needs. However I'm getting stuck when I try this:
sed -i '/origdate=empty/c\'$origdate'' $sd/pingcheck-email-$job.txt
When I run the script and it gets to this line, this is the error I'm getting:
sed: can't read Oct: No such file or directory
sed: can't read 5: No such file or directory
sed: can't read 19:52:56: No such file or directory
sed: can't read BST: No such file or directory
sed: can't read 2016: No such file or directory
I suspect it's something to do with the spaces in the date (variable), my question is: how can I work around this? Can I get sed to 'ignore' the spaces? or should I just use cut to cut the field for the date, and set that to a variable and the same thing again to set the time to another variable?
Even if someone could kindly point me in the right direction that'd be great!
Thanks in advance!
double quote the variable
sed -i '/origdate=empty/c\'"$origdate"'' $sd/pingcheck-email-$job.txt
or alternatively, the whole script
sed -i "/origdate=empty/c\$origdate" $sd/pingcheck-email-$job.txt
The problem is not with sed but rather with how bash word splits on your date given your command.
Bash
In bash, word splitting is performed on the command line so that text is broken up into a list of arguments. To illustrate, I'm going to run a simple script that outputs the first argument only.
bash -c 'echo $1' ignored_0 foo bar
Think of bash -c 'echo $1' ignored_0 as the command (sed in your case) and foo bar as the arguments. In this case, foo bar is split into two arguments, foo and bar.
To pass foo bar in as the first parameter, you need to have the text in either single or double quotes. See the GNU manual on quoting.
bash -c 'echo $1' ignored_0 'foo bar'
bash -c 'echo $1' ignored_0 "foo bar"
Parameter expansion does not occur when the variable is inside a single quote.
var="foo bar"
bash -c 'echo $1' ignored_0 '$var'
bash -c 'echo $1' ignored_0 "$var"
NOTE: In the command `bash -c 'echo $1', I do not want $1 to expand before being passed as an argument to bash because that's part of the code I want to execute.
Parameter expansion occurs when variables are outside of quotes, but word splitting will apply after the parameter is expanded. From the bash man page in the Word Splitting section:
The shell scans the results of parameter expansion, command
substitution, and arithmetic expansion that did not occur within
double quotes for word splitting.
From the GNU bash manual on Word Splitting:
The shell scans the results of parameter expansion, command
substitution, and arithmetic expansion that did not occur within
double quotes for word splitting.
var="foo bar"
bash -c 'echo $1' ignored_0 $var
The last step in Shell Expansions in Quote Removal where unquoted quote characters are removed before being passed to commands. The following command shows that ''"" has no effect on the arguments passed.
bash -c 'echo $1' ignored_0 foo''""
Application
In your example, the trailing '' after $origdate is extraneous. The important part is that $origdate is not quoted so word splitting applies to the expanded variable.
When -e is not passed to the sed command, sed expects the expression to be in one argument, or word from bash. When you run your command, your expression is /origdate=empty/c\Wed and the rest of the date is considered to be files for the expression to be applied to.
The simple fix is to put double quotes around the string for which you want to prevent word splitting. I've modified the command so that anyone can run this example without having the files on their system.
In this example, the \ must be escaped so that it is not considered an escape character for $.
echo "origdate=empty" | sed "/origdate=empty/c\\$origdate"
You can also change the type of quotes you are using without affecting word splitting like so.
echo "origdate=empty" | sed '/origdate=empty/c\'"$origdate"
You need escape by double slash
\ / \%

printing the ampersand

I have a bash script that takes a url with variables and writes it to a file, problem is the ampersand is interfering and being interpreted as a command / control character.
In this situation the string cannot be escaped BEFORE being passed to the script and I have yet to find any way to do this.
if [ $1 ] ; then
url=$1
printf %q "$url" > "/somepath/somefile"
fi
with $1 being for example localhost?x=1&y=2&z=3
What get's printed is only the part before the first ampersand: "localhost?x=1"
I have also tried echo instead of printf but it's exactly the same ??
Your script is fine, but you need to invoke the script with a quoted parameter:
./myscript.sh "localhost?x=1&y=2&z=3"
There is no problem with echo nor print. The problem is that when you run the script, it starts those 2 jobs in background. For more information you can check: http://hacktux.com/bash/ampersand.
You can simply start script with 'localhost?x=1&y=2&z=3' in apostrophes, so bash will not treat ampersand as operator but just as normal character.
Quote things. Replace all $1s with "$1"s. And quote argument when you actually invoke your script.

How do I embed an expect script that takes in arguments into a bash shell script?

I am writing a bash script which amongst many other things uses expect to automatically run a binary and install it by answering installer prompts.
I was able to get my expect script to work fine when the expect script is called in my bash script with the command "expect $expectscriptname $Parameter". However, I want to embed the expect script into the shell script instead of having to maintain two separate script files for the job. I searched around and found that the procedure to embed expect into bash script was to declare a variable like VAR below and then echo it.:
VAR=$(expect -c "
#content of expect script here
")
echo "$VAR"
1) I don't understand how echoing $VAR actually runs the expect script. Could anyone explain?
2) I am not sure how to pass $Parameter into VAR or to the echo statement. This is my main concern.
Any ideas? Thanks.
Try something like:
#!/bin/sh
tclsh <<EOF
puts $1
EOF
I don't have the expect command installed today, so I used tclsh instead.
In bash, the construct $(cmd) runs the specified command and captures its output. It's similar to the backtick notation, though there are some slight differences. Thus, the assignment to VAR is what runs the expect command:
# expect is run here
VAR=$(expect -c "
# ...
")
# This echoes the output of the expect command.
echo "$VAR"
From the bash manual page:
When the old-style backquote form
of substitution is used, backslash
retains its literal meaning except
when followed by $, , or \. The
first backquote not preceded by a
backslash terminates the command
substitution. When using the
$(command)` form, all characters
between the parentheses make up the
command; none are treated specially.
That's why it works: The bash comment character (#) isn't treated as a comment character within the $( ... ).
EDIT
Passing parameters: Just put 'em in there. For instance, consider this script:
foo="Hello, there"
bar=$(echo "
# $foo
")
echo $bar
If you run that script, it prints:
# Hello, there
Thus, the value of $foo was substituted inside the quotes. The same should work for the expect script.
Instead of a bash script and an expect script, have you considered writing just a single expect script?
Expect is a superset of Tcl, which means it is a fully functioning programming language, and you can do anything with it that you can do with bash (and for the things that you can't, you can always exec bash commands. You don't have to use expect just to "expect" things

Resources