printing the ampersand - bash

I have a bash script that takes a url with variables and writes it to a file, problem is the ampersand is interfering and being interpreted as a command / control character.
In this situation the string cannot be escaped BEFORE being passed to the script and I have yet to find any way to do this.
if [ $1 ] ; then
url=$1
printf %q "$url" > "/somepath/somefile"
fi
with $1 being for example localhost?x=1&y=2&z=3
What get's printed is only the part before the first ampersand: "localhost?x=1"
I have also tried echo instead of printf but it's exactly the same ??

Your script is fine, but you need to invoke the script with a quoted parameter:
./myscript.sh "localhost?x=1&y=2&z=3"

There is no problem with echo nor print. The problem is that when you run the script, it starts those 2 jobs in background. For more information you can check: http://hacktux.com/bash/ampersand.
You can simply start script with 'localhost?x=1&y=2&z=3' in apostrophes, so bash will not treat ampersand as operator but just as normal character.

Quote things. Replace all $1s with "$1"s. And quote argument when you actually invoke your script.

Related

Bash script printing a variable value after a newline character

I have bash script which calls another script(some_script). some_script expects some input from the user. I have used printf statement for this purpose.
But the problem is the variable value is not being accepted by the target script. I think this is because '\' is being taken as an escape character in the script
The statement somewhat looks like this
printf 'yes\n$var1\n$var2\n$var3' | some_script
If i directly replace the variable with values it runs perfectly but i want the script to take the values from the variables. How do i achieve this?
There is a difference between " and '. Try
printf "yes\n$var1\n$var2\n$var3" | some_script
because with ' the variables won't get substituted.
Yes, \ is a character that has to be escaped.
Use \\n.
For more details we would need more details on how your script works.

Running variable commands in BASH

I have a BASH script called script.sh that takes 3 arguments and runs an executable file with them. The first two are just numbers, but the last is an argument giving the input file. I would like the script to run the executable with the input as an argument of the executable and using the "<" as a replacement for stdin. (i.e.
bash script.sh 5 1 input.txt
calls the BASH script, and the contents of script.sh are as follows:
#!/bin/bash
command1="./l${1}t${2} $3"
command2="./l${1}t${2} < $3"
echo + ${command1}
${command1}
echo + ${command2}
${command2}
When I echo command1 I get
./l5t1 input.txt
which is exactly what I want and it runs just fine.
When I echo command2 I get
./l5t1 < input.txt
which is again what I want. The problem is the actual command the script runs is
./l5t1 '<' input.txt
which of course causes a segmentation fault in my program.
I'd like to know if there is a way I can run command 2 so that it runs the string exactly as it is printed in the echo output. Honestly, I have no idea why the single quotes are even inserted around the < character.
If you want to store commands it's better to use functions than variables. As you've found out, redirections don't work when stored in variables (nor do |, ;, or &).
command1() {
"./l${1}t${2}" "$3"
}
command2() {
"./l${1}t${2}" < "$3"
}
command1 "$#"
command2 "$#"
Here I've defined two functions, which are called with the arguments from the array $#. "$#" forwards the script's arguments to the functions.
Notice also that I've put quotes around "./${1}t${2}" and "$3". Using double quotes allows these parameters to contain spaces. Liberal quoting is a good defensive scripting technique.
(I strongly recommend not doing eval "$command2". Using eval is a really dangerous habit to get into.)

Bash: Passing a variable into a script that has spaces

I currently have a bash script. It looks like this:
#!/bin/bash
case "$1" in
sendcommand)
do X with $2
exit
;;
esac
How would I send all of command this command with spaces into $2 without command acting as $3, with as $4 and so on? Is there something like PHP or Javascript's encodeURI for bash?
You also have to call your script with the second argument in quotes too
./scriptname sendcommand "command with spaces"
Your script should look like this
#!/bin/bash
case "$1" in
sendcommand)
something "$2"
exit
;;
esac
You can just use double quotes:
do X with "$2"
Enclose it in double quotes:
do_X_with "$2"
The double quotes preserve the internal spacing on the variable, including newlines if they are in the value in the first place. Indeed, it is important to understand the uses of double quotes with "$#" and "$*" too, not to mention when using bash arrays.
You can't easily have a command called do because the shell uses do as a keyword in its loop structure. To invoke it, you would have to specify a path to the command, such as ./do or $HOME/bin/do.
But $2 is "this" and the OP wants it to be "this command with spaces".
OK. We need to review command line invocations and desired behaviours. Let's assume that the script being executed is called script. Further, that command being executed is othercommand (can't use command; that is a standard command).
Possible invocations include:
script sendcommand "this command with spaces"
script sendcommand 'this command with spaces'
script sendcommand this command with spaces
The single-quote and double-quote invocations are equivalent in this example. They wouldn't be equivalent if there were variables to be expanded or commands to be invoked inside the argument lists.
It is possible to write script to handle all three cases:
#!/bin/bash
case "$1" in
sendcommand)
shift
othercommand "$*"
exit
;;
esac
Suppose that the invocation encloses the arguments in quotes. The shift command removes $1 from the argument list and renumbers the remaining (single) argument as $1. It then invokes othercommand with a single string argument consisting of the contents of the arguments concatenated together. If there were several arguments, the contents would be separated by a single 'space' (first character of $IFS).
Suppose that the invocation does not enclose the arguments in quotes. The shift command still removes $1 (the sendcommand) from the argument list, and then space separates the remaining arguments as a single argument.
In all three cases, the othercommand sees a single argument that consists of "this command with spaces" (where the program does not see the double quotes, of course).

Stop bash from expanding $ from command line

I have a script I am trying to call that needs to have the $ symbol passed to it. If I run the script as
./script "blah$blah"
it is passed in fine but then the script calls another program I have no control over which then expands the parameter to just "blah". The program is being called by the command program $#. I was wondering if there was a way to prevent the parameter from being expanded when passed to the next script.
Escape the character $ with: \, e.g.: "This will not expand \$hello"
use single quotes: 'This will not expand $hello'
Use a HERE DOC:
<<'EOF'
This will not expand $hello
EOF
In your case I recommend using single quotes for readability: ./script 'blah$blah'.
A couple of options involving changing the quoting:
./script 'blah$blah'
./script "blah\$blah"
I hope this helps.
Call using single quotes:
./script 'blah$blah'
Or escape the $
./script "blah\$blah"

How do I embed an expect script that takes in arguments into a bash shell script?

I am writing a bash script which amongst many other things uses expect to automatically run a binary and install it by answering installer prompts.
I was able to get my expect script to work fine when the expect script is called in my bash script with the command "expect $expectscriptname $Parameter". However, I want to embed the expect script into the shell script instead of having to maintain two separate script files for the job. I searched around and found that the procedure to embed expect into bash script was to declare a variable like VAR below and then echo it.:
VAR=$(expect -c "
#content of expect script here
")
echo "$VAR"
1) I don't understand how echoing $VAR actually runs the expect script. Could anyone explain?
2) I am not sure how to pass $Parameter into VAR or to the echo statement. This is my main concern.
Any ideas? Thanks.
Try something like:
#!/bin/sh
tclsh <<EOF
puts $1
EOF
I don't have the expect command installed today, so I used tclsh instead.
In bash, the construct $(cmd) runs the specified command and captures its output. It's similar to the backtick notation, though there are some slight differences. Thus, the assignment to VAR is what runs the expect command:
# expect is run here
VAR=$(expect -c "
# ...
")
# This echoes the output of the expect command.
echo "$VAR"
From the bash manual page:
When the old-style backquote form
of substitution is used, backslash
retains its literal meaning except
when followed by $, , or \. The
first backquote not preceded by a
backslash terminates the command
substitution. When using the
$(command)` form, all characters
between the parentheses make up the
command; none are treated specially.
That's why it works: The bash comment character (#) isn't treated as a comment character within the $( ... ).
EDIT
Passing parameters: Just put 'em in there. For instance, consider this script:
foo="Hello, there"
bar=$(echo "
# $foo
")
echo $bar
If you run that script, it prints:
# Hello, there
Thus, the value of $foo was substituted inside the quotes. The same should work for the expect script.
Instead of a bash script and an expect script, have you considered writing just a single expect script?
Expect is a superset of Tcl, which means it is a fully functioning programming language, and you can do anything with it that you can do with bash (and for the things that you can't, you can always exec bash commands. You don't have to use expect just to "expect" things

Resources