I'm having trouble using a single quote in a command executed from within a shell script.
In my script I execute an rdesktop command that should have -u '' (<- 2 single quotes) as a parameter.
However, no matter how I try to escape the quotes it is not passed correctly.
If I just echo $command the output is fine, if I execute it weird output is created
This is the part of the script that doesn't work:
command="rdesktop -u "\'\'" $server"
`$command`
I also tried executing it directly:
`rdesktop -u "\'\'" $server`
I would appreciate any help since I read quite a few tutorials on escaping characters in shell scripts and didn't find the solution..
EDIT:
interestingly enough, if I just use
command=rdesktop -u '' $server
and echo it, the output is fine
however, if I execute it with
$command
it fails...
If your shell is bash or ksh or zsh, it's much safer and easier to build up a command with an array rather than a string:
command=( rdesktop -u '' $server )
and execute it like this
"${command[#]}"
I can't imagine the remote server needs to see a username named literally '' (i.e. 2 single quotes) -- it probably wants just an empty string.
When you invoke rdesktop -u '' $server, you are invoking rdesktop with 3 arguments, the second of which is the empty string. It's not clear to me why you are trying to invoke it in backticks, but you can get what you want using your first definition of command and:
eval $command
Note that if you do not completely control the contents of $server, then this is a big security risk. (For example, if server is the string ; cmd, then invoking eval will execute cmd)
First, if you type
`rdesktop something`
then that means to invoke rdesktop, take the output, and run that output as a command. To “execute it directly,” simply type
rdesktop something
Now for the quotes. If you execute
rdesktop -u '' $server
then rdesktop will not see the quotes. They are removed as part of the shell parsing the command. To get rdesktop to see an argument '', use
rdesktop -u "''" $server
There's no need really to add any further escaping, since single quotes are not special inside a double-quoted string.
EDIT: When all you want to do with the command in a variable is to execute it, note that this stripping of quotes only happens once:
cmd="rdesktop -u '' $server"
$cmd
Related
I am trying to create and use variables inside heredoc like this,
#!bin/bash
sudo su - postgres <<EOF
IP="XYZ"
echo "$IP"
EOF
This doesn't work right and I get a blank line as echo.
But if I use quotes around EOF like this,
#!bin/bash
sudo su - postgres <<"EOF"
IP="XYZ"
echo "$IP"
EOF
It works. Can someone please explain this? According to what I read in man the behaviour should be opposite.
The shell evaluates the unquoted here document and performs variable interpolation before passing it to the command (in your case, sudo). Because IP is not a defined variable in the parent shell, it gets expanded to an empty string.
With quotes, you prevent variable interpolation by the parent shell, and so the shell run by sudo sees and expands the variable.
I get the set of strings as input in terminal. I need to replace the ".awk" substring to ".sh" in each string using shell and then output modified string.
I wrote such script for doing this:
#!/bin/bash
while read line
do
result=${line/.awk/.sh}
echo $result
done
But it gives me an error: script-ch.sh: 6: script-ch.sh: Bad substitution.
How should I change this simple script to fix error?
UPD: ".awk" may be inside the string. For example: "file.awk.old".
If you are using Bash, then there is nothing wrong with your substitution. There is no reason to spawn an additional subshell and use a separate utility when bash substring replacement was tailor made to do that job:
$ fn="myfile.awk.old"; echo "$fn --> ${fn/.awk/.sh}"
myfile.awk.old --> myfile.sh.old
Note: if you are substituting .sh for .awk, then the . is unnecessary. A simple ${fn/awk/sh} will suffice.
I suspect you have some stray DOS character in your original script.
Not sure why it works for you and not for me.. might be the input you're giving it. It could have a space in it.
This should work:
#!/bin/bash
while read line
do
result=$(echo $line | sed 's/\.awk/\.sh/')
echo $result
done
If you run chmod +x script.sh and then run it with ./script.sh, or if you run it with bash script.sh, it should work fine.
Running it with sh script.sh will not work because the hashbang line will be ignored and the script will be interpreted by dash, which does not support that string substitution syntax.
I cannot believe I've spent 1.5 hours on something as trivial as this. I'm writing a very simple shell script which greps a file, stores the output in a variable, and echos the variable to STDOUT.
I have checked the grep command with the regex on the command line, and it works fine. But for some reason, the grep command doesn't work inside the shell script.
Here is the shell script I wrote up:
#!/bin/bash
tt=grep 'test' $1
echo $tt
I ran this with the following command: ./myScript.sh testingFile. It just prints an empty line.
I have already used chmod and made the script executable.
I have checked that the PATH variable has /bin in it.
Verified that echo $SHELL gives /bin/bash
In my desperation, I have tried all combinations of:
tt=grep 'test' "$1"
echo ${tt}
Not using the command line argument at all, and hardcoding the name of the file tt=grep 'test' testingFile
I found this: grep fails inside bash script but works on command line, and even used dos2unix to remove any possible carriage returns.
Also, when I try to use any of the grep options, like: tt=grep -oE 'test' testingFile, I get an error saying: ./out.sh: line 3: -oE: command not found.
This is crazy.
You need to use command substitution:
#!/usr/bin/env bash
test=$(grep 'foo' "$1")
echo "$test"
Command substitution allows the output of a command to replace the command itself. Command substitution occurs when a command is enclosed like this:
$(command)
or like this using backticks:
`command`
Bash performs the expansion by executing COMMAND and replacing the command substitution with the standard output of the command, with any trailing newlines deleted. Embedded newlines are not deleted, but they may be removed during word splitting.
The $() version is usually preferred because it allows nesting:
$(command $(command))
For more information read the command substitution section in man bash.
I'm using the psql command \copy and I would like to pass a variable to it from the shell (for table name) like I've done when scripting queries. I've read in the documentation that:
The syntax of the command is similar to that of the SQL COPY command. Note that, because of this, special parsing rules apply to the \copy command. In particular, the variable substitution rules and backslash escapes do not apply.
This seems quite definitive, however I'm wondering if anyone knows of a workaroud?
You could use shell variable substitution with heredoc syntax. Example:
#!/bin/sh
tablename=foo
psql -d test <<EOF
\copy $tablename FROM '/path/to/file'
EOF
You can pass the variable from the shell to psql with -v psql_var="$shell_var" commandline parameter (or access it directly with a shell escape `echo "$shell_var"` after exporting it). Then, you can build the \copy meta command in another meta command (locally with \set or server side with \gset). Example:
#!/bin/sh
tablename=foo
psql -d test -v tbl="$tablename" <<\EOF
\set cmd '\\copy ' :tbl ' FROM ''/path/to/file'''
\echo :cmd
:cmd
EOF
I am writing a bash script which amongst many other things uses expect to automatically run a binary and install it by answering installer prompts.
I was able to get my expect script to work fine when the expect script is called in my bash script with the command "expect $expectscriptname $Parameter". However, I want to embed the expect script into the shell script instead of having to maintain two separate script files for the job. I searched around and found that the procedure to embed expect into bash script was to declare a variable like VAR below and then echo it.:
VAR=$(expect -c "
#content of expect script here
")
echo "$VAR"
1) I don't understand how echoing $VAR actually runs the expect script. Could anyone explain?
2) I am not sure how to pass $Parameter into VAR or to the echo statement. This is my main concern.
Any ideas? Thanks.
Try something like:
#!/bin/sh
tclsh <<EOF
puts $1
EOF
I don't have the expect command installed today, so I used tclsh instead.
In bash, the construct $(cmd) runs the specified command and captures its output. It's similar to the backtick notation, though there are some slight differences. Thus, the assignment to VAR is what runs the expect command:
# expect is run here
VAR=$(expect -c "
# ...
")
# This echoes the output of the expect command.
echo "$VAR"
From the bash manual page:
When the old-style backquote form
of substitution is used, backslash
retains its literal meaning except
when followed by $, , or \. The
first backquote not preceded by a
backslash terminates the command
substitution. When using the
$(command)` form, all characters
between the parentheses make up the
command; none are treated specially.
That's why it works: The bash comment character (#) isn't treated as a comment character within the $( ... ).
EDIT
Passing parameters: Just put 'em in there. For instance, consider this script:
foo="Hello, there"
bar=$(echo "
# $foo
")
echo $bar
If you run that script, it prints:
# Hello, there
Thus, the value of $foo was substituted inside the quotes. The same should work for the expect script.
Instead of a bash script and an expect script, have you considered writing just a single expect script?
Expect is a superset of Tcl, which means it is a fully functioning programming language, and you can do anything with it that you can do with bash (and for the things that you can't, you can always exec bash commands. You don't have to use expect just to "expect" things