Expect loop doesn't print variables - bash

I can't get to have the variables in an expect loop written:
#!/bin/bash
expect -c "
set var1 10
puts {$var1}
puts $expect_out({$var1})
foreach name { bob jim jacobo henric } {
puts {hello $name $var1}
}"
my output is:
# ./test
({})
hello
hello
hello
hello
So basically it's not expanding any variable.
Any idea?

You should quote with ', not "! This should get the job done:
#!/bin/bash
expect -c '
set var1 10
puts "$var1"
send_user "Confirm? (Y/n): "
expect_user -re "(.*)\n"
set var1 $expect_out(1,string)
foreach name { bob jim jacobo henric } {
puts "hello $name $var1"
}
'
Output:
10
Confirm? (Y/n): Y
hello bob Y
hello jim Y
hello jacobo Y
hello henric Y

Thanks a lot, it worked as a charm.
But it has some issues when the array elements are passed via parameter like this:
function a {
names=$1
expect -c '
foreach name { $names } {
puts "$name"
}
'
}
It just prints:
$names

you mean that any variable must be wrap around by single and double quotes?
' "$variable" '

ok I decided to put the expect code in a separate script test.ex:
set interfaces [lindex $argv 0]
foreach int { $var} {
puts $int
}
When I call the expect it does not cycle over the strings elements:
test.ex "string1 string2"
as it just prints:
$var
It is just ignoring the variable content, and it handles the variable as a string itself without expanding it, is there a way to get it work?

I figured it out!
Just needed to remove the curly brackets {} around the variable that need to be looped over:
set interfaces [lindex $argv 0]
foreach int $var {
puts $int
}

Related

Iteration in TCL using for each

I'm trying to iterate the value using for each in TCL
This how i give input
./a.sh value1,value2
in the a.sh script
set var [lindex $argv 0]
set values [split $var ","]
foreach {set i 0} {$values} {
puts "iterated once $i"
}
i want the iteration to happen twice since there are two values passed . but instead it is iterating only once
please help me on this
thanks in advance
The foreach command, in its simplest form, takes a variable name, a list value, and a script to run for each element of the list. What you were passing was… weird in several ways at once. Look, here's a correct way of doing it:
set values [split $var ","]
foreach item $values {
puts "iterated: $item"
}
If you want to count through, you should set up your own counter:
set values [split $var ","]
foreach item $values {
set i [incr counter]
puts "iterated #$i: $item"
}
That can be shortened to this:
foreach item [split $var ","] {
puts "iterated #[incr counter]: $item"
}

for loop, expect and pssh

I need a script to modify the same data in multiple servers. For now the for loop generate the command lines, but i'm experiencing some problems with expect and pssh.
The for loop:
<code>
for ((var1=1;var1<=14; var1++))
{
cda stm add "$var1/$var2/$var3" ss 1
for ((var2=1;var2<=8;var2++))
{
cda stm add "$var1/$var2/$var3" ss 1
for ((var3=1; var3<64; var3++))
{
cda stm add "$var1/$var2/$var3" ss 1
}
}
}
</code>
I'm using pssh instead ssh in expect script.
The full code:
<code>
#!/usr/bin/expect
set timeout 20
set ip [lindex $argv 0]
set user [lindex $argv 1]
set password [lindex $argv 2]
for ((var1=1;var1<=14; var1++))
{
cda stm add "$var1/$var2/$var3" ss 1
for ((var2=1;var2<=8;var2++))
{
cda stm add "$var1/$var2/$var3" ss 1
for ((var3=1; var3<64; var3++))
{
cda stm add "$var1/$var2/$var3" ss 1
}
}
}
spawn pssh "$user\#$ip"
expect "yes/no" {
send "yes\r"
expect "*?assword" { send "[lindex $argv 2]\r" }
} "*?assword" { send "[lindex $argv 2]\r" }
expect "SH"
interact
</code>
I'm getting the following error:
<code>
wrong # args: should be "for start test next command"
while executing
"for ((var1=1"
(file "./ssh" line 9)
</code>
for loops in expect and Tcl have to look like this:
for {set var1 1} {$var1<=14} {incr var1} {
commands...
}
In other words, the for command requires four arguments: the start code, the condition, the "next" code and the loop body. Note that newlines are command separators in Tcl, so the open brace for the loop body must be on the same line as the for command (or you must use a backslash-newline line continuation).
You used bash/ksh syntax instead.
Apart from the syntax error in for loop, there's a logical problem with the code snippet in the question:
var2 isn't available for printing in the outermost loop (for the first time).
var3 isn't available for printing in the 2nd loop (for the first time).
Here's probably what you need:
for {set var1 1} {$var1 <=14} {incr var1} {
puts "LEVEL_1: $var1"
# cda stm add "$var1" ss 1
for {set var2 1} {$var2 <=8} {incr var2} {
puts "LEVEL_2: $var1/$var2"
# cda stm add "$var1/$var2" ss 1
for {set var3 1} {$var3 <64} {incr var3} {
puts "LEVEL_3: $var1/$var2/$var3"
# cda stm add "$var1/$var2/$var3" ss 1
}
}
}

Variables using in Expect/TCL in Bash Script (missing operand at _#_)

Hej all,
I want to use expect code in a bash script. For background information here I got my solution for the expect script: Save terminal output to variable in expect/tcl
The script works fine when I use it stand alone. But I want to use it in a bash script for processing multiple files, I have problems with the variables inside the expect part.
I got these error:
set new error bounds? (0: no, 1: yes): missing operand at _#_
in expression " _#_> 0.3 || > 0.3 "
(parsing expression " > 0.3 || > 0.3 ")
...
Code:
#! /bin/bash
MLI_offs=$1
MLI_snr=$2
MLI_diff_par=$3
/usr/bin/expect <<EOF
spawn offset_fitm $MLI_offs $MLI_snr $MLI_diff_par MLI_coffs MLI_coffsets 7.0 6 1
set range 1.5
set azimuth 1.5
while {true} {
expect "enter minimum SNR threshold:"
send "7.0\r"
expect "enter the range and azimuth error thresholds:"
send "$range $azimuth\r"
expect -re {range: ([0-9.]+) azimuth: ([0-9.]+)} {
set range $expect_out(1,string)
set azimuth $expect_out(2,string)
}
expect "set new error bounds? (0: no, 1: yes):" {
if { $range > 0.3 || $azimuth > 0.3 } {
send "1\r"
} else {
send "0\r"
break
}
}
}
interact
EOF
Thanks,
Bjoern
The problem is that the variables are being substituted inside bash as well as Tcl/Expect. Since bash replaces unknown variables with the empty string, this leaves an entirely wrong script (which in turn complains because it can't figure out what's going on). It will have broken other things too (the use of expect_out) but it happens you've not hit it.
The simplest thing is to stop using bash as a wrapper as Tcl's quite adept at doing that sort of thing itself via the argv global. Thus:
#! /usr/bin/expect
set MLI_offs [lindex $argv 0]
set MLI_snr [lindex $argv 1]
set MLI_diff_par [lindex $argv 2]
# Alternatively, replace the preceding three lines with:
# lassign $argv MLI_offs MLI_snr MLI_diff_par
spawn offset_fitm $MLI_offs $MLI_snr $MLI_diff_par MLI_coffs MLI_coffsets 7.0 6 1
set range 1.5
set azimuth 1.5
while {true} {
expect "enter minimum SNR threshold:"
send "7.0\r"
expect "enter the range and azimuth error thresholds:"
send "$range $azimuth\r"
expect -re {range: ([0-9.]+) azimuth: ([0-9.]+)} {
set range $expect_out(1,string)
set azimuth $expect_out(2,string)
}
expect "set new error bounds? (0: no, 1: yes):" {
if { $range > 0.3 || $azimuth > 0.3 } {
send "1\r"
} else {
send "0\r"
break
}
}
}
interact
you can escape the special '$' with a '\' for all expect(tcl) variables, to prevent Bash interpreting it. eg. $range should be writen as \$range in your Bash script.

while loops within expect

I am using expect within bash. I want my script to telnet into a box, expect a prompt, send a command. If there is a different prompt now, it has to proceed or else it has to send that command again.
My script goes like this:
\#!bin/bash
//I am filling up IP and PORT1 here
expect -c "
set timeout -1
spawn telnet $IP $PORT1
sleep 1
send \"\r\"
send \"\r\"
set temp 1
while( $temp == 1){
expect {
Prompt1 { send \"command\" }
Prompt2 {send \"Yes\"; set done 0}
}
}
"
Output:
invalid command name "while("
while executing
"while( == 1){"
Kindly help me.
I tried to change it to while [ $temp == 1] {
I am still facing the error below:
Output:
invalid command name "=="
while executing
"== 1"
invoked from within
"while [ == 1] {
expect {
This is how I'd implement this:
expect -c '
set timeout -1
spawn telnet [lindex $argv 0] [lindex $argv 1]
send "\r"
send "\r"
expect {
Prompt1 {
send "command"
exp_continue
}
Prompt2 {
send "Yes\r"
}
}
}
' $IP $PORT1
use single quotes around the expect script to protect expect variables
pass the shell variables as arguments to the script.
use "exp_continue" to loop instead of an explicit while loop (you had the wrong terminating variable name anyway)
The syntax for while is "while test body". There must be a spce between each of those parts which is why you get the error "no such command while)"
Also, because of tcl quoting rules, 99.99% of the time the test needs to be in curly braces. So, the syntax is:
while {$temp == 1} {
For more information see http://tcl.tk/man/tcl8.5/TclCmd/while.htm
(you probably have other problems related to your choice of shell quotes; this answer addresses your specific question about the while statement)

idioms for returning multiple values in shell scripting

Are there any idioms for returning multiple values from a bash function within a script?
http://tldp.org/LDP/abs/html/assortedtips.html describes how to echo multiple values and process the results (e.g., example 35-17), but that gets tricky if some of the returned values are strings with spaces in.
A more structured way to return would be to assign to global variables, like
foo () {
FOO_RV1="bob"
FOO_RV2="bill"
}
foo
echo "foo returned ${FOO_RV1} and ${FOO_RV2}"
I realize that if I need re-entrancy in a shell script I'm probably doing it wrong, but I still feel very uncomfortable throwing global variables around just to hold return values.
Is there a better way? I would prefer portability, but it's probably not a real limitation if I have to specify #!/bin/bash.
In the special case where your values never contain spaces, this read trick can be a simple solution:
get_vars () {
#...
echo "value1" "value2"
}
read var1 var2 < <(get_vars)
echo "var1='$var1', var2='$var2'"
But of course, it breaks as soon as there is a space in one of the values. You could modify IFS and use a special separator in your function's echo, but then the result is not really simpler than the other suggested solutions.
This question was posted 5 years ago, but I have some interesting answer to post. I have just started learning bash, and I also encounter to the same problem as you did. I think this trick might be helpful:
#!/bin/sh
foo=""
bar=""
my_func(){
echo 'foo="a"; bar="b"'
}
eval $(my_func)
echo $foo $bar
# result: a b
This trick is also useful for solving a problem when a child process can not send back a value to its parent process.
Much as I love shell, it's probably the case that as soon as you're throwing arbitrary structured data around, Unix bourne/posix shell is not the right choice.
If there are characters which do not occur inside fields, then separate with one of those. The classic example is /etc/passwd, /etc/group and various other files which use a colon as a field separator.
If using a shell which can handle a NUL character inside strings, then joining on the NUL and separating on it (via $IFS or whatever) can work well. But several common shells, including bash, break on NUL. A test would be an old .sig of mine:
foo=$'a\0b'; [ ${#foo} -eq 3 ] && echo "$0 rocks"
Even if that would work for you, you've just reached one of the warning signs that it's time to switch to a more structured language (Python, Perl, Ruby, Lua, Javascript ... pick your preferred poison). Your code is likely to become hard to maintain; even if you can, there's a smaller pool of people who'll understand it well enough to maintain it.
Yet another way:
function get_tuple()
{
echo -e "Value1\nValue2"
}
IFS=$'\n' read -d '' -ra VALUES < <(get_tuple)
echo "${VALUES[0]}" # Value1
echo "${VALUES[1]}" # Value2
In order version of Bash which doesn't support nameref (introduced in Bash 4.3-alpha) I may define helper function in which the return value is assigned to the given variable. It's sort of like using eval to do the same kind of variable assignment.
Example 1
## Add two complex numbers and returns it.
## re: real part, im: imaginary part.
##
## Helper function named by the 5th positional parameter
## have to have been defined before the function is called.
complexAdd()
{
local re1="$1" im1="$2" re2="$3" im2="$4" fnName="$5" sumRe sumIm
sumRe=$(($re1 + $re2))
sumIm=$(($im1 + $im2))
## Call the function and return 2 values.
"$fnName" "$sumRe" "$sumIm"
}
main()
{
local fooRe='101' fooIm='37' barRe='55' barIm='123' bazRe bazIm quxRe quxIm
## Define the function to receive mutiple return values
## before calling complexAdd().
retValAssign() { bazRe="$1"; bazIm="$2"; }
## Call comlexAdd() for the first time.
complexAdd "$fooRe" "$fooIm" "$barRe" "$barIm" 'retValAssign'
## Redefine the function to receive mutiple return values.
retValAssign() { quxRe="$1"; quxIm="$2"; }
## Call comlexAdd() for the second time.
complexAdd "$barRe" "$barIm" "$bazRe" "$bazIm" 'retValAssign'
echo "foo = $fooRe + $fooIm i"
echo "bar = $barRe + $barIm i"
echo "baz = foo + bar = $bazRe + $bazIm i"
echo "qux = bar + baz = $quxRe + $quxIm i"
}
main
Example 2
## Add two complex numbers and returns it.
## re: real part, im: imaginary part.
##
## Helper functions
## getRetRe(), getRetIm(), setRetRe() and setRetIm()
## have to have been defined before the function is called.
complexAdd()
{
local re1="$1" im1="$2" re2="$3" im2="$4"
setRetRe "$re1"
setRetRe $(($(getRetRe) + $re2))
setRetIm $(($im1 + $im2))
}
main()
{
local fooRe='101' fooIm='37' barRe='55' barIm='123' bazRe bazIm quxRe quxIm
## Define getter and setter functions before calling complexAdd().
getRetRe() { echo "$bazRe"; }
getRetIm() { echo "$bazIm"; }
setRetRe() { bazRe="$1"; }
setRetIm() { bazIm="$1"; }
## Call comlexAdd() for the first time.
complexAdd "$fooRe" "$fooIm" "$barRe" "$barIm"
## Redefine getter and setter functions.
getRetRe() { echo "$quxRe"; }
getRetIm() { echo "$quxIm"; }
setRetRe() { quxRe="$1"; }
setRetIm() { quxIm="$1"; }
## Call comlexAdd() for the second time.
complexAdd "$barRe" "$barIm" "$bazRe" "$bazIm"
echo "foo = $fooRe + $fooIm i"
echo "bar = $barRe + $barIm i"
echo "baz = foo + bar = $bazRe + $bazIm i"
echo "qux = bar + baz = $quxRe + $quxIm i"
}
main
you can make use of associative arrays with you have bash 4 eg
declare -A ARR
function foo(){
...
ARR["foo_return_value_1"]="VAR1"
ARR["foo_return_value_2"]="VAR2"
}
you can combine them as strings.
function foo(){
...
echo "$var1|$var2|$var3"
}
then whenever you need to use those return values,
ret="$(foo)"
IFS="|"
set -- $ret
echo "var1 one is: $1"
echo "var2 one is: $2"
echo "var3 one is: $3"
I would go for the solution I suggested here, but using an array variable instead. Older bash:es don't support associative arrays.
E.g.,
function some_func() # ARRVAR args...
{
local _retvar=$1 # I use underscore to avoid clashes with return variable names
local -a _out
# ... some processing ... (_out[2]=xxx etc.)
eval $_retvar='("${_out[#]}")'
}
Calling site:
function caller()
{
local -a tuple_ret # Do not use leading '_' here.
# ...
some_func tuple_ret "arg1"
printf " %s\n" "${tuple_ret[#]}" # Print tuple members on separate lines
}
Later version of Bash supports nameref. Use declare -n var_name to give var_name the nameref attribute. nameref gives your function the ability to "pass by reference" which is commonly used in C++ functions to return multiple values. According to Bash man page:
A variable can be assigned the nameref attribute using the -n option to the declare or local builtin commands to create a nameref, or a reference to another variable. This allows variables to be manipulated indirectly. Whenever the nameref variable is referenced or assigned to, the operation is actually performed on the variable specified by the nameref variable's value. A nameref is commonly used within shell functions to refer to a variable whose name is passed as an argument to the function.
The following are some interactive command line examples.
Example 1:
$ unset xx yy
$ xx=16
$ yy=xx
$ echo "[$xx] [$yy]"
[16] [xx]
$ declare -n yy
$ echo "[$xx] [$yy]"
[16] [16]
$ xx=80
$ echo "[$xx] [$yy]"
[80] [80]
$ yy=2016
$ echo "[$xx] [$yy]"
[2016] [2016]
$ declare +n yy # Use -n to add and +n to remove nameref attribute.
$ echo "[$xx] [$yy]"
[2016] [xx]
Example 2:
$ func()
> {
> local arg1="$1" arg2="$2"
> local -n arg3ref="$3" arg4ref="$4"
>
> echo ''
> echo 'Local variables:'
> echo " arg1='$arg1'"
> echo " arg2='$arg2'"
> echo " arg3ref='$arg3ref'"
> echo " arg4ref='$arg4ref'"
> echo ''
>
> arg1='1st value of local assignment'
> arg2='2st value of local assignment'
> arg3ref='1st return value'
> arg4ref='2nd return value'
> }
$
$ unset foo bar baz qux
$
$ foo='value of foo'
$ bar='value of bar'
$ baz='value of baz'
$ qux='value of qux'
$
$ func foo bar baz qux
Local variables:
arg1='foo'
arg2='bar'
arg3ref='value of baz'
arg4ref='value of qux'
$
$ {
> echo ''
> echo '2 values are returned after the function call:'
> echo " foo='$foo'"
> echo " bar='$bar'"
> echo " baz='$baz'"
> echo " qux='$qux'"
> }
2 values are returned after the function call:
foo='value of foo'
bar='value of bar'
baz='1st return value'
qux='2nd return value'
I am new to bash, But found this code helping.
function return_multiple_values() {
eval "$1='What is your name'"
eval "$2='my name is: BASH'"
}
return_var=''
res2=''
return_multiple_values return_var res2
echo $return_var
echo $res2
Shell script functions can only return the exit status of last command executed or the exit status of that function specified explicitly by a return statement.
To return some string one way may be this:
function fun()
{
echo "a+b"
}
var=`fun` # Invoke the function in a new child shell and capture the results
echo $var # use the stored result
This may reduce your discomfort although it adds the overhead of creation of a new shell and hence would be marginally slower.

Resources