Expecting the Unexpected in Expect - expect

How can I give an instruction to expect, when it sees anything other than what it expects?
Example, I attempt to automate a login. For anything other than a successful attempt, I need the script to mail me an error. How do I expect unexpected output? (I know some outcomes, like connection denied, or wrong password, but I want to catch everything)

Try like this:
set timeout 10; # set a reasonable timeout
# expect and send username/password ...
set success 0
set err_msg ""
expect {
"Login success!" {
set success 1
}
eof {
set err_msg $expect_out(buffer)
}
timeout {
expect *
set err_msg $expect_out(buffer)
}
}
if {! $success} {
send_mail $err_msg
exit 1
}

Related

Detect output to tty

I'm making a program to automatically type out the password when trying to ssh. The idea is I'll run my type_pass executable in the background and then start the ssh process. Something like
$ type_pass & ssh user#host.com
Currently this works as I've given a small delay (5 seconds) in the type_pass application it self.
type_pass looks something like
void typeout(int tofd, char* txt, int usecs = 100000) {
int len = strlen(txt);
for(int ii = 0; ii < len; ++ii) {
usleep(usecs);
int ioctlerr = ioctl(tofd, TIOCSTI, &txt[ii]);
int err = errno;
}
}
int main() {
sleep(10);
typeout(1, "asdas\n");
}
My question is how can I intelligently wait for the password prompt from the server and then start sending out the keys for the password?
Here's an expect script that I use for this purpose:
#!/usr/bin/expect
send_user "password: "
set old_timeout $timeout
set timeout -1
expect_user -re "(.*)\n"
send_user "\n"
set timeout $old_timeout
set password "$expect_out(1,string)\r"
spawn /bin/bash
interact {
-o -nobuffer "assword: " {
send $password
}
}
The above program asks the end user for the password and then stores it in the password variable. In this way, the password is never written to a file but expect will echo the output to the terminal. There is code that can be added to turn the echo off but I don't know it offhand.
Expect then launches a bash interactive shell.
From this point on, expect reviews all the output to the terminal. If a string is output that contains "assword:", expect will output the stored password variable.
By the way, I am not a fan of expect, but I find this program to be helpful in many cases including ssh.
I've also found sshpass to be helpful for your particular problem. I've also used ansible for similar work - but you have enough here to solve your problem.

Shell script AT Commands : not able to send sms through serial port

I have the below shell script (expect) where I am trying to send SMS. I have referred many stack overflow references and found out that ctrl-z maps to \x1a. However, even after appending it to the message and sending to the port or sending ctrl z separately to the port didn't help me. It timeouts later.
The script is written to send sms in pdu format. Irrespective of that, I believe, this is a generic issue to send ctrl-z to port. If you feel the script has some other errors, please share the solution for the same.
Also the length (34) mentioned below is the length of the (PDU_LENGTH -2)/2 as per the specification. This length doesn't include ctrl-z character.
at_command = "AT+CMGS=34\r"
message_content = "0011000C810056890......"
Script:
set PROMPT "0"
set timeout "$COMMAND_TIMEOUT"
send "$at_command"
expect {
"OK" { puts "Command Accepted\n"; }
"ERROR" { puts "Command Failed\n"; }
timeout { puts "Unable to connect to $HOSTIP at $HOSTPORT"; exit 1 }
"*>*" { set PROMPT "1"; }
}
if { "$PROMPT" == "1" } {
send "$message_content"
send "\x1a"
expect {
"OK" { puts "\nCommand accepted"; }
"ERROR" { puts "\nCommand failed"; }
"*>*" { puts "CTRL-Z dint reach UT. Error..."; }
"*" { puts "Unexpected return value received"; }
}
}
Am very sure the script sends $message_content" to port but exits immediately after sending "$message_content".
OUTPUT:
AT+CMGS=34
>
I did something like this in c# with an SMS-Gateway-Modul.
I had to switch to PDU-Mode first!
After that i had to transmit the expected PDU-Length and finally the PDU itself.
Every command has to be committed with can carriage return ASC[13] and the PDU had to be committed with an ASC[26] finally.
Here you can see a schematic(!) flow, how i did it in c#:
1) Create PDU and get length
int len;
var pdu = PDUGenerator.GetPdu(destination, message, "", out len);
2) Switch to PDUMode
SendToCom("AT+CMGF=0" + System.Convert.ToChar(13));
3) Announce message length
SendToCom("AT+CMGS=" + len + System.Convert.ToChar(13));
4) Send PDU and commit
SendToCom(pdu + System.Convert.ToChar(26));

expect fails when running proc inside proc

My script works fine (retrieves sftp prompt) when using one proc. But when I try to use proc inside proc, script gets stuck, and I do not know why.
Please do not refactor the code, that is not the point, I need to understand what is the issue here.
Working code:
proc sftp_connect {} {
set times 0;
set connection_retry 2
set timeout 1;
while { $times < $connection_retry } {
spawn sftp ${SFTP_USER}#${SFTP_SERVER}
expect {
timeout { puts "Connection timeout"; exit 1}
default {exit 2}
"*assword:*" {
send "${SFTP_PASSWORD}\n";
expect {
"sftp>" { puts "Connected"; set times [ expr $times+1]; exp_continue}
}
}
}
}
send "quit\r";
}
sftp_connect
Debug output:
expect: does "\r\nsftp> " (spawn_id exp5) match glob pattern "sftp>"? yes
But after moving send password into separate proc, expect does not retrieve sftp prompt anymore ("sftp>"):
proc sftp_send_password {} {
send "${SFTP_PASSWORD}\n";
expect {
"sftp>" { puts "Connected"; set times [ expr $times+1]; exp_continue}
}
}
proc sftp_connect {} {
set times 0;
set connection_retry 2
set timeout 1;
while { $times < $connection_retry } {
spawn sftp ${SFTP_USER}#${SFTP_SERVER}
expect {
timeout { puts "Connection timeout"; exit 1}
default {exit 2}
"*assword:*" { sftp_send_password }
}
}
send "quit\r";
}
sftp_connect
Debug output:
expect: does "" (spawn_id exp0) match glob pattern "sftp>"? yes
I don't have my copy of "Exploring Expect" handy, but I think you're running into a variable scoping issue. spawn invisibly sets a variable named spawn_id. When you call spawn in a proc, that variable is scoped only for that proc. Declare it as global:
proc sftp_connect {} {
global spawn_id
# ... rest is the same
}
I think you don't have to do the same thing in sftp_send_password because expect has a more forgiving scoping scheme than Tcl (if expect does not find a local variable, look in the global namespace).
Your sftp_send_password proc will not affect the times variable in sftp_connect though, due to the same variable scoping issue. I'd recommend
proc sftp_send_password {times_var} {
upvar 1 $times_var times ;# link this var to that in the caller
send "${SFTP_PASSWORD}\n";
expect {
"sftp>" { puts "Connected"; incr times; exp_continue}
}
# note use of `incr` instead of `expr`
}
And then the sftp_connect proc sends the times variable name:
sftp_send_password times
The following is from the expect's man page:
Expect takes a rather liberal view of scoping. In particular, variables read by commands specific to the Expect program will be sought first from
the local scope, and if not found, in the global scope. For example, this obviates the need to place global timeout in every procedure you write
that uses expect. On the other hand, variables written are always in the local scope (unless a global command has been issued). The most common
problem this causes is when spawn is executed in a procedure. Outside the procedure, spawn_id no longer exists, so the spawned process is no longer
accessible simply because of scoping. Add a global spawn_id to such a procedure.

expect - how to discard the buffer if expected string is not found

I need to spawn a script that produces lots of output which makes the regex matching of the output slow. Also the buffer fills quickly even when I use quite a large match_max value.
I would like to check the output for a particular string. If the string does not exist, I would like to discard the output read so far.
I have tried using default matches, globs and negative regexes to catch the unwanted strings, but could not get this working.
How can this be done with expect?
This 'seems' to work (more testing is required):
set success_string "\[INFO\] Started Jetty\r\n"
spawn "/usr/bin/mvn" "-pl" ":cloud-client-ui" "jetty:run"
expect {
-re "(\[^\r]*\)\r\n"
{
set current_line $expect_out(buffer)
if { [string equal "$current_line" "$success_string"] } {
puts "exiting with matched string $current_line"
exit 0
} else {
puts "discarding $current_line"
exp_continue
}
}
eof { puts "eof"; exit 1; }
timeout { puts "timeout"; exit 1; }
}

Detect end of TCL background process in a TCL script

I'm working on a program the uses an EXEC command to run a make file. This can take a long time so I want to put it in the background so the GUI doesn't lock up. However I also want the GUI to be disabled and a progress bar to run only while the make file is compiling.
So how can I detect when a background progress has finished in TCL?
Edit: It gets more complicated because my boss wants the command window to stay open (or be visable) so the user can see the progress of the make and see if it errors.
P.S. Would figuring out threading be easier? I need some way to prevent the GUI from locking up (prevent NOT RESPONDING).'
Edit: The GUI is made with TK.
I think TK is single-threaded which causes the problem. Or it could be that it defaults to single threaded and I want to set it to multi-thread.
As #glenn-jackman pointed out, the use of fileevent is preferred (because it should work everywhere).
proc handle_bgexec {callback chan} {
append ::bgexec_data($chan) [read $chan]
if {[eof $chan]} {
# end of file, call the callback
{*}$callback $::bgexec_data($chan)
unset ::bgexec_data($chan)
}
}
proc bgexec {callback args} {
set chan [open "| $args" r]
fconfigure $chan -blocking false
fileevent $chan readable [list handle_bgexec $callback $chan]
return
}
Invoke this as bgexec job_done cmd /c start /wait cmd /c make all-all. job_done gets called with the output of the command after it finishes.
It is also possible to use threads for this things, but this requires a threaded tcl build (which is now default for all platforms AFAIK, but older versions of Tcl esp. under unix don't build a threaded Tcl by default.) and the Thread package (which is included by default). An approach to use it with Threads would be:
thread::create "[list exec cmd /c start /wait cmd /c make all-all];[list thread::send [thread::id] {callback code}];thread::exit"
If you need to call this on a regular basis it might be worth to use only one worker thread instead of creating a new one for each job.
Edit: Add /wait as parameter for start the keep the first cmd running.
cmd /c start /wait cmd /c make all-all
You want to run the make process in a pipeline and use the event loop and fileevent to monitor its progress (see http://wiki.tcl.tk/880)
proc handle_make_output {chan} {
# The channel is readable; try to read it.
set status [catch { gets $chan line } result]
if { $status != 0 } {
# Error on the channel
puts "error reading $chan: $result"
set ::DONE 2
} elseif { $result >= 0 } {
# Successfully read the channel
puts "got: $line"
} elseif { [chan eof $chan] } {
# End of file on the channel
puts "end of file"
set ::DONE 1
} elseif { [chan blocked $chan] } {
# Read blocked. Just return
} else {
# Something else
puts "can't happen"
set ::DONE 3
}
}
set chan [open "|make" r]
chan configure $chan -blocking false
chan event $chan readable [list handle_make_output $chan]
vwait ::DONE
close $chan
I'm not certain about the use of vwait within Tk's event loop. Perhaps an expert will help me out here.

Resources