Performance bottleneck in expect script - performance

I'm trying to find out the point in my script that is slowing down my entire process. I'm using the expect script to send a sed command to search and replace a line in a file. This takes anywhere from 2s to 20s to finish, when it shouldn't last more than a second. I am two expect scripts in parallel in two terminals. The first file, launchmpj.exp launches a qsub job that takes several seconds to start. The second file, launchneuron.exp waits for the qsub job to start and continues the script. When the qsub job starts, the launchmpj.exp sends a command that allows the second file, launchneuron.exp to know the qsub job started and to stop waiting.
Here's launchmpj.exp
#!/usr/bin/expect -f
set timeout -1
spawn ssh $::env(username)#server
expect "$ "
send "qsub -I -q berger -A lc_tb -l nodes=\$nbnodes -l walltime=24:00:00 -d .\r"
expect "$ "
send "cp \$PBS_NODEFILE node`sed -n '1p' nodequeue`\r"
expect "$ "
send "sed -i '/wait=on/ s//wait=off/' `sed -n '1p' qsubwaitqueue`\r"
expect "$ "
send "cd $::env(MPJ_HOME)/bin\r"
expect "$ "
send "sh $::env(MPJLAUNCH)\r"
expect "Process 6 ended"
Here is the second file launchneuron.exp
#!/usr/bin/expect -f
set timeout -1
spawn ssh $::env(username)#server
expect "$ "
send "set qsubwait = qsub`sed -n '\$p' queue`.sh\r"
expect "$ "
send "sh \$qsubwait\r"
expect "$ "
send "set nodefile = node`sed -n '1p' nodequeue`\r"
expect "$ "
send "ssh `sed -n '2'p \$nodefile`\r"
expect "$ "
send "cd $::env(NEURON_HOME)\r"
expect "$ "
send "nrniv -python $::env(NEURONPY)\r"
expect "$ "
As part of the process, I'm running a sed substitution on the file below. The execution of sed alone is very fast, which means it's not the bottleneck in the aforementioned script. However, when done from the expect script, that's when it takes a long time.
sed -i '/wait=on/ s//wait=off/' qsubwait.sh
File qsubwait.sh:
wait=on
echo "Waiting for qsub to start."
while [ $wait = on ]; do
eval `sed -n '1'p qsubwait.sh`
echo `sed -n '1'p qsubwait.sh`
done

Are you hitting expect's timeout value? Run your expect script with exp_internal 1 to see what expect is waiting for.

Related

Testing account existence using expect

I have a list of 400 servers and I like to check unix account existence with expect to loop it
I wrote a bash script that uses expect command but it returns me error message that I don't understand the meaning
#!/bin/bash
fic_serv="test.txt"
echo "Passwd"
stty -echo
read -s passwd
stty echo
suffix="suffix"
account="acc"
for server in `cat $fic_serv`
do
prompt="[$acc#$server ~]$ "
expect -c "
spawn ssh -o StrictHostKeyChecking=no $account#$server.$suffix
expect "Password: "
send "$passwd\r"
expect $prompt
send "logout\r"
"
done
[acc#serv ~]$ couldn't read file "
send "passwd\r"
expect [acc#server ~]$
send "logout\r"
": no such file or directory
(I modified the value)
You should use while, not for, to parse files in Bash. Use a "redirect" to treat a file as standard input and read one line at a time.
while read server; do
...
done < $fic_serv
Your major problem is Expect interprets your "s as "end of script". Escape them, as in \", or use {}, as in:
expect -c "
spawn ssh -o StrictHostKeyChecking=no $account#$server.$suffix
expect {Password: }
send {$passwd\r}
expect $prompt
send {logout\r}
"
If you have 400 servers to manage, I strongly recommend you use ansible.
You could just put the list of hosts into a file, let's call it inventory, and run the following command:
ansible -i inventory -m shell -a "id acc" all
Using here-docs in the shell to embed code for another language is usually better than quoting hell, and sharing variables through the environment is easier and safer than parameter expansion:
export account passwd
while IFS= read -r server; do
export prompt="[$acc#$server ~]$ "
export host="$server.$suffix"
expect << 'END_EXPECT'
spawn ssh -o StrictHostKeyChecking=no $env(account)#$env(host)
expect "Password: "
send "$env(passwd)\r"
expect $env(prompt)
send "logout\r"
expect eof
END_EXPECT
done < "$fic_serv"
As shown, I like to indent the heredoc to make it more obvious.
And depending on the error message or login prompt, there can be more logic to indicate that the account name and/or password are incorrect.

Redirect grep output to file in server

I am attempting to write a script that greps for something in a number of servers and appends the output of all of them into a single file. The servers are password protected. I use expect to enter the servers and pass the grep command but I am hoping to get the output of each of them to end up in a single file.
Here is an overview of what I want to do:
spawn ssh xxx#server1
expect "password: "
send "PASSWORD\r"
expect "$ "
send "grep <something> /some/log/file >> file.txt"
expect "$ "
send "exit\r"
... then continue doing this in dozens more servers with the output of the grep command appending to file.txt each time. I don't mind where the file.txt actually is. It can be on my local computer or any of the servers.
The best I've come up with would be to put each of these in a file on the server the grep is being done on and then scp all those files to local and appending them all. This seems incredibly wasteful though, so I am looking for a way to send the output to a server or to local from a server.
It would be both easier to automate and more secure if you used public key authentication instead of password authentication to get to the servers. Then you could simply loop over them like this:
for host in server1 server2 server3 ...; do
ssh -n "$host" 'grep <something> /some/log/file'
done >file.txt
Since you have password access, you can easily put a public key in .ssh/authorized_keys to enable key access first. You can do it with your expect script:
spawn ssh xxx#server1
expect "password: "
send "PASSWORD\r"
expect "$ "
send "mkdir -p .ssh\r"
expect "$ "
send "cat >>.ssh/authorized_keys <<EOF"
send "(public key goes here)\r"
send "EOF\r"
expect "$ "
send "chmod 0700 .ssh\r"
expect "$ "
send "chmod 0600 .ssh/authorized_keys\r"
expect "$ "
send "exit\r"
If for some reason you must use a solution with password-entry, you can append to a file with expect with something like:
log_user 0 # to not see the output on screen
set fh [open foo.log a] # open the file for appending
set servers {user#server user#server2 […]}
foreach s $servers {
spawn ssh user#server
[…]
send "command"
expect "$ " { puts $fh "$expect_out(buffer)"}
}
close $fh

perl program not running thorough automated script until it is executed manually first

i am writing code to automate some steps . First it is required to switch user and then run a perl script. Here is my code
if [ -a /try/Test ]
then
su trial -c ". /try/.profile Test"
expect -c 'spawn try1;
send "3\r";
send "1\r";
send "show\r";
interact';
fi
try1 is my perl program which i am trying to call.This script throws this error
couldn't execute "try1": no such file or directory
while executing
"spawn try1"
but once i do this step manually and then run this script then this script runs without nay error.
I think you've already asked about it (and I did answer, didn't I)?
Here's the basic skeleton (make sure to add error/timeout/unexpected output handling):
# collect password
stty -echo
send_user -- "Password: "
expect_user -re "(.*)\n"
send_user "\n"
stty echo
set pass $expect_out(1,string)
spawn sudo sh;
expect -re ": *$";
send -- "$pass\r"
expect -re "\$ *$";
send "echo SETTING PARAMS\r";
expect -re "\$ *$";
send "echo RUNNING MY COMMAND\r";
expect -re "\$ *$";
interact

Bash script to wait for gnome-terminal to finish before continuing script, only works for first instance of script

I have a bash script that opens a new gnome terminal with two tabs that runs more scripts. After the scripts in the two tabs finishes, the main script in the parent terminal continues to run.
When I run multiple instances of this bash script, it no longer waits for the additional gnome-terminals to finish before continuing the parent terminal script.
How do I fix it so that the additional instances of the script runs just like the first one?
Here is the bash script that I'm running. I run additional instances of this by typing sh scriptname.sh in a new terminal.
gnome-terminal --tab --command="expect launchneuron.exp" --tab --command="expect launchmpj.exp"
echo "Simulation Complete"
echo "Plotting Results"
expect -c "
set timeout -1
spawn ssh $username#server
expect \"password\"
send \"$password\r\"
expect \"$ \"
send \"qsub -I -q abc -A lc_tb -l nodes=1 -l walltime=24:00:00 -d .\r\"
expect \"$ \"
send \"sh plotgraph.sh\r\"
expect \"$ \"
send \"exit\r\"
"
#!/bin/bash
date
bash -c "sleep 7" &
bash -c "sleep 5" &
wait
date
As you can see while running this script, both sleep commands will run in parallel, but main thread stalls, while they are running.
Sat. Jule 27 01:11:49 2013
Sat. Jule 27 01:11:56 2013
Replace sleep 7 with expect launchneuron.exp
and sleep 5 with expect launchmpj.exp
and add your plot commands after calling "wait":
echo "Simulation Complete"
...(your code to plot results)

Trimming output of expect script

I am new to Expect scripts. I have Expect script written in a bash script. The script returns me the statistics I require but along with a lot of other stuff from the terminal. "Is there any way I can get precisely the output of the command only?"
I have spent a day searching various forums but didn't had any luck.
Any sort of help will be appreciated.
Stats = $(expect -c "
spawn ssh $Username#$Host
expect \"password:\"
send \"$Password\r\"
expect \"\\\\$\"
send \"ps -A | grep java\r\"
expect -re \"$USER.*\"
send \"logout\"
")
echo $Stats > someFile.txt
You can turn of logging except for the command output:
expect -c "
log_user 0
spawn ssh $Username#$Host
expect \"password:\"
send \"$Password\r\"
expect \"\\\\$\"
log_user 1
send \"ps -A | grep java\r\"
expect -re \"$USER.*\"
send \"logout\"
"

Resources