stdout stream redirect in expect scripts - bash

I have the following bash script which wraps an expect script (in here-document form):
#!/bin/bash
PASSWORD_MYSQL_ROOT=root
expect <<- DONE
set timeout -1
spawn mysqldump --all-databases --user=root --password --protocol=TCP --host=localhost --verbose > somebackupfile.sql
expect "*?asswor?:*"
send -- "$PASSWORD_MYSQL_ROOT\r"
expect eof
DONE
When I execute this script, I get the following output:
spawn mysqldump --all-databases --user=root --password --protocol=TCP --host=localhost --verbose > somebackupfile.sql
Usage: mysqldump [OPTIONS] database [tables]
OR mysqldump [OPTIONS] --databases [OPTIONS] DB1 [DB2 DB3...]
OR mysqldump [OPTIONS] --all-databases [OPTIONS]
For more options, use mysqldump --help
send: spawn id exp4 not open
while executing
"send -- "root\r""
So something didn't work right.
After some try and error, I found out that the stdout stream redirect > somebackupfile.sql is the culprit -- the script works once this redirect is removed.
So I am wondering: How do I use stream redirection in expect scripts?

untested, but this should work:
expect <<- DONE
set timeout -1
spawn mysqldump --all-databases --user=root --password --protocol=TCP --host=localhost --verbose
expect "*?asswor?:*"
send -- "$PASSWORD_MYSQL_ROOT\r"
log_file somebackupfile.sql
expect eof
DONE

Related

Bash script, remote psql command over ssh, pipe sql file, password fails

I have a bash script. I want to run a postgres command with ssh that pipes a local file. The problem is the psql command prompts for a password, and my sql file gets piped into that. How do I write a command that pipes after I type in the password?
ssh server "psql -W -h db_host -p 5432 -U db_user -d postgres" < staging.sql
I suggest to break it down into multiple steps:
# Transfer the sql file to the server
scp staging.sql server
# Excute the queries in that file with psql over ssh
# Notes:
# - ssh -t enforces terminal allocation. You may try it without this option and see if it still works.
# - psql -f FILENAME reads commands from file
#
ssh -t server \
'psql -W -h db_host -U db_user -d postgres -f staging.sql; rm staging.sql'

Turn off the return message from the executed command

I'm developing a bash script, I've used ssh command in my bash script to run some commands on a remote server and I need to get the result from the command which runs on the remote server. so I wrote this code:
db="$(ssh -t user#host 'mysql --user=username -ppassword -e \"SHOW DATABASES;\" | grep -Ev \"(Database|information_schema|performance_schema)\"' | grep -Ev \"(mysql)\")"
But each time which I run my bash script, I will get Connection to host closed. in first of the db result. this is a default message from ssh command.
Also, If I use > /dev/null 2>&1 end of my command the db variable would be empty.
How can I turn off the return message from the executed command?
Like this :
#!/bin/bash
db=$(
ssh -t user#host bash<<EOF
mysql --user=username -ppassword -e "SHOW DATABASES" |
grep -Ev "(Database|information_schema|performance_schema|mysql)" \
2> >(grep -v 'Connection to host closed')
EOF
)
or if Connection to host closed comes from STDOUT :
...
mysql --user=username -ppassword -e "SHOW DATABASES" |
grep -Ev "(Database|information_schema|performance_schema|mysql|Connection to host closed)"
...

Sybase query:save an output to a file

I´ve created the following script:
#!/bin/bash
isql -U databasename_dba -P password -b <<EOF!
select quantity, date from name_table where numer_id="1234"
go
quit
EOF!
Running the script I got the desirable output, see:
user#system$ ./EXECUTE_DAILY_4:
But now, how can I save this result that I see in my terminal window in a file? (.csv for example)
I adapted the following to my Sybase query:
#!/bin/bash
cat > test.sql <<EOF!
isql -U databasename_dba -P password -b
select quantity, date from name_table where numer_id="1234"
go
quit
EOF!
isql test.sql >result.csv
Without success, the above is not working.
Thanks in advance
A couple options:
isql -U databasename_dba -P password -b <<-EOF > result.csv 2>&1
select quantity, date from name_table where numer_id="1234"
go
EOF
The '> result.csv 2>&1' says to write stdout to the file 'result.csv'; also redirect stderr (fd=2) to stdout (fd=1) (ie, also write stderr to the file 'result.csv'; from here you can do what you want with 'result.csv' (eg, check for errors, parse/process the file as needed, etc).
NOTE: The 'quit' is superfluous as the isql session will automatically exit/quit when it has nothing else to do.
If you want to place the query in a *.sql file:
echo "select quantity, date from name_table where numer_id='1234'" > test.sql
echo "go" >> test.sql
From here you have a couple options for submitting to isql:
isql -U databasename_dba -P password -b -i test.sql -o result.csv
or
isql -U databasename_dba -P password -b -i test.sql > result.csv 2>&1
The '-i test.sql' tells isql to take it's input from the file 'test.sql'; the first example uses '-o result.csv' to diredct stdout to 'result.csv' while the second example directs stdout/stderr to 'result.csv'.
You could effectively do the same thing but within the script itself. Something like:
#!/bin/bash
command=$(
isql -U databasename_dba -P password -b <<EOF!
select quantity, date from name_table where numer_id="1234"
go
EOF!
)
echo "$command" >> FILE.csv
I solved it using this (a basic solution):
./EXECUTE_DAILY_4> FILE.csv
I'm open to seeing more suggestions,
thanks.

How to run remote script with sudo access

I am trying to run a remote script which need to run as a sudo but its unable to run the script as sudo. Is my approach is correct to do sudo? Please help..Thanks in advance.
spawn ssh -t {*}$ssh_opts $user#$ip bash $script {*}$argv
Try this :-
#!/usr/bin/expect
set timeout -1
spawn -noecho bash -c "ssh user#host \"sudo su -c 'your_command' - sudo_user\""
expect {
-re ".*assword:"{
send "password\r"
exp_continue
} eof {
wait
}
}
Hope it helps.!!!

send: spawn id exp7 not open

When I try to execute autoexpect file I get an error send: spawn id exp7 not open
Here is my file sh.exp:
#!/usr/bin/expect
# mysql credentials and connection data
db_host='localhost'
db_name='webui_dev'
db_user='root'
db_pass=''
new_db_name='db_2011'
expect <<EOF
log_user 0
spawn mysql -h $db_host -u $db_user -p $db_pass 'create database $new_db_name'
expect "password:"
send "$db_pass\r"
log_user 1
expect eof
EOF
Can't find where is an error.
Try quoting your variables properly:
spawn mysql -h "$db_host" -u "$db_user" -p "$db_pass" "create database $new_db_name"
Variables not quoted in double quotes are subject to word splitting and pathname expansion. And variables under single quotes don't expand. Read more about shell quoting here.
Update: It seems that you're actually expanding it through a here document, but it would still apply since your arguments still need to be quoted in expect. This is what it would appear as input to expect:
log_user 0
spawn mysql -h "localhost" -u "root" -p "" "create database db_2011"
expect "password:"
send "\r"
log_user 1
expect eof
This is how it would appear if you haven't quoted it yet:
...
spawn mysql -h localhost -u root -p 'create database db_2011'
...
UPDATE:
The cause of the problem actually is because mysql ends quickly without showing a prompt for password due to the extra argument. The solution is to send the command manually. It's also preferable to just run the whole script as an expect script, not an embedded one for lesser confusion
#!/usr/bin/expect
# mysql credentials and connection data
set db_host "localhost"
set db_name "webui_dev"
set db_user "root"
set db_pass ""
set new_db_name "db_2011"
log_user 0
spawn mysql -h "$db_host" -u "$db_user" -p
expect "password:"
send "$db_pass\r"
expect "mysql>"
send "create database $new_db_name; exit; \r"
log_user 1
expect eof
Save the script as an expect file and run it with expect -f expect_file.

Resources