Expect deletes ^H after spawn bash -c - bash

I have an expect script that grabs a bunch of configs and outputs them to separate files. The files though have a bunch of ^H's though I'd like to clean out. I used the CtrlV+CtrlH to put it in the expect script, but when it runs it just does it. The sed works fine in bash, but fails in expect. How can I delete the ^H from the files?
foreach host $ip {
set output [ open "$host" w ]
set timeout 2
spawn ssh -oKexAlgorithms=+diffie-hellman-group1-sha1 -o "StrictHostKeyChecking no" -oHostKeyAlgorithms=+ssh-dss $username#$host
expect {
eof {wait; spawn telnet $host;
expect "ame:";
send "$username\r"}
}
expect "word:"
send "$password\r"
expect "#"
send "$command\r"
sleep 2
expect {
"ore--" { send -- " "; puts $output $expect_out(buffer); exp_continue}
"#" {send -- "exit\r"}
}
puts $output $expect_out(buffer)
close $output
close
}
foreach host $ip {
spawn bash -c "sed -i 's/^H//gi; s/--More--//g' ./$host"
}
close
Here's what it looks like when it runs:
spawn bash -c sed -i 's//gi; s/--More--//g' ./192.168.50.1
spawn bash -c sed -i 's//gi; s/--More--//g' ./192.168.51.1
spawn bash -c sed -i 's//gi; s/--More--//g' ./192.168.52.1

Assuming an ASCII-based locale, this should work:
tr -d '\010' < file > file.new
\010 is octal 8, which is the ASCII equivalent of backspace.

For calling out to sed, since you don't need to interact with the process, you can use exec instead:
foreach host $ip {
exec sed -i {s/^H//gi; s/--More--//g} ./$host
}
Using Tcl's version of single quotes, which are {}

Related

sftp bash script doesnt downloads files

I have a problem with my bash script.
It connects to my sftp server.
Gets the list of files to download.
Should download the files. But it doesnt do this. I can see the commands. if I write the commands manually, it works.
You can see the script and the log files here:
#!/bin/bash
LOCALDIR="/data/IMPORT/$(date +%Y%m%d)"
REMOTEDIR="/EXPORT"
FILELIST="$LOCALDIR/filelist.txt"
FILELIST2="$LOCALDIR/filelist2.txt"
SFTP="sftp -P 1234 -i /var/xxxxxx.pem -oStrictHostKeyChecking=no user#xxxxx.xxxx"
PASSPHRASE="xxxxxxxxxxxxxxxx"
mkdir -p $LOCALDIR
rm $FILELIST
rm $FILELIST2
allfilenames=()
function readFileList {
expect -c "
spawn $SFTP
expect \"assphrase\"
send \"$PASSPHRASE\r\"
expect \"sftp>\"
send \"lcd $LOCALDIR\r\"
send \"ls -l $REMOTEDIR/*\r\"
expect \"sftp>\"
send \"exit\r\"
interact " > $FILELIST
}
function getFiles {
myfilenames=("$#")
expect -c "
spawn $SFTP
expect \"assphrase\"
send \"$PASSPHRASE\r\"
expect \"sftp>\"
send \"lcd $LOCALDIR\r\"
expect \"sftp>\"
send \"cd $REMOTEDIR\r\"
expect \"sftp>\"
" >> $FILELIST2
for filepath in "${myfilenames[#]}"
do
file="$(basename -- $filepath)"
expect -c "
send \"get -P $file\r\n\"
sleep 3
expect \"sftp>\"
" >> $FILELIST2
done
expect -c "
send \"exit\r\"
" >> $FILELIST2
}
readFileList
c=0
if [[ -f "$FILELIST" ]]; then
while read line; do
filename=$(echo $line | awk '{ print $9 }')
if [[ "$filename" =~ ^$REMOTEDIR ]] ; then
allfilenames+=($filename)
fi
done < $FILELIST
fi
getFiles "${allfilenames[#]}"
filelist.txt looks like:
spawn sftp -P 1234 -i /var/xxxxxx.pem -oStrictHostKeyChecking=no user#xxxxx.xxxx
Enter passphrase for key '/var/xxxxxx.pem':
Connected to xxxxx.xxxx.
sftp> lcd /data/IMPORT/20200401
sftp> ls -l /EXPORT/*
-rw-r--r-- 0 1000472 1000472 3681 Mar 31 22:31 /EXPORT/file1.txt
-rw-r--r-- 0 1000472 1000472 14537 Mar 31 22:34 /EXPORT/file2.txt
-rw-r--r-- 0 1000472 1000472 5932 Mar 31 22:34 /EXPORT/file3.txt
sftp> exit
filelist2.txt looks like:
spawn sftp -P 1234 -i /var/xxxxxx.pem -oStrictHostKeyChecking=no user#xxxxx.xxxx
Enter passphrase for key '/var/xxxxxx.pem':
Connected to xxxxx.xxxx.
sftp> lcd /data/IMPORT/20200401
sftp> cd /EXPORT
sftp> get -P file1.txt
get -P file2.txt
get -P file3.txt
exit
expect -c 'spawn sftp ...'
expect -c 'send "get file\r" '
Here when the first expect -c completes, the SFTP connection will be closed so the second expect -c would not work. You have to use one single expect -c for one SFTP session.
It's like when you manually sftp to the server, you cannot temporarily go back to Bash and come back to sftp later.

How to loop Bash array in Expect script

I have an array in my bash script:
my_file=("a.txt" "b.txt" "c.txt" "d.txt" "e.txt")
Now in the same file I want to use Expect and make a looping to get some files in sftp. This is my code
/usr/bin/expect <<EOF
set timeout -1
array set param ${!my_file[#]}
spawn sftp $sftp_option $user#$host
expect "Password:"
send "$pswd\r"
expect "sftp>"
for arg in $param; do
send "mget $arg*\r"
expect "sftp>"
done
send "bye\r"
EOF
With that code I can't make a loop with that array above. And I got error like this:
wrong # args: should be "array set arrayName list"
while executing
"array set param 0 1"
Full Code:
#!/bin/bash
export PATH=$PATH:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/root/bin;
homeDir=/home/jlo_apps/daemon/jlo_ex/collection/wssh;
filePID=$homeDir/rsync_rfs_bbni.pid;
trap "rm $filePID; exit" SIGHUP SIGINT SIGTERM;
if [ -e $filePID ]; then
datenow=`date`;
echo "[ Rsync ] There are rsync processes still running...";
echo "========================= $datenow =========================";
exit 1;
else
echo $$ > $filePID;
fi
. $homeDir/get_rfs_bni.conf;
strconn="$dbuser/$dbpass#$dbhost:$dbport/$dbsid";
profile=`$sqlldr_path -s $strconn <<< "select host|| '|' ||username|| '|' ||password|| '|' ||port|| '|' ||outdir from p_epdp where bank='BBNI';" | tail -2 | head -1`;
mapfile mid < <($sqlldr_path -s $strconn <<< "select distinct(substr(mid, 0, 13)) from p_mid where kode_bank = 'BBNI';" | tail -3 | head -2);
host=$(echo $profile | cut -d"|" -f1);
user=$(echo $profile | cut -d"|" -f2);
pswd=$(echo $profile | cut -d"|" -f3);
port=$(echo $profile | cut -d"|" -f4);
outdir=$(echo $profile | cut -d"|" -f5);
/usr/bin/expect <<EOF
set timeout -1
spawn sftp $sftp_option $user#$host
expect "Password:"
send "$pswd\r"
expect "sftp>"
send "cd $outdir\r"
expect "sftp>"
send "lcd $rfshome\r"
expect "sftp>"
foreach arg {${mid[#]}} {
send "mget $arg*\r"
expect "sftp>"
}
send "bye\r"
EOF
rm $filePID;
sleep 10;
datenow=`date`;
echo "========================= $datenow =========================";
Is there any solution for this problem without separate file between bash and Expect?
The correct syntax to interpolate the array's values would be
array set param ${my_file[#]}
without a !, but this produces
array set param a.txt b.txt c.txt d.txt e.txt
However, the Expect syntax for creating an array looks like
array set param {one one.txt two two.txt}
with alternating keys and values (more like an associative array than just a list of values). But then you still can't use a shell for loop inside the Expect script; the language uses a completely different syntax (Expect is based on TCL, not shell script).
Probably you are looking for something like
/usr/bin/expect <<EOF
set timeout -1
spawn sftp $sftp_option $user#$host
expect "Password:"
send "$pswd\r"
expect "sftp>"
foreach arg {${my_file[#]}} {
send "mget $arg*\r"
expect "sftp>"
}
send "bye\r"
EOF
where I cribbed the proper TCL loop syntax from Pass bash array to expect script
It's a bit awkward looking, but you can do this:
$ my_file=("a.txt" "b.txt" "c.txt" "d.txt" "e.txt")
$ expect -f - -- "${my_file[#]}" <<'END'
foreach arg $argv {puts $arg}
END
a.txt
b.txt
c.txt
d.txt
e.txt
The -f - tells expect to use stdin for the script file.
Then -- ends the command line options, and the remaining arguments go into the argv list.
Don't forget to use a quoted heredoc, like I demonstrate. Otherwise the shell will expand the expect variables.

Why can't pass the variable's value into file in /etc directory?

I want to pass the value of the $ip variable into the file /etc/test.json with bash.
ip="xxxx"
sudo bash -c 'cat > /etc/test.json <<EOF
{
"server":"$ip",
}
EOF'
I expect the content of /etc/test.json to be
{
"server":"xxxx",
}
However the real content in /etc/test.json is:
{
"server":"",
}
But if I replace the target directory /etc/ with /tmp
ip="xxxx"
cat > /tmp/test.json <<EOF
{
"server":"$ip",
}
EOF
the value of the $ip variable gets passed into /tmp/test.json:
$ cat /tmp/test.json
{
"server":"xxxx",
}
In Kamil Cuk's example, the subprocess is cat > /etc/test.json which contains no variable.
sudo sh -c 'cat > /etc/test.json' << EOF
{
"server":"$ip",
}
EOF
It does not export the $ip variable at all.
Now let's make an analysis for the following:
ip="xxxx"
sudo bash -c "cat > /etc/test.json <<EOF
{
"server":\""$ip"\",
}
EOF"
The different parts in
"cat > /etc/test.json <<EOF
{
"server":\""$ip"\",
}
EOF"
will concatenate into a long string and as a command .Why can the $ip variable inherit the value from its father process here?
There are two reasons for this behavior:
Per default, variables are no passed to the environment of subsequently executed commands.
The variable is not expanded in the current context, because your command is wrapped in single quotes.
Exporting the variable
Place an export statement before the variable, see man 1 bash
The supplied names are marked for automatic export to the environment of subsequently executed commands.
And as noted by Léa Gris you also need to tell sudo to preserve the environment with the -E or --preserve-environment flag.
export ip="xxxx"
sudo -E bash -c 'cat > /etc/test.json <<EOF
{
"server":"$ip",
}
EOF'
Expand the variable in the current context:
This is the reason your second command works, you do not have any quotes around the here document in this example.
But if I replace the target directory /etc/ with /tmp [...] the value of the $ip variable gets passed into /tmp/test.json
You can change your original snippet by replacing the single quotes with double quotes and escaping the quotes around your ip:
ip="xxxx"
sudo bash -c "cat > /etc/test.json <<EOF
{
"server":\""$ip"\",
}
EOF"
Edit: For your additional questions:
In Kamil Cuk's example, the subprocess is cat > /etc/test.json which contains no variable.
sudo sh -c 'cat > /etc/test.json' << EOF
{
"server":"$ip",
}
EOF
It does not export the $ip variable at all.
Correct and you did not wrap the here document in single quotes. Therefore $ip is substituted in the current context and the string passed to subprocesses standard input is
{
"server":"xxxx",
}
So in this example the subprocess does not need to know the $ip variable.
Simple example
$ x=1
$ sudo -E sh -c 'echo $x'
[sudo] Password for kalehmann:
This echos nothing because
'echo $x' is wrapped in single quotes. $x is therefore not substituted in the current context
$x is not exported. Therefore the subprocess does not know its value.
$ export y=2
$ sudo -E sh -c 'echo $y'
[sudo] Password for kalehmann:
2
This echos 2 because
'echo $y' is wrapped in single quotes. $x is therefore not substituted in the current context
$y is exported. Therefore the subprocess does know its value.
$ z=3
$ sudo -E sh -c "echo $z"
[sudo] Password for kalehmann:
3
This echos 3 because
"echo $z" is wrapped in double quotes. $z is therefore substituted in the current context
There little need to do the here document inside the subshell. Just do it outside.
sudo tee /etc/test.json <<EOF
{
"server":"$ip",
}
EOF
or
sudo sh -c 'cat > /etc/test.json' << EOF
{
"server":"$ip",
}
EOF
Generally, it is not safe to build a fragment of JSON using string interpolation, because it requires you to ensure the variables are properly encoded. Let a tool like jq to that for you.
Pass the output of jq to tee, and use sudo to run tee to ensure that the only thing you do as root is open the file with the correct permissions.
ip="xxxx"
jq --arg x "$ip" '{server: $x}' | sudo tee /etc/test.json > /dev/.null

How to auto input the password when run vncserver :1?

I'm writing a bash script that will connect to my remote machine then run some commands, one of them is vncserver :1, but this command need to input a password. How can I do it in my shell script? (I just need to run the script only, don't need to input the password)
This is my script:
ssh -i $pem -o StrictHostKeyChecking=no -o 'IdentitiesOnly yes' admin#$ip -f '
pkill vnc ;
vncserver :1 ;
'
Thanks all,
It's ok now:
ssh -i $pem -o StrictHostKeyChecking=no -o 'IdentitiesOnly yes' admin#$ip -f '
pkill vnc ;
expect -c "
spawn vncserver :1;
expect -nocase \"password:\" {
send \"$pass\r\";
expect -nocase \"Verify:\" {
send \"$pass\r\";
expect -nocase \"Would you like to enter a view-only password \(y\/n\)\?\" {
send \"n\r\";
expect eof }; }; interact } ;
"
'

how to copy two files by SCP in expect script

please advice why
spawn scp $FILE1 $FILE2 $LOGIN#$IP:/tmp
in my expect script copy only FILE1 and not copy FILE2 ?
I try to to transfer both files by scp as
scp file1.csv file2.crt 192.8.200.1:/tmp
without expect and they transferred successfully to /tmp
so why VIA expect the only file that copied is FILE1 ??
what wrong in my syntax ?
example of my expect script
#!/usr/bin/expect -f
set FILE1 file1.csv
set FILE2 file2.crt
set multiPrompt {[#>$]}
spawn scp $FILE1 $FILE2 $LOGIN#$IP:/tmp
expect {
")?" { send "yes\r" ; exp_continue }
word: {send $PASS\r}
}
I also try this:
spawn scp "$FILE1 $FILE2" $LOGIN#$IP:/tmp
OR
spawn scp '$FILE1 $FILE2' $LOGIN#$IP:/tmp
but I get the same problem
PLEASE HELP!!!!!!!!
You can workaround it by using array and foreach in expect script, like:
#!/usr/bin/expect
set files { file1.csv file2.crt }
foreach file $files {
puts "Let's scp $file"
send "scp $file $LOGIN#$IP:/tmp"
}

Resources