how to store the values to a variable - bash

I'm try to get the used storage for a ftp server through lftp.
lftp :~> open username:password#IP
lftp username#IP:~> du
897146 ./volume(sda1)
897146 .
I want to get the value of 897146 from a sh script.
This is what I got so far:
#!/bin/bash
FTP_PASS=password
FTP_HOST=IP
FTP_USER=username
LFTP=lftp
lftp << EOF
open ${FTP_USER}:${FTP_PASS}#${FTP_HOST}
FOO="$(du)"
quit
EOF
echo "$FOO"
But I'm getting
Unknown command `FOO=9544 ./logs'.
Unknown command `9636'.

The du command inside the FTP session will output within the lftp command output. So to get the output of the du command, you need to capture the output of the lftp command inside your variable:
#!/usr/bin/env bash
FTP_PASS=password
FTP_HOST=IP
FTP_USER=username
FOO=$(lftp << EOF | filter_out_things_unrelated_to_du
open ${FTP_USER}:${FTP_PASS}#${FTP_HOST}
du
quit
EOF
)
echo "$FOO"
You will probably need to filter-out the FTP session header and MOTD from the remote FTP server and anything not related with the output of du.

Related

Expect Script - bash script file not found

My expect script
#!/usr/bin/expect -f
#I tried replacing sh - with bash -s still no positive results
spawn ssh xxxx#yyyy "sh -" < test.sh
expect "password: "
send "zzzzz\r"
expect "$ "
This command works well if executed in the terminal
ssh xxxx#yyyy "sh -" < test.sh
But if I execute it via expect script; it fails.
This is the output if I execute it via the expect script. May I know where I am going wrong
bash: test.sh: No such file or directory
P.S : Yes, the file exists and the credentials are right.
Expect script was unable to read the contents of the file, that was the issue. Solved it by reading the contents of the file and passing that variable instead of the file name,
set fh [open test.sh r]
set contents [read $fh]
close $fh
and replacing the sh- with bash -c '$contents'
Thank you everyone for the valuable comments.

Issue while capturing SFTP output to variable

I am not able to store output of below SFTP command to result variable.
The command is
result=`sftp -oPort=$p_port $p_ftp_user_id#$p_host <<EOF
cd $p_remote_dir
mget TEST_FEED*
rm TEST_FEED*
exit
EOF`
echo "$result"
When is fire above command in a shell script output err is like below:
Gtk-WARNING **: cannot open display: dora:0.0 Host key verification
failed. Couldn't read packet: Connection reset by peer
I want this error to be stored in result variable so that i can use it for below validation:
value2=`echo "$result" |grep "failed" |wc -l`
But i getting result as empty. Please help what i m missing here.
You should redirect stderr to stdout for sftp command using 2>&1
result=$(sftp -oPort=$p_port $p_ftp_user_id#$p_host <<EOF 2>&1
cd $p_remote_dir
mget TEST_FEED*
rm TEST_FEED*
exit
EOF
)
echo "$result"

BASH: sending commands to ftp and validating status codes

I want to write a bash script that runs ftp on background. I want to some way send commands to it and receive responses. For example a want run ftp, then sent it
user username pass
cd foo
ls
binary
mput *.html
and receive status codes and verify them. I tried to do it this way
tail -n 1 -f in | ftp -n host >> out &
and then reading out file and verifying. But it doesn't work. Can somebody show me the right way? Thanks a lot.
I'd run one set of commands, check the output, and then run the second set in reaction to the output. You could use here-documents for the command sets and command substitution to capture the output in a variable, e.g. like this:
output=$(cat <<EOF | ftp -n host
user username pass
cd foo
ls
binary
mput *.html
EOF
)
if [[ $output =~ "error message" ]]; then
# do stuff
fi

Bash script to pass commands remotely via SSH

i'm just starting out with bash & am trying to write a script to search specific files in a server remotely based on: (a)device name and (b) string. my goal is to get all output containing 'string' for the device specified. when i tried the script below just hangs. however, when i run the command directly on the server("grep -i "router1" /var/log/router.log | grep -i "UPDOWN"), it works. any ideas?any ideas?
#!/bin/bash
#
read -p "Enter username: " user
read -p "Enter device name: " dev
read -p "Enter string: " str
while read /home/user1/syslogs
do
ssh "$user"#server1234 'grep -i "$dev" /var/log/"$syslogs" 2> /dev/null | grep -i "$str"'
done
You seem to be mis-using the read command. You don't specify the file to read from as an argument; read always reads from standard input. It's not clear what you want to do with the value you read from the file as a result, but you want something like this:
read -p "Enter username: " user
read -p "Enter device name: " dev
read -p "Enter string: " str
while read fileName; do
# Also: I'm borrowing sputnick's solution to the nested quote problem.
ssh $user#server1234 <<EOF
grep -i "$dev" /var/log/$fileName 2>/dev/null | grep -i "$str"
EOF
done < /home/user1/syslogs
The message Pseudo-terminal will not be allocated because stdin is not a terminal is due to the fact that the stdin of the remote host's shell is being redirected from a here document and that there is no command specified for the remote host to execute, i. e. the remote host first assumes there will be a need to allocate a pseudo-terminal for an interactive login session due to the lacking command (see the synopsis of the ssh man page: ssh ... [user#]hostname [command]), but then realizes that the stdin of its shell is not a terminal since it is redirected from a here document. The result is that the remote host refuses to allocate a pseudo-terminal.
The solution in the given case would be to just specify a shell as a command for the remote host to execute the commands given in the here document.
As an alternative to specifying a shell as a command the remote host could be told in advance that there is no need for the allocation of a pseudo-terminal using the -T switch.
The -t switch, on the other hand, would be necessary only if a specified command expects an interactive login shell session on the remote host (such as top or vim).
- ssh $user#server1234 <<EOF ...
+ ssh $user#server1234 /bin/sh <<EOF ...
+ ssh -T $user#server1234 <<EOF ...

sftp put command fails when in shell script

I am trying to make a shell script that creates a mysql dump and then puts it on another computer. I have already set up keyless ssh and sftp. They script below creates the mysql dump file on the local computer when it is run and doesn't throw any errors, however the file "dbdump.db" is never put on the remote computer. If I execute the sftp connection and put command by hand then it works.
contents of mysql_backup.sh
mysqldump --all-databases --master-data > dbdump.db
sftp -b /home/tim tim#100.10.10.1 <<EOF
put dbdump.db
exit
EOF
Try to use scp that should be easier in your case.
scp dbdump.db tim#100.10.10.1:/home/tim/dbdump.db
Both sftp and scp are using SSH.
Please write mput/put command into one file (file_contains_put_command) and try below command.
sftp2 -B file_contains_put_command /home/tim tim#100.10.10.1 >> log_file
Example:
echo binary > sample_file
echo mput dbdump.db >> sample_file
echo quit >> sample_file
sftp2 -B sample_file /home/tim tim#100.10.10.1 >> log_file
Your initial approach is a few characters off working though. You're telling sftp to read it's batch-commands from /home/tim -b /home/tim. So, if you change this to -b -, it should read it's batch commands from stdin.
Something along these lines, if -b /home/tim were intended to i.e. change directory remotely, you can add cd /home/tim to your here document.
mysqldump --all-databases --master-data > dbdump.db
sftp -b - tim#100.10.10.1 <<EOF
put dbdump.db
exit
EOF

Resources