Check lftp status when executing SFTP from shell script - shell

I am using lftp to connect to SFTP server using the below in a shell script.
host=testurl.url.com
user=username
pass=pass
lftp<<EOF
open sftp://${host}
user ${user} ${pass}
cd test/myfolder/
bye
EOF
when executing the above using a shell script, the script exits but I am not sure if a connection is established and I don't see the output of my cd command which I executed within lftp.
Is there a way to output to a log file to see if connection is successful and the output of cd command.
Thank you.

I added a ls to the list of commands and I was able to list the directories
host=testurl.url.com
user=username
pass=pass
lftp<<EOF
open sftp://${host}
user ${user} ${pass}
cd test/myfolder/
ls
bye
EOF

Related

Script to copy data from cluster local to pod is neither working nor giving any error

The bash script I'm trying to run on the K8S cluster node from a proxy server is as below:
#!/usr/bin/bash
cd /home/ec2-user/PVs/clear-nginx-deployment
for file in $(ls)
do
kubectl -n migration cp $file clear-nginx-deployment-d6f5bc55c-sc92s:/var/www/html
done
This script is not copying data which is therein path /home/ec2-user/PVs/clear-nginx-deployment of the master node.
But it works fine when I try the same script manually on the destination cluster.
I am using python's paramiko.SSHClient() for executing the script remotely:
def ssh_connect(ip, user, password, command, port):
try:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(ip, username=user, password=password, port=port)
stdin, stdout, stderr = client.exec_command(command)
lines = stdout.readlines()
for line in lines:
print(line)
except Exception as error:
filename = os.path.basename(__file__)
error_handler.print_exception_message(error, filename)
return
To make sure the above function is working fine, I tried another script:
#!/usr/bin/bash
cd /home/ec2-user/PVs/clear-nginx-deployment
mkdir kk
This one runs fine with the same python function, and creates the directory 'kk' in desired path.
If you could please suggest the reason behind, or suggest an alternative to carry out this.
Thank you in advance.
The issue is now solved.
Actually, the issue was related to permissions which I got to know later. So what I did to resolve is, first scp the script to remote machine with:
scp script.sh user#ip:/path/on/remote
And then run the following command from the local machine to run the script remotely:
sshpass -p "passowrd" ssh user#ip "cd /path/on/remote ; sudo su -c './script.sh'"
And as I mentioned in question, I am using python for this.
I used the system function in os module of python to run the above commands on my local to both:
scp the script to remote:
import os
command = "scp script.sh user#ip:/path/on/remote"
os.system(command)
scp the script to remote:
import os
command = "sshpass -p \"passowrd\" ssh user#ip \"cd /path/on/remote ; sudo su -c './script.sh'\""
os.system(command)

How to run shell commands in lftp to do file transfers?

I need to copy few files from a remote directory which has subdirectories within it. I am using lftp to do that but shell commands inside it aren't working. Is there a workaround for this? Please see code below. Any help is much appreciated guys!
lftp -u $USER,$PASS sftp://$HOST <<EOF 2>&1
#find file file_name with absolutepath from REMOTE_DIR which lies in any of its subdirectories
filefound=`find "${REMOTE_DIR}"`-name "{$file_name}"`
#Get the absolutepath for subdirectory where the file resides
dir_loc=`dirname "${filefound}"`
lcd ${LOCAL_DIR}
cd ${dir_loc}
get ${file_name}
bye
EOF
The error I am getting is:
filefound: command not found
dir_loc:command not found
That's a conceptual misunderstanding.
You cannot run shell commands using SFTP client. To run shell commands use an SSH client.

Putty: trying to send multiple commands to remote server but only the first is executed [duplicate]

I want to run multiple commands automatically like sudo bash, ssh server01, ls , cd /tmp etc at server login..
I am using Remote command option under SSH in putty.
I tried multiple commands with delimiter && but not working.
There is a some information lacking in your question.
You say you want to run sudo bash, then ssh server01.
Will sudo prompt for a password in your remote server?
Assuming there is no password in sudo, running bash will open another shell waiting for user input. The command ssh server01 will not be run until that bash shell is exited.
If you want to run 2 commands, try first simpler ones like:
ls -l /tmp ; echo "hi there"
or if you prefer:
ls -l /tmp && echo "hi there"
Does this work?
If what you want is to run ssh after running bash, you can try :
sudo bash -c "ssh server01"
That is probably because the command is expected to be a program name followed by parameters, which will be passed directly to the program. In order to get && and other functionality that is provided by a command line interpreter such as bash, try this:
/bin/bash -c "command1 && command2"
I tried what I suggested in my previous answer.
It is possible to run 2 simple commands in putty separated by a semicolon. As in my example I tried with ls and echo. The remote server runs them and then the session closes.
I also tried to ssh to a remote server that is configured for not asking for a password. In that case, it also works, I get connected to the 2nd server and I can run commands on it. Upon exit, the 2 connections are closed.
So please, let us know what you actually need / want.
You can execute two consecutive commands in PuTTY using a regular shell syntax. E.g. using ; or &&.
But you want to execute ssh server01 in sudo bash shell, right?
These are not two consecutive commands, it's ssh server01 command executed within sudo bash.
So you have to use a sudo command-line syntax to execute the ssh server01, like
sudo bash ssh server01

Shell script for kubectl for uploading a file on sftp

I am writing a shell script for automating the kubectl commands of uploading a file on the SFTP server. I am using the below sequence of commands and want to shell script them:
winpty kubectl --kubeconfig="C:\kubeconfig" -n namespace exec -it podname -- bash -c "sftp username"
Are you sure you want to continue connecting (yes/no)?: yes
Enter password: *******
cd foldername
put filename
I can shell script the first command then it will ask for a prompt yes/no?. Stuck there.

How to use pbrun in a bash script?

In interactive mode, I do
pbrun su - privuser
rm dir1
But if I run above commands in a bash script file, I'll simply receive a new bash terminal window after the first command.
Is it possible use pbrun in a script file?
yes, You can use pbrun and su command in your bash script.but before that
pbrun - It is a secured terminal command where
sudo- is secured user instead of terminal.
So, If you have valid credentials, you can use below command
pbrun -u [username] dap --> syntax
Try using " pbrun -u privuser" --> It might work

Resources