Shell script for kubectl for uploading a file on sftp - bash

I am writing a shell script for automating the kubectl commands of uploading a file on the SFTP server. I am using the below sequence of commands and want to shell script them:
winpty kubectl --kubeconfig="C:\kubeconfig" -n namespace exec -it podname -- bash -c "sftp username"
Are you sure you want to continue connecting (yes/no)?: yes
Enter password: *******
cd foldername
put filename
I can shell script the first command then it will ask for a prompt yes/no?. Stuck there.

Related

Check lftp status when executing SFTP from shell script

I am using lftp to connect to SFTP server using the below in a shell script.
host=testurl.url.com
user=username
pass=pass
lftp<<EOF
open sftp://${host}
user ${user} ${pass}
cd test/myfolder/
bye
EOF
when executing the above using a shell script, the script exits but I am not sure if a connection is established and I don't see the output of my cd command which I executed within lftp.
Is there a way to output to a log file to see if connection is successful and the output of cd command.
Thank you.
I added a ls to the list of commands and I was able to list the directories
host=testurl.url.com
user=username
pass=pass
lftp<<EOF
open sftp://${host}
user ${user} ${pass}
cd test/myfolder/
ls
bye
EOF

sshpass not executing in bash script

I have a dockerfile: (these are the relevent commands)
RUN apk app --update bash openssh sshpass
CMD ["bin/sh", "/home/build/build.sh"]
Which my dockerfile gets ran by this command
docker run --rm -it -v $(pwd):/home <image-name>
and all of the commands within my bash script, that are within the mounted volume execute. These commands range from npm installs to using tar to zip up a file and I want to SFTP that tar.gz file.
I am using sshpass to automate logging in which I know isn't secured, but I'm not worried about that with this application.
sshpass -p <password> sftp -P <port> username#host << EOF
<command>
<command>
EOF
But the sshpass command is never executed. I've tested my docker run command by appending /bin/sh to it and trying it and it also does not run. The SFTP command by itself does.
And when I say it's never executed, I don't receive an error or anything.
Two possible reason
You apk command is wrong, it should be RUN apk add --update bash openssh sshpass, but I assume it typo
Seems like the known host entry is missing, you should check logs `docker logs -f , Also need to add entry in for known-host, check the suggested build script below.
Here is a working example that you can try
Dockerfile
FROM alpine
RUN apk add --update bash openssh sshpass
COPY build.sh /home/build/build.sh
CMD ["bin/sh", "/home/build/build.sh"]
build script
#!/bin/bash
echo "adding host to known host"
mkdir -p ~/.ssh
touch ~/.ssh/known_hosts
ssh-keyscan sftp >> ~/.ssh/known_hosts
echo "run command on remote server"
sshpass -p pass sftp foo#sftp << EOF
ls
pwd
EO
Now build the image, docker build -t ssh-pass .
and finally, the docker-compose for testing the above
version: '3'
services:
sftp-client:
image: ssh-pass
depends_on:
- sftp
sftp:
image: atmoz/sftp
ports:
- "2222:22"
command: foo:pass:1001
so you will able to connect the sftp container using docker-compose up

Running a bash script from alpine based docker

I have Dockerfile containing:
FROM alpine
COPY script.sh /script.sh
CMD ["./script.sh"]
and a script.sh (with executable permission):
#!/bin/bash
echo "hello world from script file"
when I run
docker run --name testing fff0e5c81ca0
where fff0e5c81ca0 is the id after building, I get an error
standard_init_linux.go:195: exec user process caused "no such file or directory"
So how can I solve it?
To run a bash script in alpine based image, you need to do either one
Install bash
$ RUN apk add --update bash
Use #!/bin/sh in script instead of #!/bin/bash
You need to do any one of these two or both
Or, like #Maroun's answer in comment, you can change your CMD to execute your bash script
CMD ["sh", "./script.sh"]
Your Dockerfile may look like this:
FROM openjdk:8u171-jre-alpine3.8
COPY script.sh /script.sh
CMD ["sh", "./script.sh"]

BASH instead of CSH while running Commands on a Remote Linux Server over SSH

I would like to run a command on a remote server using ssh, under bash, while my default session is csh.
minimal example (true command is more complex and is generated by my IDE remote debugger):
ssh hostname 'ls | head'
I don't have admin privileges. Trying chsh -s /bin/bash results with an error chsh: cannot lock /etc/passwd; try again later.
I tried adding to .cshrc the following
setenv SHELL /bin/bash
exec /bin/bash --login
but it freezes the console when sending the command through ssh (while regular ssh works)
Any idea how to solve that?
NOTE: I must have a solution that would configure the host, because I don't have access to the ssh command which is generated automatically by the debugger of my IDE. On the IDE I can only set the host name and port number. (EDIT) Therefore solutions like ssh hostname '/bin/bash -c "ls | head"' wont apply
EDIT2:
Actual command shown by IDE (again, I can't edit it):
ssh://username#localhost:2213/home/lab/username/anaconda2/envs/tf_011b/bin/python -u /specific/a/home/cc/cs/username/.pycharm_helpers/pydev/pydevd.py --multiproc --qt-support --client '0.0.0.0' --port 41823 --file /home/lab/username/remote_py/nlteach/show_attend_and_tell/train_saat_classifier.py --train_dir=/home/lab/username/nlteach/output/train/d=cub/imSD=11%imSP=rnd%tcSP=cvpr16/CSat/res50%lr0_02LrDTexpLrDc0_938OrmspWDc0/emb=512%ldTrn=0%nU=512%noHid=1%lr=0_02%lrDT=fix%lrDc=1%o=rmsp/
I am not sure why, but on a bash enabled server it works, while it fails on the csh host.
Thanks!
Invoke bash on the remote side, telling it what commands to run:
ssh hostname '/bin/bash -c "ls | head"'
If the command is too complicated (eg because of quotation mark escaping), then write your commands to a script, copy the script, then run the script:
scp script.bash hostname:/tmp/
ssh hostname '/bin/bash /tmp/script.bash'

Excecuting script running ssh commands in the background

I'm trying to execute this script on a remote server with requiretty enabled in the sudoers file.
#!/bin/bash
value=$(ssh -tt localhost sudo bash -c hostname)
echo $value
If I run the script using $ ./sample.sh & it stays stopped in the background. Only by using fg I can force the script to run. I think the problem is the missing tty for the output, but what can I do?
... what can I do?
You can stty -tostop.

Resources