bash script for ssh connect and change folder - bash

There is the following bash script:
#!/bin/bash
set -o errexit
# Общие параметры
server="some_server"
login="admin"
default_path="/home/admin/web/"
html_folder="/public_html"
# Параметры проекта
project_folder="project_name"
go_to_folder() {
ssh "$login#$server"
cd "/home/admin/web/"
}
go_to_folder
I got error "deploy.sh: line 16: cd: /home/admin/web/: No such file or directory", but if I connect manually and change directory through "cd" it works. How can I change my script?

Yes it is obvious, didn't it? You are trying to do cd to the local machine and not on the target machine. The commands to be passed to ssh much be provided in-line with its parameters, on a separate newline its looking as if you are doing no-op on the remote machine and running the cd in the local.
go_to_folder() {
ssh "$login#$server" "cd /home/admin/web/"
}
Or a more cleaner way to do would be to use here-docs
go_to_folder() {
ssh "$login#$server" <<EOF
cd /home/admin/web/
EOF
}
other ways to make ssh read from stanard input on the commands to run would be to use here-strings(<<<) also.

Related

Script to copy data from cluster local to pod is neither working nor giving any error

The bash script I'm trying to run on the K8S cluster node from a proxy server is as below:
#!/usr/bin/bash
cd /home/ec2-user/PVs/clear-nginx-deployment
for file in $(ls)
do
kubectl -n migration cp $file clear-nginx-deployment-d6f5bc55c-sc92s:/var/www/html
done
This script is not copying data which is therein path /home/ec2-user/PVs/clear-nginx-deployment of the master node.
But it works fine when I try the same script manually on the destination cluster.
I am using python's paramiko.SSHClient() for executing the script remotely:
def ssh_connect(ip, user, password, command, port):
try:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(ip, username=user, password=password, port=port)
stdin, stdout, stderr = client.exec_command(command)
lines = stdout.readlines()
for line in lines:
print(line)
except Exception as error:
filename = os.path.basename(__file__)
error_handler.print_exception_message(error, filename)
return
To make sure the above function is working fine, I tried another script:
#!/usr/bin/bash
cd /home/ec2-user/PVs/clear-nginx-deployment
mkdir kk
This one runs fine with the same python function, and creates the directory 'kk' in desired path.
If you could please suggest the reason behind, or suggest an alternative to carry out this.
Thank you in advance.
The issue is now solved.
Actually, the issue was related to permissions which I got to know later. So what I did to resolve is, first scp the script to remote machine with:
scp script.sh user#ip:/path/on/remote
And then run the following command from the local machine to run the script remotely:
sshpass -p "passowrd" ssh user#ip "cd /path/on/remote ; sudo su -c './script.sh'"
And as I mentioned in question, I am using python for this.
I used the system function in os module of python to run the above commands on my local to both:
scp the script to remote:
import os
command = "scp script.sh user#ip:/path/on/remote"
os.system(command)
scp the script to remote:
import os
command = "sshpass -p \"passowrd\" ssh user#ip \"cd /path/on/remote ; sudo su -c './script.sh'\""
os.system(command)

Terraform `local-exec` to set a local alias

I'm trying to set up an alias to quickly ssh into the newly created host when I create an AWS instance in terraform. I do this by running
# Handy alias to quickly ssh into newly created host
provisioner "local-exec" {
command = "alias sshopenldap='ssh -i ${var.key_path} ubuntu#${aws_instance.ldap_instance.public_dns}'"
}
When I see the output of this execution:
aws_instance.ldap_instance (local-exec): Executing: /bin/sh -c "alias sshopenldap='ssh -i ~/.ssh/mykey.pem ubuntu#ec2-IP.compute-1.amazonaws.com'"
It seems to be ok, but the alias is not set. Could this be that the way the command is being run it wraps it in a new scope and not the one of the current shell? If I copy paste the command as is to the console the alias is set fine.
Is there a workaround for this?
I'm running terraform a MacOS X Mountain Lion's terminal.
You could try something like:
# Handy alias to quickly ssh into newly created host
provisioner "local-exec" {
command = "echo \"alias sshopenldap='ssh -i ${var.key_path} ubuntu#${aws_instance.ldap_instance.public_dns}'\" > script.sh && source script.sh && rm -rf source.sh"
}
Not sure how the quote escaping will go...
It is indeed not possible to set an alias for the current shell in a script file, which is what your are trying to do. The only way to get out of this is to not run a script, but instead source it. So:
source somefile.sh
instead of executing it should do the trick.

Running part of bash script on a remote machine

I need to run some commands locally and then some command on a remote machine all using a single local bash script.
For simplicity just say I want to do the following and execute it on my local desktop machine.
#!/bin/bash
#upload some files to a remote machine
cd /tmp
./upload-files.sh
#run commands on remote machine
ssh myuser:mypass#somewhere.com
cd /tmp/uploads <--- these commands don't run in the ssh connection
./process-uploads.sh
exit
#run command locally again.
cd -
echo 'complete!'
Any ideas of how to do this?
You can use here-doc with ssh command:
#!/bin/bash
#upload some files to a remote machine
cd /tmp
./upload-files.sh
#run commands on remote machine
ssh -t -t myuser:mypass#somewhere.com<<EOF
cd /tmp/uploads
./process-uploads.sh
exit
EOF
#run command locally again.
cd -
echo 'complete!'
If you want to run only one command:
ssh myuser:mypass#somewhere.com 'cd /tmp/uploads; ./process-uploads.sh'

Problems using scala to remotely issue commands via ssh

I have a problem with scala when I want to create a directory remotely via ssh.
ssh commands via scala, such as date or ls, work fine.
However, when I run e.g
"ssh user#Main.local 'mkdir Desktop/test'".!
I get: bash: mkdir Desktop/test: No such file or directory
res7: Int = 127
When I copy-paste the command into my shell it executes without any problems.
Does anybody know what is going on??
EDIT:
I found this post : sbt (Scala) via SSH results in command not found, but works if I do it myself
However, the only thing I could take away from it is to use the full path for the directory to be created. However, it still does not work :(
Thanks!
ssh doesn't require that you pass the entire command line you want to run as a single argument. You're allowed to pass it multiple arguments, one for the command you want to run, and more for any arguments you want to pass that command.
So, this should work just fine, without the single quotes:
"ssh user#Main.local mkdir Desktop/test"
This shows how to get the same error message in an ordinary bash shell, without involving ssh or Scala:
bash-3.2$ ls -d Desktop
Desktop
bash-3.2$ 'mkdir Desktop/test'
bash: mkdir Desktop/test: No such file or directory
bash-3.2$ mkdir Desktop/test
bash-3.2$
For your amusement, note also:
bash-3.2$ mkdir 'mkdir Desktop'
bash-3.2$ echo echo foo > 'mkdir Desktop'/test
bash-3.2$ chmod +x 'mkdir Desktop'/test
bash-3.2$ 'mkdir Desktop/test'
foo
UPDATE:
Note that both of these work too:
Process(Seq("ssh", "user#Main.local", "mkdir Desktop/test")).!
Process(Seq("ssh", "user#Main.local", "mkdir", "Desktop/test")).!
Using the form of Process.apply that takes a Seq removes one level of ambiguity about where the boundaries between the arguments lie. But note that once the command reaches the remote host, it will be processed by the remote shell which will make its own decision about where to put the argument breaks. So for example if you wanted to make a directory with a space in the name, this works locally:
Process(Seq("mkdir", "foo bar")).!
but if you try the same thing remotely:
Process(Seq("ssh", "user#Main.local", "mkdir", "foo bar")).!
You'll get two directories named foo and bar, since the remote shell inserts an argument break.

call shell-sourced function remotely by ssh

I have function that is sourced through the .bashrc file on remote host A.
If i use "which" on remote host A , i`m getting function body as output.
I need to run it through ssh remotely from another host B.
Currently , all my tries are ending with "command not found error".
I already tried to pass to
ssh A "source /home/user/.bashrc && function "
, this not help.
Also tried force ssh to assing pseudo-tty with -t key. SHELL on both hosts is bash.
ssh localhost on host A still keeps function status available.
Output :
[user#hostA ~]$ which status
status is a function
status ()
{
dos -s $*
}
[user#hostB ~]$ ssh hostA " source /home/user/deploy/bin/_bashrc && status all "
ls: : No such file or directory
bash: status: command not found
Basically, you can't. To do that you need to copy the sourced file on the remote host and source it in there. Note, that your file may be sourcing in some other files as well… This is almost like running local program on the remote host.
The trick is to get the remote end to properly load your file containing the function into the shell environment.
I found with bash that the following works...
Put your function into .bashrc on the remote:
foo_func()
{
echo Hello World
}
Then on the local side:
ssh user#remote bash -l -c foo_func
The bash -l instructs bash to run as a login shell (sourcing startup files) and then the -c tells the shell to execute the string foo_func.

Resources