MATLAB runs on a host machine. By using the 'system' call and CYGWIN I have to run some applications on a remote system based on linux.
The problem is, after calling the SSH command the other commands are ignored
so
system('C:\cygwin64\bin\bash -l -c "ssh -t -t 10.0.0.127; cd /home/superuser/MAGIC_PATH")
does not work
So I tried to change the directory sequentially after the SSH-Connections, but now the MATLAB-script is blocked. And I have to type the command manually. Which is not the desired solution
In MATLAB:
cygwin_path='C:\cygwin64\bin\bash';
binary_path='/home/superuser/MAGIC_PATH';
SSH_string=sprintf('%s -l -c "ssh -t -t %s &"',cygwin_path,remote_IP)
ChangeDIR_string=sprintf('%s -l -c "cd /home/superuser/"',cygwin_path)
So how can I change my code respectively the system call, so that it automatically runs multiple commands and starts some applications (as background jobs)
Related
I would like to run a command on a remote server using ssh, under bash, while my default session is csh.
minimal example (true command is more complex and is generated by my IDE remote debugger):
ssh hostname 'ls | head'
I don't have admin privileges. Trying chsh -s /bin/bash results with an error chsh: cannot lock /etc/passwd; try again later.
I tried adding to .cshrc the following
setenv SHELL /bin/bash
exec /bin/bash --login
but it freezes the console when sending the command through ssh (while regular ssh works)
Any idea how to solve that?
NOTE: I must have a solution that would configure the host, because I don't have access to the ssh command which is generated automatically by the debugger of my IDE. On the IDE I can only set the host name and port number. (EDIT) Therefore solutions like ssh hostname '/bin/bash -c "ls | head"' wont apply
EDIT2:
Actual command shown by IDE (again, I can't edit it):
ssh://username#localhost:2213/home/lab/username/anaconda2/envs/tf_011b/bin/python -u /specific/a/home/cc/cs/username/.pycharm_helpers/pydev/pydevd.py --multiproc --qt-support --client '0.0.0.0' --port 41823 --file /home/lab/username/remote_py/nlteach/show_attend_and_tell/train_saat_classifier.py --train_dir=/home/lab/username/nlteach/output/train/d=cub/imSD=11%imSP=rnd%tcSP=cvpr16/CSat/res50%lr0_02LrDTexpLrDc0_938OrmspWDc0/emb=512%ldTrn=0%nU=512%noHid=1%lr=0_02%lrDT=fix%lrDc=1%o=rmsp/
I am not sure why, but on a bash enabled server it works, while it fails on the csh host.
Thanks!
Invoke bash on the remote side, telling it what commands to run:
ssh hostname '/bin/bash -c "ls | head"'
If the command is too complicated (eg because of quotation mark escaping), then write your commands to a script, copy the script, then run the script:
scp script.bash hostname:/tmp/
ssh hostname '/bin/bash /tmp/script.bash'
I am running a very simple script that will ssh into a remote ubuntu instance, move around the directory structure execute a few things, then I want the prompt to stay in Ubuntu. When the script ends, in ends back at the local prompt. How do I make modify the script so that it finishes with the remote prompt?
local$ ssh -i xxx.pem ubuntu#xxx.ap-region.compute.amazonaws.com \
"cd virtualenv; ls -lh;"
There are two things needed to be added to your commandline:
The bash command in the end starts the bash shell (you can start any other you want)
The -t switch will make sure the remote server will allocate you TTY and your shell will work as expected:
local$ ssh -t -i xxx.pem ubuntu#xxx.ap-region.compute.amazonaws.com \
"cd virtualenv; ls -lh; bash"
As discussed here, we can SSH remote to Ubuntu Server instance, and run command, and open a prompt
ssh -t host 'cmd1; cmd2; sh -i'
Though the shell prompt I get on my Ubuntu Server is not the default one i.e. as I try to run ~ ./bashrc, I get the error as below snapshot.
So my need is
ssh -t host 'cmd1; cmd2; OPEN_DEFAULT_PROMPT'
where OPEN_DEFAULT_PROMPT will open default Ubuntu Server shell prompt right after cmd1; cmd2.
Simply it is bash instead of sh -i i.e.
ssh -t host 'cmd1; cmd2; bash'
New to docker here. I have a series of commands which, if fire them off on the shell, work just fine, but if I put them in a script, don't.
boot2docker destroy
boot2docker init
boot2docker start
boot2docker ssh &
host=$(boot2docker ip 2> /dev/null)
# everything works fine up to here
ssh -i $HOME/.ssh/id_boot2docker -o "StrictHostKeyChecking no" -o "UserKnownHostsFile /dev/null" docker#$host docker run --net=host my-image
If I don't try to run a command via ssh, everything works. Viz:
ssh -i $HOME/.ssh/id_boot2docker -o "StrictHostKeyChecking no" -o "UserKnownHostsFile /dev/null" docker#$host
This brings up the docker ssh prompt. But if I do run the command via the script (and this is what I actually need to do) I get the error message:
level="fatal" msg="Post http:///var/run/docker.sock/v1.16/containers/create: dial unix /var/run/docker.sock: no such file or directory. Are you trying to connect to a TLS-enabled daemon without TLS?"
Again, if I just enter that last command, or the whole litany of commands, into the shell, no problems. How can I make this script work?
Thanks
update
If I put that last line in its own script, and run the two scripts in sequence from the command line, everything is fine (same as just typing all the commands in sequence.) If I chain the scripts, or create a third to run them in sequence, I get the error. What am I to make of this?
Thanks
host probably isn't defined when you try to use it. You can probably confirm that by echoing it's value before running ssh. Easiest solution would be to put these two lines together in the same file:
host=$(boot2docker ip 2> /dev/null)
ssh -i $HOME/.ssh/id_boot2docker -o "StrictHostKeyChecking no" -o "UserKnownHostsFile /dev/null" docker#$host docker run --net=host my-image
I'm currently trying to ssh into a remote machine and run a script, then leave the node with the script running. Below is my script. However, when it runs, the script is successfully run on the machine but ssh session hangs. What's the problem?
ssh -x $username#$node 'rm -rf statuslist
mkdir statuslist
chmod u+x ~/monitor/concat.sh
chmod u+x ~/monitor/script.sh
nohup ./monitor/concat.sh &
exit;'
There are some situations when you want to execute/start some scripts on a remote machine/server (which will terminate automatically) and disconnect from the server.
eg: A script running on a box which when executed
takes a model and copies it to a remote server
creates a script for running a simulation with the model and push it to server
starts the script on the server and disconnect
The duty of the script thus started is to run the simulation in the server and once completed (will take days to complete) copy the results back to client.
I would use the following command:
ssh remoteserver 'nohup /path/to/script `</dev/null` >nohup.out 2>&1 &'
#CKeven, you may put all those commands on one script, push it to the remote server and initiate it as follows:
echo '#!/bin/bash
rm -rf statuslist
mkdir statuslist
chmod u+x ~/monitor/concat.sh
chmod u+x ~/monitor/script.sh
nohup ./monitor/concat.sh &
' > script.sh
chmod u+x script.sh
rsync -azvp script.sh remotehost:/tmp
ssh remotehost '/tmp/script.sh `</dev/null` >nohup.out 2>&1 &'
Hope this works ;-)
Edit:
You can also use
ssh user#host 'screen -S SessionName -d -m "/path/to/executable"'
Which creates a detached screen session and runs target command within it
What do you think about using screen for this? You could run screen via ssh to start the command (concat.sh) and then you'd be able to return to the screen session if you wanted to monitor it (could be handy, depending on what concat does).
To be more specific, try this:
ssh -t $username#$node screen -dm -S testing ./monitor/concat.sh
You should find that the prompt returns immediately, and that concat.sh is running on the remote machine. I'll explain some of the options:
ssh -t makes a TTY. screen needs this.
screen -dm makes it start in "detached" mode. This is like "background" for your purposes.
-S testing gives your screen session a name. It is optional but recommended.
Now, once you've done this, you can go to the remote machine and run this:
screen -r testing
This will attach you to the screen session which contains your program. From there you can control it, kill it, see its output, and so on. Ctrl-A, then d will detach you from the screen session. screen -ls will list all running sessions.
It could be the standard input stream. Try ssh -n ... or ssh -f ....
For me, only this worked:
screen -dmS name sh my-script.sh
This, of course, depends on screen, and lets you attach later, if you ever want stdin or stdout. Screen will terminate itself when my-script.sh ends.
Below is a much more common decision that required some efforts to find, and it really works for me:
#!/usr/bin/bash
theScreenSessionName="test"
theTabNumber="1"
theStuff="date; hostname; cd /usr/local; pwd; /usr/local/bin/top"
echo "this is a test"
ssh -f user#server "/usr/local/bin/screen -x $theScreenSessionName -p $theTabNumber -X stuff \"
$theStuff
\""
It sends $theStuff list of commands to the tab No $theTabNumber of the screen session $theScreenSessionName preliminarily created at the 'server' on behalf of 'user'.
Please be aware of a trailing whitespace after
-X stuff \"
that is sent to overcome a 'stuff' option's glitch. The whitespace and $theStuff in the next line are appended by 'Enter' (^M) keystrokes. DON'T MISS 'EM!
The "this is a test" message is echoed in the initial terminal, and $theStuff commands are really executed inside the mentioned screen/tab.