I have below ssh script which I am trying to execute by Jenkins, it runs fine when I invoke it from shell.
#ssh to remote machine
sshpass ssh 10.40.94.36 -l root -o StrictHostKeyChecking=no
#Remove old slave.jar
rm -f slave.jar
#download slave.jar to that machine
wget http://10.40.95.14:8080/jnlpJars/slave.jar
pwd
#make new dir to that machine
mkdir //var//Jenkins
# make slave online
java -jar slave.jar -jnlpUrl http://10.40.95.14:8080/computer/nodeV/slave-agent.jnlp
When I execute this script through shell it downloads the jar file to remote machine and also makes a new directory. But When I invoke it by shell plugin of jenkins, every command runs seprately. so the jar gets downloaded at master and also directory get created at master.
Also I am using sshpass for passwordless automated login, which fails sometime. Is there any other way of doing this.
Related
i'm trying to setup a Gitlab pipeline, and one of the steps include running a .bat script on a Windows Server.
Windows Server has a SSH Daemon installed and configured.
I've tried the following command from a Unix host
sshpass -p <pwd> ssh -o StrictHostKeyChecking=no <user>#<ip>
"C:\Temp\test.bat"
and everything is working fine.
Gitlab job will be executed from a custom image as this:
build_and_deploy_on_integrazione:
stage: build
tags:
- maven
image: <custom_image>
script:
- apt-get update -y
- apt-get install -y sshpass
- sshpass -p <pwd> ssh -o StrictHostKeyChecking=no <user>#<ip>
"C:\Temp\test.bat"
- echo "Done"
just to be sure i've started a container of the custom image from command line on the same machine that is hosting the Gitlab Runner instance and executed the step of the script, and it's also running fine.
But when i run the pipeline from Gitlab the bat file is not executed, the only output i see is
Warning: Permanently added '<ip>' (RSA) to the list of known hosts.
and nothing else.
i've checked on the SSH Daemon log and the connection is executed correctly, so the "SSH" part of the script seems to be working, but the script is not executed.
Bellow command if i write inside a script (test.sh) and execute directly on the specific machine it works.
sshpass -p $HOST_PWD sftp testuser#host <<!
cd parent
mkdir test
bye
!
But when i try to run (directly below scrip or invoking the test.sh file in the specif path) in Jenkins with "Execute shall script on remote host using ssh" it failing with
sshpass: Failed to run command: No such file or directory
I have installed sshpass, lftp and rsync in the remote machine
Issue :
I have added export $HOST_PWD in .bashrc of specific machine as well as Jenkins but in not finding it
Script placed in specific machine, if directly executed the script in that machine it works even with $HOST_PWD. But not working if we invoke from jenkins either script or directly scrip using "Execute shall script on remote host using ssh"
Working with Changes :
Instead of $HOST_PWD if i added directly password it works.
I am running a very simple script that will ssh into a remote ubuntu instance, move around the directory structure execute a few things, then I want the prompt to stay in Ubuntu. When the script ends, in ends back at the local prompt. How do I make modify the script so that it finishes with the remote prompt?
local$ ssh -i xxx.pem ubuntu#xxx.ap-region.compute.amazonaws.com \
"cd virtualenv; ls -lh;"
There are two things needed to be added to your commandline:
The bash command in the end starts the bash shell (you can start any other you want)
The -t switch will make sure the remote server will allocate you TTY and your shell will work as expected:
local$ ssh -t -i xxx.pem ubuntu#xxx.ap-region.compute.amazonaws.com \
"cd virtualenv; ls -lh; bash"
I have an Ant script to be called from Jenkins that - after other deployment tasks - start a JBoss server. The deployment package already contains an startup script which wraps up the JBoss run script:
[...]/bin/run.sh -b ip -c config >/dev/null 2>&1 &
The startup script runs fine when manually executed (i.e ssh to the server and sudo ./startup.sh)
Now I'm having trouble invoking this startup script from sshexec task. The task can execute the startup script and JBoss does gets spun up but will terminate as soon as the task return and move on to the next task - similar to running the run.sh directly and closing the session.
My task is pretty standard
<sshexec host="host" username="username" password="password"
command="echo password | sudo -S sh ${JBOSS_HOME}/server/config/startup.sh" />
I'm confused. Shouldn't the startup script above covered starting up JBoss separately from the session already? Any idea how to solve this?
The remote system is Redhat 6.
Never mind, I found it. Still need to combine nohup and background running with the startup script. Plus the "dirty workaround" from here
https://unix.stackexchange.com/questions/91065/nohup-sudo-does-not-prompt-for-passwd-and-does-nothing (was actually brilliant)
End result:
echo password | sudo -S env && sudo sh -c 'nohup startup.sh > /dev/null 2>&1 &'
I need to run some commands locally and then some command on a remote machine all using a single local bash script.
For simplicity just say I want to do the following and execute it on my local desktop machine.
#!/bin/bash
#upload some files to a remote machine
cd /tmp
./upload-files.sh
#run commands on remote machine
ssh myuser:mypass#somewhere.com
cd /tmp/uploads <--- these commands don't run in the ssh connection
./process-uploads.sh
exit
#run command locally again.
cd -
echo 'complete!'
Any ideas of how to do this?
You can use here-doc with ssh command:
#!/bin/bash
#upload some files to a remote machine
cd /tmp
./upload-files.sh
#run commands on remote machine
ssh -t -t myuser:mypass#somewhere.com<<EOF
cd /tmp/uploads
./process-uploads.sh
exit
EOF
#run command locally again.
cd -
echo 'complete!'
If you want to run only one command:
ssh myuser:mypass#somewhere.com 'cd /tmp/uploads; ./process-uploads.sh'