A Bash script is CRON-ed to run on server A every day and connects to a remote MySQL database to run a query, dump the results to a file, and send an email.
In my freestyle project, I have set up my build to pull that Bash script from my Github repo and execute it. Jenkins starts it but the script fails. Based on my discovery,
Jenkins runs my script in a clone of my repo on its own server, not on server A where it is CRON-ed
The above explains why my shell script fails to find the mysql client executable
Based on my understanding, this means that my freestyle project needs to connect to server A where my script is CRON-ed to run like so
ssh serverA_username#serverA #using ServerA user key
SCRIPT="$WORKSPACE/shell/bash_script_name.sh"
sh -x $SCRIPT
Am I correct? Any pointers welcome. Thank you!
Related
I have a job that ssh into other servers and deploys some configuration with scp, but I can not find any way to access ssh key file used in my project configuration in TeamCity in order to execute shell command in my job - "ssh -I ~/.ssh/password", because TeamCity runs only in job directory. Therefore, I want to ask is there any way to access this SSH private key file that I mentioned in a project settings.
Just to say, I cannot use SSH-EXEC and SSH-UPLOAD as I have Shell script that ssh into many servers one by one reading from a file, therefore it would not be useful to have for each job one separate SSH exec job step in TeamCity project, so I have to somehow access the file without using standard SSH-EXEC and SSH-UPLOAD in a TeamCity
What have I tried?
I only had one idea - somehow to access SSH key that is located outside working directory by a path (I found this in documentation):
<TeamCity Data Directory>/config/projects/<project>/pluginData/ssh_keys
Problem with this, is that I cannot just cd into given path, as job does not want to go outside my working directory where job is executed by TeamCity. Therefore I could not access given directory where ssh_keys for my project is located.
UPD: Find out solution to use build feature SSH, that way you can execute SSH-key right with command line in job
There is a need to upload the script from local to server and then run over there. can someone please let me know how to can achieve this.
Just copy it from local machine to the "server" using SCP for Linux or SMB for Windows, once done you can log into the server over SSH or RDP and execute your JMeter test in command-line non-GUI mode
If you want fully unattended/automated execution consider the following:
Setting up a version control system, i.e. use Github to store your script(s)
Configure the Webhook to trigger an action when you commit the file
Install Jenkins on the server
Configure Jenkins to listen to the Github Webhook and when it happens kick off a build running a JMeter test
This way whenever you add new or update existing script it will automatically trigger the job which will execute the test, check out How to Integrate Your GitHub Repository to Your Jenkins Project article for detailed steps if needed.
When i want to update my app running on server, i should ssh to server, cd to app folder, execute git pull, then i should run npm build, next i should restart the server. How to automate this with bash script or something? Is it the case for which jenkins stands for (or some other tools)?.
But how to do this with simple bash script or something?
I dont need the rebuilding of an app every time i push to git, only when i need to update and restart everything.
Also it there a way if build take a lot of time notify me by email that build success?
For now eery time i do update for remote app, i should wait with open terminal when it builded and only then i can close the sshed terminal. Some time builds take a lot of time.
You can simply script those commands, and put that script on your server.
That way, all you need to do is to ssh to that server and call that script, which will execute those commands on demand.
Is it the case for which jenkins stands for (or some other tools)?
Not in this case, since it is purely on demand: you can execute the script through a simple ssh call, no need for Jenkins.
I have a deployment script checked in the Github which I checkout with the source code in the pipeline. I need to execute this shell script to the remote server to perform the tasks (create backup, start / stop Jboss). I am using ssh agent to ssh into the remote server. How can I execute this within the Jenkins pipeline, and get the output from the commands back to the pipeline?
Thank you
I need to run a bash script that periodically deletes old git branches. I am having trouble finding a way to connect to the git repo via the execute shell option.
Currently I am using cygwin in order to run git commands. Here is what I have in execute shell:
#!c:\cygwin64\bin\bash --login
git ls-remote git#10.1.1.126:/external-web/collette-com.git
This command is throwing the following error.
[Delete Branches] $ c:\cygwin64\bin\bash --login
C:\Users\tbraga\AppData\Local\Temp\hudson5750784484659728632.sh
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
Build step 'Execute shell' marked build as failure
I have tried running this command in the command line and am prompted for a password. Could this be the issue?
I have the git plugin configured within Jenkins and the connection works perfectly when using Source Code Management Git.
Any suggestions on how to make this connections work in the execute shell field would be greatly appreciated.
I solved this problem by passing my credentials to my execute shell script through the Credentials Binding Plugin in Jenkins
It's simple enough to create an SSH key associated with your user.
Try here : https://confluence.atlassian.com/bitbucketserver/creating-ssh-keys-776639788.html
Put keys under %userprofile%/.ssh and try running it again.
You can also use the same credentials used in your Jenkins configuration
I use SSH keys for auth to Github and had this same issue. My Jenkins configuration has EC2 slaves, so the default SSH key on the machine wasn't correct for Github.
I fixed it with the SSH Agent Plugin. In the job, enable the "SSH Agent" setting and choose the stored SSH key for Github authentication. It should be the same one selected for the Git-SCM configuration used to clone the repo.