GIT push step failing in Jenkins - windows

Below code was working fine and all of the sudden it broken. I am using windows box.
stage('Push')
{
withCredentials([usernamePassword(credentialsId: 'gitlogin', passwordVariable: 'GIT_PASSWORD', usernameVariable: 'GIT_USERNAME')]) {
sh('git push --tags origin $BRANCH_NAME')
}
if ("${BRANCH_NAME}"=="develop" || ("${BRANCH_NAME}".startsWith("release")))
{
sshagent (credentials: ['GitSSHLOGIN']) {
sh("git tag -a PBCS_${BRANCH_NAME}_${ReleaseNumber}_${BUILD_NUMBER} -m 'Tag the build ${BRANCH_NAME}_${ReleaseNumber}_${BUILD_NUMBER}'")
sh('git push --tags origin $BRANCH_NAME')
}
}
}
Below is the error we are getting.
+ git push --tags origin release/21.04
Could not create directory '/c/Jenkins/jobs/branches/release-21-04.3rkqb4/workspace/nullnull/.ssh'.
ssh_askpass: exec(/usr/lib/ssh/ssh-askpass): No such file or directory
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.

Related

Jenkins with jFrog Artifactory push Docker images

I'm trying to configure new pipeline in Jenkins. I have purchased and installed jFrog artifactory pro on Windows Server and it's up and running at: https://artifactory.mycompany.com
I found this sample here:
https://github.com/jfrog/project-examples/blob/master/jenkins-examples/pipeline-examples/declarative-examples/docker-push-example/Jenkinsfile
More specifically this section:
stage ('Push image to Artifactory') {
steps {
rtDockerPush(
serverId: "ARTIFACTORY_SERVER",
image: ARTIFACTORY_DOCKER_REGISTRY + '/hello-world:latest',
// Host:
// On OSX: "tcp://127.0.0.1:1234"
// On Linux can be omitted or null
host: HOST_NAME,
targetRepo: 'docker-local',
// Attach custom properties to the published artifacts:
properties: 'project-name=docker1;status=stable'
)
}
}
It's building and creating docker image but when it gets to push image it fails to push the image and errors out. Not sure what should go in the following:
ARTIFACTORY_DOCKER_REGISTRY
host: HOST_NAME
I've created a new local repo in artifactory "docker-local". Tried omitting host and getting
"Unsupported OS".
Putting host back in with "host: 'tcp://IP ADDRESSS" or "artifactory.mycompany.com:80/artifactory" generates
"Unsupported protocol scheme"
How would one configure jenkins pipeline to work with jFrog artifactory?
Found the solution:
ARTIFACTORY_DOCKER_REGISTRY should be IP/Artifactory-Repo-Key/Image:Tag
HOST should be docker daemon (Docker for windows is localhost:2375)
stage('Build image') { // build and tag docker image
steps {
echo 'Starting to build docker image'
script {
def dockerfile = 'Dockerfile'
def customImage = docker.build('10.20.111.23:8081/docker-virtual/hello-world:latest', "-f ${dockerfile} .")
}
}
}
stage ('Push image to Artifactory') { // take that image and push to artifactory
steps {
rtDockerPush(
serverId: "jFrog-ar1",
image: "10.20.111.23:8081/docker-virtual/hello-world:latest",
host: 'tcp://localhost:2375',
targetRepo: 'local-repo', // where to copy to (from docker-virtual)
// Attach custom properties to the published artifacts:
properties: 'project-name=docker1;status=stable'
)
}
}

Unable to print credentials set in Jenkins Pipeline

Credentials are configured in Jenkins but there's an error suggesting they are not.
I've followed documentation provided by Jenkins website.
agent {
node {
label 'master'
}
}
environment {
AWS_ACCESS_KEY_ID = credentials('jenkins-aws-secret-key-id')
AWS_SECRET_ACCESS_KEY = credentials('jenkins-aws-secret-access-key')
}
stages {
stage('checkout') {
steps {
git(url: 'git#bitbucket.org:user/bitbucketdemo.git', branch: 'master', credentialsId: 'jenkins')
echo 'hello'
}
}
stage('packer') {
steps {
echo $AWS_ACCESS_KEY_ID
}
}
}
}```
It should print out the value of the environment variable
I used the Cloudbees AWS Credentials plugin. Once installed, I was able to add my AWS credentials (additional selection in Credentials pull-down menu)
enter image description here
Then use the following snippet in my Jenkinsfile
withCredentials(
[[
$class: 'AmazonWebServicesCredentialsBinding',
accessKeyVariable: 'AWS_ACCESS_KEY_ID',
credentialsId: 'AWS',
secretKeyVariable: 'AWS_SECRET_ACCESS_KEY'
]]) {
sh 'packer build -var aws_access_key=${AWS_ACCESS_KEY_ID} -var aws_secret_key=${AWS_SECRET_ACCESS_KEY} example4.json'
}

Jenkinsfile | Upload documents from Jenkins workspace to confluence

I need to upload documents from Jenkins workspace to confluence via Jenkinsfile.
I followed up this link and started writing the basic code and sure that this will not work. Can anyone please add or comment or suggest me few links.
void Publish_Doc_Confluence(){
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', credentialsId: 'iam_user_jenkins']]) {
publishConfluence attachArchivedArtifacts: true, pageName: '', replaceAttachments: true, siteName: '', spaceName: ''
}
}
I am also using curl command to upload the file but in vain--
command-
stage('Publish to Confluence') {
steps {
withCredentials([usernamePassword(credentialsId: 'confluence', usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD')]) {
sh '''
curl -D- -u $USERNAME:$PASSWORD -X PUT -H "X-Atlassian-Token: nocheck" -F "file=#code/pydoc/*.html" -F "minorEdit=false" 'https://alm-tuigroup.atlassian.net/wiki/rest/api/content/504955238/child/attachment'
'''
}}}
And where exactly I will get the details like below in confluence page-
pageneme
sitename
spacename

Jenkins Pipeline - Parallel steps skip latest Git commits and Ruby's Bundler commands pause unexpectedly

I am writing my first declarative pipeline and I would like to run the Git Checkout and Bundler stage in parallel. When the steps from these stages are executed in parallel, I experience odd behaviors. For example:
I want to check out the latest copy of a PR to test. In the parallel step, the check out will fail to get the latest commit and will fetch an older commit as the HEAD. I have verified that the BitBucket server has newer commits than the one checked out.
Figured this out: In Bitbucket server, you need to manually access the PR page and refresh the "commit cache". This way the latest commits are served to Jenkins.
Source: https://community.atlassian.com/t5/Bitbucket-questions/Change-pull-request-refs-after-Commit-instead-of-after-Approval/qaq-p/194702
I want to run Ruby's Bundler gem (bundle install & update) in 4 different repositories at the same time. When the parallel step executes, it would unexpectedly pause/hang on bundle install in the cr_dbvals repos and not print anything to the console. Have to abort the build at this point.
Agents OS: Windows 10 x64
Running these steps sequentially yields the expected results and everything works fine. Not sure what I am missing in my script here:
pipeline {
agent {
node {
label "${env.executor_label}"
}
}
stages {
stage('Set Build Name') {
steps {
script {
currentBuild.displayName = "#${env.BUILD_NUMBER} - ${env.app_node} - ${env.browser}#${env.NODE_NAME}"
}
}
} // stage('Set Build Name')
stage('Git Checkout') {
steps {
parallel(
"Git Cucumber-Watir": {
dir('cucumber-watir') {
git(url: 'http://git-repo-url/cucumber-watir.git', branch: 'master', changelog: true)
bat(script: 'git config --add remote.origin.fetch +refs/pull-requests/*/from:refs/remotes/origin/pr/*')
bat(script: 'git fetch origin -p')
bat(script: "git checkout ${env.cucumber_watir_branch}")
}
},
"Git CrModels": {
dir('cr_models') {
git(url: 'http://git-repo-url/cr_models.git', branch: 'master', changelog: true)
bat(script: 'git config --add remote.origin.fetch +refs/pull-requests/*/from:refs/remotes/origin/pr/*')
bat(script: 'git fetch origin -p')
bat(script: "git checkout ${env.cr_models_branch}")
}
},
"Git CaModels": {
dir('ca_models') {
git(url: 'http://git-repo-url/ca_models.git', branch: 'master', changelog: true)
bat(script: 'git config --add remote.origin.fetch +refs/pull-requests/*/from:refs/remotes/origin/pr/*')
bat(script: 'git fetch origin -p')
bat(script: "git checkout ${env.ca_models_branch}")
}
},
"Git CrDbVal": {
dir('cr_dbvals') {
git(url: 'http://git-repo-url/cr_dbvals.git', branch: 'master', changelog: true)
bat(script: 'git config --add remote.origin.fetch +refs/pull-requests/*/from:refs/remotes/origin/pr/*')
bat(script: 'git fetch origin -p')
bat(script: "git checkout ${env.cr_dbvals_branch}")
}
}
) // parallel
} // steps
} // stage('Git Checkout')
stage('Bundler') {
steps {
parallel(
"Bundle Cucumber-Watir": {
dir('cucumber-watir') {
bat(script: "bundle install")
bat(script: "bundle update")
}
},
"Bundle CrModels": {
dir('cr_models') {
bat(script: "bundle install")
bat(script: "bundle update")
}
},
"Bundle CaModels": {
dir('ca_models') {
bat(script: "bundle install")
bat(script: "bundle update")
}
},
"Bundle CrDbVal": {
dir('cr_dbvals') {
bat(script: "bundle install")
bat(script: "bundle update")
}
}
) // parallel
} // steps
} // stage('Bundler')
stage('Execute Test(s)') {
steps {
dir(path: 'cucumber-watir') {
bat 'set NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P1'
bat(script: 'cucumber -t %tags% -f json -o cucumber.json -f pretty --expand')
}
} // steps
} // stage('Execute Test(s)')
} // stages
post {
always {
dir(path: 'cucumber-watir') {
cucumber 'cucumber.json' // Build Cucumber Report
}
deleteDir() // Cleanup
} // always
} // post
} // pipeline

Grunt Shell + Heroku Push = No stdout

Using Grunt to build, add, commit and push my code up to Heroku.
Build, add and commit are working great.
When I specify to "git push heroku master" in grunt shell I get no stdout while the process runs.
Here is the code in Grunt.js:
'git-push': {
command: 'git push heroku master',
options: {
failOnError: true,
stdout: true,
execOptions: { cwd: '../deploy'}
}
}
But I am only seeing the following when the process runs:
$ grunt push
Running "shell:git-push" (shell) task
Done, without errors.
I would like to see the output of the push while the push is in process.
Anyway to do this?
Update: Full grunt shell script
shell: {
'git-add': {
command: 'git --no-pager add .',
options: {
stdout: true,
execOptions: { cwd: '../deploy'}
}
},
'git-commit': {
command: 'git --no-pager commit -m "update"',
options: {
stdout: true,
execOptions: { cwd: '../deploy'}
}
},
'git-push': {
command: 'git --no-pager push heroku master',
options: {
failOnError: true,
stdout: true,
execOptions: { cwd: '../deploy'}
}
}
}
Final Grunt Shell (working):
shell: {
'git-add': {
command: 'git --no-pager add .',
options: {
stdout: true,
stderr: true,
execOptions: { cwd: '../deploy'}
}
},
'git-commit': {
command: 'git --no-pager commit -m "update"',
options: {
stdout: true,
stderr: true,
execOptions: { cwd: '../deploy'}
}
},
'git-push': {
command: 'git --no-pager push heroku master',
options: {
failOnError: true,
stdout: true,
stderr: true,
execOptions: { cwd: '../deploy'}
}
}
}
See:
How to make git diff write to stdout?
Adding --no-pager as an option, gives output.
git --no-pager <subcommand> <options>
Also, certain git commands write to stderr,as discussed here:
http://git.661346.n2.nabble.com/git-push-output-goes-into-stderr-td6758028.html
By including the flag and capturing stderr in the grunt task I was able to get output for the last part of the heroku push process (but not the part where the upload is tracked):
Fetching repository, done.
-----> Node.js app detected
PRO TIP: Specify a node version in package.json
See https://devcenter.heroku.com/articles/nodejs-support

Resources