Jenkins git checkout stage not able to checkout yml file - jenkins-pipeline

I have created a jenkins pipeline for an application. I have following stages in my declarative pipeline.
Checkout
nuget restore
sonar scan start
dotnet build
sonar scan end
build docker image
run container
deploy on google Kubernetes cluster
If I don't include 8th step my pipeline works fine, but if I include 8th step my pipeline works only the first time. For the next runs I will get the below error in the first stage.
I have created a windows machine on Azure and running Jenkins on that machine.
Jenkins file
stages {
stage('Code Checkout') {
steps {
echo 'Cloning project...'
deleteDir()
checkout changelog: false, poll: false, scm: [$class: 'GitSCM', branches: [[name: '*/development']], extensions: [], userRemoteConfigs: [[url: 'https://github.com/shailu0287/JenkinsTest.git']]]
echo 'Project cloned...'
}
}
stage('Nuget Restore') {
steps {
echo "nuget restore"
bat 'dotnet restore \"WebApplication4.sln\"'
}
}
stage('Sonar Scan Start'){
steps{
withSonarQubeEnv('SonarQube_Home') {
echo "Sonar scan start"
echo "${scannerHome}"
bat "${scannerHome}\\SonarScanner.MSBuild.exe begin /k:\"Pan33r\" /d:sonar.login=\"squ_e2ecec8e21976c04764cc4940d3d3ddbec9e2898\""
}
}
}
stage('Build Solution') {
steps {
echo "Build Solution"
bat "\"${tool 'MSBUILD_Home'}\" WebApplication4.sln /p:Configuration=Release /p:Platform=\"Any CPU\" /p:ProductVersion=1.0.0.${env.BUILD_NUMBER}"
}
}
stage('Sonar Scan End'){
steps{
withSonarQubeEnv('SonarQube_Home') {
echo "${scannerHome}"
echo "sonar scan end"
bat "${scannerHome}\\SonarScanner.MSBuild.exe end /d:sonar.login=\"squ_e2ecec8e21976c04764cc4940d3d3ddbec9e2898\""
}
}
}
stage('Building docker image') {
steps{
script {
echo "Building docker image"
dockerImage = docker.build registry + ":$BUILD_NUMBER"
}
}
}
stage('Containers'){
parallel{
stage("Run PreContainer Checks"){
environment{
containerID = "${bat(script: 'docker ps -a -q -f name="c-Shailendra-master"', returnStdout: true).trim().readLines().drop(1).join("")}"
}
steps{
script{
echo "Run PreContainer Checks"
echo env.containerName
echo "containerID is "
echo env.containerID
if(env.containerID != null){
echo "Stop container and remove from stopped container list too"
bat "docker stop ${env.containerID} && docker rm ${env.containerID}"
}
}
}
}
stage("Publish Docker Image to DockerHub"){
steps{
script {
echo "Pushing docker image to docker hub"
docker.withRegistry( '', registryCredential ) {
dockerImage.push("$BUILD_NUMBER")
dockerImage.push('latest')
}
}
}
}
}
}
stage('Docker Deployment'){
steps{
echo "${registry}:${BUILD_NUMBER}"
echo "Docker Deployment by using docker hub's image"
bat "docker run -d -p 7200:80 --name c-${containerName}-master ${registry}:${BUILD_NUMBER}"
}
}
stage('Deploy to GKE') {
steps{
echo "Deployment started ..."
step([$class: 'KubernetesEngineBuilder', projectId: env.PROJECT_ID, clusterName: env.CLUSTER_NAME, location: env.LOCATION, manifestPattern: 'Kubernetes.yml', credentialsId: env.CREDENTIALS_ID, verify deployments: true])
}
}
}
}
If I remove the last step, all my builds work fine. If I include the last step, only the first build works fine then I have to restart the machine. I am not sure what is the issue with the YML file.

Related

Run a set of linux commands using Jenkinsfile in Jenkins

I had to create a jenkins job to automate certain tasks that will perform certain operations like Updating the public site, Changing public version to latest public release, Updating Software on public site and Restarting Server these include certain operations such as copy files to a tmp folder, log in to a an on-prem server, go to the folder and unzip the file etc.
I have created the jenkinsfile as follows:
pipeline {
options {
skipDefaultCheckout()
timestamps()
}
parameters {
string(name: 'filename', defaultValue: 'abc', description: 'Enter the file name that needs to be copied')
string(database: 'database', defaultValue: 'abc', description: 'Enter the database that needs to be created')
choice(name: 'Run', choices: '', description: 'Data migration')
}
agent {
node { label 'aws && build && linux && ubuntu' }
}
triggers {
pollSCM('H/5 * * * *')
}
stages {
stage('Clean & Clone') {
steps {
cleanWs()
checkout scm
}
}
stage('Updating the public site'){
steps{
sh "scp ./${filename}.zip <user>#<server name>:/tmp"
sh "ssh <user>#<server name>"
sh "cp ./tmp/${filename}.zip ./projects/xyz/xyz-site/"
sh "cd ./projects/xyz/xyz-site/ "
sh "unzip ./${filename}.zip"
sh "cp -R ./${filename}/* ./"
}
stage('Changing public version to latest public release') {
steps {
sh "scp ./${filename}.sql.gz <user>#<server name>:/tmp"
sh "ssh <user>#<server name>"
sh "mysql -u root -p<PASSWORD>"
sh "show databases;"
sh "create database ${params.database};"
sh "GRANT ALL PRIVILEGES ON <newdb>.* TO 'ixxyz'#'localhost' WITH GRANT OPTION;"
sh "exit;"
sh "zcat tmp/${filename}.sql.gz | mysql -u root -p<PASSWORD> <newdb>"
sh "db.default.url="jdbc:mysql://localhost:3306/<newdb>""
sh "ps aux|grep monitor.sh|awk '{print "kill "$2}' |bash"
}
}
stage('Updating Software on public site') {
steps {
sh "scp <user>#<server>:/tmp/abc<version>_empty_h2.zip"
sh "ssh <user>#<server name>"
sh "su <user>"
sh "mv tmp/<version>_empty_h2.zip ./xyz/projects/xyz"
sh "cd xyz/projects/xyz"
sh "cp latest/conf/local.conf <version>_empty_h2/conf/"
}
}
stage('Restarting Server') {
steps {
sh "rm latest/RUNNING_PID"
sh "bash reload.sh"
sh "nohup bash monitor.sh &"
}
}
}
}
Is there a way I can dynamically obtain the zip filename in the root folder? I used ${filename}.zip , but it doesn't seem to work.
Also, is there a better way to perform these operations using jenkins? Any help is much appreciated.
You could write all your steps in one shell script for each stage and execute under one stage.
Regarding filename.zipeither you can take this as a parameter and pass this value to your stages. OR You can also use find command as a shell command or shell script to find .zip files in a current directory. find <dir> -iname \*.zip find . -iname \*.zip .
Example:
pipeline {
options {
skipDefaultCheckout()
timestamps()
}
parameters {
string(name: 'filename', defaultValue: 'abc', description: 'Enter the file name that needs to be copied')
choice(name: 'Run', choices: '', description: 'Data migration')
}
stage('Updating the public site'){
steps{
sh "scp ./${params.filename}.zip <user>#<server name>:/tmp"
...
}
}
}
For executing script at a certain location based on your question , you could use dir with path where your scripts are placed.
OR you can also give the path directly sh label: 'execute script', script:"C:\\Data\\build.sh"
stage('Your stage name'){
steps{
script {
// Give path where your scripts are placed
dir ("C:\\Data") {
sh label: 'execute script', script:"build.sh <Your Arguments> "
...
}
}
}
}

How to configure jenkins pipeline with logstash plugin?

Usecase: I want to send jenkins job console log to elasticsearch, from there to kibana so that i can visualise the data.
I am using logstash plugin to achieve this. For freestyle job logstash plugin configuration is working fine but for jenkins pipeline jobs I am getting all required data like build number, job name, build duration and all but it is not showing the build result i.e., success or failure it is not showing.
I tried in two ways:
1.
stage('send to ES') {
logstashSend failBuild: true, maxLines: -1
}
2.
timestamps {
logstash {
node() {
sh'''
echo 'Hello, World!'
'''
try {
stage('GitSCM')
{
git url: 'github repo.git'
}
stage('Initialize')
{
jdk = tool name: 'jdk'
env.JAVA_HOME = "${jdk}"
echo "jdk installation path is: ${jdk}"
sh "${jdk}/bin/java -version"
sh '$JAVA_HOME/bin/java -version'
def mvnHome = tool 'mvn'
}
stage('Build Stage')
{
def mvnHome = tool 'mvn'
sh "${mvnHome}/bin/mvn -B verify"
}
currentBuild.result = 'SUCCESS'
} catch (Exception err) {
currentBuild.result = 'FAILURE'
}
}
}
}
But in both ways I am not getting build result i.e., success or failure in my elasticsearch or kibana.
Can someone help.
I didn't find a clear way to do that, my solution was add those lines at the end of the Jenkinsfile:
echo "Current result: ${currentBuild.currentResult}"
logstashSend failBuild: true, maxLines: 3
In my case, I dont need it to send all console logs, only one log with the result per job.

Unable to print credentials set in Jenkins Pipeline

Credentials are configured in Jenkins but there's an error suggesting they are not.
I've followed documentation provided by Jenkins website.
agent {
node {
label 'master'
}
}
environment {
AWS_ACCESS_KEY_ID = credentials('jenkins-aws-secret-key-id')
AWS_SECRET_ACCESS_KEY = credentials('jenkins-aws-secret-access-key')
}
stages {
stage('checkout') {
steps {
git(url: 'git#bitbucket.org:user/bitbucketdemo.git', branch: 'master', credentialsId: 'jenkins')
echo 'hello'
}
}
stage('packer') {
steps {
echo $AWS_ACCESS_KEY_ID
}
}
}
}```
It should print out the value of the environment variable
I used the Cloudbees AWS Credentials plugin. Once installed, I was able to add my AWS credentials (additional selection in Credentials pull-down menu)
enter image description here
Then use the following snippet in my Jenkinsfile
withCredentials(
[[
$class: 'AmazonWebServicesCredentialsBinding',
accessKeyVariable: 'AWS_ACCESS_KEY_ID',
credentialsId: 'AWS',
secretKeyVariable: 'AWS_SECRET_ACCESS_KEY'
]]) {
sh 'packer build -var aws_access_key=${AWS_ACCESS_KEY_ID} -var aws_secret_key=${AWS_SECRET_ACCESS_KEY} example4.json'
}

How to re-use pre process jenkins/groovy in each test

Im using the following code to run our voter , currently I’ve one target which is called Run Tests
Which use exactly the same steps as the last (lint) , currently I duplicate it which I think is not a good solution ,
Is there is nice way to avoid this duplication and done it only once as per-requisite process ?
I need all the steps until the cd to the project
The only difference is one target I run
go test ...
and the second
go lint
All steps before are equal
#!/usr/bin/env groovy
try {
parallel(
'Run Tests': {
node {
//————————Here we start
checkout scm
def dockerImage = 'docker.company:50001/crt/deg:0.1.3-09’
setupPipelineEnvironment script: this,
measureDuration(script: this, measurementName: 'build') {
executeDocker(dockerImage: dockerImage, dockerWorkspace: '/go/src') {
sh """
mkdir -p /go/src/github.com/ftr/myGoProj
cp -R $WORKSPACE/* /go/src/github.com/ftr/MyGoProj
cd /go/src/github.com/ftr/MyGoProj
//————————Here we finish and TEST
go test -v ./...
"""
}
}
}
},
‘Lint’: {
node {
//————————Here we start
checkout scm
def dockerImage = 'docker.company:50001/crt/deg:0.1.3-09’
setupPipelineEnvironment script: this,
measureDuration(script: this, measurementName: 'build') {
executeDocker(dockerImage: dockerImage, dockerWorkspace: '/go/src') {
sh """
mkdir -p /go/src/github.com/ftr/myGoProj
cp -R $WORKSPACE/* /go/src/github.com/ftr/MyGoProj
cd /go/src/github.com/ftr/MyGoProj
//————————Here we finish and LINT
go lint
"""
}
}
)
}
}
}
You can use function and pass Go arguments:
try {
parallel(
'Run Tests': {
node {
checkout scm
runTestsInDocker('test -v ./...')
}
},
'Lint': {
node {
checkout scm
runTestsInDocker('lint')
}
}
)
}
def runTestsInDocker(goArgs) {
def dockerImage = 'docker.company:50001/crt/deg:0.1.3-09'
setupPipelineEnvironment script: this,
measureDuration(script: this, measurementName: 'build') {
executeDocker(dockerImage: dockerImage, dockerWorkspace: '/go/src') {
sh """
mkdir -p /go/src/github.com/ftr/myGoProj
cp -R $WORKSPACE/* /go/src/github.com/ftr/MyGoProj
cd /go/src/github.com/ftr/MyGoProj
go ${goArgs}
"""
}
}
}
Update
If some actions can be separated out of runTestsInDocker they probably should be.
For example setupPipelineEnvironment step. I don't know exact logic but maybe it can be run once before running test.
node {
stage('setup') {
setupPipelineEnvironment script: this
}
stage ('Tests') {
parallel(
'Run Tests': {
node {
checkout scm
runTestsInDocker('test -v ./...')
}
},
'Lint': {
node {
checkout scm
runTestsInDocker('lint')
}
}
)
}
}
def runTestsInDocker(goArgs) {
def dockerImage = 'docker.company:50001/crt/deg:0.1.3-09'
measureDuration(script: this, measurementName: 'build') {
executeDocker(dockerImage: dockerImage, dockerWorkspace: '/go/src') {
sh """
mkdir -p /go/src/github.com/ftr/myGoProj
cp -R $WORKSPACE/* /go/src/github.com/ftr/MyGoProj
cd /go/src/github.com/ftr/MyGoProj
go ${goArgs}
"""
}
}
}
Note
If you are running parallel on remote agents you must remember that setup performed on master may be not aviailable on remote slave.

how to keep process running after the stage is finished for declarative jenkins pipeline

pipeline {
agent none
stages {
stage('Server') {
agent{
node {
label "xxx"
customWorkspace "/home/xxx/server"
}
}
steps {
sh 'node server.js &'
//start server
}
}
stage('RunCase') {
agent{
node {
label 'clientServer'
customWorkspace "/home/xxx/CITest"
}
}
steps{
sh 'start test'
sh 'run case here'
}
}
}
}
I create above Jenkins pipeline. What I want to do is:
1. start server at server node.
2. start test at test node.
However, I found the server process will be closed when second stage start.
So how to keep server start until my second stage testing work is finished. I try to use &, still not working. It seems it will kill all process I started at first stage.
One solution is to try to start the two stages in "parallel"-mode. For more informations see this two files: parallel-declarative-blog jenkins-pipeline-syntax. But be carefull, because it is not ensured, that the first stage starts before the second one starts. Maybe you need a waiting time for your tests. Here is an example Jenkinsfile:
pipeline {
agent none
stages {
stage('Run Tests') {
parallel {
stage('Start Server') {
steps {
sh 'node server.js &'
}
}
stage('Run Tests) {
steps {
sh 'run case here'
}
}
}
}
}
}
Another solution would be to start the node server in the background. For this you can try different tools, like nohup or pm2.

Resources