helm test via Jenkins pipeline - jenkins-pipeline

I'm running a basic groovy Jenkins pipeline as code for establishing a successful connection to a kubernetes cluster. Below is the code snippet which is trying to connect to a k8s cluster and listing all the releases.
stage('Helm list'){
steps{
withCredentials([file(credentialsId: "kubeconfig-gke", variable:"kubeconfig")])
{
helm list -a
}
}
}
I get the following error on Jenkins console output :
groovy.lang.MissingPropertyException: No such property: list for class: groovy.lang.Binding
Possible solutions: class
at groovy.lang.Binding.getVariable(Binding.java:63)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:270)
at org.kohsuke.groovy.sandbox.impl.Checker$6.call(Checker.java:289)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:293)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:269)

Run it inside a shell command
steps{
withCredentials([file(credentialsId: "kubeconfig-gke", variable:"kubeconfig")])
{
sh """
helm list -a
"""
}
}
}

Related

Jenkins function of shared library executed on wrong node

I have a simple shared library and want to call a function which creates a file.
My pipeline script looks like this:
#Library('jenkins-shared-library')_
pipeline {
agent {
node {
label 'node1'
}
}
stages {
stage('test') {
steps {
script {
def host = sh(script: 'hostname', returnStdout: true).trim()
echo "Hostname is: ${host}"
def testlib = new TestLibrary(script:this)
testlib.createFile("/tmp/file1")
}
}
}
}
}
This pipeline job is triggered by another job which runs on the built-in master node.
The stage 'test' is correctly executed on the 'node1'.
Problem: The created file "/tmp/file1" is created on the jenkins master, instead of "node1"
I also tried it without the shared library and load a groovy script
directly in a step:
pipeline {
agent {
node {
label 'node1'
}
}
stages {
stage('test') {
steps {
script {
def script = load "/path/to/script.groovy"
script.createFile("/tmp/file1")
}
}
}
}
}
This also creates the file on the master node, instead on "node1".
Is there no way of loading external libs or classes and execute them on the node where the stage is running on ? I dont want to place all the code directly into the step.
Ok I found it myself, groovy scripts run by definition always on the master node. No way to run scripts or shared-library code on a node other than master.

Nullpointer exception after running kubectl and helm command in Jenkinsfile

I am trying to run kubernetes and helm command in the deploy step of the Jenkinsfile but I am facing Nullpointer exception. Below is my code:
stage ('Create and Deploy to k8s Dev Environment') {
//agent {label 'docker-maven-slave'}
options {
skipDefaultCheckout()
}
steps {
withCredentials([string(credentialsId: K8S_DEV_SECRET_ID)]) {
command """
kubectl apply --server=https://10.0.0.0:443 --insecure-skip-tls-verify=false --namespace: "dev-ns" -f -
helm template -f "portal-chart/deploy/values-dev.yaml" portal-chart
"""
}
}
Below are the logs:
[Pipeline] End of Pipeline
hudson.remoting.ProxyException: java.lang.NullPointerException
Caused: hudson.remoting.ProxyException: com.xxx.jenkins.pipeline.library.utils.dispatch.ShellCommandException: Exception calling shell command, 'kubectl apply ... ': null
at command.call(command.groovy:51)
I was able to resolve this. Below is the code:
stage ('Create and Deploy to k8s Dev Environment') {
//agent {label 'docker-maven-slave'}
options {
skipDefaultCheckout()
}
steps {
command """
kubectl apply --server=https://10.0.0.0:443 --insecure-skip-tls-verify=false --namespace: "dev-ns" -f -
helm template -f "portal-chart/deploy/values-dev.yaml" portal-chart
"""
}

How to configure jenkins pipeline with logstash plugin?

Usecase: I want to send jenkins job console log to elasticsearch, from there to kibana so that i can visualise the data.
I am using logstash plugin to achieve this. For freestyle job logstash plugin configuration is working fine but for jenkins pipeline jobs I am getting all required data like build number, job name, build duration and all but it is not showing the build result i.e., success or failure it is not showing.
I tried in two ways:
1.
stage('send to ES') {
logstashSend failBuild: true, maxLines: -1
}
2.
timestamps {
logstash {
node() {
sh'''
echo 'Hello, World!'
'''
try {
stage('GitSCM')
{
git url: 'github repo.git'
}
stage('Initialize')
{
jdk = tool name: 'jdk'
env.JAVA_HOME = "${jdk}"
echo "jdk installation path is: ${jdk}"
sh "${jdk}/bin/java -version"
sh '$JAVA_HOME/bin/java -version'
def mvnHome = tool 'mvn'
}
stage('Build Stage')
{
def mvnHome = tool 'mvn'
sh "${mvnHome}/bin/mvn -B verify"
}
currentBuild.result = 'SUCCESS'
} catch (Exception err) {
currentBuild.result = 'FAILURE'
}
}
}
}
But in both ways I am not getting build result i.e., success or failure in my elasticsearch or kibana.
Can someone help.
I didn't find a clear way to do that, my solution was add those lines at the end of the Jenkinsfile:
echo "Current result: ${currentBuild.currentResult}"
logstashSend failBuild: true, maxLines: 3
In my case, I dont need it to send all console logs, only one log with the result per job.

How can I use vSphere Cloud Plugin inside Jenkins pipeline code?

I have a setup at work where the vSphere host are manually restarted before execution of a specific jenkins job, as a noob in the office I automated this process by adding a extra build step to restart vm's with the help of https://wiki.jenkins-ci.org/display/JENKINS/vSphere+Cloud+Plugin! (vSphere cloud plugin).
I would like to now integrate this as a pipeline code, please advise.
I have already checked that this plugin is Pipeline compatible.
I currently trigger the vSphere host restart in pipeline by making it to remotely trigger a job configured with vSphere cloud plugin.
pipeline {
agent any
stages {
stage('Restarting vSphere') {
steps {
script {
sh "curl -v 'http://someserver.com/job/Vivin/job/executor_configurator/buildWithParameters?Host=build-114&token=bonkers'"
}
}
}
stage('Setting Executors') {
steps {
script {
def jenkins = Jenkins.getInstance()
jenkins.getNodes().each {
if (it.displayName == 'brewery-133') {
echo 'brewery-133'
it.setNumExecutors(8)
}
}
}
}
}
}
}
I would like to integrate the vSphere cloud plugin directly in the Pipeline code itself, please help me to integrate.
pipeline {
agent any
stages {
stage('Restarting vSphere') {
steps {
vSphere cloud plugin code that is requested
}
}
}
stage('Setting Executors') {
steps {
script {
def jenkins = Jenkins.getInstance()
jenkins.getNodes().each {
if (it.displayName == 'brewery-133') {
echo 'brewery-133'
it.setNumExecutors(8)
}
}
}
}
}
}
}
Well I found the solution myself with the help of 'pipeline-syntax' feature found in the menu of a Jenkins pipeline job.
'Pipeline-syntax' feature page contains syntax of all the possible parameters made available via the API of the installed plugins of a Jenkins server, using which we can generate or develop the syntax based on our needs.
http://<jenkins server url>/job/<pipeline job name>/pipeline-syntax/
My Jenkinsfile (Pipeline) now look like this
pipeline {
agent any
stages {
stage('Restarting vSphere') {
steps {
vSphere buildStep: [$class: 'PowerOff', evenIfSuspended: false, ignoreIfNotExists: false, shutdownGracefully: true, vm: 'brewery-133'], serverName: 'vspherecentral'
vSphere buildStep: [$class: 'PowerOn', timeoutInSeconds: 180, vm: 'brewery-133'], serverName: 'vspherecentral'
}
}
stage('Setting Executors') {
steps {
script {
def jenkins = Jenkins.getInstance()
jenkins.getNodes().each {
if (it.displayName == 'brewery-133') {
echo 'brewery-133'
it.setNumExecutors(1)
}
}
}
}
}
}
}

How to trigger a multiple run in a single pipeline job of jenkins?

I have a pipeline job which run with below pipeline groovy script,
pipeline {
parameters{
string(name: 'Unique_Number', defaultValue: '', description: 'Enter Unique Number')
}
stages {
stage('Build') {
agent { node { label 'Build' } }
steps {
script {
sh build.sh
}
}
stage('Deploy') {
agent { node { label 'Deploy' } }
steps {
script {
sh deploy.sh
}
}
stage('Test') {
agent { node { label 'Test' } }
steps {
script {
sh test.sh
}
}
}
}
I just trigger this job multiple times with different unique ID number as input parameter. So as a result i will have multiple run/build for this job at different stages.
With this, i need to trigger a multiple run/build to be promote to next stage (i.e., from build to deploy or from deploy to test) in this pipeline job as a one single build instead of triggering each and every single run/build to next stage. Is there any possibility?
I was also trying to do the same thing and found no relevant answers. May this help to someone.
This will read a file that contains the Jenkins Job name and run them iteratively from one single job.
Please change below code accordingly in your Jenkins.
pipeline {
agent any
stages {
stage('Hello') {
steps {
script{
git branch: 'Your Branch name', credentialsId: 'Your crendiatails', url: ' Your BitBucket Repo URL '
##To read file from workspace which will contain the Jenkins Job Name ###
def filePath = readFile "${WORKSPACE}/ Your File Location"
##To read file line by line ###
def lines = filePath.readLines()
##To iterate and run Jenkins Jobs one by one ####
for (line in lines) {
build(job: "$line/branchName",
parameters:
[string(name: 'vertical', value: "${params.vertical}"),
string(name: 'environment', value: "${params.environment}"),
string(name: 'branch', value: "${params.aerdevops_branch}"),
string(name: 'project', value: "${params.host_project}")
]
)
}
}
}
}
}
}
You can start multiple jobs from one pipeline if you run something as:
build job:"One", wait: false
build job:"Two", wait: false
Your main job starts children pipelines and children pipelines will run in parallel.
You can read PipeLine Build Step documentation for more information.
Also, you can read about the parallel run in declarative pipeline
Here you can find a lot of examples for parallel running

Resources