How to resolve "please provide compiled classes with sonar.java.binaries property"? - sonarqube

I tried all the possible solutions posted by other people but still I am struggling to resolve this issue. I believe in my case, it has got to do something with the agents. I will post 2 codes, the one code works but the other doesn't. Both the codes call the same groovy methods but still the second code snippet doesn't work.
The below code 1 works fine and executed the pipeline successfully:
pipeline{
agent { label 'docker-kitchensink-slave' }
stages{
stage('Checkout') {
steps{
checkout scm
}
}
//Build and Unit Tests
stage('Build and Unit Tests') {
steps{
script{
if (buildType.buildSystem == 'npm'){
buildNpm(configuration)
} else {
build(configuration)
}
}
}
}
// SonarQube Analysis
stage('SonarQube analysis') {
steps{
script{
if (buildType.buildSystem != 'npm'){
sonarQubeGating(configuration)
}
}
}
}
// Build Docker Image and Push to Artifactory
stage('Build Docker Image and Push to Artifactory') {
steps{
artifactoryImagePush(configuration)
}
}
// Approve DEV Deployment
stage('Approve Dev Deployment') {
agent none
when {
anyOf {
expression {
return (env.GIT_BRANCH.equals('master') || env.GIT_BRANCH.startsWith('hotfix-'))
}
}
}
steps{
approveDeployment()
}
}
}
}
The below code 2 doesn't work:
pipeline{
agent none
stages{
stage('Checkout') {
agent { label 'docker-kitchensink-slave' }
steps{
checkout scm
}
}
//Build and Unit Tests
stage('Build and Unit Tests') {
agent { label 'docker-kitchensink-slave' }
steps{
script{
if (buildType.buildSystem == 'npm'){
buildNpm(configuration)
} else {
build(configuration)
}
}
}
}
// SonarQube Analysis
stage('SonarQube analysis') {
agent { label 'docker-kitchensink-slave' }
steps{
script{
if (buildType.buildSystem != 'npm'){
sonarQubeGating(configuration)
}
}
}
}
// Build Docker Image and Push to Artifactory
stage('Build Docker Image and Push to Artifactory') {
agent { label 'docker-kitchensink-slave' }
steps{
unstash 'artifacts'
unstash 'artifacts'
artifactoryImagePush(configuration)
}
}
// Approve DEV Deployment
stage('Approve Dev Deployment') {
agent none
when {
anyOf {
expression {
return (env.GIT_BRANCH.equals('master') || env.GIT_BRANCH.startsWith('hotfix-'))
}
}
}
steps{
approveDeployment()
}
}
}
}
I get the error as below:
[ERROR] Failed to execute goal org.sonarsource.scanner.maven:sonar-maven-plugin:3.6.0.1398:sonar (default-cli) on project xyz-service: Your project contains .java files, please provide compiled classes with sonar.java.binaries property, or exclude them from the analysis with sonar.exclusions property. -> [Help 1]
Below is my sonar code:
void call(Map optionParams = [:]) {
script {
try {
String jacocoPath = optionParams.get('buildSystem').equals('gradle') ?
'build/JacocoReport/test/jacocoTestReport.xml' : 'target/site/jacoco/jacoco.xml'
glSonarMavenScan gitUserCredentialsId: 'sonar-key', javaVersionForSonar: '11.0', mavenVersion: '3.5.4',
additionalProps: ['sonar.coverage.jacoco.xmlReportPaths' : jacocoPath]
} catch (Exception e) {
echo "The following Sonar exception thrown ${e}"
//Stop build here, unless 'requireSonar' is set to False (String or Boolean)
if (!optionParams.get('requireSonar').toString().equalsIgnoreCase('false')) {
throw e
}
}
}
}

I'm a little confused about what you're trying to achieve here. You are showing the code that works. Are you trying to understand WHY the first block works, compared to the second, or are you just trying to get it working? If the latter, clearly you are already done.
If the former, I'm only familiar with scripted pipeline, not declarative pipeline, but it seems possible to me that if there is more than one build node that satisfies that label, then each of those "agent" lines could potentially select a build node, and each one could potentially select a different one. If the build step executes a different node than the sonarqube scan is run on, you will find yourself in a workspace without any compiled classes.

Related

Parallel pipeline with 2 inline stages

Playing with Jenkins pipeline from https://www.jenkins.io/doc/pipeline/examples/#parallel-multiple-nodes
Simple two parallel steps (OK)
I made a first test pipeline this way:
pipeline {
stages {
stage('Build') {
steps {
script {
def labels = ['precise', 'trusty'] // labels for Jenkins node types we will build on
def builders = [:]
for (x in labels) {
def label = x // Need to bind the label variable before the closure - can't do 'for (label in labels)'
// Create a map to pass in to the 'parallel' step so we can fire all the builds at once
builders[label] = {
node('JenkinsNode') {
sh script: 'echo build', label: 'Build on $env.NODE_NAME'
}
}
}
parallel builders
}
}
}
}
}
It resulted in the following expected diagram in Blue Ocean view:
Simple two parallel steps with two sub steps each (KO)
Attempt#1
Then I tried to split each parallel step in two inline stages (to simulate build and tests for example)
pipeline {
stages {
stage('Build') {
steps {
script {
def labels = ['precise', 'trusty'] // labels for Jenkins node types we will build on
def builders = [:]
for (x in labels) {
def label = x // Need to bind the label variable before the closure - can't do 'for (label in labels)'
// Create a map to pass in to the 'parallel' step so we can fire all the builds at once
builders[label] = {
node('JenkinsNode') {
stage("build") {
sh script: 'echo build', label: 'Build on $env.NODE_NAME'
}
stage("test") {
sh script: 'echo run unit tests', label: 'Run unit tests on $env.NODE_NAME'
}
}
}
}
parallel builders
}
}
}
}
}
The Jenkins logs show both build and test stages are run for each parallel step, but the Blue Ocean view only states build stage:
I would expect something like:
I'm not very clear about the boundaries between declarative and scripted pipelines, but I suspect a misunderstanding around this.
Attempt#2
Following a suggestion in comments, I slightly changed the code to have sub-stages unique names (build1, test1, build2, test2) and it does not change the diagram. I still have build steps only.
Here are the Jenkins logs in this case:
Question: Is the pipeline invalid (leading to only "build" sub-steps instead of build + test sub-steps) or is it a limitation of Blue Ocean (1.25.3)?
When combining declarative and scripted syntax things become a bit tricky.
In this specific combination case, to make it work like you expect you must create an encapsulating stage for each parallel execution code that has the same name as the parallel branch.
This will cause the blue ocean to display the inner stages as requested.
pipeline {
agent any
stages {
stage('Build') {
steps {
script {
def labels = ['precise', 'trusty']
def builders = [:]
for (x in labels) {
def label = x
builders[label] = {
stage(label) { // Encapsulating stage with same name as parallel branch
node('JenkinsNode') {
stage("build") {
sh script: 'echo build', label: 'Build on $env.NODE_NAME'
}
stage("test") {
sh script: 'echo run unit tests', label: 'Run unit tests on $env.NODE_NAME'
}
}
}
}
}
parallel builders
}
}
}
}
}
Or in a more Groovy way:
pipeline {
agent any
stages {
stage('Build') {
steps {
script {
def labels = ['precise', 'trusty']
def builders = labels.collectEntries {
["${it}" : {
stage(it) { // Encapsulating stage with same name as parallel branch
node('JenkinsNode') {
stage("build") {
sh script: 'echo build', label: 'Build on $env.NODE_NAME'
}
stage("test") {
sh script: 'echo run unit tests', label: 'Run unit tests on $env.NODE_NAME'
}
}
}
}]
}
parallel builders
}
}
}
}
}
The result:

What is correct way to write a condition using when in Jenkinsfile?

I am trying to write when statement in the single stage block in Jenkinsfile. I have tried to write as below. I know it's not the correct way to write. It's a declarative pipeline script. The pipeline expects only a single when block. How can I combine both of my when blocks and write as a single when.
stages{
stage('Approve Dev Deployment') {
agent { label 'docker-kitchensink-slave' }
when {
anyOf {
expression {
return (env.GIT_BRANCH.equals('master') || env.GIT_BRANCH.startsWith('hotfix-'))
}
}
}
when {
expression {
input message: 'Deploy test?'
return true
}
beforeAgent true
}
steps{
approveDeployment()
}
}
}
Write function with the conditions outside the pipeline scope and use the function as a condition.
def checkcondition(){
your_condition
}
stages{
stage('Approve Dev Deployment') {
agent { label 'docker-kitchensink-slave' }
when { checkcondition() }
steps{
approveDeployment()
}
}
}

Jenkins pipelines with parallel and different containers

So I am already running Jenkins pipelines with parallel base on the example from: Is it possible to create parallel Jenkins Declarative Pipeline stages in a loop?
I want to run each job in different isolated container, the agent name should be the same to all of them. Tried a few options all of them ended up withe errors, I think I need to use both declarative and scripted but not sure how.
Things I tired:
def generateTerraformStage(env) {
return {
agent { label 'local_terraform' }
stage("stage: Terraform ${TERRAFORM_ACTION} ${env}") {
echo "${env}"
sleep 30
}
}
}
stage('parallel stages') {
agent { label 'local_terraform' }
steps {
script {
parallel parallelStagesMapEnvironment
}
}
}
One of the errors I got during testing:
"java.lang.NoSuchMethodError: No such DSL method 'agent' found among steps" and "java.lang.IllegalArgumentException: Expected named arguments but got org.jenkinsci.plugins.workflow.cps.CpsClosure2#560f3533"
Dynamic parallel stages could be created only by using Scripted Pipelines. The API built-it Declarative Pipeline is not available (like agent, options, when etc.).
I don't see any information that you really need dynamic stages (e.g. based on the value returned by a 3rd-party service), so I prepared two solutions:
dynamic parallel stages - stages are generated based on something
static parallel stages - you know all stages (the when block could be used to disable these which are not needed - e.g. passed in parameters)
pipeline {
// ...
stages {
stage('dynamic parallel stages') {
steps {
script {
// params.ENVS == ['envA', 'envB', 'envC']
def values = params.ENVS.split(',')
def stages = [:]
for (def value in values) {
stages[value] = generateTerraformStage(value)
}
parallel stages
}
}
}
stage('static parallel stages') {
parallel {
stage('envA') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envA') }
}
steps {
terraformStageLogic 'envA'
}
}
stage('envB') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envB') }
}
steps {
terraformStageLogic 'envB'
}
}
stage('envC') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envC') }
}
steps {
terraformStageLogic 'envC'
}
}
// ...
}
}
}
}
Closure<Void> generateTerraformStage(env) {
return {
node('local_terraform') {
stage("stage: Terraform ${TERRAFORM_ACTION} ${env}") {
echo "${env}"
sleep 30
}
}
}
}
void terraformStageLogic(env) {
echo "${env}"
sleep 30
}
When you don't use the workspace in the stage responsible for generating or executing other stages (dynamic parallel stages and static parallel stages) then you don't need to allocate any node to it (waste of resources).

Jenkins can not download Docker

I use Jenkins to checkout source code and build image and push the image.
In Jenkins I configurate Docker just like this:
in my Jenkins Pipeline Script
pipeline {
agent none
environment {
registry = ""
registryCredential = ''
imageName = 'imageName'
dockerImage = ''
dockerHome = tool 'docker_latest'
PATH = "$dockerHome/bin:$PATH"
}
stages {
stage('Prepare') {
agent {
label "${config.job.agent}"
}
steps {
echo "CheckOut"
script {
checkout
}
}
}
stage('Building image') {
steps{
dir('jenkins-slave-savi'){
script {
dockerImage = docker.build imageName + ":$BUILD_NUMBER"
}
}
}
}
stage('Deploy Image'){
steps {
script {
docker.withRegistry(registry) {
dockerImage.push()
}
}
}
}
}
}
but when I run this script, I got this error:
ERROR: Failed to download pre-1.11.x URL https://get.docker.com/builds/Linux/x86_64/docker-latest from agent: java.net.ConnectException: Connection timed out (Connection timed out)
ERROR: Failed to download https://get.docker.com/builds/Linux/x86_64/docker-latest.tgz from agent; will retry from master
any solutions???

skipDefaultCheckout true option is skipping all the stages and going to postActions stage

I need to check out the code into $GOPATH/src/dev-DIR folder. but due to skipDefaultCheckout true option checkout scm and other stages are not executing. Directly post actions are getting executing Please help me where am I going wrong?
pipeline {
agent {node {label 'project_a'}}
options {
skipDefaultCheckout true
}
environment {
PATH = "$PATH:/opt/jenkins/:/usr/local/go/bin/"
GIT_REPO = get_gitrepo()
GOPATH= "${env.WORKSPACE}"
PROJECT_WORKSPACE = "${env.WORKSPACE}/src/dev-DIR"
}
stages {
stage('checkout scm'){
steps{
dir ("${GOPATH}/src/dev-DIR"){
checkout scm
}
}
}
stage('Install Prerequisites') {
// go get -t
}
}
post{
//some actions
}
}
```

Resources