clean install on merge and clean install javadoc:javadoc pmd:cpd pmd:pmd checkstyle:checkstyle spotbugs:spotbugs on schedule - maven

The bounty expires in 4 days. Answers to this question are eligible for a +50 reputation bounty.
PatPanda is looking for a canonical answer:
A sample where the result is one pipeline to run simpler mvn command on merge + one scheduled pipeline to run a more complex mvn command will be great. Thank you
Locked for 4 days. There are disputes about this question’s content being resolved at this time. It is not currently accepting new answers or interactions.
I have a simple Java project in GitLab. I would like to achieve using GitLab CI:
a mvn clean install on merge
a more complex command mvn clean install site -U javadoc:javadoc pmd:cpd pmd:pmd checkstyle:checkstyle spotbugs:spotbugs sonar:sonar once a day on a scheduled basis
In order to do so, I wrote this .gitlab-ci.yml file:
build-on-merge:
image: myimage
stage: build
tags:
- type/docker
services:
- docker:dind
script:
- mvn clean install
artifacts:
paths:
- target/*.*
once-per-day-at-midnight:
image: myimage
stage: build
tags:
- type/docker
services:
- docker:dind
script:
- mvn clean install site pmd:cpd pmd:pmd javadoc:javadoc checkstyle:checkstyle spotbugs:spotbugs
rules:
- if: $RUN_TESTS == 'YES'
In the file, you first see the simple build, this one is working fine. Upon merge, 1) gets triggered.
However, the once a day more complex command does not run at all.
What did I do wrong? How can I achieve those two pipelines?

Related

How do I always force a rerun of a job defined as a dependency (Gitlab CI/CD)

My problem is basically this:
I have a build job and a deploy job in my gitlab-ci.yml.
build:
extends: .node_base
artifacts:
paths:
- artifact_folder
stage: deploy
script:
- npm start
deploy:
tags:
- linux-docker
stage: deploy
when: manual
image: registry.gitlab.com/gitlab-org/cloud-deploy/aws-base:latest
script:
- aws --endpoint-url $AWS_HOST s3 sync artifact_folder/ s3://$AWS_S3_BUCKET --delete --acl public-read
dependencies:
- build
The build job downloads files from an external location and saves them in an artifact for my deploy job to use.
The deploy takes the files from the build job artifact and uploads them to an s3 bucket.
So far so good. The problem is that everytime I want to deploy new changes I will have to first re-run the build job to get the updated files from the external location, before I re-run the deploy job.
Its not a big issue but I would like to, if possible, only have one job that does both the build step and the deploy step.
My first idea was to simply run the - npm start in the build job as a before_script in the deploy job. However, I am limited by the infrastructure setup by devops atm, which means the build job runs on an environment where npm is installed, and the deploy job runs on an environment where npm is not installed.
Is there anyway I can run these two jobs separately, but somehow also only need one button in gitlab to start both of these scripts.
Or perhaps force the build job to always re-run before the deploy job runs, or vice versa. And disable the deploy job from being able to run independently of the build job?

Running github actions for `uses` in another directory

My maven repository is in ./java directory. I want to run maven test in ./java directory, but I got following error:
The goal you specified requires a project to execute but there is no POM in this directory (/github/workspace). Please verify you invoked Maven from the correct directory. -> [Help 1]
Here is my workflow:
# This is a basic workflow to help you get started with Actions
name: CI
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
workflow_dispatch:
jobs:
build:
defaults:
run:
working-directory: java
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Run maven test
uses: xlui/action-maven-cli#master
with:
lifecycle: 'clean package test'
It seems that working-directory is useless, how can I fix it?
working-directory can only be applied to run: steps (e.g it does not work on actions)
The action xlui/action-maven-cli#master currently doesn't allow to inform a path to execute the maven commands.
You could either
use another (similar) action available on the Github Marketplace which allows to inform a directory path before executing maven commands,
open a PR to update the xlui/action-maven-cli action, adding an input path (or env variable) to perform a cd command before executing the maven commands,
install maven directly on your workflow within a step(for example setup maven action), before running the maven command directly on another step (without uses the action) with run: | with a cd ./java before executing sh -c "mvn clean package test",
Something like this (for the 3rd option):
steps:
- uses: actions/checkout#v2
- name: Install maven
run: |
...
- name: Run maven command
run: |
cd ./java
mvn clean package test

How to merge artifacts across jobs for the same stage in Gitlab CI?

In Gitlab CI artifacts are segregated based on the jobs which generated them and hence when downloading, you can only download it on a per-job basis.
Is there a way to download all the artifacts, or pass on the artifacts to some other stage and upload from there? Basically some way to merge all the artifacts of a stage.
A possible scenario where it can be needed: Let's say in a stage deploy, I am deploying my project on 10 different servers, using 10 different parallel jobs. Each of these generates some artifacts. However, there is no way to download them all from the UI.
So does anyone know of a workaround? I am not looking for API based solution, but instead UI based or editing the CI yaml file to make it work.
You can create a "final" (package) stage in your pipeline which combines all the artifacts together, using the artifacts syntax.
For example:
stages:
- build
- package
.artifacts_template:
artifacts:
name: linux-artifact
paths:
- "*.txt"
expire_in: 5 minutes
build:linux-1:
extends: .artifacts_template
stage: build
script:
- touch hello-world-linux-1.txt
build:linux-2:
extends: .artifacts_template
stage: build
script:
- touch hello-world-linux-2.txt
build:linux-3:
extends: .artifacts_template
stage: build
script:
- touch hello-world-linux-3.txt
package:
stage: package
script:
- echo "packaging everything here"
needs:
- build:linux-1
- build:linux-2
- build:linux-3
artifacts:
name: all-artifacts
paths:
- "*.txt"
expire_in: 1 month

Break Push in GitLab based on SonarQube Analysis Result

I have an application in springboot which uses gradle to build the code.
I have setup https://github.com/gabrie-allaigre/sonar-gitlab-plugin on SonarQube and have integrated gitlab CI
to analyse code on every push/commit. What I want to achieve is to break the push/commit if the analysis fails.
Below is my .gitlab-ci.yml
image: XXXXXX:oraclejdk:1.8.0_121
before_script:
- export GRADLE_USER_HOME=`pwd`/.gradle
sonarqube_master_job:
stage: test
only:
- master
- release2.0
script:
- ./gradlew assemble
- ./gradlew -x test sonarqube -Dsonar.host.url=http://sonarqube.XXX.XXX.XXX:9000/sonarqube -Dsonar.login=xxxxxxxxxxxxxxxxxxxx
sonarqube_preview_feature_job:
stage: test
only:
- /^feature\/*/
- development
script:
- git checkout $CI_COMMIT_REF_NAME
- git merge --no-commit --no-ff
- ./gradlew assemble
- ./gradlew -x test sonarqube -Dsonar.host.url=http://XXXX.XXXXX.com:9000/sonarqube -Dsonar.login=xxxxxxxxxxxxxxxxxxxxx -Dsonar.analysis.mode=preview -Dsonar.gitlab.commit_sha=$CI_COMMIT_REF -Dsonar.gitlab.ref_name=$CI_COMMIT_REF_NAME -Dsonar.gitlab.project_id=$CI_PROJECT_ID --stacktrace
How do I make sure the push fails if the analysis fails? Do I need to use webhooks. Is there a sample CI file?
#jibsonline, You can refer to my answer provided in the below link.
However the script answers only how to break the build on sonar analysis and display the results.
How to integrate Sonar Quality Gates with Gitlab-CI
Since gitlab triggers the build, once the changes were pushed, it is not advisable to set up an automated tool to revert the code changes on your behalf. Whenever a build fails, write script (dependencies) such that the code will not be deployed. Since the code is not deployed, your environment will not be effected. Also,set up an email configuration whenever build fails.

Build docker image including version with bitbucket pipelines

I'm pretty new to Bitbucket Pipelines and I encountered a problem. I'm creating a pipeline to deploy a new version of our Spring Boot application (which runs in a Kubernetes cluster) to our test environment. The problem I encountered is the versioning of our docker build. Our versioning is set up as the following:
alpha_0.1
alpha_0.2
beta_1.0
gamma_1.0
gamma_1.1
So every minor update/bugfix increases the build number by 0.1, and a major update increases the version by 1.0 + every major update gets a new version name.
Currently I have the next setup:
image: java:8
options:
docker: true
branches:
master:
- step:
caches:
- gradle
script:
- ./gradlew test
- ./gradlew build
- docker build -t <application_name>/<version_name>_<version_number>
What is the best way to include the version_name and the version_number in the bitbucket pipeline? Until now we runned ruby script which allowed user input for version numbering, but bitbucket pipelines are not interactive.
Assuming that alpha_0.1 etc. are tags and that the pipeline runs if a commit is tagged, you can get the tag for the current commit like this:
TAG=$(git tag --contains $BITBUCKET_COMMIT)
You can then use your favorite language or command-line tool to create the <version_name> and <version_number> from the tag you got. It may make sense to export the tag as a shell variable to be able to use it in a script.
This is one of the shippable.yml files I have, feel free to adapt it to Atlassian's pipelines.yml and Gradle:
language: java
jdk:
- oraclejdk8
branches:
only:
- master
...
build:
ci:
# Generates build number
- BUILD_NUMBER=`git log --oneline | wc -l`
- echo "Build number':' ${BUILD_NUMBER}"
# Sets version
- mvn versions:set -DnewVersion=1.0.${BUILD_NUMBER}
# Builds and pushes to Docker Hub
- mvn package
- docker login -u ${DOCKERHUB_USERNAME} -p ${DOCKERHUB_PASSWD} --email ${DOCKERHUB_EMAIL} https://index.docker.io/v1/
- mvn -X docker:build -Dpush.image=true
My projects version (in pom.xml) are set to 0-SNAPSHOPT
This also uses Spotify's Maven plugin to build the Docker image instead of docker build -t ...

Resources