Gitlab CI/CD: Publishing the result of build stage onto another stage - continuous-integration

I'm building a CI/CD pipeline with Gitlab CI/CD.
.gitlab-ci.yml -
build:
image: node:16-alpine
stage: build
script:
- yarn install
- yarn build
artifacts:
paths:
- //something here
I have a build job which builds the app. In the next job when deploying the app, I need the build directory from the previous job that is build.
How to I publish this as an artifact?

I`m do
needs:
- job: "build"
artifacts: true
stage: release
image: registry.gitlab.com/gitlab-org/release-cli:latest #docker image
script:
- echo "release job"
release:
name: 'Release Executables $CI_COMMIT_SHORT_SHA'
description: 'Created using the release-cli'
tag_name: '$CI_COMMIT_SHORT_SHA'
assets:
links:
- name: 'comment one'
url: 'https://my.gitlab.my/you/project/-/jobs/${GE_JOB_ID}/artifacts/path to file'
result search Release!

Related

gitlab CI/CD not cache maven dependencies

I am new to Gitlab CI/CD, I have setup a simple pipeline to run jobs for a maven project when merge request is sent. here is the merged content for the pipeline:
---
stages:
- ".pre"
- pre-merge
- build-artifact
- unit-test
- build-image
- tag-image
- deploy
- ".post"
compile:
stage: pre-merge
script:
- mvn $MAVEN_CLI_OPTS compile
only:
- merge_requests
build:
stage: build-artifact
script:
- mvn $MAVEN_CLI_OPTS clean package
artifacts:
untracked: false
expire_in: 1 days
paths:
- target/*.jar
only:
- main
unit-test:
stage: unit-test
script:
- mvn $MAVEN_CLI_OPTS clean test
only:
- merge_requests
- main
build-image:
stage: build-image
script:
- echo build image using docker and publish it to artifactory
only:
- main
deploy-to-dev:
stage: deploy
script:
- update the argocd direcotry
environment:
name: development
only:
- main
tag-image:
stage: tag-image
script:
- re-tag the image and publish
only:
- tags
deploy-to-pr:
stage: deploy
script:
- update the argocd direcotry
environment:
name: production
only:
- tags
default:
tags:
- maven
variables:
MAVEN_OPTS: "-Dhttps.protocols=TLSv1.2 -Dmaven.repo.local=$CI_PROJECT_DIR/.m2/repository
-Dorg.slf4j.simpleLogger.showDateTime=true -Djava.awt.headless=true"
MAVEN_CLI_OPTS: "--batch-mode"
HELM_CHART: my-chart
cache:
paths:
- ".m2/repository"
However, seems like the cached repository is not used. I can see maven is download all the dependencies again in compile and 'unit-test' jobs. (I know I can just use the verify but I just want to test the cache in Gitlab CI)
My Setup:
Runner is using kubernetes executor
Runner is using image maven:3.8.6-openjdk-11-slim
I have exec to the runner pod and confirm that the .m2/repository is populated with jars.
no shared cache, only local

Why is a job artifact not being added in the pipeline?

UPDATE: added when:always under artifacts fixed the issue, since the unit tests were failing, so the coverage folder was not created as an artifact
When unit tests are run, a coverage folder is created. I want to save that coverage folder as an artifact in the pipeline so that sonarqube can access the reports in that folder to give an accurate coverage report. When I push up any code, I'm not seeing the coverage folder being saved as an artifact after the unit tests are run in the pre-build stage, so it is not being passed along to sonarqube in the build stage.
This is the yml file:
stages:
- Pre-Build
- Build
- etc.
Unit Tests:
stage: Pre-Build
allow_failure: true
script:
- npm ci
- npm run test
artifacts:
paths:
- coverage
when: always
SonarQube:
stage: Build
needs: ['Unit Tests']
except:
refs:
- tags
try add slash in dir-path
Unit Tests:
stage: Pre-Build
allow_failure: true
script:
- npm ci
- npm run test
artifacts:
paths:
- coverage/
when: always

Deploy a Azure Static web app with DevOps in a multi-stage pipeline

I'm currently developing a Blazor WebAssembly application that will be deployed on a Azure Static Web App. Now I need to create my CI/CD pipeline and there is a pretty easy way to build and publish the application using the AzureStaticWebApp task as shown here:
- task: AzureStaticWebApp#0
inputs:
app_location: 'App.Web'
app_build_command: 'dotnet build'
api_location: 'App.Api'
output_location: 'wwwroot'
azure_static_web_apps_api_token: $(deployment_token)
This task however builds and releases the application at the same time. In my pipeline, I'd like to build my Blazor application and store it as an artifact. In one stage of my pipeline, this artifact will be published to a test environment and if all tests pass, then another stage will be to publish this same artifact to the production environment. The goal is to publish the exact same artifact that was tested on the production environment.
Is there a way to accomplish this using the AzureStaticWebApp task or is there any alternative?
i think you can build the artifact in one job like this:
jobs:
- job: BuildWebsite
displayName: Build website
pool:
name: $(azdoPool)
steps:
- checkout: self
- task: Npm#1
displayName: npm install
inputs:
verbose: false
- powershell: |
npx gatsby build
displayName: build
- task: ArchiveFiles#2
displayName: package artifacts
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)/public/'
includeRootFolder: false
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(webPackageFile)'
- task: PublishPipelineArtifact#1
displayName: publish website artifact
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)/$(webPackageFile)'
ArtifactName: $(artifactName)
(notice the publish to artifact in the last task)
and then test/deploy in another job:
parameters:
application: ''
jobs:
- job: DeployStaticWebsite
displayName: deploy
pool:
name: $(azdoPool)
steps:
- task: DownloadPipelineArtifact#2
displayName: Download $(webPackageFile)
inputs:
artifactName: $(artifactName)
targetPath: $(artifactExtractPath)
itemPattern: '**/$(webPackageFile)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: $(artifactExtractPath)/$(webPackageFile)
destinationFolder: $(artifactExtractPath)/extracted
displayName: extract website
- task: AzureStaticWebApp#0
displayName: Deploy App
inputs:
output_location: $(extractedZipPath)
azure_static_web_apps_api_token: $(deployTokenMercleCom)
skip_app_build: true
Notice how the first task downloads the artifact/zip from the previous job.
these two jobs should be referenced by one or more stages in a different template file. something like this:
stages:
stage: Build
displayName: Build
jobs:
template: templates/build.yml
parameters:
application: ${{variables.environment}}
stage: app1
displayName: app1
variables:
application: app1
deploymentToken: $(deployTokenApp1)
jobs:
template: templates/deploy.yml
parameters:
application: ${{variables.environment}}
stage: app2
displayName: app2
variables:
application: app2
deploymentToken: $(deployTokenApp2)
jobs:
template: templates/deploy.yml
parameters:
application: ${{variables.environment}}
you can then keep adding stages for app3, app 4 as many as you want. they can all deploy the same artifact that was originally built. You'll also probably want to use deployment jobs if you want to add Environment approvals for each stage (so that you can preapprove each stage deployment).
Note, some of these variable in this yaml need defining before these template examples will run

SonarCloud code coverage remains 0.0 in GitHub Actions build

I have setup CI for a .NET Core solution using GitHub Actions. When code is pushed to the master branche, the solution is build, the unit tests are run and code analysis is run with SonarCloud.
The code analysis step is actually performed by sonarcloud-github-action.
The quality gate in SonarCloud does not pass because the coverage percentage is 0.0% (for both new as existing code). I'm generating code coverage reports using Coverlet. The coverage.opencover.xml file is succesfully generated after test execution for each unit test project.
In the sonar-project.properties file I'm referencing these files as follows:
sonar.cs.opencover.reportsPaths=**\coverage.opencover.xml
But apparently the code coverage reports are recognized but not processed by the SonarCloud scanner.
In the log of my GitHub Actions workflow, I do see these warnings:
INFO: Parsing the OpenCover report <path>/coverage.opencover.xml
INFO: Adding this code coverage report to the cache for later reuse: <path>/coverage.opencover.xml
...
WARN: Missing blame information for the following files:
WARN: * <path>/coverage.opencover.xml
WARN: This may lead to missing/broken features in SonarQube
In trying to solve the 'Missing blame information' warning, I added the coverage files to the exclusions in my SonarCloud project: **/coverage.opencover.xml but that didn't solve the issue. The warning still appears and code coverage is still 0.0%.
Any hints to get this going?
[edit]:
My workflow in GitHub Actions looks like this:
name: .NET Core
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: Setup .NET Core
uses: actions/setup-dotnet#v1
with:
dotnet-version: 2.2.108
- name: Build with dotnet
run: dotnet build src/<solution>.sln --configuration Release
- name: Unit Tests
run: dotnet test src/<solution>.sln /p:CollectCoverage=true /p:CoverletOutputFormat=opencover
- name: SonarCloud Scan
uses: sonarsource/sonarcloud-github-action#master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
`
I had a similar problem getting the coverage of a Typescript project to work. Without your sonar logs I just can guess but the problem was that the paths inside the lcov.info where absolute path from github something like SF:/home/runner/work/YoutRepoName.. and Sonar was starting a Docker container and set the workdir to /github/workdir and therefore could not locate the files from the lcov.info.
Check your logs if you find something like
2019-11-28T15:36:34.9243068Z WARN: Could not resolve 2 file paths in [/github/workspace/test/unit/coverage/lcov.info], first unresolved path: /home/runner/work/jobreporter/jobreporter/dispatcher/index.ts
2019-11-28T15:36:34.9243445Z INFO: Sensor SonarJS Coverage [javascript] (done) | time=8ms
So for the time being i had to replace all folder namens in the locv.info with /github/workdir.
In my case i used
- name: 'Run npm lint and test'
shell: bash
run: |
pushd .
npm ci
npm run lint:ci
npm run test --if-present
sed -i 's+/home/runner/work/jobreporter/jobreporter+/github/workspace+g' test/unit/coverage/lcov.info
sed -i 's+/home/runner/work/jobreporter/jobreporter+/github/workspace+g' eslint-report.json
After that the coverage was reported correctly.
Maybe that helps
Regards Mathias
I had the same problem with a Node build, where the paths in the lcov.info are not the same as the one in the Github Action docker container.
To work around it, I do my builds not by setting up Node directly in the worker, but by using a Docker Action, so that my paths stay the same in all Actions. If you dig in the logs, you can see precisely how the docker actions are run, and the available environment.
For reference, my actions look like this
- name: 'yarn install'
uses: docker://node:10.16.3-buster
with:
args: yarn install
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
CI: true
The downside is that my builds are a bit slower, but all my actions are run in Docker, which I find cleaner.
To get past this error you need to run your tests with the --blame parameter.
Here is my GitHub action for building and pushing to SonarCloud.
name: Build and run tests
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
with:
# Disabling shallow clone is recommended for improving relevancy of reporting for sonarcloud
fetch-depth: 0
- name: Setup .Net SDK (v5.0)
uses: actions/setup-dotnet#v1
with:
dotnet-version: '5.0.100'
- name: Install dependencies
run: dotnet restore
- name: Build
run: dotnet build --configuration Release --no-restore
- name: Test
run: dotnet test --blame --no-restore --verbosity normal /p:CollectCoverage=true /p:CoverletOutputFormat=opencover /p:CoverletOutput=opencover.xml
- name: SonarCloud Scan
uses: sonarsource/sonarcloud-github-action#master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

Gitlab CI - Publish Failed Test Results to Pages

I'm creating a simple java project using Gradle which generates a test report (i.e. BDD Cucumber, JUnit, etc.). This project is deployed to Gitlab where the project is built as part of the Gitlab CI process.
My JUnit reports are generated in the folder build/reports/tests/test/ relative to the project path (as an index.html and some CSS files, etc.).
How do I configure my .gitlab-ci.yml to publish the content of build/reports/tests/test/ to the Gitlab Pages even after my test cases fail?
This is what I have in my .gitlab-ci.yml: (My repo can be found HERE)
Version 1: Doesn't publish anything to pages
image: java:8-jdk
stages:
- test
before_script:
- export GRADLE_USER_HOME=`pwd`/.gradle
test:
stage: test
script:
- mkdir public
- ./gradlew test
artifacts:
paths:
- build/reports/tests/test/
only:
- master
after_script:
- mv build/reports/tests/test/* public
Version 2: Doesn't execute the deploy stage since test has failed.
image: java:8-jdk
stages:
- test
- deploy
before_script:
- export GRADLE_USER_HOME=`pwd`/.gradle
test:
stage: test
script:
- ./gradlew test
artifacts:
paths:
- build/reports/tests/test/
pages:
stage: deploy
dependencies:
- test
script:
- mkdir public
- mv build/reports/tests/test/* public
artifacts:
paths:
- public
only:
- master
I solved the issue by adding the when: always at the end of my pages stage. It now executes the stage regardless of exit code from the dependent stage.

Resources