How to include xml file from other project in gitlab ci pipeline? - maven

I have a project /templates where I want to add a common ci_settings.xml that sets some defaults for maven commands.
I then want to reuse this template in another project:
.gitlab_ci.yml:
image: maven:3.8.4-eclipse-temurin-11
include:
project: 'all/templates'
ref: master
file:
- 'ci_settings.xml'
deploy:
stage: deploy
script: mvn deploy -s ci_settings.xml
Result:
Found errors in your .gitlab-ci.yml:
Included file `ci_settings.xml` does not have YAML extension!
How can I actually make use of this external file, if not via include?

You can use include only with yml files. But you can clone the /templates project in your pipeline via CI_JOB_TOKEN and use it this way. As you don't need the commit history here you can set the depth to 1.
image: maven:3.8.4-eclipse-temurin-11
deploy:
stage: deploy
script:
- git clone --depth 1 https://gitlab-ci-token:${CI_JOB_TOKEN}#your_path_to_templates_project.git templates
- mvn deploy -s templates/ci_settings.xml

Related

Gitlab working directory not clean when using cache with CLONE_STRATEGY: none

I have a GitLab pipeline setup that has a package step to do a maven build during the tag event and a release to upload the jar to the GitLab generic package registry using curl and GitLab-release cli.
What I'm expecting to happen is a cache of the .m2 to be loaded into the package step to allow the mvn clean package to do its thing. Then archive the created jar and test results only.
The release step should begin clean with no git clone, no cache and only the jar and test results.
Instead the 'find .' shows the release step contains everything including
Git directory (.git)
Full checked out repository
.m2 cache
target (fully built as the Package step produced)
From the cache documentation (https://docs.gitlab.com/ee/ci/caching/) on GitLab it states
Archive: 'dependencies' keyword to control which job fetches the artifacts
Disable Cache uses the 'cache: []'
Why is GitLab putting so much content into the release job? The release job fails at times because its finding multiple Jar files from previous tags (IE the clean and the archiving are holding past version).
gitlab-ci.yml
variables:
MAVEN_CLI_OPTS: "-s $CI_PROJECT_DIR/.m2/settings.xml"
MAVEN_VERSION_PLUGIN_VERSION: 2.11.0
MAVEN_ARTIFACT_NAME: test-component
GIT_CLEAN_FLAGS: -ffd
PACKAGE_REGISTRY_URL: "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/${MAVEN_ARTIFACT_NAME}"
cache:
key: primary
paths:
- .m2/repository
stages:
- package
- release
package:
stage: package
image: maven:latest
script:
- mvn ${MAVEN_CLI_OPTS} clean package
artifacts:
paths:
- target/*.jar
- target/surefire-reports
only:
- tags
- merge_requests
- branches
except:
- main
release:
stage: release
image: alpine:latest
cache: []
variables:
GIT_STRATEGY: none
dependencies:
- package
script:
- |
apk add curl gitlab-release-cli
find .
JAR_NAME=`basename target/${MAVEN_ARTIFACT_NAME}-${CI_COMMIT_TAG}.jar`
'curl --header "JOB-TOKEN: ${CI_JOB_TOKEN}" --upload-file target/${JAR_NAME} ${PACKAGE_REGISTRY_URL}/${CI_COMMIT_TAG}/${JAR_NAME}'
release-cli create --name "Release $CI_COMMIT_TAG" --description "$TAG_MESSAGE" --tag-name ${CI_COMMIT_TAG} --assets-link "{\"name\":\"jar\",\"url\":\"${PACKAGE_REGISTRY_URL}/${CI_COMMIT_TAG}/${JAR_NAME}\"}"
only:
- tags
See the GitLab docs on GIT_STRATEGY:
A Git strategy of none also re-uses the local working copy, but skips all Git operations normally done by GitLab. GitLab Runner pre-clone scripts are also skipped, if present. This strategy could mean you need to add fetch and checkout commands to your .gitlab-ci.yml script.
It can be used for jobs that operate exclusively on artifacts, like a deployment job. Git repository data may be present, but it’s likely out of date. You should only rely on files brought into the local working copy from cache or artifacts.
So GitLab documentation is pretty clear that you should always expect the git repository to be present. When you want to work exclusively with artifacts, I you can create a new temporary directory and reference the path to the artifacts explicitly rather than relying on a totally clean working directory.

How do you import artifacts from one Bitbucket pipeline to another?

We have a complex build system, with a many to many relationship between our libraries and our applications. We put each library and application in it's own repository, and use the output of the library builds in our application builds.
On our old Jenkins server, we simply set up a custom workspace and checked out the projects into standardized relative paths so they could find each other. Post build steps assured that only successful builds copied to the central bin folder at the expected relative path.
On our Bamboo server, our repository was fetched to a Checkout Directory at the expected relative path, and we could fetch artifacts from other builds and put them in the central bin folder at the expected relative path.
Now I'm trying to set up some Bitbucket Pipelines builds, and I can't see an obvious way to do a similar thing. The working folder is set automatically by pipelines, I can't push that repository into a subfolder that is relative to other build outputs. I can create artifacts, but I can't seem to import them into other pipelines. I can create caches, but again I can't seem to import them into other pipelines.
Library bitbucket-pipelines.yml
image: mcr.microsoft.com/dotnet/sdk:5.0
pipelines:
branches:
master:
eCRF2:
- step:
name: Build and Test
caches:
- dotnetcore
- platform2
script:
- dotnet restore ./NET5/Platform2.sln
- dotnet build ./NET5/Platform2.sln --no-restore --configuration Release
artifacts:
- NET5/Platform2/bin/**
definitions:
caches:
platform2: NET5/Platform2/bin
App bitbucket-pipelines.yml
image: mcr.microsoft.com/dotnet/sdk:5.0
pipelines:
default:
- step:
name: Build and Test
caches:
- dotnetcore
- platform2
script:
- export PROJECT_NAME=./PlatformDataService.sln
- dotnet restore ${PROJECT_NAME}
- dotnet build ${PROJECT_NAME} --no-restore --configuration Release
artifacts:
- PlatformDataService/bin/**
https://support.atlassian.com/bitbucket-cloud/docs/deploy-build-artifacts-to-bitbucket-downloads/ did get me to upload a file to the Downloads section of the repository, but how do I pull it into the other pipeline?
Is there a way to solve this within bitbucket pipelines itself or do I have to get a nuget server that's available outside my VPN?

How to include a script.py on my gitlab-ci.yml?

I am implementing a gitlab-ci.yml for my project . in this yml file I will need to execute a script.py file . this script.py is located on a differnet project , Is there anyway to include this python script without uploading it to my project?
Something like:
include: 'https://gitlab.com/khalilazennoud3/<project-name>/-/blob/main/script.py
There's no way to 'include' a file that isn't a Pipeline definition template, but you can still grab that file. The way I'd do it is to add a second Pipeline job in a prior stage to clone that other repository, then upload the file you need as an artifact. Then in the job where you need the file, it will have the artifact available.
Here's an example pipeline with just these two Jobs:
stages:
- "Setup other Project Files" # or whatever
- Build
Grab Python Script from Other Repo:
stage: "Setup other Project Files"
image: gitscm/git
variables:
GIT_STRATEGY: none
script:
- git clone git#gitlab.example.com:user/project.git
artifacts:
paths:
- path/to/script.py.
when: on_success # since if the clone fails, there's nothing to upload
expire_in: 1 week # or whatever makes sense
Build Job:
stage: Build
image: python
dependencies: ['Grab Python Script from Other Repo']
script:
- ls -la # this will show `script.py` from the first step along with the contents of "this" project where the pipeline is running
- ./do_something_with_the_file.sh
Let's go through these line by line. For the first job:
We're using the Git image since all we need here is git
The GIT_STRATEGY: none variable tells the Gitlab Runner not to clone/fetch the project the pipeline is running for. This is super useful if the job is doing things like sending notifications to Slack, hitting another API, etc.
For the script, all we do is clone the other project so that we can upload the file as an artifact.
For the second job:
Use whatever image you're using for this job as normal
The dependencies keyword controls which artifacts from previous stages will be 1) required and 2) downloaded for this specific job. By default, all available artifacts are downloaded for all jobs. This keyword controls that since we only need the script.py file.
In the script we just make sure that the file is present, which is just a temporary thing anyway, then you can use it however you need to.

How to use bitbucket pipeline to generate war file and copy it to downloads?

I want to use bitbucket as a maven repository for a personal project. My plan is to use bitbucket pipelines to build the project and copy the war file to the downloads page. After the build finishes successfully, I get the following message:
Building war: /opt/atlassian/pipelines/agent/build/todoey-be/target/todoey.war
but when the second pipeline is run to copy the war, I get the following:
File opt/atlassian/pipelines/agent/build/todoey-be/target/todoey.war doesn't exist.
Also the artifacts tab is empty.
bitbucket_pipelines.yaml:
image: maven:3.6.3
pipelines:
default:
- step:
name: Build and Test
caches:
- maven
script:
- mvn clean compile package
artifacts:
- opt/atlassian/pipelines/agent/build/todoey-be/target/todoey.war
- step:
name: Generate and deploy war
script:
- pipe: atlassian/bitbucket-upload-file:0.3.2
variables:
BITBUCKET_USERNAME: $BITBUCKET_USERNAME
BITBUCKET_APP_PASSWORD: $BITBUCKET_APP_PASSWORD
FILENAME: "opt/atlassian/pipelines/agent/build/todoey-be/target/todoey.war"
If you sure that your path is correct, try to clean bitbucket maven cache.
You can find it on Pipelines -> Caches
The problem is here:
opt/atlassian/pipelines/agent/build/todoey-be/target/todoey.war
You have removed the "/" at the beginning of the path string, which makes it a relative path to the current directory. Try something like this, to use an absolute path:
/opt/atlassian/pipelines/agent/build/todoey-be/target/todoey.war
However, instead of using a hard-coded full path, there is a better way to use a relative path with some environment variables, and also to add some temporary debug lines to verify the path. Consider something like this:
script:
- mvn clean compile package
# Now verify the path of the built WAR output file
- echo $BITBUCKET_CLONE_DIR # Debug: Print the Git clone directory
- pwd # Debug: Print the current working directory
- find "$(pwd -P)" -name todoey.war # Debug: Show the full file path of todoey.war, from the current working directory
- echo "$BITBUCKET_CLONE_DIR/build/todoey-be/target/todoey.war" # Debug: Print the resolved path of todoey.war
artifacts:
- "$BITBUCKET_CLONE_DIR/build/todoey-be/target/todoey.war"
- pipe: atlassian/bitbucket-upload-file:0.3.2
variables:
BITBUCKET_USERNAME: $BITBUCKET_USERNAME
BITBUCKET_APP_PASSWORD: $BITBUCKET_APP_PASSWORD
FILENAME: "$BITBUCKET_CLONE_DIR/build/todoey-be/target/todoey.war"
$BITBUCKET_CLONE_DIR is a pre-defined environment variable to your project's 'Git clone' root folder: It's described here: https://support.atlassian.com/bitbucket-cloud/docs/variables-in-pipelines/
Update 5 December 2022: The guideline on Bitbucket Artifacts says that you must only use relative paths for the artifacts: step, but you can use full paths for the pipe: step. So something like this:
image: maven:3.6.3
pipelines:
default:
- step:
name: Build and Test
caches:
- maven
script:
- mvn clean compile package
artifacts:
# Use a relative path here
- build/todoey-be/target/todoey.war
- step:
name: Generate and deploy war
script:
- pipe: atlassian/bitbucket-upload-file:0.3.2
variables:
BITBUCKET_USERNAME: $BITBUCKET_USERNAME
BITBUCKET_APP_PASSWORD: $BITBUCKET_APP_PASSWORD
# Use the full path here, with an environment variable
FILENAME: "$BITBUCKET_CLONE_DIR/build/todoey-be/target/todoey.war"

Build docker image including version with bitbucket pipelines

I'm pretty new to Bitbucket Pipelines and I encountered a problem. I'm creating a pipeline to deploy a new version of our Spring Boot application (which runs in a Kubernetes cluster) to our test environment. The problem I encountered is the versioning of our docker build. Our versioning is set up as the following:
alpha_0.1
alpha_0.2
beta_1.0
gamma_1.0
gamma_1.1
So every minor update/bugfix increases the build number by 0.1, and a major update increases the version by 1.0 + every major update gets a new version name.
Currently I have the next setup:
image: java:8
options:
docker: true
branches:
master:
- step:
caches:
- gradle
script:
- ./gradlew test
- ./gradlew build
- docker build -t <application_name>/<version_name>_<version_number>
What is the best way to include the version_name and the version_number in the bitbucket pipeline? Until now we runned ruby script which allowed user input for version numbering, but bitbucket pipelines are not interactive.
Assuming that alpha_0.1 etc. are tags and that the pipeline runs if a commit is tagged, you can get the tag for the current commit like this:
TAG=$(git tag --contains $BITBUCKET_COMMIT)
You can then use your favorite language or command-line tool to create the <version_name> and <version_number> from the tag you got. It may make sense to export the tag as a shell variable to be able to use it in a script.
This is one of the shippable.yml files I have, feel free to adapt it to Atlassian's pipelines.yml and Gradle:
language: java
jdk:
- oraclejdk8
branches:
only:
- master
...
build:
ci:
# Generates build number
- BUILD_NUMBER=`git log --oneline | wc -l`
- echo "Build number':' ${BUILD_NUMBER}"
# Sets version
- mvn versions:set -DnewVersion=1.0.${BUILD_NUMBER}
# Builds and pushes to Docker Hub
- mvn package
- docker login -u ${DOCKERHUB_USERNAME} -p ${DOCKERHUB_PASSWD} --email ${DOCKERHUB_EMAIL} https://index.docker.io/v1/
- mvn -X docker:build -Dpush.image=true
My projects version (in pom.xml) are set to 0-SNAPSHOPT
This also uses Spotify's Maven plugin to build the Docker image instead of docker build -t ...

Resources