I'm trying to understand and make the build and deployment of my projects work.
I have a repo named projects. Inside projects I have several CRA (create react app) projects:
/projects/react1
/projects/react2
...
In my Travis dashboard I can see my projects repo.
Inside each project I have its own .travis.yml (the only thing that changes is the app name). I also have the API KEY in a env variable in Travis.
Example: /projects/react1
language: node_js
node_js:
- "node"
sudo: false
branches:
only:
- master
cache:
directories:
- node_modules
install:
- npm install
deploy:
provider: heroku
skip_cleanup: true
keep-history: true
api-key:
secure: $HEROKU_KEY
app: path-to-my-project
How can I make the build and deployment work in this context of multiple projects inside a single repo? I could make it work in one project inside a repo, but I'm totally stuck here and the tutorials I found didn't help much.
Related
I have a GitLab pipeline setup that has a package step to do a maven build during the tag event and a release to upload the jar to the GitLab generic package registry using curl and GitLab-release cli.
What I'm expecting to happen is a cache of the .m2 to be loaded into the package step to allow the mvn clean package to do its thing. Then archive the created jar and test results only.
The release step should begin clean with no git clone, no cache and only the jar and test results.
Instead the 'find .' shows the release step contains everything including
Git directory (.git)
Full checked out repository
.m2 cache
target (fully built as the Package step produced)
From the cache documentation (https://docs.gitlab.com/ee/ci/caching/) on GitLab it states
Archive: 'dependencies' keyword to control which job fetches the artifacts
Disable Cache uses the 'cache: []'
Why is GitLab putting so much content into the release job? The release job fails at times because its finding multiple Jar files from previous tags (IE the clean and the archiving are holding past version).
gitlab-ci.yml
variables:
MAVEN_CLI_OPTS: "-s $CI_PROJECT_DIR/.m2/settings.xml"
MAVEN_VERSION_PLUGIN_VERSION: 2.11.0
MAVEN_ARTIFACT_NAME: test-component
GIT_CLEAN_FLAGS: -ffd
PACKAGE_REGISTRY_URL: "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/${MAVEN_ARTIFACT_NAME}"
cache:
key: primary
paths:
- .m2/repository
stages:
- package
- release
package:
stage: package
image: maven:latest
script:
- mvn ${MAVEN_CLI_OPTS} clean package
artifacts:
paths:
- target/*.jar
- target/surefire-reports
only:
- tags
- merge_requests
- branches
except:
- main
release:
stage: release
image: alpine:latest
cache: []
variables:
GIT_STRATEGY: none
dependencies:
- package
script:
- |
apk add curl gitlab-release-cli
find .
JAR_NAME=`basename target/${MAVEN_ARTIFACT_NAME}-${CI_COMMIT_TAG}.jar`
'curl --header "JOB-TOKEN: ${CI_JOB_TOKEN}" --upload-file target/${JAR_NAME} ${PACKAGE_REGISTRY_URL}/${CI_COMMIT_TAG}/${JAR_NAME}'
release-cli create --name "Release $CI_COMMIT_TAG" --description "$TAG_MESSAGE" --tag-name ${CI_COMMIT_TAG} --assets-link "{\"name\":\"jar\",\"url\":\"${PACKAGE_REGISTRY_URL}/${CI_COMMIT_TAG}/${JAR_NAME}\"}"
only:
- tags
See the GitLab docs on GIT_STRATEGY:
A Git strategy of none also re-uses the local working copy, but skips all Git operations normally done by GitLab. GitLab Runner pre-clone scripts are also skipped, if present. This strategy could mean you need to add fetch and checkout commands to your .gitlab-ci.yml script.
It can be used for jobs that operate exclusively on artifacts, like a deployment job. Git repository data may be present, but it’s likely out of date. You should only rely on files brought into the local working copy from cache or artifacts.
So GitLab documentation is pretty clear that you should always expect the git repository to be present. When you want to work exclusively with artifacts, I you can create a new temporary directory and reference the path to the artifacts explicitly rather than relying on a totally clean working directory.
I have a project /templates where I want to add a common ci_settings.xml that sets some defaults for maven commands.
I then want to reuse this template in another project:
.gitlab_ci.yml:
image: maven:3.8.4-eclipse-temurin-11
include:
project: 'all/templates'
ref: master
file:
- 'ci_settings.xml'
deploy:
stage: deploy
script: mvn deploy -s ci_settings.xml
Result:
Found errors in your .gitlab-ci.yml:
Included file `ci_settings.xml` does not have YAML extension!
How can I actually make use of this external file, if not via include?
You can use include only with yml files. But you can clone the /templates project in your pipeline via CI_JOB_TOKEN and use it this way. As you don't need the commit history here you can set the depth to 1.
image: maven:3.8.4-eclipse-temurin-11
deploy:
stage: deploy
script:
- git clone --depth 1 https://gitlab-ci-token:${CI_JOB_TOKEN}#your_path_to_templates_project.git templates
- mvn deploy -s templates/ci_settings.xml
We have a complex build system, with a many to many relationship between our libraries and our applications. We put each library and application in it's own repository, and use the output of the library builds in our application builds.
On our old Jenkins server, we simply set up a custom workspace and checked out the projects into standardized relative paths so they could find each other. Post build steps assured that only successful builds copied to the central bin folder at the expected relative path.
On our Bamboo server, our repository was fetched to a Checkout Directory at the expected relative path, and we could fetch artifacts from other builds and put them in the central bin folder at the expected relative path.
Now I'm trying to set up some Bitbucket Pipelines builds, and I can't see an obvious way to do a similar thing. The working folder is set automatically by pipelines, I can't push that repository into a subfolder that is relative to other build outputs. I can create artifacts, but I can't seem to import them into other pipelines. I can create caches, but again I can't seem to import them into other pipelines.
Library bitbucket-pipelines.yml
image: mcr.microsoft.com/dotnet/sdk:5.0
pipelines:
branches:
master:
eCRF2:
- step:
name: Build and Test
caches:
- dotnetcore
- platform2
script:
- dotnet restore ./NET5/Platform2.sln
- dotnet build ./NET5/Platform2.sln --no-restore --configuration Release
artifacts:
- NET5/Platform2/bin/**
definitions:
caches:
platform2: NET5/Platform2/bin
App bitbucket-pipelines.yml
image: mcr.microsoft.com/dotnet/sdk:5.0
pipelines:
default:
- step:
name: Build and Test
caches:
- dotnetcore
- platform2
script:
- export PROJECT_NAME=./PlatformDataService.sln
- dotnet restore ${PROJECT_NAME}
- dotnet build ${PROJECT_NAME} --no-restore --configuration Release
artifacts:
- PlatformDataService/bin/**
https://support.atlassian.com/bitbucket-cloud/docs/deploy-build-artifacts-to-bitbucket-downloads/ did get me to upload a file to the Downloads section of the repository, but how do I pull it into the other pipeline?
Is there a way to solve this within bitbucket pipelines itself or do I have to get a nuget server that's available outside my VPN?
In my project I use Travis-CI for continuous integration (builds on every MR to master branch) and also for deploying the artifact to Heroku. Here is my .travis.yml file:
language: java
jdk: oraclejdk8
branches:
only:
- master
script:
mvn package
deploy:
provider: heroku
api_key: $HEROKU_API_KEY
notifications:
email:
on_success: never
on_failure: always
And here is my Procfile:
web java -Dserver.port=$PORT -jar target/my-artifact.jar
Here you can see that I use PORT Heroku variable, but I also use few custom variables. Sometimes I need to update their values after new build. Previously I did it manually, but I'm looking how I can automate this. I need to update Heroku environment variables with values which I determine in time of Travis-CI build. How can I do that?
You can set your environment variables using the Heroku platform API: https://devcenter.heroku.com/articles/platform-api-reference#config-vars
In Travis, you can run a task pre-deploy using the 'before_deploy' step (https://docs.travis-ci.com/user/customizing-the-build#The-Build-Lifecycle)
So create a script that uses the Heroku platform API to update your environment and run it as part of your before_deploy step.
We have currently three separated appveyor projects, one for each branch in our repository.
Our problem is follwing:
Appveyor ignores my filter on github branches. Everytime we make a commit to master, stage or dev it builds on all three projects instead of the single one we did make a commit to.
Each branch has a unique appveyor.yml file looking like this:
This is the appveyor.yml for dev
version: 0.0.{build}
branches:
only:
- dev
image: Visual Studio 2017
configuration: dev
before_build:
- nuget restore
build:
project: Core.Api.sln
publish_wap: true
verbosity: minimal
build_script:
- ps: .\build.ps1
after_build:
- cmd: dotnet publish src\Core.Api --output %appveyor_build_folder%\dist
test: off
artifacts:
- path: dist
name: dist.web
deploy:
...
When we make a commit, it builds on all projects. Any idea??
This happens because each project has Webhook configured on GitHub and each time someone makes a commit, each project build is triggered by webhook. Then, regardless of what branch is configured for project (that is only default branch for manual/API builds), AppVeyor reads appveyor.yml from the branch where commit was done.
Solution is to use either alternative YAML file names or alternative YAML file location.
With alternative YAML file names you can have something like appveyor-dev.yml, appveyor-stage.yml files and set specific AppVeyor project to use specific file. With alternative YAML file location is it basically the same, but in other location than repo. I personally like alternative YAML file location more because of less duplication and potential merging issues.
In both cases when webhook in say branch dev come to stage project, it still will read appveyor-dev.yml and do the right filtering.