I have a set of Azure Build Pipelines that compile rust projects and currently use blob storage to store the .cargo and target folders as a cache.
When compiling locally, once a binary is compiled the first time, subsequent cargo build's don't compile the dependent libraries/crates again, just the local binary, however with my current pipeline system, after downloading the cache and using the correct target folder to build into, the pipeline still downloads and builds crates.
This is my config.toml for the cache and any pipeline builds.
[build]
target-dir = "./target"
dep-info-basedir = "."
incremental = true
It has reduced compilation times in some cases but not nearly as much as I expect.
Can I cache more folders to increase speed? Is there some cache identifier that cargo is checking and fouling the cache over?
The pipelines run a custom xtask binary which performs many tasks including running cargo build --release could this be causing issues?
You need to cache target and ~/.cargo/registry as mentioned by Caesar in the comments above.
The following worked for me (docs):
- task: Cache#2
inputs:
key: '"cargo" | "$(Agent.OS)" | Cargo.lock'
path: $(Build.SourcesDirectory)\target
displayName: cache cargo build
- task: Cache#2
inputs:
key: '"cargo-registry" | "$(Agent.OS)" | Cargo.lock'
path: $(UserProfile)\.cargo\registry
displayName: cache cargo registry
Related
I use the Sonar tasks in my Azure DevOps pipeline to run static analysis on my code, but I would like to avoid having the pipeline download the plugins everytime the pipeline runs.
I think I could use the Cache task but I'm not sure how should I configure it.
I was facing the same issue. My pipeline was downloading the Sonar plugins every time and taking about 40~60 seconds.
I was able to cache the plugins through the .sonar/cache folder and decrease the download time to around 8~12 seconds.
Example:
variables:
SONAR_PLUGINS: /home/vsts/.sonar/cache
steps:
- task: Cache#2
inputs:
key: sonar | "$(Agent.OS)" | $(Build.Repository.Name)
path: $(SONAR_PLUGINS)
displayName: cache sonar plugins
I have a simple Gradle project that has org.gradle.caching=true set in gradle.properties in order to enable the local build cache.
When I run the build directly (./gradlew clean build) I can see that the local build cache is being used: https://scans.gradle.com/s/ykywrv3lzik3s/performance/build-cache
However, when I run the build with Coverity (bin/cov-build --dir cov-int ./gradlew clean build) I see the build cache is disabled for the same build: https://scans.gradle.com/s/j2pvoyhgzvvxk/performance/build-cache
How is Coverity causing the build cache to be disabled, and is there a way to run a build with Coverity and the Gradle Build Cache?
You can't use the build cache with Coverity, or at least you don't want to.
The Gradle Build Cache causes compilation to be skipped:
The Gradle build cache is a cache mechanism that aims to save time by reusing outputs produced by other builds. The build cache works by storing (locally or remotely) build outputs and allowing builds to fetch these outputs from the cache when it is determined that inputs have not changed, avoiding the expensive work of regenerating them.
Were that mechanism to be used with Coverity, it would prevent cov-build from seeing the compilation steps, and hence it would be unable to perform its own compilation of the source code, which is a necessary prerequisite to performing its static analysis.
I don't know precisely how Coverity is disabling the cache (or if that is even intentional on Coverity's part), but if it didn't do so, then you would have to yourself, as described in the Synopsys article Cov-build using gradle shows "No files were emitted" error message, the key step of which is:
Use "clean" and "cleanBuildCache" task to remove all saved cache data which prevent full compilation.
before running cov-build.
I'm new with Gitlab CI. Every time Gitlab CI run, it replace old folder on server. I have small problem when I want to reduce time Gradle build for project which include DL4J (very big size and take time to build). So I want it keep build folder from last version. I follow this to reduce time build by gradle.
Question: Is that possible to skip some folder by config of gitlab ci to keep it exist. This is my gitlab ci
stages:
- build
something_run:
stage: build
script:
- gradle build
- systemctl restart myproject
tags:
- ml
only:
- master
When it run, gradle will build project and time to build quite long. So I want next time CI run it will not delete last build version.
Take a look at cache (https://docs.gitlab.com/ee/ci/yaml/#cache)
cache is used to specify a list of files and directories which should be cached between jobs.
GitLab CI/CD provides a caching mechanism that can be used to save time when your jobs are running.
See also https://docs.gitlab.com/ee/ci/caching/index.html
I've started a new project of SprintBoot and Kotlin and I wanted to use Travis-CI as my CI server.
I also wanted to use codecov to collect the reports about my code coverage
Everything seems to work perfectly beside one thing, My project currently is an empty SpringBoot project that contains (and no tests) and the build itself takes up to 2m (mostly due to the time it takes to install Gradle).
I checked on their site and saw some optimizations to the build, but they're looked to early for this stage of the project (e.g. parallel tests execution).
Am I missing something? is 2m is the baseline for Travis-CI building time?
My current configurations for Travis :
# This enables the 'defaults' to test java applications:
language: java
# We can specify a list of JDKs to be used for testing
# A list of available JDKs in Trusty can be seed in:
# https://docs.travis-ci.com/user/reference/xenial/#jvm-clojure-groovy-java-scala-support
jdk:
- openjdk11
before_script:
# makes sure that gradle commands can be executed on build
- chmod +x gradlew
script:
# Makes sure that gradle can be executed.
- ./gradlew check
# Generates the reports for codecov
- ./gradlew jacocoTestReport
# This is to enable CodeCov's coverage
# If a build is successful, the code is submitted for coverage analysis
after_success:
- bash <(curl -s https://codecov.io/bash)
You'll want to cache to improve speeds of your build on Travis. Gradle has a dedicated guide on building on Travis: https://guides.gradle.org/executing-gradle-builds-on-travisci/
For caching, scroll down to Enable caching of downloaded artifacts
I hope someone can help me with a simple setup of maven CI scripts for GitLab.
I tried to search stackoverflow and google, which results in several questions and answers, but either they seem to be completely different or not that I understand them.
I have a simple setup of two projects. project B depends on project A (= pom packaging).
I have in the runner configuration /etc/gitlab-runner/config.toml the line with the volumes added
[[runners]]
...
[runners.docker]
...
volumes = ["/cache", "/.m2"]
...
my .gitlab-ci.yml for both projects look like this
image: maven:3.6.1-jdk-12
cache:
paths:
- /.m2/repository
- target/
variables:
MAVEN_OPTS: "-Dmaven.repo.local=/.m2/repository"
maven_job:
script:
- mvn clean install
with this - the first project builds correctly and I can see that the caching is working, as it does not download all maven related plugins for building the project, when executed again and again.
It also states
[INFO] Installing /builds/end2end/projectA/pom.xml to /.m2/repository/de/end2end/projectA/0.4.4-SNAPSHOT/projectA-0.4.4-SNAPSHOT.pom
It reports though at the end
WARNING: /.m2/repository: not supported: outside build directory
WARNING: /.m2/repository/classworlds: not supported: outside build directory
WARNING: /.m2/repository/classworlds/classworlds: not supported: outside build directory
WARNING: /.m2/repository/classworlds/classworlds/1.1-alpha-2: not supported: outside build directory
WARNING: /.m2/repository/classworlds/classworlds/1.1-alpha-2/_remote.repositories: not supported: outside build directory
[...]
When executing projectB, the job fails with the info, that it cannot find projectA.
So - what is wrong with the configuration of the runner / .gitlab-ci.yml files ?
I tried
cache:
paths:
- .m2/repository
which removes the warnings, but then the projectA gets in its local .m2 installed
[INFO] Installing /builds/end2end/projectA/pom.xml to /builds/end2end/projectAt/.m2/repository/de/end2end/projectA/0.4.4-SNAPSHOT/projectA-0.4.4-SNAPSHOT.pom
and projectB fails with the same error as above.
In fact, as described in gitlab doc, you use the dynamic storage so the volume is shared between subsequent runs of the same concurrent job for one project. I you want to share data between projects you must use the host-bound storage.
For the warning, the cache is only for working directory, so absolute path like /.m2/repository is not supported. In your case, you don't have to use cache for maven repository because you use a volume.