Gitlab CI cache update - gradle

On GitlabCI I have caching setup and it is working properly:
cache:
key: gradle
paths:
- .gradle/caches
before_script:
- export GRADLE_USER_HOME=`pwd`/.gradle
In order to speed up the process of uploading my cache (<20s); and take advantage of this; I delete the "extra" files which have been updated during the build:
after_script:
- rm -rf .gradle/caches/$GRADLE_VERSION/
- rm .gradle/caches/journal-1/file-access.bin
- find .gradle/caches/ -name "*.lock" -type f -delete
I expect CI to skip uploading the cache, since none of the files have been updated anymore. i.e.
Result of
- find .gradle/caches/ -mmin -5 -exec ls -la {} +
is an empty list as well.
But that is not the case and my cache is uploaded on every job.
Am I missing something else? Has anyone else ran into this?

Related

Using GitHub cache action with multiple cache paths?

I'm trying to use the official GitHub cache action (https://github.com/actions/cache) to cache some binary files to speed up some of my workflows, however I've been unable to get it working when specifying multiple cache paths.
Here's a simple, working test I've set up using a single cache path:
There is one action for writing the cache, and one for reading it (both executed in separate workflows, but on the same repository and branch).
The write-action is executed first, and creates a file "subdir/a.txt", and then caches it with the "actions/cache#v2" action:
# Test with single path
- name: Create file
shell: bash
run: |
mkdir subdir
cd subdir
printf '%s' "Lorem ipsum" >> a.txt
- name: Write cache (Single path)
uses: actions/cache#v2
with:
path: "D:/a/cache_test/cache_test/**/*.txt"
key: test-cache-single-path
The read-action retrieves the cache, prints a list of all files in the directory recursively to confirm it has restored the file from the cache, and then prints the contents of the cached txt-file:
- name: Get cached file
uses: actions/cache#v2
id: get-cache
with:
path: "D:/a/cache_test/cache_test/**/*.txt"
key: test-cache-single-path
- name: Print files
shell: bash
run: |
echo "Cache hit: ${{steps.get-cache.outputs.cache-hit}}"
cd "D:/a/cache_test/cache_test"
ls -R
cat "D:/a/cache_test/cache_test/subdir/a.txt"
This works without any issues.
Now, the description of the cache action contains an example for specifying multiple cache paths:
- uses: actions/cache#v2
with:
path: |
path/to/dependencies
some/other/dependencies
key: ${{ runner.os }}-${{ hashFiles('**/lockfiles') }}
But when I try that for my example actions, it fails to work.
In the new write-action, I create two files, "subdir/a.txt" and "subdir/b.md", and then cache them by specifying two paths:
# Test with multiple paths
- name: Create files
shell: bash
run: |
mkdir subdir
cd subdir
printf '%s' "Lorem ipsum" >> a.txt
printf '%s' "dolor sit amet" >> b.md
#- name: Write cache (Multi path)
uses: actions/cache#v2
with:
path: |
"D:/a/cache_test/cache_test/**/*.txt"
"D:/a/cache_test/cache_test/**/*.md"
key: test-cache-multi-path
The new read-action is the same as the old one, but also specifies both paths:
# Read cache
- name: Get cached file
uses: actions/cache#v2
id: get-cache
with:
path: |
"D:/a/cache_test/cache_test/**/*.txt"
"D:/a/cache_test/cache_test/**/*.md"
key: test-cache-multi-path
- name: Print files
shell: bash
run: |
echo "Cache hit: ${{steps.get-cache.outputs.cache-hit}}"
cd "D:/a/cache_test/cache_test"
ls -R
cat "D:/a/cache_test/cache_test/subdir/a.txt"
cat "D:/a/cache_test/cache_test/subdir/b.md"
This time I still get the confirmation that the cache has been read:
Cache restored successfully
Cache restored from key: test-cache-multi-path
Cache hit: true
However "ls -R" does not list the files, and the "cat" commands fail because the files do not exist.
Where is my error? What is the proper way of specifying multiple paths with the cache action?
I was able to make it work with a few modifications;
use relative paths instead of absolute
use a hash of the content for the key
It looks like with at least bash the absolute paths look like this:
/d/a/so-foobar-cache/so-foobar-cache/cache_test/cache_test/subdir
Where so-foobar-cache is the name of the repository.
.github/workflows/foobar.yml
name: Store and Fetch cached files
on: [push]
jobs:
store:
runs-on: windows-2019
steps:
- name: Create files
shell: bash
id: store
run: |
mkdir -p 'cache_test/cache_test/subdir'
cd 'cache_test/cache_test/subdir'
echo pwd $(pwd)
printf '%s' "Lorem ipsum" >> a.txt
printf '%s' "dolor sit amet" >> b.md
cat a.txt b.md
- name: Store in cache
uses: actions/cache#v2
with:
path: |
cache_test/cache_test/**/*.txt
cache_test/cache_test/**/*.md
key: multiple-files-${{ hashFiles('cache_test/cache_test/**') }}
- name: Print files (A)
shell: bash
run: |
echo "Cache hit: ${{steps.store.outputs.cache-hit}}"
find cache_test/cache_test/subdir
cat cache_test/cache_test/subdir/a.txt
cat cache_test/cache_test/subdir/b.md
fetch:
runs-on: windows-2019
needs: store
steps:
- name: Restore
uses: actions/cache#v2
with:
path: |
cache_test/cache_test/**/*.txt
cache_test/cache_test/**/*.md
key: multiple-files-${{ hashFiles('cache_test/cache_test/**') }}
restore-keys: |
multiple-files-${{ hashFiles('cache_test/cache_test/**') }}
multiple-files-
- name: Print files (B)
shell: bash
run: |
find cache_test -type f | xargs -t grep -e.
Log
$ gh run view 1446486801
โœ“ master Store and Fetch cached files ยท 1446486801
Triggered via push about 3 minutes ago
JOBS
โœ“ store in 5s (ID 4171907768)
โœ“ fetch in 10s (ID 4171909690)
First job
$ gh run view 1446486801 --log --job=4171907768 | grep -e Create -e Store -e Print
store Create files 2021-11-10T22:59:32.1396931Z ##[group]Run mkdir -p 'cache_test/cache_test/subdir'
store Create files 2021-11-10T22:59:32.1398025Z mkdir -p 'cache_test/cache_test/subdir'
store Create files 2021-11-10T22:59:32.1398695Z cd 'cache_test/cache_test/subdir'
store Create files 2021-11-10T22:59:32.1399360Z echo pwd $(pwd)
store Create files 2021-11-10T22:59:32.1399936Z printf '%s' "Lorem ipsum" >> a.txt
store Create files 2021-11-10T22:59:32.1400672Z printf '%s' "dolor sit amet" >> b.md
store Create files 2021-11-10T22:59:32.1401231Z cat a.txt b.md
store Create files 2021-11-10T22:59:32.1623649Z shell: C:\Program Files\Git\bin\bash.EXE --noprofile --norc -e -o pipefail {0}
store Create files 2021-11-10T22:59:32.1626211Z ##[endgroup]
store Create files 2021-11-10T22:59:32.9569082Z pwd /d/a/so-foobar-cache/so-foobar-cache/cache_test/cache_test/subdir
store Create files 2021-11-10T22:59:32.9607728Z Lorem ipsumdolor sit amet
store Store in cache 2021-11-10T22:59:33.9705422Z ##[group]Run actions/cache#v2
store Store in cache 2021-11-10T22:59:33.9706196Z with:
store Store in cache 2021-11-10T22:59:33.9706815Z path: cache_test/cache_test/**/*.txt
store Store in cache cache_test/cache_test/**/*.md
store Store in cache
store Store in cache 2021-11-10T22:59:33.9708499Z key: multiple-files-25c0e6413e23766a3681413625169cee1ca3a7cd2186cc1b1df5370fb43bce55
store Store in cache 2021-11-10T22:59:33.9709961Z ##[endgroup]
store Store in cache 2021-11-10T22:59:35.1757943Z Received 260 of 260 (100.0%), 0.0 MBs/sec
store Store in cache 2021-11-10T22:59:35.1761565Z Cache Size: ~0 MB (260 B)
store Store in cache 2021-11-10T22:59:35.1781110Z [command]C:\Windows\System32\tar.exe -z -xf D:/a/_temp/653f7664-e139-4930-9710-e56942f9fa47/cache.tgz -P -C D:/a/so-foobar-cache/so-foobar-cache
store Store in cache 2021-11-10T22:59:35.2069751Z Cache restored successfully
store Store in cache 2021-11-10T22:59:35.2737840Z Cache restored from key: multiple-files-25c0e6413e23766a3681413625169cee1ca3a7cd2186cc1b1df5370fb43bce55
store Print files (A) 2021-11-10T22:59:35.3087596Z ##[group]Run echo "Cache hit: "
store Print files (A) 2021-11-10T22:59:35.3088324Z echo "Cache hit: "
store Print files (A) 2021-11-10T22:59:35.3088983Z find cache_test/cache_test/subdir
store Print files (A) 2021-11-10T22:59:35.3089571Z cat cache_test/cache_test/subdir/a.txt
store Print files (A) 2021-11-10T22:59:35.3090176Z cat cache_test/cache_test/subdir/b.md
store Print files (A) 2021-11-10T22:59:35.3104465Z shell: C:\Program Files\Git\bin\bash.EXE --noprofile --norc -e -o pipefail {0}
store Print files (A) 2021-11-10T22:59:35.3106449Z ##[endgroup]
store Print files (A) 2021-11-10T22:59:35.3494703Z Cache hit:
store Print files (A) 2021-11-10T22:59:35.4456032Z cache_test/cache_test/subdir
store Print files (A) 2021-11-10T22:59:35.4456852Z cache_test/cache_test/subdir/a.txt
store Print files (A) 2021-11-10T22:59:35.4459226Z cache_test/cache_test/subdir/b.md
store Print files (A) 2021-11-10T22:59:35.4875011Z Lorem ipsumdolor sit amet
store Post Store in cache 2021-11-10T22:59:35.6109511Z Post job cleanup.
store Post Store in cache 2021-11-10T22:59:35.7899690Z Cache hit occurred on the primary key multiple-files-25c0e6413e23766a3681413625169cee1ca3a7cd2186cc1b1df5370fb43bce55, not saving cache.
Second job
$ gh run view 1446486801 --log --job=4171909690 | grep -e Restore -e Print
fetch Restore 2021-11-10T22:59:50.8498516Z ##[group]Run actions/cache#v2
fetch Restore 2021-11-10T22:59:50.8499346Z with:
fetch Restore 2021-11-10T22:59:50.8499883Z path: cache_test/cache_test/**/*.txt
fetch Restore cache_test/cache_test/**/*.md
fetch Restore
fetch Restore 2021-11-10T22:59:50.8500449Z key: multiple-files-
fetch Restore 2021-11-10T22:59:50.8501079Z restore-keys: multiple-files-
fetch Restore multiple-files-
fetch Restore
fetch Restore 2021-11-10T22:59:50.8501644Z ##[endgroup]
fetch Restore 2021-11-10T22:59:53.1143793Z Received 257 of 257 (100.0%), 0.0 MBs/sec
fetch Restore 2021-11-10T22:59:53.1145450Z Cache Size: ~0 MB (257 B)
fetch Restore 2021-11-10T22:59:53.1163664Z [command]C:\Windows\System32\tar.exe -z -xf D:/a/_temp/30b0dc24-b25f-4713-b3d3-cecee7116785/cache.tgz -P -C D:/a/so-foobar-cache/so-foobar-cache
fetch Restore 2021-11-10T22:59:53.1784328Z Cache restored successfully
fetch Restore 2021-11-10T22:59:53.5197756Z Cache restored from key: multiple-files-
fetch Print files (B) 2021-11-10T22:59:53.5483939Z ##[group]Run find cache_test -type f | xargs -t grep -e.
fetch Print files (B) 2021-11-10T22:59:53.5484730Z find cache_test -type f | xargs -t grep -e.
fetch Print files (B) 2021-11-10T22:59:53.5498140Z shell: C:\Program Files\Git\bin\bash.EXE --noprofile --norc -e -o pipefail {0}
fetch Print files (B) 2021-11-10T22:59:53.5498674Z ##[endgroup]
fetch Print files (B) 2021-11-10T22:59:55.8119800Z grep -e. cache_test/cache_test/subdir/a.txt cache_test/cache_test/subdir/b.md
fetch Print files (B) 2021-11-10T22:59:56.1777887Z cache_test/cache_test/subdir/a.txt:Lorem ipsum
fetch Print files (B) 2021-11-10T22:59:56.1784138Z cache_test/cache_test/subdir/b.md:dolor sit amet
fetch Post Restore 2021-11-10T22:59:56.3890391Z Post job cleanup.
fetch Post Restore 2021-11-10T22:59:56.5481739Z Cache hit occurred on the primary key multiple-files-, not saving cache.
Came here to see if I can cache multiple binary files. I see there a separate workflow for pushing cache and another one for retrieving. We had a separate usecase where we need to install certain dependencies. Sharing the same here.
Usecase
You workflow needs gcc and python3 to run.(The dependencies can be any other as well)
You have a script to install dependencies ./install-dependencies.sh and you provide appropriate env to the script like ENV_INSTALL_PYTHON=true or ENV_INSTALL_GCC=true
Points to be noted
./install-dependencies.sh takes care of installing the dependencies in the path ~/bin and produces the executable binaries in the same path. It also ensures that the $PATH environment variable is updated with the new binary paths
Instead of duplicating the check cache and install binaries 2 times (as we have 2 binaries now), we are able to do it in only one. So even if we have a requirement of installing 50 binaries, we can still do them in only two steps like this
The cache key name python-gcc-cache-key can be anything but ensure that it is unique.
The third step - name: install python, gcc takes care of creating the key with the name python-gcc-cache-key if it was not found, even though we have not mentioned this keyname anywhere in this step.
The first step is where you checkout your repository containing your ./install-dependencies.sh script.
Workflow
name: Install dependencies
on: [push]
jobs:
install_dependencies:
runs-on: ubuntu-latest
name: Install python, gcc
steps:
- uses: actions/checkout#v3
with:
fetch-depth: 0
## python, gcc installation
# Check if python, gcc if present in worker cache
- name: python, gcc cache
id: python-gcc-cache
uses: actions/cache#v2
with:
path: |
~/bin/python
~/bin/gcc
key: python-gcc-cache-key
# Install python, gcc if was not found in cache
- name: install python, gcc
if: steps.python-gcc-cache.outputs.cache-hit != 'true'
working-directory: .github/workflows
env:
ENV_INSTALL_PYTHON: true
ENV_INSTALL_GCC: true
run: |
./install-dependencies.sh
- name: validate python, gcc
working-directory: .github/workflows
run: |
ENV_INSTALL_BINARY_DIRECTORY_LINUX="$HOME/bin"
export PATH="$ENV_INSTALL_BINARY_DIRECTORY_LINUX:$PATH"
python3 --version
gcc --version
Benefits
It will depend on what binaries you are trying to install.
For us the saved time was nearly 50sec everytime there was cache hit.

Bitbucket pipelines how to merge two variables to produce another variable to be used somewhere else

I am trying to workout a Bitbucket pipeline using the bitbucket-pipelines.yml
image: microsoft/dotnet:sdk
pipelines:
branches:
master:
- step:
script:
- dotnet build $PROJECT_NAME
- export EnvrBuild=Production_$BITBUCKET_BUILD_NUMBER
- '[ ! -e "$BITBUCKET_CLONE_DIR/$EnvrBuild" ] && mkdir $BITBUCKET_CLONE_DIR/$EnvrBuild'
- dotnet publish $PROJECT_NAME --configuration Release
- cp -r $BITBUCKET_CLONE_DIR/$PROJECT_NAME/bin/Release/netcoreapp2.1/publish/** $BITBUCKET_CLONE_DIR/$EnvrBuild
artifacts:
- $EnvrBuild/**
I am new to pipelines in Bitbucket. When I do an echo of $EnvrBuild I get the result right, but the $EnvrBuild does not have anything in the subsequent steps and it does not produce any artifacts, how ever if I hard code the values, it works. Is there a way to do something like $BITBUCKET_BUILD_NUMBER+"_"+$BITBUCKET_BRANCH ? (I know this is wrong, but you get the idea of what I am trying to achieve. Thank you in advance
Variable expansion is not allowed to specify artifacts, you have to provide a static value. However, you can store multiple subdirectories under your build directory using wildcards implicitly. Here is an example:
image: microsoft/dotnet:sdk
pipelines:
branches:
master:
- step:
script:
- dotnet build $PROJECT_NAME
- export EnvrBuild=Production_$BITBUCKET_BUILD_NUMBER
- '[ ! -e "$BITBUCKET_CLONE_DIR/$EnvrBuild" ] && mkdir $BITBUCKET_CLONE_DIR/$EnvrBuild'
- dotnet publish $PROJECT_NAME --configuration Release
- mkdir -p $BITBUCKET_CLONE_DIR/build_dir/$EnvrBuild
- cp -r $BITBUCKET_CLONE_DIR/$PROJECT_NAME/bin/Release/netcoreapp2.1/publish/** $BITBUCKET_CLONE_DIR/build_dir/$EnvrBuild
artifacts:
- build_dir/**
- step:
script:
- export EnvrBuild=Production_$BITBUCKET_BUILD_NUMBER
- ls build_dir/$EnvrBuild

Travis CI keeps saying "change detected" even if the directory is removed in `before_cache`

Here is an example build I can find.
I'm not sure what went wrong, but even if I have this part in .travis.yml, it keeps saying change detected:
cache:
directories:
- $HOME/virtualenv/python3.6.*
- $HOME/.cache/pip
before_cache:
- rm -f $HOME/.cache/pip/log/debug.log
- rm -rf $HOME/.cache/pip/http
The part of the log looks like:
$ rm -f $HOME/.cache/pip/log/debug.log
$ rm -rf $HOME/.cache/pip/http
store build cache
change detected (content changed, file is created, or file is deleted):
/home/travis/.cache/pip/http/0/2/8/3/8/0283814c221ac4bc25c88210daf1373d5ea1599443793f980776f2bd
/home/travis/.cache/pip/http/0/a/8/f/a/0a8faabd212d81beff3ad0e11f3e4746188c0ad05c9190218de2e48a
/home/travis/.cache/pip/http/1/a/6/6/8/1a668413371d25a5c96b0d9ce943feb382e5084277c6becd46243276
/home/travis/.cache/pip/http/3/a/f/3/a/3af3addf06e983a6c02f46e7bea70c221d3ff95bf1418fa6da354e14
/home/travis/.cache/pip/http/3/d/0/7/9/3d0790aa6d8aba43447ad4d8fdc684c544812f2cc57ad084f4b1b2db
/home/travis/.cache/pip/http/4/3/5/8/9/435895f5c58d1fbe5d6efd64c4a3afa3e8b280691afe2988aaf12f5c
/home/travis/.cache/pip/http/7/b/f/8/d/7bf8d0ac304d190542382e13233a33a5644477a3738766e4e84c6fe1
/home/travis/.cache/pip/http/8/1/0/3/1/8103159fcae9ef47b8f04cd57495057ea1d442635ec8972390866e7e
/home/travis/.cache/pip/http/8/a/4/e/c/8a4eccf4e850fd6cf5ebe8398d0140632e274527091f7587e668be40
/home/travis/.cache/pip/http/9/8/b/2/1/98b21875dce7a2d53963fef4d2f05edab9e8237a174b4b9a157cb45f
/home/travis/.cache/pip/http/a/c/1/0/1/ac
...
changes detected, packing new archive
.
uploading archive
Any solution?
in my .travis.yml I add
cache:
directories:
- $HOME/.cache/wheels

How to enable GZip compression for Gitlab Pages?

When using Gitlab Pages to render my site which is slow in page-ranking. I can't find any solution on how to do following in GitLab (non-enterprise version)
Specify HTTP Cache Headers for various page resources like for an image, so that it can be cached.
Specify/Enable compression for GZip as page-ranking mentions compression disabled in gitlab.io.
It looks like specifying HTTP Cache Headers is still not possible. But at least they have hardcoded "max-age=600" for all the resources here.
You can compress contents of your public folder via .gitlab-ci.yml:
script:
npm install
npm run build
gzip -k -6 -r public
GitLab has support for serving compressed assets if you pre-compress them in the pages CI Job already. See the documentation.
Note that you can and also should use brotli compression as it's optimized for web content and supported by most modern browsers.
There is also a suggested snippet for your .gitlab-ci.yml:
pages:
# Other directives
script:
# Build the public/ directory first
- find public -type f -regex '.*\.\(htm\|html\|txt\|text\|js\|css\)$' -exec gzip -f -k {} \;
- find public -type f -regex '.*\.\(htm\|html\|txt\|text\|js\|css\)$' -exec brotli -f -k {} \;
I haven't found a way of influencing cache behavior. I am also looking for this.
Enable GZip compression for GitLab Pages
If you add the precompressed .gz versions of the files of your static site, then nginx can serve them instead of the regular ones. Add this line to your .gitlab-ci.yml file:
image: alpine:latest
pages:
stage: deploy
script:
- mkdir .temp
- cp -r * .temp
- mv .temp public
- gzip -k -9 $(find public -type f)
artifacts:
paths:
- public
only:
- master
This command compresses all files found in the public directory with the maximum compression ratio.

Mage::getStoreConfig not returning updated values

Recently we've noticed that Mage::getStoreConfig is not returning updated values in a app/code/local plugin. Everything was working as of last Friday so we assume that something has changed on the server.
We can see the values updating correctly in the database table core_config_data.
We have
recompiled
flushed the Magento Cache
flushed the Cache Storage
reset folder and file ownership and permissions
find . -type f -exec chmod 644 {} \;
find . -type d -exec chmod 755 {} \;
For example, we added an extra character to the store phone number, see that the database value has updated but it doesn't show with the following line
Mage::getStoreConfig('general/store_information/phone')
As a test we duplicated the site and database via Plesk and applied the latest patches to both sites. The duplicated site worked as normal.
I'm intrigued to find out what has happened so any ideas as to what the issue might be would be welcome?

Resources