Fastlane SwiftPM dependency caching - continuous-integration

SwiftPM is always fetching modules whenever I deploy a build. This takes a long time. I use Fastlane on GitHub for our CI/CD. May be there is some way to enable packages caching?
Looks like GitHub do not provide caching for Swift packages.

Related

Automate updating outdated dependencies in CI/CD using `yarn outdated`

My team is developing a React component library which relies on MaterialUI components. The customer of our employer wants to automatically signal and/or upgrade outdated dependencies (specifically when the dependency on MaterialUI becomes outdated at least). We are using yarn as dependency manager.
I found yarn lists all the outdated dependencies (or a specific dependency if specified) through the yarn outdated command. One can then upgrade said dependencies using the yarn upgrade command to which the dependency to be updated is supplied as parameter. To do this using a single command, running yarn upgrade-interactive lists outdated dependencies which the user can then select to be updated.
I am wondering if there is/are way(s) to automate this process. I tried piping the results of yarn outdated to yarn update as well as yarn version, but yarn upgrade seems to ignore whatever input it receives and updates every package regardless and yarn version throws errors saying the version are not proper semvers.
I realise yarn upgrade-interactive makes this process easy and quick for developers, however the project is intended to become open-source over time and the customer prefers a centralised solution rather than relying on every individual contributor to track this themselves. As far as I am aware, yarn upgrade-interactive cannot be automated as it requires user input in order to select the package(s) to be updated.
Other solutions I found, such as Dependabot or packages like 'yarn-outdated-notifier', seem to only work with GitHub. The project is currently running on Azure DevOps and, when it goes public, will run on GitLab.
Is there any way we could do this in our CI/CD environment or with any (free) solutions? The customer prefers to have as few dependencies as possible.

How to avoid Gradle wrapper downloading distro when running in Gradle docker image?

My project is built with gradlew. GitLab CI builds the project in a docker runner with an official Gradle image (see https://hub.docker.com/_/gradle).
Now even though Gradle is already installed in the cointainer, the wrapper will still download the distro every time. This makes up the majority of the build time.
How can I "tell" the wrapper about the already installed distro, so that it will not redownload it (assuming of course the versions match)?
Of course the alternative is to use gradle instead of gradlew in CI and rely on the docker image to have the correct distro but I'd like to avoid this if possible, as I would then have to manually keep .gitlab-ci.yml and the wrapper config in sync.
I don't think you cant instruct the wrapper to use a local version of Gradle that was installed manually.
The only approach I can think of to prevent downloading the distribution on every build, that doesn't involve additional steps when upgrading Gradle, is to cache the Gradle home folder (e.g. /home/gradle/.gradle). This should be possible even if it resides in a Docker container.
I don't know the details of how GitLab supports caching, but it probably only makes sense if the cache is stored locally (either on the same machine or in a cache server with high network bandwidth). If it has to be uploaded and downloaded from something like an S3 bucket on every build, that would likely take as much time as downloading it from services.gradle.org. But if you can make this work, you will not only cache the Gradle distribution but also the build dependencies, which should further speed up the build.

Go binaries for concourse ci tasks

What are some good patterns for using go in concourse-ci tasks. For example, should I build files locally with all dependencies and check-in cross-compiled binaries to the repo? Should I build on concourse prior to running the task?
Examples of what people do here would be great. Public repos of pipelines/tasks even better.
The way I see it there are currently 3 options for handling go builds:
Use vendoring
Explicitly declare the dependencies as concourse resources
Maintain a docker image with the required dependencies already included
All options have pros and cons. The first option is currently my favorite since the responsibility for handling dependencies is up to the project maintainers and there is a very clear way to see which versions/revisions of the dependencies are being used - i.e. just check the vendoring tool config - but it does force you to have all dependency code in the project's repo.
The second option follows the go "philosophy" of always tracking master, but it may lead to slower builds (concourse will need to check every single resource regularly) and may lead to sudden breakage because of changes in dependencies.
The third option allows you to implicitly fix the revision of the dependencies in the docker image, in that respect it's similar to the first, however it requires maintaining docker images (doesn't necessarily mean 1 per project, but it might mean more than one depending on the number of projects that use this option and if there are conflicting dependency versions between them)

Scaling TeamCity build chains

We have many projects that are treated, built and deployed the same way and we want to have a unified TeamCity build chain for all of them.
Our build chain should contain:
Compilation and testing for pull requests and develop branch.
SonarQube analysis for pull requests.
NPM publish and autolabel for merged pull requests into develop.
All of the projects are NPM packages that comply with the following:
required scripts: install, test, clean, build
package.json and sonar-project.properties in the root
We had the idea of using common parametrized builds for SonarQube and NPM publish, since they are agnostic to the project itself, but it looks like TeamCity does not support anything like this unless we replicate chain builds for each project.
Ideally it should look something like this:
The problem here is that if I add triggers for build finish and add dependencies to SonarQube and NPM publish for all of the projects it will require building every project and not only the one that has just been built.
TeamCity provides something like this by being able to duplicate build configurations:
I do not want to have many duplicated builds in the same way I do not fancy duplicated code. Is there a way to create the common build chain or should I move forward and look for a scripted way of generating the latter?

How to get Jenkins repository server to host only stable builds?

I have Jenkins version 2.7.1 running on a Windows 7 machine. It is successfully pulling code from a subversion repository and running tests. I have the test jobs set up for the development branch of each project only.
We periodically make stable releases of the projects in jar files with version numbers. I would like to have Jenkins be the repository manager for those stable releases. These are made by hand - There is no Jenkins job making or testing stable releases. The projects do use Maven.
Each stable build is tagged in the subversion repository, so it could be made again on demand if needed.
I downloaded the Maven repository server hoping to make this fit the purpose. I read the documentation that's provided, but it's pretty terse. As I understand it and have it configured now, this appears to have a couple of issues:
If I go to jenkins-ip/plugin/repository/project, it has made directories there that expose the names of all of my projects, which seems undesirable. (Here jenkins-ip is the IP where I access Jenkins on my local network.)
On the other hand, there's nothing but empty directories under these projects, so they're currently useless.
These projects all correspond to the continuous testing of the development branch. There's no apparent way to get the stable builds into the hierarchy. (It doesn't seem efficient to create a job for each stable release...)
Is there anyway to get Jenkins (with this plugin or through another method) to be the repository manager just for the stable builds? I know that I can start a different repository manager like archiva, but it would be ideal to use Jenkins since it's already running and it seems to claim capability for this function now.
To use Maven repository server you have to build the project on Jenkins.
Then the plugin will expose all archived artifacts as maven repo.
Note you need to use a "Maven project" type for it to work (freestyle is not supported)
There are several plugins that will help you manage building from multiple tags, however not all of them work with "Maven project" type.
You could also try Jenkins pipeline (previously "Workflow") or the Job-DSL plugin.
A simplest solution would be to have a build parameter specify the tag name (then checkout e.g. ^/tags/projectname/${tagParam}), but you have to figure out how to trigger the job then.

Resources