Gradle incremental on different machines - gradle

We have multiple GitLab runners on different machines.
We broke our pipeline in multiple steps that are dependent and I see that gradle doesn't run incrementally.
For example, we have build apk and upload apk to hockeyapp. We always copy apk output from one step to another. However gradle still start building apk from scratch because upload needs it.
How can I troubleshoot it?
What folders to copy to make sure gradle runs incrementally?
We are on Gradle 4.x (4.1 and moving to 4.2)

I don't think you should do this. You would probably need to copy over the .gradle directory from the root project, but I'm not sure this works too well and as expected. I guess you should instead use the new Build Cache which sounds like being exactly what you need in your situation and is an official feature and does not involve some unsupported copying-around of build metadata.

Related

How to avoid Gradle wrapper downloading distro when running in Gradle docker image?

My project is built with gradlew. GitLab CI builds the project in a docker runner with an official Gradle image (see https://hub.docker.com/_/gradle).
Now even though Gradle is already installed in the cointainer, the wrapper will still download the distro every time. This makes up the majority of the build time.
How can I "tell" the wrapper about the already installed distro, so that it will not redownload it (assuming of course the versions match)?
Of course the alternative is to use gradle instead of gradlew in CI and rely on the docker image to have the correct distro but I'd like to avoid this if possible, as I would then have to manually keep .gitlab-ci.yml and the wrapper config in sync.
I don't think you cant instruct the wrapper to use a local version of Gradle that was installed manually.
The only approach I can think of to prevent downloading the distribution on every build, that doesn't involve additional steps when upgrading Gradle, is to cache the Gradle home folder (e.g. /home/gradle/.gradle). This should be possible even if it resides in a Docker container.
I don't know the details of how GitLab supports caching, but it probably only makes sense if the cache is stored locally (either on the same machine or in a cache server with high network bandwidth). If it has to be uploaded and downloaded from something like an S3 bucket on every build, that would likely take as much time as downloading it from services.gradle.org. But if you can make this work, you will not only cache the Gradle distribution but also the build dependencies, which should further speed up the build.

Clean up .gradle forlde on MacOS

I'm trying to clean up some space on my Mac, and I have noticed that .gradle folder takes 4.49GB.
Inside it, I can see that there is a cache folder that is 1.23GB large and wrapper that is 3.18GB large.
Inside the wrapper folder I can see such a structure:
gradle-2.13-all
gradle-3.1-bin
gradle-3.5-rc-2-bin
gradle-3.5.1-bin
gradle-4.2-all
gradle-4.2.1-bin
And so on. So this looks like some old data that is still present.
I did not found any gradle command to clean up it's old data nor dit I found whether this data is important.
So the question is what can be deleted in order to free some space?
You can safely delete the whole .gradle folder if you don't need to work offline. This folder will be recreated as soon, as you will run any Gradle task. And Gradle will download some of the the previously cached dependencies and wrapped Gradle versions automatically.
For sure, you likely have cached more dependencies and wrapped versions, you need at the moment, so it can make some free space for you.
Gradle 4.10 and later will clean up that folder for you and remove unused files automatically. See the release notes.

Do I need bcsymbolmap files created by Carthage

I am using Carthage dependency manager in my iOS project.
I have the Carthage/build folder in my repository to always have ready to go built frameworks when checking out the repo.
I am wondering what the bcsymbolmap files in the build folder are for. Quite a few of them are created with every carthage update.
Do I need to keep these files? Should I have them in my repository?
No, you don't need those files. If you set up Carthage properly, binary, .dsym and .bcsymbolmap files will be copied on build phase. When you archive the build for distributing using App Store, all needed files will be included in archive and after you upload the build to App Store you will be able to upload dsyms files anytime (to be able to decode your crash reports). If fact you don't need to store .dsyms and .bcsymbolmap files in your repository.
There is a good article explaining what is happening when the framework is being build (and what in fact Carthage scripts do) https://instabug.com/blog/ios-binary-framework/. Also it explains what for .bcsymbolmaps files used for - so Apple servers can rebuild your code using Bitcode and then you can desymbolicate your crash reports.
So, you don't need to keep those files. No need to store them in repository. The other reason not to store content of Build folder is that anyway your project can fail build with it on another machine with different environment. If you want to build your project with the same versions of dependencies - use Carthage bootstrap command instead of update.
P.S.
Also you can investigate what copy-frameworks command do:
https://github.com/Carthage/Carthage/blob/fc0166b4827736bac2c804fc928797f1a742c455/Source/carthage/CopyFrameworks.swift
If you use carthage build without the specification of a project, all
.bcsymbolmaps should be deleted, but if you use e.g. carthage build Alamofire it should just delete the corresponding .bcsymbolmap
From the discussion of a github issue. Looks like you do not need those files, since the default behaviour is to delete them when building a new build.
In general, you should not commit files generated during a local build into your repository, since builds can be device specific, and everyone cloning into or pulling from your repository should be able to perform a build themselves.
Bitcode Symbol Map(BCSymbolMap)
.bcsymbolmap is a textual file with debug information and which is generated for decoding stack trace. Solves same issues as .dSYM[About] but on more low level for and when Bitcode[About] is generated
It looks like:
BCSymbolMap Version: 2.0
__swift_FORCE_LOAD_$_swiftCompatibility50
__swift_FORCE_LOAD_$_swiftCompatibility51
__swift_FORCE_LOAD_$_swiftCompatibilityDynamicReplacements
_$sSo26SCNetworkReachabilityFlagsVMa
_$sSo24SCNetworkReachabilityRefaMa
...
Do I need to keep these files? Should I have them in my repository?
They are optional

how to configure maven-release-plugin to copy assets as part of build cycle?

I have the following requirement.
As part of our release cycle (when artifacts are pushed to artifactory), we need some assets from project directory to be copied to different machine inside a new folder.
Example:
project/root/src/main/assets/*
copied to
v4 (create this directory before scp the files).
we currently use maven-release-plugin for release and wondering if it is possible to configure maven-release-plugin to do so. I tried searching for it but in vain.
Any help here is much appreciated.
Thanks

Run TeamCity Build Step on a specific Agent

I've got TeamCity installed and working, and I need to have a build step run on a particular build agent (everything's running on Windows, but we have a Mac portion I need to build as well).
How do I tell the build step what agent I want it to run on? I've seen this, but that references an entire build; I just want a particular step to run on a given agent.
Is this even possible?
From what I am aware of, it is not possible. You may want a separate build configuration to build for mac.
Sharma is somewhat correct, and KIR has it completely correct.
I needed a build configuration for each server, Mac and Windows. Then I set a snapshot dependency from the Windows build on the Mac build (to make sure the Mac version builds completely first) and a artifact dependency from the same (to copy the resulting build output from the Mac to the Windows box). Then I modified the build process on the Windows box to include the artifacts, and voila, works like a charm.

Resources