My build files are large and messy, making them difficult to read. like below:
plugins {
...
id "com.google.protobuf" version "0.8.17"
}
dependencies {
implementation "androidx.datastore:datastore-core:1.0.0"
implementation "com.google.protobuf:protobuf-javalite:3.18.0"
...
}
protobuf {
protoc {
artifact = "com.google.protobuf:protoc:3.14.0"
}
// Generates the java Protobuf-lite code for the Protobufs in this project. See
// https://github.com/google/protobuf-gradle-plugin#customizing-protobuf-compilation
// for more information.
generateProtoTasks {
all().each { task ->
task.builtins {
java {
option 'lite'
}
}
}
}
}
I want to define the above code into an external file, and then introduce it into the build file, how should I do it?
According to the Gradle documentation, as of now it is not possible to move the plugin block to other file than the project’s build script or settings.gradle file.
For the other sections, let's say dependencies or protobuf, then you can move these sections on a separate gradle files and import them by using the following statement:
apply from: "${project.rootDir}/your-gradle-file"
Of course the path of your-gradle-file should be adjusted according to the project's folder structure you decide.
If you want to split the dependencies into multiple gradle file you can do the following:
on your main gradle file:
dependencies {
apply from: "${project.rootDir}/depsGroup1.gradle"
apply from: "${project.rootDir}/depsGroup2.gradle"
}
and within each depsGroup file:
dependencies {
implementation xyz
}
Related
I have a multi-project Gradle build. In each subproject I have a properties.gradle file like the following:
def usefulMethod() {
return 'foo'
}
ext {
foo = 'bar'
usefulMethod = this.&usefulMethod
}
And then I import it into the subproject build.gradle using apply from: './properties.gradle'.
However, if two subprojects import a variable with the same name, I get this error:
Cannot add extension with name 'foo', as there is an extension already registered with that name.
It seems that adding to ext affects the entire project instead of just the subproject like I wanted. What is the correct way to import properties and variables from an external file in a subproject without leaking them into the entire project build?
The plain ext is the extension for ENTIRE project, the root project and all subprojects. To avoid polluting the root namespace whenever you include a file via apply from..., you should instead use project.ext. project refers to the current project or subproject being built. For example, the below file could be apply from'd to add a downloadFile function to the current project:
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'de.undercouch:gradle-download-task:3.4.3'
}
}
project.ext {
apply plugin: de.undercouch.gradle.tasks.download.DownloadTaskPlugin
downloadFile = { String url, File destination ->
if (destination.exists()) {
println "Skipping download because ${destination} already exists"
} else {
download {
src url
dest destination
}
}
}
}
My Project is using spotless plugin. I need to ignore java files from the generated-resources directory. How to do the same.
This is how I am using the plugin.
apply plugin: "com.diffplug.gradle.spotless"
spotless {
lineEndings = 'unix';
java {
eclipseFormatFile "eclipse-java-google-style.xml"
}
}
sourceSets has generated-resources directory included which I do not want to remove.
You can specify a target for the spotless formatter which allows includes and excludes.
I use the following in the top-level build.gradle in a multi-project build where all Java code resides in subdirectories under the modules directory:
subprojects {
...
spotless {
java {
target project.fileTree(project.rootDir) {
include '**/*.java'
exclude 'modules/*/generated/**/*.*'
}
googleJavaFormat()
}
}
...
}
In my Gradle build script I want to import a ZIP dependency that contains static analysis configuration (CheckStyle, PMD etc.) and then "apply from" the files in that ZIP. When anyone runs the "check" task, my custom static analysis configuration should be used then.
I've tried the somewhat convoluted solution below, but I can't get it to work. The files are retrieved and unpacked into the "config" directory, but "apply from" does not work - Gradle complains it cannot find the files; I assume this is due to "apply from" being run during the build configuration phase.
Is there a simpler way to do this?
repositories {
maven { url MY_MAVEN_REPO }
}
configurations {
staticAnalysis {
description = "Static analysis configuration"
}
}
dependencies {
staticAnalysis group:'my-group', name:'gradle-static-analysis-conf', version:'+', ext:'zip'
}
// Unzip static analysis conf files to "config" in root project dir.
// This is the Gradle default location.
task prepareStaticAnalysisConf(type: Copy) {
def confDir = new File(rootProject.projectDir, "config")
if (!confDir.exists()) {
confDir.mkdirs()
}
from {
configurations.staticAnalysis.collect { zipTree(it) }
}
into confDir
apply from: 'config/quality.gradle'
}
check.dependsOn('prepareStaticAnalysisConf')
You are perfectly right: Gradle runs apply during evaluation phase, but the prepareStaticAnalysisConf was not executed yet and the archive is not unpacked.
Instead of a task, just write some top-level code. It should do the trick. Also, you'd better use the buildscript level dependency, so that it is resolved before script is executed.
Here is the full script
buildScript {
repositories {
maven { url MY_MAVEN_REPO }
}
dependencies {
classpath group:'my-group', name:'gradle-static-analysis-conf', version:'+', ext:'zip'
}
}
def zipFile = buildscript.configurations.classpath.singleFile
copy {
from zipTree(it)
into 'config'
}
apply from: 'config/quality.gradle'
I am trying to use the avro-gradle-plugin on github, but have not gotten any luck getting it to work. Does anyone have any sample code on how they get it to work?
I figured out how to do it myself. The following is a snippet that I would like to share for people who might run into the same issues as I did:
apply plugin: 'java'
apply plugin: 'avro-gradle-plugin'
sourceCompatibility = "1.6"
targetCompatibility = "1.6"
buildscript {
repositories {
maven {
// your maven repo information here
}
}
dependencies {
classpath 'org.apache.maven:maven-artifact:2.2.1'
classpath 'org.apache.avro:avro-compiler:1.7.1'
classpath 'org.apache.avro.gradle:avro-gradle-plugin:1.7.1'
}
}
compileAvro.source = 'src/main/avro'
compileAvro.destinationDir = file("$buildDir/generated-sources/avro")
sourceSets {
main {
java {
srcDir compileAvro.destinationDir
}
}
}
dependencies {
compileAvro
}
I found "com.commercehub.gradle.plugin.avro" gradle plugin to work better.
use the folllowing:
// Gradle 2.1 and later
plugins {
id "com.commercehub.gradle.plugin.avro" version "VERSION"
}
// Earlier versions of Gradle
buildscript {
repositories {
jcenter()
}
dependencies {
classpath "com.commercehub.gradle.plugin:gradle-avro-plugin:VERSION"
}
}
apply plugin: "com.commercehub.gradle.plugin.avro"
more details at https://github.com/commercehub-oss/gradle-avro-plugin
When evaluating a plugin the following questions needs to be asked:
Are generated files included into source jar?
Is plugin fast? Good plugin use avro tools api instead of forking VM for every file. For large amount of files creating VM for every file can take 10min to compile.
Do you need intermediate avsc files?
Is build incremental (i.e. do not regenerate all files unless one of the sources changed)?
Is plugin flexible enough to give access to generated schema files, so further actions, such as registration schema in schema repository can be made?
It is easy enough to implement without any plugin if you are not happy with plugin or need more flexibility.
//
// define source and destination
//
def avdlFiles = fileTree('src/Schemas').include('**/*.avdl')
// Do NOT generate into $buildDir, because IntelliJ will ignore files in
// this location and will show errors in source code
def generatedJavaDir = "generated/avro/java"
sourceSets.main.java.srcDir generatedJavaDir
//
// Make avro-tools available to the build script
//
buildscript {
dependencies {
classpath group:'org.apache.avro', name:'avro-tools' ,version: avro_version
}
}
//
// Define task's input and output, compile idl to schema and schema to java
//
task buildAvroDtos(){
group = "build"
inputs.files avdlFiles
outputs.dir generatedJavaDir
doLast{
avdlFiles.each { avdlFile ->
def parser = new org.apache.avro.compiler.idl.Idl(avdlFile)
parser.CompilationUnit().getTypes().each { schema ->
def compiler = new org.apache.avro.compiler.specific.SpecificCompiler(schema)
compiler.compileToDestination(avdlFile, new File(generatedJavaDir))
}
}
}
}
//
// Publish source jar, including generated files
//
task sourceJar(type: Jar, dependsOn: buildAvroDtos) {
from sourceSets.main.allSource
// Package schemas into source jar
into("Schemas") { from avdlFiles }
}
// Clean "generated" folder upon "clean" task
clean {
delete('generated')
}
Configuration for avro with gradle as build tool need to add along with applying java plugin.
below changes in settings.gradle
pluginManagement {
repositories {
gradlePluginPortal()
mavenCentral()
}
}
below changes in build.gradle
plugins {
id "com.github.davidmc24.gradle.plugin.avro" version "1.3.0"
}
repositories {
mavenCentral()
}
dependencies {
implementation "org.apache.avro:avro:1.11.0"
}
generateAvroJava {
source("${projectDir}/src/main/resources/avro")//sourcepath avrofile
}
if you want to generate setter methods too add this task too in build.gradle
avro {
createSetters = true
}
link for reference
I am new to gradle but learning quickly. I need to get some specific JARs from logback into a new directory in my release task. The dependencies are resolving OK, but I can't figure out how, in the release task, to extract just logback-core-1.0.6.jar and logback-access-1.0.6.jar into a directory called 'lib/ext'. Here are the relevant excerpts from my build.gradle.
dependencies {
...
compile 'org.slf4j:slf4j-api:1.6.4'
compile 'ch.qos.logback:logback-core:1.0.6'
compile 'ch.qos.logback:logback-classic:1.0.6'
runtime 'ch.qos.logback:logback-access:1.0.6'
...
}
...
task release(type: Tar, dependsOn: war) {
extension = "tar.gz"
classifier = project.classifier
compression = Compression.GZIP
into('lib') {
from configurations.release.files
from configurations.providedCompile.files
}
into('lib/ext') {
// TODO: Right here I want to extract just logback-core-1.0.6.jar and logback-access-1.0.6.jar
}
...
}
How do I iterated over the dependencies to locate those specific files and drop them in the lib/ext directory created by into('lib/ext')?
Configurations are just (lazy) collections. You can iterate over them, filter them, etc. Note that you typically only want to do this in the execution phase of the build, not in the configuration phase. The code below achieves this by using the lazy FileCollection.filter() method. Another approach would have been to pass a closure to the Tar.from() method.
task release(type: Tar, dependsOn: war) {
...
into('lib/ext') {
from findJar('logback-core')
from findJar('logback-access')
}
}
def findJar(prefix) {
configurations.runtime.filter { it.name.startsWith(prefix) }
}
It is worth nothing that the accepted answer filters the Configuration as a FileCollection so within the collection you can only access the attributes of a file. If you want to filter on the dependency itself (on group, name, or version) rather than its filename in the cache then you can use something like:
task copyToLib(type: Copy) {
from findJarsByGroup(configurations.compile, 'org.apache.avro')
into "$buildSrc/lib"
}
def findJarsByGroup(Configuration config, groupName) {
configurations.compile.files { it.group.equals(groupName) }
}
files takes a dependencySpecClosure which is just a filter function on a Dependency, see: https://gradle.org/docs/current/javadoc/org/gradle/api/artifacts/Dependency.html