I'd like to have 3 java applications (a backend, a front end, and an Android app) using protocol buffers (gRPC) to communicate. So I would like the 3 apps all to be able to have access to a shared protobuf repo (Github) where I manage the .proto files. I am new to using Gradle and protobufs, so I'm not sure what the proper way to manage this is, and any help or guidance would be appreciated. Can I have each Gradle project declare my github protobuf repo as a dependency, and then pull it down and compile it when I build the project? I would assume this way would be a good way to do it, rather than storing the compiled protobuf classes, since the Android app might need a different "Java-lite" version of the protobufs? I am using the google/protobuf-gradle-plugin to compile the .proto files, and see documentation for compile from local files, or pulling in projects that have precompiled .proto files, but no documentation for pulling in remote .proto files. Am I on the right track?
In what form is your remote .proto file/repo? If it is just a url, then you can use Download task:
buildscript {
repositories {
mavenCentral()
}
dependencies {
classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.0'
}
}
plugins {
id "de.undercouch.download" version "3.2.0"
}
group 'testtest'
version '1.0-SNAPSHOT'
apply plugin: 'java'
apply plugin: 'com.google.protobuf'
sourceCompatibility = 1.8
repositories {
mavenCentral()
}
task downloadFile << {
download {
src 'https://raw.githubusercontent.com/grpc/grpc-java/master/compiler/src/test/proto/test.proto'
dest "$projectDir/src/main/proto/test.proto"
overwrite true
}
}
build.dependsOn downloadFile
dependencies {
compile "io.grpc:grpc-protobuf-lite:1.5.0"
compile "io.grpc:grpc-stub:1.5.0"
}
protobuf {
protoc {
artifact = 'com.google.protobuf:protoc:3.3.0'
}
plugins {
javalite {
artifact = "com.google.protobuf:protoc-gen-javalite:3.0.0"
}
grpc {
artifact = "io.grpc:protoc-gen-grpc-java:1.5.0"
}
}
generateProtoTasks {
all().each { task ->
task.builtins {
remove java
}
task.plugins {
javalite {}
grpc {
option 'lite'
}
}
}
}
}
Related
I want to use spring boot inside fabric mod. And it works while I'm runing my minecraft server with gradlew.bat runServer command. But when I'm using gradlew.bat build and put the production jar inside a real server I got no class deff found error. Like that :
java.lang.RuntimeException: Could not execute entrypoint stage 'server' due to errors, provided by 'litopia_services'!
Caused by: java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication
at fr.litopia.services.LitopiaServices.onInitializeServer(LitopiaServices.java:23) ~[litopia-services-1.0.0.jar:?]
And it's seems leagite cause inside the jar that gradle build there is'nt spring boot transiant depencency.
So to prevent this issue I have to use api insted of include or implement include but none of that have work actually my build.gradle looks like that :
plugins {
id 'fabric-loom' version '0.11-SNAPSHOT'
id 'org.springframework.boot' version '2.5.14'
id 'io.spring.dependency-management' version '1.0.15.RELEASE'
}
sourceCompatibility = JavaVersion.VERSION_17
targetCompatibility = JavaVersion.VERSION_17
archivesBaseName = project.archives_base_name
version = project.mod_version
group = project.maven_group
repositories {
// Add repositories to retrieve artifacts from in here.
// You should only use this when depending on other mods because
// Loom adds the essential maven repositories to download Minecraft and libraries from automatically.
// See https://docs.gradle.org/current/userguide/declaring_repositories.html
// for more information about repositories.
maven { url 'https://jitpack.io' }
maven { url 'https://oss.sonatype.org/content/repositories/snapshots' }
maven { url 'https://mvnrepository.com/artifac' }
maven { url 'https://zoidberg.ukp.informatik.tu-darmstadt.de/artifactory/public-releases' }
maven { url 'https://maven.terraformersmc.com/releases' }
mavenCentral()
}
configurations {
springBom
compileOnly.extendsFrom(springBom)
annotationProcessor.extendsFrom(springBom)
implementation.extendsFrom(springBom)
all {
// We need to use fabric implementation of log4j
exclude group: 'org.springframework.boot', module: 'spring-boot-starter-logging'
}
}
dependencies {
// To change the versions see the gradle.properties file
minecraft "com.mojang:minecraft:${project.minecraft_version}"
mappings "net.fabricmc:yarn:${project.yarn_mappings}:v2"
modImplementation "net.fabricmc:fabric-loader:${project.loader_version}"
// Fabric API. This is technically optional, but you probably want it anyway.
modImplementation "net.fabricmc.fabric-api:fabric-api:${project.fabric_version}"
// Spring boot dependencies
springBom enforcedPlatform('org.springframework.boot:spring-boot-dependencies:2.5.14')
api include('org.springframework.boot:spring-boot-starter')
implementation include('org.springframework.boot:spring-boot-dependencies:2.5.14')
implementation include('org.springframework.boot:spring-boot-starter-web')
implementation include('org.springframework.boot:spring-boot-starter-websocket')
// OpenAPI swagger
implementation include('org.springdoc:springdoc-openapi-ui:1.6.13')
// Spring discord api
implementation include('com.discord4j:discord4j-core:3.2.3')
}
processResources {
inputs.property "version", project.version
filesMatching("fabric.mod.json") {
expand "version": project.version
}
}
tasks.withType(JavaCompile).configureEach {
it.options.release = 17
}
java {
// Loom will automatically attach sourcesJar to a RemapSourcesJar task and to the "build" task
// if it is present.
// If you remove this line, sources will not be generated.
withSourcesJar()
}
jar {
from("LICENSE") {
rename { "${it}_${project.archivesBaseName}"}
}
}
I'm trying to publish a library to an AWS S3 Maven repository using this guide. After finally getting it to upload the artifacts to the S3 bucket without error, I made it a dependency of a new project per the guide.
When I tried to build the new project, an error occurred stating that it couldn't find the first of my library's dependencies. Sure enough, there was no pom.xml file generated that would have included that dependency (and others).
Not knowing a lot about how program Gradle tasks, I think the problem is within the pom.withXml {} portion of the script. Or the problem may occur before that since there's not even an empty pom.xml file.
Here is the entire script:
apply plugin: 'com.android.library'
apply plugin: 'maven-publish'
// update these next lines to fit your submodule
group = 'com.myproject'
version = '1.0'
// Add sources as an artifact
task sourceJar(type: Jar) {
from android.sourceSets.main.java.srcDirs
classifier "source"
}
// Loop over all variants
android.libraryVariants.all { variant ->
variant.outputs.all { output ->
// This creates a publication for each variant
publishing.publications.create(variant.name, MavenPublication) {
// The sources artifact from earlier
artifact sourceJar
// Variant dependent artifact, e.g. release, debug
artifact source: output.outputFile, classifier: output.name
// Go through all the dependencies for each variant and add them to the POM
// file as dependencies
pom.withXml {
def dependencies = asNode().appendNode('dependencies')
// Filter out anything that's not an external dependency. You shouldn't
// be publishing artifacts that depend on local (e.g. project) dependencies,
// but who knows...
configurations.getByName(variant.name + "CompileClasspath").allDependencies
.findAll { it instanceof ExternalDependency }
.each {
def dependency = dependencies.appendNode('dependency')
dependency.appendNode('groupId', it.group)
dependency.appendNode('artifactId', it.name)
dependency.appendNode('version', it.version)
}
}
}
}
}
// Ensure that the publish task depends on assembly
tasks.all { task ->
if (task instanceof AbstractPublishToMaven) {
task.dependsOn assemble
}
}
// Configure the destination repository with
// S3 URL and access credentials
publishing {
repositories {
maven {
url "s3://sdk.myproject.com.s3.amazonaws.com"
credentials(AwsCredentials) {
accessKey AWS_ACCESS_KEY
secretKey AWS_SECRET_KEY
}
}
}
}
Any ideas about what's going wrong? Thank you!
I'm trying to build a project that uses both Google protocol buffers and Kotlin using Gradle. I want the proto files to compile into Java source, which is then called from my Kotlin code.
My source files are arranged like this:
src/main/proto/*.proto
src/main/kotlin/*.kt
src/test/kotlin/*.kt
Here's my build.gradle file:
version '1.0-SNAPSHOT'
apply plugin: 'kotlin'
apply plugin: 'java'
apply plugin: 'com.google.protobuf'
repositories {
mavenCentral()
maven { url "http://dl.bintray.com/kotlin/kotlin-eap-1.1" }
}
buildscript {
ext.kotlin_version = '1.1-M02'
repositories {
mavenCentral()
maven { url "http://dl.bintray.com/kotlin/kotlin-eap-1.1" }
}
dependencies {
classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.0'
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
}
}
protobuf {
protoc {
artifact = 'com.google.protobuf:protoc:3.0.0'
}
}
dependencies {
compile 'com.google.protobuf:protobuf-java:3.0.0'
compile "org.jetbrains.kotlin:kotlin-stdlib:$kotlin_version"
testCompile 'junit:junit:4.12'
}
When I run ./gradlew assemble I get a bunch of "Unresolved reference" errors during :compileKotlin. Afterwards I can see that there are no Java source files generated, so it appears that the proto compiler is not being invoked at all.
If I remove the apply plugin: 'kotlin' line, then ./gradlew assemble successfully generates the Java source, but of course my Kotlin source is never compiled.
How do I fix my build.gradle so that I can call my protobuf code from Kotlin?
To get protobuf-gradle-plugin and kotlin-gradle-plugin to cooperate, you need to ensure that the Java code is (re)generated before invoking the Kotlin compiler.
For Gradle's default source sets, main and test, you can do that like this:
compileKotlin.dependsOn ':generateProto'
compileTestKotlin.dependsOn ':generateTestProto'
If you are using other source sets, you'll need to make adjustments.
Older versions of protobuf-gradle-plugin also required updating sourceSets, but newer versions do not seem to require this.
// Don't do this with protobuf-gradle-plugin 0.9.0 or higher
sourceSets.main.java.srcDirs += "${protobuf.generatedFilesBaseDir}/main/java"
sourceSets.test.java.srcDirs += "${protobuf.generatedFilesBaseDir}/test/java"
For Kotlin and Android:
android {
sourceSets {
debug.java.srcDirs += 'build/generated/source/proto/debug/java'
release.java.srcDirs += 'build/generated/source/proto/release/java'
}
}
An additional source directory has to be added for every build type. In this sample there are two build types: debug and release.
If you're using grpc, another line has to be added per build type:
android {
sourceSets {
debug.java.srcDirs += 'build/generated/source/proto/debug/java'
debug.java.srcDirs += 'build/generated/source/proto/debug/grpc'
release.java.srcDirs += 'build/generated/source/proto/release/java'
release.java.srcDirs += 'build/generated/source/proto/release/grpc'
}
}
At least with Kotlin 1.0.6, protobuf-gradle-plugin 0.8.0, protobuf 3.2.x and grpc 1.x it's not required to fiddle with the task order.
if you are working with multiple build types and flavors in android and with protobuf-lite use below with kotlin.
for example I have debug and release builds with demo and prod flavors it will create demoDebug, demoRelease and prodDebug and prodRelease variants.
then use
`
android{
sourceSets {
debug.java.srcDirs += 'build/generated/source/proto/demoDebug/javalite'
debug.java.srcDirs += 'build/generated/source/proto/prodDebug/javalite'
release.java.srcDirs += 'build/generated/source/proto/demoRelease/javalite'
release.java.srcDirs += 'build/generated/source/proto/prodRelease/javalite'
}
}
`
tie the different compileKotlin with generateProto
tasks.withType(org.jetbrains.kotlin.gradle.tasks.KotlinCompile).all {
if (getName() == 'compileDemoDebugKotlin')
dependsOn(':app:generateDemoDebugProto')
if (getName() == 'compileDemoReleaseKotlin')
dependsOn(':app:generateDemoReleaseProto')
if (getName() == 'compileProdDebugKotlin')
dependsOn(':app:generateProdDebugProto')
if (getName() == 'compileProdReleaseKotlin')
dependsOn(':app:generateProdReleaseProto')
}
For the gradle setup :
plugins {
id 'com.android.application'
id 'kotlin-android'
id 'com.google.protobuf' version "0.8.17"
}
Then at the bottom of the build.gradle
protobuf {
protoc {
artifact = "com.google.protobuf:protoc:3.10.0"
}
// Generates the java Protobuf-lite code for the Protobufs in this project. See
// https://github.com/google/protobuf-gradle-plugin#customizing-protobuf-compilation
// for more information.
generateProtoTasks {
all().each { task ->
task.builtins {
java {
option 'lite'
}
}
}
}
}
How can I fetch dependencies from a specific URL, i.e. I would like do something like this:
dependencies {
compile 'http://rforge.net/Rserve/files/RserveEngine.jar'
compile 'http://rforge.net/Rserve/files/REngine.jar'
}
is this even possible or do I have to download the jars in a separate task and add them using
compile files('....')
?
Basically this is the way to go:
apply plugin: 'java'
repositories {
ivy {
url 'http://rforge.net/Rserve/files/'
layout "pattern", {
artifact "[artifact].[ext]"
}
}
}
configurations{
rserve
}
dependencies {
rserve name: 'RserveEngine'
rserve name: 'REngine'
}
task fetchRserve(type: Copy) {
from configurations.rserve
into "$buildDir/rserve"
}
You can experiment with ivy layout to introduce modules, versions, extensions. Here the docs can be found.
I am trying to use the avro-gradle-plugin on github, but have not gotten any luck getting it to work. Does anyone have any sample code on how they get it to work?
I figured out how to do it myself. The following is a snippet that I would like to share for people who might run into the same issues as I did:
apply plugin: 'java'
apply plugin: 'avro-gradle-plugin'
sourceCompatibility = "1.6"
targetCompatibility = "1.6"
buildscript {
repositories {
maven {
// your maven repo information here
}
}
dependencies {
classpath 'org.apache.maven:maven-artifact:2.2.1'
classpath 'org.apache.avro:avro-compiler:1.7.1'
classpath 'org.apache.avro.gradle:avro-gradle-plugin:1.7.1'
}
}
compileAvro.source = 'src/main/avro'
compileAvro.destinationDir = file("$buildDir/generated-sources/avro")
sourceSets {
main {
java {
srcDir compileAvro.destinationDir
}
}
}
dependencies {
compileAvro
}
I found "com.commercehub.gradle.plugin.avro" gradle plugin to work better.
use the folllowing:
// Gradle 2.1 and later
plugins {
id "com.commercehub.gradle.plugin.avro" version "VERSION"
}
// Earlier versions of Gradle
buildscript {
repositories {
jcenter()
}
dependencies {
classpath "com.commercehub.gradle.plugin:gradle-avro-plugin:VERSION"
}
}
apply plugin: "com.commercehub.gradle.plugin.avro"
more details at https://github.com/commercehub-oss/gradle-avro-plugin
When evaluating a plugin the following questions needs to be asked:
Are generated files included into source jar?
Is plugin fast? Good plugin use avro tools api instead of forking VM for every file. For large amount of files creating VM for every file can take 10min to compile.
Do you need intermediate avsc files?
Is build incremental (i.e. do not regenerate all files unless one of the sources changed)?
Is plugin flexible enough to give access to generated schema files, so further actions, such as registration schema in schema repository can be made?
It is easy enough to implement without any plugin if you are not happy with plugin or need more flexibility.
//
// define source and destination
//
def avdlFiles = fileTree('src/Schemas').include('**/*.avdl')
// Do NOT generate into $buildDir, because IntelliJ will ignore files in
// this location and will show errors in source code
def generatedJavaDir = "generated/avro/java"
sourceSets.main.java.srcDir generatedJavaDir
//
// Make avro-tools available to the build script
//
buildscript {
dependencies {
classpath group:'org.apache.avro', name:'avro-tools' ,version: avro_version
}
}
//
// Define task's input and output, compile idl to schema and schema to java
//
task buildAvroDtos(){
group = "build"
inputs.files avdlFiles
outputs.dir generatedJavaDir
doLast{
avdlFiles.each { avdlFile ->
def parser = new org.apache.avro.compiler.idl.Idl(avdlFile)
parser.CompilationUnit().getTypes().each { schema ->
def compiler = new org.apache.avro.compiler.specific.SpecificCompiler(schema)
compiler.compileToDestination(avdlFile, new File(generatedJavaDir))
}
}
}
}
//
// Publish source jar, including generated files
//
task sourceJar(type: Jar, dependsOn: buildAvroDtos) {
from sourceSets.main.allSource
// Package schemas into source jar
into("Schemas") { from avdlFiles }
}
// Clean "generated" folder upon "clean" task
clean {
delete('generated')
}
Configuration for avro with gradle as build tool need to add along with applying java plugin.
below changes in settings.gradle
pluginManagement {
repositories {
gradlePluginPortal()
mavenCentral()
}
}
below changes in build.gradle
plugins {
id "com.github.davidmc24.gradle.plugin.avro" version "1.3.0"
}
repositories {
mavenCentral()
}
dependencies {
implementation "org.apache.avro:avro:1.11.0"
}
generateAvroJava {
source("${projectDir}/src/main/resources/avro")//sourcepath avrofile
}
if you want to generate setter methods too add this task too in build.gradle
avro {
createSetters = true
}
link for reference