Gradle unable to publish multiple maven native artifacts - maven

I can't seem to get Gradle to publish multiple artifacts to a Maven repository. It publishes some, but not all, and I have no idea why. The targets are debug & release versions of static libraries, built for OS X and Windows (4 static libraries in total). The OS X libraries are stored but the Windows ones are not. If I modify the filter closure such that the OS X libraries are filtered out, nothing is stored.
model {
buildTypes {
debug
release
}
platforms {
"osx-x86_64" {
operatingSystem "osx"
architecture "x86_64"
}
"windows-x86_64" {
operatingSystem "windows"
architecture "x86_64"
}
}
toolChains {
// OS X and Windows toolchains (Clang and Mingw) described here
// - they build the artifacts I wish to store ok
// just removed for clarity
}
} // end of model
libraries {
saveMe {}
}
nativeArtifacts {
libSaveMe {
from (libraries.saveMe) { it instanceof StaticLibraryBinary &&
(it.targetPlatform.name == "osx-x86_64" ||
it.targetPlatform.name == "windows-x86_64")
// if I make this closure on the line above something like:
// it instanceof StaticLibraryBinary && it.targetPlatform.name == "windows-x86_64"
// then nothing is saved in the Maven repo
}
}
}
publishing {
repositories {
maven {
credentials {
username = 'user'
password = 'password'
}
url "http://hostname.com/path/to/repo"
}
}
publications {
mavPub(MavenPublication) {
from nativeArtifacts.libSaveMe
}
}
}
I'm using a very nice external plugin (Gradle Native Artifacts Plugin, written by Serge Gebhardt #sgeb (?)) but before I try to make sense of his code (I'm a beginner in Gradle), I thought I'd ask and see if there's something obvious I'm doing wrong.
I've tried putting logger statements in the filter closure and I can see that all possible combos of debug/release static/shared libraries are being tried, and the filter is correctly identifying true/false whether the libraries should be saved, but it doesn't make it to Maven.
Is there a debugging line I could put in publishing{} (or a task) to see what the actual contents of the nativeArtifacts.libSaveMe collection is?

OK, so the moral of the story is: never assume, check. In this instance, check that the statement:
toolChains {
// OS X and Windows toolchains (Clang and Mingw) described here
// - they build the artifacts I wish to store ok
// just removed for clarity
}
is actually true. It wasn't.
The publish task was being done by a CI server and the toolchain was failing to build the windows artifacts on the CI server, but it was working on my local machine (due to a config error in the installation of the mingw toolchain). The toolchain failure was occurring without an error (unless run in --debug) and hence was invisible to me, as the toolchain was being replaced by a non-Windows compiler that didn't create the artifacts.

Related

LeakCanary not installed in launcher

I did the change as suggested in app:build.gradle ->
debugImplementation 'com.squareup.leakcanary:leakcanary-android:2.9.1'
But when i pushed my apk, then i cannot see the Leak Canary in my launcher
Nay idea why?
#Harsh,
Did you implement an app with multiple buidtypes?
If yes you need to change the LeakCanary dependency follow your build types:
Example:
your build type is
buildTypes {
dev{}
....
release{}
}
if you want to install with the dev variant, your dependency should be like this: devImplementation 'com.squareup.leakcanary:leakcanary-android:xxx_xxx'
If not: remove this line in build gradle
variantFilter { variant ->
if (variant.buildType.name == "debug") {
setIgnore(true)
}
}

Mixed Clojure and ClojureScript Build With Clojurephant

I'm considering moving a project from boot to Gradle with clojurephant hoping to leverage more of the Gradle ecosystem. This project builds one large uberjar that contains a Clojure project with Ring and Jetty that in turn ships a ClojureScript app built with re-frame.
In boot, I essentially just required boot-cljs, added
(boot-cljs/cljs :optimizations :advanced)
to my build task, which also calls (pom), (aot) and (uber) (all standard boot tasks), and everything worked smoothly.
With Clojurephant, I find that the Clojure and ClojureScript parts end up in different subdirectories. In particular I find underneath of build
clojure/main
clojurescript/main
resources/main (essentially a copy of my resources project folder)
Adding to my confusion is that these paths don't translate in a way that I can see to the structure of the Uberjar that Gradle builds using the shadow plugin
Some excerpts from my build.gradle:
plugins {
id 'dev.clojurephant.clojure' version '0.5.0'
id 'dev.clojurephant.clojurescript' version '0.5.0'
id 'application'
id 'com.github.johnrengelman.shadow' version '5.0.0'
id 'maven-publish'
id 'distribution'
id 'com.meiuwa.gradle.sass' version '2.0.0'
}
// ...
clojure {
builds {
main {
aotAll()
}
}
}
// ...
clojurescript {
builds {
all {
compiler {
outputTo = 'public/app.js'
outputDir = 'public/js/out'
main = 'com.example.mycljsstuff'
assetPath = 'js/out'
}
}
main {
compiler {
optimizations = 'advanced'
sourceMap = 'public/app.js.map'
}
}
dev {
compiler {
optimizations = 'none'
preloads = ['com.example.mycljsstuff']
}
}
}
}
EDIT: Forgot to mention that for boot I configure the init function to start loading the CLJS code in a file called app.cljs.edn. With Clojurephant I only found a way to set a main namespace, not a function.
My question here is ultimately, how can I configure the ClojureScript build so that it works in the end when being delivered from the Uberjar?
The Clojure things seem to work. Ring and Jetty run and happily deliver a first static webpage. But all the CLJS/JS things can't be found.
I'd be super happy to just receive some pointers to other projects where I can learn, documentation, or tutorials. I haven't found a lot and then got lost in understanding the code of Clojurephant itself.
A year ago at work I was able to split up a combined CLJ & CLJS (backend/frontend) project into 2 separate projects: pure Clojure for the backend, and pure ClojureScript for the frontend. This resolved many, many problems we had and I would never try to keep two codebases in the same project again.
The backend CLJ part continued to use Lein as the build tool. It's not perfect but it is well understood.
The frontend CLJS part was transitioned from the original Figwheel (aka "1.0") to the newer Figwheel-Main (aka "2.0"). Following the lead from figwheel.org
we chose to restructure the build into using Deps/CLI (the original combined project used Lein for everything). The transition away from Lein to Deps/CLI was a real winner for CLJS work.
While Deps/CLI works great for pure Clojure code, be aware that it does not natively support the inclusion of Java source code. I have a template project
you can clone that shows a simple workaround for this.
For any CLJS project, I highly recommend the use of Figwheel-Main over the original Figwheel as it is a major "2.0" type of upgrade, and will make your life much, much better.
Enjoy!
Trying to answer my own question here, since I managed to get it running.
Clojure
plugins {
id 'dev.clojurephant.clojure' version '0.5.0'
id 'application'
id 'com.github.johnrengelman.shadow' version '5.0.0'
// ... more to come further down
}
group = 'com.example'
version = '1.0.0-SNAPSHOT'
targetCompatibility = 1.8
mainClassName = 'com.example.myproject'
dependencies {
implementation(
'org.clojure:clojure:1.10.1',
'ring:ring:1.8.0',
// and many more
)
testImplementation(
'junit:junit:4.12',
'clj-http-fake:clj-http-fake:1.0.3',
'ring:ring-mock:0.4.0'
)
devImplementation(
'org.clojure:tools.namespace:0.3.0-alpha4',
'cider:cider-nrepl:0.21.1',
'org.clojure:java.classpath',
'jonase:eastwood:0.3.11',
'lein-bikeshed:lein-bikeshed:0.5.2'
)
}
clojure {
builds {
main {
aotAll()
}
}
}
clojureRepl {
handler = 'cider.nrepl/cider-nrepl-handler'
}
This is enough to get an executable JAR running -main from com.example.myproject namespace when calling ./gradlew shadowJar from the command line. Not sure if the application plugin is relevant here. Also, ./gradlew clojureRepl spins up an nrepl that Emacs/Cider can connect to.
ClojureScript
// Merge with plugins above. Reproduced only the CLJS relevant
// part here
plugins {
id 'dev.clojurephant.clojurescript' version '0.5.0'
}
// Again, merge with dependencies above
dependencies {
implementation(
// ....
'org.clojure:clojurescript:1.10.597',
're-frame:re-frame:0.10.5',
'reagent:reagent:0.7.0',
// and a few more
)
}
clojurescript {
builds {
all {
compiler {
outputTo = 'public/app.js'
outputDir = 'public/state/'
main = 'com.example.myproject.webui'
assetPath = 'status/app.out'
}
}
main {
compiler {
optimizations = 'advanced'
sourceMap = 'public/app.js.map'
}
}
dev {
compiler {
optimizations = 'none'
preloads = ['com.example.myproject.webui']
}
}
}
}
This creates a /public folder in the top level of there JAR and app.js inside that folder, which is where the HTML file delivered by Ring expects it.
One important step for me was to call my init CLJS function in the CLJS file which was taken care of by some other component before. I'm not sure that this set up is totally correct and will do the figwheel setup eventually. Maybe the call to init will not be necessary then.
CSS
// Merge ....
plugins {
id 'com.meiuwa.gradle.sass' version '2.0.0'
}
sassCompile {
output = file("$buildDir/resources/main/public/")
source = fileTree("${rootDir}/src/main/resources/public/")
include("**/*.scss")
exclude("**/_*.sass", "**/_*.scss")
}
This compiles my app.scss to app.css in the right spot, where my HTML file searches for it.
Pro & Con
After migrating, I get for free
Faster compilation locally and in CI after setting up the caches correctly.
License and OWASP dep check reports by using plugins com.github.jk1.dependency-license-report and org.owasp.dependencycheck for which equivalents exists in leiningen but not boot (AFAIK).
Maven publishing with authentication via HTTP header instead of username/password which is not available in boot.
On the downsides:
Yucky syntax in my build file. No, really.
I have to enter the nrepl port number in Emacs manually.

Can Gradle C++ dependencies get set differently for different applications?

I have some applications in the same project that have the same dependencies that are also being built, some of which need them as static and others need them shared. I built a very basic show of this error based off of the answer at https://github.com/gradle/gradle-native/issues/1017 and it's getting practically the same error. It's a basic project with 2 libraries, :1 and :2, and a cpp-application at :app. This spits out the following error:
Could not determine the dependencies of task ':app:installDebug'.
> Could not resolve all task dependencies for configuration ':app:nativeRuntimeDebug'.
> Could not resolve project :1.
Required by:
project :app
> Module 'gradletest:1' has been rejected:
Cannot select module with conflict on capability 'gradletest:1:unspecified' also provided by [gradletest:1:unspecified(debugSharedRuntimeElements), gradletest:1:unspecified(debugStaticRuntimeElements)]
> Could not resolve project :1.
Required by:
project :app > project :2
> Module 'gradletest:1' has been rejected:
Cannot select module with conflict on capability 'gradletest:1:unspecified' also provided by [gradletest:1:unspecified(debugSharedRuntimeElements), gradletest:1:unspecified(debugStaticRuntimeElements)]
Here's my sample that I got to break this. I'm pretty sure I followed everything the answer said in github, and have perused the gradle docs for everything I can find that may be relevant. In my current, non-MVP situation I would have an :app2 that would have :1 and :2 linked as shared libraries, so just setting :1 and :2 to be static will not fix the issue unfortunately.
1/build.gradle.kts
plugins {
`cpp-library`
`cpp-unit-test`
}
library {
linkage.set(listOf(Linkage.STATIC, Linkage.SHARED))
}
2/build.gradle.kts
plugins {
`cpp-library`
`cpp-unit-test`
}
library {
linkage.set(listOf(Linkage.STATIC, Linkage.SHARED))
dependencies {
implementation(project(":1"))
}
}
app/build.gradle.kts
plugins {
`cpp-application`
`cpp-unit-test`
}
dependencies {
implementation(project(":1")) {
attributes { attribute(Attribute.of("org.gradle.native.linkage", Linkage::class.java), Linkage.STATIC) }
}
implementation(project(":2")) {
attributes { attribute(Attribute.of("org.gradle.native.linkage", Linkage::class.java), Linkage.STATIC) }
}
}
application {
}
This was done on Gradle 5.4.1, and unfortunately it is incredibly hard to get updates so I was unable to test to see if it's the old version that's the problem.

Android Studio 2.2's incremental compiler can't see generated protobufs

I upgraded my stable version of Android Studio to 2.2 and now the IDE's "incremental compiler" can't find any of the symbols for generated protobuf classes. I open the project and it can build and deploy the app to a device just fine. But when I open a Java class file that contains generated protobuf references, Android Studio marks them as errors soon after the file opens. Every Java import of a generated proto class is marked with "Cannot resolve symbol".
I first noticed this a month ago on the canary channel but didn't think anything of it because I was floundering with other protobuf issues (upgrading to 3.0.0 with its javalite split). I forgot about it until today. It's still possible to work on the project, it's just that the IDE is near useless since it thinks there are errors (even though real compiles are fine with it).
For reference.
gradle-2.14.1
com.android.tools.build:gradle:2.2.0
com.google.protobuf:protobuf-gradle-plugin:0.8.0
com.google.protobuf:protobuf-lite:3.0.0
com.google.protobuf:protoc:3.0.0
com.google.protobuf:protoc-gen-javalite:3.0.0
And in the modules that contain .proto files:
protobuf {
protoc {
artifact = google_protoc_artifact
}
plugins {
javalite {
artifact = google_protoc_javalite_artifact
}
}
generateProtoTasks {
all().each { task ->
task.builtins {
remove java
}
task.plugins {
javalite { }
}
}
}
}
We had the same issue and found out the following:
1) In order for idea (studio) to see your source, you need to help it by adding the idea plugin to your module:
apply plugin: 'idea'
idea {
module {
// Use "${protobuf.generatedFilesBaseDir}/main/javalite" for LITE_RUNTIME protos
sourceDirs += file("${protobuf.generatedFilesBaseDir}/main/java");
}
}
2) Another bug, Android Studio seems to ignore any source directory under build/. You have to move your generated directory outside of build:
protobuf {
generatedFilesBaseDir = "$projectDir/src/generated"
}
These two changes fixed the problem introduced by Android Studio 2.2 for us.
In my case, I was using the kotlin protobuf plugin and to fix the error of the IDE not being able to resolve it. I tweaked the other answer above to point to the main folder.
// build.gradle.kts
plugins {
idea
}
idea {
module {
// explicitly tell intellij where to resolve generated proto files
sourceDirs.plusAssign(file("build/generated/source/proto/main/"))
}
}

How to not build gradle subproject using docker plugin if docker not available?

I have a small multiproject Gradle build. I first constructed a Maven build structure for it, and that works fine. There are three subprojects. Two of them construct WAR files. The last constructs a Docker image using the other two WAR files.
The top-level Maven aggregator uses profile activation so that the Docker image project is only ever built if the OS is Linux. Eventually, I'll need a better conditional check for that (when Docker becomes a first-class element in Windows). For now, the Maven profile activation check works fine.
I'd like to do something similar in the Gradle build, as presently it tries to build the Docker image on Windows, which doesn't work.
What's the best way to have the Gradle build in the Docker-building subproject to do nothing if Docker isn't available (might as well do it right for the future)? If that's too hard I can settle for only building it on Linux.
I don't think, there is a simple solution for this. Especially if you want it to be platform independent. The only solution I can suggest is to make function to run CLI command gradle -v and parse this command output. Though, to run this command you have to do platform specific calls. Something like this:
import org.apache.tools.ant.taskdefs.condition.Os
//if true then Docker is available locally, otherwise false
ext.isDockerAvailable = { ->
def commandStdOut = new ByteArrayOutputStream()
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
//for Windows OS
exec {
commandLine "cmd", "/c", 'gradle', '-v'
standardOutput = commandStdOut
}
} else if (Os.isFamily(Os.FAMILY_UNIX)) {
//for Unix-family OS
exec {
commandLine "sh", "/c", 'gradle', '-v'
standardOutput = commandStdOut
}
} else {
//if OS is unsupported
println 'Unsupported OS version'
}
commandStdOut = commandStdOut.toString().trim()
//check command output for predefined words
return commandStdOut.contains('Build time:') && commandStdOut.contains('Revision:')
}.call()
task buildDockerImage {
enabled = isDockerAvailable
}
Sure, this works for Windows- and Unix-family OS.
And if you don't care much about whether image will be built or not, why don't you simply set ignoreFailures for docker-tasks and ignore them if they fail on Windows.
Anyway, I suppose, it's much better to use remote build server (with Docker remote API available) for such a task. In that case neither you, nor someone else need local Docker distribution to be installed if you are in the same LAN.

Resources