I did the change as suggested in app:build.gradle ->
debugImplementation 'com.squareup.leakcanary:leakcanary-android:2.9.1'
But when i pushed my apk, then i cannot see the Leak Canary in my launcher
Nay idea why?
#Harsh,
Did you implement an app with multiple buidtypes?
If yes you need to change the LeakCanary dependency follow your build types:
Example:
your build type is
buildTypes {
dev{}
....
release{}
}
if you want to install with the dev variant, your dependency should be like this: devImplementation 'com.squareup.leakcanary:leakcanary-android:xxx_xxx'
If not: remove this line in build gradle
variantFilter { variant ->
if (variant.buildType.name == "debug") {
setIgnore(true)
}
}
Related
I have a project that builds into an SDK library.
Before building a release I would like to verify that all configured dependencies are available using a gradle task. Somehow current build operations pass, because it manages to resolve a different version of a misconfigured dependency as it gets used by some other library. However, this is not a guaranteed situation and therefor I'd like a task that verifies if all configured dependencies are actually available or not and otherwise, fail.
I started a Task like this:
abstract class CheckDependenciesTask : DefaultTask() {
#TaskAction
fun checkDependencies() {
project.configurations.forEach { config ->
if (config.isCanBeResolved) {
println("Resolving ${config.name}")
config.resolve()
} else {
println("Not resolving ${config.name}")
}
}
}
}
But that really aims to resolve configs, rather than dependencies. Somehow I'm not able to figure out how to check if dependencies are available. When running lintAnalyzeDebug --info it neatly prints that they are missing, so there must be some command that checks this, but then just fails instead of try to resolve it.
Any idea or pointers on how to achieve this would be appreciated. Thank you.
I upgraded my stable version of Android Studio to 2.2 and now the IDE's "incremental compiler" can't find any of the symbols for generated protobuf classes. I open the project and it can build and deploy the app to a device just fine. But when I open a Java class file that contains generated protobuf references, Android Studio marks them as errors soon after the file opens. Every Java import of a generated proto class is marked with "Cannot resolve symbol".
I first noticed this a month ago on the canary channel but didn't think anything of it because I was floundering with other protobuf issues (upgrading to 3.0.0 with its javalite split). I forgot about it until today. It's still possible to work on the project, it's just that the IDE is near useless since it thinks there are errors (even though real compiles are fine with it).
For reference.
gradle-2.14.1
com.android.tools.build:gradle:2.2.0
com.google.protobuf:protobuf-gradle-plugin:0.8.0
com.google.protobuf:protobuf-lite:3.0.0
com.google.protobuf:protoc:3.0.0
com.google.protobuf:protoc-gen-javalite:3.0.0
And in the modules that contain .proto files:
protobuf {
protoc {
artifact = google_protoc_artifact
}
plugins {
javalite {
artifact = google_protoc_javalite_artifact
}
}
generateProtoTasks {
all().each { task ->
task.builtins {
remove java
}
task.plugins {
javalite { }
}
}
}
}
We had the same issue and found out the following:
1) In order for idea (studio) to see your source, you need to help it by adding the idea plugin to your module:
apply plugin: 'idea'
idea {
module {
// Use "${protobuf.generatedFilesBaseDir}/main/javalite" for LITE_RUNTIME protos
sourceDirs += file("${protobuf.generatedFilesBaseDir}/main/java");
}
}
2) Another bug, Android Studio seems to ignore any source directory under build/. You have to move your generated directory outside of build:
protobuf {
generatedFilesBaseDir = "$projectDir/src/generated"
}
These two changes fixed the problem introduced by Android Studio 2.2 for us.
In my case, I was using the kotlin protobuf plugin and to fix the error of the IDE not being able to resolve it. I tweaked the other answer above to point to the main folder.
// build.gradle.kts
plugins {
idea
}
idea {
module {
// explicitly tell intellij where to resolve generated proto files
sourceDirs.plusAssign(file("build/generated/source/proto/main/"))
}
}
In our Gradle project, we want to add a new module for functional-tests that needs to be able to access dependencies from other subprojects but still not be run as part of the full project build. If I try this, it still gets built:
def javaProjects() {
return subprojects.findAll { it.name != 'functional-tests' }
}
configure(javaProjects()) {
...
}
project(':functional-tests') {
....
}
The result is the same even if I move the functional-tests build to a separate build.gradle file of its own. Can someone point out how to achieve this?
I found a better solution to be to exclude the functional tests from running on the command line or via the build file.
For example, to run all tests except the functional tests, run:
$ gradle check -x :functional-tests:check
Then when building the project, you can let the subproject build but exclude their tests from running.
$ gradle clean assemble -x :functional-tests:check
A better option is do disable the functional tests in your build file unless a property is set. For example, in your build.gradle you'd add:
project('functional-tests') {
test {
onlyIf {
project.hasProperty("functionalTests")
}
}
}
This way, functional tests are always skipped unless you specify a specific build property:
$ gradle check
$ gradle -PfunctionalTests check
Hope that helps!
I do it like this:
//for all sub projects
subprojects {
if (it.name != 'project name') {
//do something
}
}
by this way, I can exclude some special project in subprojects.
you can also use it in allprojects or project.
As far as I know it's not possible to deactivate or exclude project after it as been included in settings.gradle. Therefore it maybe done in the following way in settings.gradle:
include 'p1', 'p2', 'p3'
if (any_condition_here) {
include 'functional-tests'
}
It will require additional checking in build.gradle - to configure the project if it's included.
What also comes to my head is -a command line switch, see here. Maybe it might helpful somehow.
You can't exclude the subproject, but you can disable subproject tasks:
gradle.taskGraph.whenReady {
gradle.taskGraph.allTasks.each {
if(it.project == project) {
it.onlyIf { false }
}
}
}
Just to mention that you donćt need to create a new module for integration/functional tests. I prefer to make a new, dedicated source set...
The approach is nicely described here: https://tomgregory.com/gradle-integration-tests/
I am creating Eclipse project files as shown:
eclipse {
project {
natures 'org.eclipse.jdt.core.javanature'
buildCommand 'org.eclipse.jdt.core.javabuilder'
}
classpath {
downloadSources true
downloadJavadoc true
}
}
I have a multi-project gradle build where projects reference each other and 3rd party libs. For projectA, its dependencies are:
dependencies {
compile project(':projectB')
compile project(':projectC')
compile "com.google.guava:guava:${VER_GUAVA}"
}
This works great, except that the generated projects don't reference each other. It builds just fine, but it means that if I refactor something in projectB, references in projectA aren't refactored with it. The fix is apparently to set the referencedProjects variable of the eclipse configuration, but I'd like for this to be automatically populated based on the dependencies. Something like:
eclipse {
project {
referencedProjects dependencies.grep(dep is project)
...
Anybody have any hints?
This is how I ended up fixing it:
// add all project dependencies as referencedProjects
tasks.cleanEclipse.doLast {
project.configurations.stream()
.flatMap({config -> config.dependencies.stream()}) // get all deps
.filter({dep -> dep instanceof ProjectDependency}) // that are Projects
.forEach({dep -> eclipse.project.referencedProjects.add(dep.name)}) // and add to referencedProjects
}
// always create "fresh" projects
tasks.eclipse.dependsOn(cleanEclipse)
Probably ought to mention that I'm using the wuff plugin because I'm in an OSGI environment, and it does tricks on the dependencies. That might be the reason that I'm not getting this automatically, but the above snippet fixes it pretty easily.
I can't seem to get Gradle to publish multiple artifacts to a Maven repository. It publishes some, but not all, and I have no idea why. The targets are debug & release versions of static libraries, built for OS X and Windows (4 static libraries in total). The OS X libraries are stored but the Windows ones are not. If I modify the filter closure such that the OS X libraries are filtered out, nothing is stored.
model {
buildTypes {
debug
release
}
platforms {
"osx-x86_64" {
operatingSystem "osx"
architecture "x86_64"
}
"windows-x86_64" {
operatingSystem "windows"
architecture "x86_64"
}
}
toolChains {
// OS X and Windows toolchains (Clang and Mingw) described here
// - they build the artifacts I wish to store ok
// just removed for clarity
}
} // end of model
libraries {
saveMe {}
}
nativeArtifacts {
libSaveMe {
from (libraries.saveMe) { it instanceof StaticLibraryBinary &&
(it.targetPlatform.name == "osx-x86_64" ||
it.targetPlatform.name == "windows-x86_64")
// if I make this closure on the line above something like:
// it instanceof StaticLibraryBinary && it.targetPlatform.name == "windows-x86_64"
// then nothing is saved in the Maven repo
}
}
}
publishing {
repositories {
maven {
credentials {
username = 'user'
password = 'password'
}
url "http://hostname.com/path/to/repo"
}
}
publications {
mavPub(MavenPublication) {
from nativeArtifacts.libSaveMe
}
}
}
I'm using a very nice external plugin (Gradle Native Artifacts Plugin, written by Serge Gebhardt #sgeb (?)) but before I try to make sense of his code (I'm a beginner in Gradle), I thought I'd ask and see if there's something obvious I'm doing wrong.
I've tried putting logger statements in the filter closure and I can see that all possible combos of debug/release static/shared libraries are being tried, and the filter is correctly identifying true/false whether the libraries should be saved, but it doesn't make it to Maven.
Is there a debugging line I could put in publishing{} (or a task) to see what the actual contents of the nativeArtifacts.libSaveMe collection is?
OK, so the moral of the story is: never assume, check. In this instance, check that the statement:
toolChains {
// OS X and Windows toolchains (Clang and Mingw) described here
// - they build the artifacts I wish to store ok
// just removed for clarity
}
is actually true. It wasn't.
The publish task was being done by a CI server and the toolchain was failing to build the windows artifacts on the CI server, but it was working on my local machine (due to a config error in the installation of the mingw toolchain). The toolchain failure was occurring without an error (unless run in --debug) and hence was invisible to me, as the toolchain was being replaced by a non-Windows compiler that didn't create the artifacts.