I'm trying to use the tooling API to run a gradle task from groovy code. The following works:
ProjectConnection connection = GradleConnector.newConnector()
.forProjectDirectory(new File(System.properties.getProperty('user.dir')))
.connect()
connection.newBuild()
.forTasks('deploy')
.setStandardOutput(System.out)
.run()
But the task I want to run depends on project properties. For example, if I was running it from the command line, I'd use
gradle -Penv=local deploy
I can't figure out how, using the tooling API, to set those project values.
You can do:
connection.newBuild()
.withArguments('-Penv=local')
.forTasks('deploy')
.setStandardOutput(System.out)
.run()
Related
I am using gradle tooling api and I encountered the following scenario.
There is a project that applies a certain plugin P which creates a task T only if the property shouldApplyP is passed.
Hence, if you will run ./gradlew tasks --all you won't see task T, but if you will run
./gradlew -PshouldApplyP tasks --all you will see task T.
In gradle tooling api, once a ProjectConnection has created I can do
connection.getModel(GradleProject.class).getTasks()
But I can't see this specific task. Is there a way to pass the project connection this property
-PshouldApplyP so it will be presented in the getTasks() method?
connection.newBuild()
.withArguments("-PshouldApplyP", "tasks --all")
.run()
In our CI environment, we currently have one build server (based on Atlassian Bamboo) and two SonarQube instances (versions 6.0 and 6.5). Initially, our CI server was configured to communicate with the 6.0 SonarQube instance. This has been configured in the /home/bamboo/.gradle/gradle.properties file on our CI server like this:
systemProp.sonar.host.url=<http url of SonarQube 6.0 instance>
systemProp.sonar.login=<username here>
systemProp.sonar.password=<password here>
Now we have another Gradle-based project running on our CI server which shall talk to the new SonarQube 6.5 instance. I tried configuring this but failed all the time.
Things I have done so far:
Added commandline arguments to gradle wrapper command:
I have tried adding -Dsonar.host.url=, -Dsonar.login=, -Dsonar.password= to the Gradle command. As this didn't seem to work, I have also tried to set commandline arguments as SonarQube system properties using -DsystemProp.sonar.host.url=, -DsystemProp.sonar.login=, -DsystemProp.sonar.password=. This didn't work either.
Added properties to the build.gradle file
- Added properties to the build.gradle file like this:
sonarqube {
properties {
property "sonar.host.url", "<http url of SonarQube 6.0 instance>"
property "sonar.login", "<username here>"
property "sonar.password", "<password here>"
...<other SonarQube analysis settings here>...
}
}
In all cases, the CI server talked to the wrong SonarQube instance (6.0). My question is, whether it is possible to configure a single project to talk to another SonarQube instance. As you have seen, we use Gradle 3.2.1 as a build tool. And we are using the org.sonarqube Gradle plugin too.
Thank you for any help.
André
Your first try did not work, because you set the system properties from the commandline, but setting it from the project properties later on resets the system properties to the configured values.
Your second try did not work, because the systemProp.sonar.login syntax is only suppored in gradle.properties files, not via -P commandline project properties.
Your third try did not work because the SonarQube scanner prefers the system property values over the value configured via the DSL, so that one can change what is configured in the build script with the help of local configuration.
You need to set the system properties in your build script manually, this then overwrite what was automatically set from the project property. Using the project gradle.properties file does not work as the user file overwrite the project file. So you need something like System.properties.'sonar.login' = '...' in your build script. You can either hard-code it there, or then use project properties that you can set in your gradle.properties file or via -P parameters.
Besides that, I'd never depend on having any configuration in Gradle User dir on a build server. Most buildservers use build agents that might run on distributed machines, so you would always have to make sure that all build agents are configured the same and so on. I'd always configure in the build setup of the build server the according configuration, either by setting system properties, or environment properties or commandline arguments.
Just my 2ct.
According to Google's Dataflow documentation, Dataflow job template creation is "currently limited to Java and Maven." However, the documentation for Java across GCP's Dataflow site is... messy, to say the least. The 1.x and 2.x versions of Dataflow are pretty far apart in terms of details, I have some specific code requirements that lock me into the 2.0.0r3 codebase, so I'm pretty much required to use Apache Beam. Apache is -- understandably -- quite dedicated to Maven, but institutionally my company's thrown the bulk of its weight behind Gradle, so much so that they migrated all their Java projects over to it last year and have pushed back against re-introducing it.
However, now we seem to be at an impasse, because we've got a specific goal to try to centralize a lot of our back-end gathering in GCP's Dataflow, and GCP Dataflow doesn't appear to have formal support for Gradle. If it does, it's not in the official documentation.
Is there a sufficient technical basis to actually build Dataflow templates with Gradle and the issue is that Google's docs simply haven't been updated to support this? Is there a technical reason why Gradle can't do what's being done with Maven? Is there a better guide for working with GCP Dataflow than the docs on Google's and Apache's websites? I haven't worked with Maven archetypes before, and all the searches I've done for "gradle archetypes" turn up results from, at best, over a year ago. Most of the information points to forum discussions from 2014 and version 1.7rc3, but we're on 3.5. This feels like it ought to be a solved problem, but for the life of me I can't find any current information on this online.
Commandline to Run Cloud Dataflow Job With Gradle
Generic Execution
$ gradle clean execute -DmainClass=com.foo.bar.myfolder.MyPipeline -Dexec.args="--runner=DataflowRunner --gcpTempLocation=gs://my-bucket/tmpdataflow" -Pdataflow-runner
Specific Example
$ gradle clean execute -DmainClass=com.foo.bar.myfolder.MySpannerPipeline -Dexec.args="--runner=DataflowRunner --gcpTempLocation=gs://my-bucket/tmpdataflow --spannerInstanceId=fooInstance --spannerDatabaseId=barDatabase" -Pdataflow-runner
Explanation of Commandline
gradle clean execute uses the execute task which allows us to easily pass commandline flags to the Dataflow Pipeline. The clean task removes cached builds.
-DmainClass= specifies the Java Main class since we have multiple pipelines in a single folder. Without this, Gradle doesn't know what the Main class is and where to pass the args. Note: Your gradle.build file must include task execute per below.
-Dexec.args= specifies the execution arguments, which will be passed to the Pipeline. Note: Your gradle.build file must include task execute per below.
--runner=DataflowRunner and -Pdataflow-runner ensure that the Google Cloud Dataflow runner is used and not the local DirectRunner
--spannerInstanceId= and --spannerDatabaseId= are just pipeline-specific flags. Your pipeline won't have them so.
build.gradle contents (NOTE: You need to populate your specific dependencies)
apply plugin: 'java'
apply plugin: 'maven'
apply plugin: 'application'
group = 'com.foo.bar'
version = '0.3'
mainClassName = System.getProperty("mainClass")
sourceCompatibility = 1.8
targetCompatibility = 1.8
repositories {
maven { url "https://repository.apache.org/content/repositories/snapshots/" }
maven { url "http://repo.maven.apache.org/maven2" }
}
dependencies {
compile group: 'org.apache.beam', name: 'beam-sdks-java-core', version:'2.5.0'
// Insert your build deps for your Beam Dataflow project here
runtime group: 'org.apache.beam', name: 'beam-runners-direct-java', version:'2.5.0'
runtime group: 'org.apache.beam', name: 'beam-runners-google-cloud-dataflow-java', version:'2.5.0'
}
task execute (type:JavaExec) {
main = System.getProperty("mainClass")
classpath = sourceSets.main.runtimeClasspath
systemProperties System.getProperties()
args System.getProperty("exec.args").split()
}
Explanation of build.gradle
We use the task execute (type:JavaExec) in order to easily pass runtime flags into the Java Dataflow pipeline program. For example, we can specify what the main class is (since we have more than one pipeline in the same folder) and we can pass specific Dataflow arguments (i.e., specific PipelineOptions). more here
The line of build.gradle that reads runtime group: 'org.apache.beam', name: 'beam-runners-google-cloud-dataflow-java', version:'2.5.0' is very important. It provides the Cloud Dataflow runner that allows you to execute pipelines in Google Cloud Platform.
There's absolutely nothing stopping you writing your Dataflow application/pipeline in Java, and using Gradle to build it.
Gradle will simply produce an application distribution (e.g. ./gradlew clean distTar), which you then extract and run with the --runner=TemplatingDataflowPipelineRunner --dataflowJobFile=gs://... parameters.
It's just a runnable Java application.
The template and all the binaries will then be uploaded to GCS, and you can execute the pipeline through the console, CLI or even Cloud Functions.
You don't even need to use Gradle. You could just run it locally and the template/binaries will be uploaded. But, I'd imagine you are are using a build server like Jenkins.
Maybe the Dataflow docs should read "Note: Template creation is currently limited to Java", because this feature is not available in the Python SDK yet.
Update: 7th December 2020
We can stage dataflow templates using gradle as well.
For stage:
Here are the mandatory parameters:
project
region
gcpTempLocation (good to have if you don't have bucket create access, if not given it will create automatically)
stagingLocation
templateLocation
Here is the sample command line in gradle:
gradle clean execute -D mainClass=com.something.mainclassname -D exec.args="--runner=DataflowRunner --project=<project_id> --region=<region_name> --gcpTempLocation=gs://bucket/somefolder --stagingLocation=gs://bucket/somefolder --templateLocation=gs://bucket/somefolder"
Assumptions:
GOOGLE_APPLICATION_CREDENTIALS environmental variable is set with service account key.
gradle is installed.
JAVA_HOME environmental variable is set.
Bare minimum dependencies are added.
compile 'org.apache.beam:beam-sdks-java-core:2.22.0'
compile 'org.apache.beam:beam-sdks-java-io-google-cloud-platform:2.22.0'
compile 'org.apache.beam:beam-sdks-java-extensions-google-cloud-platform-core:2.22.0'
compile 'org.apache.beam:beam-runners-google-cloud-dataflow-java:2.22.0'
I have a go lang application which exposes a rest API and logs the information to DB. I am trying to convert the make file to gradle build. Is there any default way similar to maven2gradle plugin or the gradle build file should be written manually? I checked the syntactical differences between gradle and make file but still not clear about passing run time arguments to gradle that is similar to
run:build
./hello -conf=/apps/content/properties/prop.json -v=0 -logDest="FILE" -log_dir="/var/log/logdir"
hello is my executable and others are the runtime arguments. This is my first attempt in migrating make to gradle and I couldnt find any clear documentation. Please help.
As far as I have checked, there is no direct plugin that could do this task. As a workaround, the build execution could be written as seperate tasks in gradle and ordered accordingly. Tasks here would contain setting Go path, installing dependencies and building the application and would be run as command line process in Gradle. Gradle provides support to run command line processes as described in gradle documentation. Hope it helps.
I have a custom plugin and I am writing tests to test it. For this I'm using gradle tooling api (I found that to be the recommended way to test).
One of the test requires me to run a task by setting some environment variable. How do I test this. I do not see ProjectConnection providing a way to set environment variable.
If I have to manually test I would have to do this :
setenv LRG_REPOS foo
gradle verify_lrg -PlrgName=abc
where verify_lrg is task added by my custom plugin.
Currently to solve this, I am running using ProcessBuilder, but would like to know if there is any gradle tooling way (because all other tests are using gradle tooling api)
It will be possible to configure environment variables via gradle tooling api since 3.5 version, see details at https://github.com/gradle/gradle/pull/1029
https://github.com/gradle/gradle/blob/446468213543e32a0ce1bce0bbedabcfe925c572/subprojects/tooling-api/src/main/java/org/gradle/tooling/LongRunningOperation.java#L190