Spring Cloud Dataflow - handling argument in Task - spring

I would like to pass the ID of an object to a variable when starting a Task in Spring Cloud Dataflow. I know that it can be done with arguments or parameters, but I don't know how to handle these arguments or parameters in Java code so I can take over this value. Could you please indicate how this could be done?

In the context of Spring Cloud Data Flow, you can pass arguments or properties to your task application.
The arguments you pass for the Spring Cloud Task application are the command line arguments for the task application itself. You need to qualify the arguments as the command line arguments for your application.
The properties you pass for the Spring Cloud Task application are the application configuration properties or task deployer properties. They have to use the prefix app, deployer or scheduler.
For instance, for the out-of-the-box timestamp task application, you can see how arguments and properties can be used in the following example:
Register out-of-the-box task applications
Create timestamp task:
dataflow:>task create a1 --definition "timestamp"
Launch the task with arguments and properties
dataflow:>task launch a1 --arguments "--spring.main.banner-mode=off" --properties "app.timestamp.format=YYYY/DD/MM"
In the above case, the command line argument --spring.main.banner-mode=off is passed to the timestamp application while the timestamp application's property format is passed to the task application.

Related

Alternative way to define bootRun task in Gradle doesn't work

I usually define tasks in Gradle (using Groovy) like tasks.withType(Type); e.g.: tasks.withType(JavaCompile), tasks.withType(Test), etc.
Now, I want to do the same with some provided Spring Boot tasks, namely: bootRun and bootStartScripts, but Gradle cannot find it.
I know it's silly and I could get away just by using bootRun and bootStartScripts, but I would like to understand why those cannot be configured/defined in such way.
I guess with define you mean configure, because withType can only be used to configure existing tasks. It takes a task type (a class) and a closure that can be used to configure all available tasks of that type. This needs to be considered, because a project may contain multiple tasks of the same type that should actually do completely different things. Whether to configure all those tasks or just a specific one is important!
To pass the task type to the method withType you need to know the name of the class implementing the task type. This name is not necessarily related to the name(s) of the actual task(s). For the tasks test and compileJava of the Gradle Java Plugin those classes are org.gradle.api.tasks.testing.Test and org.gradle.api.tasks.compile.JavaCompile. Since those classes are provided by Gradle, they are automatically imported and can be referenced via their simple names Test and JavaCompile. But the Spring Boot Plugin is a third-party plugin, so the classes need to be referenced by their full names.
The task bootStartScripts from your question is of type CreateStartScript, that is provided by Gradle. Therefore it can be configured like this:
tasks.withType(CreateStartScripts) {
// configure
}
The task bootRun is of type org.springframework.boot.gradle.tasks.run.BootRun, that is provided by the Spring Boot Plugin. So you need to specify the full name:
tasks.withType(org.springframework.boot.gradle.tasks.run.BootRun) {
// configure
}

Passing arguments to the task in the Spring Cloud Dataflow

I need a Spring Cloud Dataflow Task to pass on the parameters during startup. While I found out how to run it in Spring Data Flow Shell (e. g.: task create my-composed-task --definition "mytaskapp --displayMessage=hello"), I don't know how to refer to these parameters in Java code. Can anyone guide me?
The simplest example can be found in Spring Cloud Task timestamp sample.
Here, the application prints the current timestamp when it starts; however, there's also the ability to override the format of the timestamp through the setFormat(String format) function.
For instance, you can create and launch the application as:
task create myTaskDefinition --definition "timestamp --format='yyyy'"
task launch myTaskDefinition
When the launch is successful, instead of the default yyyy-MM-dd HH:mm:ss.SSS, you will see the resulting output in yyyy format.

spring boot batch to spring cloud task with multiple jobs

I have a spring boot batch application that has 5 unique jobs that execute by console using the command:
java -jar artifactName jobName param1
but now this project will be move to cloud, so I need to use spring cloud task. So far so good.
I know that I have to define in the main class the #enableTask and also in the application.properties define the properties:
spring.application.name=cloudTask
So reading the Spring documentation understand that for triggering my jobs using spring cloud dataflow server, can define a task that in this case i should use as cloudTask. But does not make sense because how will tigger it, because my application has 5 different jobs, so the question is:
how do i connect this task name with my jobs define in the application?
The logic tell me that I need to define also 5 task name, then how do I bind this task name with the respective job.
With #EnableTask annotation, you should be able to register your batch as Task application in SCDF - Under 'Apps'
Once your batch appears in Apps,
If all jobs 5 jobs are independent, you should be able to create 5 different Composed Tasks with same App name but with different parameters,
OR
If those are interlinked, then linked jobs can be combined together in 1 composed task, by providing alias and passing corresponding set of parameters, in DSL.
Once the composed task is launched, task execution status can be viewed under 'Task -> Executions' and corresponding to Jobs status can be viewed under 'Jobs'
To pass custom parameters to tasks, #EnableConfigurationProperties #ConfigurationProperties can be leveraged.

Not able to pass JOB PARAMETER to Steps - Spring BatchJobs

We are implementing Spring Batch Jobs,
We need to pass the job parameter from Client/MASTER to SLAVE.
CLIENT/MASTER is where our Job and partitioning code is present. We are calling the JOB using J Unit passing the JOB PARAMETER.
SLAVE is where all the Steps and its implementation(reader Writer and processor) is defined.
We are able to achieve this in a standalone way, but not with Client & SERVER way. I am not sure why we are not able to achieve our set up, where we are able to achieve in Standalone?
We are using Weblogic and Spring Integration along with JMS to Achieve the same
Please assist.
We are able to resolve this issue as below in our bean configuration file:
<property name="load" value="#{jobParameters[load]}"></property>
load is passed from our Shell script
./esk200.sh esk200-context.xml TIMESTAMP load=full

Enterprise manager 12c jmx operations

Wanted to know if OEM Cloud Control 12c supports Custom JMX operations as in JConsole. I've instrumented my java application to add some JMX Operations which take in a String as parameter, does some processing and return the result.
Example: Something like the add() operation
I tried using the jmxcli utility for creating a metadata plugin, but looks like the arguments (or parameters) to the JMX Operation should be hardcoded while creating the plugin. Is there any other way to run JMX operations on user-defined parameters in OEM?
This can be done using a custom UI, the scope of this QueryDescriptor should be "USER". The custom UI will be used to input this value, and this value will be used in the JMX fetchlet.

Resources