spring boot batch to spring cloud task with multiple jobs - spring

I have a spring boot batch application that has 5 unique jobs that execute by console using the command:
java -jar artifactName jobName param1
but now this project will be move to cloud, so I need to use spring cloud task. So far so good.
I know that I have to define in the main class the #enableTask and also in the application.properties define the properties:
spring.application.name=cloudTask
So reading the Spring documentation understand that for triggering my jobs using spring cloud dataflow server, can define a task that in this case i should use as cloudTask. But does not make sense because how will tigger it, because my application has 5 different jobs, so the question is:
how do i connect this task name with my jobs define in the application?
The logic tell me that I need to define also 5 task name, then how do I bind this task name with the respective job.

With #EnableTask annotation, you should be able to register your batch as Task application in SCDF - Under 'Apps'
Once your batch appears in Apps,
If all jobs 5 jobs are independent, you should be able to create 5 different Composed Tasks with same App name but with different parameters,
OR
If those are interlinked, then linked jobs can be combined together in 1 composed task, by providing alias and passing corresponding set of parameters, in DSL.
Once the composed task is launched, task execution status can be viewed under 'Task -> Executions' and corresponding to Jobs status can be viewed under 'Jobs'
To pass custom parameters to tasks, #EnableConfigurationProperties #ConfigurationProperties can be leveraged.

Related

Spring Cloud Dataflow - Set max-connection-pool for Composed Task Runner

I've encountered an issue on Spring Cloud Dataflow when running multiple composed tasks at once.
Hikari DataSource takes 10 connections from the connection pool by default. When running for example 10 composed tasks at once, this means 100 connections + connections required for every task on each composed task.
I tried running the Composed Task Runner locally with spring.datasource.hikari.maximum-pool-size=1 and it worked.
Is there any way how to set this property to every Composed Task Runner by default ? I did not find any documentation related to modifying things like this for composed tasks.

Spring boot Quartz to skip some jobs

I use spring boot + maven multi module project. I am using cluster-aware Quartz configuration, so job details are stored in the database. Lets say, 2 Spring boot maven modules (projects) are using quartz, but I want each module run it is own jobs, which are different to each module. In that case, when I start one module, it actually tries to run jobs from other module too because Quartz engine is reading the jobs to run from the database. So how do I specify in each module to run only specific jobs related to at module?
One way is to have different table prefixes for each module. Is there any other way to use the same quartz tables between 2 different modules, but each module decide which jobs to run?
I have almost the same. My solution was:
Job A writes a flag into the DB table next to the job "running_by" jobA and when its done, it remove the flag.
So JobB checks if this Job has a flag already. If not it will proceed, if yes it skip this job because JobA is already running it.
config a property in quartz.properties
org.quartz.jobStore.isClustered = true
that can make quartz work in cluster mode

Spring task:scheduled or #Scheduler to restrict a Job to run in multiple instance

I have one #Scheduler job which runs on multiple servers in a clustered environment. However I want to restrict the job to run in only in one server and other servers should not run the same job once any other server has started it .
I have explored Spring Batch has lock mechanism using some Database table , but looking for any a solution only in spring task:scheduler.
I had the same problem and the solution what I implemented was a the Lock mechanism with Hazelcast and to made it easy to use I also added a proper annotation and a bit of spring AOP for that. So with this trick I was able to enforce a single schedule over the cluster done with a single annotation.
Spring Batch has this nice functionality that it would not run the job with same job arguments twice.
You can use this feature so that when a spring batch job kicks start in another server it does not run.
Usually people pass a timestamp as argument so it will by pass this logic, which you can change it.

Spring Cloud Dataflow - handling argument in Task

I would like to pass the ID of an object to a variable when starting a Task in Spring Cloud Dataflow. I know that it can be done with arguments or parameters, but I don't know how to handle these arguments or parameters in Java code so I can take over this value. Could you please indicate how this could be done?
In the context of Spring Cloud Data Flow, you can pass arguments or properties to your task application.
The arguments you pass for the Spring Cloud Task application are the command line arguments for the task application itself. You need to qualify the arguments as the command line arguments for your application.
The properties you pass for the Spring Cloud Task application are the application configuration properties or task deployer properties. They have to use the prefix app, deployer or scheduler.
For instance, for the out-of-the-box timestamp task application, you can see how arguments and properties can be used in the following example:
Register out-of-the-box task applications
Create timestamp task:
dataflow:>task create a1 --definition "timestamp"
Launch the task with arguments and properties
dataflow:>task launch a1 --arguments "--spring.main.banner-mode=off" --properties "app.timestamp.format=YYYY/DD/MM"
In the above case, the command line argument --spring.main.banner-mode=off is passed to the timestamp application while the timestamp application's property format is passed to the task application.

How to makes Activiti run multiple parallel process instances

I have a Spring Boot web application project using an embedded Activiti engine (using the activiti-spring-boot-starter-basic Maven dependency) with a simple workflow processing a business request. There are service tasks implemented with JavaDelegate objects.
We may have multiple parallel requests that must be processed at the same time. However, the Activiti engine is waiting for a given process instance to reach a pause state (waiting for messages) before beginning a new process instance.
We are starting a process instance using this command :
runtimeService.startProcessInstanceByKey("process_simple", variables);
I have added to application.properties the following to no use :
# Activiti config
spring.activiti.jobExecutorActivate = true
spring.activiti.asyncExecutorEnabled = true
spring.activiti.asyncExecutorActivate = true
I also have define all service tasks to Asynchronous, and to Multi-instance type : parallel but that does not work either.
Have I missed a setting ?
Thanks for any help.

Resources