Complete user tasks of Activiti - spring-boot

I have tow user tasks Leave and Audit, i completed Leave task and go next Audi task, but
ACT_RU_TASK table have Leave and Audi data . How to only Audi data in ACT_RU_TASK when you completed Leave task?
Bpmn:
Bpmn image
#SpringBootTest
class ActivitistartApplicationTests {
#Autowired
ProcessEngine processEngine;
#Autowired
RuntimeService runtimeService;
#Autowired
TaskService taskService;
#Test
void startProcess() {
String processDefiKey="porcess";
ProcessInstance pi = runtimeService.startProcessInstanceByKey(processDefiKey);
}
#Test
void completeLeave() {
String assignee = "Tom";
TaskQuery taskQuery = taskService.createTaskQuery();
List<Task> list = taskQuery.taskAssignee(assignee).list();
for(Task task : list){
processEngine.getTaskService().complete(task.getId());
}
}
#Test
void completeCheck() {
String assignee = "Paul";
TaskQuery taskQuery = taskService.createTaskQuery();
List<Task> list = taskQuery.taskAssignee(assignee).list();
for(Task task : list){
processEngine.getTaskService().complete(task.getId());
}
}
}
Step1
I am run startProcess() create new task and ACT_RU_TASK have data
ACT_RU_TASK Table
Step2
I am run completeLeave()
ACT_RU_TASK Table (Comoleted Leave task)
Leave task data should delete in ACT_RU_TASK Table
Step3
I am run completeCheck()
Comoleted Check task
Check task is delete in ACT_RU_TASK Table, but Leave task data exist
Completed All task ACT_RU_TASK Table shoud no related task data

Related

camunda spring boot external tasks do not run in parallel

I created a small camunda application with spring boot. all tasks are external.The probliem,Is I the taks do not run n parallel . just one task at a time .
And one of the tasks doenot complete until ,I close the spring boot application.
Please help
Adding the following to application.yml did not solve the problem
spring.task.execution.pool:
core-size: 10
max-size: 20
spring.task.scheduling:
pool.size: 20
#Component
#ExternalTaskSubscription("getCustomers") // create a subscription for this topic name
public class GetCustomers implements ExternalTaskHandler {
#Resource(name = "amlDataSourceJdbcTemplate")
private JdbcTemplate jdbc;
public void execute(ExternalTask externalTask, ExternalTaskService externalTaskService) {
Logger.getLogger("GetCustomers").log(Level.INFO, "worker: GetCustomers started" );
String sql = "SELECT cust_i_id FROM CUSTOMERS";
List<String> customersList = jdbc.query(
sql,
(rs, rowNum) ->
new String(
rs.getString("cust_i_id")
)
);
// we could call an external service to create the loan documents here
// complete the external task
VariableMap variables = Variables.createVariables();
variables.put(Common.VAR_NAME_CUSTOMERS_LIST, customersList);
Logger.getLogger("GetCustomers").log(Level.INFO, "worker: GetCustomers finished" );
// complete the external task
externalTaskService.complete(externalTask, variables);
// externalTaskService.complete(externalTask);
Logger.getLogger("GetCustomers").log(Level.INFO, "worker: GetCustomers Task completed" );
}
}

How to configure spring batch job to run parallelly without considering the other set of jobs?

i am working in the spring-batch where i am having one situation
we have 2 set of schedulers
public class SchedulerA {
#Autowired
private Job a;
#Autowired
private Job b;
#Autowired
private Job c;
#Autowired
private Job d;
#Autowired
private SpringBatchJobHandler springBatchJobHandler;
#Autowired
private JobHandler jobHandler;
private List<String> jobName = new ArrayList<>();
#Bean
public void SchedulerLoad() {
jobName.add(a.getName());
jobName.add(b.getName());
jobName.add(c.getName());
jobName.add(d.getName());
}
#Scheduled(fixedDelay = 300000)
private void jobScheduler() throws Exception {
for (String job : jobName) {
if (!jobHandler.isJobForceStopped()) {
springBatchJobHandler.runJob(job);
}
}
}
and this is the second set of scheduler
public class SchedulerB {
#Autowired
private Job q;
#Autowired
private Job w;
#Autowired
private Job e;
#Autowired
private Job r;
#Autowired
private SpringBatchJobHandler springBatchJobHandler;
#Autowired
private JobHandler jobHandler;
private List<String> jobName = new ArrayList<>();
#Bean
public void SchedulerLoad() {
jobName.add(q.getName());
jobName.add(w.getName());
jobName.add(e.getName());
jobName.add(r.getName());
}
#Scheduled(fixedDelay = 300000)
private void jobScheduler() throws Exception {
for (String job : jobName) {
if (!jobHandler.isJobForceStopped()) {
springBatchJobHandler.runJob(job);
}
}
}
Now what we are trying is jobs within the each scheduler will run in sequential mode but both scheduler class will parallely.
Ex: Both schedulerA and schedulerB has to run at same time parallely, But the jobs within the respective classes has to be run in sequential mode only.
is that possible to achieve the above scenario?
i know above question will be so much confusing but we dont have any other choice..!!
please share your feedback for this situation..
There are two TaskExecutors involved in your scenario:
The one used by Spring Batch to run jobs when launched via a JobLauncher. By default, this uses a SyncTaskExecutor which runs submitted jobs in the calling thread. If you use the default JobLauncher in your SpringBatchJobHandler, then your jobs will be run in sequence since you submit them in a for loop.
The one used by Spring Boot to run scheduled tasks. By default, this one has a pool size of 1, see Spring Boot documentation: The thread pool uses one thread by default and those settings can be fine-tuned using the spring.task.scheduling namespace. This means if you schedule two tasks to run at the same time as in your use case, they will be run in sequence by the same thread.
Now If you want to run those scheduled tasks in parallels using different threads, you need to increase the pool size of the TaskExecutor configured by Spring Boot to 2 or more. You can do that by setting the following property:
spring.task.scheduling.pool.size=2

Spring Batch question for email summary at the end of all jobs

We have approximately 20 different Spring Batch jobs (some running as microservices, some lumped together in one Spring Boot app). What I need to do is gather all the errors encountered by ALL the jobs, as well as the number of records processed, and summarize it all in an email.
I have implemented ItemListenerSupport as a start:
public class BatchItemListener extends ItemListenerSupport<BaseDomainDataObject, BaseDomainDataObject> {
private final static Log logger = LogFactory.getLog(BatchItemListener.class);
private final static Map<String, Integer> numProcessedMap = new HashMap<>();
private final static Map<String, Integer> errorMap = new HashMap<>();
#Override
public void onReadError(Exception ex) {
logger.error("Encountered error on read", ex);
}
#Override
public void onProcessError(BaseDomainDataObject item, Exception ex) {
String msgBody = ExceptionUtils.getStackTrace(ex);
errorMap.put(item, msgBody);
}
#Override
public void onWriteError(Exception ex, List<? extends BaseDomainDataObject> items) {
logger.error("Encountered error on write", ex);
numProcessedMap.computeIfAbsent("numErrors", val -> items.size());
}
#Override
public void afterWrite(List<? extends BaseDomainDataObject> items) {
logger.info("Logging successful number of items written...");
numProcessedMap.computeIfAbsent("numSuccess", val -> items.size());
}
}
But how to I access the errors I accumulate in the listener when my batch jobs are finally finished? Right now I don't even have a good way to know when they are all finished. Any suggestions? Does Spring Batch provide something better for summarizing jobs?
Spring Batch does not provide a way to orchestrate jobs. The closest you can get out of the box is using a "master" job with multiple steps of type Jobstep that delegate to your sub-jobs. with this approach, you can do the aggregation in a JobExecutionListener#afterJob configured on the master job.
Otherwise, you can Spring Cloud Data Flow and create a composed task of all your jobs.

Job Execution Listener is failed to store value in Job Execution Context Table

I have implemented the function to store number of processed with failure and success into job execution context in afterJob() method of JobExecutionListener implemented:
public void afterJob(JobExecution jobExecution) {
final long jobExecutionId = jobExecution.getId();
final BatchStatus jobStatus = jobExecution.getStatus();
final ExecutionContext jobExecutionContext = jobExecution.getExecutionContext();
String exitCodeAndMessage = null;
Map<String, Integer> recordsProcessed = null;
switch (jobStatus) {
case COMPLETED:
//exitCodeAndMessage = getExitCodeAndMessageFromEveryStep(jobExecution);
recordsProcessed = getExitCodeAndMessageFromEveryStep(jobExecution);
if (exitCodeAndMessage == null) {
exitCodeAndMessage = "COMPLETED";
}
jobExecutionContext.putString("AfterJob", "Success");
jobExecutionContext.put("recordsProcessed", recordsProcessed);
break;
After running the job, there is no value stored inside the table. Why?
The job execution context is not saved after the job is finished but is saved in between every step execution. So you can use a StepExecutionListener and add these metrics in the afterStep method. More details on this here: https://docs.spring.io/spring-batch/4.0.x/reference/html/domain.html#executioncontext

How to dynamically schedule a Spring Batch job with ThreadPoolTaskScheduler

I have a Spring Batch application in which I want to schedule jobs calls.
The scheduling interval is not known at build so I can't just annotate my Job with #Scheduled.This led me to use a ThreadPoolTaskScheduler.
The thing is the method schedule takes a Runnable as a parameter. Is it possible to schedule jobs this way ?
I can call the job directly from the following service but I can't schedule it.
Here is my the background of my problem, I tried to make it simple :
#Service
public class ScheduledProcessor{
private final ThreadPoolTaskScheduler threadPoolTaskScheduler;
private Application application;
#Autowired
public ScheduledProcessor(ThreadPoolTaskScheduler threadPoolTaskScheduler, Application application){
this.threadPoolTaskScheduler = threadPoolTaskScheduler;
this.application = application;
scheduledTasks = new ArrayList();
Trigger trigger = new CronTrigger("0/6 * * * * *");
//Here I am trying to schedule my job.
//The following line is wrong because a Job can't be cast to a Runnable but I wanted to show the intended behaviour.
threadPoolTaskScheduler.schedule((Runnable) application.importUserjob, trigger);
System.out.println("Job launch !");
}
And here is the JobBuilderFactory :
#Bean
public Job importUserJob(JobBuilderFactory jobs, Step s1, Step s2) {
return jobs.get("importUserJob")
.incrementer(new RunIdIncrementer())
.flow(s1)
.end()
.build();
}
I understand (well, I'm even not sure about that) that I can't directly cast a Job to a Runnable but is it possible to convert it in any way ? Or can you give me some advice about what to use for being able to dynamically schedule spring batch jobs ?
In case that changes something, I also need to be able to restart / skip my steps, as I currently can with the threadPoolTaskScheduler.
Thank you in advance for any help or hint you could provide.
I finally got how to do it !
I created a class which implements Runnable (and for convenience, extends Thread, which avoid the need to implement all of Runnable classes).
#Component
public class MyRunnableJob extends Thread implements Runnable{
private Job job;
private JobParameters jobParameters;
private final JobOperator jobOperator;
#Autowired
public MyRunnableJob(JobOperator jobOperator) {
this.jobOperator = jobOperator;
}
public void setJob(Job job){
this.job=job;
}
#Override
public void run(){
try {
String dateParam = new Date().toString();
this.jobParameters = new JobParametersBuilder().addString("date", dateParam).toJobParameters();
System.out.println("jobName : "+job.getName()+" at "+dateParam);
jobOperator.start(job.getName(), jobParameters.toString());
} catch (NoSuchJobException | JobInstanceAlreadyExistsException | JobParametersInvalidException ex) {
Logger.getLogger(MyRunnableJob.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
In my ScheduledProcessor class, I set a Job to myRunnable class and then pass it as a parameter of the schedule method.
public class SchedulingProcessor {
//Autowired fields :
private final JobLauncher jobLauncher;
private final Job importUserJob;
private final ThreadPoolTaskScheduler threadPoolTaskScheduler;
private final MyRunnableJob myRunnableJob;
//Other fields :
private List<ScheduledFuture> scheduledTasks;
#Autowired
public SchedulingProcessor(JobLauncher jobLauncher, Job importUserJob, ThreadPoolTaskScheduler threadPoolTaskScheduler, MyRunnableJob myRunnableJob) throws Exception {
this.jobLauncher=jobLauncher;
this.importUserJob=importUserJob;
this.threadPoolTaskScheduler=threadPoolTaskScheduler;
this.myRunnableJob=myRunnableJob;
Trigger trigger = new CronTrigger("0/6 * * * * *");
myRunnableJob.setJob(this.importUserJob);
scheduledTasks = new ArrayList();
scheduledTasks.add(this.threadPoolTaskScheduler.schedule((Runnable) myRunnableJob, trigger));
}
}
The scheduledTasks list is just to keep a control over the tasks I just scheduled.
This trick enabled me to dynamically (thanks to ThreadPoolTaskScheduler) schedule Spring Batch Jobs encapsulated in a class implementing Runnable. I wish it can help someone in the same case as mine.
Heres another way to trigger them from your spring context.
Job emailJob = (Job) applicationContext.getBean("xyzJob");
JobLauncher launcher = (JobLauncher) applicationContext
.getBean("jobLauncher");
launcher.run(emailJob, new JobParameters());

Resources