Launch spring cloud task with JVM args via DeployerPartitionHandler - spring-boot

I am planning to execute spring batch job on Pivotal cloud Foundry. The job executes fine on a single JVM with multiple threads(Local partitioning). I am looking to scale the job and the first option i considered is running the worker processes as a spring cloud task.
Before run it in PCF i am running it in local. I created a partition handler in the following manner.
#Bean
public PartitionHandler partitionHandler(TaskLauncher taskLauncher, JobExplorer jobExplorer, TaskRepository taskRepository) throws Exception {
Resource resource = this.resourceLoader
.getResource("file:I:/Project/target/batch.jar");
DeployerPartitionHandler partitionHandler =
new DeployerPartitionHandler(taskLauncher, jobExplorer, resource, "workerStep", taskRepository);
List<String> commandLineArgs = new ArrayList<>(3);
commandLineArgs.add("--spring.profiles.active=worker");
commandLineArgs.add("--spring.cloud.task.initialize-enabled=false");
commandLineArgs.add("--spring.batch.initializer.enabled=false");
commandLineArgs.add("--java.security.krb5.conf=I:/krb5.conf");
commandLineArgs.add("--java.security.auth.login.config=I:/jaas.conf");
partitionHandler
.setCommandLineArgsProvider(new PassThroughCommandLineArgsProvider(commandLineArgs));
partitionHandler
.setEnvironmentVariablesProvider(new SimpleEnvironmentVariablesProvider(this.environment));
partitionHandler.setMaxWorkers(2);
partitionHandler.setApplicationName("PartitionedBatchJobTask");
return partitionHandler;
}
#Bean
#Profile("worker")
public DeployerStepExecutionHandler stepExecutionHandler(JobExplorer jobExplorer) {
return new DeployerStepExecutionHandler(this.context, jobExplorer, this.jobRepository);
}
The task is started, but as mentioned in the args the jar has to run with certain JVM args. These args are not being properly sent to task to start the app as a task. (I am connecting to a DB using Kerberose. Need to send these properties as a JVM arg)
I see the below being executed as a task
Command to be executed: I:/java.exe -jar I:/Project/target/batch.jar --spring.profiles.active=worker --spring.cloud.task.initialize-enabled=false --spring.batch.initializer.enabled=false --java.security.krb5.conf=I:/krb5.conf --java.security.auth.login.config=I:/jaas.conf --jdk.tls.client.protocols=TLSv1.2 --spring.cloud.task.job-execution-id=1 --spring.cloud.task.step-execution-id=3 --spring.cloud.task.step-name=workerStep --spring.cloud.task.name=application-1_migrateProfileJob_migrateProfileFollowerStep:partition0 --spring.cloud.task.parentExecutionId=29 --spring.cloud.task.executionid=30
Since the JVM args are sent after the jar command the app is not able to recognize th eproperties.. Can any one please let me know what i am doing wrong

Related

How to get previous task local variables of a running process instance in activiti

I am developing a spring boot application with activiti as the workflow engine. The activiti-spring-boot-starter dependency version is 7.1.0.M6 and spring-boot-starter-parent version is 2.6.7.
I have defined a BPMN 2.0 diagram using activiti-modelling-app and I am now starting the process instance. After completing a task, I want to access its task local variables when processing the next task. I am unable to figure out the api for it.
I tried using the historyService as below but with no luck. I get the result list as empty everytime with different apis (finished(), unfinished() etc)
HistoricTaskInstance acceptMobile = historyService.createHistoricTaskInstanceQuery()
.processInstanceId(processInstanceId)
.taskName("my-task1")
.singleResult();
Can someone guide me on what could be the right api to use to get the local variables of a previously completed task?
Thanks.
The best way to transfer variables between tasks is to use execution variables with DelegateExecution
execution variables are specific pointers to where the process is active, for more information, see apiVariables
Let say you have Task-A and Task-B with different listeners
here's how to use execution variable from Task-A to Task-B:
#Component("TaskListenerA")
public class TaskListenerA implements TaskListener {
#Override
public void notify(DelegateTask task) {
DelegateExecution execution = task.getExecution();
if("complete".equals(task.getEventName()) {
String myTaskVar = (String) task.getVariable("taskAvariable")
execution.setVariable("exeVariable", myTaskVar);
}
}
}
#Component("TaskListenerB")
public class TaskListenerB implements TaskListener {
#Override
public void notify(DelegateTask task) {
DelegateExecution execution = task.getExecution();
String myVariable = execution.get("exeVariable");
}
}

Spring Batch - Running a particular job in an application with multiple jobs

I have just got into a new project, which is all about migrating COBOL jobs to spring batch.
Before I joined, my colleague started on this migration and created the spring batch application with 1 job. My colleague was/is able to run the spring batch job from the command line just by running the application's jar file.
After I joined, I followed the 1st spring batch job and developed the 2nd spring batch job. But my spring batch job is not running from the command line by running the application's jar file like the 1st spring batch job.
I tried to explicitly launch my job (2nd job) from the commandlinerunner's run method and it worked. (The 1st spring batch job didn't have to do this).
Once the jobs are developed and tested, each job will be run from the Tivoli workload scheduler scheduled at different times.
The command that I used from command line:
java -Dspring.profiles.active=dev -jar target/SPRINGBATCHJOBS-0.0.2-SNAPSHOT.jar job.name=JOB2 dateCode=$$$$
Package structure
App
src
main
java
com.test
BatchApplication.java
common
dao
...
entity
...
service
...
job1
config
...
core
...
dao
...
entity
...
job2
config
...
core
...
dao
...
entity
...
Code Snippet (on a very high-level)
public class BatchApplication {
public static void main(String[] args) {
SpringApplication.run(BatchApplication.class, args);
}
}
public class Job2Config {
public Job JOB2() {
return jobBuilderFactory.get("JOB2")
.preventRestart()
.incrementer(batchJobIncrementer)
.listener(jobExecutionListener())
.flow(job2Step())
.end()
.build();
}
public Step job2Step() {
return stepBuilderFactory.get("job2Step")
.<String, String>chunk(
Integer.parseInt(chunkSize))
.reader(job2ItemReader())
.writer(this.job2ItemWriter)
.faultTolerant()
.skipLimit(Integer.parseInt(this.skipLimit))
.skipPolicy(skipPolicy())
.retryPolicy(new NeverRetryPolicy())
.listener(chunkListener())
.allowStartIfComplete(true)
.build();
}
public JdbcCursorItemReader<String> job2ItemReader() {
return new JdbcCursorItemReaderBuilder<String>()
.dataSource(oracleDataSource)
.name("job2ItemReader")
.sql(this.sqlQuery)
.fetchSize(Integer.parseInt(this.fetchSize))
.rowMapper((resultSet, rowNum) -> resultSet.getString(1))
.build();
}
}
public class Job2CommandLineRunner implements CommandLineRunner {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job JOB2;
public void run(String... args) throws Exception {
String jobName = params.get("job.name");
if (jobName != null && jobName.equalsIgnoreCase("JOB2")) {
JobParameters jobParameters =
new JobParametersBuilder()
.addLong("time", System.currentTimeMillis())
.addString("dateCode", params.get("dateCode"))
.toJobParameters();
this.jobLauncher.run(JOB2, jobParameters);
}
}
}
I have couple of questions:
The 2nd spring batch job simply followed the 1st spring batch job for the implementation but I'm not sure why the 2nd spring batch job didn't get invoked
from the command line like the 1st spring batch job. To invoke the 2nd spring batch job, I had to explicitly launch it from the command line runner.
When there're multiple jobs in a Spring Batch application, is invoking a particular job explicitly thru CommandLineRunner the right way of doing it? If it's not, what's the right way of invoking a particular job (of an application with multiple jobs) from command line (eventually the jobs need to be called thru Tivoli workload scheduler)
It will be great if someone can help me clarify my questions.
Thanks in advance.

Is there a way to unregister Spring Cron Task from ScheduledTaskRegistrar?

We have written a simple application in spring boot that triggers a cron job. We are able to launch it successfully. Below is pice of code.
CronTask task = new CronTask(new Runnable() {
#Override
public void run() {
System.out.println("Job running ...")
}
}, cronExpression);
taskRegistrar.addCronTask(task);
taskRegistrar.afterPropertiesSet();
Now how to unregister/kill/destroy/remove the task started by me.
I can get the corn task back from the registry using
this.taskRegistrar.getCronTaskList();
But don't see a method in the registry to unregister a task nor any method in task to destroy it.

Activiti Escalation Listener Configuration

I am using activiti 5.18.
Behind the scenes : There are few task which are getting routed though a workflow. Some of these tasks are eligible for escalation. I have written my escalation listener as follows.
#Component
public class EscalationTimerListener implements ExecutionListener {
#Autowired
ExceptionWorkflowService exceptionWorkflowService;
#Override
public void notify(DelegateExecution execution) throws Exception {
//Process the escalated tasks here
this.exceptionWorkflowService.escalateWorkflowTask(execution);
}
}
Now when I start my tomcat server activiti framework internally calls the listener even before my entire spring context is loaded. Hence exceptionWorkflowService is null (since spring hasn't inejcted it yet!) and my code breaks.
Note : this scenario only occurs if my server isn't running at the escalation time of tasks and I start/restart my server post this time. If my server is already running during escalation time then the process runs smoothly. Because when server started it had injected the service and my listener has triggered later.
I have tried delaying activiti configuration using #DependsOn annotation so that it loads after ExceptionWorkflowService is initialized as below.
#Bean
#DependsOn({ "dataSource", "transactionManager","exceptionWorkflowService" })
public SpringProcessEngineConfiguration getConfiguration() {
final SpringProcessEngineConfiguration config = new SpringProcessEngineConfiguration();
config.setAsyncExecutorActivate(true);
config.setJobExecutorActivate(true);
config.setDataSource(this.dataSource);
config.setTransactionManager(this.transactionManager);
config.setDatabaseSchemaUpdate(this.schemaUpdate);
config.setHistory(this.history);
config.setTransactionsExternallyManaged(this.transactionsExternallyManaged);
config.setDatabaseType(this.dbType);
// Async Job Executor
final DefaultAsyncJobExecutor asyncExecutor = new DefaultAsyncJobExecutor();
asyncExecutor.setCorePoolSize(2);
asyncExecutor.setMaxPoolSize(50);
asyncExecutor.setQueueSize(100);
config.setAsyncExecutor(asyncExecutor);
return config;
}
But this gives circular reference error.
I have also tried adding a bean to SpringProcessEngineConfiguration as below.
Map<Object, Object> beanObjectMap = new HashMap<>();
beanObjectMap.put("exceptionWorkflowService", new ExceptionWorkflowServiceImpl());
config.setBeans(beanObjectMap);
and the access the same in my listener as :
Map<Object, Object> registeredBeans = Context.getProcessEngineConfiguration().getBeans();
ExceptionWorkflowService exceptionWorkflowService = (ExceptionWorkflowService) registeredBeans.get("exceptionWorkflowService");
exceptionWorkflowService.escalateWorkflowTask(execution);
This works but my repository has been autowired into my service which hasn't been initialized yet! So it again throws error in service layer :)
So is there a way that I can trigger escalation listeners only after my entire spring context is loaded?
Have you tried binding the class to ApplicationListener?
Not sure if it will work, but equally I'm not sure why your listener code is actually being executed on startup.
Try to set the implementation type of listeners using Java class or delegate expression and then in the class implement JavaDelegate instead of ExecutionListener.

Rerunning a completed Job from Spring Framework?

I am developing a web application using Spring Framework that does some jobs as the application starts up, and these jobs primarily consist of loading data from CSVs and making Java objects out of them.
Currently, I am trying to build a RESTful API using Restlet and Spring framework and one of the queries is supposed to take in a job name as parameter and restart that job even if that job has been marked as COMPLETED, how do I accomplish a job restart? I have tried the spring frameworks' Joboperator interface's startNextInstance() method and have also tried to manually increment the JobParameters so that there is no jobinstancealrradyrunning exception?
Anyone has any code snippet or alternative idea on how to restart a Job in Spring Framework that has been marked as Completed?
Any help would be greatly appreciated, thanks!!
Because of the terms you're using i'm quite certain you're using Spring Batch
In Batch terms you cannot actually restart a COMPLETED instance or execution. Single job instance is identified by job parameters. If you need to run the job again with same parameters, one way would be to include some unique parameter, for example the current timestamp to the JobParameters before launching.
So restarting a completed job would mean starting a new instance of the job with similiar parameters. Here's an slightly modified snippet i've used before that uses JobLauncher and JobRegistry to launch a new job by name:
#Autowired
#Qualifier("asyncJobLauncher")
private JobLauncher asyncJobLauncher;
#Autowired
private JobRegistry jobRegistry;
...
public JobExecution startJob(String jobName) {
Job job;
try {
job = jobRegistry.getJob(jobName);
} catch (NoSuchJobException e) {
// handle invalid job name
}
JobParametersBuilder jobParams = new JobParametersBuilder();
jobParams.addLong("currentTime", System.currentTimeMillis());
// add other relevant parameters
try {
JobExecution jobExecution = asyncJobLauncher.run(job, jobParams.toJobParameters());
return jobExecution;
} catch (JobExecutionAlreadyRunningException e) {
// handle instance already running with given params
} catch (Exception e) {
// handle other errors
}
}
Hope it helps, here's some reading about the subject.

Resources