Spring Cloud dataflow in Openshift - how to get execution Id - spring

Currently I am running spring cloud data flow in open shift environment.
I can able to trigger the open shift poda through it. I have used oracle database in both dataflow and application which I am triggering in data flow.
When ever pod is triggered, dataflow is generating a execution id which is visible in pod json file as a
Argument for pod. How can I get it as a input parameter for my spring batch job.
On Spring DataFlow I am running Spring Batch jobs so maybe better way is too somehow set execution job id and pass it as input parameter?
Since the execution id cannot be fetched from pod arguments, spring batch ia generating a new execution id and updating the status to that id. Whenever I see it in Dataflow UI, scdf triggered id has status as NA and new execution id ia generated with update status as success or failure.
How can I get exeuction is from pods or how to stop scdf to stop generating a execution id and make my spring f batch to generate a execution id.

Related

Asynchronous Kafka consumer in Spring Batch Application

In our Spring Batch application workers, item processors are further interacting with another service asynchronously through Kafka. The requirement here is we required an acknowledgement in order to retry failed batches and the condition is to not wait for the acknowledgement.
Is there any mechanism in spring batch by which we can asynchronously consume Kafka ?
Is it possible to rerun specific local worker step in rerun of job?
We implement producers and consumers over same step using Spring batch decider. Thus, during the first run it will only produce Kafka and on second run it will consume the Kafka.
We are looking for solution where we can asynchronously consume Kafka in Spring batch application in order to rerun specific worker step.
Is there any mechanism in spring batch by which we can asynchronously consume Kafka ? Is it possible to rerun specific local worker step in rerun of job?
According to your diagram, you are doing that call from an item processor. The closest "feature" you can get from Spring Batch is the AsyncItemProcessor. This is a special processor that processes items asynchronously in a separate thread. The callback is unwrapped in an AsyncItemWriter with the result of the call.
Other than that, I do not see any other obvious way to do that with a built-in feature from Spring Batch. So you would have to manage that in a custom ItemProcessor.

Trigger Spring integration from a Spring batch job

I'm looking for a way to trigger a spring integration that does file download from a spring batch job. Currently I've seen examples of triggering batch job after the integration has received the files.
Basically a step of my batch job should be fetch file from a remote server using spring integration and the next step is process it / upload to some server.

How to stop jobs from Spring Cloud Data Flow immediately

I have used Spring Cloud Data Flow to control some batch jobs. In SCDF, after I defined some tasks, they were launched as jobs with running status. When I tried to stop a particular job, It did not stop immediately. I have found that the job was still running until it finished it's current step.
For example, My job 'ABC' has 2 steps A and B. In SCDF, I stop job 'ABC" when step A is being executed and job 'ABC' is still running until step A is completed and it do not implement step B.
So, Are there any ways to stop a Job immediately from Spring Cloud Data Flow?
From the Spring Cloud Data Flow, the batch job stop operation is delegated to the Spring Batch API. This means there is nothing Spring Cloud Data Flow offers to stop a batch job immediately as it needs to be handled by the Spring Batch or the job implementation itself.
When a batch job stop request is sent for a Batch Job execution (if it is running), the current step execution is set with the flag terminateOnly to true which means the step execution is ready to be stopped based on the underlying step execution implementation.

Configuring spring Batch tasks in Spring cloud data flow

I have created a project with 2 rest API that launches different Jobs. My project is connected to a MySql database. I would like to monitor both the Jobs in spring cloud data flow. Please help me out how we need to configure SCDF to MySql so that both the Jobs will be monitored. And additionally, i would like to know that how, if we launch the job by firing the API, whether our SCDF will monitor those Job Instance. If not, please let me know how we can do that.
Thanks in Advance
Please take a moment to read the Spring Batch Admin to SCDF migration guide. It is a requirement that the jobs are wrapped with Spring Cloud Task programming model.
Once you have the batch-job wrapped as a Task, you can register them in SCDF to build Task/Batch pipelines using SCDF's DSL or the GUI.
As for the datasource, all you must have to make sure is that the same datasource is shared between SCDF and the batch-jobs. With this, SCDF's Dashboard will automatically list the jobs and its execution details.
Here are a few examples for your reference.
And additionally, i would like to know that how, if we launch the job by firing the API, whether our SCDF will monitor those Job Instance
Assuming you're referring to SCDF's Task launch API (e.g., a Scheduled trigger or by other means); if triggered, yes, the job executions will be captured in the database as far as SCDF and the batch-jobs share a common datasource as explained previously.

accessing remote spring batch jobs from spring batch admin

I am new to spring batch. I want to run spring batch jobs on server a and want to launch those jobs from server b using spring batch admin.is it possible? I have searched the following two ways:
1.JMX way: i could convert spring batch beans into mbeans but i cant read them from spring batch admin.can you tell how to read mbeans from spring batch admin and launch them?
2.common repository: i think if i use the same db repository for both spring batch and spring batch admin then i can launch remote jobs from spring batch admin (from server b).but in the job xml file in spring batch admin what should be the classpath for tasklet?
can you help in the above or tell me if any new way exists?
we ended up implementing a framework using mq communication to handle this. each 'batch node' registers itself and any 'batch class' parameters such as 'nodeType=A' or 'jobSizeiCanHandle=BIG' (these are fictitious but you get the point). The client console reads this information and queries the nodes via MQ for the job list. It then submits job requests with parameters via a rudimentary text based protocol (property file format).
command=START_JOB
job=JobABC
param1=x
param2=y
One of the batch nodes will pick up the message and start the job, it will return success/fail status in the same manner with a message with the same correlation id. so the client can show response to the user.
this allows us to do what you're talking about AND spark the jobs via an external scheduler (Control-M) . The 'nodeType=A' mentioned above allows us to query individual nodes (the nodes listen where 'nodeType=A or nodeType=*'. This allows commands to be 'targeted' to specific nodes if that is necessary.
Keep in mind, this is our own console, not the spring batch admin console. So perhaps that doesn't help you, but building up a simple console doesn't take that long using the spring batch APIs (4 or 5 asps).
The batch nodes could also have started up simple services like HTTP REST services or 'whatever' but we use MQ heavily and i liked the idea of not having to preregister nodes (the framework code doesn't know/care that it's in an HTTP container, so it couldn't register the endpoint easily). With MQ, the channel is preconfigured and all apps just 'use it' so it seemed easier.
Good luck.
I am trying to do the same thing. But it seems that in order to launch job directly from Spring batch admin, all the job resource has to be added to the spring batch web app. May be try restful job submission with spring MVC
#chau
One way to use Spring batch admin as is, but "discover" and "invoke" remote jobs is to provide your own implementations for org.springframework.batch.admin.service.JobService and org.springframework.batch.core.launch.JobOperator that can query and invoke jobs from remote job registry/repository.
You can find custom implementation for JobService and JMX enabled Job administrator in : https://github.com/regunathb/Trooper/tree/master/batch-core as: org.trpr.platform.batch.impl.spring.admin.SimpleJobService and org.trpr.platform.batch.impl.spring.jmx.JobAdministrator
Spring beans XML that uses these beans are here : https://github.com/regunathb/Trooper/blob/master/batch-core/src/main/resources/packaged/common-batch-config.xml

Resources