Passing arguments to the task in the Spring Cloud Dataflow - spring

I need a Spring Cloud Dataflow Task to pass on the parameters during startup. While I found out how to run it in Spring Data Flow Shell (e. g.: task create my-composed-task --definition "mytaskapp --displayMessage=hello"), I don't know how to refer to these parameters in Java code. Can anyone guide me?

The simplest example can be found in Spring Cloud Task timestamp sample.
Here, the application prints the current timestamp when it starts; however, there's also the ability to override the format of the timestamp through the setFormat(String format) function.
For instance, you can create and launch the application as:
task create myTaskDefinition --definition "timestamp --format='yyyy'"
task launch myTaskDefinition
When the launch is successful, instead of the default yyyy-MM-dd HH:mm:ss.SSS, you will see the resulting output in yyyy format.

Related

Spring Batch with unknown datasource

I have a working Spring Boot application which embeds a Spring Batch Job. The job is not run on a schedule, instead we kick it with an endpoint. It is working as it should. The basics of the batch are
Kick the endpoint to start the job
Reader reads from input file
Processor reads from oracle database using jpa repository and simple spring datasource config
Writer writes to output file
However there are new requirements:
The schema of the repository database is from here on unknown on application startup. The tables are the same, it is just an unknown schema. This fact is out of our control and you might think it is stupid but there are reasons for it and this cant be changed. This means that with current functionality we need to reconfigure the datasource when we know the new schema name, and restart the application. This is a job that we will run for a number of times when migrating from one system to another, so it has a limited lifecycle and we just need a "quick fix" to be able to use it without rewriting the whole app. So what I would like to do is:
Send the schema name as a query param to the application, put it in job parameters and then - get a new datasource when the processor reads from the repository. Would this be doable at all using Spring Batch? Any help appreciated!

Spring Cloud Dataflow - handling argument in Task

I would like to pass the ID of an object to a variable when starting a Task in Spring Cloud Dataflow. I know that it can be done with arguments or parameters, but I don't know how to handle these arguments or parameters in Java code so I can take over this value. Could you please indicate how this could be done?
In the context of Spring Cloud Data Flow, you can pass arguments or properties to your task application.
The arguments you pass for the Spring Cloud Task application are the command line arguments for the task application itself. You need to qualify the arguments as the command line arguments for your application.
The properties you pass for the Spring Cloud Task application are the application configuration properties or task deployer properties. They have to use the prefix app, deployer or scheduler.
For instance, for the out-of-the-box timestamp task application, you can see how arguments and properties can be used in the following example:
Register out-of-the-box task applications
Create timestamp task:
dataflow:>task create a1 --definition "timestamp"
Launch the task with arguments and properties
dataflow:>task launch a1 --arguments "--spring.main.banner-mode=off" --properties "app.timestamp.format=YYYY/DD/MM"
In the above case, the command line argument --spring.main.banner-mode=off is passed to the timestamp application while the timestamp application's property format is passed to the task application.

spring boot batch to spring cloud task with multiple jobs

I have a spring boot batch application that has 5 unique jobs that execute by console using the command:
java -jar artifactName jobName param1
but now this project will be move to cloud, so I need to use spring cloud task. So far so good.
I know that I have to define in the main class the #enableTask and also in the application.properties define the properties:
spring.application.name=cloudTask
So reading the Spring documentation understand that for triggering my jobs using spring cloud dataflow server, can define a task that in this case i should use as cloudTask. But does not make sense because how will tigger it, because my application has 5 different jobs, so the question is:
how do i connect this task name with my jobs define in the application?
The logic tell me that I need to define also 5 task name, then how do I bind this task name with the respective job.
With #EnableTask annotation, you should be able to register your batch as Task application in SCDF - Under 'Apps'
Once your batch appears in Apps,
If all jobs 5 jobs are independent, you should be able to create 5 different Composed Tasks with same App name but with different parameters,
OR
If those are interlinked, then linked jobs can be combined together in 1 composed task, by providing alias and passing corresponding set of parameters, in DSL.
Once the composed task is launched, task execution status can be viewed under 'Task -> Executions' and corresponding to Jobs status can be viewed under 'Jobs'
To pass custom parameters to tasks, #EnableConfigurationProperties #ConfigurationProperties can be leveraged.

excel file as job parameter in spring batch

I have an requirement to use sprint batch in order to bulk upload excel data. user would be importing this file from an UI and service is expected to use this file and import into database. I am new to spring batch and with some analysis was able to infer that we will not be able to send excel file as parameter . Is saving the file to local is only way to read this file ? Is there anyway i can read the incoming file directly using spring batch ?
If I understand your question correctly you want a user to invoke a Spring service endpoint with a file and then Spring batch job should pick that file up as an input and start job processing.
Yes, this is very much doable and you do not need to explicitly save it to your local yourself.
Here is what I would do:
Take the file as an input to a POST endpoint using Spring "org.springframework.web.multipart.MultipartFile" object. Let's call this object "file".
Then get the input stream from MultipartFile object using "file.getInputStream()".
Set this input stream as "resource" to "FlatFileItemReader" object of Spring Batch.
Sample code:
flatFileItemReader.setResource(new
InputStreamResource(file.getInputStream()));
Once this is done and you start the Spring batch job, this file will be processed in your job.

Not able to pass JOB PARAMETER to Steps - Spring BatchJobs

We are implementing Spring Batch Jobs,
We need to pass the job parameter from Client/MASTER to SLAVE.
CLIENT/MASTER is where our Job and partitioning code is present. We are calling the JOB using J Unit passing the JOB PARAMETER.
SLAVE is where all the Steps and its implementation(reader Writer and processor) is defined.
We are able to achieve this in a standalone way, but not with Client & SERVER way. I am not sure why we are not able to achieve our set up, where we are able to achieve in Standalone?
We are using Weblogic and Spring Integration along with JMS to Achieve the same
Please assist.
We are able to resolve this issue as below in our bean configuration file:
<property name="load" value="#{jobParameters[load]}"></property>
load is passed from our Shell script
./esk200.sh esk200-context.xml TIMESTAMP load=full

Resources