Kafka ItemWriter for Spring Batch for Spring Cloud Data Flow - spring

I've created a Spring Batch application which reads data from file and write to database. I want to write output data into Kafka. I've tried to use Spring Cloud Data Flow but could not even run it.
I've followed that tutorial: https://spring.io/guides/gs/batch-processing/
when I register it to Spring Cloud Data Flow it gives a status as N/A. I've checked that tutorial: http://www.baeldung.com/spring-cloud-data-flow-batch-processing and added #EnableTask to my application class but result is same.
My question is: I want to write data to Kafka and use that Spring Batch job at Spring Cloud Data Flow. How can I do that?

Related

Can we use Spring Boot to expose Rest API's for Spark Jobs?

I have been given an task where I have the Apache Spark SQL and they want me to expose out the result as SPring REST API's using Spring Boot.
Is this possible? What I would be given would be Spark SQL's.
The Data currently runs on our DataBricks, But then the data also needs to be exposed as REST API's.
Yes, you can do that. You can look up for Spring Spark integration.
https://docs.spring.io/spring-hadoop/docs/current/reference/html/springandhadoop-spark.html
Also, you can look at this question for more references.
https://docs.spring.io/spring-hadoop/docs/current/reference/html/springandhadoop-spark.html

Where is logs for Spring Cloud Data Flow Stream

I am using Spring Cloud Data Flow Stream. I have launched 3 spring boot apps as a stream. namely source, processor & sink.
How can i see the logs of these spring boot apps? thanks
Apparently you can look at http://localhost:9393/runtime/apps to see stdout in attributes to see where the logs are

Spring Cloud Data Flow - can it be used without spring boot?

Can Spring Cloud Data Flow be used in Spring5 applications - NOT Spring Boot - my current employer seems to view Spring Boot applications as insecure (I've no idea why) in anyway I'd like to try use this stack for an integration project, so is it possible to use it without Spring Boot?
With Spring Cloud Data Flow you can deploy streams, tasks and batches.
This is all based on Spring, Spring Cloud and Spring Boot. Spring Boot is nothing else as a preconfigured Spring stack.
Spring Data Flow is a runitme that usually needs a cloud infrastructure like Kubernets.
I'm not sure if you really are looking for that or more for something like https://spring.io/projects/spring-integration

How to register Spring Batch Application in spring cloud data flow?

I am trying to register Spring bath project jar file in spring cloud data flow. I am pretty new to spring cloud data flow. Please help me step by step process to register Spring batch Job in spring cloud data flow.
There are a few examples in the SCDF-samples repo.
Also, we spend cycles writing the documentation, so please make sure to review the reference guide; in particular the task/batch sections could be useful.

Spring Boot to Spring XD conversion

I have an existing stand-alone Spring Boot (CommandLineRunner) application that generates JSON payloads, currently written to the local file system. Though I could tail that file easily in Spring XD, I'd rather have the entire Spring Boot application run as either a source or job module in Spring XD, but I'm having some difficulties making the conversion. Does anyone know of a Spring XD sample where the XD module (source or job) is the creator/originator of outgoing messages, instead of using some sort of inbound adapter from another external system? (Maybe my entire premise is misguided, but I want the application to be managed by Spring XD so that I don't have to manage another out-of-band orchestration tool to start and stop the JSON-generating process.)

Resources