Spring XD stream that responds to an HTTP POST - spring-xd

I would like to create a stream in Spring XD that starting with an HTTP source would use the result of the stream execution as the body of the response.
In the following example, I would like to know if there is anything I could use as sink after I upper case in the transformer so that the body of the response of the original http post is that upper cased entity.
stream create --name httptest --definition "http | transform --expression=payload.toUpperCase() | ??" --deploy
Currently, when I use it, putting the log module as the sink, the returned body is null.
Thanks,
David

No; that is not currently supported; the stream is a one-way street.
You can use a stand-alone Spring Integration, or Spring Integration Java DSL app (perhaps a Spring Boot app which provides a fast on-ramp with the web and integration starters) to provide that functionality.

Related

Start Spring Batch job through JMS with Spring Cloud Dataflow

I have an application which listen to an activeMQ queue and start a Batch Job when receiving a message.
I'd like to use Spring Cloud Dataflow to provide an UI but I can't find informations on how to configure it.
Since it uses Spring Boot I should be able to replicate how my application currently works (use a REST API to make it listen to activeMQ and start job when receiving message), but I can't find anything on how to make it start the batch in Cloud Dataflow.
You have a few options here.
Option 1: Launch your application as-is and manually send message to launch task.
Any arbitrary Spring Boot application can be launched from Dataflow (simply register it as type = "App").
Taken from https://github.com/spring-cloud/spring-cloud-dataflow/blob/main/spring-cloud-dataflow-docs/src/main/asciidoc/streams.adoc#register-a-stream-application:
Registering an application by using --type app is the same as registering a source, processor or sink. Applications of the type app can be used only in the Stream Application DSL (which uses double pipes || instead of single pipes | in the DSL) and instructs Data Flow not to configure the Spring Cloud Stream binding properties of the application. The application that is registered using --type app does not have to be a Spring Cloud Stream application. It can be any Spring Boot application. See the Stream Application DSL introduction for more about using this application type.
You would have to send the task launch in your code. You can use the Dataflow REST client to do this. You can get an idea of how to do that by looking at https://github.com/spring-cloud/spring-cloud-dataflow/tree/main/spring-cloud-dataflow-tasklauncher/spring-cloud-dataflow-tasklauncher-sink.
Option 2: Use pre-built stream applications to model the same flow as your application.
The app you describe can be logically modeled as a Spring Cloud Stream application.
There is a JMS source (provides messages to signal the need to kickoff task/batch job)
There is a TaskLauncher sink (receives messages and kicks off the task/batch job)
This app can actually be constructed w/ little effort by using the pre-packaged applications to model this flow.
JMS Source
Dataflow Tasklauncher Sink
If you have to register these applications in the UI - they can be found at:
maven://org.springframework.cloud.stream.app:jms-source-kafka:3.1.1
maven://org.springframework.cloud:spring-cloud-dataflow-tasklauncher-sink-kafka:2.9.2
Stream definition:
jms-source | dataflow-tasklauncher-sink
The README(s) on the above source/sinks give details about the configuration options.
Option 3: Custom Spring Cloud Stream app w/ function composition
The previous option would create 2 separate apps. However, if you want to keep the logic in a single app then you can look into creating a custom Spring Cloud Stream app that uses function composition and leverage the pre-built reusable Java functions that the apps in option 2 are built upon.
JMS Supplier
TaskLauncherFunction

Internal Channels in spring cloud stream

I started developing an spring cloud stream project. I'm successfully received message from Kafka through #Streamlistener annotation. Before sending the message to any output channel, I have to convert the payload by calling an externalservice or by DB call. I don't want to call the external service or DB method from the same streamlistener method. My question is , can we create internal channels (like Spring Integration DSL flow) in spring cloud stream?
Yes you can. In Spring Cloud Stream, the channel binding to the binder would only be enabled based on what binder you use for the channel binding.

Spring Cloud Stream router app

I've been playing with the router sink in the Spring Cloud Stream App Starters, and I have a question about content type.
I'm sending a JSON string to the router, and I would like to write a SpEL expression to determine the routing. However, even when I run this by modifying the JUnit test cases in the project, the "payload" shows up as a string, not parsed JSON. When running JUnit test case for the filter processor, also in the Spring Cloud Stream App Starters, all I need to do is pass valid JSON in a string, and the payload is a LinkedHashMap. (A regular string, like "Hello, world!" makes the payload show up as a String type.)
I really want my router to have a HashMap payload as well. Otherwise, I can't figure out how to write my SpEL expression. I learned earlier how to set a content type on Spring Cloud Stream, and so when I deploy the router in Spring Cloud Dataflow, I try to set it via:
stream deploy --name router-flow --properties "app.router.spring.cloud.stream.bindings.input.content-type=application/json"
However, the payload stills shows up a string. Where am I going wrong?
You can use the #jsonPath() SpEL function - it's automatically registered for use by Stream apps.

Reactive stream Kafka Stream Fan-out to http actors

I'm very new to Akka Streaming and reactive streaming. I have a question: is it possible to have a rest API receiving a message dropping it on Kafka Bus, and the Kafka streaming consumer then aggregates the messages in a max. time window and retrun the answer back?
How to implement such a system? Or where to start?
Thanks
For a rest API you can consider the Kafka REST Proxy: https://github.com/confluentinc/kafka-rest
Or course you can instead build your own using akka-http and akka-stream-kafka.
As to windowing, I'm sure it can be done in akka streams but personally, I'd suggest using Kafka Streams as the first port of call:
http://docs.confluent.io/current/streams/developer-guide.html#windowing
I'm not sure what exactly you mean by returning the answer back, but if you follow the approach above, you can use use REST Proxy to consume the windowed-aggregated messages or you can build a REST service that queries the Kafka Streams state stores via the so-called "interactive queries". This post shows how to do it using javax.ws.rs: https://www.confluent.io/blog/unifying-stream-processing-and-interactive-queries-in-apache-kafka/ but for a reactive application you can do the same using akka-http instead (I'm implementing this exact thing on one of my projects).

Spring Cloud DataFlow for HTTP request/response exchange

I would like to use streams to handle an HTTP request/response exchange. I didn't see any Spring Cloud Stream App Starters with HTTP sink functionality. Will I need to build a custom sink to handle the response? If so, do I pass the request through my processing pipeline, then use the request in my sink to form the response? I don't think I've misunderstood the use case of Spring Cloud DataFlow and Spring Cloud Stream. Perhaps there are app starters available for this pattern.
Spring Cloud Stream/Dataflow is for unidirectional (stream) processing; it is not intended for request/reply processing.
You could, however, utilize a Stream from a Spring Integration Application; for example, with the rabbitmq binder...
http-inbound-gateway -> amqp-outbound-gateway
Where the outbound gateway is configured to expect the reply from a specific queue and then your stream could be...
:requestQueue > processor1 | ... | processorn > :replyQueue
Spring Integration doesn't currently have an outbound gateway for Kafka. I opened an issue.

Resources