Using Spring Integration Router with Spring Cloud Stream - spring-boot

I have been trying to use the Spring Integration's #Router with Spring Cloud Stream with Kafka binding. My understanding is that when you return a List of channel names from the method annotated with #Router they should be produced to the destined Kafka topics. But I don't see the messages being produced.
Does Spring Integration's #Router work with Spring Cloud Stream. If not, what's the alternative and how do I programmatically route to channels selected at runtime?

What makes you believe that Spring Integration #Router has any effect in Spring Cloud Stream? It's a completely different framework.
Yes there are mechanisms to route FROM and TO in Spring Cloud Stream and they are described here.
I think the specific section of interest to your use case is Routing FROM, but consider reading the full section to understand the differences and mechanisms used.

Related

Spring and Google Cloud PubSub - subscribing to events

Following documentation, there are multiple ways to integrate Google Cloud PubSub events with Spring application:
Spring Cloud GCP has several modules for sending messages to Pub/Sub
topics and receiving messages from Pub/Sub subscriptions using the
Spring Framework. You can use these modules independently or combine
them for different use cases:
Spring Cloud GCP Pub/Sub Starter lets you send and receive messages
using helper classes and call the Pub/Sub Java client library for more
advanced scenarios.
Spring Integration Channel Adapters for Pub/Sub
let you connect Spring Integration Message Channels with Pub/Sub.
Spring Cloud Stream Binder for Pub/Sub lets you use Pub/Sub as
messaging middleware in Spring Cloud Stream applications.
I don't have full understanding - what are those different use cases mentioned, and how to determine, which module is best for which use case?
Application (Dockerized Spring Boot app, deployed to Kubernetes in GCP) I am working on is rather simple, it is expected to act upon received PubSub event, it is not going to publish any events itself.
Spring Cloud GCP Pub/Sub Starter module contains the java client classes for pub sub which will be used by your spring application to perform administrative and functional operations (ie. sending and receiving messages).
Spring Integration Channel Adapters for Pub/Sub module is utilized when your spring application uses Message Channels. This module will help routing message between message channel and pub/sub using channel adapters.
Spring Cloud Stream Binder for Pub/Sub module is used in Spring Cloud Stream Applications in order to utilize cloud Pub/Sub API.
Since, your application requirements are basic you can easily go for Spring Cloud GCP Pub/Sub Starter module. For more information you can refer to this Google documentation.

Spring Security for Spring cloud stream with RabbitMQ as binder

I have a Spring boot application with Spring cloud stream enabled. This project contains both API endpoints and producer/consumer streams in it. In our case RabbitMQ is the binder. We have enabled Spring security but it works for API endpoints.
I need to know how security can be enforced for stream requests coming from RabbitMQ. Since no user context involved in this and the other services are publishing their requests to queue (our cloud stream has listeners for that queue) and not directly calling the API. So i am not sure how client credentials flow can be used in it.
#Vignesh
Now I'm not sure is it possible to share user content from message producer to message consumer.
I created an issue: https://github.com/Azure/azure-sdk-for-java/issues/23323
Please vote for the issue if you think this is necessary for you.
You can click +1 to vote, or you can add comments in the issue.
Further discussion in the issue is welcome.

Spring boot Kafka messaging. How to use SpEL to manage handler access

I'm using Kafka in Spring Boot project. There are a lot of benefits in case you have simple flow (to use #KafkaListener, #KafkaHandler) and spring prepares almost everything for development.
In my application I have different handlers for the same message data. I want to use SpEL to manage handlers manipulating header data, but I've not detected corresponding API for that.
So my question: is it possible to manage my handlers via SpEL in case I have special headers for that (Header for example "X-OPERATION_TYPE":"patch")? How?
P.S.
I can make workarounds using GoF Strategy as example, but I hope spring already has solution for that case.
There is not such a "conditional routing" in Spring for Apache Kafka, but you can do that routing manually in the single #KafkaListener with plain if...else or switch.
For more comprehensive routing logic it would be better to take a look into Spring Integration: https://docs.spring.io/spring-integration/docs/5.0.9.RELEASE/reference/html/messaging-routing-chapter.html

Kafka bindings without #EnableBinding annotations in Spring

I'm using spring cloud to connect to my Kafka broker. It works fine. Now I want to create my binding by code instead of annotation.
Is there a convenient way to do it?
Could you elaborate why do you want to do the binding programmatically instead of using #EnableBinding.
While Spring Cloud Stream simplifies exactly that, if you prefer to use your own way of connecting (for any other specific reason), then you might want to check the Spring Integration adapters to do the binding. But, in this case, you are on your own by setting up the lifecycle and all other goodies that Spring Cloud Stream provides.
If you still want to use Spring Cloud Stream but don't want to use the annotation, then check here to see all the configuration that Spring Cloud Stream does when you annotate and apply your use case.
Please follow https://github.com/spring-cloud/spring-cloud-stream/issues/954. We plan to add this feature to 1.3.0.RC1.

Spring cloud contract - integrating with non - spring endpoint

I have a spring webapp that communicates with external service over kafka. IS it possible to somehow test contract between those services?
Yes you can. Spring Cloud Contract supports CDC with messaging. If you're using Spring Cloud Stream - the work to be done is trivial. If not then you'll have to implement your own as presented in this issue - Spring Cloud Contract and plain Spring AMQP . Summing it up it's enough for both consumer and producer to implement a custom org.springframework.cloud.contract.verifier.messaging.MessageVerifier bean that will be responsible for receiving and sending of messages via Kafka

Resources