1 output bindings in Spring Cloud Stream Kafka Binder - spring

in this page tells that you can't make just one Outputs
but I need to make just one Outputs by using Spring Cloud Stream Kafka Binder
what should I do?
some articles says that using org.springframework.cloud.stream.function.StreamBridge but it's not works for me
I made StreamBridge to send topics to Kafka but Kafka doesn't produce topics to my Spring boot application
and this is my application.yml and produce topic code
// producer Springboot application
spring.cloud.stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
bindings:
deliveryIncoming:
destination : deliveryIncomingtopic
#Bean
public CommandLineRunner commandLineRunner(ApplicationContext ctx) {
// wanna produce deliveryIncomingtopic and send to Spring's Consumer
}
// Consumer Springboot application
spring :
cloud:
stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
function:
definition : deliveryIncoming;
bindings:
deliveryIncoming-in-0:
destination : deliveryIncomingtopic
#Bean
public Consumer<KStream<String, String>> deliveryIncoming() {
return input ->
input.foreach((key, value) -> {
System.out.println("deliveryIncoming is playing");
System.out.println("Key: " + key + " Value: " + value);
});
}
EDIT
sorry I think I made kind of unclear
I just want to do like below
produce(deliveryIncomingtopic) -> Kafka -> consumer(deliveryIncomingtopic)

If that's the case, then you need to change your bean function definition to return java.util.Function instead of java.util.Consumer.
#Bean
public Function<KStream<String, String>, KStream<String, String>> deliveryIncoming() {
return input ->
input.foreach((key, value) -> {
System.out.println("deliveryIncoming is playing");
System.out.println("Key: " + key + " Value: " + value);
});
}
However, AFAIK.. you still need to define the output channel in your application.yml. You can use the same name with different suffix. Something like below :
deliveryIncoming-in-0:
destination: <your_topic_name>
deliveryIncoming-out-0:
destination: <your_topic_name>

Just want to make things clear here. Are you looking to consume a message from your inbound topic deliveryIncomingtopic and then generate/produce another message to another output topic ?
If that's your question about, then I believe you were missing something there within your application.yml.
You need to have another configuration for your output topic. e.g. :
Since you're using Spring Cloud Function (based on what i see in your application.yml), then you should add more configuration for your output topic as follow:
spring :
cloud:
stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
function:
definition: deliveryIncoming;deliveryOutput
bindings:
deliveryIncoming-in-0:
destination: deliveryIncomingtopic
deliveryOutput-out-0:
destination: deliveryOutput
And also define another bean for your producer function :
#Bean
public Producer<KStream<String, String>> deliveryOutput() {
// do your necessary logic here to outbound your message
}
Hope this will match your expectation.

Related

Setting `client.id` dynamically with Spring Cloud Stream Kafka Streams

I need to set a specific client.id value for Kafka Streams flows with Spring Cloud Stream dynamically.
I know that you can set the value statically as follows:
spring:
cloud:
stream:
kafka:
streams:
binder:
functions:
kafkaStreamCountWords:
configuration:
client.id: client-id-kafkaStreamCount
But I need to do it dynamically for all streams.
With Spring Cloud Stream Binder Kafka there are customizers for producers and consumers. For example:
#Bean
public ConsumerConfigCustomizer consumerConfigCustomizer() {
return (consumerProperties, bindingName, destination) -> {
consumerProperties.put(ConsumerConfig.CLIENT_ID_CONFIG, bindingName);
};
}
#Bean
public ProducerConfigCustomizer producerConfigCustomizer() {
return (producerProperties, bindingName, destination) -> {
producerProperties.put(ProducerConfig.CLIENT_ID_CONFIG, bindingName);
};
}
But they are not valid for Kafka Stream binder at Function level.
Does anyone know how I can do it for Spring Cloud Stream Kafka Streams?

Spring Boot and Kotlin: Dynamic Kafka topic listeners

I am writing a service with Sprint Boot and Kotlin. The service should listen to Kafka topics. I am using Spring Cloud Stream for that.
What's working is having a hardcoded topic to listen to. For example in application.yaml I define the topic to listen:
spring:
cloud:
stream:
default:
contentType: application/*+avro
group: my-consumer-group
consumer:
useNativeDecoding: false
# Binding-specific configs (Kafka agnostic)
bindings:
my_topic:
# Topic to consume from
destination: my_topic
and then access it like:
interface MyTopicSink {
#Input(INPUT)
fun input(): SubscribableChannel
companion object {
const val INPUT = "my_topic" # (From `application.yaml`)
}
}
/**
* Reads from my_topic Kafka topics.
*/
#Service
#EnableBinding(MyTopicSink::class)
class MyFancyConsumer() {
/**
* Listens & consumes from the my_topic.
*/
#StreamListener(MyTopicSink.INPUT)
fun processTopic(
#Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) key: String,
#Payload payload: org.apache.avro.generic.GenericData.Record
) {
log.info("Yaay, this is working")
}
}
That's great and all, but does anyone know if I can make this dynamic? For example, one of my ideas is to have a list of topics defined somewhere, and then just have this consumer be dynamic. Reading from all the topics. Anyone did something similar? Thoughts?

Spring Cloud Stream Kafka send message

How can i send a message with the new Spring Cloud Stream Kafka Functional Model?
The deprecated way looked like this.
public interface OutputTopic {
#Output("output")
MessageChannel output();
}
#Autowired
OutputTopic outputTopic;
public someMethod() {
outputTopic.output().send(MessageBuilder.build());
}
But how can i send a message in the functional style?
application.yml
spring:
cloud:
function:
definition: process
stream:
bindings:
process-out-0:
destination: output
binder: kafka
#Configuration
public class Configuration {
#Bean
Supplier<Message<String>> process() {
return () -> {
return MessageBuilder.withPayload("foo")
.setHeader(KafkaHeaders.MESSAGE_KEY, "bar".getBytes()).build();
};
}
I would Autowire a MessageChannel but there is no MessageChannel-Bean for process, process-out-0, output or something like that. Or can i send a message with a Supplier-Bean?
Could someone please give me an example?
Thanks a lot!
You can either use the StreamBridge or the reactor API - see Sending arbitrary data to an output (e.g. Foreign event-driven sources)

Spring Cloud Stream. Azure EventHubs multiple input channels

I am having problem customizing input channel name when using spring cloud stream azure event hub binder
spring-cloud-azure-eventhubs-stream-binder/1.2.6
When I use Sink interface provided by spring cloud like this:
#EnableBinding(Sink::class)
internal class SomeListener{
#StreamListener(Sink.INPUT)
fun listener(
message: GenericMessage<MyMessage>
) {
logger.trace("Got message")
}
}
with application properties like this:
spring:
cloud:
azure:
eventhub:
connection-string: ...
checkpoint-access-key: ...
checkpoint-storage-account: ...
stream:
eventhub:
bindings:
input:
consumer:
checkpoint-mode: BATCH
bindings:
input:
destination: test
group: $Default
binder: eventhub
I can receive messages without any problems.
But when I want to use custom channel like this:
internal interface CustomSink {
#Output(INPUT)
fun input(): SubscribableChannel
companion object {
const val INPUT = "custom-input"
}
}
#EnableBinding(CustomSink::class)
internal class SomeEventSinkListener{
#StreamListener(CustomSink.INPUT)
fun listener(
message: GenericMessage<MyMessage>
) {
logger.trace("Got message")
}
}
and properties like this:
spring:
cloud:
azure:
eventhub:
connection-string: <secret>
checkpoint-access-key: <secret>
checkpoint-storage-account: <secret>
stream:
eventhub:
bindings:
custom-input:
consumer:
checkpoint-mode: BATCH
bindings:
custom-input:
destination: test
group: $Default
binder: eventhub
I did not receive any messages.
The only think changed is channel name. I don't know what is wrong. I can customize output channel name without problem but input channels work only if I use Sink interface Binding. I would like to listen on two different destinations in future that's why I need to custom input channels.
I appreciate any help. Thanks.

How to read Kafka Message Key from Spring cloud streams?

I am using spring cloud streams to consume a message from Kafka.
Is it possible to read the Kafka Message Key from the code?
I have a Kafka topic that generally has 2 types of messages. The action to be taken varies depending on the message key. I see the spring documentation has only the following to read the message. Here, I need to specify the actual mapping of the message (Greetings class here). However, I need a way through which I can read the message key and determine the deserializable Pojo
public class GreetingsListener {
#StreamListener(GreetingsProcessor.INPUT)
public void handleGreetings(#Payload Greetings request) {
}
}
You can try something like this:
#StreamListener(GreetingsProcessor.INPUT)
public void handleGreetings(#Payload Greetings request, #Header(KafkaHeaders.RECEIVED_MESSAGE_KEY)String key) {
}
You need to provide a proper deserializer for the key. For e.g. if your key is String, then you can provide:
spring.cloud.stream.kafka.binder.configuration.key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
If there is a need to use different key deserializer for different input channels, this setting can be extended under producer section of each kafka bindings. For example:
spring:
cloud:
stream:
kafka:
bindings:
<channel_name>:
consumer:
startOffset: latest
autoCommitOffset: true
autoCommitOnError: true
configuration:
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer

Resources