How to read Kafka Message Key from Spring cloud streams? - spring

I am using spring cloud streams to consume a message from Kafka.
Is it possible to read the Kafka Message Key from the code?
I have a Kafka topic that generally has 2 types of messages. The action to be taken varies depending on the message key. I see the spring documentation has only the following to read the message. Here, I need to specify the actual mapping of the message (Greetings class here). However, I need a way through which I can read the message key and determine the deserializable Pojo
public class GreetingsListener {
#StreamListener(GreetingsProcessor.INPUT)
public void handleGreetings(#Payload Greetings request) {
}
}

You can try something like this:
#StreamListener(GreetingsProcessor.INPUT)
public void handleGreetings(#Payload Greetings request, #Header(KafkaHeaders.RECEIVED_MESSAGE_KEY)String key) {
}
You need to provide a proper deserializer for the key. For e.g. if your key is String, then you can provide:
spring.cloud.stream.kafka.binder.configuration.key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
If there is a need to use different key deserializer for different input channels, this setting can be extended under producer section of each kafka bindings. For example:
spring:
cloud:
stream:
kafka:
bindings:
<channel_name>:
consumer:
startOffset: latest
autoCommitOffset: true
autoCommitOnError: true
configuration:
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer

Related

Get topic name from Spring Cloud Stream MessageChannel

We are using Kafka Cloud Stream in a Spring Boot application to send data to Kafka.
like this
producerChannel.send(MessageBuilder
.withPayload(data)
.setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON)
.build())
I would like to know whether it's possible to get topic name from the MessageChannel, other than reading directly from yaml file?
#Output("topic-name-out")
MessageChannel producerChannel();
Topic name is present in a kafka.yaml
spring:
cloud:
stream:
bindings:
topic-name-out:
destination: topic_name_to_producer
contentType: application/json
producer:
partitionCount: ${partition_count:3}
I see you are using annotation-based programming model (e.g., #Output). It has been deprecated for 3+ years and is being removed from the code base.
Please upgrade to functional model.
As for your question about destination name that is configured externally, there is a way to access it programmatically via Bindings, but I am more curious as to why do you need it as it is an internal detail and also given that it is externally configurable it can change without notice, thus affecting your code.
you can create a bean and bind it to the producer topic
#Bean
MessageChannel producerLogger(){
return (message,l)->{
RecordMetadata meta = message.getHeaders().get(KafkaHeaders.RECORD_METADATA, RecordMetadata.class);
Object txnId = message.getHeaders().get("txnId");
if (Objects.nonNull(meta) && Objects.nonNull(txnId) && txnId instanceof byte[]) {
log.trace("Topic [{}] Partition [{}] Offset [{}] TxnId [{}]", meta.topic(), meta.partition(), meta.offset(),
new String((byte[]) txnId));
}
}
return true;
}
This will give you topic, parition and offeset to which your application is producing the message.

1 output bindings in Spring Cloud Stream Kafka Binder

in this page tells that you can't make just one Outputs
but I need to make just one Outputs by using Spring Cloud Stream Kafka Binder
what should I do?
some articles says that using org.springframework.cloud.stream.function.StreamBridge but it's not works for me
I made StreamBridge to send topics to Kafka but Kafka doesn't produce topics to my Spring boot application
and this is my application.yml and produce topic code
// producer Springboot application
spring.cloud.stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
bindings:
deliveryIncoming:
destination : deliveryIncomingtopic
#Bean
public CommandLineRunner commandLineRunner(ApplicationContext ctx) {
// wanna produce deliveryIncomingtopic and send to Spring's Consumer
}
// Consumer Springboot application
spring :
cloud:
stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
function:
definition : deliveryIncoming;
bindings:
deliveryIncoming-in-0:
destination : deliveryIncomingtopic
#Bean
public Consumer<KStream<String, String>> deliveryIncoming() {
return input ->
input.foreach((key, value) -> {
System.out.println("deliveryIncoming is playing");
System.out.println("Key: " + key + " Value: " + value);
});
}
EDIT
sorry I think I made kind of unclear
I just want to do like below
produce(deliveryIncomingtopic) -> Kafka -> consumer(deliveryIncomingtopic)
If that's the case, then you need to change your bean function definition to return java.util.Function instead of java.util.Consumer.
#Bean
public Function<KStream<String, String>, KStream<String, String>> deliveryIncoming() {
return input ->
input.foreach((key, value) -> {
System.out.println("deliveryIncoming is playing");
System.out.println("Key: " + key + " Value: " + value);
});
}
However, AFAIK.. you still need to define the output channel in your application.yml. You can use the same name with different suffix. Something like below :
deliveryIncoming-in-0:
destination: <your_topic_name>
deliveryIncoming-out-0:
destination: <your_topic_name>
Just want to make things clear here. Are you looking to consume a message from your inbound topic deliveryIncomingtopic and then generate/produce another message to another output topic ?
If that's your question about, then I believe you were missing something there within your application.yml.
You need to have another configuration for your output topic. e.g. :
Since you're using Spring Cloud Function (based on what i see in your application.yml), then you should add more configuration for your output topic as follow:
spring :
cloud:
stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
function:
definition: deliveryIncoming;deliveryOutput
bindings:
deliveryIncoming-in-0:
destination: deliveryIncomingtopic
deliveryOutput-out-0:
destination: deliveryOutput
And also define another bean for your producer function :
#Bean
public Producer<KStream<String, String>> deliveryOutput() {
// do your necessary logic here to outbound your message
}
Hope this will match your expectation.

Spring Boot and Kotlin: Dynamic Kafka topic listeners

I am writing a service with Sprint Boot and Kotlin. The service should listen to Kafka topics. I am using Spring Cloud Stream for that.
What's working is having a hardcoded topic to listen to. For example in application.yaml I define the topic to listen:
spring:
cloud:
stream:
default:
contentType: application/*+avro
group: my-consumer-group
consumer:
useNativeDecoding: false
# Binding-specific configs (Kafka agnostic)
bindings:
my_topic:
# Topic to consume from
destination: my_topic
and then access it like:
interface MyTopicSink {
#Input(INPUT)
fun input(): SubscribableChannel
companion object {
const val INPUT = "my_topic" # (From `application.yaml`)
}
}
/**
* Reads from my_topic Kafka topics.
*/
#Service
#EnableBinding(MyTopicSink::class)
class MyFancyConsumer() {
/**
* Listens & consumes from the my_topic.
*/
#StreamListener(MyTopicSink.INPUT)
fun processTopic(
#Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) key: String,
#Payload payload: org.apache.avro.generic.GenericData.Record
) {
log.info("Yaay, this is working")
}
}
That's great and all, but does anyone know if I can make this dynamic? For example, one of my ideas is to have a list of topics defined somewhere, and then just have this consumer be dynamic. Reading from all the topics. Anyone did something similar? Thoughts?

Spring Cloud Stream Kafka send message

How can i send a message with the new Spring Cloud Stream Kafka Functional Model?
The deprecated way looked like this.
public interface OutputTopic {
#Output("output")
MessageChannel output();
}
#Autowired
OutputTopic outputTopic;
public someMethod() {
outputTopic.output().send(MessageBuilder.build());
}
But how can i send a message in the functional style?
application.yml
spring:
cloud:
function:
definition: process
stream:
bindings:
process-out-0:
destination: output
binder: kafka
#Configuration
public class Configuration {
#Bean
Supplier<Message<String>> process() {
return () -> {
return MessageBuilder.withPayload("foo")
.setHeader(KafkaHeaders.MESSAGE_KEY, "bar".getBytes()).build();
};
}
I would Autowire a MessageChannel but there is no MessageChannel-Bean for process, process-out-0, output or something like that. Or can i send a message with a Supplier-Bean?
Could someone please give me an example?
Thanks a lot!
You can either use the StreamBridge or the reactor API - see Sending arbitrary data to an output (e.g. Foreign event-driven sources)

Spring Cloud Stream dynamic channels

I am using Spring Cloud Stream and want to programmatically create and bind channels. My use case is that during application startup I receive the dynamic list of Kafka topics to subscribe to. How can I then create a channel for each topic?
I ran into similar scenario recently and below is my sample of creating SubscriberChannels dynamically.
ConsumerProperties consumerProperties = new ConsumerProperties();
consumerProperties.setMaxAttempts(1);
BindingProperties bindingProperties = new BindingProperties();
bindingProperties.setConsumer(consumerProperties);
bindingProperties.setDestination(retryTopic);
bindingProperties.setGroup(consumerGroup);
bindingServiceProperties.getBindings().put(consumerName, bindingProperties);
SubscribableChannel channel = (SubscribableChannel)bindingTargetFactory.createInput(consumerName);
beanFactory.registerSingleton(consumerName, channel);
channel = (SubscribableChannel)beanFactory.initializeBean(channel, consumerName);
bindingService.bindConsumer(channel, consumerName);
channel.subscribe(consumerMessageHandler);
I had to do something similar for the Camel Spring Cloud Stream component.
Perhaps the Consumer code to bind a destination "really just a String indicating the channel name" would be useful to you?
In my case I only bind a single destination, however I don't imagine it being much different conceptually for multiple destinations.
Below is the gist of it:
#Override
protected void doStart() throws Exception {
SubscribableChannel bindingTarget = createInputBindingTarget();
bindingTarget.subscribe(message -> {
// have your way with the received incoming message
});
endpoint.getBindingService().bindConsumer(bindingTarget,
endpoint.getDestination());
// at this point the binding is done
}
/**
* Create a {#link SubscribableChannel} and register in the
* {#link org.springframework.context.ApplicationContext}
*/
private SubscribableChannel createInputBindingTarget() {
SubscribableChannel channel = endpoint.getBindingTargetFactory()
.createInputChannel(endpoint.getDestination());
endpoint.getBeanFactory().registerSingleton(endpoint.getDestination(), channel);
channel = (SubscribableChannel) endpoint.getBeanFactory().initializeBean(channel,
endpoint.getDestination());
return channel;
}
See here for the full source for more context.
I had a task where I did not know the topics in advance. I solved it by having one input channel which listens to all the topics I need.
https://docs.spring.io/spring-cloud-stream/docs/Brooklyn.RELEASE/reference/html/_configuration_options.html
Destination
The target destination of a channel on the bound middleware (e.g., the RabbitMQ exchange or Kafka topic). If the channel is bound as a consumer, it could be bound to multiple destinations and the destination names can be specified as comma-separated String values. If not set, the channel name is used instead.
So my configuration
spring:
cloud:
stream:
default:
consumer:
concurrency: 2
partitioned: true
bindings:
# inputs
input:
group: application_name_group
destination: topic-1,topic-2
content-type: application/json;charset=UTF-8
Then I defined one consumer which handles messages from all these topics.
#Component
#EnableBinding(Sink.class)
public class CommonConsumer {
private final static Logger logger = LoggerFactory.getLogger(CommonConsumer.class);
#StreamListener(target = Sink.INPUT)
public void consumeMessage(final Message<Object> message) {
logger.info("Received a message: \nmessage:\n{}", message.getPayload());
// Here I define logic which handles messages depending on message headers and topic.
// In my case I have configuration which forwards these messages to webhooks, so I need to have mapping topic name -> webhook URI.
}
}
Note, in your case it may not be a solution. I needed to forward messages to webhooks, so I could have configuration mapping.
I also thought about other ideas.
1) You kafka client consumer without Spring Cloud.
2) Create a predefined number of inputs, for example 50.
input-1
intput-2
...
intput-50
And then have a configuration for some of these inputs.
Related discussions
Spring cloud stream to support routing messages dynamically
https://github.com/spring-cloud/spring-cloud-stream/issues/690
https://github.com/spring-cloud/spring-cloud-stream/issues/1089
We use Spring Cloud 2.1.1 RELEASE
MessageChannel messageChannel = createMessageChannel(channelName);
messageChannel.send(getMessageBuilder().apply(data));
public MessageChannel createMessageChannel(String channelName) {
return (MessageChannel) applicationContext.getBean(channelName);}
public Function<Object, Message<Object>> getMessageBuilder() {
return payload -> MessageBuilder
.withPayload(payload)
.setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON)
.build();}
For the incoming messages, you can explicitly use BinderAwareChannelResolver to dynamically resolve the destination. You can check this example where router sink uses binder aware channel resolver.

Resources