Setting `client.id` dynamically with Spring Cloud Stream Kafka Streams - apache-kafka-streams

I need to set a specific client.id value for Kafka Streams flows with Spring Cloud Stream dynamically.
I know that you can set the value statically as follows:
spring:
cloud:
stream:
kafka:
streams:
binder:
functions:
kafkaStreamCountWords:
configuration:
client.id: client-id-kafkaStreamCount
But I need to do it dynamically for all streams.
With Spring Cloud Stream Binder Kafka there are customizers for producers and consumers. For example:
#Bean
public ConsumerConfigCustomizer consumerConfigCustomizer() {
return (consumerProperties, bindingName, destination) -> {
consumerProperties.put(ConsumerConfig.CLIENT_ID_CONFIG, bindingName);
};
}
#Bean
public ProducerConfigCustomizer producerConfigCustomizer() {
return (producerProperties, bindingName, destination) -> {
producerProperties.put(ProducerConfig.CLIENT_ID_CONFIG, bindingName);
};
}
But they are not valid for Kafka Stream binder at Function level.
Does anyone know how I can do it for Spring Cloud Stream Kafka Streams?

Related

Get topic name from Spring Cloud Stream MessageChannel

We are using Kafka Cloud Stream in a Spring Boot application to send data to Kafka.
like this
producerChannel.send(MessageBuilder
.withPayload(data)
.setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON)
.build())
I would like to know whether it's possible to get topic name from the MessageChannel, other than reading directly from yaml file?
#Output("topic-name-out")
MessageChannel producerChannel();
Topic name is present in a kafka.yaml
spring:
cloud:
stream:
bindings:
topic-name-out:
destination: topic_name_to_producer
contentType: application/json
producer:
partitionCount: ${partition_count:3}
I see you are using annotation-based programming model (e.g., #Output). It has been deprecated for 3+ years and is being removed from the code base.
Please upgrade to functional model.
As for your question about destination name that is configured externally, there is a way to access it programmatically via Bindings, but I am more curious as to why do you need it as it is an internal detail and also given that it is externally configurable it can change without notice, thus affecting your code.
you can create a bean and bind it to the producer topic
#Bean
MessageChannel producerLogger(){
return (message,l)->{
RecordMetadata meta = message.getHeaders().get(KafkaHeaders.RECORD_METADATA, RecordMetadata.class);
Object txnId = message.getHeaders().get("txnId");
if (Objects.nonNull(meta) && Objects.nonNull(txnId) && txnId instanceof byte[]) {
log.trace("Topic [{}] Partition [{}] Offset [{}] TxnId [{}]", meta.topic(), meta.partition(), meta.offset(),
new String((byte[]) txnId));
}
}
return true;
}
This will give you topic, parition and offeset to which your application is producing the message.

1 output bindings in Spring Cloud Stream Kafka Binder

in this page tells that you can't make just one Outputs
but I need to make just one Outputs by using Spring Cloud Stream Kafka Binder
what should I do?
some articles says that using org.springframework.cloud.stream.function.StreamBridge but it's not works for me
I made StreamBridge to send topics to Kafka but Kafka doesn't produce topics to my Spring boot application
and this is my application.yml and produce topic code
// producer Springboot application
spring.cloud.stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
bindings:
deliveryIncoming:
destination : deliveryIncomingtopic
#Bean
public CommandLineRunner commandLineRunner(ApplicationContext ctx) {
// wanna produce deliveryIncomingtopic and send to Spring's Consumer
}
// Consumer Springboot application
spring :
cloud:
stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
function:
definition : deliveryIncoming;
bindings:
deliveryIncoming-in-0:
destination : deliveryIncomingtopic
#Bean
public Consumer<KStream<String, String>> deliveryIncoming() {
return input ->
input.foreach((key, value) -> {
System.out.println("deliveryIncoming is playing");
System.out.println("Key: " + key + " Value: " + value);
});
}
EDIT
sorry I think I made kind of unclear
I just want to do like below
produce(deliveryIncomingtopic) -> Kafka -> consumer(deliveryIncomingtopic)
If that's the case, then you need to change your bean function definition to return java.util.Function instead of java.util.Consumer.
#Bean
public Function<KStream<String, String>, KStream<String, String>> deliveryIncoming() {
return input ->
input.foreach((key, value) -> {
System.out.println("deliveryIncoming is playing");
System.out.println("Key: " + key + " Value: " + value);
});
}
However, AFAIK.. you still need to define the output channel in your application.yml. You can use the same name with different suffix. Something like below :
deliveryIncoming-in-0:
destination: <your_topic_name>
deliveryIncoming-out-0:
destination: <your_topic_name>
Just want to make things clear here. Are you looking to consume a message from your inbound topic deliveryIncomingtopic and then generate/produce another message to another output topic ?
If that's your question about, then I believe you were missing something there within your application.yml.
You need to have another configuration for your output topic. e.g. :
Since you're using Spring Cloud Function (based on what i see in your application.yml), then you should add more configuration for your output topic as follow:
spring :
cloud:
stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
function:
definition: deliveryIncoming;deliveryOutput
bindings:
deliveryIncoming-in-0:
destination: deliveryIncomingtopic
deliveryOutput-out-0:
destination: deliveryOutput
And also define another bean for your producer function :
#Bean
public Producer<KStream<String, String>> deliveryOutput() {
// do your necessary logic here to outbound your message
}
Hope this will match your expectation.

Spring Boot and Kotlin: Dynamic Kafka topic listeners

I am writing a service with Sprint Boot and Kotlin. The service should listen to Kafka topics. I am using Spring Cloud Stream for that.
What's working is having a hardcoded topic to listen to. For example in application.yaml I define the topic to listen:
spring:
cloud:
stream:
default:
contentType: application/*+avro
group: my-consumer-group
consumer:
useNativeDecoding: false
# Binding-specific configs (Kafka agnostic)
bindings:
my_topic:
# Topic to consume from
destination: my_topic
and then access it like:
interface MyTopicSink {
#Input(INPUT)
fun input(): SubscribableChannel
companion object {
const val INPUT = "my_topic" # (From `application.yaml`)
}
}
/**
* Reads from my_topic Kafka topics.
*/
#Service
#EnableBinding(MyTopicSink::class)
class MyFancyConsumer() {
/**
* Listens & consumes from the my_topic.
*/
#StreamListener(MyTopicSink.INPUT)
fun processTopic(
#Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) key: String,
#Payload payload: org.apache.avro.generic.GenericData.Record
) {
log.info("Yaay, this is working")
}
}
That's great and all, but does anyone know if I can make this dynamic? For example, one of my ideas is to have a list of topics defined somewhere, and then just have this consumer be dynamic. Reading from all the topics. Anyone did something similar? Thoughts?

Spring Cloud Stream Kafka send message

How can i send a message with the new Spring Cloud Stream Kafka Functional Model?
The deprecated way looked like this.
public interface OutputTopic {
#Output("output")
MessageChannel output();
}
#Autowired
OutputTopic outputTopic;
public someMethod() {
outputTopic.output().send(MessageBuilder.build());
}
But how can i send a message in the functional style?
application.yml
spring:
cloud:
function:
definition: process
stream:
bindings:
process-out-0:
destination: output
binder: kafka
#Configuration
public class Configuration {
#Bean
Supplier<Message<String>> process() {
return () -> {
return MessageBuilder.withPayload("foo")
.setHeader(KafkaHeaders.MESSAGE_KEY, "bar".getBytes()).build();
};
}
I would Autowire a MessageChannel but there is no MessageChannel-Bean for process, process-out-0, output or something like that. Or can i send a message with a Supplier-Bean?
Could someone please give me an example?
Thanks a lot!
You can either use the StreamBridge or the reactor API - see Sending arbitrary data to an output (e.g. Foreign event-driven sources)

How to read Kafka Message Key from Spring cloud streams?

I am using spring cloud streams to consume a message from Kafka.
Is it possible to read the Kafka Message Key from the code?
I have a Kafka topic that generally has 2 types of messages. The action to be taken varies depending on the message key. I see the spring documentation has only the following to read the message. Here, I need to specify the actual mapping of the message (Greetings class here). However, I need a way through which I can read the message key and determine the deserializable Pojo
public class GreetingsListener {
#StreamListener(GreetingsProcessor.INPUT)
public void handleGreetings(#Payload Greetings request) {
}
}
You can try something like this:
#StreamListener(GreetingsProcessor.INPUT)
public void handleGreetings(#Payload Greetings request, #Header(KafkaHeaders.RECEIVED_MESSAGE_KEY)String key) {
}
You need to provide a proper deserializer for the key. For e.g. if your key is String, then you can provide:
spring.cloud.stream.kafka.binder.configuration.key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
If there is a need to use different key deserializer for different input channels, this setting can be extended under producer section of each kafka bindings. For example:
spring:
cloud:
stream:
kafka:
bindings:
<channel_name>:
consumer:
startOffset: latest
autoCommitOffset: true
autoCommitOnError: true
configuration:
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer

Resources