I am facing a scenario in my Kafka Stream job written in Spring Cloud Stream where producer failed with an exception.
Checked in kafka stream, there is a config to handle the same by:
default.production.exception.handler
Unable to find subsequent handler in spring cloud stream.
Kindly help. Thanks in advance.
Spring boot version: 2.2.4.RELEASE
default.production.exception.handler
You can set any arbitrary Kafka Streams property using the
spring.cloud.stream.kafka.streams.binder.configuration
property.
See the documentation.
So just add it there.
Related
In the Spring Cloud Stream application with Kafka binders, I am trying to PAUSE/RESUME the input binders. I searched and all of those sample solutions suggest using BindingsEndpoint.
References->
https://cloud.spring.io/spring-cloud-static/spring-cloud-stream-binder-kafka/2.2.0.M1/spring-cloud-stream-binder-kafka.html
Spring cloud stream kafka pause/resume binders
Using the BindingsEndpoint works in Spring Integration annotations based configuration Is there any way to use this feature in functional-style programming where Kafka listener is Consumer<T> bean?
Thanks in advance!
I have a usecase where I want to get the underlying Kafka producer (KafkaTemplate) in a Spring Cloud Stream application. While navigating the code I stumbled upon KafkaProducerMessageHandler which has a getKafkaTemplate method. However, it fails to auto-wire.
Also, if I directly auto-wire KafkaTemplate, the template is initialized with default properties and it ignores the broker in the binder key of the SCSt configuration
How can I access the underlying KafkaTemplate or a producer/consumer in a Spring Cloud Stream app?
EDIT: Actually my SCSt app has multiple Kafka binders and I want to get the KafkaTemplate or Kafka producer corresponding to each binder. Is that possible somehow?
It's not entirely clear why you would need to do that, but you can capture the KafkaTemplates by adding a ProducerMessageHandlerCustomizer #Bean to the application context.
I started developing an spring cloud stream project. I'm successfully received message from Kafka through #Streamlistener annotation. Before sending the message to any output channel, I have to convert the payload by calling an externalservice or by DB call. I don't want to call the external service or DB method from the same streamlistener method. My question is , can we create internal channels (like Spring Integration DSL flow) in spring cloud stream?
Yes you can. In Spring Cloud Stream, the channel binding to the binder would only be enabled based on what binder you use for the channel binding.
i am designing realtime kafka consuming job, and considering between spring boot batch and spring boot.
kafka version : 2.11-1.1.1
jdk : 1.8
which is better?
spring boot batch + kafka
spring boot + kafka
and please tell me why :)
If you are thinking to create job and manage through spring cloud data flow further, go with the #1 and if you use case is to just keep consuming the kafka message when recieved to kafka topic, use spring boot and spring kafka. It totally depend on your use case.
I am new to spring cloud and I'm searching for any simple complete example for spring cloud stream using kafka broker.
You can refer to the samples repo here and use Kafka binder in the sample application if it has a binder other than Kafka.