Spring cloud stream with bind kafka - spring-boot

I am new to spring cloud and I'm searching for any simple complete example for spring cloud stream using kafka broker.

You can refer to the samples repo here and use Kafka binder in the sample application if it has a binder other than Kafka.

Related

Spring Cloud Stream PAUSE/RESUME Kafka binders

In the Spring Cloud Stream application with Kafka binders, I am trying to PAUSE/RESUME the input binders. I searched and all of those sample solutions suggest using BindingsEndpoint.
References->
https://cloud.spring.io/spring-cloud-static/spring-cloud-stream-binder-kafka/2.2.0.M1/spring-cloud-stream-binder-kafka.html
Spring cloud stream kafka pause/resume binders
Using the BindingsEndpoint works in Spring Integration annotations based configuration Is there any way to use this feature in functional-style programming where Kafka listener is Consumer<T> bean?
Thanks in advance!

Get underlying low-level Kafka consumers and Producers in Spring Cloud Stream

I have a usecase where I want to get the underlying Kafka producer (KafkaTemplate) in a Spring Cloud Stream application. While navigating the code I stumbled upon KafkaProducerMessageHandler which has a getKafkaTemplate method. However, it fails to auto-wire.
Also, if I directly auto-wire KafkaTemplate, the template is initialized with default properties and it ignores the broker in the binder key of the SCSt configuration
How can I access the underlying KafkaTemplate or a producer/consumer in a Spring Cloud Stream app?
EDIT: Actually my SCSt app has multiple Kafka binders and I want to get the KafkaTemplate or Kafka producer corresponding to each binder. Is that possible somehow?
It's not entirely clear why you would need to do that, but you can capture the KafkaTemplates by adding a ProducerMessageHandlerCustomizer #Bean to the application context.

Internal Channels in spring cloud stream

I started developing an spring cloud stream project. I'm successfully received message from Kafka through #Streamlistener annotation. Before sending the message to any output channel, I have to convert the payload by calling an externalservice or by DB call. I don't want to call the external service or DB method from the same streamlistener method. My question is , can we create internal channels (like Spring Integration DSL flow) in spring cloud stream?
Yes you can. In Spring Cloud Stream, the channel binding to the binder would only be enabled based on what binder you use for the channel binding.

Producer Exception Handler in Spring Cloud Stream

I am facing a scenario in my Kafka Stream job written in Spring Cloud Stream where producer failed with an exception.
Checked in kafka stream, there is a config to handle the same by:
default.production.exception.handler
Unable to find subsequent handler in spring cloud stream.
Kindly help. Thanks in advance.
Spring boot version: 2.2.4.RELEASE
default.production.exception.handler
You can set any arbitrary Kafka Streams property using the
spring.cloud.stream.kafka.streams.binder.configuration
property.
See the documentation.
So just add it there.

Expose kafka stream metrics with spring actuator (prometheus)

I am running a Kafka Stream app with Springboot 2.
I would like to have my kafka stream metrics available in the prometheus format at host:8080/actuator/prometheus
I don't manage to have this. I am not sure I understand how kafka stream metrics are exported.
Can actuator get these JMX metrics ?
Is there a way to get these metrics and expose them in Prometheus format ?
PS: didn't worked with java jmx_prometheus_agent neither
Does someone has a solution or an example ?
Thank you
You could produce all available Kafka-Streams metrics (the same as from KafkaStreams.metrics()) into Prometheus using micrometer-core and spring-kafka libraries. For integrating Kafka-Streams with micrometer, you could have KafkaStreamsMicrometerListener bean:
#Bean
KafkaStreamsMicrometerListener kafkaStreamsMicrometerListener(MeterRegistry meterRegistry) {
return new KafkaStreamsMicrometerListener(meterRegistry);
}
where MeterRegistry is from micrometer-core dependency.
If you create Kafka Streams using StreamsBuilderFactoryBean from spring-kafka, then you need to add listener into it:
streamsBuilderFactoryBean.addListener(kafkaStreamsMicrometerListener);
And if you create KafkaStreams objects directly, then on each KafkaStreams object you need to invoke
kafkaStreamsMicrometerListener.streamsAdded(beanId, kafkaStreams);
where beanId is any unique identifier per KafkaStreams object.
As a result, Kafka Streams provides multiple useful Prometheus metrics, like kafka_consumer_coordinator_rebalance_latency_avg, kafka_stream_thread_task_closed_rate, etc. KafkaStreamsMicrometerListener under the hood uses KafkaStreamsMetrics.
If you need to have Grafana Prometheus graphs with these metrics, you need to add them as Gauge metric type.
I don't have a complete example, but metrics are well accessible and documented in Confluent documentation on Monitoring Kafka Streams.
Maybe dismiss actuator and use #RestController from Spring Web along with KafkaStreams#metrics() to publish exactly what you need.

Resources