Spring Cloud Stream PAUSE/RESUME Kafka binders - spring

In the Spring Cloud Stream application with Kafka binders, I am trying to PAUSE/RESUME the input binders. I searched and all of those sample solutions suggest using BindingsEndpoint.
References->
https://cloud.spring.io/spring-cloud-static/spring-cloud-stream-binder-kafka/2.2.0.M1/spring-cloud-stream-binder-kafka.html
Spring cloud stream kafka pause/resume binders
Using the BindingsEndpoint works in Spring Integration annotations based configuration Is there any way to use this feature in functional-style programming where Kafka listener is Consumer<T> bean?
Thanks in advance!

Related

Get underlying low-level Kafka consumers and Producers in Spring Cloud Stream

I have a usecase where I want to get the underlying Kafka producer (KafkaTemplate) in a Spring Cloud Stream application. While navigating the code I stumbled upon KafkaProducerMessageHandler which has a getKafkaTemplate method. However, it fails to auto-wire.
Also, if I directly auto-wire KafkaTemplate, the template is initialized with default properties and it ignores the broker in the binder key of the SCSt configuration
How can I access the underlying KafkaTemplate or a producer/consumer in a Spring Cloud Stream app?
EDIT: Actually my SCSt app has multiple Kafka binders and I want to get the KafkaTemplate or Kafka producer corresponding to each binder. Is that possible somehow?
It's not entirely clear why you would need to do that, but you can capture the KafkaTemplates by adding a ProducerMessageHandlerCustomizer #Bean to the application context.

Producer Exception Handler in Spring Cloud Stream

I am facing a scenario in my Kafka Stream job written in Spring Cloud Stream where producer failed with an exception.
Checked in kafka stream, there is a config to handle the same by:
default.production.exception.handler
Unable to find subsequent handler in spring cloud stream.
Kindly help. Thanks in advance.
Spring boot version: 2.2.4.RELEASE
default.production.exception.handler
You can set any arbitrary Kafka Streams property using the
spring.cloud.stream.kafka.streams.binder.configuration
property.
See the documentation.
So just add it there.

Which is the better combination? spring boot with kafka

i am designing realtime kafka consuming job, and considering between spring boot batch and spring boot.
kafka version : 2.11-1.1.1
jdk : 1.8
which is better?
spring boot batch + kafka
spring boot + kafka
and please tell me why :)
If you are thinking to create job and manage through spring cloud data flow further, go with the #1 and if you use case is to just keep consuming the kafka message when recieved to kafka topic, use spring boot and spring kafka. It totally depend on your use case.

Expose kafka stream metrics with spring actuator (prometheus)

I am running a Kafka Stream app with Springboot 2.
I would like to have my kafka stream metrics available in the prometheus format at host:8080/actuator/prometheus
I don't manage to have this. I am not sure I understand how kafka stream metrics are exported.
Can actuator get these JMX metrics ?
Is there a way to get these metrics and expose them in Prometheus format ?
PS: didn't worked with java jmx_prometheus_agent neither
Does someone has a solution or an example ?
Thank you
You could produce all available Kafka-Streams metrics (the same as from KafkaStreams.metrics()) into Prometheus using micrometer-core and spring-kafka libraries. For integrating Kafka-Streams with micrometer, you could have KafkaStreamsMicrometerListener bean:
#Bean
KafkaStreamsMicrometerListener kafkaStreamsMicrometerListener(MeterRegistry meterRegistry) {
return new KafkaStreamsMicrometerListener(meterRegistry);
}
where MeterRegistry is from micrometer-core dependency.
If you create Kafka Streams using StreamsBuilderFactoryBean from spring-kafka, then you need to add listener into it:
streamsBuilderFactoryBean.addListener(kafkaStreamsMicrometerListener);
And if you create KafkaStreams objects directly, then on each KafkaStreams object you need to invoke
kafkaStreamsMicrometerListener.streamsAdded(beanId, kafkaStreams);
where beanId is any unique identifier per KafkaStreams object.
As a result, Kafka Streams provides multiple useful Prometheus metrics, like kafka_consumer_coordinator_rebalance_latency_avg, kafka_stream_thread_task_closed_rate, etc. KafkaStreamsMicrometerListener under the hood uses KafkaStreamsMetrics.
If you need to have Grafana Prometheus graphs with these metrics, you need to add them as Gauge metric type.
I don't have a complete example, but metrics are well accessible and documented in Confluent documentation on Monitoring Kafka Streams.
Maybe dismiss actuator and use #RestController from Spring Web along with KafkaStreams#metrics() to publish exactly what you need.

Spring cloud stream with bind kafka

I am new to spring cloud and I'm searching for any simple complete example for spring cloud stream using kafka broker.
You can refer to the samples repo here and use Kafka binder in the sample application if it has a binder other than Kafka.

Resources