How to use Spring Cloud Stream Kafka with Confluent Schema Registry? - spring

I am seeking a simple workable example which use Spring Cloud Stream Kafka with Confluent Schema Registry (producer & consumer). I followed the spring cloud stream reference guide by adding the following code but it didn't work. Can anyone guide me how to achieve it? Many thanks!
#Bean
public SchemaRegistryClient schemaRegistryClient(#Value("${spring.cloud.stream.schemaRegistryClient.endpoint}") String endpoint){
ConfluentSchemaRegistryClient client = new ConfluentSchemaRegistryClient();
client.setEndpoint(endpoint);
return client;
}

I tried this schema registry sample and it worked for me. All instructions are mentioned in README file.

Here is a simple example that shows spring cloud stream with Avro serialization. This example stubs and mocks out the schema registry.
Your bean looks correct, you just need a running schema registry and configuration for "spring.cloud.stream.schemaRegistryClient.endpoint" in application.properties (or application.yml)

Related

How to leverage Armeria's fantastic JSON to GRPC transcoding functions to springboot project

We have a existing springboot project which has terrible API management system. So we wanna do something like grpc-gateway related work. But we don't want to add sidecar to our existing service. We found that Armeria has a wonderful json grpc transcoding function. How do we leverage this thing to our existing spring boot project.
We found that Armeria has a wonderful json grpc transcoding function.
I guess a minimal example may look like the following:
final GrpcService grpcService = GrpcService.builder()
.addService(new MyGrpcService())
.enableHttpJsonTranscoding(true) // enable http json transcoding
.build();
final ServerBuilder sb = Server.builder();
sb.service(grpcService).serviceUnder("/foo", grpcService); // add the grpc service to the server
final Server server = sb.build();
Runtime.getRuntime().addShutdownHook(new Thread(() -> {
server.stop().join();
}));
server.start().join(); // start the server
How do we leverage this thing to our existing spring boot project.
Armeria also offers spring-boot integration. An example can be found in the following repository.
You can also ask at slack or github issues if you have any additional/follow up questions.

Spring Cloud Stream PAUSE/RESUME Kafka binders

In the Spring Cloud Stream application with Kafka binders, I am trying to PAUSE/RESUME the input binders. I searched and all of those sample solutions suggest using BindingsEndpoint.
References->
https://cloud.spring.io/spring-cloud-static/spring-cloud-stream-binder-kafka/2.2.0.M1/spring-cloud-stream-binder-kafka.html
Spring cloud stream kafka pause/resume binders
Using the BindingsEndpoint works in Spring Integration annotations based configuration Is there any way to use this feature in functional-style programming where Kafka listener is Consumer<T> bean?
Thanks in advance!

Mocking SchemaRegistryClient in stream processor Consumer

I have a Reactor-based Spring Boot Kafka stream processing app that I am working on writing integration tests for. I am using Spring's #EmbeddedKafka broker. It works great, I have it overriding the bootstrap broker urls that get configured on my reactive processor's consumer & publisher, but what I haven't figured out yet is how to deal with the schema registry for my processor when testing. I'm using Confluent's KafkaAvroSerializer and KafkaAvroDeserializer classes and just have the schema.registry.url field configured in my Spring app configs to get injected into the Kafka properties. I'm using Confluent's MockSchemaRegistryClient for the test producer and consumer, but what I need is a way to inject this mock client into the actual consumer and producer in my stream processor code, but I see no way to do that. Almost seems like I need something more like an embedded version of the schema registry to point them to like the embedded broker. Our build pipeline does not support spinning up containers otherwise I'd use Docker or Testcontainers. Anyone else solve this already? Any help or suggestions appreciated.
I managed to figure this out. If you use a url that begins with mock:// for your test's SerDes, and you override the schema.registry.url property in the #SpringBootTest annotation with the same mock url, then your processor's consumer and producer will also pick up and use this mock schema registry client, and everything just works!

Expose kafka stream metrics with spring actuator (prometheus)

I am running a Kafka Stream app with Springboot 2.
I would like to have my kafka stream metrics available in the prometheus format at host:8080/actuator/prometheus
I don't manage to have this. I am not sure I understand how kafka stream metrics are exported.
Can actuator get these JMX metrics ?
Is there a way to get these metrics and expose them in Prometheus format ?
PS: didn't worked with java jmx_prometheus_agent neither
Does someone has a solution or an example ?
Thank you
You could produce all available Kafka-Streams metrics (the same as from KafkaStreams.metrics()) into Prometheus using micrometer-core and spring-kafka libraries. For integrating Kafka-Streams with micrometer, you could have KafkaStreamsMicrometerListener bean:
#Bean
KafkaStreamsMicrometerListener kafkaStreamsMicrometerListener(MeterRegistry meterRegistry) {
return new KafkaStreamsMicrometerListener(meterRegistry);
}
where MeterRegistry is from micrometer-core dependency.
If you create Kafka Streams using StreamsBuilderFactoryBean from spring-kafka, then you need to add listener into it:
streamsBuilderFactoryBean.addListener(kafkaStreamsMicrometerListener);
And if you create KafkaStreams objects directly, then on each KafkaStreams object you need to invoke
kafkaStreamsMicrometerListener.streamsAdded(beanId, kafkaStreams);
where beanId is any unique identifier per KafkaStreams object.
As a result, Kafka Streams provides multiple useful Prometheus metrics, like kafka_consumer_coordinator_rebalance_latency_avg, kafka_stream_thread_task_closed_rate, etc. KafkaStreamsMicrometerListener under the hood uses KafkaStreamsMetrics.
If you need to have Grafana Prometheus graphs with these metrics, you need to add them as Gauge metric type.
I don't have a complete example, but metrics are well accessible and documented in Confluent documentation on Monitoring Kafka Streams.
Maybe dismiss actuator and use #RestController from Spring Web along with KafkaStreams#metrics() to publish exactly what you need.

AWS kinesis consumer with Java and Spring

I want to write an AWS kinesis stream consumer in a Spring boot application. And I'm not sure if Spring has a native support of kinesis, or I have to use the kinesis client library.
According to this blog post org.springframework.integration:spring-integration-aws has it (RELEASE is available in maven repo). However, this example on GitHub uses org.springframework.cloud:spring-cloud-starter-stream-kinesis, which is available only on Spring snapshots repo under 1.0.0.BUILD-SNAPSHOT.
EDIT: The question is, where can I find an example of KinesisMessageDrivenChannelAdapter?
Not clear what is the question though.
If you are looking for a sample, there is indeed no one. Right the solution we have in Spring is definitely a Channel Adapter for Spring Integration. And that KinesisMessageDrivenChannelAdapter is exactly consumer implementation for AWS Kinesis:
#SpringBootApplication
public static class MyConfiguration {
#Bean
public KinesisMessageDrivenChannelAdapter kinesisInboundChannelChannel(AmazonKinesis amazonKinesis) {
KinesisMessageDrivenChannelAdapter adapter =
new KinesisMessageDrivenChannelAdapter(amazonKinesis, "MY_STREAM");
adapter.setOutputChannel(kinesisReceiveChannel());
return adapter;
}
}
The sample you found on GitHub is for Spring Cloud Stream and based on the Kinesis Binder which indeed is still under development.

Resources