I am not using Spring-Kafka module to produce and consume messages. Instead, I am using Apache client library with producer and consumer implementations. As I am not using Spring-Kafka, the Spring Slueth auto configuration is not applied to generate the traces. I have referred https://docs.spring.io/spring-cloud-sleuth/docs/current-SNAPSHOT/reference/html/integrations.html I don't find any document around how to apply Spring Slueth for the code which uses 3rd party libraries?
If you're not using Spring, then don't add it to non-Spring code.
The traces in Sleuth are implemented using Brave Kafka Interceptor
Related
I am looking to get the Spring Boot and Apache Kafka complex topics/scenarios examples. Whatever I found on web was very basic and similar demo.
Does anyone has Spring Boot and Apache Kafka example?
I was also looking for solid examples. And I could not find anything concrete but "hello world"... Then I just realized that I'm overlooking the Confluent's documents.
How to Work with Apache Kafka in Your Spring Boot Application
This is literally "hello world" but might be a good warm-up for beginners.
Spring for Apache Kafka Deep Dive – Part 1: Error Handling, Message Conversion and Transaction Support
Spring for Apache Kafka Deep Dive – Part 2: Apache Kafka and Spring Cloud Stream
Spring for Apache Kafka Deep Dive – Part 3: Apache Kafka and Spring Cloud Data Flow
Spring for Apache Kafka Deep Dive – Part 4: Continuous Delivery of Event Streaming Pipelines
I think the Confluent Platform's blog is a real hidden gem for the developers. There are lots of cool topics besides of example/tutorials. To name few of them:
Spring for Apache Kafka – Beyond the Basics: Can Your Kafka Consumers Handle a Poison Pill?
Advanced Testing Techniques for Spring for Apache Kafka
How to Use Schema Registry and Avro in Spring Boot Applications
please refer this tutorial. there are some steps.
https://howtodoinjava.com/kafka/spring-boot-with-kafka/
In a few words:
I'm trying to decide between using the default Spring for Apache Kafka stack, KafkaTemplate or the pair, ReactiveKafkaProducerTemplate and ReactiveKafkaConsumerTemplate for my Reactor based application.
Some more context:
In the company I work we're developing a high-disponibility application aiming to publish a set of requests directly to a Kafka Broker. Since this is an API centric application expecting to receive a few millions of requests per week, we decided to go with a stack based on the Project Reactor with Spring WebFlux and Kotlin.
After doing some digging I've discovered that the Spring for Apache Kafka has a simple wrapper designed around the Reactor Kafka implementation, but this wrapper lacks a lot of the functionalities present in the default KafkaTemplate mentioned before, things like: A Metrics Binder out of the box (for prometheus integration), associated factories, extensive documentation, Auto configuration, etc.
I'm trying to understand what I'm really giving up when using the default implementation in favor of the Reactive one. Am I giving up back pressure functionality? Am I sacrificing the Reactive Stack present in my application? Will this be a toll in the future? Does anyone has some experience in working with a Reactive Stack alongside a non-reactive solution?
I have, also, a few concerns regarding the DLT flow facilitated in the default implementation, things like the SeekToCurrentErrorHandler strategy
In my corporate project I am using Spring Boot and Apache ActiveMQ 5.x Spring Boot starter. I am a totally beginner in this.
My goal is to expose Prometheus endpoint with some JMS queue metrics:
number of messages in queue
number of messages in error queue
What are dedicated tools for retrieving such metrics? Up to now I have found two possible ways. Can anyone confirm which of these two tools can solve my problem?
https://docs.spring.io/spring-integration/docs/5.1.7.RELEASE/reference/html/#system-management-chapter
https://activemq.apache.org/components/artemis/documentation/latest/metrics.html (here the example is not very helpful)
I don't think the Spring stuff will work because that will provide Spring-related metrics from the application itself, not the ActiveMQ broker.
Also, the documentation for ActiveMQ you cited is for ActiveMQ Artemis. However, the dependency you're using is for ActiveMQ 5.x. Therefore, the documentation is not applicable. However, if you choose to use ActiveMQ Artemis it is very simple to expose a Prometheus endpoint using this Prometheus metrics plugin implementation. It's worth noting that Artemis is ActiveMQ's next generation message broker. If you're starting a new project I would recommend you use it rather than 5.x. Artemis is planned to replace 5.x and become ActiveMQ 6.0 in the future.
I think your best bet would be to configure the Prometheus JMX exporter. It even has a sample configuration for ActiveMQ 5.x.
ActiveMQ comes with Jolokia bundled by default for extracting JMX Beans for the JVM, queues and a bunch of other metrics using HTTP. That way we can easily export using a software like Telegraf, which comes with a simple input plugin for ActiveMQ and a simple output plugin for Prometheus.
Have a project with spring boot 2.1.9, spring kafka and spring sleuth (2.1.6).
All's been going well till I reached the point of tracing messages to/from kafka.
Kafka messaging is done through:
kafkaTemplate.send(uri, pojo)
And here I realized that there is no injection into kafka messaging - both debugged points of doSend of Kafka, and printed the message, received from #KafkaListener (with looking for keys from brave...KafkaKeys), and never saw a note of tracing.
As I understand from the doc, these messaging are enabled by default (not that I havent enabling "messaging" or "integration").
Tried registering custom bean implementations of "Propagation.Setter" just to see if it's actually being called, and never seen this actually ping.
Additional note: I found that org.apache.kafka(from spring-kafka 2.2.9) ...KafkaProducer is used instead of Sleuth one's.
What am I missing?
We are using plain Spring AMQP in our spring boot projects.
We want to make sure that our message consumers can test against real messages and avoid to test against static test messages.
Thus our producers could generate message snippets in a test phase that can be picked up by the consumer test to make sure it tests against the latest message version and see if changes in the producer break the consumer.
It seems like Spring Cloud Contract does exactly that. So is there a way to integrate spring cloud contract with spring amqp? Any hints in which direction to go would be highly appreciated.
Actually we don't support it out of the box but you can set it up yourself. In the autogenerated tests we're using an interface to receive and send messages so
you could implement your own class that uses spring-amqp. The same goes for the consumer side (the stub runner). What you would need to do is to implement and register a bean of
org.springframework.cloud.contract.verifier.messaging.MessageVerifier type for both producer and consumer. This should work cause what we're doing in the autogenerated tests is that we #Inject MessageVerifier
so if you register your own bean it will work.
UPDATE:
As #Mathias has mentioned it, the AMQP support is already there in Spring Cloud Contract https://cloud.spring.io/spring-cloud-contract/spring-cloud-contract.html#_stub_runner_spring_amqp