Jax-rs and amqp zipkin integration - spring

I've been roaming the depths of the internet but I find myself unsatisfied by the examples I've found so far. Can someone point me or, show me, a good starting point to integrate zipkin tracing with jaxrs clients and amqp clients?
My scenario is quite simple and I'd expect this task to be trivial tbh. We have a micro services based architecture and it's time we start tracing our requests and have global perspective of our inter service dependencies and what the requests actually look like (we do have metrics but I want more!) . The communication is done via jax-rs auto generated clients and we use rabbit template for messaging.
I've seen brave integrations with jaxrs but they are a bit simplistic. My zipkin server is a spring boot mini app using stream-rabbit, so zipkin data is sent using rabbitmq.
Thanks in advance.

After some discussion with Marcin Grzejszczak and Adrien Cole (zipkin and sleuth creators/active developers) I ended up creating a Jersey filter that acts as bridge between sleuth and brave. Regarding AMQP integration, added a new #StreamListener with a conditional for zipkin format spans (using headers). Sending messages to the sleuth exchange with zipkin format will then be valid and consumed by the listener. For javascript (zipkin-js), I ended up creating a new AMQP Logger that sends zipkin spans to a determined exchange. If someone ends up reading this and needs more detail, you're welcome to reach out to me.

Related

OpenTelemetry: Context propagation using messaging (Artemis)

I wrote some micro-services using Quarkus that communicate via Artemis. Now I want to add OpenTelemetry for tracing purpose.
What I already tried is to call service B from service A using HTTP/REST. Here the trace id from service A is automatically added to the header of the HTTP request and used in service B. So this works fine. In Jaeger I can see the correlation.
But how can this be achieved using Artemis as messaging system? Do I have to (manually) add the trace id from service A into the message and read it in service B to setup somehow the context (don't know whether this is possible)? Or is there possibly an automatism like for HTTP requests?
I would appreciate any assistance.
I have to mention at this point that I have little experience with tracing so far.
There is no quarkus, quarkiverse extension or smallrye lib that provides integration with Artemis and OpenTelemetry, yet.
Also, OpenTelemetry massaging spec is being worked at the moment, because the correct way to correlate sent, received messages and services is under definition at the OTel spec level.
However, I had exactly the same problem as you and did a manual instrumentation that you can use as inspiration: quarkus-observability-demo-activemq
It will correlate the sent service as parent of receiving end.

Spring Reactive Stack with Spring for Apache Kafka

In a few words:
I'm trying to decide between using the default Spring for Apache Kafka stack, KafkaTemplate or the pair, ReactiveKafkaProducerTemplate and ReactiveKafkaConsumerTemplate for my Reactor based application.
Some more context:
In the company I work we're developing a high-disponibility application aiming to publish a set of requests directly to a Kafka Broker. Since this is an API centric application expecting to receive a few millions of requests per week, we decided to go with a stack based on the Project Reactor with Spring WebFlux and Kotlin.
After doing some digging I've discovered that the Spring for Apache Kafka has a simple wrapper designed around the Reactor Kafka implementation, but this wrapper lacks a lot of the functionalities present in the default KafkaTemplate mentioned before, things like: A Metrics Binder out of the box (for prometheus integration), associated factories, extensive documentation, Auto configuration, etc.
I'm trying to understand what I'm really giving up when using the default implementation in favor of the Reactive one. Am I giving up back pressure functionality? Am I sacrificing the Reactive Stack present in my application? Will this be a toll in the future? Does anyone has some experience in working with a Reactive Stack alongside a non-reactive solution?
I have, also, a few concerns regarding the DLT flow facilitated in the default implementation, things like the SeekToCurrentErrorHandler strategy

JMS configuration for Spring Integration

I am trying to implement activemq(just want to receive messages) with spring integration.I cant find any clues how to provide java configuration for activemq. What are the minimum required components for job. Somewhere we have channel, adapter somewhere we dont. I am unable to understand spring concepts of adapter, channel and service activator. They are all feeling same to me. I find the integration documentation going above my head. I never had problems with understanding other spring modules(boot, mvc, cloud, batch). Can someone point me in the right direction or what is it that I am doing wrong.
You probably are missing the fact that Spring Integration is a reference implementation for well-known Enterprise Integration Patterns. So, please, consider to start from the theory and ideas. Then you can come back to Spring Integration as an API for those EIP. See respective book on the matter: https://www.enterpriseintegrationpatterns.com.
To read messages from JMS destination you need to use a JmsMessageDrivenEndpoint with respective ConnectionFactory injected.
There is nothing more about that than an ActiveMQConnectionFactory as a bean.
For example in tests we do like this:
new ActiveMQConnectionFactory("vm://localhost?broker.persistent=false")
And an in-memory broker is started.
See a test class with Java DSL for some way how to configure JMS components: https://github.com/spring-projects/spring-integration/blob/master/spring-integration-jms/src/test/java/org/springframework/integration/jms/dsl/JmsTests.java

Example of RabbitMQ with RPC in Spring Integration

After make a search about different ways to implement it, im stuck.
What im looking for is to realize this example (https://www.rabbitmq.com/tutorials/tutorial-six-spring-amqp.html) with Spring Integration.
I had found interesting post as this (Spring integration with Rabbit AMQP for "Client Sends Message -> Server Receives & returns msg on return queue --> Client get correlated msg") but didn't help me with what i need.
My case mill be a system where a client call the "convertSendAndReceive" method and a server (basede on Spring Integration) will response.
Thanks
According to your explanation it sounds like Outbound Gateway on the Client side and Inbound Gateway on the Server side pair is what you need.
Spring Integration AMQP support provides those implementations for you with built-in correlation functionality: https://docs.spring.io/spring-integration/docs/5.0.0.RELEASE/reference/html/amqp.html

Spring Boot Micro Service Tracing Options

I am having below requirement for which is there any open source library will cover all of them.
1.We are building a distributed micro service architecture with Spring Boot.Which includes more than 100 micro services.
2.There is a lot if inter micro service communications possible to achieve single transaction.
3.We want to trace every micro service call and the trace should provide following information.
a.Transaction ID/Trace ID
b. Back end transaction status-HTTP status for REST.Like wise for SOAP as well.
c.Time taken for that call.
d.Request and Response payload.
Currently we are achieving this using indigenous tracing frame work.Is there any open source project will handle all this without any coding from developer.I know we have few options with spring Boot Cloud Zipkin,Seluth etc does this handle above requirements.
My project has similar requirements to yours. IMHO, Spring-cloud-sleuth + Zipkin work well in my case.
For any inter microservices communication, we are using Kafka, and Spring-cloud-sleuth + zipkin has no problem to trace all the call, from REST -> Kafka -> More Kafka -> REST.
To enable Kafka Tracing, just simply add
spring:
sleuth:
propagation-keys: some-key
sampler:
probability: 1
messaging:
kafka:
enabled: true
We are also using Azure ApplicationInsights to do centralized logging, which is well integrated with Spring Cloud.
Hope above give you some confidence of using Sleuth + Zipkin.

Resources