How to add the request info in raven java with logback? - sentry

We are using raven with logback in java and sentry for error reporting. We can successfully log errors but how can we add the request info?
I have not found any example on https://docs.getsentry.com/hosted/clients/java/modules/logback/

You can use MDC or marker in slf4j, here's the link of example from sentry: https://docs.sentry.io/clients/java/modules/logback/. Please check Mapped Tags.
You can specify MDC keys to send as tags instead of including them in Additional Data. This allows them to be filtered within Sentry.

Related

How to add same prefix to all LOGS in request in SpringBoot project

I use #Slf4j to add logs in all layers in my SpringBoot project.
I want to use some unque ID for all logs inside onq request.
For example generate UUID at the RestController and automatic add it to every logs in this thread
For example
[..... Some UUID ....] "The real logger message"
How to do this?
It need to have possibility to filter all logs of specific request.
If your application is threaded (or any better word to describe "opposed to a reactive application where it might be possible that everything happens in the main thread"), use the mapped diagnostic context (MDC). The MDC is a thread-bound Key-Value Map, where you can put and retrieve data. Baeldung has a tutorial on logging with the mdc using multiple logging frameworks. Also, there are plenty of examples across the web.
If your application should be reactive, you may wanna check out this project reactor reference.

How to log the trace id using Springboot custom filters

I have few custom filters in my Springboot Webflux API. This project has been activated with Spring Sleuth, however, these filters are not logging the trace and span ids in the log messages.
I made sure that the order was set properly for these filters.
Example:
2020-03-23 12:53:18.895 -2020-03-23 12:53:18.895 INFO [my-spring-boot,,,] 9569 --- [ctor-http-nio-3] c.d.a.f.test.myTestEnvWebFilter : Reading data from header
Can someone please provide your insights on this?
It might be due to the default sampler percentage configuration, take a look at this article for an example:
https://www.baeldung.com/tracing-services-with-zipkin

How to Add DataDog trace ID in Logs using Spring Boot + Logback

OK, I spent quiet some time figuring out how to configure stuff to have DataDog trace ID in logs but couldn't get it working. To be clear what I'm looking for is to see trace IDs in logs message, the same way that adding spring-cloud-starter-sleuth to the classpath, automatically configure Slf4j/Logback to show trace IDs in log messages.
Where I've started:
We've got a simple web spring boot application running as a Docker container deployed as an AWS Elastic BeansTalk, whose logs go to CloudWatch and we read them there.
We have DataDog as a Java agent (thus no dependencies in pom.xml)
We have SLF4J/Logback in our dependencies list.
There's no other related depndencies (like dd-trace-ot or any opertracing libs)
What I did so far:
I found on SO that adding opentracing-spring-cloud-starter will add log integration automatically. But I couldn't get it working.
On DD website, it says configuring the pattern is enough to see the IDs, but in our case it didn't work. (is it because we don't have our logs a JSON?). Also, adding dd-trace-ot didn't help.
Notes:
We can't switch to JSON logs.
We can't switch to any other library (e.g. Slueth).
We can't go away from CloudWatch.
Can someone tell me how exactly I need to configure the application to see trace IDs in log messages? Is there any documentation or samples I can look at?
Do you have the ability to add some parameters in the logs sent. From the documentation you should be able to inject the trace id into your logs in a way that Datadog will interpret them.
You can also look at a parser to extract the trace id and span id from the raw log. This documentation should help you out on that.
From the documentation, if you don't have JSON logs, you need to include dd.trace_id and dd.span_id in your formatter:
If your logs are raw formatted, update your formatter to include
dd.trace_id and dd.span_id in your logger configuration:
<Pattern>"%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L -
%X{dd.trace_id:-0} %X{ dd.span_id:-0} - %m%n"</Pattern> ```
So if you add %X{dd.trace_id:-0} %X{ dd.span_id:-0}, it should work.

Spring cloud sleuth not appending keys to Hibernate query logs

When I am using spring cloud sleuth, I am observing that it is appending application name and key details for all the application logs.
But this is not happening for Hibernate logs or jpa queries.
Is there any way to achieve this using sleuth
You can check out Brave integration with JDBC via py6spy - https://github.com/openzipkin/brave/tree/master/instrumentation/p6spy
Extract from the docs:
brave-instrumentation-p6spy
This includes a tracing event listener for P6Spy (a proxy for calls to your JDBC driver). It reports to Zipkin how long each statement takes, along with relevant tags like the query.
P6Spy requires a spy.properties in your application classpath (ex src/main/resources). brave.p6spy.TracingP6Factory must be in the modulelist to enable tracing.
modulelist=brave.p6spy.TracingP6Factory
url=jdbc:p6spy:derby:memory:p6spy;create=true
In addition, you can specify the following options in spy.properties
remoteServiceName
By default the zipkin service name for your database is the name of the database. Set this property to override it
remoteServiceName=myProductionDatabase
includeParameterValues
When set to to true, the tag sql.query will also include the JDBC parameter values.
Note: if you enable this please also consider enabling 'excludebinary' to avoid logging large blob values as hex (see http://p6spy.readthedocs.io/en/latest/configandusage.html#excludebinary).
includeParameterValues=true
excludebinary=true
spy.properties applies globally to any instrumented jdbc connection. To override this, add the zipkinServiceName property to your connection string.
jdbc:mysql://127.0.0.1:3306/mydatabase?zipkinServiceName=myServiceName
This will override the remoteServiceName set in spy.properties.
The current tracing component is used at runtime. Until you have instantiated brave.Tracing, no traces will appear.

Assign different logger (appender) to different Apache Camel routes?

I was wondering if there is any way already implemented in Apache Camel to be able to log to different loggers depending on the route. I am using Spring DSL to create the routes. My use case is that I want a different log file for each route I am defining.
Is that possible?
You can enabled MDC logging, which then include details about which route is currently being logged from: http://camel.apache.org/mdc-logging.html
Then the logging framework you use, such as log4j, logback, etc. can be configured to log to different appenders based on a MDC key (eg camel.routeId)

Resources