Spring/Gradle: Adding custom logging pattern to application.yml removes log highlighting - spring

I'm working in a Spring Boot Application using Gradle. I'm probably going to be giving a lot of unnecessary context here, but I'm not sure where my problem is.
I have an SLF4j logger that's resolving to a org.slf4j.impl.Log4jLoggerAdapter. I'm trying to change the log pattern layout for the console logs. The default pattern looks like:
2022-03-02 17:42:48.892 [ INFO] 19296 --- [ main] ggestions.jvm.JVMLatestVersionSuggestion
In the console output, the level, timestamp, and class path are all highlighted. However, I really don't like the format, and would like to overwrite it.
I've added the following to my application.yml:
logging.pattern.console: '%d [%t] %-5level %logger{36} - %m%n'
Which is giving me the layout that I want, but the highlighting disappears and it's all default color. I've seen advice to add %highlight, but that doesn't correctly display - it just shows up as the word %highlight in the output, like so:
2022-03-02 17:50:09,525 [main] %highlight(INFO ) com.indeed.common.boot.suggestions.jvm.JVMLatestVersionSuggestion
The logger is still resolving to org.slf4j.impl.Log4jLoggerAdapter. My guess is that it's because Log4j doesn't support highlighting, but then I'm not clear what was making it work in the first place. Is it possible to get it back?
(note: this is work code, and I don't have the ability to change the Slf4j implementation to Log4j2)

Related

Customising Junit5 test output via Gradle

I'm trying to output BDD from my junit tests like the following: -
Feature: Adv Name Search
Scenario: Search by name v1
Given I am at the homepage
When I search for name Brad Pitt
And I click the search button2
Then I expect to see results with name 'Brad Pitt'
When running in IntelliJ IDE, this displays nicely but when running in Gradle nothing is displayed. I did some research and enabled the test showStandardStreams boolean i.e.
In my build.gradle file I've added ...
test {
useJUnitPlatform()
testLogging {
showStandardStreams = true
}
}
This produces ...
> Task :test
Adv Name Search STANDARD_OUT
Feature: Adv Name Search
Tests the advanced name search feature in IMDB
Adv Name Search > Search by name v1 STANDARD_OUT
Scenario: Search by name v1
Given I am at the homepage
When I search for name Brad Pitt
And I click the search button2
Then I expect to see results with name 'Brad Pitt'
... which is pretty close but I don't really want to see the output from gradle (the lines with STANDARD_OUT + extra blank lines).
Adv Name Search STANDARD_OUT
Is there a way to not show the additional Gradle logging in the test section?
Or maybe my tests shouldn't be using System.out.println at all, but rather use proper logging (eg. log4j) + gradle config to display these?
Any help / advice is appreciated.
Update (1)
I've created a minimum reproducable example at https://github.com/bobmarks/stackoverflow-junit5-gradle if anyone wants to quickly clone and ./gradlew clean test.
You can replace your test { … } configuration with the following to get what you need:
test {
useJUnitPlatform()
systemProperty "file.encoding", "utf-8"
test {
onOutput { descriptor, event ->
if (event.destination == TestOutputEvent.Destination.StdOut) {
logger.lifecycle(event.message.replaceFirst(/\s+$/, ''))
}
}
}
}
See also the docs for onOutput.
FWIW, I had originnaly posted the following (incomplete) answer which turned out to be focusing on the wrong approach of configuring the test logging:
I hardly believe that this is possible. Let me try to explain why.
Looking at the code which produces the lines that you don’t want to see, it doesn’t seem possible to simply configure this differently:
Here’s the code that runs when something is printed to standard out in a test.
The method it calls next unconditionally adds the test descriptor and event name (→ STANDARD_OUT) which you don’t want to see. There’s no way to switch this off.
So changing how standard output is logged can probably not be changed.
What about using a proper logger in the tests, though? I doubt that this will work either:
Running tests basically means running some testing tool – JUnit 5 in your case – in a separate process.
This tool doesn’t know anything/much about who runs it; and it probably shouldn’t care either. Even if the tool should provide a logger or if you create your own logger and run it as part of the tests, then the logger still has to print its log output somewhere.
The most obvious “somewhere” for the testing tool process is standard out again, in which case we wouldn’t win anything.
Even if there was some interprocess communication between Gradle and the testing tool for exchanging log messages, then you’d still have to find some configuration possibility on the Gradle side which configures how Gradle prints the received log messages to the console. I don’t think such configuration possibility (let alone the IPC for log messages) exists.
One thing that can be done is to set the displayGranuality property in testLogging Options
From the documentation
"The display granularity of the events to be logged. For example, if set to 0, a method-level event will be displayed as "Test Run > Test Worker x > org.SomeClass > org.someMethod". If set to 2, the same event will be displayed as "org.someClass > org.someMethod".

config yml for mybatis different level

logging:
level:
root: info
com:
demo:
mapper: debug
insertBigData: info
I want to see my mybatis sql in debug model, but there a function which inser a huge value, and I want to ignore this special function, but yml seem don't support config this style
I know move the function to another mapper is a way. what else can I do to ignore log of function insertBigData
It might not be ideal, but specifying partially-qualified key seems to be working.
logging:
level:
root: info
com:
demo:
mapper: debug
mapper.insertBigData: info

Adding Custom "trace id with Alpha numeric values and spiting it out in application Log "

I am using Sleuth 2.1.3.
I want to add a custom "trace ID" as "correlation id" with alpha numeric value and want to spit in logs with spanid and parent id.
If i use below implementation for creating new custom trace id. does it get printed in logs ?
I tried below implementation but does not see any custom trace in log
https://github.com/openzipkin/zipkin-aws/blob/release-0.11.2/brave-propagation-aws/src/main/java/brave/propagation/aws/AWSPropagation.java
Tracing.newBuilder().propagationFactory(
ExtraFieldPropagation.newFactoryBuilder(B3Propagation.FACTORY)
.addField("x-vcap-request-id")
.addPrefixedFields("x-baggage-", Arrays.asList("country-code", "user-id"))
.build()
);
I tried with above code from https://cloud.spring.io/spring-cloud-sleuth/reference/html/#propagation but didnt see any custom trace id in log
You've passed in the B3Propagation.FACTORY as the implementation of the propagation factory so you're explicitly stating that you want the default B3 headers. You've said that you want some other field that is alphanumeric to be also propagated. Then in a log parsing tool you can define that you want to use your custom field as the trace id, but it doesn't mean that the deafult X-B3-TraceId field will be changed. If you want to use your custom field as trace id that Sleuth understands, you need to change the logging format and implement a different propagation factory bean.
One of the way which worked for me is
using ExtraFieldPropagation
and adding those keys in sleuth properties under propagation-keys
and whitelisted-keys
sample code
' #Autowired Tracer tracer;
Span currentSpan = tracer.nextSpan().start();
ExtraFieldPropagation.set(
"customkey", "customvalue");
sleuth:
log:
slf4j:
whitelisted-mdc-key : customkey
propagation:
tag:
enabled: true
propagation-keys : customkey '

Spring boot logging setup of FileAppender - where does it use the max-size property?

Overflowers
Please pardon my question if it's answer it or the answer is naive.
I have a very basic Spring Boot (1.5.4) logging setup in application.properties:
logging.level.org=WARN
logging.level.com=WARN
logging.level.springfox=OFF
logging.level.org.hibernate.hql.internal.ast=ERROR
logging.level.com.MyCompany.kph=DEBUG
logging.file=/var/MyProduct/logs/MyProduct.log
logging.file.max-size=2GB
logging.file.max-history=100
The 2GB is not being honoured. No value I put in there is being honoured. Even xxxxx as a value does not cause a blow-up.
logging.file does - and I can see that being used inside DefaultLogbackConfiguration.
From my source-following I can see method DefaultLogbackConfiguration#setMaxFileSize(a, b) being called. But that method is fixed at 10MB. This aligns with the behaviour i'm seeing.
Am I doing something wrong and triggering the very default behaviour? Or Does default behavior get loaded first then specific stuff goes on top? (If it does, I can't find it and it's not working for me).
Can someone point to me where max-size gets consumed and used?
Thanks
Rich
Christ just by writing this post and reading the docs for MY-SPRING-VERSION, I see max-size is not used at all. That is why it's not working.
https://docs.spring.io/spring-boot/docs/1.5.19.BUILD-SNAPSHOT/reference/htmlsingle/#boot-features-logging

How to log MDC with Spring Sleuth?

I have a Spring boot + sleuth based application. All works as expected. I have for now logs like this:
2017-05-04 17:55:52.226 INFO [alert,692d0eeca479e216,c3c8b680dc29ad02,false] 17292 --- [cTaskExecutor-1] c.k.a.b.s.alert.impl.AlertServiceImpl : Alert state to process: xxx
Now, I want to add custom MDC to my log like the contract reference for example. I want to have logs like this:
2017-05-04 17:55:52.226 INFO [alert,692d0eeca479e216,c3c8b680dc29ad02,false] [CONTRACT_REF] 17292 --- [cTaskExecutor-1] c.k.a.b.s.alert.impl.AlertServiceImpl : Alert state to process: xxx
I tried various things with no success:
Use the Spring Sleuth Tracer to add a tag;
Add logging.pattern.level=%5p %mdc to my application.properties file with MDC.put(xxx, xxx)
How can I add custom MDC/tags to my log?
For versions before 2.x, You have to create your own implementation of a SpanLogger. The easiest way will be to extend the Slf4jSpanLogger and provide your own code to add / update and remove the entries from MDC context. Then you can change your logging pattern and that way your logs will contain what they need.
I was able to add data to the MDC fairly easily by doing MDC.put("yourCoolKey", "your cool value") (see MDC.put JavaDoc).
Once you put the value into the MDC, you can use the sequence %X{yourCoolKey} in your logging pattern (in my case, the value of logging.pattern.console) to print the string "your cool value" as part of each log statement.
Optionally, you can specify a default value in the pattern string by adding :-<defaultValue> after the key, such as %X{yourCoolKey:-N/A}, which will print the string "N/A" whenever the MDC does not have an entry for "yourCoolKey". The default, if not specified, is a blank string ("")

Resources