How to remove Spring Data CustomConversions warnings from application startup? - spring-boot

I have an application with the following Spring dependencies:
starterBase : 'org.springframework.boot:spring-boot-starter:2.2.1.RELEASE',
starterActuator: 'org.springframework.boot:spring-boot-starter-actuator:2.2.1.RELEASE',
starterJpa : 'org.springframework.boot:spring-boot-starter-data-jpa:2.2.1.RELEASE',
starterTest : 'org.springframework.boot:spring-boot-starter-test:2.2.1.RELEASE',
starterWeb : 'org.springframework.boot:spring-boot-starter-web:2.2.1.RELEASE',
elasticsearch : 'org.springframework.boot:spring-boot-starter-data-elasticsearch:2.2.1.RELEASE'
In the moment that I added the elasticsearch dependency, the following Warnings appeared when I start the application:
WARN [main] o.s.data.convert.CustomConversions.register - Registering converter from class org.springframework.data.geo.Point to interface java.util.Map as writing converter although it doesn't convert to a store-supported type! You might wanna check you annotation setup at the converter implementation.
WARN [main] o.s.data.convert.CustomConversions.register - Registering converter from interface java.util.Map to class org.springframework.data.geo.Point as reading converter although it doesn't convert from a store-supported type! You might wanna check you annotation setup at the converter implementation.
WARN [main] o.s.data.convert.CustomConversions.register - Registering converter from class org.springframework.data.elasticsearch.core.geo.GeoPoint to interface java.util.Map as writing converter although it doesn't convert to a store-supported type! You might wanna check you annotation setup at the converter implementation.
WARN [main] o.s.data.convert.CustomConversions.register - Registering converter from interface java.util.Map to class org.springframework.data.elasticsearch.core.geo.GeoPoint as reading converter although it doesn't convert from a store-supported type! You might wanna check you annotation setup at the converter implementation.
I debugged the code, and in spring-data-commons:2.2.1-RELEASE in CustomConversions.java, there is a private method with name 'register' in line 196, and its javadoc mentions the Mongo types, and it is strange, because we are not using Mongo. Is this Mongo reference correct?
But the main question is, is there any way to avoid/remove these warnings?

This code was refactored into spring data commons in April 2017, and the comment was copied from the original place and not adapted. So this is no mongo specific stuff here.
As for the warnings, all you can do at the moment is ignore them, we'll check if we need these at all.
Addition:
there is an issue for that, the corrsponding PR is in the pipeline of being processed. So hopefully these warnings will be dealed with soon.

I fixed it with adding to my application.yml:
logging.level.org.springframework.data.convert.CustomConversions: ERROR

If you use log4j2, you may ignore this error by adding a specific log level for this package, something like the following.
<?xml version="1.0" encoding="UTF-8"?>
<Configuration>
<Loggers>
<Root level="info">
<!-- <AppenderRef ref="........"/> -->
</Root>
<Logger name="org.springframework.data.convert.CustomConversions" level="ERROR"></Logger>
</Loggers>
</Configuration>

Related

inconsistent bean validation initialization of ConstraintValidator defined via ServiceLoader

This question asks for some specifics about more general topic regarding modularization of bean validation I asked before.
In question linked above, following this documentation and this post I split annotation and ConstraintValidator definition into 2 java modules, and linked them together using ServiceLoader as shown in documentation here. Works, mostly. But there is one unsolved issue, that it does not work for validation defined via XML, which I did according to documentation again. What does not work: The pairing between annotation and ConstraintValidator is not set, the service loader stuff is not used at all.
To recap: I have working setup using this ServiceLoader approach and it works when validating stuff coming through rest layer. All paired correctly.
BUT! We are getting these DTOs also through kafka. And here we have two different flows. There is some initialization of common ConstraintValidators on startup, and then:
if we first get REST message, ServiceLoader stuff is discovered only at this request time, some next initialization is done seemignly, and after that even kafka messages works, meaning pairing for custom validator is available everywhere. (Great!)
if kafka message arrives first though(typical), no service loader stuff is consulted and somehow it 'destroys' the configuration in way, that even if later rest request comes it won't work either, saying, that there is no ConstraintValidator for given annotation. The initialization is completed somehow defectively.
validation.xml is as easy as:
<validation-config
xmlns="http://xmlns.jcp.org/xml/ns/validation/configuration"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/validation/configuration validation-configuration-2.0.xsd"
version="2.0">
<constraint-mapping>/META-INF/validation-constraints.xml</constraint-mapping>
</validation-config>
notes:
2.0 version is because of hibernate-validator 6.2.0 which comes from spring dependency management.
Why not use annotation and dump this xml stuff altogether? Not mine file, unmodifiable.
If there is some trivial newbie mistake, please advise. Maybe there is some way how to kick in service loader functionality into action in validation.xml file, I'm not aware of and cannot find anywhere.
EDITS/suggestions:
A: try to inject validator on startup to make sure it's loaded:
#Autowired
private Validator validator;
#EventListener(ApplicationReadyEvent.class)
public void logReady() {
System.out.println(validator.toString());
}
did print initialized validator, did not help though.

How to use Spring environment values from config server in Logstash Encoder of Logback

I would like to use Spring environment values as custom fields in the Logstash encoder of a Logback appender.
There is a general configuration tag to use properties
<property resource="logstash.properties" />
And there is a special configuration tag from Spring for this purpose
<springProperty name="appEnv" source="environment"/>
The properties of both tags can then be used in the custom fields of the Logstash encoder
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
<customFields>{"application.environment":"${appEnv}"</customFields>
</encoder>
Problem is, as far as I understand, that this only works under certain circumstances. The problem is probably that Logback has already finished configuring when the Spring environment is built.
It seems to work when
The property is local and static (available on configuration time)
The property is in bootstrap.properties
It seems NOT to work when
The property is dynamic as when retrieved from Spring config server
My property values delivered from config server are null when Logback is configured and therefore the log shows them as appEnv_IS_UNDEFINED for a property called appEnv.
Because most examples just use the spring.application.name this seems to be mostly unnoticed.
To solve the timing problem, I searched for a way to reload the Logback configuration onApplicationEvent. I found this answer that confirms my problem and offers a skeleton solution.
I found other solutions where the Logback appender that uses the Logstash encoder is completely programmatically built and added to the LoggerContext.
However, I wonder if there is also a way to stick with the XML configuration of the appender and "just reload" the config programmatically when the Spring environment is ready. How would I do this?
I found this answer to do the reload, but it does not work for my case. The appEnv_IS_UNDEFINED continue to appear in the log file.
I was able to solve my problem by implementing a Spring ApplicationContextInitializer.
In the called initialize method I can access my Logback Appender and Encoder via RootLogger.
Logger rootLogger = (Logger) LoggerFactory.getLogger(Logger.ROOT_LOGGER_NAME);
RollingFileAppender jsonFileAppender = (RollingFileAppender) rootLogger.getAppender(LOGSTASH_APPENDER_NAME);
LogstashEncoder encoder = (LogstashEncoder) jsonFileAppender.getEncoder();
From the LogstashEncoder, I can get the customFields
String customFields = encoder.getCustomFields();
And there I found the unresolved properties in the JSON String as expected
{"application.environment":"appEnv_IS_UNDEFINED"}
Since I can get the built Spring Environment from the passed ApplicationContext
springEnvironment = applicationContext.getEnvironment();
I can match unresolved properties with the Regex (\w+)_IS_UNDEFINED and replace them with the real value from the Spring Environment.
Surprisingly, I do not need to reload or restart anything. It is sufficent to just set the fixed customFields on the Encoder. Immediately after, the Log messages contain the correct values.
encoder.setCustomFields(fixedCustomFields);
With this Initializer in place, I can fully configure my appender and the LogstashEncoder in logback-spring.xml or an included file.

How to set date format for JsonObjectMapper in Spring Integration

I am converting my Java object to Map using Spring Integration ObjectToMapTransformer's transformPayload().
Everything works fine except that the Instant fields in my object are getting broken into epochSecond and nano, which in turn throws exception while persisting in data-store (MongoDB).
This is the Spring Integration JsonObjectMapper being used to convert the Object to Map:
private final JsonObjectMapper<?, ?> jsonObjectMapper = JsonObjectMapperProvider.newInstance();
My question is how can I configure date format for the above mapper. Just like Jackson's ObjectMapper::configure(), do we have any similar options here?
I cannot find any, neither in source code nor on internet!!
I also tried enabling/disabling spring.jackson.serialization.WRITE_DATES_AS_TIMESTAMPS in my application.properties, but no joy!
I have jackson-datatype-jsr310 dependency in my pom.xml
How to get the Instant in correct format?
I think we should add support for custom JsonObjectMapper injection. That way you would be able to build Jackson2JsonObjectMapper based on desired ObjectMapper.
Please, raise a JIRA ticket on the matter and don't hesitate with the contribution: https://github.com/spring-projects/spring-integration/blob/master/CONTRIBUTING.adoc
Meanwhile as a workaround I'd suggest a pair of ObjectToJsonTransformer/JsonToObjectTransformer:
.transform(Transformers.toJson(jsonObjectMapper(), ObjectToJsonTransformer.ResultType.NODE))
.transform(Transformers.fromJson(Map.class, jsonObjectMapper()))

Change PDFBox logging level using logback

I have a java app that is running on spring boot.
I'm using tika which in turn uses pdfbox.
I'm using logback as my logging implementation with slf4j.
I know that pdfbox uses apache commons logging.
I'm trying to disable the change the logging level to FATAL like so
<logger name="org.apache.pdfbox" level="FATAL"/>
The problem is that it still doesn't change the level.
I've run this with a debugger. I'm inspecting the logger that pdfbox uses and the results are
result = SLF4JLocationAwareLog
name = org.apache.pdfbox.util.PDFStreamEngine
logger.level = null
logger.loggerContext = ch.qos.logback.classic.LoggerContext[default]
By logger context, I understand that it is indeed using logback, but the configs are not present.
I'll answer my own question and hope that someone will find it useful.
The reason that the logger.level was null is because I didn't specify anything, so it got it from the parent logger. The FATAL didn't work because the highest level is not FATAL but ERROR.
http://logback.qos.ch/apidocs/ch/qos/logback/classic/Level.html
When I changed it to error everything worked as expected.

Spring boot with Thymeleaf static assets getting repetitive log statements

I'm converting from JSPs to Thymeleaf while converting a SOA service to Spring Boot. I'm wondering if I have not configured something correctly as I continue to get these statements:
o.s.b.a.e.m.EndpointHandlerMapping - Looking up handler method for path /css/bootstrap.min.css
o.s.b.a.e.m.EndpointHandlerMapping - Looking up handler method for path /img/gizmonicInstitute.jpg
o.s.b.a.e.m.EndpointHandlerMapping - Did not find handler method for [/img/gizmonicInstitute.jpg]
o.s.b.a.e.m.EndpointHandlerMapping - Did not find handler method for [/css/bootstrap.min.css]
Within my .html file (located within the /resources/templates directory)
<link th:href="#{/css/bootstrap.min.css}"
href="../../css/bootstrap.min.css" rel="stylesheet" media="screen"/>
.
:
<img src="/img/gizmonicInstitute.jpg"/>
With in my spring boot startup, I see ResourceHttpRequestHandler mapped as follows:
o.s.w.s.h.SimpleUrlHandlerMapping - Mapped URL path [/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
Is there something I'm not configuring? The pages are being discovered and are rendering fine.. just that these messages are littering my logs.
Turns out, that the existing SLF/Logback configuration which we had in an logback.xml file enabled root to be level info:
<root level="info">
<appender-ref ref="STDOUT"/>
<appender-ref ref="central"/>
</root>
Requiring a configuration level set for spring:
logging.level.org.springframework.*: WARN
Once I set the configuration level to WARN or above, these messages disappeared (which I interpreted as a misconfiguration with my spring boot migration). Shout out to #AndyWilkinson for directing my attention toward log messaging levels.
Update #1
A note that once the logback.xml is used, setting configuration level via properties does not seem to work. So I had to add this to logback.xml:
<logger name="org.springframework" level="WARN"/>

Resources