Spring - Thymeleaf - Hot reload of message files (localisation) - spring-boot

I'm a using spring boot (2.5.7) with the devtools dependency for hot reload. It works pretty well (including changes in fragments) but not for the localisation files (message_XX.properties under resources/lang). Every time I make a change there, I need to restart the server. Here is my application.yaml:
spring:
thymeleaf:
cache: false
mode: HTML
encoding: UTF-8
prefix: file:src/main/resources/templates/
web:
resources:
static-locations:
- file:src/main/resources/static/
cache:
period: 0
Some edits:
I use vscode and gradle 7
I redefined a MessageSource.
Any idea?
Thanks!

Forgot to reply earlier, in the end, I simply came back to:
spring:
thymeleaf:
cache: false
mode: HTML
encoding: UTF-8
cache:
period: 0
My LocaleConfig class:
#Configuration
public class LocaleConfig {
#Bean
public AcceptHeaderLocaleResolver localeResolver() {
final AcceptHeaderLocaleResolver resolver = new AcceptHeaderLocaleResolver();
resolver.setDefaultLocale(Locale.ENGLISH);
return resolver;
}
#Bean
public ResourceBundleMessageSource messageSource() {
final ResourceBundleMessageSource source = new ResourceBundleMessageSource();
source.setBasename("lang/message");
return source;
}
}
And then, I need two terminals, one with a continuous build from gradle and one with the spring boot bootRun task:
./gradlew compileJava --continuous
and
./gradlew bootRun
DISCLAIMER: it may happen that I need to stop and restart both as the new messages are sometimes not picked up.

Related

Spring integration properties for Kafka

While trying to use listener config properties in application.yml, I am facing an issue where the KafkaListener annotated method is not invoked at all if I use the application.yml config(listener.type= batch). It only gets invoked when I explicitly set setBatchListener to true in code. Here is my code and configuration.
Consumer code:
#KafkaListener(containerFactory = "kafkaListenerContainerFactory",
topics = "${spring.kafka.template.default-topic}",
groupId = "${spring.kafka.consumer.group-id}")
public void receive(List<ConsumerRecord<String,byte[]>> consumerRecords,Acknowledgment acknowledgment){
processor.process(consumerRecords,acknowledgment);
}
application.yml:
listener:
missing-topics-fatal: false
type: batch
ack-mode: manual
Consumer configuration:
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(
new DefaultKafkaConsumerFactory<>(kafkaProperties.buildConsumerProperties()));
factory.setErrorHandler(new SeekToCurrentErrorHandler( new UpdateMessageErrorHandler(),new FixedBackOff(idleEventInterval,maxFailures)));
final ContainerProperties properties = factory.getContainerProperties();
properties.setIdleBetweenPolls(idleBetweenPolls);
properties.setIdleEventInterval(idleEventInterval);
return factory;
}
If I'm not mistaken, by using the ConcurrentKafkaListenerContainerFactory builder in your configuration you're essentially overriding a piece of code that is usually executed within ConcurrentKafkaListenerContainerFactoryConfigurer class within spring autoconfiguration package:
if (properties.getType().equals(Type.BATCH)) {
factory.setBatchListener(true);
factory.setBatchErrorHandler(this.batchErrorHandler);
} else {
factory.setErrorHandler(this.errorHandler);
}
Since it's hardcoded in your application.yaml file anyway, why is it a bad thing for it to be configured in your #Configuration file?

Swagger UI - Load custom file.yaml/json instead default configuration

I'm developing an SpringBoot REST project which runs perfectly. I'm trying to implement the OpenApi-ui in the project. It's working fine by default but I'd like to use my own yaml/json information file instead the default info.
I have been following the F.A.Q SpringDoc documentation , but nothing is working for me. It's throwing FAILED TO LOAD API DEFINITION : Fetch error undefined /open-api.yaml in the UI. Am I missing something in my configuration?
Thanks in advance.
Implementation
implementation group: 'org.springdoc', name: 'springdoc-openapi-ui', version: '1.5.9'
App config (route yaml = src/main/resources)
springdoc:api-docs:enabled: false
swagger-ui:url: /open-api.yaml
Configuration
#Bean
public OpenAPI customOpenAPI() {
return new OpenAPI()
.components(new Components().addSecuritySchemes("basicScheme",
new SecurityScheme().type(SecurityScheme.Type.HTTP).scheme("basic")))
.info(new Info().title("MyApp").version("1.0")
.license(new License().name("Apache 2.0").url("http://springdoc.org")));
}
#Bean
public SpringDocConfiguration springDocConfiguration(){
return new SpringDocConfiguration();
}
#Bean
public SpringDocConfigProperties springDocConfigProperties() {
return new SpringDocConfigProperties();
}
Yaml file
openapi: 3.0.3
info:
title: MyApp
description: MyApp Description
version: 1.0.0
servers:
- url: http://localhost:8080
description: Local server
{...more}
Access URL to OpenApi UI
http://localhost:8080/swagger-ui/index.html?configUrl=/v3/api-docs/swagger-config
OpenApi UI Image
Just if someone is looking for something similar, we finally created a new class, extending SwaggerIndexPageTransformer and implementing by SwaggerIndexTransformer , which led us to use #override method to change the url.
You can follow > https://github.com/springdoc/springdoc-openapi/issues/763

How to configure LoggingMeterRegistry step duration in Spring Boot 2.x?

I am trying to configure the LoggingMeterRegistry to log metrics for my Spring Boot 2.1.6 application. I want the metrics to be logged every hour.
In my application.yml, I've the following configured
management:
metrics:
export:
logging:
enabled: true
step: 60m
But in the logs I see the metrics being logged every minute. I've tried the other variation for the property key as well e.g.
management.metrics.export.logging:
enabled: true
step: 60m
I have also tried various formats for the duration string e.g. 1h, PT60M but with no success. The metrics are logged at 1 minute intervals.
I was looking at the code here StepDurationConverterTest and here StepDurationConverter that converts the step duration String to a Duration object and looks like both formats 60m and 1h should work.
Any ideas why I can't seem to change the logging interval?
I think the problem here is there's no
org.springframework.boot.actuate.autoconfigure.metrics.export.logging
package like there is for other MeterRegistrys (eg org.springframework.boot.actuate.autoconfigure.metrics.export.jmx).
Ie theres no auto configuration for the properties in Spring Boot. This is probably because the LoggingMeterRegistry is marked as #Incubating
You need to manually configure the LoggingMeterRegistry as a bean and create your own #ConfigurationProperties LoggingProperties and LoggingPropertiesConfigAdapter to get this to work. Or just hardcode the step period you want.
To configure step count duration in micrometer:
Please follow below step:
#Configuration
public class LoggingMeterRegistryConfig {
#Bean
public LoggingMeterRegistry loggingMeterRegistry() {
LoggingRegistryConfig config = new LoggingRegistryConfig() {
#Override
public String get(String s) {
return null;
}
#Override
public Duration step() {
return Duration.ofMinutes(2);
}
};
return LoggingMeterRegistry.builder(config).clock(Clock.SYSTEM).threadFactory(new NamedThreadFactory("logging-metrics-publisher")).build();
}
}
The following #Bean supplies config from Spring Environment allowing you to specify a property logging.step: 1h to get your desired period.
#Bean
LoggingMeterRegistry loggingMeterRegistry(Environment env) {
LoggingRegistryConfig springBasedConfig = prop -> env.getProperty(prop, String.class);
return new LoggingMeterRegistry(springBasedConfig, Clock.SYSTEM);
}

Spring Cloud Stream Kafka Channel Not Working in Spring Boot Application

I have been attempting to get an inbound SubscribableChannel and outbound MessageChannel working in my spring boot application.
I have successfully setup the kafka channel and tested it successfully.
Furthermore I have create a basic spring boot application that tests adding and receiving things from the channel.
The issue I am having is when I put the equivalent code in the application it belongs in, it appears that the messages never get sent or received. By debugging it's hard to ascertain what's going on but the only thing that looks different to me is the channel-name. In the working impl the channel name is like application.channel in the non working app its localhost:8080/channel.
I was wondering if there is some spring boot configuration blocking or altering the creation of the channels into a different channel source?
Anyone had any similar issues?
application.yml
spring:
datasource:
url: jdbc:h2:mem:dpemail;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE
platform: h2
username: hello
password:
driverClassName: org.h2.Driver
jpa:
properties:
hibernate:
show_sql: true
use_sql_comments: true
format_sql: true
cloud:
stream:
kafka:
binder:
brokers: localhost:9092
bindings:
email-in:
destination: email
contentType: application/json
email-out:
destination: email
contentType: application/json
Email
public class Email {
private long timestamp;
private String message;
public long getTimestamp() {
return timestamp;
}
public void setTimestamp(long timestamp) {
this.timestamp = timestamp;
}
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
}
Binding Config
#EnableBinding(EmailQueues.class)
public class EmailQueueConfiguration {
}
Interface
public interface EmailQueues {
String INPUT = "email-in";
String OUTPUT = "email-out";
#Input(INPUT)
SubscribableChannel inboundEmails();
#Output(OUTPUT)
MessageChannel outboundEmails();
}
Controller
#RestController
#RequestMapping("/queue")
public class EmailQueueController {
private EmailQueues emailQueues;
#Autowired
public EmailQueueController(EmailQueues emailQueues) {
this.emailQueues = emailQueues;
}
#RequestMapping(value = "sendEmail", method = POST)
#ResponseStatus(ACCEPTED)
public void sendToQueue() {
MessageChannel messageChannel = emailQueues.outboundEmails();
Email email = new Email();
email.setMessage("hello world: " + System.currentTimeMillis());
email.setTimestamp(System.currentTimeMillis());
messageChannel.send(MessageBuilder.withPayload(email).setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON).build());
}
#StreamListener(EmailQueues.INPUT)
public void handleEmail(#Payload Email email) {
System.out.println("received: " + email.getMessage());
}
}
I'm not sure if one of the inherited configuration projects using Spring-Cloud, Spring-Cloud-Sleuth might be preventing it from working, but even when I remove it still doesnt. But unlike my application that does work with the above code I never see the ConsumeConfig being configured, eg:
o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values:
auto.commit.interval.ms = 100
auto.offset.reset = latest
bootstrap.servers = [localhost:9092]
check.crcs = true
client.id = consumer-2
connections.max.idle.ms = 540000
enable.auto.commit = false
exclude.internal.topics = true
(This configuration is what I see in my basic Spring Boot application when running the above code and the code works writing and reading from the kafka channel)....
I assume there is some over spring boot configuration from one of the libraries I'm using creating a different type of channel I just cannot find what that configuration is.
What you posted contains a lot of unrelated configuration, so hard to determine if anything gets in the way. Also, when you say "..it appears that the messages never get sent or received.." are there any exceptions in the logs? Also, please state the version of Kafka you're using as well as Spring Cloud Stream.
Now, I did try to reproduce it based on your code (after cleaning up a bit to only leave relevant parts) and was able to successfully send/receive.
My Kafka version is 0.11 and Spring Cloud Stream 2.0.0.
Here is the relevant code:
spring:
cloud:
stream:
kafka:
binder:
brokers: localhost:9092
bindings:
email-in:
destination: email
email-out:
destination: email
#SpringBootApplication
#EnableBinding(KafkaQuestionSoApplication.EmailQueues.class)
public class KafkaQuestionSoApplication {
public static void main(String[] args) {
SpringApplication.run(KafkaQuestionSoApplication.class, args);
}
#Bean
public ApplicationRunner runner(EmailQueues emailQueues) {
return new ApplicationRunner() {
#Override
public void run(ApplicationArguments args) throws Exception {
emailQueues.outboundEmails().send(new GenericMessage<String>("Hello"));
}
};
}
#StreamListener(EmailQueues.INPUT)
public void handleEmail(String payload) {
System.out.println("received: " + payload);
}
public interface EmailQueues {
String INPUT = "email-in";
String OUTPUT = "email-out";
#Input(INPUT)
SubscribableChannel inboundEmails();
#Output(OUTPUT)
MessageChannel outboundEmails();
}
}
Okay so after a lot of debugging... I discovered that something is creating a Test Support Binder (how don't know yet) so obviously this is used to not impact add messages to a real channel.
After adding
#SpringBootApplication(exclude = TestSupportBinderAutoConfiguration.class)
The kafka channel configurations have worked and messages are adding.. would be interesting to know what on earth is setting up this test support binder.. I'll find that sucker eventually.

How to drop in-memory h2db between Spring Integration tests?

I am using Liquibase in my Spring web application. I have a bunch of entities with hundreds of tests for REST APIs in the integration tests for each entity like User, Account, Invoice, License etc. All of my integration tests pass when run by class but a lot of them fail when ran together using gradle test. It is very likely there is data collision between the tests and I am not interested in spending time to fix clean up data as of now. I prefer dropping the DB and context after every class. I figured I could use #DirtiesContext at class and so I annotated by test with it.
#RunWith(SpringRunner.class)
#SpringBootTest(classes = {Application.class, SecurityConfiguration.class},
webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
#DirtiesContext
public class InvoiceResourceIntTest {
I see that after adding the annotation, web application context starts for every class but when the Liquibase initialization happens, the queries are not run because checksum matches. Since this is an in-memory DB, I expected the DB to be destroyed along with spring context but it is not happening.
I also set jpa hibernate ddl-auto to create-drop but that did not help. The next option I am considering is instead of mem, write h2db to a file and delete that file in #BeforeClass of my integration test class files. I prefer dropping db automatically in memory instead of managing it in test, but want to try here as a last option. Thanks for the help.
Update:
I updated test as below.
#RunWith(SpringRunner.class)
#SpringBootTest(classes = {Application.class, SecurityConfiguration.class},
webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT,
properties = "spring.datasource.name=AccountResource")
#DirtiesContext
public class AccountResourceIntTest {
I have set unique names for each integration test. I still don't see the database being new because I can see only Liquibase checksum in the logs.
Here is my app config from application.yml
spring:
datasource:
driver-class-name: org.h2.Driver
url: jdbc:h2:mem:myApp;DB_CLOSE_DELAY=-1
name:
username:
password:
jpa:
database-platform: com.neustar.registry.le.domain.util.FixedH2Dialect
database: H2
open-in-view: false
show_sql: true
hibernate:
ddl-auto: create-drop
naming-strategy: org.springframework.boot.orm.jpa.hibernate.SpringNamingStrategy
properties:
hibernate.cache.use_second_level_cache: false
hibernate.cache.use_query_cache: false
hibernate.generate_statistics: true
hibernate.hbm2ddl.auto: validate
My project is generated from JHipster 2.x version if it matters. Please see my database configuration class below. AppProperties are application specific properties (Different from Spring).
#Configuration
public class DatabaseConfiguration {
private static final int LIQUIBASE_POOL_INIT_SIZE = 1;
private static final int LIQUIBASE_POOL_MAX_ACTIVE = 1;
private static final int LIQUIBASE_POOL_MAX_IDLE = 0;
private static final int LIQUIBASE_POOL_MIN_IDLE = 0;
private static final Logger LOG = LoggerFactory.getLogger(DatabaseConfiguration.class);
/**
* Creates data source.
*
* #param dataSourceProperties Data source properties configured.
* #param appProperties the app properties
* #return Data source.
*/
#Bean(destroyMethod = "close")
#ConditionalOnClass(org.apache.tomcat.jdbc.pool.DataSource.class)
#Primary
public DataSource dataSource(final DataSourceProperties dataSourceProperties,
final AppProperties appProperties) {
LOG.info("Configuring Datasource with url: {}, user: {}",
dataSourceProperties.getUrl(), dataSourceProperties.getUsername());
if (dataSourceProperties.getUrl() == null) {
LOG.error("Your Liquibase configuration is incorrect, please specify database URL!");
throw new ApplicationContextException("Data source is not configured correctly, please specify URL");
}
if (dataSourceProperties.getUsername() == null) {
LOG.error("Your Liquibase configuration is incorrect, please specify database user!");
throw new ApplicationContextException(
"Data source is not configured correctly, please specify database user");
}
if (dataSourceProperties.getPassword() == null) {
LOG.error("Your Liquibase configuration is incorrect, please specify database password!");
throw new ApplicationContextException(
"Data source is not configured correctly, "
+ "please specify database password");
}
PoolProperties config = new PoolProperties();
config.setDriverClassName(dataSourceProperties.getDriverClassName());
config.setUrl(dataSourceProperties.getUrl());
config.setUsername(dataSourceProperties.getUsername());
config.setPassword(dataSourceProperties.getPassword());
config.setInitialSize(appProperties.getDatasource().getInitialSize());
config.setMaxActive(appProperties.getDatasource().getMaxActive());
config.setTestOnBorrow(appProperties.getDatasource().isTestOnBorrow());
config.setValidationQuery(appProperties.getDatasource().getValidationQuery());
org.apache.tomcat.jdbc.pool.DataSource dataSource = new org.apache.tomcat.jdbc.pool.DataSource(config);
LOG.info("Data source is created: {}", dataSource);
return dataSource;
}
/**
* Create data source for Liquibase using dba user and password provided for "liquibase"
* in application.yml.
*
* #param dataSourceProperties Data source properties
* #param liquibaseProperties Liquibase properties.
* #param appProperties the app properties
* #return Data source for liquibase.
*/
#Bean(destroyMethod = "close")
#ConditionalOnClass(org.apache.tomcat.jdbc.pool.DataSource.class)
public DataSource liquibaseDataSource(final DataSourceProperties dataSourceProperties,
final LiquibaseProperties liquibaseProperties, final AppProperties appProperties) {
LOG.info("Configuring Liquibase Datasource with url: {}, user: {}",
dataSourceProperties.getUrl(), liquibaseProperties.getUser());
/*
* This is needed for integration testing. When we run integration tests using SpringJUnit4ClassRunner, Spring
* uses
* H2DB if it is in the class path. In that case, we have to create pool for H2DB.
* Need to find a better solution for this.
*/
if (dataSourceProperties.getDriverClassName() != null
&& dataSourceProperties.getDriverClassName().startsWith("org.h2.")) {
return dataSource(dataSourceProperties, appProperties);
}
if (dataSourceProperties.getUrl() == null) {
LOG.error("Your Liquibase configuration is incorrect, please specify database URL!");
throw new ApplicationContextException("Liquibase is not configured correctly, please specify URL");
}
if (liquibaseProperties.getUser() == null) {
LOG.error("Your Liquibase configuration is incorrect, please specify database user!");
throw new ApplicationContextException(
"Liquibase is not configured correctly, please specify database user");
}
if (liquibaseProperties.getPassword() == null) {
LOG.error("Your Liquibase configuration is incorrect, please specify database password!");
throw new ApplicationContextException(
"Liquibase is not configured correctly, please specify database password");
}
PoolProperties config = new PoolProperties();
config.setDriverClassName(dataSourceProperties.getDriverClassName());
config.setUrl(dataSourceProperties.getUrl());
config.setUsername(liquibaseProperties.getUser());
config.setPassword(liquibaseProperties.getPassword());
// for liquibase pool, we dont need more than 1 connection
config.setInitialSize(LIQUIBASE_POOL_INIT_SIZE);
config.setMaxActive(LIQUIBASE_POOL_MAX_ACTIVE);
// for liquibase pool, we dont want any connections to linger around
config.setMaxIdle(LIQUIBASE_POOL_MAX_IDLE);
config.setMinIdle(LIQUIBASE_POOL_MIN_IDLE);
org.apache.tomcat.jdbc.pool.DataSource dataSource = new org.apache.tomcat.jdbc.pool.DataSource(config);
LOG.info("Liquibase data source is created: {}", dataSource);
return dataSource;
}
/**
* Creates a liquibase instance.
*
* #param dataSource Data source to use for liquibase.
* #param dataSourceProperties Datasource properties.
* #param liquibaseProperties Liquibase properties.
* #return Liquibase instance to be used in spring.
*/
#Bean
public SpringLiquibase liquibase(#Qualifier("liquibaseDataSource") final DataSource dataSource,
final DataSourceProperties dataSourceProperties, final LiquibaseProperties liquibaseProperties) {
// Use liquibase.integration.spring.SpringLiquibase if you don't want Liquibase to start asynchronously
SpringLiquibase liquibase = new AsyncSpringLiquibase();
liquibase.setDataSource(dataSource);
liquibase.setChangeLog("classpath:config/liquibase/master.xml");
liquibase.setContexts(liquibaseProperties.getContexts());
liquibase.setDefaultSchema(liquibaseProperties.getDefaultSchema());
liquibase.setDropFirst(liquibaseProperties.isDropFirst());
liquibase.setShouldRun(liquibaseProperties.isEnabled());
return liquibase;
}
}
This is because each test shares the same database and that the lifecycle of H2 is not in our control. If you start a process (the VM) and require a database named foo, close the application context, start a new one and require foo again you'll get the same instance.
In the upcoming 1.4.2 release we've added a property to generate a unique name for the database on startup (see spring.datasource.generate-unique-name) and that value will be set to true by default on 1.5.
In the meantime, you can annotate each test with #SpringBootTest(properties="spring.datasource.name=xyz") where xyz is different for a test that requires a separate DB.
If I understand everything correctly liquibase takes care of database status. For every file, also for the test data, liquibase creates a checksum in a table to check whether something has changed or not. The h2 instance still alive after a #DirtiesContext so the checksums still exists in the database. Liquibase thinks that everything is correct but the test data may have changed.
To force liquibase to drop the database and recreate a completely new database you must set the properties in application.yml (that one for tests):
liquibase:
contexts: test
drop-first: true
or as an alternative you can hardcode it:
liquibase.setDropFirst(true);
You can either annotate your test with #DirtiesContext, which slows down the test because the whole application context gets rebuild.
Or you can create a custom TestExecutionListener which is much faster. I've created a custom TestExecutionListener, which recreates the database and keeps the context.
public class CleanUpDatabaseTestExecutionListener
extends AbstractTestExecutionListener {
#Inject
SpringLiquibase liquibase;
#Override
public int getOrder() {
return Ordered.HIGHEST_PRECEDENCE;
}
#Override
public void afterTestClass(TestContext testContext) throws Exception {
//This is a bit dirty but it works well
testContext.getApplicationContext()
.getAutowireCapableBeanFactory()
.autowireBean(this);
liquibase.afterPropertiesSet();
}
if you are using the TestExecutionListener you must add this Listener to your test with:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = Application.class)
#WebAppConfiguration
#IntegrationTest
#TestExecutionListeners(listeners = {
DependencyInjectionTestExecutionListener.class,
TransactionalTestExecutionListener.class,
CleanUpDatabaseTestExecutionListener.class,
})
public class Test {
//your tests
}
NOTE: DO NOT USE #DirtiesContext and the TestExecutionListener together, this will lead to an error.
Solved by removing username, url and password parameters.
spring:
autoconfigure:
exclude: org.springframework.boot.autoconfigure.security.SecurityAutoConfiguration
jackson:
serialization:
indent_output: true
datasource:
driver-class-name: org.hsqldb.jdbcDriver
generate-unique-name: true
jpa:
hibernate:
dialect: org.hibernate.dialect.HSQLDialect
ddl-auto: validate
show-sql: true
h2:
console:
enabled: false
liquibase:
change-log: classpath:/liquibase/db.changelog-master.xml
drop-first: true
contexts: QA

Resources