#DataJpaTest updating actual data in case of MySQL but working fine with H2 - spring-boot

I am learning #DataJpaTest, my test case is as below
import com.demo.mockito.entity.StudentEntity;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase;
import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
#DataJpaTest
#AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
class StudentRepositoryTest {
#Autowired
private StudentRepository studentRepository;
#Test
public void findAll() {
StudentEntity student1 = new StudentEntity("shrikant", new Date());
studentRepository.save(student1);
List<StudentEntity> entityList = studentRepository.findAll();
assertEquals(1, entityList.size());
}
}
it's giving me the error
expected: <1> but was: <33>
Expected :1
Actual :33
<Click to see difference>
because right now there are 33 records in DB, and with every save test case, it increases.
src/main/test/application.properties
spring.datasource.url=jdbc:mysql://localhost:3306/student_db?jdbcCompliantTruncation=false&sessionVariables=sql_mode='NO_ENGINE_SUBSTITUTION'&useSSL=false&useServerPrepStmts=false&rewriteBatchedStatements=true&useUnicode=true&characterEncoding=utf8
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
spring.datasource.username=root
spring.datasource.password=root
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQL8Dialect
spring.jpa.generate-ddl=true
spring.jpa.database.schema=student_db
spring.jpa.hibernate.ddl-auto=update
spring.jpa.show-sql=true
build.gradle
plugins {
id 'java'
id 'org.springframework.boot' version '2.2.6.RELEASE'
id 'io.spring.dependency-management' version '1.0.9.RELEASE'
}
group 'com.demo.mockito'
version '1.0-SNAPSHOT'
repositories {
mavenCentral()
}
dependencies {
compile 'org.springframework.boot:spring-boot-starter-web'
implementation 'org.springframework.boot:spring-boot-starter-data-jpa'
compileOnly 'org.projectlombok:lombok'
annotationProcessor 'org.projectlombok:lombok'
testRuntime('org.junit.jupiter:junit-jupiter-engine:5.2.0')
runtime 'mysql:mysql-connector-java'
testImplementation('org.springframework.boot:spring-boot-starter-test') {
exclude group: 'org.junit.vintage', module: 'junit-vintage-engine'
}
}
test {
useJUnitPlatform()
}
if i use h2 instead, it gives correct result beucase every time it recreates a new instance with no data.
is this intended behavior? or am I doing something wrong,
is h2 standard in case of database testing.
but I don't want to configure another database in my application when my intention is to test MySQL only.

if i use h2 instead, it gives correct result because every time it
recreates a new instance with no data.
You answered yourself to your question.
Bu default, at each Spring Boot container starts (which happens when you define a test with #SpringBootTest or the test slicing #DataJpaTest annotation), a new database instance is created when you use an in-memory DB such as H2 (that is the default behavior of H2 that you may change) while when you use MySQL, Spring Boot doesn't use that strategy by default. It does not change the DB content.
The official doc states indeed :
Spring Boot chooses a default value for you based on whether it thinks
your database is embedded. It defaults to create-drop if no schema
manager has been detected or none in all other cases.
About :
but I don't want to configure another database in my application when
my intention is to test MySQL only.
For unit tests you want to use in-memory DB as H2 because that is straight and doesn't require a long/complex setup (that is populating/cleaning DB state).
For integration tests you want to use the target DB (here MySQL) because you want to write tests that are the closest possible of your application behavior.
To achieve that, you have to use a specific DB (a test database), you also have to populate fixture data for the tests and at last you have to clean data to make tests to be reproducible.
Both kinds of tests are complementary.

Related

How resolve error injecting bean MapStruct in Spring

I'm trying to Inject my mapper using mapstruct, but spring doesn't recognize the bean.
There is my mapper
package com.api.gestioncartera.Services.Mappers;
import org.mapstruct.Mapper;
import org.springframework.stereotype.Component;
import com.api.gestioncartera.Entities.CollectionCompany;
import com.api.gestioncartera.Services.DTO.CollectionCompanyDto;
#Mapper(componentModel = "spring")
public interface CollectionCompanyMapper {
CollectionCompanyDto collectionCompanyToCollectionCompanyDto(CollectionCompany collectionCompany);
}
There is my Service where I'm trying to inject it
#Service
#Transactional
public class CollectionCompanyServiceImp implements CollectionCompanyService{
#Autowired
private CollectionCompanyMapper companyMapper;
}
My gradle config
plugins {
id 'org.springframework.boot' version '2.5.6'
id 'io.spring.dependency-management' version '1.0.11.RELEASE'
id 'java'
}
...
dependencies {
...
implementation 'org.mapstruct:mapstruct:1.4.2.Final'
annotationProcessor 'org.mapstruct:mapstruct-processor:1.4.2.Final'
}
compileJava {
options.compilerArgs += [
'-Amapstruct.suppressGeneratorTimestamp=true',
'-Amapstruct.suppressGeneratorVersionInfoComment=true',
'-Amapstruct.verbose=true',
'-Amapstruct.defaultComponentModel=spring'
]
}
I also enable enable annotation processing in the IDE
Properties in the IDE
The error is:
Consider defining a bean of type 'com.api.gestioncartera.Services.Mappers.CollectionCompanyMapper' in your configuration.
I noticed that I don't have any plugin referencing mapstruct, can be this the problem? Image:
I'm using Spring Tool Suite 4 (Eclipse) + Gradle 6.8 + SrpingBoot 2.5.6
Please help!!
Eclipse has its problems with annotation processing.
I solved the issue with my projects using this plugin:
https://plugins.gradle.org/plugin/
Add this to the top of your gradle plugins.
plugins {
id "eclipse"
id "com.diffplug.eclipse.apt" version "3.37.1"
}
then do a gradle refresh.
If it‘s still not working, run
./gradlew eclipse eclipseJdtApt eclipseFactorypath
Hope this helps!

SpringBoot - Can't resolve #RunWith - cannot find symbol

SpringBoot project.
In build.gradle:
dependencies {
implementation 'com.google.code.gson:gson:2.7'
implementation 'com.h2database:h2'
implementation 'org.springframework.boot:spring-boot-starter'
implementation 'org.springframework.boot:spring-boot-starter-web'
implementation 'org.springframework.boot:spring-boot-starter-data-jpa'
implementation 'org.springframework.boot:spring-boot-starter-jdbc'
implementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml'
implementation 'com.squareup.okhttp3:logging-interceptor:3.8.0'
implementation('com.squareup.retrofit2:retrofit:2.4.0')
implementation('com.squareup.retrofit2:converter-gson:2.4.0')
implementation group: 'javax.validation', name: 'validation-api', version: '2.0.1.Final'
testImplementation('org.springframework.boot:spring-boot-starter-test') {
exclude group: 'org.junit.vintage', module: 'junit-vintage-engine'
}
}
test {
useJUnitPlatform()
}
Here my test class:
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest;
import org.springframework.boot.test.autoconfigure.orm.jpa.TestEntityManager;
import org.springframework.test.context.junit4.SpringRunner;
import static org.assertj.core.api.AssertionsForInterfaceTypes.assertThat;
#RunWith(SpringRunner.class)
#DataJpaTest
public class CategoryRepositoryIntegrationTest {
#Autowired
private TestEntityManager entityManager;
#Autowired
private CategoryRepository productRepository;
But I get error:
error: cannot find symbol
#RunWith(SpringRunner.class)
^
symbol: class RunWith
1 error
FAILURE: Build failed with an exception
You mixed JUnit 4 and 5. You use the Test annotation from JUnit 5 and the RunWith annotation is from JUnit 4. I would recommend using JUnit 5. For this you only need to replace RunWith with the following line:
#ExtendWith(SpringExtension.class)
Or if you use SpringBoot 2.1 or older, you can remove the RunWith annotation and it should also work.
I found this from Spring boot doc, not sure could help with your question.
If you are using JUnit 4, don’t forget to also add #RunWith(SpringRunner.class) to your test, otherwise the annotations will be ignored. If you are using JUnit 5, there’s no need to add the equivalent #ExtendWith(SpringExtension.class) as #SpringBootTest and the other #…Test annotations are already annotated with it.
here is the link https://docs.spring.io/spring-boot/docs/2.1.5.RELEASE/reference/html/boot-features-testing.html
and check section 46.3

How to build a SOAP WS with Apache CXF + Spring Boot in Gradle?

The assignment was simple: A SOAP web service implemented with spring boot, JDBC using Gradle.
After some time looking around the discovery was made that "Spring-WS" only works with a contract-first development style.
And we didn't want that, so we dig a little further and found out what we already know, we had to use Apache CXF for a Contract Last development style.
So off we went to search, code and test; but once the data access and facades were done we couldn’t figure out how to wire the Apache CXF WS with the Spring Boot service Façade.
So… how is it done?
This is more of a rhetorical question, because after looking around we could not find an example of Spring Boot & Apache CXF working seamlessly together, so for anyone who may be searching, here is a simple example.
First the dependencies used by the Gradle project
build.gradle file
buildscript {
ext {
springBootVersion = '2.0.1.RELEASE'
}
repositories {
mavenCentral()
}
dependencies {
classpath("org.springframework.boot:spring-boot-gradle-plugin:${springBootVersion}")
}
}
apply plugin: 'java'
apply plugin: 'eclipse-wtp'
apply plugin: 'org.springframework.boot'
apply plugin: 'io.spring.dependency-management'
apply plugin: 'war'
group = 'com.telcel'
version = '0.0.1-RC'
sourceCompatibility = 1.8
repositories {
mavenCentral()
}
configurations {
providedRuntime
}
dependencies {
// Apache CXF
compile(group: 'org.apache.cxf', name: 'cxf-spring-boot-starter-jaxws', version: '3.1.15') {
exclude(module: 'spring-boot-starter-tomcat')
}
// JDBC support
compile('org.springframework.boot:spring-boot-starter-jdbc')
// embedded servlet container
compile group: 'org.springframework.boot', name: 'spring-boot-starter-undertow', version: '1.5.4.RELEASE'
runtime group: 'com.ibm.informix', name: 'jdbc', version: '4.10.10.0'
testCompile('org.springframework.boot:spring-boot-starter-test')
testRuntime group: 'com.ibm.informix', name: 'jdbc', version: '4.10.10.0'
}
Then, we need some basic things for the CXF config.
application.properties file:
cxf.path=/service
server.address=0.0.0.0
We needed Spring Boot to create a CXF Endpoint, and we also needed that Endpoint to use our Spring aware Facade... this is where the wiring magic happened.
WebServiceConfig.java
package com.telcel.validaserie;
import com.telcel.validaserie.ui.ValidaSerieEndpoint;
import org.apache.cxf.Bus;
import org.apache.cxf.jaxws.EndpointImpl;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import javax.xml.ws.Endpoint;
#Configuration
public class WebServiceConfig {
#Autowired
private Bus bus;
#Autowired
private ValidaSerieEndpoint validaSerieEndpoint;
#Bean
public Endpoint endpoint() {
EndpointImpl endpoint = new EndpointImpl(bus, validaSerieEndpoint);
endpoint.publish("/");
return endpoint;
}
}
Notice the autowired ValidaSerieEndpoint that goes as a parameter into the EndpointImpl constructor, that's the trick, plain simple.
Finally just a simple web service implementation exposed as a Spring Bean (notice the Spring #Service stereotype)
ValidaSerieEndpoint.class
package com.telcel.validaserie.ui;
import com.telcel.validaserie.servicios.ValidaSeriesFacade;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import javax.jws.WebMethod;
import javax.jws.WebParam;
import javax.jws.WebService;
#Service
#WebService
public class ValidaSerieEndpoint {
#Autowired
private ValidaSeriesFacade validaSeriesFacade;
#WebMethod
public String validaTelefonoIccid(#WebParam(name = "iccid") String iccid) {
return validaSeriesFacade.validaTelefonoIccid(iccid);
}
#WebMethod
public String validaTelefonoImei(#WebParam(name = "imei") String imei) {
return validaSeriesFacade.validaTelefonoImei(imei);
}
#WebMethod
public int validaFacturaIccid(#WebParam(name = "iccid") String iccid, #WebParam(name = "fuerza-venta") String fuerzaVenta) {
return validaSeriesFacade.validaFacturaIccid(iccid, fuerzaVenta);
}
#WebMethod
public int validaFacturaImei(#WebParam(name = "imei") String imei, #WebParam(name = "fuerza-venta") String fuerzaVenta) {
return validaSeriesFacade.validaFacturaImei(imei, fuerzaVenta);
}
}
And that's it quite simple after you look at it... hope this helps.

How do I resolve an error in my configuration when trying to unit test a Spring Kafka Consumer?

Code Location
I think there might be to many modules to make this question clean looking, so here is the repo. I will hopefully include all the necessary components.
https://github.com/ewingian/RestCalculator
Problem
I am learning to write Kafka services, this process includes learning unit testing for the producer and consumer. Followed a tutorial on setting up unit testing with the consumer. When I run the test I receive a class configuration error.
ERROR
9:34:59.547 [main] WARN kafka.server.BrokerMetadataCheckpoint - No meta.properties file under dir /tmp/kafka-8346130278143417083/meta.properties
09:34:59.567 [main] ERROR kafka.server.KafkaServer - [Kafka Server 0], Fatal error during KafkaServer startup. Prepare to shutdown
java.lang.NoClassDefFoundError: org/apache/kafka/common/network/LoginType
at kafka.network.Processor.<init>(SocketServer.scala:406)
at kafka.network.SocketServer.newProcessor(SocketServer.scala:141)
at kafka.network.SocketServer$$anonfun$startup$1$$anonfun$apply$1.apply$mcVI$sp(SocketServer.scala:94)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at kafka.network.SocketServer$$anonfun$startup$1.apply(SocketServer.scala:93)
at kafka.network.SocketServer$$anonfun$startup$1.apply(SocketServer.scala:89)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
at kafka.network.SocketServer.startup(SocketServer.scala:89)
at kafka.server.KafkaServer.startup(KafkaServer.scala:219)
at kafka.utils.TestUtils$.createServer(TestUtils.scala:120)
at kafka.utils.TestUtils.createServer(TestUtils.scala)
at org.springframework.kafka.test.rule.KafkaEmbedded.before(KafkaEmbedded.java:154)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:46)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:191)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:51)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.network.LoginType
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 23 common frames omitted
The picture shows my directory structure
When I look at the list of libraries in the project structure settings of IDEA I see org.apache.kafka:kafka-clients:0.11.0.0; however I cannot import the missing module, which I understand is part of kafka-clients. (org/apache/kafka/common/network/LoginType)
Question
Has anyone come across this error before? Have I misconfigured my gradle file? Is my project directory set up correctly to effectively What might I be missing?perform Kafka Unit testing? I have not found much information on the LoginType yet, but will keep searching.
Here is a copy of gradle build file:
buildscript {
repositories {
mavenCentral()
}misconfigured
dependencies {
classpath("org.springframework.boot:spring-boot-gradle-plugin:1.5.10.RELEASE")
}
}
apply plugin: 'java'
apply plugin: 'eclipse'
apply plugin: 'idea'
apply plugin: 'org.springframework.boot'
jar {
baseName = 'calculator'
version = '0.1.0'
}
repositories {
mavenCentral()
}
sourceCompatibility = 1.8
targetCompatibility = 1.8
dependencies {
compile("org.springframework.boot:spring-boot-starter-web")
testCompile("org.springframework.boot:spring-boot-starter-test")
compile("org.springframework.kafka:spring-kafka:1.3.2.RELEASE")
testCompile("org.springframework.kafka:spring-kafka-test")
}
task wrapper(type: Wrapper) {
gradleVersion = '2.3'
}
If there is anything else I might need to include in this question please let me know. Thanks
Unit Test Code
package com.calculator;
/**
* Created by ian on 2/9/18.
*/
import com.calculator.kafka.services.*;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.Assert.assertThat;
import static org.springframework.kafka.test.assertj.KafkaConditions.key;
import static org.springframework.kafka.test.hamcrest.KafkaMatchers.hasValue;
import java.util.Map;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.TimeUnit;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.junit.After;
import org.junit.Before;
import org.junit.ClassRule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.listener.KafkaMessageListenerContainer;
import org.springframework.kafka.listener.MessageListener;
import org.springframework.kafka.listener.config.ContainerProperties;
import org.springframework.kafka.test.rule.KafkaEmbedded;
import org.springframework.kafka.test.utils.KafkaTestUtils;
import org.springframework.test.context.junit4.SpringRunner;
#RunWith(SpringRunner.class)
#SpringBootTest
public class KafkaTest {
// private static final Logger LOGGER = LoggerFactory.getLogger(KafkaTest.class);
private static final String TEMPLATE_TOPIC = "input";
private static String SENDER_TOPIC = "input";
#ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true, TEMPLATE_TOPIC);
private KafkaMessageListenerContainer<String, Integer> container;
private BlockingQueue<ConsumerRecord<String, Integer>> records;
#Autowired
private KafkaConsumer consumer;
#Before
public void setUp() {
// Set up the consumer properties
Map<String, Object> integerProperties = KafkaTestUtils.consumerProps("jsa-group", "false", embeddedKafka);
// create a Kafka consumer factory
DefaultKafkaConsumerFactory<String, Integer> consumerFactory = new DefaultKafkaConsumerFactory<String, Integer>(integerProperties);
// set the topic that needs to be consumed
ContainerProperties containerProperties = new ContainerProperties(SENDER_TOPIC);
// create a Kafka MessageListenerContainer
container = new KafkaMessageListenerContainer<>(consumerFactory, containerProperties);
// setup a Kafka message listener
container.setupMessageListener(new MessageListener<String, Integer>() {
#Override
public void onMessage(ConsumerRecord<String, Integer> record) {
// LOGGER.debug("test-listener received message='{}'", record.toString());
records.add(record);
}
});
// start the container and underlying message listener
container.start();
}
#After
public void tearDown() {
// stop the container
container.stop();
}
#Test
public void testTemplate() throws Exception {
// send the message
String greeting = "Hello Spring Kafka Sender!";
Integer i1 = 12;
consumer.processMessage(i1);
// check that the message was received
ConsumerRecord<String, Integer> received = records.poll(10, TimeUnit.SECONDS);
// Hamcrest Matchers to check the value
assertThat(received, hasValue(i1));
// AssertJ Condition to check the key
assertThat(received).has(key(null));
}
}
Here is a summary of what I discovered:
I was not using the most up to date Spring Boot version. Following the tutorials I used 1.5.10.RELEASE. Nothing wrong with that, but it was having compatibility issues with spring kafka.
I tried using an up to date version of spring kafka with spring boot 1.5.10.RELEASE. Kept getting errors dealing with certain classes not being found. Had to lower the version of Kafka to 1.3.2.RELEASE
This configuration allow me to run my spring boot application, but the unit testing failed hence this stack overflow question.
I attempted to rewrite my gradle file to use newer spring boot and kafka, was met with failure, I think the repositories were no good.
SOLUTION
Finally went to spring's website and used their project generator. Pulled in the latest spring Kafka and Boost. This gave me a fresh build.gradle with the proper repos. Added spring-kafka-test in manually and unit tests performed successfully.

Spring Boot 1.5 validated ConfigurationProperties

Jump to the bottom for the motivations and the solutions to this issue!
In the process of upgrading from Spring Boot 1.4 to 1.5 I read (source: https://github.com/spring-projects/spring-boot/wiki/Spring-Boot-1.5-Release-Notes#upgrading-from-spring-boot-14)
If you have #ConfigurationProperties classes that use JSR-303 constraint annotations, you should now additionally annotate them with #Validated. Existing validation will currently continue to work, however, a warning will be logged. In the future, classes without #Validated will not be validated at all.
So, diligently, I add #Validated to all of mine configuration properties. Now I have a specific use case that breaks, aka the property is not loaded anymore (I summarize first, then add code).
If I use a template property defined in application.properties file and then try to override the value for specific profiles, then the application is not starting.
Here is some sample code to reproduce (relevant files):
build.gradle
buildscript {
ext {
springBootVersion = '1.5.1.RELEASE'
}
repositories {
mavenCentral()
}
dependencies {
classpath("org.springframework.boot:spring-boot-gradle-plugin:${springBootVersion}")
}
}
apply plugin: 'java'
apply plugin: 'eclipse'
apply plugin: 'org.springframework.boot'
version = '0.0.1-SNAPSHOT'
sourceCompatibility = 1.8
repositories {
mavenCentral()
}
dependencies {
compile('org.springframework.boot:spring-boot-starter-web')
testCompile('org.springframework.boot:spring-boot-starter-test')
}
application.properties : demo.prop=${profile.prop}
application-demo.properties : profile.prop=demo
DemoApplication.java
package package;
import javax.validation.constraints.NotNull;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.stereotype.Component;
import org.springframework.validation.annotation.Validated;
import org.springframework.web.bind.annotation.GetMapping;
#SpringBootApplication
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
#org.springframework.web.bind.annotation.RestController
public static class RestController {
#Autowired
private DemoProperties properties;
#GetMapping
public String get() {
return properties.prop == null ? "null" : properties.prop;
}
}
#Component
#ConfigurationProperties(prefix = "demo")
// #Validated
public static class DemoProperties {
#NotNull
private String prop;
public void setProp(String prop) {
this.prop = prop;
}
public String getProp() {
return prop;
}
}
}
As it stands, my application produces the expected result when run with -Dspring.profiles.active=demo
curl "http://localhost:8080"
demo
however, uncommenting //#validated and running the application as before produces
curl "http://localhost:8080"
null
Full application available at https://github.com/ThanksForAllTheFish/boot-props (including a test case showing that defining profile.prop in config/application.properties fails as well with #validated but succeeds without).
I guess it is a bug in Spring Boot, but it may me not understanding something, so SoF first (as hinted in Spring Boot issues manager on github).
This github issue seems related: https://github.com/spring-projects/spring-boot/issues/8173
As I found the solution to my issue (some time ago already, but added as explanation in the question itself), I figured it may be more helpful to copy my findings here.
The problem with my sample code is that #Validated wraps the real class with a proxy, so that validation concerns can be injected, therefore return properties.prop == null ? "null" : properties.prop; is actually trying to access the prop field of the proxy. Changing to getProp() is the fix. Pretty obvious once found out.
Regarding production code: the issue was related to https://github.com/spring-projects/spring-boot/issues/8173, or more precisely to https://github.com/spring-cloud/spring-cloud-commons/issues/177, as we use spring-cloud. Basically, there was a conflict in BeanPostProcessor between spring-cloud and spring-boot (details in the ticket on github) that was solved in Dalston.RELEASE of spring-cloud. Just updating the dependency in our project solved the issue in production as well. Lot of digging and testing to just change 7 characters in our codebase.

Resources