Hibernate ColumnTransformer and DataJpaTest - spring-boot

I do have a mysql database. That database reads and writes by hibernate and uses field encryption. The application is running on spring boot.
#ColumnTransformer(
read = "AES_DECRYPT(message, 'secret')",
write = "AES_ENCRYPT(?, 'secret')"
)
#Column(
columnDefinition = "varbinary(5120)"
)
private String field;
When writing unit tests I get an exception because these test is running on embedded h2 and the encryption methods are mysql based.
#RunWith(SpringRunner.class)
#DataJpaTest
I found this solution, but it does not work for me: How to Ignore Certain Fields in unit tests, Hibernate
Is there any way to test this behaviour and ignoring encryption and decryption in test configuration?
regards,
Moritz

Related

How can I write a #DataJpaTest in a Spring Boot application that uses rsa keys loaded through configuration?

I followed the Spring Boot guides to set up JWT's using spring-boot-starter-oauth2-resource-server, so I've got references to the rsa keys used for signing the JWT's in my application.yml:
rsa:
privateKey: classpath:certs/private.pem
publicKey: classpath:certs/public.pem
This worked great until I tried to write a #DataJpaTest for testing the service layer of the application.
#DataJpaTest
public class FooTest {
#Test
public void test() {
System.out.println();
}
}
That test fails with the error:
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [java.lang.String] to type [java.security.interfaces.RSAPublicKey]
at app//org.springframework.boot.context.properties.bind.BindConverter.convert(BindConverter.java:118)
at app//org.springframework.boot.context.properties.bind.BindConverter.convert(BindConverter.java:100)
at app//org.springframework.boot.context.properties.bind.BindConverter.convert(BindConverter.java:92)
at app//org.springframework.boot.context.properties.bind.Binder.bindProperty(Binder.java:459)
at app//org.springframework.boot.context.properties.bind.Binder.bindObject(Binder.java:403)
at app//org.springframework.boot.context.properties.bind.Binder.bind(Binder.java:343)
I know the converters are available somewhere, because the same test runs fine with #SpringBootTest. I think I found them in org.springframework.security.converter.RsaKeyConverters. But I don't know how to register them so they're picked up during the #DataJpaTest.
I don't think those converters should be necessary for the test - FooTest has no dependencies right now.
How can I either set up the #DataJpaTest to work with this recommended spring-boot project setup, or how can I change the project setup so that I can easily write and run #DataJpaTests?
RsaKeyConverters is somehow configured by SecurityAutoConfiguration. #SpringBootTest will consider all the auto configuration and hence it can setup RsaKeyConverters properly. But #DataJpaTest will only consider the auto configuraiotn that are related to testing the JPA stuff and hence it will ignore SecurityAutoConfiguration.
You can use #ImportAutoConfiguration to tell it to also consider SecurityAutoConfiguration. But after that, you will find that although it is considered , it will only be enabled if the spring-boot is started as the servlet mode but #DataJpaTest starts it as the 'none' mode. So you need to configure another property spring.main.web-application-type=servlet to force it to start as servlet mode.
So making these two configuration changes should solve your problem :
#ImportAutoConfiguration(classes = SecurityAutoConfiguration.class)
#DataJpaTest(properties = "spring.main.web-application-type=servlet")
public class FooTest {
}
Have you tried to #MockBean Converter<String, RSAPublicKey> rsaPublicKeyConver; in your test class?

Cassandra unit spring - Launch an embedded Cassandra only once per test class

I am setting up integration tests for my spring-boot application using cassandra-unit-spring maven dependency. I am able to run my tests which invoke the spring-boot application which in turn accesses an in-memory embedded Cassandra database.
Below is the code for my test class
#RunWith(SpringJUnit4ClassRunner.class)
#TestExecutionListeners({CassandraUnitDependencyInjectionTestExecutionListener.class, DependencyInjectionTestExecutionListener.class})
#CassandraDataSet(value = "cassandra/dbcreate.cql", keyspace = "test")
#EmbeddedCassandra
#SpringBootTest({"spring.data.cassandra.port=9142"})
public class IntegrationTest {
#Autowired
private TestRepository testRepository;
#Test
public void testFindById() {
Token token = generateRandomToken();
testRepository.insert(token);
Optional<Token> tokenStored = testRepository.findById(token.getKey());
compareReplayToken(token, tokenStored.get()); //This method does the assertions
}
}
This single test invokes the embedded Cassandra and creates the keyspace and tables from the commands in the cassandra/dbcreate.cql file. After the test runs, the keyspace and tables are dropped.
Till now, it is fine. But, if I try to add multiple tests in this class, this approach creates the keyspace and tables at the beginning of each test and then drops them once the test runs.
And the dbcreate.cql file has a lot of commands to create multiple tables and when these commands run for each test, this makes my tests really slow.
Also, this problem multiplies when I try to have multiple such test classes.
Possible solution that I could think of is:
Have a separate cql file for each test class that has limited cql commands concerned with that class only - Again, this doesn't solve the problem of the database reset for each test in a single class
I want to run all my integration tests for a single launch of this embedded Cassandra and the tables and keyspace should be created and dropped only once for a fast execution
What should be the ideal solution for such a problem?
Any help is much appreciated.
Thanks!

Problems with integration tests after upgrading H2 database from 1.4.200 to 2.0.204

Recently I have upgraded H2 database in our SpringBoot 2.5.8 project from version 1.4.200 to 2.0.204. It is used for testing purposes only. For production we use PostgreSQL 12.9.
It seems that after upgrading some words become keywords in H2 database for example: day, value. After invoking integration test Hibernate fails on DDL part.
Postgres 12 - Keywords
H2 - Keywords
What is the best solution for that case?
Review all entities and apply back-ticks around reserved column names:
#NotNull
#Column(name = "day", nullable = false)
private LocalDate day;
#NotNull
#Column(name = "`day`", nullable = false)
private LocalDate day;
Provide dedicated SpringPhysicalNamingStrategy and override toPhysicalColumnName method only for integration test purpose. Check list of reserved keywords in H2 database and quote them.
# Datasource related properties
spring.datasource.driver-class-name=org.h2.Driver
spring.datasource.url=jdbc:h2:mem:db;DB_CLOSE_DELAY=-1
spring.datasource.username=sa
spring.datasource.password=sa
spring.sql.init.mode=always
spring.sql.init.continue-on-error=true
spring.sql.init.platform=h2
spring.jpa.show-sql=true
spring.jpa.generate-ddl=true
spring.jpa.hibernate.ddl-auto=update
spring.jpa.properties.hibernate.jdbc.time_zone=UTC
spring.jpa.hibernate.naming.physical-strategy=[project-related-package-name-here].strategy.CustomH2NamingStrategy
spring.jpa.defer-datasource-initialization=true
I think first solution should work both with PostgreSQL and H2 databases. Although day identifier is non-reserved in PostgreSQL 12.9 it might be in the future releases.
Second one should sort out the problem with H2 database.
What do you think about that? Maybe there are better solutions for that case? Or maybe take a list of reserved keywords from SQL:​2016 standard and apply them to both databases via custom SpringPhysicalNamingStrategy?
In H2 2.0 you can use SET NON_KEYWORDS setting by appending ;NON_KEYWORDS=DAY,VALUE to JDBC URL, but the normal solution is to quote all identifiers in generated SQL unconditionally with spring.jpa.properties.hibernate.globally_quoted_identifiers=true, for example.
You also should normally have ;MODE=PostgreSQL;DATABASE_TO_LOWER=TRUE;DEFAULT_NULL_ORDERING=HIGH in JDBC URL of H2 when you try to use it instead of PostgreSQL. But it's a bad idea to use different DBMS for production and testing and Hibernate ORM doesn't fully support H2 2.0 yet. Edited: Hibernate ORM 5.6.5.Final has basic support of H2 2.*.* and some additional issues were fixed in newer versions.

SpringBoot 1.5 : #SpringBootTest and memory Database

I have a SpringBoot application with spring data/jpa to connect for database.
And a propertie file yml where defined the database connection.
Everything works very well.
I create a test like that :
#ActiveProfiles("dev")
#RunWith(SpringRunner.class)
#SpringBootTest(classes = MyMicroServiceApp.class, webEnvironment=SpringBootTest.WebEnvironment.RANDOM_PORT)
public class MyMicroServiceAppTest {
#Test
public <T> void postConnex() {
//Create Object connexCreate
...
// Create POST
ResponseEntity<Udsaconnex> result1 = this.restTemplate().postForEntity("http://localhost:" + port + "/v1/connex",
connexCreate, Udsaconnex.class);
id = result1.getBody().getIdconnex();
assertEquals(result1.getBody().toString().isEmpty(), false);
}
}
For my test, i have not configured properties for database connection but the test work and i view in console that :
Hibernate: drop table connex if exists
I don't understand why, #SpringBootTest mock database like #DataJpaTest automatically ??
It's possible but i don't find anything about that in spring boot documentation.
Thanks for your help.
If you have application.yml file specifying DB location, then SpringBootTest will obviously use the same configuration and will use your configured DB.
From the title of your question I guess you have an in-memory database in your build dependencies. Spring-boot has some autoconfiguration for certain database (H2, HSQL, Derby) if they are found on the classpath. See this link for a list of the supported databases:
Spring Boot Embedded Database Support

Different UserType for different database in Hibernate

I'm developing an application in Spring Boot and Hibernate. I've created a class implementing UserType, which maps java.time.Duration to INTERVAL in PostgreSQL. Using it in following manner:
#TypeDefs({
#TypeDef(typeClass = DurationUserType.class, defaultForType = Duration.class)
}
works fine.
I'm running unit tests with H2. Since it doesn't support INTERVAL type, Hibernate fails at initialization with H2. I'm looking for the way to somehow associate DurationUserType with PostgreSQL dialect, so this it's not used in unit tests.

Resources