Spring Kafka Consumer consumed message as LinkedHashMap hence automatically converting BigDecimal to double - spring-boot

I am using annotation based spring kafka listener to consume the kafka messages, and code is as below
Consuming Employee Object
Class Employee{
private String name;
private String address;
private Object account;
//getters
//setters
}
Account object decides on runtime whether it's Saving Account or Current Account etc.
Class SavingAcc{
private BigDecimal balance;
}
Class CurrentAcc{
private BigDecimal balance;
private BigDecimal limit;
}
Saving & Current Account having BigDecimal Fields to store balance.
Hence while sending Employee object from Kafka producer, all the fields are correctly mapped and appears in correct format of BigDecimal, etc.
But while consuming the Employee object in another service, account object is appearing as LinkedHashMap and BigDecimal fields are converted to Double. which is causing issues.
As per my understanding, the main reason can be as
a) Declaration of account as Object type instead of specific type
b) Or the deserializer should be provided more specifically. [I have already give Employee.class as type to kafka receiver deserializer, so Employee fields are correctly mapped but account fields wrong].
#Bean
public ConsumerFactory<String, Employee> consumerFactory(){
return new DefaultKafkaConsumerFactory<>(consumerConfigs(), new StringDeserializer(), new JsonDeserializer<>(Employee.class));
}
Need help on how to map or how to get the account fields properly deserialize.

Use Generics and a custom JavaType method.
Class Employee<T> {
private String name;
private String address;
private T account;
//getters
//setters
}
JavaType withCurrent = TypeFactory.defaultInstance().constructParametricType(Employee.class, CurrentAcc.class);
JavaType withSaving = TypeFactory.defaultInstance().constructParametricType(Employee.class, SavingAcc.class);
public static JavaType determineType(String topic, byte[] data, Headers headers) {
// If it's a current account
return withCurrent;
// else
return withSaving;
}
If you construct the deserializer yourself use
deser.setTypeResolver(MyClass::determineType);
When configuring with properties.
spring.json.value.type.method=com.mycompany.MyCass.determineType
You have to inspect the data or headers (or topic) to determine which type you want.
EDIT
Here is a complete example. In this case, I pass a type hint in the Account object, but an alternative would be to set a header on the producer side.
#SpringBootApplication
public class JacksonApplication {
public static void main(String[] args) {
SpringApplication.run(JacksonApplication.class, args);
}
#Data
public static class Employee<T extends Account> {
private String name;
private T account;
}
#Data
public static abstract class Account {
private final String type;
protected Account(String type) {
this.type = type;
}
}
#Data
public static class CurrentAccount extends Account {
private BigDecimal balance;
private BigDecimal limit;
public CurrentAccount() {
super("C");
}
}
#Data
public static class SavingAccount extends Account {
private BigDecimal balance;
public SavingAccount() {
super("S");
}
}
#KafkaListener(id = "empListener", topics = "employees")
public void listen(Employee<Account> e) {
System.out.println(e);
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("employees").partitions(1).replicas(1).build();
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, Employee> template) {
return args -> {
Employee<CurrentAccount> emp1 = new Employee<>();
emp1.setName("someOneWithACurrentAccount");
CurrentAccount currentAccount = new CurrentAccount();
currentAccount.setBalance(BigDecimal.ONE);
currentAccount.setLimit(BigDecimal.TEN);
emp1.setAccount(currentAccount);
template.send("employees", emp1);
Employee<SavingAccount> emp2 = new Employee<>();
emp2.setName("someOneWithASavingAccount");
SavingAccount savingAccount = new SavingAccount();
savingAccount.setBalance(BigDecimal.ONE);
emp2.setAccount(savingAccount);
template.send("employees", emp2);
};
}
private static final JavaType withCurrent = TypeFactory.defaultInstance()
.constructParametricType(Employee.class, CurrentAccount.class);
private static final JavaType withSaving = TypeFactory.defaultInstance()
.constructParametricType(Employee.class, SavingAccount.class);
public static JavaType determineType(String topic, byte[] data, Headers headers) throws IOException {
if (JsonPath.read(new ByteArrayInputStream(data), "$.account.type").equals("C")) {
return withCurrent;
}
else {
return withSaving;
}
}
}
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer
spring.kafka.consumer.value-deserializer=org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.value.type.method=com.example.demo.JacksonApplication.determineType
Result
JacksonApplication.Employee(name=someOneWithACurrentAccount, account=JacksonApplication.CurrentAccount(balance=1, limit=10))
JacksonApplication.Employee(name=someOneWithASavingAccount, account=JacksonApplication.SavingAccount(balance=1))
POM
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.5.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.example</groupId>
<artifactId>demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>jackson</name>
<description>Demo project for Spring Boot</description>
<properties>
<java.version>11</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
<dependency>
<groupId>com.jayway.jsonpath</groupId>
<artifactId>json-path</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
EDIT2
And here is an example that conveys the type hint in a header instead...
#SpringBootApplication
public class JacksonApplication {
public static void main(String[] args) {
SpringApplication.run(JacksonApplication.class, args);
}
#Data
public static class Employee<T extends Account> {
private String name;
private T account;
}
#Data
public static abstract class Account {
}
#Data
public static class CurrentAccount extends Account {
private BigDecimal balance;
private BigDecimal limit;
}
#Data
public static class SavingAccount extends Account {
private BigDecimal balance;
}
#KafkaListener(id = "empListener", topics = "employees")
public void listen(Employee<Account> e) {
System.out.println(e);
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("employees").partitions(1).replicas(1).build();
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, Employee> template) {
return args -> {
Employee<CurrentAccount> emp1 = new Employee<>();
emp1.setName("someOneWithACurrentAccount");
CurrentAccount currentAccount = new CurrentAccount();
currentAccount.setBalance(BigDecimal.ONE);
currentAccount.setLimit(BigDecimal.TEN);
emp1.setAccount(currentAccount);
template.send("employees", emp1);
Employee<SavingAccount> emp2 = new Employee<>();
emp2.setName("someOneWithASavingAccount");
SavingAccount savingAccount = new SavingAccount();
savingAccount.setBalance(BigDecimal.ONE);
emp2.setAccount(savingAccount);
template.send("employees", emp2);
};
}
private static final JavaType withCurrent = TypeFactory.defaultInstance()
.constructParametricType(Employee.class, CurrentAccount.class);
private static final JavaType withSaving = TypeFactory.defaultInstance()
.constructParametricType(Employee.class, SavingAccount.class);
public static JavaType determineType(String topic, byte[] data, Headers headers) throws IOException {
if (headers.lastHeader("accountType").value()[0] == 'C') {
return withCurrent;
}
else {
return withSaving;
}
}
public static class MySerializer extends JsonSerializer<Employee<?>> {
#Override
public byte[] serialize(String topic, Headers headers, Employee<?> emp) {
headers.add(new RecordHeader("accountType",
new byte[] { (byte) (emp.getAccount() instanceof CurrentAccount ? 'C' : 'S')}));
return super.serialize(topic, headers, emp);
}
}
}
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.producer.value-serializer=com.example.demo2.JacksonApplication.MySerializer
spring.kafka.consumer.value-deserializer=org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.value.type.method=com.example.demo2.JacksonApplication.determineType

This annotation solved my problem
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS,include = JsonTypeInfo.As.PROPERTY,property = "#class")
private T account
it binds defined class for generic to the field

Related

Optimistic locking Test in spring data JPA

I'm trying to test the optimistic locking mechanism in spring data jpa by loading a certain entity twice using findBy function, then updating the first one and asserting that when the second one is updated, it will throw an OptimisticLockingFailureException.
But the problem is that no exception is thrown and the second update is done successfully.
After investigation i found that findBy function hits the database only the first time and caches the returned entity. and when i call it again it returns cached entity. which means that both loaded entities are equal. so the first update reflects in both entities making the second entity does not have the stale data.
so, how do i force loading the second entity from the data base in the second findBy function call ?
Here is my code:-
Test class
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertThrows;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase;
import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.test.context.DynamicPropertyRegistry;
import org.springframework.test.context.DynamicPropertySource;
#DataJpaTest
#AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
class PersistenceTests {
#Autowired
private ProductRepository repository;
private ProductEntity savedEntity;
#DynamicPropertySource
static void databaseProperties(DynamicPropertyRegistry registry) {
registry.add("spring.datasource.url", () -> "jdbc:mysql://localhost:3306/code_snippet");
registry.add("spring.datasource.username", () -> "root");
registry.add("spring.datasource.password", () -> "System");
registry.add("spring.jpa.hibernate.ddl-auto", () -> "create-drop");
}
#BeforeEach
void setupDb() {
repository.deleteAll();
ProductEntity entity = new ProductEntity(1, "n", 1);
savedEntity = repository.save(entity);
assertEqualsProduct(entity, savedEntity);
}
#Test
void optimisticLockError() {
// Store the saved entity in two separate entity objects
ProductEntity entity1 = repository.findById(savedEntity.getId()).get();
ProductEntity entity2 = repository.findById(savedEntity.getId()).get();
// Update the entity using the first entity object
entity1.setName("n1");
repository.save(entity1);
// Update the entity using the second entity object.
// This should fail since the second entity now holds an old version number,
// i.e. an Optimistic Lock Error
assertThrows(OptimisticLockingFailureException.class, () -> {
entity2.setName("n2");
repository.save(entity2);
});
// Get the updated entity from the database and verify its new sate
ProductEntity updatedEntity = repository.findById(savedEntity.getId()).get();
assertEquals(1, (int) updatedEntity.getVersion());
assertEquals("n1", updatedEntity.getName());
}
private void assertEqualsProduct(ProductEntity expectedEntity, ProductEntity actualEntity) {
assertEquals(expectedEntity.getId(), actualEntity.getId());
assertEquals(expectedEntity.getVersion(), actualEntity.getVersion());
assertEquals(expectedEntity.getProductId(), actualEntity.getProductId());
assertEquals(expectedEntity.getName(), actualEntity.getName());
assertEquals(expectedEntity.getWeight(), actualEntity.getWeight());
}
}
Entity
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
import javax.persistence.Table;
import javax.persistence.Version;
#Entity
#Table(name = "product")
public class ProductEntity {
#Id
#GeneratedValue
private Integer id;
#Version
private Integer version;
private int productId;
private String name;
private int weight;
public ProductEntity() {
}
public ProductEntity(int productId, String name, int weight) {
this.productId = productId;
this.name = name;
this.weight = weight;
}
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
public Integer getVersion() {
return version;
}
public void setVersion(Integer version) {
this.version = version;
}
public int getProductId() {
return productId;
}
public void setProductId(int productId) {
this.productId = productId;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getWeight() {
return weight;
}
public void setWeight(int weight) {
this.weight = weight;
}
}
Repository
import java.util.Optional;
import org.springframework.data.repository.PagingAndSortingRepository;
public interface ProductRepository extends PagingAndSortingRepository<ProductEntity, Integer> {
Optional<ProductEntity> findByProductId(int productId);
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.7.2</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<groupId>com.javaworld.codesnippet</groupId>
<artifactId>writing-persistence-tests</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>writing-persistence-tests</name>
<description>Demo project for Spring Boot</description>
<properties>
<java.version>11</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Main Class
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class WritingPersistenceTestsApplication {
public static void main(String[] args) {
SpringApplication.run(WritingPersistenceTestsApplication.class, args);
}
}
The problem is that your test method is by default transactional. You can disable the transactional for this method by adding:
#Test
#Transactional(value = Transactional.TxType.NEVER)
Than you get in the second save ObjectOptimisticLockingFailureException

mongodb reactive spring boot DBRef resolution is not supported

I wrote this programe spring boot mongodb reactive
#SpringBootApplication
public class ReactitveMongoDbApplication {
public static void main(String[] args) {
SpringApplication.run(ReactitveMongoDbApplication.class, args);
}
#Bean
CommandLineRunner ok(SoscieteRepository sos, TransactionRepository trs) {
return a -> {
List<Sosciete> ls = List.of(new Sosciete("SG", "sosciete general", 1235.22),
new Sosciete("AB", "AIR BOLL", 478.36), new Sosciete("TO", "TOYOTA", 458.24));
trs.deleteAll().subscribe(null, null, () -> {
sos.deleteAll().subscribe(null, null, () -> {
ls.forEach(t -> sos.save(t).subscribe(so -> {
System.out.println(so);
for (int i = 0; i < 10; i++) {
Transaction tr = new Transaction();
tr.setDate(Instant.now());
tr.setSosciete(so);
double x = 1 + ((Math.random() * 12) - 6) / 100;
tr.setPrice(so.getPrice() * x);
trs.save(tr).subscribe(ts -> {
System.out.println(ts);
});
}
}));
});
});
System.out.println("done !");
};
}
}
interface SoscieteRepository extends ReactiveMongoRepository<Sosciete, String> {
}
#Document
#Data
#AllArgsConstructor
#NoArgsConstructor
class Sosciete {
#Id
private String id;
private String name;
private double price;
}
interface TransactionRepository extends ReactiveMongoRepository<Transaction, String> {
}
#Document
#Data
#AllArgsConstructor
#NoArgsConstructor
class Transaction {
#Id
private String id;
private Instant date;
private double price;
#DBRef(db = "sosciete", lazy = true)
private Sosciete sosciete;
}
#RestController
class ReactiveRestController {
#Autowired
private SoscieteRepository sos;
#Autowired
private TransactionRepository trans;
#GetMapping(value = "/soscietes")
public Flux<Sosciete> getAllSc() {
return sos.findAll();
}
#GetMapping(value = "/transactions")
public Flux<Transaction> getAllTr() {
return trans.findAll();
}
}
the dependencies:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb-reactive</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
aplication.prop file:
spring.data.mongodb.database=webflux
spring.data.mongodb.port=27017
server.port=3000
my code works fine in this link :
http://localhost:3000/soscietes
but in this link:
http://localhost:3000/transactions
the code throws the following error:
java.lang.UnsupportedOperationException: DBRef resolution is not supported!
it's the DBRef annotation that is cause for this error,
it does not work at all. It throws the following exception.
can we add such a configuration to make it work?
thank you in advance
I got the same issue with Spring boot version : 2.7.0
And i changed #DBRef to #DocumentReference and it works
I found the solution:
i modified the spring boot parent version in pom.xml:
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.6.3</version>
<relativePath />
</parent>
to 2.1.2.RELEASE

SpringBoot with Aws postgress

I am creating a project Java Springboot with postgreSql connected to AWS
An error:
Whitelabel Error Page
This application has no configured error view, so you are seeing this as a fallback.
Fri Aug 13 10:46:24 UTC 2021
[a93e3797-1] There was an unexpected error (type=Not Found, status=404).
UserController:
#RestController
public class UserController {
#Autowired
private userService userService;
private static final String ERROR_MAPPING = "/error";
#PostMapping(path="/employees")
public customers addEmployee(#RequestBody customers employee) {
return userService.save(employee);
}
#GetMapping(path="/employees")
public ResponseEntity<List<customers>> getAllEmployees() {
return ResponseEntity.ok(userService.listAll());
}
}
User Service:
#Service
public class userService {
#Autowired
private userRepository repo;
public List<customers> listAll(){
return repo.findAll();
}
public customers save(customers u) {
repo.save(u);
return u;
}
}
User Repository:
#Repository
public interface userRepository extends JpaRepository<customers, Long> {
}
User Model:
#Entity
#Table(name = "customers")
public class customers {
private int Customer_Id;
private String First_Name;
private String Last_Name;
private int Phone_Number;
public customers(int customer_Id, String first_Name, String last_Name, int phone_Number) {
Customer_Id = customer_Id;
First_Name = first_Name;
Last_Name = last_Name;
Phone_Number = phone_Number;
}
public customers() {
}
public int getCustomer_Id() {
return Customer_Id;
}
public void setCustomer_Id(int customer_Id) {
Customer_Id = customer_Id;
}
public String getFirst_Name() {
return First_Name;
}
public void setFirst_Name(String first_Name) {
First_Name = first_Name;
}
public String getLast_Name() {
return Last_Name;
}
public void setLast_Name(String last_Name) {
Last_Name = last_Name;
}
public int getPhone_Number() {
return Phone_Number;
}
public void setPhone_Number(int phone_Number) {
Phone_Number = phone_Number;
}
}
Application:package com.esdt.user;
#SpringBootApplication
#EntityScan("com.esdt.user.model.*")
#EnableJpaRepositories("com.esdt.user.repository.*")
#ComponentScan(basePackages={ "com.esdt.user.controller.*", "com.esdt.user.service.*" })
public class UserApplication {
public static void main(String[] args) {
SpringApplication.run(UserApplication.class, args);
}
}
properties:
## Spring DATASOURCE (DataSourceAutoConfiguration & DataSourceProperties)
spring.datasource.url=jdbc:postgresql://awswebsite/postgres?allowPublicKeyRetrieval=true&useSSL=false
spring.datasource.username=user
spring.datasource.password=pass
# The SQL dialect makes Hibernate generate better SQL for the chosen database
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect
# Hibernate ddl auto (create, create-drop, validate, update)
spring.jpa.hibernate.ddl-auto=create
spring.jpa.hibernate.show-sql=true
I added the following and it works:
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>

Transaction support in spring-boot running Tomcat, with AspectJ load-time weaving (LTW)

I am having a hard time configuring transactional support in spring-boot 2.0.3 with AspectJ LTW (load-time weaving). My spring-boot is running embedded Tomcat servlet container. In my persistence layer, I am not using JPA, but Spring JDBC Template instead.
I opted for AspectJ mode for transaction management because we are leveraging a rather big project with nested transactions and sometimes it is hard to keep track of all the applications of #Transactional annotation. So that when this annotation is being used I want to have a predictable result - atomic DB operation. I do not want to think about whether we have a self-invocation or method that is marked to be transactional is public.
I have read a bunch of documentation regarding transaction support in spring and how to configure LTW AspectJ weaving.
Unfortunately, I cannot make it work. I have created a test (spring-boot test class) that is meant to mimic different failures in a code that should be transactional (see it below). Also, I cannot see the weaving actually happening. I am clearly missing something, cannot figure out what.
My test class:
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, classes = TestConfig.class)
#ActiveProfiles("TEST")
public class TransactionalIT {
#SpyBean
private JdbcTemplate jdbcTemplate;
// we need this guy in order to perform a cleanup in static #AfterClass method
private static JdbcTemplate jdbcTemplateInStaticContext;
#Autowired
private PlatformTransactionManager txManager;
#Spy
private NestedTransactionsJdbcDao dao;
#Before
public void setUp() {
if (jdbcTemplateInStaticContext == null) {
// Making sure we're working with the proper tx manager
assertThat(txManager).isNotNull();
assertThat(txManager.getClass()).isEqualTo(DataSourceTransactionManager.class);
jdbcTemplateInStaticContext = jdbcTemplate;
jdbcTemplateInStaticContext.execute("CREATE TABLE entity_a (id varchar(12) PRIMARY KEY, name varchar(24), description varchar(255));");
jdbcTemplateInStaticContext.execute("CREATE TABLE entity_b (id varchar(12) PRIMARY KEY, name varchar(24), description varchar(255));");
jdbcTemplateInStaticContext.execute("CREATE TABLE entity_a_to_b_assn (entity_a_id varchar(12) NOT NULL, entity_b_id varchar(12) NOT NULL, " +
"CONSTRAINT fk_entity_a FOREIGN KEY (entity_a_id) REFERENCES entity_a(id), " +
"CONSTRAINT fk_entity_b FOREIGN KEY (entity_b_id) REFERENCES entity_b(id), " +
"UNIQUE (entity_a_id, entity_b_id));");
}
}
#AfterClass
public static void cleanup() {
if (jdbcTemplateInStaticContext != null) {
jdbcTemplateInStaticContext.execute("DROP TABLE entity_a_to_b_assn;");
jdbcTemplateInStaticContext.execute("DROP TABLE entity_a;");
jdbcTemplateInStaticContext.execute("DROP TABLE entity_b;");
}
}
#Test
public void createObjectGraph_FailsDuring_AnAttemptToCreate3rdEntityA() {
doThrow(new RuntimeException("blah!")).when(jdbcTemplate).update(eq("INSERT INTO entity_a (id, name, description) VALUES(?, ?, ?);"),
eq("a3"), eq("entity a3"), eq("descr_a_3"));
try {
dao.createObjectGraph(getObjectGraph());
fail("Should never reach this point");
} catch (RuntimeException e) {
assertThat(e.getMessage()).isEqualTo("blah!");
assertDbCounts(0L, 0L, 0L);
}
}
private void assertDbCounts(long expectedACount, long expectedBCount, long expectedAToBCount) {
Long actualACount = jdbcTemplate.queryForObject("SELECT count(*) count_a FROM entity_a", new LongRowMapper());
assertThat(actualACount).isEqualTo(expectedACount);
Long actualBCount = jdbcTemplate.queryForObject("SELECT count(*) count_b FROM entity_b", new LongRowMapper());
assertThat(actualBCount).isEqualTo(expectedBCount);
Long actualAToBCount = jdbcTemplate.queryForObject("SELECT count(*) count_a_to_b FROM entity_b", new LongRowMapper());
assertThat(actualAToBCount).isEqualTo(expectedAToBCount);
}
private final class LongRowMapper implements RowMapper<Long> {
#Override
public Long mapRow(ResultSet resultSet, int i) throws SQLException {
return resultSet.getLong(1);
}
}
private ObjectGraph getObjectGraph() {
EntityA a1 = new EntityA("a1", "entity a1", "descr_a_1");
EntityA a2 = new EntityA("a2", "entity a2", "descr_a_2");
EntityA a3 = new EntityA("a3", "entity a3", "descr_a_3");
EntityB b1 = new EntityB("b1", "entity b1", "descr_b_1");
EntityB b2 = new EntityB("b2", "entity b2", "descr_b_2");
EntityB b3 = new EntityB("b3", "entity b3", "descr_b_3");
AtoBAssn a1b1 = new AtoBAssn("a1", "b1");
AtoBAssn a1b3 = new AtoBAssn("a1", "b3");
AtoBAssn a2b2 = new AtoBAssn("a2", "b2");
AtoBAssn a2b3 = new AtoBAssn("a2", "b3");
AtoBAssn a3b1 = new AtoBAssn("a3", "b1");
return new ObjectGraph(
Lists.newArrayList(a1, a2, a3),
Lists.newArrayList(b1, b2, b3),
Lists.newArrayList(a1b1, a1b3, a2b2, a2b3, a3b1));
}
#Data
#AllArgsConstructor
private class EntityA {
private String id;
private String name;
private String description;
}
#Data
#AllArgsConstructor
private class EntityB {
private String id;
private String name;
private String description;
}
#Data
#AllArgsConstructor
private class AtoBAssn {
private String idA;
private String idB;
}
#Data
#AllArgsConstructor
private class ObjectGraph {
private List<EntityA> aList;
private List<EntityB> bList;
List<AtoBAssn> aToBAssnList;
}
#Repository
public class NestedTransactionsJdbcDao {
#Transactional
public void createObjectGraph(ObjectGraph og) {
createEntitiesA(og.getAList());
createEntitiesB(og.getBList());
createAtoBAssn(og.getAToBAssnList());
doSomethingElse();
}
#Transactional
public void createEntitiesA(List<EntityA> aList) {
aList.forEach(a ->
jdbcTemplate.update("INSERT INTO entity_a (id, name, description) VALUES(?, ?, ?);",
a.getId(), a.getName(), a.getDescription()));
}
#Transactional
public void createEntitiesB(List<EntityB> bList) {
bList.forEach(b ->
jdbcTemplate.update("INSERT INTO entity_b (id, name, description) VALUES(?, ?, ?);",
b.getId(), b.getName(), b.getDescription()));
}
#Transactional
/**
* Intentionally access is set to package-private
*/
void createAtoBAssn(List<AtoBAssn> aToBAssnList) {
aToBAssnList.forEach(aToB ->
jdbcTemplate.update("INSERT INTO entity_a_to_b_assn (entity_a_id, entity_b_id) VALUES(?, ?);",
aToB.getIdA(), aToB.getIdB()));
}
void doSomethingElse() {
// Intentionally left blank
}
}
}
Here is my configuration class:
import org.apache.catalina.loader.WebappClassLoader;
import org.springframework.context.annotation.AdviceMode;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.EnableLoadTimeWeaving;
import org.springframework.context.annotation.Primary;
import org.springframework.instrument.classloading.LoadTimeWeaver;
import org.springframework.instrument.classloading.tomcat.TomcatLoadTimeWeaver;
import org.springframework.jdbc.datasource.DataSourceTransactionManager;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import org.springframework.transaction.aspectj.AnnotationTransactionAspect;
#Configuration
#EnableTransactionManagement(mode = AdviceMode.ASPECTJ)
#EnableLoadTimeWeaving(aspectjWeaving = EnableLoadTimeWeaving.AspectJWeaving.ENABLED)
public class EventCoreConfig {
#Bean
public LoadTimeWeaver loadTimeWeaver() {
// https://tomcat.apache.org/tomcat-8.0-doc/api/org/apache/tomcat/InstrumentableClassLoader.html
return new TomcatLoadTimeWeaver(new WebappClassLoader());
}
#Bean
#Primary
public PlatformTransactionManager txManager(DataSource dataSource) {
DataSourceTransactionManager txManager = new DataSourceTransactionManager(dataSource);
AnnotationTransactionAspect aspect = new AnnotationTransactionAspect();
aspect.setTransactionManager(txManager);
return txManager;
}
}
Here is the portion of my pom.xml that is adding dependencies of interest:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aop</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>ch.vorburger.mariaDB4j</groupId>
<artifactId>mariaDB4j</artifactId>
<version>2.4.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mariadb.jdbc</groupId>
<artifactId>mariadb-java-client</artifactId>
<version>2.4.0</version>
<scope>test</scope>
</dependency>
Any help will be greatly appreciated. I know this is a little advanced topic, but I do not think it should be that complicated. I think Spring's documentation lacks examples of how to properly perform this kind of configuration. Also, I haven't found any success stories over there with a similar setup.

Writing to Dynamo table works fine, but reading throws DynamoDBMappingException

Executing orderRequestDao.save(new OrderRequest("5000", "body")); successfully places a record in Dynamo. Any attempts to read returns:
[Request processing failed; nested exception is com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException:
could not invoke null on class
com.cfa.fulfillmentApi.model.OrderRequest
with value 100 of type class java.lang.String] with root cause
(Record with id: 100 exists)
I'm using the following jars (aws.sdk.version: 1.11.86):
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>${aws.sdk.version}</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
<version>${aws.sdk.version}</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId>
<version>${aws.sdk.version}</version>
</dependency>
<dependency>
<groupId>com.github.derjust</groupId>
<artifactId>spring-data-dynamodb</artifactId>
<version>4.4.1</version>
</dependency>
DyamoDb config:
Primary partition key: id (String)
Dao:
#EnableScan
public interface OrderRequestDao extends CrudRepository<OrderRequest, String> {
OrderRequest findOne(String s);
OrderRequest save(OrderRequest or);
}
Domain object:
#DynamoDBTable(tableName = "dev_transx")
public class OrderRequest {
private String id;
private String body;
public OrderRequest(String id, String body) {
this.id = id;
this.body = body;
}
public OrderRequest() {}
#DynamoDBHashKey
public String getId()
{
return id;
}
#DynamoDBAttribute
public String getBody()
{
return body;
}
public void setBody(String body) {
this.body = body;
}
#Override
public String toString() {
return String.format(
"Customer[id=%d, body='%s']",
id, body);
}
#Override
public int hashCode() {
return id.hashCode();
}
}
I've tried just about every data type for id in the domain class, but no luck.
I removed aws-java-sdk-dynamodb since it was already in spring-data-dynamodb
Most importantly I added a setter for ID in the domain class.
Adding a setter method for the field works like a charm

Resources