Spring-data & Redis: sort the results - spring-boot

From a Spring boot application, I want to use Redis and retrieve paged and sorted results.
Maven config:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-redis</artifactId>
<version>2.1.4.RELEASE</version>
</dependency>
<dependency>
<groupId>redis.clients</groupId>
<artifactId>jedis</artifactId>
<version>2.10.2</version>
<type>jar</type>
</dependency>
Based on an online example, I implemented the following repository:
#Repository
public interface StudentRepository extends PagingAndSortingRepository<Student, String> {
List<Student> findByName(String Name, Pageable page);
List<Student> findByName(String Name);
}
Now I want to filter my Student using
studentRepository.findAll(Sort.by("range").ascending());
With Student:
#RedisHash("Student")
public class Student implements Serializable {
#Id
private String id;
#Indexed
private String name;
private Gender gender;
private int grade;
Whatever the sort is, the order is always the same. After a debug session, I understood that the reason is:
class RedisQueryEngine extends QueryEngine<RedisKeyValueAdapter, RedisOperationChain, Comparator<?>> {
/**
* Creates new {#link RedisQueryEngine} with defaults.
*/
RedisQueryEngine() {
this(new RedisCriteriaAccessor(), null);
}
/**
* Creates new {#link RedisQueryEngine}.
*
* #param criteriaAccessor
* #param sortAccessor
* #see QueryEngine#QueryEngine(CriteriaAccessor, SortAccessor)
*/
private RedisQueryEngine(CriteriaAccessor<RedisOperationChain> criteriaAccessor,
#Nullable SortAccessor<Comparator<?>> sortAccessor) {
super(criteriaAccessor, sortAccessor);
}
The public constructor that is called by Spring has no sortAccessor defined. This sort accessor is then used in the QueryEngine class by my query:
public <T> Collection<T> execute(KeyValueQuery<?> query, String keyspace, Class<T> type) {
CRITERIA criteria = this.criteriaAccessor.map(it -> it.resolve(query)).orElse(null);
SORT sort = this.sortAccessor.map(it -> it.resolve(query)).orElse(null);
return execute(criteria, sort, query.getOffset(), query.getRows(), keyspace, type);
}
Is this behavior expected? Can Redis only page results (this works, I tested it) but not sort them (in this case, paging is totally useless)?

Related

Transaction support in spring-boot running Tomcat, with AspectJ load-time weaving (LTW)

I am having a hard time configuring transactional support in spring-boot 2.0.3 with AspectJ LTW (load-time weaving). My spring-boot is running embedded Tomcat servlet container. In my persistence layer, I am not using JPA, but Spring JDBC Template instead.
I opted for AspectJ mode for transaction management because we are leveraging a rather big project with nested transactions and sometimes it is hard to keep track of all the applications of #Transactional annotation. So that when this annotation is being used I want to have a predictable result - atomic DB operation. I do not want to think about whether we have a self-invocation or method that is marked to be transactional is public.
I have read a bunch of documentation regarding transaction support in spring and how to configure LTW AspectJ weaving.
Unfortunately, I cannot make it work. I have created a test (spring-boot test class) that is meant to mimic different failures in a code that should be transactional (see it below). Also, I cannot see the weaving actually happening. I am clearly missing something, cannot figure out what.
My test class:
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, classes = TestConfig.class)
#ActiveProfiles("TEST")
public class TransactionalIT {
#SpyBean
private JdbcTemplate jdbcTemplate;
// we need this guy in order to perform a cleanup in static #AfterClass method
private static JdbcTemplate jdbcTemplateInStaticContext;
#Autowired
private PlatformTransactionManager txManager;
#Spy
private NestedTransactionsJdbcDao dao;
#Before
public void setUp() {
if (jdbcTemplateInStaticContext == null) {
// Making sure we're working with the proper tx manager
assertThat(txManager).isNotNull();
assertThat(txManager.getClass()).isEqualTo(DataSourceTransactionManager.class);
jdbcTemplateInStaticContext = jdbcTemplate;
jdbcTemplateInStaticContext.execute("CREATE TABLE entity_a (id varchar(12) PRIMARY KEY, name varchar(24), description varchar(255));");
jdbcTemplateInStaticContext.execute("CREATE TABLE entity_b (id varchar(12) PRIMARY KEY, name varchar(24), description varchar(255));");
jdbcTemplateInStaticContext.execute("CREATE TABLE entity_a_to_b_assn (entity_a_id varchar(12) NOT NULL, entity_b_id varchar(12) NOT NULL, " +
"CONSTRAINT fk_entity_a FOREIGN KEY (entity_a_id) REFERENCES entity_a(id), " +
"CONSTRAINT fk_entity_b FOREIGN KEY (entity_b_id) REFERENCES entity_b(id), " +
"UNIQUE (entity_a_id, entity_b_id));");
}
}
#AfterClass
public static void cleanup() {
if (jdbcTemplateInStaticContext != null) {
jdbcTemplateInStaticContext.execute("DROP TABLE entity_a_to_b_assn;");
jdbcTemplateInStaticContext.execute("DROP TABLE entity_a;");
jdbcTemplateInStaticContext.execute("DROP TABLE entity_b;");
}
}
#Test
public void createObjectGraph_FailsDuring_AnAttemptToCreate3rdEntityA() {
doThrow(new RuntimeException("blah!")).when(jdbcTemplate).update(eq("INSERT INTO entity_a (id, name, description) VALUES(?, ?, ?);"),
eq("a3"), eq("entity a3"), eq("descr_a_3"));
try {
dao.createObjectGraph(getObjectGraph());
fail("Should never reach this point");
} catch (RuntimeException e) {
assertThat(e.getMessage()).isEqualTo("blah!");
assertDbCounts(0L, 0L, 0L);
}
}
private void assertDbCounts(long expectedACount, long expectedBCount, long expectedAToBCount) {
Long actualACount = jdbcTemplate.queryForObject("SELECT count(*) count_a FROM entity_a", new LongRowMapper());
assertThat(actualACount).isEqualTo(expectedACount);
Long actualBCount = jdbcTemplate.queryForObject("SELECT count(*) count_b FROM entity_b", new LongRowMapper());
assertThat(actualBCount).isEqualTo(expectedBCount);
Long actualAToBCount = jdbcTemplate.queryForObject("SELECT count(*) count_a_to_b FROM entity_b", new LongRowMapper());
assertThat(actualAToBCount).isEqualTo(expectedAToBCount);
}
private final class LongRowMapper implements RowMapper<Long> {
#Override
public Long mapRow(ResultSet resultSet, int i) throws SQLException {
return resultSet.getLong(1);
}
}
private ObjectGraph getObjectGraph() {
EntityA a1 = new EntityA("a1", "entity a1", "descr_a_1");
EntityA a2 = new EntityA("a2", "entity a2", "descr_a_2");
EntityA a3 = new EntityA("a3", "entity a3", "descr_a_3");
EntityB b1 = new EntityB("b1", "entity b1", "descr_b_1");
EntityB b2 = new EntityB("b2", "entity b2", "descr_b_2");
EntityB b3 = new EntityB("b3", "entity b3", "descr_b_3");
AtoBAssn a1b1 = new AtoBAssn("a1", "b1");
AtoBAssn a1b3 = new AtoBAssn("a1", "b3");
AtoBAssn a2b2 = new AtoBAssn("a2", "b2");
AtoBAssn a2b3 = new AtoBAssn("a2", "b3");
AtoBAssn a3b1 = new AtoBAssn("a3", "b1");
return new ObjectGraph(
Lists.newArrayList(a1, a2, a3),
Lists.newArrayList(b1, b2, b3),
Lists.newArrayList(a1b1, a1b3, a2b2, a2b3, a3b1));
}
#Data
#AllArgsConstructor
private class EntityA {
private String id;
private String name;
private String description;
}
#Data
#AllArgsConstructor
private class EntityB {
private String id;
private String name;
private String description;
}
#Data
#AllArgsConstructor
private class AtoBAssn {
private String idA;
private String idB;
}
#Data
#AllArgsConstructor
private class ObjectGraph {
private List<EntityA> aList;
private List<EntityB> bList;
List<AtoBAssn> aToBAssnList;
}
#Repository
public class NestedTransactionsJdbcDao {
#Transactional
public void createObjectGraph(ObjectGraph og) {
createEntitiesA(og.getAList());
createEntitiesB(og.getBList());
createAtoBAssn(og.getAToBAssnList());
doSomethingElse();
}
#Transactional
public void createEntitiesA(List<EntityA> aList) {
aList.forEach(a ->
jdbcTemplate.update("INSERT INTO entity_a (id, name, description) VALUES(?, ?, ?);",
a.getId(), a.getName(), a.getDescription()));
}
#Transactional
public void createEntitiesB(List<EntityB> bList) {
bList.forEach(b ->
jdbcTemplate.update("INSERT INTO entity_b (id, name, description) VALUES(?, ?, ?);",
b.getId(), b.getName(), b.getDescription()));
}
#Transactional
/**
* Intentionally access is set to package-private
*/
void createAtoBAssn(List<AtoBAssn> aToBAssnList) {
aToBAssnList.forEach(aToB ->
jdbcTemplate.update("INSERT INTO entity_a_to_b_assn (entity_a_id, entity_b_id) VALUES(?, ?);",
aToB.getIdA(), aToB.getIdB()));
}
void doSomethingElse() {
// Intentionally left blank
}
}
}
Here is my configuration class:
import org.apache.catalina.loader.WebappClassLoader;
import org.springframework.context.annotation.AdviceMode;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.EnableLoadTimeWeaving;
import org.springframework.context.annotation.Primary;
import org.springframework.instrument.classloading.LoadTimeWeaver;
import org.springframework.instrument.classloading.tomcat.TomcatLoadTimeWeaver;
import org.springframework.jdbc.datasource.DataSourceTransactionManager;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import org.springframework.transaction.aspectj.AnnotationTransactionAspect;
#Configuration
#EnableTransactionManagement(mode = AdviceMode.ASPECTJ)
#EnableLoadTimeWeaving(aspectjWeaving = EnableLoadTimeWeaving.AspectJWeaving.ENABLED)
public class EventCoreConfig {
#Bean
public LoadTimeWeaver loadTimeWeaver() {
// https://tomcat.apache.org/tomcat-8.0-doc/api/org/apache/tomcat/InstrumentableClassLoader.html
return new TomcatLoadTimeWeaver(new WebappClassLoader());
}
#Bean
#Primary
public PlatformTransactionManager txManager(DataSource dataSource) {
DataSourceTransactionManager txManager = new DataSourceTransactionManager(dataSource);
AnnotationTransactionAspect aspect = new AnnotationTransactionAspect();
aspect.setTransactionManager(txManager);
return txManager;
}
}
Here is the portion of my pom.xml that is adding dependencies of interest:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aop</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>ch.vorburger.mariaDB4j</groupId>
<artifactId>mariaDB4j</artifactId>
<version>2.4.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mariadb.jdbc</groupId>
<artifactId>mariadb-java-client</artifactId>
<version>2.4.0</version>
<scope>test</scope>
</dependency>
Any help will be greatly appreciated. I know this is a little advanced topic, but I do not think it should be that complicated. I think Spring's documentation lacks examples of how to properly perform this kind of configuration. Also, I haven't found any success stories over there with a similar setup.

How to prevent hibernate5 from lazy fetching when jackson serializes json object?

I'm currently running a spring-boot application where an endpoint returns a Page of a particular object stored in the database. For our purpose lets call that object "x". Within "x" there is a list of objects that are set to be lazily fetched.
#Entity
#DynamicUpdate
class x {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Integer id;
#JsonIgnore
#OneToMany(mappedBy = "x", cascade = CascadeType.MERGE, fetch = FetchType.LAZY)
private List<y> lazilyFetchedList;
#Override
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
#JsonIgnore
public List<y> getLazilyFetchedList() {
return lazilyFetchedList;
}
public void setLazilyFetchedList(List<y> lazilyFetchedList) {
this.lazilyFetchedList = lazilyFetchedList;
}
}
I set #JsonIgnore above because I don't want lazilyFetchedList to be sent to the client upon a GET call.
My problem is, even though that field is successfully ignored by jackson as a client viewing the JSON response. But additional querys are still made by hibernate to fetch the lazilyFetchedList when serializing the Java object "x" (even though jackson is not using the result).
I have already tried answers from Avoid Jackson serialization on non fetched lazy objects but none of the answers seem to work.
Here is what my controller looks like:
#RequestMapping(value = "/{id}/x", method = RequestMethod.GET)
public ApiResponse<?> findX(#PathVariable Integer id, PagingInfo info) {
Page<x> page = repo.findX(id, toPageable(info));
return toResponse(page, FIND_LIST_STATUS);
}
Here's what my configuration of the object mapper looks like:
#Configuration
#EnableWebMvc
public class ApiWebConfig extends WebMvcConfigurerAdapter {
#Bean
public ObjectMapper objectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
configureDefaultObjectMapper(objectMapper);
customizeObjectMapper(objectMapper);
return objectMapper;
}
public static void configureDefaultObjectMapper(ObjectMapper objectMapper) {
objectMapper.setPropertyNamingStrategy(PropertyNamingStrategy.SNAKE_CASE);
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, true);
objectMapper.configure(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false);
objectMapper.setSerializationInclusion(JsonInclude.Include.ALWAYS);
objectMapper.registerModule(new Hibernate5Module());
JavaTimeModule javaTimeModule = new JavaTimeModule();
javaTimeModule.addSerializer(ZonedDateTime.class, ZonedDateTimeSerializer.INSTANCE);
javaTimeModule.addSerializer(OffsetDateTime.class, OffsetDateTimeSerializer.INSTANCE);
objectMapper.registerModule(javaTimeModule);
}
/**
* Only register a json message converter
*/
#Override
public void configureMessageConverters(List<HttpMessageConverter<?>> converters) {
converters.clear();
MappingJackson2HttpMessageConverter converter = new MappingJackson2HttpMessageConverter();
converter.setSupportedMediaTypes(Arrays.asList(MediaType.APPLICATION_JSON, ActuatorMediaTypes.APPLICATION_ACTUATOR_V1_JSON));
converter.setObjectMapper(objectMapper());
converters.add(converter);
}
}
Versions:
Spring-Boot 1.5.3
Jackson 2.8.6
Hibernate 5.0.11.Final
jackson-datatype-hibernate5 2.9.0
Add the following dependency to your pom.xml:
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-hibernate5</artifactId>
</dependency>
(add the version if it is not managed by spring-boot-dependencies or spring-boot-starter-parent)
Add the following code to your spring configuration class:
#Bean
protected Module module() {
return new Hibernate5Module();
}
With the following imports:
import com.fasterxml.jackson.databind.Module;
import com.fasterxml.jackson.datatype.hibernate5.Hibernate5Module;

How do I use spring data jpa to query jsonb column?

I'm having a problem getting this native query right against a postgres 9.4 instance.
My repository has a method:
#Query(value = "SELECT t.* " +
"FROM my_table t " +
"WHERE t.field_1 = ?1 " +
"AND t.field_2 = 1 " +
"AND t.field_3 IN ?2 " +
"AND t.jsonb_field #>> '{key,subkey}' = ?3",
nativeQuery = true)
List<Entity> getEntities(String field1Value,
Collection<Integer> field3Values,
String jsonbFieldValue);
But the logs show this:
SELECT t.* FROM my_table t
WHERE t.field_1 = ?1
AND t.field_2 = 1
AND t.field_3 IN ?2
AND t.jsonb_field ? '{key,subkey}' = ?3
And I get this exception:
Internal Exception: org.postgresql.util.PSQLException: No value
specified for parameter 2.
I logged the parameters directly before method invocation, and they are all supplied.
I'm not sure why #>> shows ? in the log. Do I need to escape #>>? Do I need to format the collection for IN? Do I need to escape the json path?
When I execute the query directly against the db, it works. Example:
SELECT *
FROM my_table t
WHERE t.field_1 = 'xxxx'
AND t.field_2 = 1
AND t.field_3 IN (13)
AND t.jsonb_field #>> '{key,subkey}' = 'value'
I found very helpful the Specification api from spring data.
Let's say we have an entity with name Product and a property with name title of type JSON(B).
I assume that this property contains the title of the Product in different languages. An example could be: {"EN":"Multicolor LED light", "EL":"Πολύχρωμο LED φώς"}.
The source code below finds a (or more in case it is not a unique field) product by title and locale passed as arguments.
#Repository
public interface ProductRepository extends JpaRepository<Product, Integer>, JpaSpecificationExecutor<Product> {
}
public class ProductSpecification implements Specification<Product> {
private String locale;
private String titleToSearch;
public ProductSpecification(String locale, String titleToSearch) {
this.locale = locale;
this.titleToSearch = titleToSearch;
}
#Override
public Predicate toPredicate(Root<Product> root, CriteriaQuery<?> query, CriteriaBuilder builder) {
return builder.equal(builder.function("jsonb_extract_path_text", String.class, root.<String>get("title"), builder.literal(this.locale)), this.titleToSearch);
}
}
#Service
public class ProductService {
#Autowired
private ProductRepository productRepository;
public List<Product> findByTitle(String locale, String titleToSearch) {
ProductSpecification cs = new ProductSpecification(locale, titleToSearch);
return productRepository.find(cs);
// Or using lambda expression - without the need of ProductSpecification class.
// return productRepository.find((Root<ProductCategory> root, CriteriaQuery<?> query, CriteriaBuilder builder) -> {
// return builder.equal(builder.function("jsonb_extract_path_text", String.class, root.<String>get("title"), builder.literal(locale)), titleToSearch);
// });
}
}
You can find another answer about the way you should use the Spring Data here.
Hope that helps.
If the operator is being converted to a question mark for one reason or another, then you should try using the function instead. You can find the corresponding function using \doS+ #>> in the psql console. It tells us the function called is jsonb_extract_path_text. This would make your query:
#Query(value = "SELECT t.* " +
"FROM my_table t " +
"WHERE t.field_1 = ?1 " +
"AND t.field_2 = 1 " +
"AND t.field_3 IN ?2 " +
"AND jsonb_extract_path_text(t.jsonb_field, '{key,subkey}') = ?3",
nativeQuery = true)
Maybe this is an old topic, but I'm putting here search in jsonb by field using spring specification.
If you want to search with "LIKE" you need to create like disjunction with the following code:
final Predicate likeSearch = cb.disjunction();
After that, let's assume u have jsonb field in your object which is address, and address has 5 fields. To search in all these fields you need to add "LIKE" expression for all fields:
for (String field : ADDRESS_SEARCH_FIELDS) {
likeSearch.getExpressions().add(cb.like(cb.lower(cb.function("json_extract_path_text", String.class,
root.get("address"), cb.literal(field))), %searchKey%));
}
Where cb is the same criteriaBuilder. %searchKey% is what you want to search in address fields.
Hope this helps.
You can also use the FUNC JPQL keywork for calling custom functions and not use a native query.
Something like this,
#Query(value = "SELECT t FROM my_table t "
+ "WHERE t.field_1=:field_1 AND t.field_2=1 AND t.field_3 IN :field_3 "
+ "AND FUNC('jsonb_extract_path_text', 'key', 'subkey')=:value")
List<Entity> getEntities(#Param("field_1") String field_1, #Param("field_3") Collection<Integer> field_3, #Param("value") String value);
I suggest not following this way, I prefer to follow generic CRUD way (also working on advanced auto generated DAO methods in way of StrongLoop Loopback does, for Spring Data Rest maven plugin, but it is experimental in the moment only).
But with this JSON, now what to do... I am looking for something similar to MongoDB JSON processing in Spring Data via #Document annotation, however this is not yet available. But there are other ways :-)
In general it is about implementing your JSON user type (UserType interface):
public class YourJSONBType implements UserType {
Finally you need to enhance your JPA classes with specification of your implemented user type:
#Entity
#Data
#AllArgsConstructor
#NoArgsConstructor
#TypeDef(name = "JsonbType", typeClass = YourJSONBType.class)
public class Person {
#Id
#GeneratedValue
private Long id;
#Column(columnDefinition = "jsonb")
#Type(type = "JsonbType")
private Map<String,Object> info;
}
look at another related articles here:
Mapping PostgreSQL JSON column to Hibernate value type
The full implementation example is available here:
https://github.com/nzhong/spring-data-jpa-postgresql-json
https://github.com/mariusneo/postgres-json-jpa
Similar, but little different example is available here:
http://www.wisely.top/2017/06/27/spring-data-jpa-postgresql-jsonb/?d=1
Sharing my own example as I struggled decomposing the provided answers for my specific needs. Hopefully this helps others. My examples are in groovy and I'm integrating with a postgres SQL database. This is a simple example of how to search a JSON column on a field called "name" and use paging.
JSON support class
#TypeDefs([#TypeDef(name = "jsonb", typeClass = JsonBinaryType.class)])
#MappedSuperclass
class JSONSupport {}
The entity class:
#Entity
#Table(name = "my_table")
class MyEntity extends JSONSupport {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
Long pk
#Type(type = "jsonb")
#Column(columnDefinition = "jsonb")
String jsonData
}
The specification class
class NameSpecification implements Specification<MyEntity> {
private final String name
PhoneNumberSpecification(String name) {
this.name = name
}
#Override
Predicate toPredicate(Root<ContactEntity> root, CriteriaQuery<?> query, CriteriaBuilder builder) {
return builder.equals(
builder.function(
"jsonb_extract_path_text",
String.class,
root.<String>get("jsonData"),
builder.literal("name")
),
this.name
)
}
}
The repository
interface MyEntityRepository extends PagingAndSortingRepository<MyEntity, Long>, JpaSpecificationExecutor<MyEntity> {}
The usage
#Service
class MyEntityService {
private final MyEntityRepository repo
MyEntityService(MyEntityRepository repo) {
this.repo = repo
}
Page<MyEntity> getEntitiesByNameAndPage(String name, Integer page, Integer pageSize) {
PageRequest pageRequest = PageRequest.of(page, pageSize, Sort.by("pk"))
NameSpecification spec = new NameSpecification(name)
return repo.findAll(spec, pageRequest)
}
}
Create a table in postgres DB
CREATE TABLE shared.my_data (
id serial PRIMARY KEY,
my_config jsonb
);
Insert data into the table
INSERT into shared.my_data (id, my_config) VALUES( 1,
'{"useTime": true,
"manualUnassign": false,
"require":true,
"blockTime":10,
"additionalHours":1,
"availablegroups":[10,20,30]
}')
Check data in table:
select * from shared.tenant_data
Spring boot Java project
Java version: 11
Spring version: 2.7.1
Maven dependency on POM.xml file.
For postgres JOSNB, we need particular
vladmihalcea dependency version 2.14.0
<dependency>
<groupId>com.vladmihalcea</groupId>
<artifactId>hibernate-types-52</artifactId>
<version>2.14.0</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-jpa</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.24</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
JSON object class
import com.fasterxml.jackson.annotation.JsonProperty;
import java.util.List;
public class MyConfig {
#JsonProperty("useTime")
private boolean useTime;
#JsonProperty("manualUnassign")
private boolean manualUnassign;
#JsonProperty("require")
private boolean require;
#JsonProperty("additionalHours")
private int additionalHours;
#JsonProperty("blockTime")
private int blockTime;
#JsonProperty("availableGroup")
private List<Integer> availableGroup;
}
[Entity]Root object to encapsulate column in table row
import com.vladmihalcea.hibernate.type.json.JsonBinaryType;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.hibernate.annotations.Type;
import org.hibernate.annotations.TypeDef;
import javax.persistence.*;
#Data
#Entity
#Table(name = "my_data", schema = "shared")
#Builder
#NoArgsConstructor
#AllArgsConstructor
#TypeDef(name = "jsonb", typeClass = JsonBinaryType.class)
public class MyData {
#Id
#GeneratedValue(strategy= GenerationType.IDENTITY)
private Long id;
#Type(type = "jsonb")
#Column(columnDefinition = "jsonb")
private MyConfig myConfig;
}
Repository layer
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
#Repository
public interface MyDataRepo extends JpaRepository<MyData, Long> {
}
Service layer
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import java.util.List;
#Service
public class MyDataService {
#Autowired
private MyDataRepo myDataRepo;
public List<MyData> getAllMyspecificData(){
List<MyData> allMyData = myDataRepo.findAll();
return allMyData;
}
}
REST End point
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.util.List;
#RestController
#RequestMapping(path = "/my")
public class MyResouce {
#Autowired
MyDataService myDataService;
#GetMapping("/data")
public ResponseEntity<Object> getAllMyData() {
List<MyData> myDataList =
myDataService.getAllMyspecificData();
return new ResponseEntity<>(myDataList, HttpStatus.OK);
}
}

POST duplicate entry not causing PK collision in Spring Data REST

As per Spring Data REST Documentation, POST method creates a new entity from the given request body. However, I found it could also update the existing entity. In some cases, this can be problematic. Here is an example:
DemoApplication.java
package com.example;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
}
UserRepository.java
package com.example;
import org.springframework.data.repository.PagingAndSortingRepository;
public interface UserRepository extends PagingAndSortingRepository<User, String> {}
User.java
package com.example;
import javax.persistence.Entity;
import javax.persistence.Id;
#Entity
public class User {
#Id
private String username;
private String password;
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
}
pom.xml (within project tag)
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>demo</name>
<description>Demo project for Spring Boot</description>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.5.1.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-rest</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
application.properties: empty
URL: http://localhost:8080/users
Method: POST
JSON content:
{"username":"user","password":"password"}
I assumed the above POST request was getting HTTP 201 at the first time, and only for one time. However, I was able to send the above POST request many times, and got HTTP 201 all the time. In addition, I was also able to change the password in the database using POST request.
I believe that this is a security problem. For example, I might allow anonymous user registration through a POST request. But, with above situation, the existing user could be overwritten.
Question: How can I prevent a new entity being created from a POST request if an old entity has already existed with the same id? Or, did I miss interpret the Spring Data REST Documentation?
Supplementary explanation:
The cause of this issue is the design behind Spring Data REST. Because Spring Data REST is built upon Spring Data JPA, which was not used to directly expose to the “outside”. So it “trusts” data that comes in. The method isNew in org.springframework.data.repository.core.support.AbstractEntityInformation shows how data is determined as new or not new.
public boolean isNew(T entity) {
ID id = getId(entity);
Class<ID> idType = getIdType();
if (!idType.isPrimitive()) {
return id == null;
}
if (id instanceof Number) {
return ((Number) id).longValue() == 0L;
}
throw new IllegalArgumentException(String.format("Unsupported primitive id type %s!", idType));
}
The result of isNew method will eventually effects save method in org.springframework.data.jpa.repository.support.SimpleJpaRepository.
public <S extends T> S save(S entity) {
if (entityInformation.isNew(entity)) {
em.persist(entity);
return entity;
} else {
return em.merge(entity);
}
}
In the situation mentioned in this question, the username field, which is also the id of the user entity, would always contain data in order to create new users. Therefore, when it goes to isNew, id == null will always return false. Then save method will always perform a merge operation.
The above hints are all I can provide. Even though, I do not know if there is a solution to solve this problem.
URL links are just references. They may be not the exact the same version to what I am using.
To make an entity work properly with Spring Data REST (and Spring Data JPA as well), the entity class needs to implement Persistable. The more noticeable method to be overridden is isNew(). This method will be called instead of the one in AbstractEntityInformation mentioned in the question. To make an entity knowing its own state (new or old), a version variable is also needed. By annotating #Version on an explicit field, Spring Data JPA will update this field. Thus, once the entity is first constructed, the field is it’s default value (null or 0 depending on what data type it uses). In addition, because Spring Data REST is meant to expose to the outside world, to protect version from being misused, #JsonIgnore is used on version field.
For this particular question, the User.java class needs to be changed to as following:
package com.example;
import javax.persistence.*;
import org.springframework.data.domain.Persistable;
import com.fasterxml.jackson.annotation.JsonIgnore;
#Entity
public class User implements Persistable<String> {
/**
*
*/
private static final long serialVersionUID = 7509971300023426574L;
#Id
private String username;
private String password;
#Version
#JsonIgnore
private Long version;
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public Long getVersion() {
return version;
}
public void setVersion(Long version) {
this.version = version;
}
#Override
public String getId() {
return username;
}
#Override
public boolean isNew() {
return version == null;
}
}
As #Alan Hay mentioned, validation should also be performed for the incoming data. It is definitely good to have.

Writing to Dynamo table works fine, but reading throws DynamoDBMappingException

Executing orderRequestDao.save(new OrderRequest("5000", "body")); successfully places a record in Dynamo. Any attempts to read returns:
[Request processing failed; nested exception is com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException:
could not invoke null on class
com.cfa.fulfillmentApi.model.OrderRequest
with value 100 of type class java.lang.String] with root cause
(Record with id: 100 exists)
I'm using the following jars (aws.sdk.version: 1.11.86):
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>${aws.sdk.version}</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
<version>${aws.sdk.version}</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId>
<version>${aws.sdk.version}</version>
</dependency>
<dependency>
<groupId>com.github.derjust</groupId>
<artifactId>spring-data-dynamodb</artifactId>
<version>4.4.1</version>
</dependency>
DyamoDb config:
Primary partition key: id (String)
Dao:
#EnableScan
public interface OrderRequestDao extends CrudRepository<OrderRequest, String> {
OrderRequest findOne(String s);
OrderRequest save(OrderRequest or);
}
Domain object:
#DynamoDBTable(tableName = "dev_transx")
public class OrderRequest {
private String id;
private String body;
public OrderRequest(String id, String body) {
this.id = id;
this.body = body;
}
public OrderRequest() {}
#DynamoDBHashKey
public String getId()
{
return id;
}
#DynamoDBAttribute
public String getBody()
{
return body;
}
public void setBody(String body) {
this.body = body;
}
#Override
public String toString() {
return String.format(
"Customer[id=%d, body='%s']",
id, body);
}
#Override
public int hashCode() {
return id.hashCode();
}
}
I've tried just about every data type for id in the domain class, but no luck.
I removed aws-java-sdk-dynamodb since it was already in spring-data-dynamodb
Most importantly I added a setter for ID in the domain class.
Adding a setter method for the field works like a charm

Resources