Spring Security 5 provides a ReactiveSecurityContextHolder to fetch the SecurityContext from a Reactive context, but when I want to implement AuditorAware and get audition work automatically, but it does not work. Currently I can not find a Reactive variant for AuditorAware.
#Bean
public AuditorAware<Username> auditor() {
return () -> ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.log()
.filter(a -> a != null && a.isAuthenticated())
.map(Authentication::getPrincipal)
.cast(UserDetails.class)
.map(auth -> new Username(auth.getName()))
.switchIfEmpty(Mono.empty())
.blockOptional();
}
I have added #EnableMongoAuduting on my boot Application class.
On the Mongo document class. I added audition related annotations.
#CreatedDate
private LocalDateTime createdDate;
#CreatedBy
private Username author;
When I added a post, the createdDate is filled, but author is null.
{"id":"5a49ccdb9222971f40a4ada1","title":"my first post","content":"content of my first post","createdDate":"2018-01-01T13:53:31.234","author":null}
The complete codes is here, based on Spring Boot 2.0.0.M7.
Update: Spring Boot 2.4.0-M2/Spring Data Common 2.4.0-M2/Spring Data Mongo 3.1.0-M2 includes a ReactiveAuditorAware, Check this new sample, Note: use #EnableReactiveMongoAuditing to activiate it.
I am posting another solution which counts with input id to support update operations:
#Component
#RequiredArgsConstructor
public class AuditCallback implements ReactiveBeforeConvertCallback<AuditableEntity> {
private final ReactiveMongoTemplate mongoTemplate;
private Mono<?> exists(Object id, Class<?> entityClass) {
if (id == null) {
return Mono.empty();
}
return mongoTemplate.findById(id, entityClass);
}
#Override
public Publisher<AuditableEntity> onBeforeConvert(AuditableEntity entity, String collection) {
var securityContext = ReactiveSecurityContextHolder.getContext();
return securityContext
.zipWith(exists(entity.getId(), entity.getClass()))
.map(tuple2 -> {
var auditableEntity = (AuditableEntity) tuple2.getT2();
auditableEntity.setLastModifiedBy(tuple2.getT1().getAuthentication().getName());
auditableEntity.setLastModifiedDate(Instant.now());
return auditableEntity;
})
.switchIfEmpty(Mono.zip(securityContext, Mono.just(entity))
.map(tuple2 -> {
var auditableEntity = (AuditableEntity) tuple2.getT2();
String principal = tuple2.getT1().getAuthentication().getName();
Instant now = Instant.now();
auditableEntity.setLastModifiedBy(principal);
auditableEntity.setCreatedBy(principal);
auditableEntity.setLastModifiedDate(now);
auditableEntity.setCreatedDate(now);
return auditableEntity;
}));
}
}
Deprecated: see the update solution in the original post
Before the official reactive AuditAware is provided, there is an alternative to implement these via Spring Data Mongo specific ReactiveBeforeConvertCallback.
Do not use #EnableMongoAuditing
Implement your own ReactiveBeforeConvertCallback, here I use a PersistentEntity interface for those entities that need to be audited.
public class PersistentEntityCallback implements ReactiveBeforeConvertCallback<PersistentEntity> {
#Override
public Publisher<PersistentEntity> onBeforeConvert(PersistentEntity entity, String collection) {
var user = ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.filter(it -> it != null && it.isAuthenticated())
.map(Authentication::getPrincipal)
.cast(UserDetails.class)
.map(userDetails -> new Username(userDetails.getUsername()))
.switchIfEmpty(Mono.empty());
var currentTime = LocalDateTime.now();
if (entity.getId() == null) {
entity.setCreatedDate(currentTime);
}
entity.setLastModifiedDate(currentTime);
return user
.map(u -> {
if (entity.getId() == null) {
entity.setCreatedBy(u);
}
entity.setLastModifiedBy(u);
return entity;
}
)
.defaultIfEmpty(entity);
}
}
Check the complete codes here.
To have createdBy attribute filled, you need to link your auditorAware bean with the annotation #EnableMongoAuditing
In your MongoConfig class, define your bean :
#Bean(name = "auditorAware")
public AuditorAware<String> auditor() {
....
}
and use it in the annotation :
#Configuration
#EnableMongoAuditing(auditorAwareRef="auditorAware")
class MongoConfig {
....
}
Related
I am working on a Spring Boot project with Spring Data Rest, Gradle and Oracle Express DB in which I use DozerBeanMapper to map entities to DTOs and vice versa, I use no xml configurations for Dozer, just Java ones:
#Slf4j
#Configuration
public class DozerConfig {
#Bean
public DozerBeanMapper getDozerMapper() {
log.info("Initializing DozerBeanMapper bean");
return new DozerBeanMapper();
}
}
Also for clarity I have explicitly configured all the fields that has to be mapped although they are all with the same names. For example my Client mapper:
#Slf4j
#Component
public class ClientMapper extends BaseMapper {
private BeanMappingBuilder builder = new BeanMappingBuilder() {
#Override
protected void configure() {
mapping(Client.class, ClientDTO.class)
.fields("id", "id")
.fields("name", "name")
.fields("midName", "midName")
.fields("surname", "surname")
.exclude("password")
.fields("phone", "phone")
.fields("email", "email")
.fields("address", "address")
.fields("idCardNumber", "idCardNumber")
.fields("idCardIssueDate", "idCardIssueDate")
.fields("idCardExpirationDate", "idCardExpirationDate")
.fields("bankAccounts", "bankAccounts")
.fields("accountManager", "accountManager")
.fields("debitCardNumber", "debitCardNumber")
.fields("creditCardNumber", "creditCardNumber")
.fields("dateCreated", "dateCreated")
.fields("dateUpdated", "dateUpdated");
}
};
#Autowired
public ClientMapper(DozerBeanMapper mapper) {
super(mapper);
mapper.addMapping(builder);
}
public ClientDTO toDto(Client entity) {
log.info("Mapping Client entity to DTO");
return mapper.map(entity, ClientDTO.class);
}
public Client toEntity(ClientDTO dto) {
log.info("Mapping Client DTO to entity");
return mapper.map(dto, Client.class);
}
public List<ClientDTO> toDtos(List<Client> entities) {
log.info("Mapping Client entities to DTOs");
return entities.stream()
.map(entity -> toDto(entity))
.collect(Collectors.toList());
}
public List<Client> toEntities(List<ClientDTO> dtos) {
log.info("Mapping Client DTOs to entities");
return dtos.stream()
.map(dto -> toEntity(dto))
.collect(Collectors.toList());
}
public EmployeeDTO toEmployeeDto(Employee entity) {
log.info("Mapping Employee entity to DTO");
return mapper.map(entity, EmployeeDTO.class);
}
public Employee toEmployeeEntity(EmployeeDTO dto) {
log.info("Mapping Employee DTO to entity");
return mapper.map(dto, Employee.class);
}
public List<EmployeeDTO> toEmployeeDtos(List<Employee> entities) {
log.info("Mapping Employee entities to DTOs");
return entities.stream()
.map(entity -> toEmployeeDto(entity))
.collect(Collectors.toList());
}
public List<Employee> toEmployeeEntities(List<EmployeeDTO> dtos) {
log.info("Mapping Employee DTOs to entities");
return dtos.stream()
.map(dto -> toEmployeeEntity(dto))
.collect(Collectors.toList());
}
}
Despite this I get the following exception:
"httpStatus": "500 Internal Server Error",
"exception": "java.lang.IllegalArgumentException",
"message": "setAttribute(name, value):\n name: "http://apache.org/xml/features/validation/schema\"\n value: \"true\"",
"stackTrace": [
"oracle.xml.jaxp.JXDocumentBuilderFactory.setAttribute(JXDocumentBuilderFactory.java:289)",
"org.dozer.loader.xml.XMLParserFactory.createDocumentBuilderFactory(XMLParserFactory.java:71)",
"org.dozer.loader.xml.XMLParserFactory.createParser(XMLParserFactory.java:50)",
"org.dozer.loader.xml.MappingStreamReader.<init>(MappingStreamReader.java:43)",
"org.dozer.loader.xml.MappingFileReader.<init>(MappingFileReader.java:44)",
"org.dozer.DozerBeanMapper.loadFromFiles(DozerBeanMapper.java:219)",
"org.dozer.DozerBeanMapper.loadCustomMappings(DozerBeanMapper.java:209)",
"org.dozer.DozerBeanMapper.initMappings(DozerBeanMapper.java:315)",
"org.dozer.DozerBeanMapper.getMappingProcessor(DozerBeanMapper.java:192)",
"org.dozer.DozerBeanMapper.map(DozerBeanMapper.java:120)",
"com.rosenhristov.bank.exception.mapper.ClientMapper.toDto(ClientMapper.java:52)",
"com.rosenhristov.bank.service.ClientService.lambda$getClientById$0(ClientService.java:27)",
"java.base/java.util.Optional.map(Optional.java:265)",
"com.rosenhristov.bank.service.ClientService.getClientById(ClientService.java:27)",
"com.rosenhristov.bank.controller.ClientController.getClientById(ClientController.java:57)",
"java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)",
"java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)",
"java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)",
"java.base/java.lang.reflect.Method.invoke(Method.java:566)",
"org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:197)",
"org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:141)",
"org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:106)",
"org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:893)",
"org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:807)",
"org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87)",
"org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1061)",
"org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:961)",
"org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006)",
"org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898)",
"javax.servlet.http.HttpServlet.service(HttpServlet.java:626)",
"org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883)",
"javax.servlet.http.HttpServlet.service(HttpServlet.java:733)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100)",
"org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93)",
"org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:93)",
"org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201)",
"org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:202)",
"org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97)",
"org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:542)",
"org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:143)",
"org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)",
"org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78)",
"org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343)",
"org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:374)",
"org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65)",
"org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:868)",
"org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1590)",
"org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)",
"java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)",
"java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)",
"org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)",
"java.base/java.lang.Thread.run(Thread.java:834)"
]
It seems Dozer is trying to find some xml conffig file taking into account this:
"oracle.xml.jaxp.JXDocumentBuilderFactory.setAttribute(JXDocumentBuilderFactory.java:289)"
It seems it is searching for a xml validation schema (take a look at var1 on the picture below):
When I start the application I just saw the following in the IntelliJ console:
Trying to find Dozer configuration file: dozer.properties
2020-12-23 11:46:09.855 WARN 17488 --- [ restartedMain] org.dozer.config.GlobalSettings: Dozer configuration file not found: dozer.properties. Using defaults for all Dozer global properties.
Probably I have to search for dozer.properties and find out how to make it look for Java configurations?
Can someone help me, please? I searched for some solution in internet but I still haven't found a suitable one. I am new to Dozer, I have used Mapstruct before?
You can try my beanknife to generate the dto file automatically. It will has a read method to convert the entity to dto. Although no converter from dto to entity, I think you don't need it in most situation.
#ViewOf(value = Client.class, genName = "ClientDto", includePattern = ".*")
class ClientDtoConfiguration {}
Then it will generate a dto class named "ClientDto" with all the properties of Client.
Client client = ...
ClientDto clientDto = ClientDto.read(client);
List<Client> clients = ...
List<ClientDto> clientDtos = ClientDto.read(clients);
Then serialize the dtos instead of entities.
i created my own HttpTraceRepository, for write and read httpTrace from mongodb on capped collection.
Today, i don't understand why HttpTrace is a final class without public constructor.
In my case, when i read mongodb ( findAll method ) to find the last X hours http trace and show it in spring-boot admin UI, it's impossible to instanciante HttpTrace object.
#Component
public class MongoTraceRepository implements HttpTraceRepository {
private static final String ADMIN_TRACE = "admin.trace";
private MongoOperations mongoOps;
#Value("${admin.display.trace.last.x.hours}")
private int displayTraceLastXHours;
#Autowired
public MongoTraceRepository(MongoOperations mongoOps) {
this.mongoOps = mongoOps;
}
#PostConstruct
public void initialize() {
MongoCollection<Document> collection = mongoOps.getCollection(ADMIN_TRACE);
boolean collectionExists = mongoOps.collectionExists(ADMIN_TRACE);
if (!collectionExists) {
collection.drop();
// 100 Mo max, 500 000 documents max, capped !
mongoOps.createCollection(ADMIN_TRACE, CollectionOptions.empty().size(104857600).maxDocuments(500000).capped());
}
}
#Override
public List<HttpTrace> findAll() {
Date yesterday = Date.from(LocalDateTime.now().minusHours(displayTraceLastXHours).atZone(ZoneId.systemDefault()).toInstant());
return mongoOps.find(new Query(where("timestamp").gte(yesterday)), HttpTrace.class, ADMIN_TRACE)
.stream()
.sorted((o1, o2) -> o2.getTimestamp().compareTo(o1.getTimestamp()))
.collect(Collectors.toList());
}
#Override
#Async
public void add(HttpTrace traceInfo) {
mongoOps.save(traceInfo, ADMIN_TRACE);
}
}
This is the error log :
org.springframework.data.mapping.MappingException: No property request found on entity class org.springframework.boot.actuate.trace.http.HttpTrace$Request to bind constructor parameter to!
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:68)
at org.springframework.data.mapping.model.SpELExpressionParameterValueProvider.getParameterValue(SpELExpressionParameterValueProvider.java:49)
at org.springframework.data.convert.ReflectionEntityInstantiator.createInstance(ReflectionEntityInstantiator.java:75)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:86)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:273)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:253)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readValue(MappingMongoConverter.java:1387)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1334)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1276)
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:71)
at org.springframework.data.mapping.model.SpELExpressionParameterValueProvider.getParameterValue(SpELExpressionParameterValueProvider.java:49)
at org.springframework.data.convert.ReflectionEntityInstantiator.createInstance(ReflectionEntityInstantiator.java:75)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:86)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:273)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:253)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:202)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:198)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:86)
at org.springframework.data.mongodb.core.MongoTemplate$ReadDocumentCallback.doWith(MongoTemplate.java:2785)
at org.springframework.data.mongodb.core.MongoTemplate.executeFindMultiInternal(MongoTemplate.java:2448)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:2244)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:2227)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:770)
There is a constructor especially suited to create HTTPTrace from a persistence store.
It was added since 2.1.0 though.
Is there any custom param convertors in Spring.For example
#RequestMapping(value="/getEmployees/{"empName"}")
public void getEmployees(#PathVariable("empName") Employee employee)
{
}
The path variable which is getting from request is of type spring.Based on the given empName it will assign to Employee object.In JAX-RS we can use ParamConvertor and ParamConvertorProvider to convert.Like in JAX-RS do we have any convertors?
Yes, you can use custom params convertors in Spring
1 Create a converter component that implements spring Converter interface
#Component
public class StringToRightsModeConverter implements Converter<String, RightsMode> {
#Override
public RightsMode convert(String s) {
try{
return RightsMode.valueOf(s);
} catch (Exception e) {
return RightsMode.getByCode(s);
}
}
}
2 That's all. Spring will automatically use it for String -> RightsMode type conversion
#GetMapping({"/periods"})
public List<Period> periods(
#RequestParam(required = false) RightsMode rightsMode) {
................................................................
}
Spring 3 type conversion
I'm using the gemfire-json-server module in SpringXD to populate a GemFire grid with json representation of “Order” objects. I understand the gemfire-json-server module saves data in Pdx form in GemFire. I’d like to read the contents of the GemFire grid into an “Order” object in my application. I get a ClassCastException that reads:
java.lang.ClassCastException: com.gemstone.gemfire.pdx.internal.PdxInstanceImpl cannot be cast to org.apache.geode.demo.cc.model.Order
I’m using the Spring Data GemFire libraries to read contents of the cluster. The code snippet to read the contents of the Grid follows:
public interface OrderRepository extends GemfireRepository<Order, String>{
Order findByTransactionId(String transactionId);
}
How can I use Spring Data GemFire to convert data read from the GemFire cluster into an Order object?
Note: The data was initially stored in GemFire using SpringXD's gemfire-json-server-module
Still waiting to hear back from the GemFire PDX engineering team, specifically on Region.get(key), but, interestingly enough if you annotate your application domain object with...
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public class Order ... {
...
}
This works!
Under-the-hood I knew the GemFire JSONFormatter class (see here) used Jackson's API to un/marshal (de/serialize) JSON data to and from PDX.
However, the orderRepository.findOne(ID) and ordersRegion.get(key) still do not function as I would expect. See updated test class below for more details.
Will report back again when I have more information.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = GemFireConfiguration.class)
#SuppressWarnings("unused")
public class JsonToPdxToObjectDataAccessIntegrationTest {
protected static final AtomicLong ID_SEQUENCE = new AtomicLong(0l);
private Order amazon;
private Order bestBuy;
private Order target;
private Order walmart;
#Autowired
private OrderRepository orderRepository;
#Resource(name = "Orders")
private com.gemstone.gemfire.cache.Region<Long, Object> orders;
protected Order createOrder(String name) {
return createOrder(ID_SEQUENCE.incrementAndGet(), name);
}
protected Order createOrder(Long id, String name) {
return new Order(id, name);
}
protected <T> T fromPdx(Object pdxInstance, Class<T> toType) {
try {
if (pdxInstance == null) {
return null;
}
else if (toType.isInstance(pdxInstance)) {
return toType.cast(pdxInstance);
}
else if (pdxInstance instanceof PdxInstance) {
return new ObjectMapper().readValue(JSONFormatter.toJSON(((PdxInstance) pdxInstance)), toType);
}
else {
throw new IllegalArgumentException(String.format("Expected object of type PdxInstance; but was (%1$s)",
pdxInstance.getClass().getName()));
}
}
catch (IOException e) {
throw new RuntimeException(String.format("Failed to convert PDX to object of type (%1$s)", toType), e);
}
}
protected void log(Object value) {
System.out.printf("Object of Type (%1$s) has Value (%2$s)", ObjectUtils.nullSafeClassName(value), value);
}
protected Order put(Order order) {
Object existingOrder = orders.putIfAbsent(order.getTransactionId(), toPdx(order));
return (existingOrder != null ? fromPdx(existingOrder, Order.class) : order);
}
protected PdxInstance toPdx(Object obj) {
try {
return JSONFormatter.fromJSON(new ObjectMapper().writeValueAsString(obj));
}
catch (JsonProcessingException e) {
throw new RuntimeException(String.format("Failed to convert object (%1$s) to JSON", obj), e);
}
}
#Before
public void setup() {
amazon = put(createOrder("Amazon Order"));
bestBuy = put(createOrder("BestBuy Order"));
target = put(createOrder("Target Order"));
walmart = put(createOrder("Wal-Mart Order"));
}
#Test
public void regionGet() {
assertThat((Order) orders.get(amazon.getTransactionId()), is(equalTo(amazon)));
}
#Test
public void repositoryFindOneMethod() {
log(orderRepository.findOne(target.getTransactionId()));
assertThat(orderRepository.findOne(target.getTransactionId()), is(equalTo(target)));
}
#Test
public void repositoryQueryMethod() {
assertThat(orderRepository.findByTransactionId(amazon.getTransactionId()), is(equalTo(amazon)));
assertThat(orderRepository.findByTransactionId(bestBuy.getTransactionId()), is(equalTo(bestBuy)));
assertThat(orderRepository.findByTransactionId(target.getTransactionId()), is(equalTo(target)));
assertThat(orderRepository.findByTransactionId(walmart.getTransactionId()), is(equalTo(walmart)));
}
#Region("Orders")
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public static class Order implements PdxSerializable {
protected static final OrderPdxSerializer pdxSerializer = new OrderPdxSerializer();
#Id
private Long transactionId;
private String name;
public Order() {
}
public Order(Long transactionId) {
this.transactionId = transactionId;
}
public Order(Long transactionId, String name) {
this.transactionId = transactionId;
this.name = name;
}
public String getName() {
return name;
}
public void setName(final String name) {
this.name = name;
}
public Long getTransactionId() {
return transactionId;
}
public void setTransactionId(final Long transactionId) {
this.transactionId = transactionId;
}
#Override
public void fromData(PdxReader reader) {
Order order = (Order) pdxSerializer.fromData(Order.class, reader);
if (order != null) {
this.transactionId = order.getTransactionId();
this.name = order.getName();
}
}
#Override
public void toData(PdxWriter writer) {
pdxSerializer.toData(this, writer);
}
#Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof Order)) {
return false;
}
Order that = (Order) obj;
return ObjectUtils.nullSafeEquals(this.getTransactionId(), that.getTransactionId());
}
#Override
public int hashCode() {
int hashValue = 17;
hashValue = 37 * hashValue + ObjectUtils.nullSafeHashCode(getTransactionId());
return hashValue;
}
#Override
public String toString() {
return String.format("{ #type = %1$s, id = %2$d, name = %3$s }",
getClass().getName(), getTransactionId(), getName());
}
}
public static class OrderPdxSerializer implements PdxSerializer {
#Override
public Object fromData(Class<?> type, PdxReader in) {
if (Order.class.equals(type)) {
return new Order(in.readLong("transactionId"), in.readString("name"));
}
return null;
}
#Override
public boolean toData(Object obj, PdxWriter out) {
if (obj instanceof Order) {
Order order = (Order) obj;
out.writeLong("transactionId", order.getTransactionId());
out.writeString("name", order.getName());
return true;
}
return false;
}
}
public interface OrderRepository extends GemfireRepository<Order, Long> {
Order findByTransactionId(Long transactionId);
}
#Configuration
protected static class GemFireConfiguration {
#Bean
public Properties gemfireProperties() {
Properties gemfireProperties = new Properties();
gemfireProperties.setProperty("name", JsonToPdxToObjectDataAccessIntegrationTest.class.getSimpleName());
gemfireProperties.setProperty("mcast-port", "0");
gemfireProperties.setProperty("log-level", "warning");
return gemfireProperties;
}
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
//cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxSerializer(new OrderPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
#Bean(name = "Orders")
public PartitionedRegionFactoryBean ordersRegion(Cache gemfireCache) {
PartitionedRegionFactoryBean regionFactoryBean = new PartitionedRegionFactoryBean();
regionFactoryBean.setCache(gemfireCache);
regionFactoryBean.setName("Orders");
regionFactoryBean.setPersistent(false);
return regionFactoryBean;
}
#Bean
public GemfireRepositoryFactoryBean orderRepository() {
GemfireRepositoryFactoryBean<OrderRepository, Order, Long> repositoryFactoryBean =
new GemfireRepositoryFactoryBean<>();
repositoryFactoryBean.setRepositoryInterface(OrderRepository.class);
return repositoryFactoryBean;
}
}
}
So, as you are aware, GemFire (and by extension, Apache Geode) stores JSON in PDX format (as a PdxInstance). This is so GemFire can interoperate with many different language-based clients (native C++/C#, web-oriented (JavaScript, Pyhton, Ruby, etc) using the Developer REST API, in addition to Java) and also to be able to use OQL to query the JSON data.
After a bit of experimentation, I am surprised GemFire is not behaving as I would expect. I created an example, self-contained test class (i.e. no Spring XD, of course) that simulates your use case... essentially storing JSON data in GemFire as PDX and then attempting to read the data back out as the Order application domain object type using the Repository abstraction, logical enough.
Given the use of the Repository abstraction and implementation from Spring Data GemFire, the infrastructure will attempt to access the application domain object based on the Repository generic type parameter (in this case "Order" from the "OrderRepository" definition).
However, the data is stored in PDX, so now what?
No matter, Spring Data GemFire provides the MappingPdxSerializer class to convert PDX instances back to application domain objects using the same "mapping meta-data" that the Repository infrastructure uses. Cool, so I plug that in...
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
You will also notice, I set the PDX 'read-serialized' property (cacheFactoryBean.setPdxReadSerialized(false);) to false in order to ensure data access operations return the domain object and not the PDX instance.
However, this had no affect on the query method. In fact, it had no affect on the following operations either...
orderRepository.findOne(amazonOrder.getTransactionId());
ordersRegion.get(amazonOrder.getTransactionId());
Both calls returned a PdxInstance. Note, the implementation of OrderRepository.findOne(..) is based on SimpleGemfireRepository.findOne(key), which uses GemfireTemplate.get(key), which just performs Region.get(key), and so is effectively the same as (ordersRegion.get(amazonOrder.getTransactionId();). The outcome should not be, especially with Region.get() and read-serialized set to false.
With the OQL query (SELECT * FROM /Orders WHERE transactionId = $1) generated from the findByTransactionId(String id), the Repository infrastructure has a bit less control over what the GemFire query engine will return based on what the caller (OrderRepository) expects (based on the generic type parameter), so running OQL statements could potentially behave differently than direct Region access using get.
Next, I went onto try modifying the Order type to implement PdxSerializable, to handle the conversion during data access operations (direct Region access with get, OQL, or otherwise). This had no affect.
So, I tried to implement a custom PdxSerializer for Order objects. This had no affect either.
The only thing I can conclude at this point is something is getting lost in translation between Order -> JSON -> PDX and then from PDX -> Order. Seemingly, GemFire needs additional type meta-data required by PDX (something like #JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type") in the JSON data that PDXFormatter recognizes, though I am not certain it does.
Note, in my test class, I used Jackson's ObjectMapper to serialize the Order to JSON and then GemFire's JSONFormatter to serialize the JSON to PDX, which I suspect Spring XD is doing similarly under-the-hood. In fact, Spring XD uses Spring Data GemFire and is most likely using the JSON Region Auto Proxy support. That is exactly what SDG's JSONRegionAdvice object does (see here).
Anyway, I have an inquiry out to the rest of the GemFire engineering team. There are also things that could be done in Spring Data GemFire to ensure the PDX data is converted, such as making use of the MappingPdxSerializer directly to convert the data automatically on behalf of the caller if the data is indeed of type PdxInstance. Similar to how JSON Region Auto Proxying works, you could write AOP interceptor for the Orders Region to automagicaly convert PDX to an Order.
Though, I don't think any of this should be necessary as GemFire should be doing the right thing in this case. Sorry I don't have a better answer right now. Let's see what I find out.
Cheers and stay tuned!
See subsequent post for test code.
We currently have an application which uses multiple databases with the same schema. At the moment we're using a custom solution for switching between them based on the user's session. This works via
public final class DataSourceProxy extends BasicDataSource {
...
Authentication auth = SecurityContextHolder.getContext().getAuthentication();
if (auth != null && auth.getDetails() instanceof Map) {
Map<String, String> details = (Map<String, String>) auth.getDetails();
String targetUrl = details.get("database");
Connection c = super.getConnection();
Statement s = c.createStatement();
s.execute("USE " + targetUrl + ";");
s.close();
return c;
} else {
return super.getConnection();
}
}
Now we want to build a solution using AbstractRoutingDataSource. The problem is:
#Component
public class CustomRoutingDataSource extends AbstractRoutingDataSource {
#Autowired
Environment env;
#Autowired
DbDetailsRepositoy repo;
public CustomRoutingDataSource() {
Map<Object, Object> targetDataSources = new HashMap<Object, Object>();
for(DBDetails dbd : repo.findAll() {
// create DataSource and put it into the map
}
setTargetDataSources(new HashMap<Object, Object>());
}
#Override
protected Object determineCurrentLookupKey() {
Authentication auth = SecurityContextHolder.getContext().getAuthentication();
if (auth != null && auth.getDetails() instanceof Map) {
Map<String, String> details = (Map<String, String>) auth.getDetails();
return details.get("database");
}
return null;
}
}
Inside the constructor (or even via #PostConstruct) we have to fill the targetDataSources Map. But(!) for this we need the connection details which are stored in another database, which has its own DataSource and Entity Manager.
It seems like Spring can't determine the order of Bean construction, or maybe I'm just missing something. It always gives a NullPointerException when accessing the repository (which btw is a JpaRepository).
We're using Spring 3.2.3, Spring Data, Hibernate 4.2. Complete Annotation and Java-Code configuration of Spring and Spring Security.
Please help us!
Spring of course has to call the constructor before it can populate the properties. But that's not a Spring thing, that's basic Java 101 and one of the plenty downsides of using field injection.
To avoid this, simply add your dependencies to the constructor:
#Component
class CustomRoutingDataSource extends AbstractRoutingDataSource {
#Autowired
public CustomRoutingDataSource(DbDetailsRepository repo, Environment environment) {
…
}
…
}