Own HttpTraceRepository - spring-boot

i created my own HttpTraceRepository, for write and read httpTrace from mongodb on capped collection.
Today, i don't understand why HttpTrace is a final class without public constructor.
In my case, when i read mongodb ( findAll method ) to find the last X hours http trace and show it in spring-boot admin UI, it's impossible to instanciante HttpTrace object.
#Component
public class MongoTraceRepository implements HttpTraceRepository {
private static final String ADMIN_TRACE = "admin.trace";
private MongoOperations mongoOps;
#Value("${admin.display.trace.last.x.hours}")
private int displayTraceLastXHours;
#Autowired
public MongoTraceRepository(MongoOperations mongoOps) {
this.mongoOps = mongoOps;
}
#PostConstruct
public void initialize() {
MongoCollection<Document> collection = mongoOps.getCollection(ADMIN_TRACE);
boolean collectionExists = mongoOps.collectionExists(ADMIN_TRACE);
if (!collectionExists) {
collection.drop();
// 100 Mo max, 500 000 documents max, capped !
mongoOps.createCollection(ADMIN_TRACE, CollectionOptions.empty().size(104857600).maxDocuments(500000).capped());
}
}
#Override
public List<HttpTrace> findAll() {
Date yesterday = Date.from(LocalDateTime.now().minusHours(displayTraceLastXHours).atZone(ZoneId.systemDefault()).toInstant());
return mongoOps.find(new Query(where("timestamp").gte(yesterday)), HttpTrace.class, ADMIN_TRACE)
.stream()
.sorted((o1, o2) -> o2.getTimestamp().compareTo(o1.getTimestamp()))
.collect(Collectors.toList());
}
#Override
#Async
public void add(HttpTrace traceInfo) {
mongoOps.save(traceInfo, ADMIN_TRACE);
}
}
This is the error log :
org.springframework.data.mapping.MappingException: No property request found on entity class org.springframework.boot.actuate.trace.http.HttpTrace$Request to bind constructor parameter to!
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:68)
at org.springframework.data.mapping.model.SpELExpressionParameterValueProvider.getParameterValue(SpELExpressionParameterValueProvider.java:49)
at org.springframework.data.convert.ReflectionEntityInstantiator.createInstance(ReflectionEntityInstantiator.java:75)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:86)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:273)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:253)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readValue(MappingMongoConverter.java:1387)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1334)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1276)
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:71)
at org.springframework.data.mapping.model.SpELExpressionParameterValueProvider.getParameterValue(SpELExpressionParameterValueProvider.java:49)
at org.springframework.data.convert.ReflectionEntityInstantiator.createInstance(ReflectionEntityInstantiator.java:75)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:86)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:273)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:253)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:202)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:198)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:86)
at org.springframework.data.mongodb.core.MongoTemplate$ReadDocumentCallback.doWith(MongoTemplate.java:2785)
at org.springframework.data.mongodb.core.MongoTemplate.executeFindMultiInternal(MongoTemplate.java:2448)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:2244)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:2227)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:770)

There is a constructor especially suited to create HTTPTrace from a persistence store.
It was added since 2.1.0 though.

Related

How to make AuditorAware work with Spring Data Mongo Reactive

Spring Security 5 provides a ReactiveSecurityContextHolder to fetch the SecurityContext from a Reactive context, but when I want to implement AuditorAware and get audition work automatically, but it does not work. Currently I can not find a Reactive variant for AuditorAware.
#Bean
public AuditorAware<Username> auditor() {
return () -> ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.log()
.filter(a -> a != null && a.isAuthenticated())
.map(Authentication::getPrincipal)
.cast(UserDetails.class)
.map(auth -> new Username(auth.getName()))
.switchIfEmpty(Mono.empty())
.blockOptional();
}
I have added #EnableMongoAuduting on my boot Application class.
On the Mongo document class. I added audition related annotations.
#CreatedDate
private LocalDateTime createdDate;
#CreatedBy
private Username author;
When I added a post, the createdDate is filled, but author is null.
{"id":"5a49ccdb9222971f40a4ada1","title":"my first post","content":"content of my first post","createdDate":"2018-01-01T13:53:31.234","author":null}
The complete codes is here, based on Spring Boot 2.0.0.M7.
Update: Spring Boot 2.4.0-M2/Spring Data Common 2.4.0-M2/Spring Data Mongo 3.1.0-M2 includes a ReactiveAuditorAware, Check this new sample, Note: use #EnableReactiveMongoAuditing to activiate it.
I am posting another solution which counts with input id to support update operations:
#Component
#RequiredArgsConstructor
public class AuditCallback implements ReactiveBeforeConvertCallback<AuditableEntity> {
private final ReactiveMongoTemplate mongoTemplate;
private Mono<?> exists(Object id, Class<?> entityClass) {
if (id == null) {
return Mono.empty();
}
return mongoTemplate.findById(id, entityClass);
}
#Override
public Publisher<AuditableEntity> onBeforeConvert(AuditableEntity entity, String collection) {
var securityContext = ReactiveSecurityContextHolder.getContext();
return securityContext
.zipWith(exists(entity.getId(), entity.getClass()))
.map(tuple2 -> {
var auditableEntity = (AuditableEntity) tuple2.getT2();
auditableEntity.setLastModifiedBy(tuple2.getT1().getAuthentication().getName());
auditableEntity.setLastModifiedDate(Instant.now());
return auditableEntity;
})
.switchIfEmpty(Mono.zip(securityContext, Mono.just(entity))
.map(tuple2 -> {
var auditableEntity = (AuditableEntity) tuple2.getT2();
String principal = tuple2.getT1().getAuthentication().getName();
Instant now = Instant.now();
auditableEntity.setLastModifiedBy(principal);
auditableEntity.setCreatedBy(principal);
auditableEntity.setLastModifiedDate(now);
auditableEntity.setCreatedDate(now);
return auditableEntity;
}));
}
}
Deprecated: see the update solution in the original post
Before the official reactive AuditAware is provided, there is an alternative to implement these via Spring Data Mongo specific ReactiveBeforeConvertCallback.
Do not use #EnableMongoAuditing
Implement your own ReactiveBeforeConvertCallback, here I use a PersistentEntity interface for those entities that need to be audited.
public class PersistentEntityCallback implements ReactiveBeforeConvertCallback<PersistentEntity> {
#Override
public Publisher<PersistentEntity> onBeforeConvert(PersistentEntity entity, String collection) {
var user = ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.filter(it -> it != null && it.isAuthenticated())
.map(Authentication::getPrincipal)
.cast(UserDetails.class)
.map(userDetails -> new Username(userDetails.getUsername()))
.switchIfEmpty(Mono.empty());
var currentTime = LocalDateTime.now();
if (entity.getId() == null) {
entity.setCreatedDate(currentTime);
}
entity.setLastModifiedDate(currentTime);
return user
.map(u -> {
if (entity.getId() == null) {
entity.setCreatedBy(u);
}
entity.setLastModifiedBy(u);
return entity;
}
)
.defaultIfEmpty(entity);
}
}
Check the complete codes here.
To have createdBy attribute filled, you need to link your auditorAware bean with the annotation #EnableMongoAuditing
In your MongoConfig class, define your bean :
#Bean(name = "auditorAware")
public AuditorAware<String> auditor() {
....
}
and use it in the annotation :
#Configuration
#EnableMongoAuditing(auditorAwareRef="auditorAware")
class MongoConfig {
....
}

JAXBElement: providing codec (/converter?) for class java.lang.Class

I have been evaluating to adopt spring-data-mongodb for a project. In summary, my aim is:
Using existing XML schema files to generate Java classes.
This is achieved using JAXB xjc
The root class is TSDProductDataType and is further modeled as below:
The thing to note here is that ExtensionType contains protected List<Object> any; allowing it to store Objects of any class. In my case, it is amongst the classes named TSDModule_Name_HereModuleType and can be browsed here
Use spring-data-mongodb as persistence store
This is achieved using a simple ProductDataRepository
#RepositoryRestResource(collectionResourceRel = "product", path = "product")
public interface ProductDataRepository extends MongoRepository<TSDProductDataType, String> {
TSDProductDataType queryByGtin(#Param("gtin") String gtin);
}
The unmarshalled TSDProductDataType, however, contains JAXBElement which spring-data-mongodb doesn't seem to handle by itself and throws a CodecConfigurationException org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.lang.Class.
Here is the faulty statement:
TSDProductDataType tsdProductDataType = jaxbElement.getValue();
repository.save(tsdProductDataType);
I tried playing around with Converters for spring-data-mongodb as explained here, however, it seems I am missing something since the exception is about "Codecs" and not "Converters".
Any help is appreciated.
EDIT:
Adding converters for JAXBElement
Note: Works with version 1.5.6.RELEASE of org.springframework.boot::spring-boot-starter-parent. With version 2.0.0.M3, hell breaks loose
It seems that I missed something while trying to add converter earlier. So, I added it like below for testing:
#Component
#ReadingConverter
public class JAXBElementReadConverter implements Converter<DBObject, JAXBElement> {
//#Autowired
//MongoConverter converter;
#Override
public JAXBElement convert(DBObject dbObject) {
Class declaredType, scope;
QName name = qNameFromString((String)dbObject.get("name"));
Object rawValue = dbObject.get("value");
try {
declaredType = Class.forName((String)dbObject.get("declaredType"));
} catch (ClassNotFoundException e) {
if (rawValue.getClass().isArray()) declaredType = List.class;
else declaredType = LinkedHashMap.class;
}
try {
scope = Class.forName((String) dbObject.get("scope"));
} catch (ClassNotFoundException e) {
scope = JAXBElement.GlobalScope.class;
}
//Object value = rawValue instanceof DBObject ? converter.read(declaredType, (DBObject) rawValue) : rawValue;
Object value = "TODO";
return new JAXBElement(name, declaredType, scope, value);
}
QName qNameFromString(String s) {
String[] parts = s.split("[{}]");
if (parts.length > 2) return new QName(parts[1], parts[2], parts[0]);
if (parts.length == 1) return new QName(parts[0]);
return new QName("undef");
}
}
#Component
#WritingConverter
public class JAXBElementWriteConverter implements Converter<JAXBElement, DBObject> {
//#Autowired
//MongoConverter converter;
#Override
public DBObject convert(JAXBElement jaxbElement) {
DBObject dbObject = new BasicDBObject();
dbObject.put("name", qNameToString(jaxbElement.getName()));
dbObject.put("declaredType", jaxbElement.getDeclaredType().getName());
dbObject.put("scope", jaxbElement.getScope().getCanonicalName());
//dbObject.put("value", converter.convertToMongoType(jaxbElement.getValue()));
dbObject.put("value", "TODO");
dbObject.put("_class", JAXBElement.class.getName());
return dbObject;
}
public String qNameToString(QName name) {
if (name.getNamespaceURI() == XMLConstants.NULL_NS_URI) return name.getLocalPart();
return name.getPrefix() + '{' + name.getNamespaceURI() + '}' + name.getLocalPart();
}
}
#SpringBootApplication
public class TsdApplication {
public static void main(String[] args) {
SpringApplication.run(TsdApplication.class, args);
}
#Bean
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(),
new JAXBElementWriteConverter()
));
}
}
So far so good. However, how do I instantiate MongoConverter converter;?
MongoConverter is an interface so I guess I need an instantiable class adhering to this interface. Any suggestions?
I understand the desire for convenience in being able to just map an existing domain object to the database layer with no boilerplate, but even if you weren't having the JAXB class structure issue, I would still be recommending away from using it verbatim. Unless this is a simple one-off project, you almost definitely will hit a point where your domain models will need to change but your persisted data need to remain in an existing state. If you are just straight persisting the data, you have no mechanism to convert between a newer domain schema and an older persisted data scheme. Versioning of the persisted data scheme would be wise too.
The link you posted for writing the customer converters is one way to achieve this and fits in nicely with the Spring ecosystem. That method should also solve the issue you are experiencing (about the underlying messy JAXB data structure not converting cleanly).
Are you unable to get that method working? Ensure you are loading them into the Spring context with #Component plus auto-class scanning or manually via some Configuration class.
EDIT to address your EDIT:
Add the following to each of your converters:
private final MongoConverter converter;
public JAXBElement____Converter(MongoConverter converter) {
this.converter = converter;
}
Try changing your bean definition to:
#Bean
public CustomConversions customConversions(#Lazy MongoConverter converter) {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(converter),
new JAXBElementWriteConverter(converter)
));
}

Read Application Object from GemFire using Spring Data GemFire. Data stored using SpringXD's gemfire-json-server

I'm using the gemfire-json-server module in SpringXD to populate a GemFire grid with json representation of “Order” objects. I understand the gemfire-json-server module saves data in Pdx form in GemFire. I’d like to read the contents of the GemFire grid into an “Order” object in my application. I get a ClassCastException that reads:
java.lang.ClassCastException: com.gemstone.gemfire.pdx.internal.PdxInstanceImpl cannot be cast to org.apache.geode.demo.cc.model.Order
I’m using the Spring Data GemFire libraries to read contents of the cluster. The code snippet to read the contents of the Grid follows:
public interface OrderRepository extends GemfireRepository<Order, String>{
Order findByTransactionId(String transactionId);
}
How can I use Spring Data GemFire to convert data read from the GemFire cluster into an Order object?
Note: The data was initially stored in GemFire using SpringXD's gemfire-json-server-module
Still waiting to hear back from the GemFire PDX engineering team, specifically on Region.get(key), but, interestingly enough if you annotate your application domain object with...
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public class Order ... {
...
}
This works!
Under-the-hood I knew the GemFire JSONFormatter class (see here) used Jackson's API to un/marshal (de/serialize) JSON data to and from PDX.
However, the orderRepository.findOne(ID) and ordersRegion.get(key) still do not function as I would expect. See updated test class below for more details.
Will report back again when I have more information.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = GemFireConfiguration.class)
#SuppressWarnings("unused")
public class JsonToPdxToObjectDataAccessIntegrationTest {
protected static final AtomicLong ID_SEQUENCE = new AtomicLong(0l);
private Order amazon;
private Order bestBuy;
private Order target;
private Order walmart;
#Autowired
private OrderRepository orderRepository;
#Resource(name = "Orders")
private com.gemstone.gemfire.cache.Region<Long, Object> orders;
protected Order createOrder(String name) {
return createOrder(ID_SEQUENCE.incrementAndGet(), name);
}
protected Order createOrder(Long id, String name) {
return new Order(id, name);
}
protected <T> T fromPdx(Object pdxInstance, Class<T> toType) {
try {
if (pdxInstance == null) {
return null;
}
else if (toType.isInstance(pdxInstance)) {
return toType.cast(pdxInstance);
}
else if (pdxInstance instanceof PdxInstance) {
return new ObjectMapper().readValue(JSONFormatter.toJSON(((PdxInstance) pdxInstance)), toType);
}
else {
throw new IllegalArgumentException(String.format("Expected object of type PdxInstance; but was (%1$s)",
pdxInstance.getClass().getName()));
}
}
catch (IOException e) {
throw new RuntimeException(String.format("Failed to convert PDX to object of type (%1$s)", toType), e);
}
}
protected void log(Object value) {
System.out.printf("Object of Type (%1$s) has Value (%2$s)", ObjectUtils.nullSafeClassName(value), value);
}
protected Order put(Order order) {
Object existingOrder = orders.putIfAbsent(order.getTransactionId(), toPdx(order));
return (existingOrder != null ? fromPdx(existingOrder, Order.class) : order);
}
protected PdxInstance toPdx(Object obj) {
try {
return JSONFormatter.fromJSON(new ObjectMapper().writeValueAsString(obj));
}
catch (JsonProcessingException e) {
throw new RuntimeException(String.format("Failed to convert object (%1$s) to JSON", obj), e);
}
}
#Before
public void setup() {
amazon = put(createOrder("Amazon Order"));
bestBuy = put(createOrder("BestBuy Order"));
target = put(createOrder("Target Order"));
walmart = put(createOrder("Wal-Mart Order"));
}
#Test
public void regionGet() {
assertThat((Order) orders.get(amazon.getTransactionId()), is(equalTo(amazon)));
}
#Test
public void repositoryFindOneMethod() {
log(orderRepository.findOne(target.getTransactionId()));
assertThat(orderRepository.findOne(target.getTransactionId()), is(equalTo(target)));
}
#Test
public void repositoryQueryMethod() {
assertThat(orderRepository.findByTransactionId(amazon.getTransactionId()), is(equalTo(amazon)));
assertThat(orderRepository.findByTransactionId(bestBuy.getTransactionId()), is(equalTo(bestBuy)));
assertThat(orderRepository.findByTransactionId(target.getTransactionId()), is(equalTo(target)));
assertThat(orderRepository.findByTransactionId(walmart.getTransactionId()), is(equalTo(walmart)));
}
#Region("Orders")
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public static class Order implements PdxSerializable {
protected static final OrderPdxSerializer pdxSerializer = new OrderPdxSerializer();
#Id
private Long transactionId;
private String name;
public Order() {
}
public Order(Long transactionId) {
this.transactionId = transactionId;
}
public Order(Long transactionId, String name) {
this.transactionId = transactionId;
this.name = name;
}
public String getName() {
return name;
}
public void setName(final String name) {
this.name = name;
}
public Long getTransactionId() {
return transactionId;
}
public void setTransactionId(final Long transactionId) {
this.transactionId = transactionId;
}
#Override
public void fromData(PdxReader reader) {
Order order = (Order) pdxSerializer.fromData(Order.class, reader);
if (order != null) {
this.transactionId = order.getTransactionId();
this.name = order.getName();
}
}
#Override
public void toData(PdxWriter writer) {
pdxSerializer.toData(this, writer);
}
#Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof Order)) {
return false;
}
Order that = (Order) obj;
return ObjectUtils.nullSafeEquals(this.getTransactionId(), that.getTransactionId());
}
#Override
public int hashCode() {
int hashValue = 17;
hashValue = 37 * hashValue + ObjectUtils.nullSafeHashCode(getTransactionId());
return hashValue;
}
#Override
public String toString() {
return String.format("{ #type = %1$s, id = %2$d, name = %3$s }",
getClass().getName(), getTransactionId(), getName());
}
}
public static class OrderPdxSerializer implements PdxSerializer {
#Override
public Object fromData(Class<?> type, PdxReader in) {
if (Order.class.equals(type)) {
return new Order(in.readLong("transactionId"), in.readString("name"));
}
return null;
}
#Override
public boolean toData(Object obj, PdxWriter out) {
if (obj instanceof Order) {
Order order = (Order) obj;
out.writeLong("transactionId", order.getTransactionId());
out.writeString("name", order.getName());
return true;
}
return false;
}
}
public interface OrderRepository extends GemfireRepository<Order, Long> {
Order findByTransactionId(Long transactionId);
}
#Configuration
protected static class GemFireConfiguration {
#Bean
public Properties gemfireProperties() {
Properties gemfireProperties = new Properties();
gemfireProperties.setProperty("name", JsonToPdxToObjectDataAccessIntegrationTest.class.getSimpleName());
gemfireProperties.setProperty("mcast-port", "0");
gemfireProperties.setProperty("log-level", "warning");
return gemfireProperties;
}
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
//cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxSerializer(new OrderPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
#Bean(name = "Orders")
public PartitionedRegionFactoryBean ordersRegion(Cache gemfireCache) {
PartitionedRegionFactoryBean regionFactoryBean = new PartitionedRegionFactoryBean();
regionFactoryBean.setCache(gemfireCache);
regionFactoryBean.setName("Orders");
regionFactoryBean.setPersistent(false);
return regionFactoryBean;
}
#Bean
public GemfireRepositoryFactoryBean orderRepository() {
GemfireRepositoryFactoryBean<OrderRepository, Order, Long> repositoryFactoryBean =
new GemfireRepositoryFactoryBean<>();
repositoryFactoryBean.setRepositoryInterface(OrderRepository.class);
return repositoryFactoryBean;
}
}
}
So, as you are aware, GemFire (and by extension, Apache Geode) stores JSON in PDX format (as a PdxInstance). This is so GemFire can interoperate with many different language-based clients (native C++/C#, web-oriented (JavaScript, Pyhton, Ruby, etc) using the Developer REST API, in addition to Java) and also to be able to use OQL to query the JSON data.
After a bit of experimentation, I am surprised GemFire is not behaving as I would expect. I created an example, self-contained test class (i.e. no Spring XD, of course) that simulates your use case... essentially storing JSON data in GemFire as PDX and then attempting to read the data back out as the Order application domain object type using the Repository abstraction, logical enough.
Given the use of the Repository abstraction and implementation from Spring Data GemFire, the infrastructure will attempt to access the application domain object based on the Repository generic type parameter (in this case "Order" from the "OrderRepository" definition).
However, the data is stored in PDX, so now what?
No matter, Spring Data GemFire provides the MappingPdxSerializer class to convert PDX instances back to application domain objects using the same "mapping meta-data" that the Repository infrastructure uses. Cool, so I plug that in...
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
You will also notice, I set the PDX 'read-serialized' property (cacheFactoryBean.setPdxReadSerialized(false);) to false in order to ensure data access operations return the domain object and not the PDX instance.
However, this had no affect on the query method. In fact, it had no affect on the following operations either...
orderRepository.findOne(amazonOrder.getTransactionId());
ordersRegion.get(amazonOrder.getTransactionId());
Both calls returned a PdxInstance. Note, the implementation of OrderRepository.findOne(..) is based on SimpleGemfireRepository.findOne(key), which uses GemfireTemplate.get(key), which just performs Region.get(key), and so is effectively the same as (ordersRegion.get(amazonOrder.getTransactionId();). The outcome should not be, especially with Region.get() and read-serialized set to false.
With the OQL query (SELECT * FROM /Orders WHERE transactionId = $1) generated from the findByTransactionId(String id), the Repository infrastructure has a bit less control over what the GemFire query engine will return based on what the caller (OrderRepository) expects (based on the generic type parameter), so running OQL statements could potentially behave differently than direct Region access using get.
Next, I went onto try modifying the Order type to implement PdxSerializable, to handle the conversion during data access operations (direct Region access with get, OQL, or otherwise). This had no affect.
So, I tried to implement a custom PdxSerializer for Order objects. This had no affect either.
The only thing I can conclude at this point is something is getting lost in translation between Order -> JSON -> PDX and then from PDX -> Order. Seemingly, GemFire needs additional type meta-data required by PDX (something like #JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type") in the JSON data that PDXFormatter recognizes, though I am not certain it does.
Note, in my test class, I used Jackson's ObjectMapper to serialize the Order to JSON and then GemFire's JSONFormatter to serialize the JSON to PDX, which I suspect Spring XD is doing similarly under-the-hood. In fact, Spring XD uses Spring Data GemFire and is most likely using the JSON Region Auto Proxy support. That is exactly what SDG's JSONRegionAdvice object does (see here).
Anyway, I have an inquiry out to the rest of the GemFire engineering team. There are also things that could be done in Spring Data GemFire to ensure the PDX data is converted, such as making use of the MappingPdxSerializer directly to convert the data automatically on behalf of the caller if the data is indeed of type PdxInstance. Similar to how JSON Region Auto Proxying works, you could write AOP interceptor for the Orders Region to automagicaly convert PDX to an Order.
Though, I don't think any of this should be necessary as GemFire should be doing the right thing in this case. Sorry I don't have a better answer right now. Let's see what I find out.
Cheers and stay tuned!
See subsequent post for test code.

DeferredResult in spring mvc

I have one class that extends DeferredResults and extends Runnable as shown below
public class EventDeferredObject<T> extends DeferredResult<Boolean> implements Runnable {
private Long customerId;
private String email;
#Override
public void run() {
RestTemplate restTemplate=new RestTemplate();
EmailMessageDTO emailMessageDTO=new EmailMessageDTO("dineshshe#gmail.com", "Hi There");
Boolean result=restTemplate.postForObject("http://localhost:9080/asycn/sendEmail", emailMessageDTO, Boolean.class);
this.setResult(result);
}
//Constructor and getter and setters
}
Now I have controller that return the object of the above class,whenever new request comes to controller we check if that request is present in HashMap(That stores unprocessed request at that instance).If not present then we are creating object of EventDeferredObject class can store that in HashMap and call start() method on it.If this type request is already present then we will return that from HashMap.On completion on request we will delete that request from HashMap.
#RequestMapping(value="/sendVerificationDetails")
public class SendVerificationDetailsController {
private ConcurrentMap<String , EventDeferredObject<Boolean>> requestMap=new ConcurrentHashMap<String , EventDeferredObject<Boolean>>();
#RequestMapping(value="/sendEmail",method=RequestMethod.POST)
public EventDeferredObject<Boolean> sendEmail(#RequestBody EmailDTO emailDTO)
{
EventDeferredObject<Boolean> eventDeferredObject = null;
System.out.println("Size:"+requestMap.size());
if(!requestMap.containsKey(emailDTO.getEmail()))
{
eventDeferredObject=new EventDeferredObject<Boolean>(emailDTO.getCustomerId(), emailDTO.getEmail());
requestMap.put(emailDTO.getEmail(), eventDeferredObject);
Thread t1=new Thread(eventDeferredObject);
t1.start();
}
else
{
eventDeferredObject=requestMap.get(emailDTO.getEmail());
}
eventDeferredObject.onCompletion(new Runnable() {
#Override
public void run() {
if(requestMap.containsKey(emailDTO.getEmail()))
{
requestMap.remove(emailDTO.getEmail());
}
}
});
return eventDeferredObject;
}
}
Now this code works fine if there no identical request comes to that stored in HashMap. If we give number of different request at same time code works fine.
Well, I do not know if I understood correctly, but I think you might have race conditions in the code, for example here:
if(!requestMap.containsKey(emailDTO.getEmail()))
{
eventDeferredObject=new EventDeferredObject<Boolean>(emailDTO.getCustomerId(), emailDTO.getEmail());
requestMap.put(emailDTO.getEmail(), eventDeferredObject);
Thread t1=new Thread(eventDeferredObject);
t1.start();
}
else
{
eventDeferredObject=requestMap.get(emailDTO.getEmail());
}
think of a scenario in which you have two requests with the same key emailDTO.getEmail().
Request 1 checks if there is a key in the map, does not find it and puts it inside.
Request 2 comes some time later, checks if there is a key in the map, finds it, and
goes to fetch it; however just before that, the thread started by request 1 finishes and another thread, started by onComplete event, removes the key from the map. At this point,
requestMap.get(emailDTO.getEmail())
will return null, and as a result you will have a NullPointerException.
Now, this does look like a rare scenario, so I do not know if this is the problem you see.
I would try to modify the code as follows (I did not run it myself, so I might have errors):
public class EventDeferredObject<T> extends DeferredResult<Boolean> implements Runnable {
private Long customerId;
private String email;
private ConcurrentMap ourConcurrentMap;
#Override
public void run() {
...
this.setResult(result);
ourConcurrentMap.remove(this.email);
}
//Constructor and getter and setters
}
so the DeferredResult implementation has the responsibility to remove itself from the concurrent map. Moreover I do not use the onComplete to set a callback thread, as it seems to me an unnecessary complication. To avoid the race conditions I talked about before, one needs to combine somehow the verification of the presence of an entry with its fetching into one atomic operation; this is done by the putIfAbsent method of ConcurrentMap. Therefore I change the controller into
#RequestMapping(value="/sendVerificationDetails")
public class SendVerificationDetailsController {
private ConcurrentMap<String , EventDeferredObject<Boolean>> requestMap=new ConcurrentHashMap<String , EventDeferredObject<Boolean>>();
#RequestMapping(value="/sendEmail",method=RequestMethod.POST)
public EventDeferredObject<Boolean> sendEmail(#RequestBody EmailDTO emailDTO)
{
EventDeferredObject<Boolean> eventDeferredObject = new EventDeferredObject<Boolean>(emailDTO.getCustomerId(), emailDTO.getEmail(), requestMap);
EventDeferredObject<Boolean> oldEventDeferredObject = requestMap.putIfAbsent(emailDTO.getEmail(), eventDeferredObject );
if(oldEventDeferredObject == null)
{
//if no value was present before
Thread t1=new Thread(eventDeferredObject);
t1.start();
return eventDeferredObject;
}
else
{
return oldEventDeferredObject;
}
}
}
if this does not solve the problem you have, I hope that at least it might give some idea.

Can you think of a better way to only load DBbUnit once per test class with Spring?

I realise that best practise may advise on loading test data on every #Test method, however this can be painfully slow for DBUnit so I have come up with the following solution to load it only once per class:
Only load a data set once per test class
Support multiple data sources and those not named "dataSource" from the ApplicationContext
Roll back of the inserted DBUnit data set not strictly required
While the code below works, what is bugging me is that my Test class has the static method beforeClassWithApplicationContext() but it cannot belong to an Interface because its static. Therefore my use of Reflection is being used in a non Type safe manner. Is there a more elegant solution?
/**
* My Test class
*/
#RunWith(SpringJUnit4ClassRunner.class)
#TestExecutionListeners({DependencyInjectionTestExecutionListener.class, DirtiesContextTestExecutionListener.class, DbunitLoadOnceTestExecutionListener.class})
#ContextConfiguration(locations={"classpath:resources/spring/applicationContext.xml"})
public class TestClass {
public static final String TEST_DATA_FILENAME = "Scenario-1.xml";
public static void beforeClassWithApplicationContext(ApplicationContext ctx) throws Exception {
DataSource ds = (DataSource)ctx.getBean("dataSourceXyz");
IDatabaseConnection conn = new DatabaseConnection(ds.getConnection());
IDataSet dataSet = DbUnitHelper.getDataSetFromFile(conn, TEST_DATA_FILENAME);
InsertIdentityOperation.CLEAN_INSERT.execute(conn, dataSet);
}
#Test
public void somethingToTest() {
// do stuff...
}
}
/**
* My new custom TestExecutioner
*/
public class DbunitLoadOnceTestExecutionListener extends AbstractTestExecutionListener {
final String methodName = "beforeClassWithApplicationContext";
#Override
public void beforeTestClass(TestContext testContext) throws Exception {
super.beforeTestClass(testContext);
Class<?> clazz = testContext.getTestClass();
Method m = null;
try {
m = clazz.getDeclaredMethod(methodName, ApplicationContext.class);
}
catch(Exception e) {
throw new Exception("Test class must implement " + methodName + "()", e);
}
m.invoke(null, testContext.getApplicationContext());
}
}
One other thought I had was possibly creating a static singleton class for holding a reference to the ApplicationContext and populating it from DbunitLoadOnceTestExecutionListener.beforeTestClass(). I could then retrieve that singleton reference from a standard #BeforeClass method defined on TestClass. My code above calling back into each TestClass just seems a little messy.
After the helpful feedback from Matt and JB this is a much simpler solution to achieve the desired result
/**
* My Test class
*/
#RunWith(SpringJUnit4ClassRunner.class)
#TestExecutionListeners({DependencyInjectionTestExecutionListener.class, DirtiesContextTestExecutionListener.class, DbunitLoadOnceTestExecutionListener.class})
#ContextConfiguration(locations={"classpath:resources/spring/applicationContext.xml"})
public class TestClass {
private static final String TEST_DATA_FILENAME = "Scenario-1.xml";
// must be static
private static volatile boolean isDataSetLoaded = false;
// use the Qualifier to select a specific dataSource
#Autowired
#Qualifier("dataSourceXyz")
private DataSource dataSource;
/**
* For performance reasons, we only want to load the DBUnit data set once per test class
* rather than before every test method.
*
* #throws Exception
*/
#Before
public void before() throws Exception {
if(!isDataSetLoaded) {
isDataSetLoaded = true;
IDatabaseConnection conn = new DatabaseConnection(dataSource.getConnection());
IDataSet dataSet = DbUnitHelper.getDataSetFromFile(conn, TEST_DATA_FILENAME);
InsertIdentityOperation.CLEAN_INSERT.execute(conn, dataSet);
}
}
#Test
public void somethingToTest() {
// do stuff...
}
}
The class DbunitLoadOnceTestExecutionListener is no longer requried and has been removed. It just goes to show that reading up on all the fancy techniques can sometimes cloud your own judgement :o)
Not a specialist, but couldn't you call an instance method of your test object in prepareTestInstance() after having verified it implements the appropriate interface, and call this method only if it's the first time prepareTestInstance is invoked with a test instance of this class. You would just have to keep a set of already seen classes:
#Override
public void prepareTestInstance(TestContext testContext) throws Exception {
MyDbUnitTest instance = (MyDbUnitTest) getTestInstance();
if (!this.alreadySeenClasses.contains(instance.getClass()) {
instance.beforeClassWithApplicationContext(testContext.getApplicationContext());
this.alreadySeenClasses.add(instance.getClass());
}
}

Resources