My objective is to add a custom convenience method to a Spring Data REST API without creating my own controller.
According to the documentation here I have extended PagingAndSortingRepository in the following manner:
Repository:
#RepositoryRestResource
public interface PartyRestRepository extends PagingAndSortingRepository<Party, String>, CustomPartyRestRepository {
}
Interface with my method:
public interface CustomPartyRestRepository {
void dynamicPartyCreation(final String name);
}
Implementation:
public class CustomPartyRestRepositoryImpl implements CustomPartyRestRepository {
#Autowired
private PartyService partyService;
#Autowired
private PartyRepository partyRepository;
#Autowired
private HeroService heroService;
#Override
public void dynamicPartyCreation(final String name) {
final Party party = this.partyService.createParty(name);
final List<Hero> heroes = IntStream.range(0, 3)
.mapToObj(i -> this.heroService.createHero(String.format("Hero %d for %s", i, name)))
.collect(Collectors.toList());
party.setMembers(heroes);
this.partyRepository.save(party);
}
}
When I do GET localhost:8080/profile/parties/, I see that Spring has picked up my method and is exposing it:
<...cut...>
{
"name": "dynamicPartyCreation",
"type": "SAFE"
}
<..cut..>
But I can't seem to use it. GET localhost:8080/parties/dynamicPartyCreation/ results in 404, as does POST, with body or without, with query param or not. A PUT simply creates a party and ignores the /parties/dynamicPartyCreation/ part of the url (meaning, my method isn't called). I have tried a million combinations but I can't use it.
What am I doing wrong?
Well, interesting, I never thought it could work this way...
try
return this.partyRepository.save(party);
Although I don't understand how can it work at all, when you basically #autowired PartyRepositoryCustom into itself...
So maybe it should be something like this:
public class CustomPartyRestRepositoryImpl implements CustomPartyRestRepository {
#Autowired
private PartyService partyService;
#Autowired
private ListableBeanFactory beanFactory;
#Autowired
private HeroService heroService;
#Override
public Party dynamicPartyCreation(final String name) {
final Party party = this.partyService.createParty(name);
final List<Hero> heroes = IntStream.range(0, 3)
.mapToObj(i -> this.heroService.createHero(String.format("Hero %d for %s", i, name)))
.collect(Collectors.toList());
party.setMembers(heroes);
return beanFactory.getBean(PartyRepository.class).save(party);
}
}
If the server doesn't start, try accessing/initialising the PartyService and HeroService via beanFactory too.
Although it should work (I successfully tested a similar method in my current project), I still don't think it's a good idea to implement it this way.
It's not a search (even if it returns a new object)
It changes the DB, so this endpoint should be accessed via PUT method.
So I suggest creating a custom controller method instead, and moving the logic into the PartyService class.
Oh, btw. The URL you were looking for is
GET /parties/search/dynamicPartyCreation/
Related
In general this back-story does not matter but just to explain the code below:
The server handles users and user groups. User groups are able to "discover" places - at this point in time these places are coming exclusively from the Google Places API.
Current Implementation
Currently, I have a lot of JpaRepository objects, which I call Repository, in my Service Layer. I am stressing "Repository" because in my proposed solution below, they'd be downgraded to DAOs.
However, what I do not like in my current code, and also the reason for my question here, is the amount of repositories one can find in the UserGroupService.
#Service
public class UserGroupService {
private final static Logger LOGGER = LogManager.getLogger(UserGroupService.class);
#Autowired
private UserGroupRepository userGroupRepository;
#Autowired
private UserGroupPlaceRepository userGroupPlaceRepository;
#Autowired
private PlaceRepository placeRepository;
#Autowired
private GooglePlaceRepository googlePlaceRepository;
#Autowired
private GooglePlaces googlePlaces;
public UserGroupService() {
}
#Transactional
public void discoverPlaces(Long groupId) {
final UserGroup userGroup = this.userGroupRepository.findById(groupId).orElse(null);
if (userGroup == null) {
throw new EntityNotFoundException(String.format("User group with id %s not found.", groupId));
}
List<PlacesSearchResult> allPlaces = this.googlePlaces.findPlaces(
userGroup.getLatitude(),
userGroup.getLongitude(),
userGroup.getSearchRadius());
allPlaces.forEach(googlePlaceResult -> {
GooglePlace googlePlace = this.googlePlaceRepository.findByGooglePlaceId(googlePlaceResult.placeId);
if (googlePlace != null) {
return;
}
Place place = new Place();
place.setLatitude(googlePlaceResult.geometry.location.lat);
place.setLongitude(googlePlaceResult.geometry.location.lng);
place.setPlaceType(Place.PlaceType.GOOGLE_PLACE);
place.setName(googlePlaceResult.name);
place.setVicinity(googlePlaceResult.vicinity);
place = this.placeRepository.save(place);
UserGroupPlace.UserGroupPlaceId userGroupPlaceId = new UserGroupPlace.UserGroupPlaceId();
userGroupPlaceId.setUserGroup(userGroup);
userGroupPlaceId.setPlace(place);
UserGroupPlace userGroupPlace = new UserGroupPlace();
userGroupPlace.setUserGroupPlaceId(userGroupPlaceId);
this.userGroupPlaceRepository.save(userGroupPlace);
googlePlace = new GooglePlace();
googlePlace.setPlace(place);
googlePlace.setGooglePlaceId(googlePlaceResult.placeId);
this.googlePlaceRepository.save(googlePlace);
});
}
}
A Solution That Does Not Work
What could make this code a lot simpler and had the potential to resolve this mess up there, would be #Inheritance:
#Entity
#Table(name = "place")
#Inheritance(strategy InheritanceType.JOINED)
public class Place { /* .. */ }
#Entity
#Table(name = "google_place")
public class GooglePlace extends Place { /* .. */ }
However, this is not an option because then I cannot have a PlaceRepository which saves just a place. Hibernate does not seem to like it..
My proposal
I think my confusion starts with the names that Spring is using. E.g. JpaRepository - I am not so sure if this is actually "the right" name. Because as far as I understood, these objects actually work like data access objects (DAOs). I think it should actually look something like this:
public interface PlaceDao extends JpaRepository<Place, Long> {
}
public interface GooglePlaceDao extends JpaRepository<Place, Long> {
}
#Repository
public class GooglePlaceRepository {
#Autowired
private PlaceDao placeDao;
#Autowired
private GooglePlaceDao googlePlaceDao;
public List<GooglePlace> findByGroupId(Long groupId) {
// ..
}
public void save(GooglePlace googlePlace) {
// ..
}
public void saveAll(List<GooglePlace> googlePlaces) {
// ..
}
}
#Service
public class UserGroupService {
#Autowired
private GooglePlaceRepository googlePlaceRepository;
#Autowired
private UserGroupRepository userGroupRepository;
#Transactional
public void discoverPlaces(Long groupId) {
final UserGroup userGroup = this.userGroupRepository.findById(groupId).orElse(null)
.orElseThrow(throw new EntityNotFoundException(String.format("User group with id %s not found.", groupId)));
List<PlacesSearchResult> fetched = this.googlePlaces.findPlaces(
userGroup.getLatitude(),
userGroup.getLongitude(),
userGroup.getSearchRadius());
// Either do the mapping here or let GooglePlaces return
// List<GooglePlace> instead of List<PlacesSearchResult>
List<GooglePlace> places = fetched.stream().map(googlePlaceResult -> {
GooglePlace googlePlace = this.googlePlaceRepository.findByGooglePlaceId(googlePlaceResult.placeId);
if (googlePlace != null) {
return googlePlace;
}
Place place = new Place();
place.setLatitude(googlePlaceResult.geometry.location.lat);
place.setLongitude(googlePlaceResult.geometry.location.lng);
place.setPlaceType(Place.PlaceType.GOOGLE_PLACE);
place.setName(googlePlaceResult.name);
place.setVicinity(googlePlaceResult.vicinity);
googlePlace = new GooglePlace();
googlePlace.setPlace(place);
googlePlace.setGooglePlaceId(googlePlaceResult.placeId);
return googlePlace;
}).collect(Collectors.toList());
this.googlePlaceRepository.saveAll(places);
// Add places to group..
}
}
Summary
I would like to know what I don't see. Am I fighting the framework, or does my data model not make sense and this is why I find myself struggling with this? Or am I still having issues on how the two patterns "Repository" and "DAO" are supposed to be used?
How would one implement this?
I would say you are correct that there are too many repository dependencies in your service. Personally, I try to keep the number of #Autowired dependencies to a minimum and I try to use a repository only in one service and expose its higher level functionality via that service. At our company we call that data sovereignty (in German: Datenhoheit) and its purpose is to ensure that there is only one place in the application where those entities are modified.
From what I understand from your code I would introduce a PlacesService which has all the Dependencies to the PlaceRepository, GooglePlaceRepository and GooglePlaces. If you feel like Service is not the right name you could also call it the PlacesDao, mark it with a Spring #Component annotation and inject all the Repositories, which are by definition collections of things
#Component
public class PlacesDao {
#Autowired
private PlaceRepository placeRepository;
#Autowired
private GooglePlaceRepository googlePlaceRepository;
This service/DAO could offer an API findPlacesForGroup(userGroup) and createNewPlace(...) and thus making your for Loop smaller and more elegant.
On a side note: you can merge your first four lines into just one. Java Optionals support a orElseThrow() method:
UserGroup userGroup = userGroupRepository.findById(groupId).orElseThrow(() ->
new EntityNotFoundException(String.format("User group with id %s not found.", groupId));
I think the foreach does not look like a good approach to me. You're doing way to much for just a single responsibility of a function. I would refactor this to a standart for loop.
Place place = new Place();
place.setLatitude(googlePlaceResult.geometry.location.lat);
place.setLongitude(googlePlaceResult.geometry.location.lng);
place.setPlaceType(Place.PlaceType.GOOGLE_PLACE);
place.setName(googlePlaceResult.name);
place.setVicinity(googlePlaceResult.vicinity);
place = this.placeRepository.save(place);
This part can easily be a method in a service.
UserGroupPlace.UserGroupPlaceId userGroupPlaceId = new
UserGroupPlace.UserGroupPlaceId();
userGroupPlaceId.setUserGroup(userGroup);
userGroupPlaceId.setPlace(place);
UserGroupPlace userGroupPlace = new UserGroupPlace();
userGroupPlace.setUserGroupPlaceId(userGroupPlaceId);
this.userGroupPlaceRepository.save(userGroupPlace);
That part as well.
googlePlace = new GooglePlace();
googlePlace.setPlace(place);
googlePlace.setGooglePlaceId(googlePlaceResult.placeId);
this.googlePlaceRepository.save(googlePlace);
And this part: I don't understand why your doing this. You could just update the googlePlace instance you loaded from the repo. Hibernate/Transactions are doing the rest for you.
The use case is to implement a dirty field tracker. For this I have an interface:
public interface Dirtyable {
String ID = "dirty";
Set<String> getDirty();
static <T> T wrap(final T delegate) {
return DirtyableInterceptor.wrap(delegate, ReflectionUtils::getPropertyName);
}
static <T> T wrap(final T delegate, final Function<Method, String> resolver) {
return DirtyableInterceptor.wrap(delegate, resolver);
}
}
In the interceptor class the wrapping method is:
static <T> T wrap(final T delegate, final Function<Method, String> resolver) {
requireNonNull(delegate, "Delegate must be non-null");
requireNonNull(resolver, "Resolver must be non-null");
final Try<Class<T>> delegateClassTry = Try.of(() -> getClassForType(delegate.getClass()));
return delegateClassTry.flatMapTry(delegateClass ->
dirtyableFor(delegate, delegateClass, resolver))
.mapTry(Class::newInstance)
.getOrElseThrow(t -> new IllegalStateException(
"Could not wrap dirtyable for " + delegate.getClass(), t));
}
The method dirtyableFor defines a ByteBuddy which forwards to a specific instance at each call. However, instrumenting at every invocation is a bit expensive so it caches the instrumented subclass from the given instance's class. For this I use the resilience4j library (a.k.a. javaslang-circuitbreaker).
private static <T> Try<Class<? extends T>> dirtyableFor(final T delegate,
final Class<T> clazz,
final Function<Method, String> resolver) {
long start = System.currentTimeMillis();
Try<Class<? extends T>> r = Try.of(() -> ofCheckedSupplier(() ->
new ByteBuddy().subclass(clazz)
.defineField(Dirtyable.ID, Set.class, Visibility.PRIVATE)
.method(nameMatches("getDirty"))
.intercept(reference(new HashSet<>()))
.implement(Dirtyable.class)
.method(not(isDeclaredBy(Object.class))
.and(not(isAbstract()))
.and(isPublic()))
.intercept(withDefaultConfiguration()
.withBinders(Pipe.Binder.install(Function.class))
.to(new DirtyableInterceptor(delegate, resolver)))
.make().load(clazz.getClassLoader())
.getLoaded())
.withCache(getCache())
.decorate()
.apply(clazz));
System.out.println("Instrumentation time: " + (System.currentTimeMillis() - start));
return r;
}
private static <T> Cache<Class<? super T>, Class<T>> getCache() {
final CachingProvider provider = Caching.getCachingProvider();
final CacheManager manager = provider.getCacheManager();
final javax.cache.Cache<Class<? super T>, Class<T>> cache =
manager.getCache(Dirtyable.ID);
final Cache<Class<? super T>, Class<T>> dirtyCache = Cache.of(cache);
dirtyCache.getEventStream().map(Object::toString).subscribe(logger::debug);
return dirtyCache;
}
From the logs, the intrumentation time drops from 70-100ms for a cache miss to 0-2ms for a cache hit.
For completeness here is the interceptor method:
#RuntimeType
#SuppressWarnings("unused")
public Object intercept(final #Origin Method method, final #This Dirtyable dirtyable,
final #Pipe Function<Object, Object> pipe) throws Throwable {
if (ReflectionUtils.isSetter(method)) {
final String property = resolver.apply(method);
dirtyable.getDirty().add(property);
logger.debug("Intercepted setter [{}], resolved property " +
"[{}] flagged as dirty.", method, property);
}
return pipe.apply(this.delegate);
}
This solution works well, except that the DirtyableInterceptor is always the same for cache hits, so the delegate instance is also the same.
Is it possible to bind a forwarder to a supplier of an instance so that intercepted methods would forward to it? How could this be done?
You can create a stateless interceptor by making your intercept method static. To access the object's state, define two fields on your subclass which you access using the #FieldValue annotations in your now static interceptor. Instead of using the FixedValue::reference instrumentation, you would also need to use the FieldAccessor implementation to read the value. You also need to define the fields using the defineField builder method.
You can set these fields either by:
Adding setter methods in your Dirtyable interface and intercepting them using the FieldAccessor implementation.
Defining an explicit constructor to which you supply the values. This also allows you to define the fields to be final. To implement the constructor, you first need to invoke a super constructor and then call the FieldAccessor several times to set the fields.
Doing so, you have created a fully stateless class that you can reuse but one that you need to initialze. Byte Buddy already offers a built-in TypeCache for easy reuse.
I am using Spring version 4.3.3 and Jackson version 2.8.3. I am trying to filter out specific fields from an entity bean based on some custom logic that is determined at runtime. The #JsonFilter seems ideal for this type of functionality. The problem is that when I put it at the field or method level, my custom filter never gets invoked. If I put it at the class level, it gets invoked just fine. I don't want to use it at the class level though since then I would need to separately maintain the list of hardcoded field names that I want to apply the logic to. As of Jackson 2.3, the ability to put this annotation at the field level is supposed to exist.
Here is the most basic custom filter without any custom logic yet:
public class MyFilter extends SimpleBeanPropertyFilter {
#Override
protected boolean include(BeanPropertyWriter beanPropertyWriter) {
return true;
}
#Override
protected boolean include(PropertyWriter propertyWriter) {
return true;
}
}
Then I have the Jackson ObjectMapper configuration:
public class MyObjectMapper extends ObjectMapper {
public MyObjectMapper () {
SimpleFilterProvider filterProvider = new SimpleFilterProvider();
filterProvider.addFilter("myFilter", new MyFilter());
setFilterProvider(filterProvider);
}
}
Then finally I have my entity bean:
#Entity
public class Project implements Serializable {
private Long id;
private Long version;
#JsonFilter("myFilter") private String name;
#JsonFilter("myFilter") private String description;
// getters and setters
}
If I move the #JsonFilter annotation to the class level where #Entity is, the filter at least gets invoked, but when it is at the field level like in the example here, it never gets invoked.
I have the same need but after examining the unit tests I discovered that this is not the use-case covered by annotating a field.
Annotating a field invokes a filter on the value of the field not the instance containing the field. For example, imagine you have to classes, A and B, where A contains a field of type B.
class A {
#JsonFilter("myFilter") B foo;
}
Jackson applies "myFilter" to the fields in B not in A. Since your example contains fields of type String, which has no fields, Jackson never invokes your filter.
I have a need to exclude certain fields based on the caller's permissions. For example, an employee's profile may contain his taxpayer id, which is considered sensitive information and should only be serialized if the caller is a member of the Payrole department. Since I'm using Spring Security, I wish to integrate Jackson with the current security context.
public class EmployeeProfile {
private String givenName;
private String surname;
private String emailAddress;
#VisibleWhen("hasRole('PayroleSpecialist')")
private String taxpayerId;
}
The most obvious way to do this is to Jackson's filter mechanism but it has a few limitations:
Jackson does not support nested filters so adding an access filter prohibits using filters for any other purpose.
One cannot add Jackson annotations to existing, third-party classes.
Jackson filters are not designed to be generic. The intent is to write a custom filter for each class you wish to apply filtering. For example, I you need to filter classes A and B, then you have to write an AFilter and a BFilter.
For my use-case, the solution is to use a custom annotation introspector in conjunction with a chaining filter.
public class VisibilityAnnotationIntrospector extends JacksonAnnotationIntrospector {
private static final long serialVersionUID = 1L;
#Override
public Object findFilterId(Annotated a) {
Object result = super.findFilterId(a);
if (null != result) return result;
// By always returning a value, we cause Jackson to query the filter provider.
// A more sophisticated solution will introspect the annotated class and only
// return a value if the class contains annotated properties.
return a instanceof AnnotatedClass ? VisibilityFilterProvider.FILTER_ID : null;
}
}
This is basically a copy SimpleBeanProvider that replaces calls to include with calls to isVisible. I'll probably update this to use a Java 8 BiPredicate to make the solution more general but works for now.
This class also takes another filter as an argument and will delegate to it the final decision on whether to serialize the field if the field is visible.
public class AuthorizationFilter extends SimpleBeanPropertyFilter {
private final PropertyFilter antecedent;
public AuthorizationFilter() {
this(null);
}
public AuthorizationFilter(final PropertyFilter filter) {
this.antecedent = null != filter ? filter : serializeAll();
}
#Deprecated
#Override
public void serializeAsField(Object bean, JsonGenerator jgen, SerializerProvider provider, BeanPropertyWriter writer) throws Exception {
if (isVisible(bean, writer)) {
this.antecedent.serializeAsField(bean, jgen, provider, writer);
} else if (!jgen.canOmitFields()) { // since 2.3
writer.serializeAsOmittedField(bean, jgen, provider);
}
}
#Override
public void serializeAsField(Object pojo, JsonGenerator jgen, SerializerProvider provider, PropertyWriter writer) throws Exception {
if (isVisible(pojo, writer)) {
this.antecedent.serializeAsField(pojo, jgen, provider, writer);
} else if (!jgen.canOmitFields()) { // since 2.3
writer.serializeAsOmittedField(pojo, jgen, provider);
}
}
#Override
public void serializeAsElement(Object elementValue, JsonGenerator jgen, SerializerProvider provider, PropertyWriter writer) throws Exception {
if (isVisible(elementValue, writer)) {
this.antecedent.serializeAsElement(elementValue, jgen, provider, writer);
}
}
private static boolean isVisible(Object pojo, PropertyWriter writer) {
// Code to determine if the field should be serialized.
}
}
I then add a custom filter provider to each instance of ObjectMapper.
#SuppressWarnings("deprecation")
public class VisibilityFilterProvider extends SimpleFilterProvider {
private static final long serialVersionUID = 1L;
static final String FILTER_ID = "dummy-filter-id";
#Override
public BeanPropertyFilter findFilter(Object filterId) {
return super.findFilter(filterId);
}
#Override
public PropertyFilter findPropertyFilter(Object filterId, Object valueToFilter) {
if (FILTER_ID.equals(filterId)) {
// This implies that the class did not have an explict filter annotation.
return new AuthorizationFilter(null);
}
// The class has an explicit filter annotation so delegate to it.
final PropertyFilter antecedent = super.findPropertyFilter(filterId, valueToFilter);
return new VisibilityPropertyFilter(antecedent);
}
}
Finally, I have a Jackson module that automatically registers the custom annotaion introspector so I don't have to add it to each ObjectMapper instance manually.
public class FieldVisibilityModule extends SimpleModule {
private static final long serialVersionUID = 1L;
public FieldVisibilityModule() {
super(PackageVersion.VERSION);
}
#Override
public void setupModule(Module.SetupContext context) {
super.setupModule(context);
// Append after other introspectors (instead of before) since
// explicit annotations should have precedence
context.appendAnnotationIntrospector(new VisibilityAnnotationIntrospector());
}
}
There are more improvements that can be made and I still have more unit tests to write (e.g., handling arrays and collections) but this is the basic strategy I used.
You can try this approach for the same purpose:
#Entity
#Inheritance(
strategy = InheritanceType.SINGLE_TABLE
)
#DiscriminatorColumn(
discriminatorType = DiscriminatorType.STRING,
length = 2
)
#Table(
name = "project"
)
#JsonTypeInfo(
use = Id.CLASS,
include = As.PROPERTY,
property = "#class"
)
#JsonSubTypes({
#Type(
value = BasicProject.class,
name = "basicProject"
),
#Type(
value = AdvanceProject.class,
name = "advanceProject"
)})
public abstract class Project {
private Long id;
private Long version;
}
#Entity
#DiscriminatorValue("AD")
public class AdvanceProject extends Project {
private String name;
private String description;
}
#Entity
#DiscriminatorValue("BS")
public class BasicProject extends Project {
private String name;
}
I don't think you will make it work. I was trying and these are results of my investigation, maybe it will be helpful.
First of all, as #Faron noticed, the #JsonFilterannotation is applied for the class being annotated not a field.
Secondly, I see things this way. Let's imagine, somewhere in Jackson internals you are able to get the actual field. You can figure out if there is the annotation using Java Reflection API. You can even get the filter name. Then you get to the filter and pass the field value there. But it happens at runtime, how will you get the corresponding JsonSerializer of the field type if you decide to serialize the field? It is impossible because of type erasure.
The only alternative I see is to forget about dynamic logic. Then you can do the following things:
1) extend JacksonAnnotationIntrospector (almost the same as implement AnnotationIntrospector but no useless default code) overriding hasIgnoreMarker method. Take a look at this answer
2) criminal starts here. Kinda weird way taking into account your initial goal but still: extend BeanSerializerModifier and filter out fields there. An example can be found here. This way you can define serializer that actually doesn't serialize anything (again, I understand how strange it is but maybe one will find it helpful)
3) similar to the approach above: define useless serializer based on BeanDescription implementing ContextualSerializer's createContextual method. The example of this magic is here
Thanks to this really good blog, I was able to use #JsonView to filter out specific fields from an entity bean based on some custom logic that is determined at runtime.
Since the #JsonFilter does not apply for the fields within a class, I found this to be a cleaner workaround.
Here is the sample code:
#Data
#AllArgsConstructor
public class TestEntity {
private String a;
#JsonView(CustomViews.SecureAccess.class)
private Date b;
#JsonView(CustomViews.SecureAccess.class)
private Integer c;
private List<String> d;
}
public class CustomViews {
public static interface GeneralAccess {}
public static interface SecureAccess {}
public static class GeneralAccessClass implements GeneralAccess {}
public static class SecureAccessClass implements SecureAccess, GeneralAccess {}
public static Class getWriterView(final boolean hasSecureAccess) {
return hasSecureAccess
? SecureAccessClass.class
: GeneralAccessClass.class;
}
}
#Test
public void test() throws JsonProcessingException {
final boolean hasSecureAccess = false; // Custom logic resolved to a boolean value at runtime.
final TestEntity testEntity = new TestEntity("1", new Date(), 2, ImmutableList.of("3", "4", "5"));
final ObjectMapper objectMapper = new ObjectMapper().enable(MapperFeature.DEFAULT_VIEW_INCLUSION);
final String serializedValue = objectMapper
.writerWithView(CustomViews.getWriterView(hasSecureAccess))
.writeValueAsString(testEntity);
Assert.assertTrue(serializedValue.contains("a"));
Assert.assertFalse(serializedValue.contains("b"));
Assert.assertFalse(serializedValue.contains("c"));
Assert.assertTrue(serializedValue.contains("d"));
}
It appears that the update for mongoOperations do not trigger the events in AbstractMongoEventListener.
This post indicates that was at least the case in Nov 2014
Is there currently any way to listen to update events like below? This seems to be quite a big omission if it is the case.
MongoTemplate.updateMulti()
Thanks!
This is no oversight. Events are designed around the lifecycle of a domain object or a document at least, which means they usually contain an instance of the domain object you're interested in.
Updates on the other hand are completely handled in the database. So there are no documents or even domain objects handled in MongoTemplate. Consider this basically the same way JPA #EntityListeners are only triggered for entities that are loaded into the persistence context in the first place, but not triggered when a query is executed as the execution of the query is happening in the database.
I know it's too late to answer this Question, I have the same situation with MongoTemplate.findAndModify method and the reason I needed events is for Auditing purpose. here is what i did.
1.EventPublisher (which is ofc MongoTemplate's methods)
public class CustomMongoTemplate extends MongoTemplate {
private ApplicationEventPublisher applicationEventPublisher;
#Autowired
public void setApplicationEventPublisher(ApplicationEventPublisher
applicationEventPublisher) {
this.applicationEventPublisher = applicationEventPublisher;
}
//Default Constructor here
#Override
public <T> T findAndModify(Query query, Update update, Class<T> entityClass) {
T result = super.findAndModify(query, update, entityClass);
//Publishing Custom Event on findAndModify
if(result!=null && result instanceof Parent)//All of my Domain class extends Parent
this.applicationEventPublisher.publishEvent(new AfterFindAndModify
(this,((Parent)result).getId(),
result.getClass().toString())
);
return result;
} }
2.Application Event
public class AfterFindAndModify extends ApplicationEvent {
private DocumentAuditLog documentAuditLog;
public AfterFindAndModify(Object source, String documentId,
String documentObject) {
super(source);
this.documentAuditLog = new DocumentAuditLog(documentId,
documentObject,new Date(),"UPDATE");
}
public DocumentAuditLog getDocumentAuditLog() {
return documentAuditLog;
}
}
3.Application Listener
public class FindandUpdateMongoEventListner implements ApplicationListener<AfterFindAndModify> {
#Autowired
MongoOperations mongoOperations;
#Override
public void onApplicationEvent(AfterFindAndModify event) {
mongoOperations.save(event.getDocumentAuditLog());
}
}
and then
#Configuration
#EnableMongoRepositories(basePackages = "my.pkg")
#ComponentScan(basePackages = {"my.pkg"})
public class MongoConfig extends AbstractMongoConfiguration {
//.....
#Bean
public FindandUpdateMongoEventListner findandUpdateMongoEventListner(){
return new FindandUpdateMongoEventListner();
}
}
You can listen to database changes, even the changes completely outside your program (MongoDB 4.2 and newer).
(code is in kotlin language. same for java)
#Autowired private lateinit var op: MongoTemplate
#PostConstruct
fun listenOnExternalChanges() {
Thread {
op.getCollection("Item").watch().onEach {
if(it.updateDescription.updatedFields.containsKey("name")) {
println("name changed on a document: ${it.updateDescription.updatedFields["name"]}")
}
}
}.start()
}
This code only works when replication is enabled. You can enable it even when you have a single node:
Add the following replica set details to mongodb.conf (/etc/mongodb.conf or /usr/local/etc/mongod.conf or C:\Program Files\MongoDB\Server\4.0\bin\mongod.cfg) file
replication:
replSetName: "local"
Restart mongo service, Then open mongo console and run this command:
rs.initiate()
I am looking for transportation layer for gwt. I would like to create ajax request using generic method, f.e this is my DAO/service:
public class GenericDao<T extends GenericModel<T>> {
private Logger logger = LoggerFactory.getLogger(this.getClass().getCanonicalName());
#Transient protected Class<T> entityClass;
public GenericDao() {
super();
}
public GenericDao(Class<? extends GenericModel<T>> clazz) {
this.entityClass = (Class<T>) clazz;
}
public T getBy(Long id) {
return JPA.em().find(entityClass, id);
}
public List<GenericModel<T>> get() {
logger.error("trying to get data from db");
return getList();
}
public List<GenericModel<T>> getList() {
return JPA.em().createQuery("FROM " + entityClass.getSimpleName()).getResultList();
}
public void save(GenericModel<T> entityClass) {
JPA.em().getTransaction().begin();
JPA.em().persist(entityClass);
JPA.em().getTransaction().commit();
}
public void update(T entityClass) {
JPA.em().getTransaction().begin();
JPA.em().merge(entityClass);
JPA.em().getTransaction().commit();
}
public void delete(T entityClass) {
JPA.em().getTransaction().begin();
JPA.em().remove(entityClass);
JPA.em().getTransaction().commit();
}
}
GenericModel/Entity:
#MappedSuperclass
public class GenericModel<T extends GenericModel<T>> implements Identifiable, Versionable {
#Transient
protected Class<T> entityClass;
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
#Version
private Integer version;
// setter & getter
#Override
public Long getId() {return id;}
public void setId(Long id) {this.id = id;}
#Override
public Integer getVersion() {return version;}
public void setVersion(Integer version) {this.version = version;}
// constructor
public GenericModel() {
Class<?> obtainedClass = getClass();
Type genericSuperclass = null;
for (;;) {
genericSuperclass = obtainedClass.getGenericSuperclass();
if (genericSuperclass instanceof ParameterizedType) {
break;
}
obtainedClass = obtainedClass.getSuperclass();
}
ParameterizedType genericSuperclass_ = (ParameterizedType) genericSuperclass;
try {
entityClass = ((Class) ((Class) genericSuperclass_
.getActualTypeArguments()[0]));
} catch (ClassCastException e) {
entityClass = guessEntityClassFromTypeParametersClassTypedArgument();
}
}
public GenericModel(Long id) {
this();
this.id = id;
}
}
I am looking for mechanism that will allow me to use this generic service for all models on client side (each db entity have id- so I would like to downloads using ajax all my Entities this way, so I should have only one generic method for that on client side).
I've already checked:
GWT-RPC
RequestFactory
RestyGWT
But none of them support this feature.
I've found here:
https://www.mail-archive.com/google-web-toolkit#googlegroups.com/msg100095.html
information that: gwt-jackson supports generics and polymorphism. Unfortunately I didn't found any working example that. Can someone help, give an example, approved that information?
All entities have id and version parameter. So I would like to have one metod on client side RF that will allow me to get from server(service/dao/whatever) that entity by id- like this: Request getBy(Long id); But unfortunatelly I can't make it work. I like the RF way, so I've tried it first. Generally I don't wonna repeat code for downloading entity/proxy by id.
For better understanding, please look also on:
RequestFactory client-side inheritance of typed class with generics
I'm confused as to why you think RPC can't handle generics - according to your link, it can, but RestyGWT cannot. Granted, none of your JPA references make any sense in GWT, but those would live in a DAO on the server, not in the entity/model class themselves, or at least not in the client version. If you had a RPC method that returned T where <T extends GenericModel<T>>, then you would have serializers for every possible GenericModel<?> subtype, and any/all that are gwt-compatible could be sent over the wire.
Edit from update to question:
Your GenericModel class uses features of Java that cannot work in GWT, such as reflection. This cannot be compiled to GWT, since the compiler relies on removing reflection information to minimize your compiled size - leaving in general reflection information means leaving in details about all classes and members, even ones that it can't statically prove are in use, since some reflection might make use of them.
If there is a way to phrase your model object in a way that just deals with the data at hand, focus on that. Otherwise consider a DTO which is just the data to send over the wire - I'm not sure how you would plan to use the entityClass field on the client, or why that would be important to read from the superclass's generics instead of just using getClass().
RequestFactory will have a hard time dealing with generics - unlike RPC (and possibly RestyGWT) it cannot handle polymorphism the way you want, but will instead only send the fields for the declared type, not any arbitrary subtype. RPC will actually send the instance if it is something that the client can handle.