I am trying to implement a QueryDslMongoRepository for a model "Document"
#QueryEntity
#Document(collection="currentDocuments")
public class DocumentImpl extends TranslatableObjectImpl implements Document
In our current implementation a to be deleted document moves von "currentDocuments" into "deletedDocuments" collection.
I cant find a solution to create a repository like this
public interface DocumentRepository extends MongoRepository<DocumentImpl, String> ,QueryDslPredicateExecutor<DocumentImpl> {}
with a dynamic collection name.
My goal is to have the advantages of queryDsl in one Repository for different collections and to be able to move models from one collection into another like
public move(DocumentImpl entity, String sourceCollection, String targetCollection){
repository.delete(entity,sourceCollection);
repository.save(entity,targetCollection);
}
or something like
public List<Document> findAllDocumentsWithAttachments(String collectionName){
return repository.findAll(QDocumentImpl.documentImpl.attachments.isNotEmpty(), collectionName);
}
Any suggestions?
I implemented this feature by creating an own FactoryBean extending MongoRepositoryFactoryBean.
According to the Answer of this -> Question <- I implemeted following solution.
Entity
#QueryEntity
public class Document extends AbstractObject {
}
Custom QuerydslMongoRepository
public interface CustomQuerydslMongoRepository<T extends AbstractObject,ID extends Serializable> extends MongoRepository<T, ID> ,QueryDslPredicateExecutor<T>{
List<T> findAll(Predicate predicate, String collectionName);
T save(T entity, String collectionName);
...
}
Custom QuerydslMongoRepository Implementation
public class CustomQuerydslMongoRepositoryImpl<T extends AbstractObject,ID extends Serializable> extends QueryDslMongoRepository<T,ID> implements CustomQuerydslMongoRepository<T,ID> {
//All instance variables are available in super, but they are private
private static final EntityPathResolver DEFAULT_ENTITY_PATH_RESOLVER = SimpleEntityPathResolver.INSTANCE;
private final EntityPath<T> path;
private final PathBuilder<T> pathBuilder;
private final MongoOperations mongoOperations;
public CustomQuerydslMongoRepositoryImpl(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations) {
this(entityInformation, mongoOperations,DEFAULT_ENTITY_PATH_RESOLVER);
}
public CustomQuerydslMongoRepositoryImpl(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations, EntityPathResolver resolver) {
super(entityInformation, mongoOperations, resolver);
this.path=resolver.createPath(entityInformation.getJavaType());
this.pathBuilder = new PathBuilder<T>(path.getType(), path.getMetadata());
this.mongoOperations=mongoOperations;
}
#Override
public List<T> findAll(Predicate predicate, String collectionName) {
MongodbQuery<T> query = createQueryFor(predicate,collectionName);
return query.list();
}
#Override
public T save(T entity,String collectionName){
Assert.notNull(entity, "Entity must not be null!");
mongoOperations.save(entity, collectionName);
return entity;
}
private MongodbQuery<T> createQueryFor(Predicate predicate,String collectionName) {
Class<T> domainType = getEntityInformation().getJavaType();
MongodbQuery<T> query = new SpringDataMongodbQuery<T>(getMongoOperations(), domainType,collectionName);
return query.where(predicate);
}
}
Custom Repository Factory
public class CustomQueryDslMongodbRepositoryFactoryBean<R extends QueryDslMongoRepository<T, I>, T, I extends Serializable> extends MongoRepositoryFactoryBean<R, T, I> {
#Override
protected RepositoryFactorySupport getFactoryInstance(MongoOperations operations) {
return new CustomQueryDslMongodbRepositoryFactory<T,I>(operations);
}
public static class CustomQueryDslMongodbRepositoryFactory<T, I extends Serializable> extends MongoRepositoryFactory {
private MongoOperations operations;
public CustomQueryDslMongodbRepositoryFactory(MongoOperations mongoOperations) {
super(mongoOperations);
this.operations = mongoOperations;
}
#SuppressWarnings({ "rawtypes", "unchecked" })
protected Object getTargetRepository(RepositoryMetadata metadata) {
return new CustomQuerydslMongoRepositoryImpl(getEntityInformation(metadata.getDomainType()), operations);
}
protected Class<?> getRepositoryBaseClass(RepositoryMetadata metadata) {
return CustomQuerydslMongoRepository.class;
}
}
}
Entity Repository
public interface DocumentRepository extends CustomQuerydslMongoRepository<Document, String>{
}
Usage in Service
#Autowired
DocumentRepository repository;
public List<Document> getAllDocuments(Predicate predicate){
return repository.findAll(predicate,"myCustomCollection");
}
Related
So I'm new to Unit Testing.
I'm trying to test the behavior of the method findAll() in IngredientServiceImplTest.
The problem that I'm facing has to do with the return value of the method mocked. It always return empty thus throwing the exception.
Can someone tell me what I'm missing?
Testing class.
#ExtendWith(MockitoExtension.class)
class IngredientServiceImplTest {
#Mock
private MenuItemIngredientRepository menuItemIngredientRepository;
#Mock
private IngredientRepository ingredientRepository;
#InjectMocks
private IngredientServiceImpl ingredientService;
#Mock
private JpaRepository<Ingredient, Long> jpaRepository;
#Mock
private BaseMapper<IngredientCreateDto, IngredientUpdateDto, IngredientResponseDto,
Ingredient> baseMapper;
#BeforeEach
void init() {
ingredientService.jpaRepository = jpaRepository;
ingredientService.baseMapper = baseMapper;
}
#Test
void When_FindById_ReturnIngredient() {
Ingredient ingredient = new Ingredient();
ingredient.setId(1L);
ingredient.setName("Name");
IngredientCreateDto ingredientCreateDto = new IngredientCreateDto();
ingredientCreateDto.setName("Name");
when(jpaRepository.findById(ingredient.getId())).thenReturn(Optional.of(ingredient));
when(baseMapper.createDtoToEntity(ingredientCreateDto)).thenReturn(ingredient);
assertEquals(ingredientService.findById(ingredient.getId()).getName(),
}
Base service class
#AllArgsConstructor
#NoArgsConstructor
public class BaseServiceImpl<CREATE_DTO, UPDATE_DTO, RESPONSE_DTO, ENTITY> implements BaseService<CREATE_DTO, UPDATE_DTO, RESPONSE_DTO, ENTITY> {
#Autowired
protected JpaRepository<ENTITY, Long> jpaRepository;
#Autowired
protected BaseMapper<CREATE_DTO, UPDATE_DTO, RESPONSE_DTO, ENTITY> baseMapper;
#Override
public List<RESPONSE_DTO> findAll() {
return jpaRepository.findAll()
.stream()
.map(baseMapper::entityToResponseDto)
.collect(Collectors.toList());
}
#Override
public RESPONSE_DTO findById(Long id) {
return jpaRepository.findById(id)
.map(baseMapper::entityToResponseDto)
.orElseThrow(() -> {
throw new RuntimeException("Entity with id: " + id + " does not exist!");
});
}
#Override
public RESPONSE_DTO save(CREATE_DTO entity) {
return baseMapper.entityToResponseDto(jpaRepository.save(baseMapper.createDtoToEntity(entity)));
}
}
Ingredient Service class
#Service
#AllArgsConstructor
#Validated
public class IngredientServiceImpl extends BaseServiceImpl<IngredientCreateDto, IngredientUpdateDto, IngredientResponseDto, Ingredient> implements IngredientService{
private final MenuItemIngredientRepository menuItemIngredientRepository;
private final IngredientRepository ingredientsRepository;
#Override
public IngredientResponseDto update(Long id, IngredientUpdateDto ingredient) {
super.findById(id);
Ingredient entityIngredient = baseMapper.updateDtoToEntity(ingredient);
entityIngredient.setId(id);
entityIngredient.setUpdatedAt(LocalDateTime.now());
return baseMapper.entityToResponseDto(jpaRepository.save(entityIngredient));
}
#Override
public List<IngredientResponseDto> findTopIngredients(Integer n) {
return menuItemIngredientRepository.findTopIngredients(n)
.stream()
.map(id -> baseMapper.entityToResponseDto(jpaRepository.getOne(id)))
.collect(Collectors.toList());
}
#Override
public List<IngredientResponseDto> findAllByFilter(IngredientFilter ingredientFilter) {
return ingredientsRepository.findAllByFilter(ingredientFilter)
.stream()
.map(ingredient -> baseMapper.entityToResponseDto(ingredient))
.collect(Collectors.toList());
}
}
I'm using spring-data-rest and I have a JpaRepository like this:
#RepositoryRestResource(path = "projects")
public interface ProjectsRepository extends JpaRepository<MetricsProjects, Integer> {...}
My repository interface:
#RepositoryRestResource(path = "projects")
public interface ProjectsRepository extends JpaRepository<MetricsProjects, Integer> {
List<MetricsProjects> findByProjectName(String projectName);
#Override
#RestResource(exported = false)
public void deleteById(Integer id);
#Override
#RestResource(exported = false)
public void delete(MetricsProjects entity);
#Override
#RestResource(exported = false)
public void deleteAll(Iterable<? extends MetricsProjects> entities);
#Override
#RestResource(exported = false)
public void deleteAll();
#Override
#RestResource(exported = false)
public void deleteInBatch(Iterable<MetricsProjects> entities);
#Override
#RestResource(exported = false)
public void deleteAllInBatch();
}
I've also added disableDefaultExposure() , as suggested somewhere.
My Configuration file:
#Configuration
public class SpringDataRestConfiguration implements RepositoryRestConfigurer {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration restConfig) {
restConfig.disableDefaultExposure();
}
}
But I still see the DELETE methods exposed from my Swagger-UI, how do I prevent this?
Create a controller method for the DELETE endpoint and return 405 Method Not Allowed.
Modify the configureRepositoryRestConfiguration to
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration restConfig) {
restConfig.getExposureConfiguration()
.forDomainType(put the class type here)
.withItemExposure((metdata, httpMethods) -> httpMethods.disable(HttpMethod.DELETE))
.withCollectionExposure((metdata, httpMethods) -> httpMethods.disable(HttpMethod.DELETE));
}
Replace put the class type here with the className.class, for example if the className is MetricsProjects, put MetricsProjects.class there.
Here is additional info.
I would like to save a new entity using HazlecastRepository.
When the id is null, the KeyValueTemplate use SecureRandom and generate id which is like -123123123123123123.
I don't want to save id like that, instead of that i woud like to get it from sequence in db and put it to the map.
I have found 2 solutions:
1) In AdminService get the next value from sequence in database and set it
2) Create atomic counter id in the Hazelcast server and init it with the current value from the sequence. In AdminService get counter, increment value and set id.
but they are not very pretty.
Do you have any other ideas?
The code:
#Configuration
#EnableHazelcastRepositories(basePackages = "com.test")
public class HazelcastConfig {
#Bean
public HazelcastInstance hazelcastInstance(ClientConfig clientConfig) {
return HazelcastClient.newHazelcastClient(clientConfig);
}
#Bean
#Qualifier("client")
public ClientConfig clientConfig() {
ClientConfig clientConfig = new ClientConfig();
clientConfig.setClassLoader(HazelcastConfig.class.getClassLoader());
ClientNetworkConfig networkConfig = clientConfig.getNetworkConfig();
networkConfig.addAddress("127.0.0.1:5701");
networkConfig.setConnectionAttemptLimit(20);
return clientConfig;
}
#Bean
public KeyValueTemplate keyValueTemplate(ClientConfig clientConfig) {
return new KeyValueTemplate(new HazelcastKeyValueAdapter(hazelcastInstance(clientConfig)));
}
}
#Service
#RequiredArgsConstructor
public class AdminService {
private final UserRepository userRepository;
...
#Transactional
public User addOrUpdateUser(UserUpdateDto dto) {
validate(dto);
User user = dto.getId() != null ? userService.getUser(dto.getId()) : new User();
mapUser(user, dto);
return userRepository.save(user);
}
...
}
#Repository
public interface UserRepository extends HazelcastRepository<User, Long> {
}
#KeySpace("users")
#Entity
#Table(name = "users)
#Data
#AllArgsConstructor
#NoArgsConstructor
public class User extends DateAudit implements Serializable {
#javax.persistence.Id
#org.springframework.data.annotation.Id
// #GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "user_generator")
// #SequenceGenerator(name="user_generator", sequenceName = "user_seq")
private Long id;
...
}
Hazelcast server:
#Component
#Slf4j
public class UserLoader implements ApplicationContextAware, MapStore<Long, User> {
private static UserJpaRepository userJpaRepository;
#Override
public User load(Long key) {
log.info("load({})", key);
return userJpaRepository.findById(key).orElse(null);
}
#Override
public Map<Long, User> loadAll(Collection<Long> keys) {
Map<Long, User> result = new HashMap<>();
for (Long key : keys) {
User User = this.load(key);
if (User != null) {
result.put(key, User);
}
}
return result;
}
#Override
public Iterable<Long> loadAllKeys() {
return userJpaRepository.findAllId();
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
userJpaRepository = applicationContext.getBean(UserJpaRepository.class);
}
#Override
public void store(Long aLong, User user) {
userJpaRepository.save(user);
}
#Override
public void storeAll(Map<Long, User> map) {
for (Map.Entry<Long, User> mapEntry : map.entrySet()) {
store(mapEntry.getKey(), mapEntry.getValue());
}
}
#Override
public void delete(Long aLong) {
userJpaRepository.deleteById(aLong);
}
#Override
public void deleteAll(Collection<Long> collection) {
collection.forEach(this::delete);
}
}
public interface UserJpaRepository extends CrudRepository<User, Long> {
#Query("SELECT u.id FROM User u")
Iterable<Long> findAllId();
}
I think that there is no better way than what you described.
I'd go with the second solution, because then you're at least coupled to Hazelcast server only.
I upgraded from Spring Boot v.1.5.8 to v.2.1.5. When I try to start the application I get the following error:
IllegalArgumentException: Fragment implementation .OtmUniParentRepository2019GeneratedImpl$$EnhancerBySpringCGLIB$$cdf9e294 does not implement x.OtmUniParentRepository2019Generated!
Why I can't start it anymore?
The files:
OtoUniChildRepository2019
#RepositoryRestResource(collectionResourceRel = "OtoUniChild2019", path = "OtoUniChild2019")
public interface OtoUniChildRepository2019 extends OtoUniChildRepository2019Generated {
}
#Transactional
class OtoUniChildRepository2019Impl extends HeartcoreRepositoryImpl<OtoUniChild> {
#PostConstruct
private void setIni() {
super.setIni(OtoUniChild.TABLENAME, OtoUniChild.getColumnName(), OtoUniChild.class, "AGRIDB2019");
}
}
OtoUniChildRepository2019Generated
public interface OtoUniChildRepository2019Generated extends HeartcoreRepository<OtoUniChild> {
OtoUniChild findByIdAndOtoUniParentIsNotNull(#Param("id") String id);
OtoUniChild findByOtoUniParentId(#Param("id") String id);
}
#Transactional
class OtoUniChildRepository2019GeneratedImpl extends HeartcoreRepositoryImpl<OtoUniChild> {
#PostConstruct
private void setIni() {
super.setIni(OtoUniChild.TABLENAME, OtoUniChild.getColumnName(), OtoUniChild.class, "AGRIDB2019");
}
}
HeartcoreRepository
#NoRepositoryBean
public interface HeartcoreRepository<T extends Heartcore> extends RevisionRepository<T, String, Integer>, PagingAndSortingRepository<T, String>, HeartcoreCustomRepository<T> {
#Override
T findOne(String id);
boolean existsById(String id);
#Override
Collection<T> findAll();
List<T> findAllByKanton(#Param("kanton") String kanton);
}
HeartcoreCustomRepository
public interface HeartcoreCustomRepository<T extends Heartcore> {
List<T> findCustom(String sqlQuery);
List<T> findCustom(String select, String where);
Class<T> getType();
T findOne(String id);
Collection<T> findAll();
String getSequence(String sequenceName);
}
HeartcoreCustomRepositoryImpl
#Transactional
public class HeartcoreRepositoryImpl<T extends Heartcore> implements HeartcoreCustomRepository<T> {
#PersistenceContext
protected EntityManager entityManager;
// irrelevant code
public void setIni(String tablename, List<String> columns, Class<T> type, String schema) {
this.tablename = tablename;
this.columns = columns;
this.type = type;
this.schema = schema;
MultitenantDataSource multitenantDataSource = (MultitenantDataSource) entityManager.getEntityManagerFactory().getProperties().get("hibernate.connection.datasource");
DataSource dataSource = (DataSource) multitenantDataSource.determineTargetDataSource();
try {
this.dbDriver = dataSource.getConnection().getMetaData().getDriverName();
}
catch (SQLException e) {
e.printStackTrace();
}
}
// irrelevant code
With 1.5.8 it works fine and I couldn't find infos about breaking changes.
EDIT: Is something wrong with this inheritance structure of the repositories? I tried some different approaches but none worked. Is there an other way to implement some basic functionality for repositories?
I have my own custom Spring Data common repository in order to provide common behavior to all Spring Data repositories. And all I need is to modify EntityManager when repository is being created. But I can't inject a Spring bean into JpaRepositoryFactoryBean due to the bean is created via new operator.
public class BasicJpaRepositoryFactoryBean<T extends Repository<S, ID>, S, ID extends Serializable> extends JpaRepositoryFactoryBean<T, S, ID> {
#Autowired
private SomeService service; // - it does not work
#Override
protected RepositoryFactorySupport createRepositoryFactory(EntityManager em) {
// do some logic here
service.doSmth();
return new CommonRepositoryFactory<>(em);
}
private static class CommonRepositoryFactory<T, I extends Serializable> extends JpaRepositoryFactory {
private final EntityManager em;
public CommonRepositoryFactory(EntityManager em) {
super(em);
this.em = em;
}
#SuppressWarnings("unchecked")
protected Object getTargetRepository(RepositoryMetadata metadata) {
JpaEntityInformation entityInformation = getEntityInformation(metadata.getDomainType());
return new CommonRepositoryImpl(entityInformation, em);
}
protected Class<?> getRepositoryBaseClass(RepositoryMetadata metadata) {
return CommonRepositoryImpl.class;
}
}
}
Implement a setter in that class or one that extends from it.