How to use #Autowired in an class annotated with #Entity? - spring

I have an entity called TimeBooking. When I request this entity and return to the client I want to get a list of ActivityTimeBookings from a repository. But when the function get called the repo is null.
So I tried to #Autowired the repo and marked it as transient and also said Spring that there is a dependency which should be injected.
#Configurable(preConstruction = true)
#Entity
public class TimeBooking extends BaseEntity{
#Autowired
private transient ActivityTimeBookingRepository activityTimeBookingRepository;
...
#JsonProperty("activityTimeBookings")
private List<ActivityTimeBooking> activityTimeBookings() {
return this.activityTimeBookingRepository.findByDate(this.timeFrom);
}
}
Any suggestions?

Using #Autowired in a class annotated with #Entity is a bad practice.
The solution is given below :
1. Create a service interface :
public interface TimeBookingService {
public List<ActivityTimeBooking> activityTimeBookings();
}
2. Create an implementation of the service interface :
#Service
public class TimeBookingServiceImpl implements TimeBookingService {
#Autowired
private ActivityTimeBookingRepository activityTimeBookingRepository;
public List<ActivityTimeBooking> activityTimeBookings() {
return this.activityTimeBookingRepository.findByDate(this.timeFrom);
}
}

Usually its indeed a bad practice to inject something into JPA entities.
These are usually created by JPA implementation (like Hibernate) and spring as a DI framework doesn't really participate in this process.
Note, that there can be many instances of this class created as a result of query, so if you later use this for serialization of the list of this object you might end up running N queries to the database given N entities like this were retrieved.
Answering your question about "getting access to the repo" I believe you should consider refactoring:
In the service class (assuming you have a "regular" contoller, service and dao):
you can:
class MyService {
SomeResult.. doSomething() {
List<TimeBooking> allTimeBookings = dao.getAllTimeBooking();
LocalDateTime timeFrom = calculateTimeFrom(allTimeBookings);
List<ActivityTimeBooking> allActivityTimeBookings = dao.findByDate(timeFrom);
return calculateResults(allTimeBookings, allActivityTimeBooking);
}
}
class MyDao {
List<ActivityTimeBooking> findByDate(LocalDateTime timeFrom) {...}
List<TimeBooking> getAllTimeBookings() {...}
}
Regarding the service implementation, I've assumed this use case can't be covered by usual "JOIN between two tables" so that that creating an association between TimeBooking and ActivityTimeBooking is not an option.
Note 2, I've used one repository (dao) for brevity, in real application you might want to inject two different repositories into the service.

Related

How to avoid Spring Repository<T, ID> to leak persistence information into service tier

I'm using spring-data-mongodb at the moment so this question is primarily in context of MongoDB but I suspect my question applies to repository code in general.
Out of the box when using a MongoRepository<T, ID> interface (or any other Repository<T, ID> descendent) the entity type T is expected to be the document type (the type that defines the document schema).
As a result injecting such a repository into service component means this repository is leaking database schema information into the service tier (highly pseudo) :
class MyModel {
UUID id;
}
#Document
class MyDocument {
#Id
String id;
}
interface MyRepository extends MongoRepository<MyDocument, String> {
}
class MyService {
MyRepository repository;
MyModel getById(UUID id) {
var documentId = convert(id, ...);
var matchingDocument = repository.findById(documentId).orElse(...);
var model = convert(matchignDocument, ...);
return model;
}
}
Whilst ideally I'd want to do this :
class MyModel {
UUID id;
}
#Document
class MyDocument {
#Id
String id;
}
#Configuration
class MyMagicConversionConfig {
...
}
class MyDocumentToModelConverter implements Converter<MyModel, MyDocument> {
...
}
class MyModelToDocumentConverter implements Converter<MyDocument, MyModel> {
...
}
// Note that the model and the model's ID type are used in the repository declaration
interface MyRepository extends MongoRepository<MyModel, UUID> {
}
class MyService {
MyRepository repository;
MyModel getById(UUID id) {
// Repository now returns the model because it was converted upstream
// by the mongo persistence layer.
var matchingModel = repository.findById(documentId).orElse(...);
return matchingModel ;
}
}
Defining this conversion once seems significantly more practical than having to consistently do it throughout your service code so I suspect I'm just missing something.
But of course this requires some way to inform the mongo mapping layer to be aware of what conversion has to be applied to move between MyModel and MyDocument and to use the latter for it's actual source of mapping metadata (e.g. #Document, #Id, etc.).
I've been fiddling with custom converters but I just can't seem to make the MongoDB mapping component do the above.
My two questions are :
Is it currently possible to define custom converters or implement callbacks that allow me to define and implement this model <-> document conversion once and abstract it away from my service tier.
If not, what is the idiomatic way to approach cleaning this up such that the service layer can stay blissfully unaware of how or with what schema an entity is persisted? A lot of Spring Boot codebases appear to be fine with using the type that defines the database schema as their model but that seems supoptimal. Suggestions welcome!
Thanks!
I think you're blowing things a bit out of proportion. The service layer is not aware of the schema. It is aware of the types returned by the repository. How the properties of those are mapped onto the schema, depends on the object-document mapping. This, by default, uses the property name, as that's the most straightforward thing to do. That translation can either be customized using annotations on the document type or by registering a FieldNamingStrategy with Spring Data MongoDB.
Spring Data MongoDB's object-document mapping subsystem provides a lot of customization hooks that allows transforming arbitrary MongoDB documents into entities. The types which the repositories return are your domain objects that - again, only by default - are mapped onto a MongoDB document 1:1, simply because that's the most reasonable thing to do in the first place.
If really in doubt, you can manually implement repository methods individually that allow you to use the MongoTemplate API that allows you to explicitly define the type, the data should be projected into.
You can use something like MapStruct or write your own Singleton Mapper.
Then create default methods in your repository:
interface DogRepository extends MongoRepository<DogDocument, String> {
DogDocument findById(String id);
default DogModel dogById(String id) {
return DogMapper.INSTANCE.toModel(
findById(id)
);
}
}

Im geting the error Parameter 0 of constructor required a single bean, but 2 were found but one of the beans "found" has been deleted by me

Im trying to set up a h2 database using jpa/jdbc, after creating an implemntation for a query interface using jpa as opposed to jdbc i am now getting the error:
Parameter 0 of constructor in com.nsa.charitystarter.service.CharityQueries required a single bean, but 2 were found:
- charityRepoJDBC: defined in file [C:\Users\V La Roche\Desktop\assessment-1-starter\out\production\classes\com\nsa\charitystarter\data\CharityRepoJDBC.class]
- charityRepoJPA: defined in null
Im unsure as to what has gone wrong and am not really sure where to go from here, i havent been able to find many people with a similar issue to me online.
My implementation using jdbc
#Repository
public class CharityRepoJDBC implements CharityRepository {
private JdbcTemplate jdbc;
private RowMapper<Charity> charityMapper;
#Autowired
public CharityRepoJDBC(JdbcTemplate aTemplate) {
jdbc = aTemplate;
charityMapper = (rs, i) -> new Charity(
rs.getLong("id"),
rs.getString("name"),
rs.getString("registration_id"),
rs.getString("acronym"),
rs.getString("purpose")
);
}
#Override
public List<Charity> findCharityBySearch(String searchTerm) {
String likeSearch = "%" + searchTerm + "%";
return jdbc.query(
"select id, acronym, name, purpose, logo_file_name, registration_id " +
"from charity " +
"where concat(name, acronym, purpose, registration_id) like ?",
new Object[]{likeSearch},
charityMapper);
}
#Override
public Optional<Charity> findById(Long id) {
return Optional.of(
jdbc.queryForObject(
"select id, acronym, name, purpose, logo_file_name, registration_id from charity where id=?",
new Object[]{id},
charityMapper)
);
}
}
Charity finder implementation:
#Service
public class CharityQueries implements CharityFinder {
private CharityRepository charityRepository;
public CharityQueries(CharityRepository repo) {
charityRepository = repo;
}
public Optional<Charity> findCharityByIndex(Integer index) {
return charityRepository.findById(index.longValue());
}
public List<Charity> findCharityBySearch(String searchTerm) {
return charityRepository.findCharityBySearch(searchTerm);
}
}
CharityFinder interface
public interface CharityFinder {
public Optional<Charity> findCharityByIndex(Integer index);
public List<Charity> findCharityBySearch(String searchTerm);
}
error log :
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of constructor in com.nsa.charitystarter.service.CharityQueries required a single bean, but 2 were found:
- charityRepoJDBC: defined in file [C:\Users\V La Roche\Desktop\assessment-1-starter\out\production\classes\com\nsa\charitystarter\data\CharityRepoJDBC.class]
- charityRepoJPA: defined in null
Action:
Consider marking one of the beans as #Primary, updating the consumer to accept multiple beans, or using #Qualifier to identify the bean that should be consumed
Process finished with exit code 0
You have following definition currently,
#Repository
public class CharityRepoJDBC implements CharityRepository {
And you are injecting CharityRepository in your service layer CharityQueries
#Service
public class CharityQueries implements CharityFinder {
private CharityRepository charityRepository;
Hence when you deploy your application the container is confused which bean you are trying to autowire into the service.
By default spring autowires by type and hence by that there are 2 beans which are qualified to be injected by spring container.
CharityRepository itself and other
CharityRepoJDBC
So you need to either explicitly tell container which bean you are trying to autowire in this case.
So you can try adding qualifiers as below to solve the issue.
#Service
public class CharityQueries implements CharityFinder {
#Qualifier("CharityRepoJDBC")
private CharityRepository charityRepository;
and at the same time modify your CharityRepoJDBC to be,
#Repository(value = "CharityRepoJDBC")
public class CharityRepoJDBC implements CharityRepository {
You seem to have the Spring Data JDBC starter on the classpath and the Spring Data JPA starter.
Spring Data JDBC has a bug which causes it to produce implementation for repository interfaces even if it shouldn't, thus you end up with one implementation from JPA and another one from JDBC.
If you really want to use Spring Data JDBC and Spring Data JPA you can limit the #EnableJdbcRepositories and #EnableJpaRepositories annotations using the include and exclude filters.
But from your code and the tags you used I suspect you might be not at all interested in Spring Data Jdbc but only in Spring Jdbc.
If this is the case look for a dependency spring-boot-starter-data-jdbc and change it to spring-boot-starter-jdbc.
In case all this Spring (Data) JDBC/JPA confuse you this question and its answers might help: Spring Data JDBC / Spring Data JPA vs Hibernate
I solved it putting #Primary annotation in the repository interface.
In your case it would be the following:
#Primary
public interface CharityFinder {
public Optional<Charity> findCharityByIndex(Integer index);
public List<Charity> findCharityBySearch(String searchTerm);
}

Repository pattern in MVC: controller code explanation

What is the purpose of SudentRepository in this example? Why do I need one?
public class StudentController : Controller
{
private IStudentRepository _repository;
public StudentController() : this(new StudentRepository())
{
}
public StudentController(IStudentRepository repository)
{
_repository = repository;
}
I updated to actually include a specific question that I think you're getting at. The purpose of StudentRepository is to encapsulate interactions with persisted data. The Controller need not know if its stored in a db, flat file, in memory, etc.
The reason you're injecting it in via an interface is because you may eventually have multiple implementations of that repository, and the interface is just a contract to ensure basic functionality across all implementations. This is called constructor injection (a type of dependency injection) in case you want to learn more.

Why is this method in a Spring Data repository considered a query method?

We have implemented an application that should be able to use either JPA, Couchbase or MongoDB. (for now, may increase in the future). We successfully implemented JPA and Couchbase by separating repositories for each e.g. JPA will come from org.company.repository.jpa while couchbase will come from org.company.repository.cb. All repository interfaces extends a common repository found in org.company.repository. We are now targeting MongoDB by creating a new package org.company.repository.mongo. However we are encountering this error:
No property updateLastUsedDate found for type TokenHistory!
Here are our codes:
#Document
public class TokenHistory extends BaseEntity {
private String subject;
private Date lastUpdate;
// Getters and setters here...
}
Under org.company.repository.TokenHistoryRepository.java
#NoRepositoryBean
public interface TokenHistoryRepository<ID extends Serializable> extends TokenHistoryRepositoryCustom, BaseEntityRepository<TokenHistory, ID> {
// No problem here. Handled by Spring Data
TokenHistory findBySubject(#Param("subject") String subject);
}
// The custom method
interface TokenHistoryRepositoryCustom {
void updateLastUsedDate(#Param("subject") String subject);
}
Under org.company.repository.mongo.TokenHistoryMongoRepository.java
#RepositoryRestResource(path = "/token-history")
public interface TokenHistoryMongoRepository extends TokenHistoryRepository<String> {
TokenHistory findBySubject(#Param("subject") String subject);
}
class TokenHistoryMongoRepositoryCustomImpl {
public void updateLastUsedDate(String subject) {
//TODO implement this
}
}
And for Mongo Configuration
#Configuration
#Profile("mongo")
#EnableMongoRepositories(basePackages = {
"org.company.repository.mongo"
}, repositoryImplementationPostfix = "CustomImpl",
repositoryBaseClass = BaseEntityRepositoryMongoImpl.class
)
public class MongoConfig {
}
Setup is the same for both JPA and Couchbase but we didn't encountered that error. It was able to use the inner class with "CustomImpl" prefix, which should be the case base on the documentations.
Is there a problem in my setup or configuration for MongoDB?
Your TokenHistoryMongoRepositoryCustomImpl doesn't actually implement the TokenHistoryRepositoryCustom interface, which means that there's no way for us to find out that updateLastUsedDate(…) in the class found is considered to be an implementation of the interface method. Hence, it's considered a query method and then triggers the query derivation.
I highly doubt that this works for the other stores as claimed as the code inspecting query methods is shared in DefaultRepositoryInformation.

Using QueryDslRepositorySupport in combination with interface repositories

since I didn't get a reply on the spring forum I'll give it a try here.
Is there a way to have a common interface repository which is extended by interfaces the following way:
#NoRepositoryBean
public interface CommonRepository<T> extends JpaRepository<T, Long>, QueryDslPredicateExecutor<T> {
T getById(final long id);
}
#Repository
public interface ConcreteRepository extends CommonRepository<ConcreteEntity> {
List<ConcreteEntity> getByNameAndAddress(final String name, final String address);
}
public class ConcreteRepositoryImpl extends QueryDslRepositorySupport implements ConcreteRepository {
private BooleanExpression nameEquals(final QConcreteEntity entity, final String name) {
return entity.eq(name);
}
public List<ConcreteEntity> getByNameAndAddress(final String name, final String address) {
QConcreteEntity entity = QConcreteEntity.concreteEntity;
return from(entity).where(entity.name.eq(name).and(entity.address.eq(address))).list(entity);
}
}
The problem with the implementation is that I have to implement getById(final long id)
in each concrete class. I don't want to do that. Normally, spring data automatically knows about each entity. Also I want to have the functionality of QueryDslRepositorySupport.
In my example it normally generates something like:
select .. from concreteentity en where en.id = ...
Is there a way to solve it? I already stumbled upon
Spring Jpa adding custom functionality to all repositories and at the same time other custom funcs to a single repository
and
http://docs.spring.io/spring-data/data-jpa/docs/current/reference/html/repositories.html#repositories.custom-implementations
but I don't think these solutions are helpful and I don't entirely understand how I can use them to solve the problem.
Thanks,
Christian
One way to create a generic getById under QuerydslRepositorySupport is like this
T getById(long id) {
return getEntityManager().find(getBuilder().getType(), id)
}

Resources