How does Spring Data JPA work internally - spring

I was going through Spring Data JPA Tutorial.
I am confused on how does this framework work internally.
Let me state specific scenario
There was specific code
/**
* Custom finder
*/
public List<Location> getLocationByStateName(String name) {
#SuppressWarnings("unchecked")
List<Location> locs = entityManager
.createQuery("select l from Location l where l.state like :state")
.setParameter("state", name + "%").getResultList(); // note
return locs;
}
This was simply replaced by following interface
#Repository
public interface LocationJPARepository extends JpaRepository<Location, Long> {
List<Location> findByStateLike(String stateName);
}
And corresponding test case worked fine
#Test
public void testFindWithLike() throws Exception {
List<Location> locs = locationRepository.getLocationByStateName("New");
assertEquals(4, locs.size());
}
New test case
#Test
public void testFindWithLike() throws Exception {
List<Location> locs = locationJPARepository.findByStateLike("New");
assertEquals(4, locs.size());
}
My question
How does framework know if i am looking for exact match using = or partial match using SQL like operator (it cant be method name ?)
if it somehow decide I am looking for partial match then still there are sub options ... like name% or %name or %name% …
Also how it decides case is important in like ? ( i can have case-insensitive by using SQL like with toUpper() i.e. by comparing everything in upper case )
(added ques) is there a way i can check the EXACT SQL in log some where ??
Hope i was able to explain my question properly. Let me know if i need to add in more clarity.

I recommend to take a look at Query Creation section of the reference guide. It explains the rules pretty clearly.
For instance when you want to find User by first name and ignore case, you would use method name like findByFirstnameIgnoreCase which would translate into condition like UPPER(x.firstame) = UPPER(?1).
By default when you have findByProperty method, the match is exact, so if you want to have LIKE functionality you would use method name findByFirstnameLike which would in turn translate into condition where x.firstname like ?1.
You can combine these keywords, but it can get a little crazy. Personally I prefer using #Query annotation for more complicated queries to avoid super long repository method names.

Related

Java 8 Application layer and specific output transformation

I have a gradle multiproject with 2 subprojects trying to emulate an hexagonal architecture :
rest-adapter
application layer
I don't want the application services to expose the domain models and do'nt want to force a specific representation as output. So I would like something like application services consume 2 args (a command and something) and return a T. The client configures the service.
The rest adapter doesn't ave access to the domain model, so I can't return the domain models and let the adapter creates its representation.
What about the something. I tried :
have a signature <T> List<T> myUseCase(Command c, Function<MyDomainModel, T> fn). The application layer is the owner of transformations functions (because the signature uses MyDomainModel) and exposes a dictionnary of function. So the rest controller references one of the Fn. It works. And I'm searching of a better way. More elegant way if it exists.
have a signature <T> List<T> myUseCase(Command c, FnEnum fn) For each enum I have associated a Function. With this, I found the signature more elegant : the consumer provides which transformation it wants from an enum. But doesn't work cause the generic method doesn't compile. The cannot be resolved. Currently, I didn't find a way.
something with java 8 consumer or supplier or something else but I failed to wrap my head around.
I'm feeling there's a more elegant solution for this kind of problem : a service which accepts a function that transforms and build an output that the client provides.
I think that what you need to implement is the so called "Data Transformer" pattern.
Imagine that you have a use case that returns a certain domain object (for example "User"), but you shouldn't expose domain to clients. And you want every client to choose the format of the returned data.
So you define a data transformer interface for the domain object:
public interface UserDataTransformer {
public void write ( User user );
public String read();
}
For every output format your clients need you define a class implementing the interface. For example if you want to represent the User in XML format:
public class UserXMLDataTransformer implements UserDataTransformer {
private String xmlUser;
#Override
public void write(User user) {
this.xmlUser = xmlEncode ( user );
}
private String xmlEncode(User user) {
String xml = << transform user to xml format >>;
return xml;
}
#Override
public String read() {
return this.xmlUser;
}
}
Then you make your application service depends on the data trasnsformer interface, you inject it in the constructor:
public class UserApplicationService {
private UserDataTransformer userDataTransformer;
public UserApplicationService ( UserDataTransformer userDataTransformer ) {
this.userDataTransformer = userDataTransformer;
}
public void myUseCase ( Command c ) {
User user = << call the business logic of the domain and construct the user object you wanna return >> ;
this.userDataTransformer.write(user);
}
}
And finally, the client could look something like this:
public class XMLClient {
public static void main ( String[] args ) {
UserDataTransformer userDataTransformer = new UserXMLDataTransformer();
UserApplicationService userService = new UserApplicationService(userDataTransformer);
Command c = << data input needed by the use case >>;
userService.myUseCase(c);
String xmlUser = userDataTransformer.read();
System.out.println(xmlUser);
}
}
I've consider that the output is a String, but you could use generics maybe to return any type you want.
I haven't mentioned it, but this approach injecting the transformer into the application service follows the "port and adapters" pattern. The transformer interface would be the port, and every class implementing it would be an adapter for the desired format.
Also, this was just an example. You can use a dependency injection framework like Spring in order to create the component instances and wire them all. And also you should use the composition root pattern to do it.
Hope this example helped.
I'm feeling there's a more elegant solution for this kind of problem : a service which accepts a function that transforms and build an output that the client provides.
You are sending data across the boundary between the application and the REST layer (and presumably between the application and the REST consumer); it may be useful to think about messaging patterns.
For example, the application can define a service provider interface that defines a contract/protocol for accepting data from the application.
interface ResponseBuilder {...}
void myUseCase(Command c, ResponseBuilder builder)
The REST adapter provides an implementation of the ResponseBuilder that can take the inputs and generate some useful data structure from them.
The response builder semantics (the names of the functions in the interface) might be drawn from the domain model, but the arguments will normally be either primitives or other message types.
CQS would imply that a query should return a value; so in that case you might prefer something like
interface ResponseBuilder<T> {
...
T build();
}
<T> T myUseCase(Command c, ResponseBuilder<T> builder)
If you look carefully, you'll see that there's no magic here; we've simply switched from having a direct coupling between the application and the adapter to having an indirect coupling with the contract.
EDIT
My first solution is using a Function<MyDomainModel, T> which is a bit different from your ResponseBuilder ; but in the same vein.
It's almost dual to it. You'd probably be a little bit better off with a less restrictive signature on myUseCase
<T>
List<T> myUseCase(Command c, Function<? super MyDomainModel, T> fn)
The dependency structure is essentially the same -- the only real difference is what the REST adapter is coupled to. If you think the domain model is stable, and the output representations are going to change a lot, then the function approach gives you the stable API.
I suspect that you will find, however, that the output representations stabilize long before the domain model does, in which case the ResponseBuilder approach will be the more stable choice.

Spring AOP - annotation with args

I am stuck with a problem in spring boot. I am trying to give extra functionality to some RestControllers, and I am trying to achieve it with some custom annotations. Here is an example.
My annotation:
#Target(ElementType.METHOD)
#Retention(RetentionPolicy.RUNTIME)
public #interface MyCustomAnnotation {
String someArg();
}
My aspect:
#Aspect
#Component
public class MyAspect {
#Around(
value = "#annotation(MyCustomAnnotation)",
argNames = "proceedingJoinPoint,someArg"
)
public Object addMyLogic(ProceedingJoinPoint proceedingJoinPoint, String someArg)
throws Throwable
{
System.out.println(someArg);
return proceedingJoinPoint.proceed();
}
}
My method:
#MyCustomAnnotation(someArg = "something")
#GetMapping("/whatever/route")
public SomeCustomResponse endpointAction(#RequestParam Long someId) {
SomeCustomResult result = someActionDoesNotMatter(someId);
return new SomeCustomResponse(result);
}
Mostly based on the docs (https://docs.spring.io/spring/docs/3.0.3.RELEASE/spring-framework-reference/html/aop.html - 7.2.4.6 Advice parameters) I am pretty sure, it should work.
I am here, because it does not...
What drives me crazy, is that even Intellij, when tries to help with argNames (empty string -> red underline -> alt+enter -> Correct argNames attribute) gives me this, and keeps it red...
Based on the docs, proceedingJoinPoint is not even needed (it does not work without it either): "If the first parameter is of the JoinPoint, ProceedingJoinPoint..."
With the current setup, it says "Unbound pointcut parameter 'someArg'"
At this point, I should also note, that without the args it is working fine.
I have two questions, actually:
Why does this does not work? (That was pretty obvious)
If I would like to give some extra functionality to some controllers, and I would like to parameterise it from the outside, is it the right pattern in spring boot? (With python, it was quite easy to do this with decorators - I am not quite sure, that I am not misguided by the similar syntax)
One example (the example above was pretty abtract):
I would like to create a #LogEndpointCall annotation, and the developer of a route can later just put it on the endpoint that he is developing
...however, it would be nice, if he could add a string (or more likely, an enum) as a parameter
#LogEndpointCall(EndpointCallLogEnum.NotVeryImportantCallWhoCares)
or
#LogEndpointCall(EndpointCallLogEnum.PrettySensitiveCallCheckItALot)
so that the same logic is triggered, but with a different param -> and a save to a different destination will be made.
You cannot directly bind an annotation property to an advice parameter. Just bind the annotation itself and access its parameter normally:
#Around("#annotation(myCustomAnnotation)")
public Object addMyLogic(
ProceedingJoinPoint thisJoinPoint,
MyCustomAnnotation myCustomAnnotation
)
throws Throwable
{
System.out.println(thisJoinPoint + " -> " + myCustomAnnotation.someArg());
return thisJoinPoint.proceed();
}
It will print the something like this with Spring AOP
execution(SomeCustomResponse de.scrum_master.app.Application.endpointAction(Long)) -> something
and something like this with AspectJ (because AJ also knows call joinpoints, not just execution)
call(SomeCustomResponse de.scrum_master.app.Application.endpointAction(Long)) -> something
execution(SomeCustomResponse de.scrum_master.app.Application.endpointAction(Long)) -> something
If you want your method to intercept method that take on consideration args you must explicit mention that in you pointcut expression , to make it work here is what you should do :
#Around(
value = "#annotation(MyCustomAnnotation) && args(someArg)",
argNames = "someArg")
notice that i add && args(someArg), you can add as much arguments as you want, in argNames you can omit proceedingJoinPoint.

Multi-Column Search with Spring JPA Specifications

I want to create a multi field search in a Spring-Boot back-end. How to do this with a Specification<T> ?
Environment
Springboot
Hibernate
Gradle
Intellij
The UI in the front end is a Jquery Datatable. Each column allows a single string search term to be applied. The search terms across more than one column is joined by a and.
I have the filters coming from the front end already getting populated into a Java object.
Step 1
Extend JPA Specification executor
public interface SomeRepository extends JpaRepository<Some, Long>, PagingAndSortingRepository<Some, Long>, JpaSpecificationExecutor {
Step2
Create a new class SomeSpec
This is where I am lost as to what the code looks like it and how it works.
Do I need a method for each column?
What is Root and what is Criteria Builder?
What else is required?
I am rather new at JPA so while I don't need anyone to write the code for me a detailed explanation would be good.
UPDATE
It appears QueryDSL is the easier and better way to approach this. I am using Gradle. Do I need to change my build.gradle from this ?
If you don't want to use QueryDSL, you'll have to write your own specifications. First of all, you need to extend your repository from JpaSpecificationExecutor like you did. Make sure to add the generic though (JpaSpecificationExecutor<Some>).
After that you'll have to create three specifications (one for each column), in the Spring docs they define these specifications as static methods in a class. Basically, creating a specification means that you'll have to subclass Specification<Some>, which has only one method to implement, toPredicate(Root<Some>, CriteriaQuery<?>, CriteriaBuilder).
If you're using Java 8, you can use lambdas to create an anonymous inner class, eg.:
public class SomeSpecs {
public static Specification<Some> withAddress(String address) {
return (root, query, builder) -> {
// ...
};
}
}
For the actual implementation, you can use Root to get to a specific node, eg. root.get("address"). The CriteriaBuilder on the other hand is to define the where clause, eg. builder.equal(..., ...).
In your case you want something like this:
public class SomeSpecs {
public static Specification<Some> withAddress(String address) {
return (root, query, builder) -> builder.equal(root.get("address"), address);
}
}
Or alternatively if you want to use a LIKE query, you could use:
public class SomeSpecs {
public static Specification<Some> withAddress(String address) {
return (root, query, builder) -> builder.like(root.get("address"), "%" + address + "%");
}
}
Now you have to repeat this for the other fields you want to filter on. After that you'll have to use all specifications together (using and(), or(), ...). Then you can use the repository.findAll(Specification) method to query based on that specification, for example:
public List<Some> getSome(String address, String name, Date date) {
return repository.findAll(where(withAddress(address))
.and(withName(name))
.and(withDate(date));
}
You can use static imports to import withAddress(), withName() and withDate() to make it easier to read. The where() method can also be statically imported (comes from Specification.where()).
Be aware though that the method above may have to be tweaked since you don't want to filter on the address field if it's null. You could do this by returning null, for example:
public List<Some> getSome(String address, String name, Date date) {
return repository.findAll(where(address == null ? null : withAddress(address))
.and(name == null ? null : withName(name))
.and(date == null ? null : withDate(date));
}
You could consider using Spring Data's support for QueryDSL as you would get quite a lot without having to write very much code i.e. you would not actually have to write the specifictions.
See here for an overview:
https://spring.io/blog/2011/04/26/advanced-spring-data-jpa-specifications-and-querydsl/
Although this approach is really convenient (you don’t even have to
write a single line of implementation code to get the queries
executed) it has two drawbacks: first, the number of query methods
might grow for larger applications because of - and that’s the second
point - the queries define a fixed set of criterias. To avoid these
two drawbacks, wouldn’t it be cool if you could come up with a set of
atomic predicates that you could combine dynamically to build your
query?
So essentially your repository becomes:
public interface SomeRepository extends JpaRepository<Some, Long>,
PagingAndSortingRepository<Some, Long>, QueryDslPredicateExecutor<Some>{
}
You can also get request parameters automatically bound to a predicate in your Controller:
See here:
https://spring.io/blog/2015/09/04/what-s-new-in-spring-data-release-gosling#querydsl-web-support
SO your Controller would look like:
#Controller
class SomeController {
private final SomeRepository repository;
#RequestMapping(value = "/", method = RequestMethod.GET)
String index(Model model,
#QuerydslPredicate(root = Some.class) Predicate predicate,
Pageable pageable) {
model.addAttribute("data", repository.findAll(predicate, pageable));
return "index";
}
}
So with the above in place it is simply a Case of enabling QueryDSL on your project and the UI should now be able to filter, sort and page data by various combinations of criteria.

override condition in Spring QuerydslPredicate

I'm using QuerydslPredicate (Spring 4.2.5, Spring Boot 1.3.3, querydsl-core 3.7.0) to create a search web service.
My Ticket entity has properties like name, description, etc.
I want a strict equality on the name field, but a "contains" comparison on the description.
The web service
public Page<Ticket> findAll(#QuerydslPredicate(root = Ticket.class) Predicate predicate, String description) {
BooleanBuilder builder = new BooleanBuilder(predicate);
if (isNotEmpty(description)) {
builder.and(QTicket.ticket.description.containsIgnoreCase(description));
}
return ticketService.findAll(builder, pageable);
}
Problem: when I query my web service like that: http...?description=foo, two comparisons are generated for the description (I started a debugger and looked at the generated BooleanBuilder). The pseudo-code looks like that: "description = foo AND description contains foo".
I'd like to keep the "contains" comparison only.
I found a workaround: I simply renamed web service's parameter description to descriptionFragment. This way, I can call http...?descriptionFragment=foo.
public Page<Ticket> findAll(#QuerydslPredicate(root = Ticket.class) Predicate predicate, String descriptionFragment) {
BooleanBuilder builder = new BooleanBuilder(predicate);
if (isNotEmpty(descriptionFragment)) {
builder.and(QTicket.ticket.description.containsIgnoreCase(descriptionFragment));
}
return ticketService.findAll(builder, pageable);
}
Question: I'd like to avoid this workaround. Is there a way to override default equality on the description field?
I found a solution: my TicketRepository should extends QuerydslBinderCustomizer
#Override
default void customize(QuerydslBindings bindings, QTicket qTicket) {
bindings.bind(qTicket.description).first(StringExpression::containsIgnoreCase);
}

How to trim white spaces from char fields pojo using hibernate and Legacy database

My table has column as char(5) and can not change it to varchar(5). So when I fetch values out from the table using hibernateTemplate , that returns added spaces with actual say single alphabet value.(A custome fix is to use .trim() method with checking NPE) but do we have a provided approach to handle this kind of situation.
PS.I am using Spring support for hibernate dao support.
(In SQL, the CHAR data type is a fixed length character string. By definition, the additional characters are padded wtih spaces.)
One way of avoiding explicit call to trim() is you can provide a lifecycle method using a #PostLoad annotation n your Enitity.
eg:
#PostLoad
protected void trim(){
if(stringAttr!=null){
stringAttr=stringAttr.trim();
}
}
I have referred discussion on similar question here
Of the suggested solutions I feel adding user type best suites the requirements and makes more sense because
1. #PostLoad (Or actually lifecycle methods) does not work with using SessionFactory with hibernate.
2. Modification in assessor could be ignored during code refractor and another developer may overlook the setters which may lead to overwriting the setter.
So I am using following solution.
1.Have one package-info.java :- This has details for typedefs being used.
2.Annotated the fields with #Type(type="MyTrimTypeString")
3.Defined a user type class MyTrimTypeString, Code for which followed reply by StepherSwensen with a slight update on nullSafeGet looks like
#Override
public Object nullSafeGet(ResultSet rs, String[] names, Object owner) throws HibernateException, SQLException
{
final String val = rs.getString(names[0]);
return (val == null ? val : val.trim());
}
and nullSafeSet looks like :
#Override
public void nullSafeSet(PreparedStatement st, Object value, int index) throws HibernateException, SQLException
{
final String val = (String)value;
st.setString(index, val);
}

Resources