Spring data elasticsearch how to create repository method for keyword field - elasticsearch

Let's say I have mapping like this, and I want to search by the "requestId.keyword" field to fetch the exact match requests. How can I implement it with the Spring Data Elasticsearch repository without using #Query annotation?
"requestId": {
"type": "text",
"analyzer": "1_to_15_analyzer_without_space",
"search_analyzer": "all_symbols_and_fold_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
}

This is not possible with the mechanism to build queries by introspecting the method name. The first idea is to have something like (I am using a Foo entity here):
SearchHits<Foo> searchByRequestId_Keyword(String keyword);
The analysis of the method name is done in the spring-data-common module which only uses the property names of the Java properties of an entity (might be nested). But the keyword subfield only exists in Elasticsearch and - if not autocreated - in the #MultiField annotation. But the code to parse the methodname does not use store-specific information and so an approach like this will not work and fail with the error that keyword is not a property of text - which is right for the Java object.
What you can do is to first add a custom repository fragment interface:
public interface FooKeywordRepository {
SearchHits<Foo> searchByRequestIdKeyword(String keyword);
}
and provide an implementation that must be named like the interface with Impl as suffix:
import org.elasticsearch.index.query.QueryBuilders;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.SearchHits;
import org.springframework.data.elasticsearch.core.query.Criteria;
import org.springframework.data.elasticsearch.core.query.CriteriaQuery;
import org.springframework.data.elasticsearch.core.query.NativeSearchQueryBuilder;
import org.springframework.data.elasticsearch.core.query.Query;
public class FooKeywordRepositoryImpl implements FooKeywordRepository {
private final ElasticsearchOperations operations;
public FooKeywordRepositoryImpl(ElasticsearchOperations operations) {
this.operations = operations;
}
#Override
public SearchHits<Foo> searchByRequestIdKeyword(String keyword) {
Query query1 = new NativeSearchQueryBuilder()
.withQuery(QueryBuilders.termQuery("requestId.keyword", keyword))
.build();
Query query2 = new CriteriaQuery(Criteria.where("requestId.keyword").is(keyword));
return operations.search(query1, Foo.class); // could be query2 as well
}
}
You have an ElasticsearchOperations injected and use that to execute a query that you build. I have put in two ways to build the query, both work.
Your repository definition to use would then be:
public interface FooRepository extends ElasticsearchRepository<Foo, String>, FooKeywordRepository {
// other custom methods if needed
}

Related

GraphQL mutation implementation in Spring Boot using #GraphQlRepository

I have the following Spring Boot reactive "stack" with GraphQL and MongoDB (in Kotlin):
spring-boot-starter-webflux
spring-boot-starter-graphql
spring-boot-starter-data-mongodb-reactive
A very basic example for a server which exposes a GraphQL API to query customers:
import org.springframework.boot.autoconfigure.SpringBootApplication
import org.springframework.boot.runApplication
import org.springframework.data.annotation.Id
import org.springframework.data.mongodb.core.mapping.Document
import org.springframework.data.mongodb.repository.ReactiveMongoRepository
import org.springframework.graphql.data.GraphQlRepository
#SpringBootApplication class ServerApplication
fun main(args: Array<String>) {
runApplication<ServerApplication>(*args)
}
#Document
data class Customer(
#Id val id: String? = null,
val name: String?,
)
#GraphQlRepository
interface CustomerRepository : ReactiveMongoRepository<Customer, String>
In combination with the following GraphQL schema file
type Customer {
id: ID
name: String
}
type Query {
customers: [Customer]
customerById(id: ID!): Customer
}
type Mutation {
createCustomer(name: String!): Customer
}
It is already possible to query customers / customerById and retrieve the data accordingly using e.g.:
{
customers { id name }
customerById(id: "...") { name }
}
This is made possible by the #GraphQlRepository annotation, which automatically registers a handler for fetching data directly from the database.
However, I can't find anything in the documentation about how mutations are implemented i.e. if there is such a simple automatic solution like for the queries or if this has to be implemented manually by a controller with #MutationMapping.
#Controller
class CustomerController(val customerRepository: CustomerRepository) {
#MutationMapping
fun createCustomer(#Argument name: String?): Mono<Customer> {
return customerRepository.save(Customer(name = name))
}
}
As far as I know, mutations must be implemented through a dedicated, #MutationMapping annotated method in the Controller like you suggest. I have made a couple of exercises and the only difference from your example is that I have used a special Input type -both in the schema and in the Java codebase- to define it; in your case, a String will do.
The schema:
type Query{
obras: [Obra]
obrasPorArtista(apellidoArtista:String!): [Obra]
}
type Mutation{
addObra(nueva: ObraInput): Obra
}
type Obra{
id: ID
titulo: String
precio: Float
}
input ObraInput{
titulo: String
artista: String
precio: Float
}
The Controller (the service is injected):
#MutationMapping
public Mono<Obra> addObra(#Argument ObraInput nueva){
return obraService.guardarObra(nueva);
}
The ObraInput:
public record ObraInput(String titulo, String artista, double precio) {}
(The Obra is an Entity with the JPA annotations, columns, etc)
Hope it helps!

spring-data-neo4j v6: No converter found capable of converting from type [MyDTO] to type [org.neo4j.driver.Value]

Situation
I'm migrating a kotlin spring data neo4j application from spring-data-neo4j version 5.2.0.RELEASE to version 6.0.11.
The original application has several Repository interfaces with custom queries which take some DTO as a parameter, and use the various DTO fields to construct the query. All those types of queries currently fail with
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [MyDTO] to type [org.neo4j.driver.Value]
The reference documentation for spring-data-neo4j v6 only provides examples where parameters passed to custom query methods of a #Repository interface are of the same type as the #Node class associated with that repository. The documentation does not explicitly state that only parameters of the Node class are allowed.
Question
Is there any way to pass an arbitrary DTO (not being a #Node class) to a custom query method in a #Repository interface in spring-data-neo4j v6 like it was possible in v5?
Code samples
Example node entity
#Node
data class MyEntity(
#Id
val attr1: String,
val attr2: String,
val attr3: String
)
Example DTO
data class MyDTO(
val field1: String,
val field2: String
)
Example Repository interface
#Repository
interface MyRepository : PagingAndSortingRepository<MyEntity, String> {
// ConverterNotFoundException is thrown when this method is called
#Query("MATCH (e:MyEntity {attr1: {0}.field1}) " +
"CREATE (e)-[l:LINK]->(n:OtherEntity {attr2: {0}.field2))")
fun doSomethingWithDto(dto: MyDTO)
}
Solutions tried so far
Annotate DTO as if it were a Node entity
Based on the following found in the reference docs https://docs.spring.io/spring-data/neo4j/docs/current/reference/html/#custom-queries.parameters
Mapped entities (everything with a #Node) passed as parameter to a
function that is annotated with a custom query will be turned into a
nested map.
#Node
data class MyDTO(
#Id
val field1: String,
val field2: String
)
Replace {0} with $0 in custom query
Based on the following found in the reference docs https://docs.spring.io/spring-data/neo4j/docs/current/reference/html/#custom-queries.parameters
You do this exactly the same way as in a standard Cypher query issued
in the Neo4j Browser or the Cypher-Shell, with the $ syntax (from
Neo4j 4.0 on upwards, the old {foo} syntax for Cypher parameters has
been removed from the database).
...
[In the given listing] we are referring to the parameter by its name.
You can also use $0 etc. instead.
#Repository
interface MyRepository : PagingAndSortingRepository<MyEntity, String> {
// ConverterNotFoundException is thrown when this method is called
#Query("MATCH (e:MyEntity {attr1: $0.field1}) " +
"CREATE (e)-[l:LINK]->(n:OtherEntity {attr2: $0.field2))")
fun doSomethingWithDto(dto: MyDTO)
}
Details
spring-boot-starter: v2.4.10
spring-data-neo4j: v6.0.12
neo4j-java-driver: v4.1.4
Neo4j server version: v3.5.29
RTFM Custom conversions ...
Found the solution myself. Hopefully someone else may benefit from this as well.
Solution
Create a custom converter
import mypackage.model.*
import com.fasterxml.jackson.core.type.TypeReference
import com.fasterxml.jackson.module.kotlin.jacksonObjectMapper
import org.neo4j.driver.Value
import org.neo4j.driver.Values
import org.springframework.core.convert.TypeDescriptor
import org.springframework.core.convert.converter.GenericConverter
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair
import java.util.HashSet
class DtoToNeo4jValueConverter : GenericConverter {
override fun getConvertibleTypes(): Set<ConvertiblePair>? {
val convertiblePairs: MutableSet<ConvertiblePair> = HashSet()
convertiblePairs.add(ConvertiblePair(MyDTO::class.java, Value::class.java))
return convertiblePairs
}
override fun convert(source: Any?, sourceType: TypeDescriptor, targetType: TypeDescriptor?): Any? {
return if (MyDTO::class.java.isAssignableFrom(sourceType.type)) {
// generic way of converting an object into a map
val dataclassAsMap = jacksonObjectMapper().convertValue(source as MyDTO, object :
TypeReference<Map<String, Any>>() {})
Values.value(dataclassAsMap)
} else null
}
}
Register custom converter in config
import org.springframework.context.annotation.Bean
import org.springframework.context.annotation.Configuration
import org.springframework.data.neo4j.core.convert.Neo4jConversions
import org.springframework.core.convert.converter.GenericConverter
import java.util.*
#Configuration
class MyNeo4jConfig {
#Bean
override fun neo4jConversions(): Neo4jConversions? {
val additionalConverters: Set<GenericConverter?> = Collections.singleton(DtoToNeo4jValueConverter())
return Neo4jConversions(additionalConverters)
}
}
It's ridiculous that the framework would force you to write a custom converter for this. I made a #Transient object in my overridden User class for a limited set of update-able user profile fields, and I'm encountering the same error. I guess I will just have to break up the object into its component String fields in the method params to get around this problem. What a mess.
#Query("MATCH (u:User) WHERE u.username = :#{#username} SET u.firstName = :#{#up.firstName},u.lastName = :#{#up.firstName},u.intro = :#{#up.intro} RETURN u")
Mono<User> update(#Param("username") String username,#Param("up") UserProfile up);
No converter found capable of converting from type [...UserProfile] to type [org.neo4j.driver.Value]

SpringData Mongo projection ignore and overide the values on save

Let me explain my problem with SpringData mongo, I have the following interface declared, I declared a custom query, with a projection to ignore the index, this example is only for illustration, in real life I will ignore a bunch of fields.
public interface MyDomainRepo extends MongoRepository<MyDomain, String> {
#Query(fields="{ index: 0 }")
MyDomain findByCode(String code);
}
In my MongoDB instance, the MyDomain has the following info, MyDomain(code="mycode", info=null, index=19), so when I use the findByCode from MyDomainRepo I got the following info MyDomain(code="mycode", info=null, index=null), so far so good, because this is expected behaviour, but the problem happens when..., I decided to save the findByCode return.
For instance, in the following example, I got the findByCode return and set the info property to myinfo and I got the object bellow.
MyDomain(code="mycode", info="myinfo", index=null)
So I used the save from MyDomainRepo, the index was ignored as expected by the projection, but, when I save it back, with or without an update, the SpringData Mongo, overridden the index property to null, and consequently, my record on the MongoDB instance is overridden too, the following example it's my MongoDB JSON.
{
"_id": "5f061f9011b7cb497d4d2708",
"info": "myinfo",
"_class": "io.springmongo.models.MyDomain"
}
There's a way to tell to SpringData Mongo, to simply ignores the null fields on saving?
Save is a replace operation and you won't be able to signal it to patch some fields. It will replace the document with whatever you send
Your option is to use the extension provided by Spring Data Repository to define custom repository methods
public interface MyDomainRepositoryCustom {
void updateNonNull(MyDomain myDomain);
}
public class MyDomainRepositoryImpl implements MyDomainRepositoryCustom {
private final MongoTemplate mongoTemplate;
#Autowired
public BookRepositoryImpl(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
#Override
public void updateNonNull(MyDomain myDomain) {
//Populate the fileds you want to patch
Update update = Update.update("key1", "value1")
.update("key2", "value2");
// you can you Update.fromDocument(Document object, String... exclude) to
// create you document as well but then you need to make use of `MongoConverter`
//to convert your domain to document.
// create `queryToMatchId` to mtach the id
mongoTemplate.updateFirst(queryToMatchId, update, MyDomain.class);
}
}
public interface MyDomainRepository extends MongoRepository<..., ...>,
MyDomainRepositoryCustom {
}

Applying groupBy to Spring specification query

I'm building a simple search engine using Spring specification-driven repositories. The problem is that i make joinful queries which end in duplicating records, and i need to apply a groupBy restriction, which isn't applied to resulting query if i call .groupBy in my specification. How can i achieve such grouping?
The minimal example may be specified as follows:
Repository
import org.springframework.data.jpa.repository.JpaSpecificationExecutor;
public interface EntityRepository implements JpaSpecificationExecutor<Entity> {}
Specification
import org.springframework.data.jpa.domain.Specification;
public class DummyEntitySpecification implements Specification<Entity> {
public Predicate toPredicate(Root<Entity> root, CriteriaQuery<?> criteriaQuery,
CriteriaBuilder builder) {
// So from here i get results with repeated main entity data, but actually
// i need to filter results that has presence of related subentities of
// some sort
From join = root.join("SubEntity");
// though i call groupBy here, Spring internally uses another query
// instance, which doesn't inherit grouping
criteriaQuery.groupBy(root.get("id"));
return criteriaBuilder.ge(join.get("id"), 1);
}
}
Call
public class Anyclass {
#Autowired
private EntityRepository entityRepository;
public void uselessSearch() {
// this may return several entities with id = 1, for example
entityRepository.findAll(new DummyEntitySpecification());
}
}

ElasticSearch in Spring with #Query

I have successfully created a query using ElasticSearch's _plugin/head interface. The query is meant to return the latest timestamp for a specific device at a specific location. The query looks as follows:
{
"query":{
"bool":{
"must":[
{
"term":{
"deviceevent.location.id":"1"
}
},
{
"term":{
"deviceevent.deviceId":"AHE1LDD01"
}
}
]
}
},
"from":0,
"size":1,
"sort":{
"timestamp":{
"order":"desc"
}
}
}
The above query works as intended.
Now using Spring-Boot and Spring-Data-ElasticSearch, I defined my own ElasticSearchRepository which looks as follows:
package com.repository.elasticsearch;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.data.elasticsearch.annotations.Query;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
import com.domain.DeviceEvent;
public interface DeviceEventRepository extends ElasticsearchRepository<DeviceEvent, String>
{
#Query("{\"bool\":{\"must\":[{\"term\":{\"deviceevent.location.id\": \"?0\"}},{\"term\":{\"deviceevent.deviceId\": \"?1\"}}]}},\"from\": 0,\"size\": 1,\"sort\":{\"timestamp\":{\"order\":\"desc\"}}")
DeviceEvent findLatestCheckInAtLocation(Long locationId, String deviceId);
}
The above code is breaking mainly because I would expect it to return one DeviceEvent, but it's actually returning a device events with count = 10 (The default Page size). It seems also that the results are not being ordered by the timestamp in a descending order. It's as if the size and order parts of the query are not being picked up.
What am I doing wrong here?
Instead of controlling the results size in the query annotation.
Use the Pageable interface, the following is taken from the documentation.
public interface BookRepository extends ElasticsearchRepository<Book, String> {
#Query("{"bool" : {"must" : {"field" : {"name" : "?0"}}}}")
Page<Book> findByName(String name,Pageable pageable);
}
This would allow you to:
findByName("foo-name", new PageRequest(0,1));
If you want to sort also:
findByName("foo-name", new PageRequest(0,1, new Sort(new Sort.Order(Sort.Direction.ASC,"name")))).getContent().get(0);

Resources