Spring data jpa Query dynamically pass where clause - spring

My Entity is in this way
public class event
{
String title;
String description;
String city;
}
I am new to Spring data jpa ,i want implement search feature when an user enters "Hello Hyderabad Fest"
I want token size the string and split into words and find Any word matches on any properties on entity with search query hit to db.
WHERE title LIKE '%Hello%' OR title LIKE '%Hyderabad%' OR title LIKE
'%Fest%' OR description LIKE '%Hello%' OR description LIKE
'%Hyderabad%' OR description LIKE '%Fest%'city LIKE '%Hello%' OR
cityitle LIKE '%Hyderabad%' OR city LIKE '%Fest%'
How can we achieve this in spring data jpa.
Can we dynamically pass where condition in Spring data jpa named queries
Can we lucene kind query which we use in nosql dbs.
any other suggestion
Thanks in advance.

Postgresql fulltext search query solved the above issue http://rachbelaid.com/postgres-full-text-search-is-good-enough/

Related

Update by query multiple fields using Spring Data Elasticsearch?

I want to update all the documents that have for exemple the same name. I've seen in the elasticsearch documentation that I can use _update_by_query. So I tried to implement it in my repository like this:
#Query("{\"script\": { \"inline\": \"ctx._source.name = ?1\"; \"ctx._source.username = ?2\"; \"ctx._source.avatar = ?3\", \"lang\": \"painless\" }, \"query\": { \"match\": { \"name\" : \"?1\" }}")
List<User> update(String name, String username, String avatar);
But I get the following error:
nested exception is ElasticsearchStatusException[Elasticsearch exception [type=parsing_exception, reason=[script] query does not support [inline]]]
at org.springframework.kafka.listener.SeekUtils.seekOrRecover(SeekUtils.java:157) ~[spring-kafka-2.5.0.RELEASE.jar:2.5.0.RELEASE]
Edit 26.06.2020:
This answer is not correct, I added a correct on.
Old incorrect answer:
Seems strange to me, that this error comes from org.springframework.kafka.listener.SeekUtils.
To update using a script, you can use the update(UpdateQuery updateQuery, IndexCoordinates index) of a ElasticsearchOperations instance.
To have this in your Repository, you will need to create a repository cusomization like it is described here. In the implementation, autowire a ElasticsearchOperations instance. In this custom repository interface, you define the method
List<User> update(String name, String username, String avatar);
In the implementation, build up a UpdateQuery object with the script and the other information and pass this to the ElasticsearchOperations instance.
After checking the code of Spring Data Elasticsearch, I need to withdraw what I wrote in the first answer:
Currently Spring Data Elasticsearch does not support update by query. It is only possible to update entities with a know id either in a single operation or in a batch update.
I created an issue in Jira to add support for that.

spring-data-mongodb using the fieldName instead of _id

I have a Pojo with an attribute as
Class A{
#Id
#Field("item_id")
private String itemId;
}
When i try to update a document in MongoDB collection based on the itemId as below, it worked well and able to see from mongo ops logs that the query was transformed as "_id in itemIds "
Query query = new Query(Criteria.where("itemId").in(itemIds));
Update update = new Update();
update.set("field2", "abd");
mongoTemplate.updateMulti(query, update, A.class)
When i upgraded to spring-data-mongodb-2.1.5.RELEASE, the query i saw in the mongo logs was "item_id in itemIds". Since item_id is not a field and no index for that field in the collection, the query took forever to complete.
Any help to understand why the spring-data library is building the query as _id in older version and using the field as it is in newer version?
After a 2 minute search on the Spring documentation (https://docs.spring.io/spring-data/mongodb/docs/1.3.3.RELEASE/reference/html/mapping-chapter.html):
The following outlines what field will be mapped to the '_id' document field:
A field annotated with #Id (org.springframework.data.annotation.Id) will be mapped to the '_id' field.
A field without an annotation but named id will be mapped to the '_id' field.
Did you try that already?

Convert ObjectId to String in Spring Data

How can I reference two mongodb collections using spring data while the localField is of type ObjectId and foreignField is of type String?
ProjectionOperation convertId=Aggregation.project().and("_id").as("agentId");
LookupOperation activityOperation = LookupOperation.newLookup().
from("activity").
localField("agentId").
foreignField("agent_id").
as("activities");
Aggregation aggregation = Aggregation.newAggregation(convertId,activityOperation);
return mongoTemplate.aggregate(aggregation, "agents", AgentDTO.class).getMappedResults()
However, this doesn't return any records because of the type issue. Is it possible to implement $toString or $convert in ProjectionOperation? or what other options are there?
I was able to solve it by writing native mongodb aggregation operation in java code as described in MongoDB $aggregate $push multiple fields in Java Spring Data
After implementing this solution I was able to add native $addfields as follows:
AggregationOperation addField=new GenericAggregationOperation("$addFields","{ \"agId\": { \"$toString\": \"$_id\" }}");

Spring Data find substring containing special characters

How to retrieve data from elasticsearch containing special characters like . / - ... using spring data ?
I have document defined like this:
#Document(indexName = "audit-2018.135", type = "audit")
public class Trace {
#Id
private String id;
#Field(type = FieldType.Text)
private String uri;
// setters & getters
}
I need to retrieve data from elasticSearch by field uri.
Here's an example how data looks:
martin-int-vip.vs.cz:5080/kib/api/runtime/case/CASE0000000000324223
When using kibana I can use:
uri: "martin-int-vip.vs.cz:5080/kib"
and I get back the record above which contains desired substring.
Now I need to achieve the same from java however in spring data it's not working as expected.
I have elasticSearchRepository defined like this:
public interface TraceRepository extends ElasticsearchRepository<Trace, String> {
List<Trace> findByUriContaining(String uri);
}
when I call method findByUriContaining with parameter uri as:
martin-int-vip.vs.cz:5080\/kib
or even this
martin-int
I get back 0 results. When I send "kib" as parameter it returns correctly all records containing word "kib" however it's not working with special characters like . / - etc. How should I query my elasticSearch from java to get all records which contains my desired substring ? Thanks
I found out that by default elasticSearch analyzer tokenize your query. Instead of "martin-int-vip.vs.cz:5080/kib" it looks for a field which contains martin and int and vip...
In order to query these kind of fields you need to change behaviour of analyzer or you can use elasticSearchTemplate and Criteria to build more flexible queries.

Group toghether Node properties and return as a view in Cypher

I am working with v2.2.3 of Neo4J and Spring Neo4j Data SDN 4
I want to return a few properties of a node using a cypher query and map them into attributes of a POJO.My function in Spring data repository looks something like this
#Query(
"MATCH(n:ServiceProvider{profileStatus:{pStatus},currentResidenceState:{location}}) RETURN n.name,n.currentResidenceAddress ,n.employmentStatus,"
+ "n.idProofType,n.idProofNumber
ORDER BY n.registrationDate DESC SKIP{skip} LIMIT {limit}")
List<AdminSearchMapResult> getServiceProviderRecords(
#Param("pStatus")String pStatus,
#Param("location")String location,
#Param("skip") int skip,#Param("limit")int limit);
I get an error like
Scalar response queries must only return one column. Make sure your cypher query only returns one item.
I think its because of the fact that I cant bundle all the returned attributes into a view that can map into the POJO
If I return the node itself and map it into a POJO it works
Kindly guide
This can be done using #QueryResult
Annotate the AdminSearchMapResult POJO with #QueryResult. For example:
#QueryResult
public class AdminSearchMapResult {
String name;
String currentResidenceAddress;
...
}
Optionally annotate properties with #Property(name = "n.idProofType") if the alias is different from the field name.

Resources