Querying with Completion Suggesters in Elasticsearch with Java API - elasticsearch

I have my indices created, and mapping type for my 'suggest' field set to completion. I can't figure out how to configure the query for completion suggestions in elastic-search (Java API)
I'm trying to use this Query to base my implementation off of.
"song-suggest" : {
"text" : "n",
"completion" : {
"field" : "suggest"
}
}
Here's what I have so far,
CompletionSuggestionBuilder compBuilder = new CompletionSuggestionBuilder("complete");
compBuilder.text("n");
compBuilder.field("suggest");
SearchResponse searchResponse = localClient.prepareSearch(INDEX_NAME)
.setTypes("completion")
.setQuery(QueryBuilders.matchAllQuery())
.addSuggestion(compBuilder)
.execute().actionGet();
CompletionSuggestion compSuggestion = searchResponse.getSuggest().getSuggestion("complete");
Am I missing something, doing something wrong? Thanks!

Not sure if this is the best thing to do. But this works for me. Hope it helps.
#Override
public List<SuggestionResponse> findSuggestionsFor(String suggestRequest) {
CompletionSuggestionBuilder suggestionsBuilder = new CompletionSuggestionBuilder("completeMe");
suggestionsBuilder.text(suggestRequest);
suggestionsBuilder.field("suggest");
SuggestRequestBuilder suggestRequestBuilder =
client.prepareSuggest(MUSIC_INDEX).addSuggestion(suggestionsBuilder);
logger.debug(suggestRequestBuilder.toString());
SuggestResponse suggestResponse = suggestRequestBuilder.execute().actionGet();
Iterator<? extends Suggest.Suggestion.Entry.Option> iterator =
suggestResponse.getSuggest().getSuggestion("completeMe").iterator().next().getOptions().iterator();
List<SuggestionResponse> items = new ArrayList<>();
while (iterator.hasNext()) {
Suggest.Suggestion.Entry.Option next = iterator.next();
items.add(new SuggestionResponse(next.getText().string()));
}
return items;
}

paramsMap = req.getParameterMap();
String prefix = getParam("prefix");
if (prefix == null) {
EndpointUtil.badRequest("Autocomplete EndPoint: prefix parameter is missing", resp);
return;
}
SearchRequest searchRequest;
SearchSourceBuilder searchSourceBuilder;
searchRequest = new SearchRequest("section");
searchSourceBuilder = new SearchSourceBuilder();
searchSourceBuilder.timeout(new TimeValue(60, TimeUnit.SECONDS));
searchSourceBuilder.from(0);
searchSourceBuilder.size(MAX_HITS);
CompletionSuggestionBuilder suggestionBuilder = new CompletionSuggestionBuilder("text.completion")
.prefix(prefix, Fuzziness.AUTO).size(MAX_HITS);
SuggestBuilder suggestBuilder = new SuggestBuilder();
suggestBuilder.addSuggestion(SUGGEST_NAME, suggestionBuilder);
searchSourceBuilder.suggest(suggestBuilder);
searchRequest.source(searchSourceBuilder);
SearchResponse searchResponse = getElasticClient().search(searchRequest);
Suggest suggest = searchResponse.getSuggest();
List<Document> results = new ArrayList<Document>();
Suggest.Suggestion<Suggest.Suggestion.Entry<Suggest.Suggestion.Entry.Option>> suggestion
= suggest.getSuggestion(SUGGEST_NAME);
List<Suggest.Suggestion.Entry<Suggest.Suggestion.Entry.Option>> list = suggestion.getEntries();
for(Suggest.Suggestion.Entry<Suggest.Suggestion.Entry.Option> entry :list) {
List<Suggest.Suggestion.Entry.Option> options = entry.getOptions();
for(Suggest.Suggestion.Entry.Option option : options) {
Document doc = new Document();
doc.append("text",option.getText().toString());
results.add(doc);
}
}
sendJsonResult(results, resp);

But I'm running into the error "field "suggest" doesn't have type 'completion'. My mapping looks like this: code .field("suggest") .startObject() .field("type", "completion") .field("index_analyzer","simple") .field("search_analyzer","simple") .endObject()
It sounds like, that your mapping is not applied correctly. Did you check it out?
Based on the mapping you provided, I think you are missing the properties around your mapping. Try the following mapping:
XContentFactory.jsonBuilder().startObject()
.startObject("properties")
.startObject("suggest")
.field("type", "completion")
.endObject()
.endObject()
.endObject()
Btw, SimpleAnalyzer is the default Analyzer for the suggestions. Thus, you need not define it explicitly.

To anyone who still needs this. The code snippet below works with ES v 6.3:-
CompletionSuggestionBuilder suggestionBuilder = new CompletionSuggestionBuilder("<field_name>").prefix("<search_term>");
SearchRequestBuilder requestBuilder =
oaEsClient.client().prepareSearch("<index_name>").setTypes("<type_name>")
.suggest(new SuggestBuilder().addSuggestion("<suggestion_name>",suggestionBuilder))
.setSize(20)
.setFetchSource(true)
.setExplain(false)
;
SearchResponse response = requestBuilder.get();
Suggest suggest = response.getSuggest();

Related

How to use Term Query for nested objects in spring data elasticsearch?

My Document is like:
class Foo{
private Integer idDl;
private String Name;
private String Add;
#Field(type = FieldType.Nested)
private List< Bar> Bar;
}
class Bar{
private String barId;
private List<String> barData
}
and Foo sample response data is like:
{
"idDl": 123,
"Name": "ABCD",
"Add": "FL",
"Bar": [
{
"barId": "A456B",
"barData": [
"Bar1",
"Bar2"
]
},
{
"barId": "A985D",
"barData": [
"Bar4",
"Bar5"
]
}
]
}
I want to return all Fooobjects where Bar.barId is matching. I am using NativeSearchQueryBuilder provided by spring-data-elasticsearch as:
String[] includeFields = new String[]{"idDl", "Name"};
String[] excludeFields = new String[]{"Add"}; // to exclude Add field of Foo
Query searchQuery = new NativeSearchQueryBuilder()
.withQuery(termQuery("Bar.barId", "A456B"))
.withSourceFilter(new FetchSourceFilter(includeFields, excludeFields))
.build();
return elasticsearchRestTemplate.queryForList( searchQuery, Foo.class);
We have also tried using nestedQuery as follows:
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(nestedQuery("Bar",
boolQuery().must(termQuery("Bar.barId", "A456B")), ScoreMode.Max))
.withIndices(indices)
.withSourceFilter(new FetchSourceFilter(includeFields, excludeFields))
.build();
return elasticsearchRestTemplate.queryForList(searchQuery, Foo.class);
But getting exception as:
org.elasticsearch.ElasticsearchStatusException: Elasticsearch exception [type=search_phase_execution_exception, reason=all shards failed]
at org.elasticsearch.rest.BytesRestResponse.errorFromXContent(BytesRestResponse.java:177) ~[elasticsearch-6.8.7.jar:6.8.7]
at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:2053) ~[elasticsearch-rest-high-level-client-6.8.7.jar:6.8.7]
at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:2030) ~[elasticsearch-rest-high-level-client-6.8.7.jar:6.8.7]
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1777) ~[elasticsearch-rest-high-level-client-6.8.7.jar:6.8.7]
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1734) ~[elasticsearch-rest-high-level-client-6.8.7.jar:6.8.7]
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1696) ~[elasticsearch-rest-high-level-client-6.8.7.jar:6.8.7]
at org.elasticsearch.client.RestHighLevelClient.search(RestHighLevelClient.java:1092) ~[elasticsearch-rest-high-level-client-6.8.7.jar:6.8.7]
I am using termQuery as in the first snippet but i ain't getting response for it and but instead if i use matchQuery("Bar.barId", "A456B") I am getting the response. We just want to check query performance using termQuery and matchQuery.How to fetch the data using termQuery ?
P.S: we are using spring-boot-starter-data-elasticsearch 2.2.6.RELEASE in our spring-boot project.
We have the similar requirements and solved using this snippet, I've tried to covert it, according to your requirement. Code is pretty straight forward, let me know if you need further clarification.
BoolQueryBuilder boolQueryBuilder = boolQuery();
BoolQueryBuilder nestedBoolQueryBuilder = boolQuery().must(boolQuery()
.should(termQuery("Bar.barId", barId.toLowerCase()))).minimumNumberShouldMatch(1);
QueryBuilder nestedQueryBuilder = nestedQuery("Bar", nestedBoolQueryBuilder);
boolQueryBuilder = boolQueryBuilder.must(nestedQueryBuilder);
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(boolQueryBuilder)
.withPageable(pageable)
.build();
You haven't specified any analyser. So default one is used standard analyzer
Unlike full-text queries, term-level queries do not analyze search terms. Instead, term-level queries match the exact terms stored in a field.
Reference
Term query do not analyse which means it looks for A456B but index contains a456b due to the behaviour of standard analyzer which contains lowercase tokenizer
Whereas match query is a full text search query which does the analyser on index time and search time. So search time a456bmatches the words in the indexa3456b`.

More Like This Query Not Getting Serialized - NEST

I am trying to create an Elasticsearch MLT query using NEST's object initializer syntax. However, the final query when serialized, is ONLY missing the MLT part of it. Every other query is present though.
When inspecting the query object, the MLT is present. It's just not getting serialized.
I wonder what I may be doing wrong.
I also noticed that when I add Fields it works. But I don't believe fields is a mandatory property here that when it is not set, then the MLT query is ignored.
The MLT query is initialized like this;
new MoreLikeThisQuery
{
Like = new[]
{
new Like(new MLTDocProvider
{
Id = parameters.Id
}),
}
}
MLTDocProvider implements the ILikeDocument interface.
I expect the serialized query to contain the MLT part, but it is the only part that is missing.
This looks like a bug in the conditionless behaviour of more like this query in NEST; I've opened an issue to address. In the meantime, you can get the desired behaviour by marking the MoreLikeThisQuery as verbatim, which will override NEST's conditionless behaviour
var client = new ElasticClient();
var parameters = new
{
Id = 1
};
var searchRequest = new SearchRequest<Document>
{
Query = new MoreLikeThisQuery
{
Like = new[]
{
new Like(new MLTDocProvider
{
Id = parameters.Id
}),
},
IsVerbatim = true
}
};
var searchResponse = client.Search<Document>(searchRequest);
which serializes as
{
"query": {
"more_like_this": {
"like": [
{
"_id": 1
}
]
}
}
}

Spring boot custom query MongoDB

I have this MongoDb query:
db.getCollection('user').find({
$and : [
{"status" : "ACTIVE"},
{"last_modified" : { $lt: new Date(), $gte: new Date(new Date().setDate(new Date().getDate()-1))}},
{"$expr": { "$ne": ["$last_modified", "$time_created"] }}
]
})
It works in Robo3T, but when I put this in spring boot as custom query, it throws error on project start.
#Query("{ $and : [ {'status' : 'ACTIVE'}, {'last_modified' : { $lt: new Date(), $gte: new Date(new Date().setDate(new Date().getDate()-1))}}, {'$expr': { '$ne': ['$last_modified', '$time_created']}}]}")
public List<User> findModifiedUsers();
I tried to make query with Criteria in spring:
Query query = new Query();
Criteria criteria = new Criteria();
criteria.andOperator(Criteria.where("status").is(UserStatus.ACTIVE), Criteria.where("last_modified").lt(new Date()).gt(lastDay), Criteria.where("time_created").ne("last_modified"));
but it doesn't work, it returns me all users like there is no this last criteria not equal last_modified and time_created.
Does anyone know what could be problem?
I think that this feature is not supported yet by Criteria - check this https://jira.spring.io/browse/DATAMONGO-1845 .
One workaround is to pass raw query via mongoTemplate like this:
BasicDBList expr = new BasicDBList();
expr.addAll(Arrays.asList("$last_modified","$time_created"));
BasicDBList and = new BasicDBList();
and.add(new BasicDBObject("status","ACTIVE"));
and.add(new BasicDBObject("last_modified",new BasicDBObject("$lt",new Date()).append("$gte",lastDate)));
and.add(new BasicDBObject("$expr",new BasicDBObject("$ne",expr)));
Document document = new Document("$and",and);
FindIterable<Document> result = mongoTemplate.getCollection("Users").find(document);

WrapperQueryBuilder - aggs query throwing Query Malformed exception

I have a Json query string:
"\"query\":{\"match_all\": {}},\"aggs\":{\"avg1\":{\"avg\":{\"field\":\"age\"} } }";
When query is executed via Jest Client, aggregation values are available.
But when this query is converted into a Query Builder (WrapperQueryBuilder) object, getting the following exception.
; nested: QueryParsingException[[st1index] [_na] query malformed, must start with start_object]; }{[ixJ-6RHNR5C6fC7HfJHqaw][st1index][4]: SearchParseException[[st1index][4]: from[-1],size[-1]: Parse Failure [Failed to parse source [{
"query" : {
"wrapper" : {
"query" : "InF1ZXJ5Ijp7Im1hdGNoX2FsbCI6IHt9fSwiYWdncyI6eyJhdmcxIjp7ImF2ZyI6eyJmaWVsZCI6ImFnZSJ9IH0gfQ=="
}
}
}]]]; nested: QueryParsingException[[st1index] [_na] query malformed, must start with start_object]; }]
How do i fix this?
Edit 1: Code analysis:
Code analysis details added:
public static void main(String[] args) throws Exception
{
try{
//Jest client building
JestClientFactory factory = new JestClientFactory();
HttpClientConfig config = new HttpClientConfig.
Builder("http://localhost:9201")
.connTimeout(10000)
.readTimeout(10000)
.multiThreaded(true).build();
factory.setHttpClientConfig(config);
JestClient jestClient = factory.getObject();
String query ="{\"query\":{\"match_all\": {}},\"aggs\":{\"avg1\":{\"avg\":{\"field\":\"age\"} } }}";
String query2 ="{\"match_all\": {}},\"aggs\":{\"avg1\":{\"avg\":{\"field\":\"age\"} } }}";
WrapperQueryBuilder wrapQB = new WrapperQueryBuilder(query2);
SearchSourceBuilder ssb = new SearchSourceBuilder();
ssb.query(wrapQB);
//working code commented
// Search.Builder searchBuilder = new Search.Builder(query).addIndex("st1index").addType("st1type");
//code which needs to be fixed
Search.Builder searchBuilder = new
Search.Builder(ssb.toString()).addIndex("st1index").addType("st1type");
SearchResult result = jestClient.execute(searchBuilder.build());
System.out.println(result.getJsonString());
}
catch(Exception e)
{
System.out.println("inside exception block");
e.printStackTrace();
}
}
with String query and with commented SearchSourceBuilder, aggs results are displayed. But by using WrapperQueryBuilder , unable to retrieve aggs results
You're almost there, you're simply missing enclosing braces:
"{\"query\":{\"match_all\": {}},\"aggs\":{\"avg1\":{\"avg\":{\"field\":\"age\"}}}}";
^ ^
| |
this one... ...and this one
UPDATE
In the WrapperQueryBuilder, you can only pass the content of the query part, and not the aggregations part. You need to add the aggregation part directly on the SearchSourceBuilderlike this:
SearchSourceBuilder ssb = new SearchSourceBuilder();
// add the query part
String query ="{\"match_all\": {}}";
WrapperQueryBuilder wrapQB = new WrapperQueryBuilder(query);
ssb.query(wrapQB);
// add the aggregation part
AvgBuilder avgAgg = AggregationBuilders.avg("avg1").field("age");
ssb.aggregation(avgAgg);

Elasticsearch: bulk update multiple documents saved in a Java String?

I can create the following string saved in a Java String object called updates.
{ "update":{ "_index":"myindex", "_type":"order", "_id":"1"} }
{ "doc":{"field1" : "aaa", "field2" : "value2" }}
{ "update":{ "_index":"myindex", "_type":"order", "_id":"2"} }
{ "doc":{"field1" : "bbb", "field2" : "value2" }}
{ "update":{ "_index":"myindex", "_type":"order", "_id":"3"} }
{ "doc":{"field1" : "ccc", "field2" : "value2" }}
Now I want to do bullk update within a Java program:
Client client = getClient(); //TransportClient
BulkRequestBuilder bulkRequest = client.prepareBulk();
//?? how to attach updates variable to bulkRequest?
BulkResponse bulkResponse = bulkRequest.execute().actionGet();
I am unable to find a way to attach the above updates variable to bulkRequest before execute.
I notice that I am able to add UpdateRequest object to bulkRequest, but it seems to add only one document one time. As indicated above, I have multiple to-be-updated document in one string.
Can someone enlighten me on this? I have a gut feel that I may do things wrong way.
Thanks and regards.
The following code should work fine for you.
For each document updation , you need to create a separate update request as below and keep on adding it to the bulk requests.
Once the bulk requests is ready , execute a get on it.
JSONObject obj = new JSONObject();
obj.put("field1" , "value1");
obj.put("field2" , "value2");
UpdateRequest updateRequest = new UpdateRequest(index, indexType, id1).doc(obj.toString());
BulkRequestBuilder bulkRequest = client.prepareBulk();
bulkRequest.add(updateRequest);
obj = new JSONObject();
obj.put("fieldX" , "value1");
obj.put("fieldY" , "value2");
updateRequest = new UpdateRequest(index, indexType, id2).doc(obj.toString());
bulkRequest = client.prepareBulk();
bulkRequest.add(updateRequest);
bulkRequest.execute().actionGet();
I ran into the same problem where only 1 document get updated in my program. Then I found the following way which worked perfectly fine. This uses spring java client. I have also listed the the dependencies I used in the code.
import org.elasticsearch.action.update.UpdateRequest;
import org.elasticsearch.index.query.QueryBuilder;
import org.springframework.data.elasticsearch.core.query.UpdateQuery;
import org.springframework.data.elasticsearch.core.query.UpdateQueryBuilder;
private UpdateQuery updateExistingDocument(String Id) {
// Add updatedDateTime, CreatedDateTime, CreateBy, UpdatedBy field in existing documents in Elastic Search Engine
UpdateRequest updateRequest = new UpdateRequest().doc("UpdatedDateTime", new Date(), "CreatedDateTime", new Date(), "CreatedBy", "admin", "UpdatedBy", "admin");
// Create updateQuery
UpdateQuery updateQuery = new UpdateQueryBuilder().withId(Id).withClass(ElasticSearchDocument.class).build();
updateQuery.setUpdateRequest(updateRequest);
// Execute update
elasticsearchTemplate.update(updateQuery);
}

Resources