I need a query that makes partial match on a string and filter outside documents that have a specific value for a field.
I tried this payload for es:
payload = {
search_request: {
_source: [ 'name', 'source','pg_id' ],
query: {
match: { name: query_string }
bool: {
must_not: {
term: { "source.source": source_value }
}
}
},
size: 100
},
query_hint: query,
algorithm: algorithm,
field_mapping: { title: ["_source.name", "_source.source"]}
}
But ES trows this error:
{
:error=> {
:root_cause=> [
{
:type=>"parse_exception",
:reason=>
"failed to parse search source. expected field name but got [
START_OBJECT
] "}],
:type=>" search_phase_execution_exception",
:reason=>"all shards failed",
:phase=>"query",
:grouped=>true,
:failed_shards=> [
{
:shard=>0,
:index=>"articles",
:node=>"3BUP3eN_TB2-zExigd_k2g",
:reason=> {
:type=>"parse_exception",
:reason=>
"failed to parse search source. expected field name but got [
START_OBJECT
] "
}
}
]
},
:status=>400
}
I am using Elasticsearch 2.4
First of all your json format is not valid. Check for a commas and quotes.
Also if you need just to filtrate documents - filters are much faster than queries. Check documentation
Related
PS: my GraphQL skills are pretty basic so sorry for any incorrect use of words and terms
I want to achieve filtering on the code field highlighted below
(transactions --> edges --> node --> header --> transactionSource --> code = "something"
{
transactions(last: 10) {
edges {
node {
amount
periodId
header {
owner {
owner {
code
description
dbId
ownerDbId
ownerCode
}
}
transactionSource {
code
}
}
}
}
pageInfo {
hasNextPage
}
}
}
The client i'm working with have defined a list of filtering options, which I can successfully filter on, but when I try to filter on the code field I get the following result:
{
"errors": [
{
"message": "Argument 'filter' has invalid value. In field 'code': Unknown field.",
"locations": [
{
"line": 2,
"column": 26
}
],
"extensions": {
"code": "ARGUMENTS_OF_CORRECT_TYPE",
"codes": [
"ARGUMENTS_OF_CORRECT_TYPE"
],
"number": "5.6.1"
}
}
]
}
I presume this is because of me not knowing exactly how to set up the filter correctly.
Is there a way to filter on any field or do I need to talk to the guys maintaining the client and ask them nicely to make the code field available for filtering?
Thanks
I am using elasticsearch 6.8 and below is a sample of query I send:
{
"query": {
...
"bool": {
"should": [
{ match_phrase: { descriptor: 'xxx' } },
{ match_phrase: { descriptor: 'xxx' } },
{ match_phrase: { descriptor: 'xxx' } },
{ match_phrase: { descriptor: 'xxx' } }
]
}
...
}
As you can see there are many match_phrase under should array. Is the order of these match matter in terms of scores in the result?
The filter parameter indicates filter context. Its term and range clauses are used in filter context. They will filter out documents which do not match, but they WILL NOT affect the score for matching documents.
Please also see the document of ElasticSearch as a reference.
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-filter-context.html
I want to find exact matches on a (analyzed string) field in ES. All values are integers but mapped as strings. I, unfortunately, cannot change the mapping and using
query: {
match: {
fieldName: '1234'
}
}
also gives me 0 hits.
I cannot figure out if it's the standard analyzer working in a bizarre way when the mapping is
index: {
type: {
properties: {
fieldName: {
type: string
}
}
}
}
and data is
{fieldName: '12345'}
or there is something in the match query that I'm missing.
Thanks :)
Change your quotations for the fieldNames value from ticks ' to quotes ". Trying your query will the correct quotes returns the expected results on my end.
{
"query": {
"match": {
"fieldName": "1234"
}
}
}
I am using elasticsearch 1.1.
Normally in this version, percolator on nested documents should work.
Although, i am trying to do this but I get the following error:
failures: [
{
index: test
shard: 4
reason: BroadcastShardOperationFailedException[[test][4] ]; nested: PercolateException[failed to percolate]; nested: ElasticsearchIllegalArgumentException[Nothing to percolate];
}
]
I have the following percolator (sorry elasticsearch head removed me all the quotes):
{
_index: test
_type: .percolator
_id: 27
_version: 1
_score: 1
_source: {
query: {
filtered: {
query: {
match_all: { }
}
filter: {
nested: {
filter: {
term: {
city: london
}
}
path: location
}
}
}
}
}
}
And while trying to percolate this document I am getting the error:
{
...
"location": {
"date": "2014-05-05T15:07:58",
"namedplaces": {
"city": "london"
}
}
}
Any idea why it doesn't work ?
EDIT :
In elasticsearch log I got more precision about the error:
[2014-05-06 13:33:48,972][DEBUG][action.percolate ] [Tomazooma] [test][2], node[H42BBxajRs2w2NmllMnp7g], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.percolate.PercolateReque
st#7399452e]
org.elasticsearch.percolator.PercolateException: failed to percolate
at org.elasticsearch.action.percolate.TransportPercolateAction.shardOperation(TransportPercolateAction.java:198)
at org.elasticsearch.action.percolate.TransportPercolateAction.shardOperation(TransportPercolateAction.java:55)
at org.elasticsearch.action.support.broadcast.TransportBroadcastOperationAction$AsyncBroadcastAction$2.run(TransportBroadcastOperationAction.java:226)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: org.elasticsearch.ElasticsearchIllegalArgumentException: Nothing to percolate
at org.elasticsearch.percolator.PercolatorService.percolate(PercolatorService.java:187)
at org.elasticsearch.action.percolate.TransportPercolateAction.shardOperation(TransportPercolateAction.java:194)
... 5 more
The documentation of ES is not really clear about it. But when you look at this page, you will see that when you are percolating you need to surround your indexed document by doc{}. It is indeed compulsory otherwise the exception that you've got will appears:
Try to do so on :
{
"doc":{
...
"location": {
"date": "2014-05-05T15:07:58",
"namedplaces": {
"city": "london"
}
}
}
}
I hope it will help ;-)
Another reason for the Nothing to percolate exception is not setting the Content-Length HTTP header.
Because the GET request has a body, it should also have a 'Content-Length' HTTP header, but not all APIs will set this for you; as I found out the hard way!
I'm querying for states using the state code as the query string, and "in" and "or" (Indiana and Oregon) are failing, presumably because they're reserved words.
I can confirm that the data exists in the index correctly, because when I run:
curl -XGET 'localhost:9200/state/_search?size=200&pretty=true' -d '{"query" : {"match_all" : {}}}' > out.txt
I can see the data there for both the working states and the non-working states. Plus, if I change the state code of a non-working state in CouchDB to something like XYZ, I can verify that the change makes it to ES by running the above command and searching for XYZ. So I know I'm looking at the right data and it's indexing fine.
The problem is the query. Right now, here's what my entire query object looks like:
var q = {
size: 0,
query: {
filtered: {
query: { term: { postcode: 'tn' } },
filter: { term: { version: 2 } }
}
},
facets: {
version: { terms: { field: "version" } },
count : { statistical : { field : "latestValues.enroll" } }
}
};
If I run that query, I get no results. If I change the "or" out with "tn" or "tx" or "sc" etc., then it works fine.
I looked for a way to escape reserved words and found this link but it doesn't seem to work for me, when running the following query:
var q = {
size: 0,
query: {
filtered: {
query: { match_all: { } },
filter: { term: { version: 2, postcode: 'or' } }
}
},
facets: {
version: { terms: { field: "version" } },
count : { statistical : { field : "latestValues.enroll" } }
}
};
(Note that that query also works when changing out "or" with a non-reserved-word-state so I know it's not a problem with the query itself).
Any ideas?
This is not about "reserved" words, its about stop words. You are using an analyzer which removes stop words (the default analyzer up to a more recent version of Elasticsearch).
You'll need to change the analyzer for the field, see here: http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/analysis.html
This will change require reindexing, though