Wildcard on key in graylog - elasticsearch

Hi I am trying to use wildcard on key inside of my query. Because I have arrays in my data so I am saving my data in flat form. Like obj_0_id, obj_1_ID and so on. So is there any way to write something like this obj_*_ID:123
Thank you

You can use query_string which lets you specify wildcards on field names. Example from docs:
{
"query_string" : {
"fields" : ["city.*"],
"query" : "this AND that OR thus",
"use_dis_max" : true
}
}

Related

how to query strings with incasesensitive the text in elastic search

I'm looking for data in two fields with one filed must be the same, one using query
i have data
{
"NUMBER" : "5587120",
"SID" : "121213-13131-_X",
"ADDRESS" : "purwakarta"
}
i have tried use query string like this
GET test/_doc/_search
{
"query" : {
"bool" : {
"must" : [
{"match" : {"NUMBER" : "5587120"}}
],
"filter" : {
"query_string" : {
"default_field" : "SID.keyword",
"query" : "*X*"
}
}
}
}
when I enter the same text as the one recorded, the data I want appears, but when I write the text with lowercase, the data doesn't appear
As it's not clear from your question, that on which field you want the case insensitive search, based on the context I am assuming its the SID.keyword field.
Why your solution not working: Please note that keyword fields are not analyzed and indexed in elasticsearch as it is, so in case of your field SID.keyword you are providing its value 121213-13131-_X so it will be stored as it is, it will not create just one token which is exactly same as the provided value.
Now you are using the query_string on-field SID.keyword, hence your query string will use the same analyzer configured for the field which is the keyword analyzer which is again no-op analyzer, hence doesn't lowercase the *X* provided in the query.
Solution : If you want the insensitive search than instead of using the SID.keyword field, simply creates a custom analyzer which uses the keyword analyzer and then passes it to lowercase token filter, so your 121213-13131-_X will be converted to 121213-13131-_x(Note small case x). And then your query string will also use the same analyzer and will match the document as ultimately elasticsearch works on tokens match.

How to specify certain fields only in the query property

I am using a service which wraps requests to Elastic Search. This service only allows me to send the query property to Elastic Search. I want to tell Elastic Search to look only for matches in a certain field in a document.
For example, if this is my document:
{
name: 'foo',
value: 'true'
}
Then I want to tell Elastic Search to look only for documents where name equals foo.
The Elastic Search documentation says to do this by using the fields property like so:
{
"multi_match" : {
"query" : "this is a test",
"fields" : [ "subject^3", "message" ]
}
}
But I can ONLY access the query property, so I can't specify fields. Lower down on the page, under best fields it says that this is equivalent to doing something like +first_name:will +first_name:smith. But when I put this, it's looking for text that actually matches +first_name:will +first_name:smith in the value, rather than looking for a first_name field that has a value will.
Is it possible to specify what field to search in with Elastic Search using only the query property?
This sounds like a perfect match for query_string(https://www.elastic.co/guide/en/elasticsearch/reference/1.x/query-dsl-query-string-query.html). You can do something like this with it:
"query_string" : {
"query" : "subject:whatever OR message:whatever"
}
So, if you can change multi_match to query_string this would be what you are looking for.
Lucene supports fielded data. When performing a search you can either specify a field, or use the default field. The field names and default field is implementation specific.
You can search any field by typing the field name followed by a colon ":" and then the term you are looking for.
{
"query": {
"query_string": {
"query": "Name:\"foo bar cook\"",
"default_operator" : "or"
}
}
}
use default_operator and to perform AND operation, or to perform OR kind of operation among the values

Is there a way to apply the synonym token filter in ElasticSearch to field names rather than the value?

Consider the following JSON file:
{
"titleSony": "Matrix",
"cast": [
{
"firstName": "Keanu",
"lastName": "Reeves"
}
]
}
Now, I know in ElasticSearch, you can apply a synonym token filter to field values as given in the following link: Elasticsearch Analysis: Synonym token filter.
Hence, I can create a "synonym.txt" file with Matrix => Matx, then if I search for titleSony:Matx, it will return the documents with Matrix as well.
Now, what I would like is to create a synonym for the field name titleSony. For example - titleSony => titleAll, such that when I search for titleAll, I should get all documents with titleSony as well.
Is there any way to accomplish this in ElasticSearch?
Now, what I would like is to create a synonym for the field name "titleSony". For example - titleSony => titleAll , hence when I search for "titleAll", I should get all documents with "titleSony" as well.
Yes, somewhat. Elasticsearch has some default behavior very similar to this, which I'll touch on in a bit.
The feature you're looking for is called "Copy to field." It allows you to specify that the terms in one field should be copied into another. This is useful for consolidating terms you expect to match into a single field, to help simplify your query when you would like to match against any one of a number of fields.
In this example, you would specify in your mapping that the terms in the titleSony field ought to be copied into the titleAll field. Presumably you'd have other fields (say, titleDisney) which also copy into that field as well. So a search against titleAll will effectively match the other fields whose terms are copied into it.
An excerpt of your mapping might look something like this:
{
"movies" : {
"properties" : {
"titleSony" : { "type" : "string", "copy_to" : "titleAll" },
"titleDisney" : { "type" : "string", "copy_to" : "titleAll" },
"titleAll" : { "type" : "string" },
"cast" : { ... },
...
}
}
I mentioned earlier that Elasticsearch does something like this. By default it creates a special field called _all into which all the document's terms are copied. This field lets you construct very simple queries to match against terms that occur in any field on the document. So as you see, this is a fairly common convention in Elasticsearch. (Elasticsearch mapping: _all field.)

Find uppercase strings with wildcard

I have a field my_field that is defined like this:
"properties" : {
...
"my_field" : { "type" : "string", "store" : "no", "index" : "not_analyzed" },
...
}
All lowercase strings that are stored in that field can be found with wildcard:
i.e. kindergarten can be found with my_field:kinder*
but all uppercase strings cannot be found with wildcard:
i.e. KINDERGARTEN can neither be found with myfield:KINDER* nor with my_field:kinder*
Is that the expected behaviour or am I doing something wrong?
You must set lowercase_expanded_terms to false in order to do case-sensitive search with wildcards. Like this: http://localhost:9200/test/_search?lowercase_expanded_terms=false&q=my_field:KINDER*
I did quick test and everything looks correct to me.
I would try to test analysis on that field using /_analyze API to see that values really aren't lowercased.
curl -XPOST 'http://localhost:9200/test/_analyze?field=my_field' -d {
"test": "This Should Be Single Token"
}
Or try Index Termlist Plugin to see what tokens are actually stored in that field.

Doing search in elasticsearch

I prepare query object and do search in elasticsearch.
For making query object, I give key and their value.
Problem is, when key and value is like "brand":"Men's Wear" then In this case elasticsearch is unable to give me related docs. I think problem is with comma or may be space. everything is fine if I use other json property for key and value (having no space and comma like "priority":"high")
Any help please!
Update:
no match query still not working! one more problem i found in creating search query. query i am using is:
var qryObj1 = {
"query" : {
"text" : {"name":"Tom"}
}
};
This will return all docs having name Tom. Now I want to get all docs having name Tom and profession is developer. So, here modified one:
qryObj1 = {
"query" : {
"text" : {"name":"Tom","profession":"developer"}
},"operator" : "and"
};
but search result is old one. any help!
Sounds like you are using TermQuery, aren't you?
TermQuery are not analyzed so they don't match with your analyzed content.
Try with a MatchQuery. It should work.
You need to use boolean query
http://www.elasticsearch.org/guide/reference/query-dsl/bool-query.html
Here you can ask ES to take AND or OR of various queries
"bool" : {
"must" : [
"text" : {"name":"Tom"},
"text" : {"profession":"developer"}
]
}

Resources