Is it possible to search documents in Elastic Search index by version? I try this:
curl -XGET eshost:9200/myindex/mytype/_search -d '{query:{match:{_version:"2"}}}'
But it does not work.
I need such a query to get all documents, that have never been updated.
Try using version
Returns a version for each search hit.
{
"version": true,
"query" : {
"term" : { "user" : "kimchy" }
}
}
Unfortunately, you cannot query or filter by _version - the problem is that that field is not indexed, so queries and filters cannot access it:
http://elasticsearch-users.115913.n3.nabble.com/Can-i-filter-query-by-version-td4044331.html
you can search like normal and set version as true and it will return you count of all the versions for each document:
GET indextest/original/_search?pretty=true
{
"version": true
}
You might try
GET index/item/_search?pretty=version
{
"version": true
}
Filter by version is not possible, you can just get the version of each document by set the parameter "version" = true
Related
In elasticsearch term query documentation: https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-term-query.html, it is stated there's a case_insensitive field
However, I can't get to set it. I can set boost and value without issue, but not case_insensitive.
GET movies/_search
{
"query": {
"term": {
"overview": {
"value" : "batman",
"boost": 0.5,
"case-insensitive": true
}
}
}
}
When I run, I get the error state "[term] query does not support [case_insensitive]".
Where did I get it wrong, or the documentation is wrong?
Looks like you are on less than the ES 7.10.0 version where it was not present, Even if you check the documentation of ES 7.9 option of case_insensitive is not present.
Please find the related link of Github issue and PR which added support of case-insensitive to term query.
Please refer to this diff where caseInsensitive field was added to TermQuery.
I am trying to perform the simplest filter for a specific property value, as a JSON input, in a Kibana visualization, thoroughly without success.
I can't, to my surprise, find concrete examples in doing that (have been searching for a couple of minutes now).
Say we have a document with the following structure:
{
a: true,
b: 10
}
How can I add a Filter aggregation for all documents with a = true ?
I tried using "script", "query", "filters" api, but all give me parse errors. My filter jsons are all valid, my problem is with the exact syntax elastic is expecting, but all examples I found out there and tried - give me parsing errors (after making the amendments to my index structure).
Kibana's version: 6.4.3
How is this accomplished ?
An example:
POST /sales/_search?size=0
{
"aggs" : {
"docs" : {
"filter" : { "term": { "a": "true" } },
}
}
}
Here is the link to the official documentation with example.
When a time-based index is added to kibana, you have to pick the field that will act as a time field. If you want to switch from one field to another, normally I would delete the index and re-add it back. But you end up loosing scripted fields and filed formatting this way.
Is there any way to modify the existing index time field without loosing scripted fields/formatting?
It can probably be done by messing around directly with /.kibana/index-pattern/index_pattern_name but all my attempts with changing timeFieldName directly ended up dropping scripted fields.
This is what worked for me on Kibana 7.7.0 (Elastic Cloud):
find the {id} of the document whose title field corresponds to the index you want to make change
GET .kibana/_search
{
"query": {
"match": {
"index-pattern.title": "{NAME_OF_THE_INDEX}"
}
}
}
change the timefield with following code
POST .kibana/_update/{id}
{
"doc": {
"index-pattern": {
"timeFieldName" : "{NEW_TIME_FIELD_NAME}"
}
}
}
The easiest way seems to update the corresponding document:
POST /.kibana/index-pattern/YOUR_INDEX_NAME/_update
{
"doc": {
"timeFieldName": "NEW_TIME_FIELD_NAME"
}
}
It should preserve scripted fields.
This doesn't seem to work on Kibana 5.
Instead, following is the way worked on Kibana 5.
1. find the {id} of the document whose title field corresponds to the index you want to make change
GET .kibana/index-pattern/_search
{
"_source" : "_id",
"query" : {
"match" : {
"title": "{NAME_OF_THE_INDEX}"
}
}
}
2. change the timefield with following code
POST /.kibana/index-pattern/{id}/_update
{
"doc": {
"timeFieldName" : "{NEW_TIME_FIELD_NAME}"
}
}
This worked fine with me on Kibana 5.
I have 100GB ES index now. Right now I need to change one field to multi-fields, such as: username to username.username and username.raw (not_analyzed). I know it will apply to the incoming data. But how can I make this change affect on the old data? Should I using index scroll to copy the whole index to a new one, Or there is a better solution to just copy one filed please.
There's a way to achieve this without reindexing all your data by using the update by query plugin.
Basically, after installing the plugin, you can run the following query and all your documents will get the multi-field re-populated.
curl -XPOST 'localhost:9200/your_index/_update_by_query' -d '{
"query" : {
"match_all" : {}
},
"script" : "ctx._source.username = ctx._source.username;"
}'
It might take a while to run on 100GB docs, but after this runs, the username.raw field will be populated.
Note: for this plugin to work, one needs to have scripting enabled.
POST index/type/_update_by_query
{
"query" : {
"match_all" : {}
},
"script" :{
"inline" : "ctx._source.username = ctx._source.username;",
"lang" : "painless"
}
}
This worked for me on es 5.6, above one did not!
I have documents which contains only "url"(analyzed) and "respsize"(not_analyzed) fields at first. I want to update documents that match the url and add new field "category"
I mean;
at first doc1:
{
"url":"http://stackoverflow.com/users/4005632/mehmet-yener-yilmaz",
"respsize":"500"
}
I have an external data and I know "stackoverflow.com" belongs to category 10,
And I need to update the doc, and make it like:
{
"url":"http://stackoverflow.com/users/4005632/mehmet-yener-yilmaz",
"respsize":"500",
"category":"10"
}
Of course I will do this all documents which url fields has "stackoverflow.com"
and I need the update each doc oly once.. Because category data of url is not changeable, no need to update again.
I need to use _update api with _version number to check it but cant compose the dsl query.
EDIT
I run this and looks works fine:
But documents not changed..
Although query result looks true, new field not added to docs, need refresh or etc?
You could use the update by query plugin in order to do just that. The idea is to select all document without a category and whose url matches a certain string and add the category you wish.
curl -XPOST 'localhost:9200/webproxylog/_update_by_query' -H "Content-Type: application/json" -d '
{
"query": {
"filtered": {
"filter": {
"bool": {
"must": [
{
"term": {
"url": "stackoverflow.com"
}
},
{
"missing": {
"field": "category"
}
}
]
}
}
}
},
"script" : "ctx._source.category = \"10\";"
}'
After running this, all your documents with url: stackoverflow.com that don't have a category, will get category: 10. You can run the same query again later to fix new stackoverflow.com documents that have been indexed in the meantime.
Also make sure to enable scripting in elasticsearch.yml and restart ES:
script.inline: on
script.indexed: on
In the script, you're free to add as many fields as you want, e.g.
...
"script" : "ctx._source.category1 = \"10\"; ctx._source.category2 = \"20\";"
UPDATE
ES 2.3 now features the update by query functionality. You can still use the above query exactly as is and it will work (except that filtered and missing are deprecated, but still working ;).
That all sounds great but just to add to #Val answer, Update By Query is available form ElasticSearch 2.x but not for earlier versions. In our case we're using 1.4 for legacy reasons and there is no chance of upgrading in forseeable future so another solution is using the Update by query plugin provided here: https://github.com/yakaz/elasticsearch-action-updatebyquery