I am using AWS Elastic search. Right now my kibana has 0 free space. I want to delete documents that are older than 30days.
Is there any setting for that or query which can clear documents which is older than 30 days.
You could simple use the delete by query to delete the documents which are older than 30 days :)
Related
I have an Elasticsearch cluster on Kubernetes, I also have a curator that deletes indices older than 7 days.
I want to change the curator to work according to a certain condition:
If document key1=value1 delete these documents delete after 10 days, otherwise delete after 7 days.
Is there any way to do it?
Curator is limited to index deletion as a whole and not at the document level.
What Curator does under the hood is call DELETE index-name and there is not way to configure it to call the delete by query API which is what you're asking for.
I've a lot of elasticsearch clusters which hold the historical indices(more than 10 years old), some of these indices are created newly with latest settings and fields, but old ones are not deleted.
Now I need to delete the old indices which are not receiving any search and index requests.
I've already gone to elasticsearch curator but it would not work with older version of ES.
Is there is any API which can just gives the last time of index and search request in ES, that would serve my purpose very well.
EDIT:- I've also check https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-stats.html but this also doesn't give the last time when indexing or search request came. all it gave is the number of these requests from last restart.
we are using elastic-search 5 and i need to delete indices older than 30 days through API.Can any can help me out for this?
You can use curator to do that.
https://www.elastic.co/guide/en/elasticsearch/client/curator/5.x/delete_indices.html
curator --host <hostname> delete indices --older-than 30
Sorry if this is a simple question - I'm new to ELK and have it all running with data coming through ok. My issue is that I'm concerned about storage growth given the number of records that will be coming through.
Having a search on the google I've seen that on GrayLog there is a setting to limit the amount of data to retain ( Graylog2- how to config logs retention to 1 week ) and I'd like to do the same in ELK but I can't find the correct setting.
There is no easy way to do this in GUI (yet). What you need is the Curator that can delete or rollup indices based on time (delete indices older than 7 days) or amount of documents in an index.
In a future Version there will be an inbuilt tool for that in Kibana, but it´s not in the current release (6.5). It will probably release with Elastic 6.6 (as a beta), but you may even have to wait for 7.X
I am playing with ES. When one updates document in ES, ES automatically increments the version of the document.
While this is great, i wander if ES keeps the old documents too?
If it keeps the whole old documents, the storage on disk could grow a lot if I often update documents.
So in general , i am planning to do daily updates on all documents in some index. For 1 year i will have 365 updates on every document in one index. Is this OK to do ? Will i have 365 documents stored in ES ?
Is there a way to clean some old versions of the documents ?
No it does not keep old documents, it's just for optimistic locking (concurrent updates).
http://www.elasticsearch.org/blog/versioning/