Aggregation value error in Elastic Search - elasticsearch

I am trying to create a Date Histogram and aggregate a particular field to find the maximum value which is of long type in mapping from my ealsticsearch, but i get the result in floating point number,
for example :
Instead of getting 31032832 am getting 3.1032832E7
However am able to get 31032832 properly when i query my elasticsearch index through chrome plugin sense.

I found out what was the issue! it was giving me double value after aggregation because of this:
while accessing i called myResult.getMax().longValue() which solved my problem.

Related

JanusGraph with Elasticsearch index is not working

I have added mixed index in JanusGraph to support full-text search with Elasticsearch.
I have mixed index like:
myindex = mgmt.buildIndex("myesindex", Vertex.class)
.addKey("name", Mapping.TEXTSTRING.asParameter())
.addKey("sabindex", Mapping.TEXTSTRING.asParameter())
.buildMixedIndex("search");
I am able to load data into Elasticsearch engine.
Also I am able to execute the query successfully.
The issue I am facing is when I hit query :
g.V().has('code','abc').valueMap()
==>{str=[some text], code=[abc], sab=[sab], sabindex=[sabindex], name=[[some tex]]}
I am getting the result successfully, but when I try to search with name and code:
g.V().has('name', textContains('some text')).has('code','abc').valueMap()
code field is also indexed(composite)
At that time I am getting no result. Though data is present in graph and Elasticsearch.
And another scenario is same query with different name and code works successfully. I also rebuild the graph multiple times but not getting positive results.
The first query shows the value is name=[[some tex]]. It is missing the final t in text, so that explains why the query isn't matching on some text.
If you instead do textContains('some tex'), you would get the same result as the first query. Using the profile() step would show that the myindex was utilized.
See this gist of the recreate scenario.

Elastic Search - Find document with a conflicting field type

I'm using Elastic Search 5.6.2 with Kibana and I'm currently facing a problem
My documents are indexed on the field timestamp which is normally an integer, however recently somebody has logged a document with a timestamp that is not an integer, and Kibana complains of conflicting type.
The discover panels display nothing and the following errors pop:
Saved "field" parameter is now invalid. Please select a new field.
Discover: "field" is a required parameter
How can I look for the document(s) causing these conflicts so that to find the service creating bad logs ?
The field type (either integer or text/keyword) is not defined on per document basis but rather on per index basis (in the mappings). I guess you are manipulating timeseries data, and you probably have un index per day (or per month or ...).
In Kibana Dev Tools:
List the created indices with GET _cat/indices
For each index (logstash-2017.09.28 in my example) do a GET logstash-2017.09.28/_mapping and check the type of the field in #timestamp
The field type is probably different between indices.
You won't be able to change the field type on created indices. Deleting document won't solve you're problem. The only solution is to drop the index or reindex the whole index with a new field type (in a specific mapping).
To avoid this problem on future indices, the solution is to create an index template with a mapping telling that the field #timestamp is of type date or whatever.

how to find the date range in uri request search in elasticsearch

I have a search in which i need to find the delta of data
http://localhost:9200/index/index_type/_search?q=sampledate[21-02-2015 TO 22-02-2015]
but this search is giving me error
could anybody help?
You can use below query:
GET /index_name/index_type/_search?q=dateCreated:[2016-01-06+TO+2016-01-07]
This will work only if dateCreated is a date field. Won't work with String
We had similar weird issue with this date field in Elastic Search 7.6.1.
We found working solution by removing colon(:) after date fields and surrounding entire date query part with brackets.
i.e.
GET /index_name/index_type/_search?q=dateCreated:[2016-01-06+TO+2016-01-07]
Above query changed to
GET /index_name/index_type/_search?q=(dateCreated[2016-01-06+TO+2016-01-07]) This should work

How to add a numeric filter on kibana dashboard?

I have a field that contains numbers. I want a filter that shows all logs that are less than a constant value.
When I try to add a new query filter, all I can see is a query string option.
If you are talking about the query field a syntax like this works:
field:<10
Will find just records with a field value less than 10. Found this by experimentation one day -- don't know if it's documented anywhere.

How to get facet.query results only, using solrj?

I'm trying to get results of a facet query using solrj, but it seems it doesn't matter whether I add the facet query or not. I get the same document list anyway.
So this query returns the same document list...
query.setQuery(searchString);
query.setFacet(true);
query.addFacetField("CATNAME_STR");
query.addFacetQuery("CATNAME_STR:" + facetName);
...with this query
query.setQuery(searchString);
query.setFacet(true);
query.addFacetField("CATNAME_STR");
Only difference is I can get number of documents that matches the facet query with response.getFacetQuery();
I was expecting it to work like
http://localhost:8983/solr/select/?q=*%3A*&version=2.2&start=0&rows=10&indent=on&facet=on&facet.field=CATNAME_STR&fq=CATNAME_STR:Erasmus
Any ideas?
Thanks.
By the way I'm using Solr Version 3.1.0 and solr-core-3.1.0
As it turns out fq=CATNAME_STR:Erasmus does not mean query.addFacetQuery("CATNAME_STR:Erasmus") but instead query.addFilterQuery("CATNAME_STR:Erasmus")

Resources