I ran a conversion program, catmandu, and sent the results to elasticsearch. I'm new to elasticsearch. Do you know how I could find something similar to '.schema' for sqlite3 in elasticsearch? I'd like to know what fields are in there.
After searching around this morning I was unable to find something that would tell me the fields if I did not know them ahead of time.
Thank you for your help.
You use the mapping API to get current mapping for an index. This will show you all of the fields that have been indexed:
curl -XGET "http://localhost:9200/{yourIndex}/_mapping"
Related
We have a requirement that we return just the source fields in search results, without any of the metadata. From searching, I gather that this is not possible with elasticsearch, but I did find a reference to maybe using a plugin in this thread:
Filter out metadata fields and only return source fields in elasticsearch
The plugin that was linked was this one:
https://github.com/imotov/elasticsearch-just-source/blob/master/src/main/java/org/elasticsearch/examples/justsource/rest/action/RestJustSourceAction.java
I'm still learning about elasticsearch, but can someone explain how I would implement and deploy that plugin in our elasticsearch configuration?
Thanks,
Jim
As stated in the first link you referenced, it is possible to do it with response filtering which is not a plugin but a standard feature of ES:
GET /index/type/_search?filter_path=hits.hits._source
If you want to get rid of hits.hits._source you can use jq
curl -XGET localhost:9200/index/type/_search?filter_path=hits.hits._source | jq '.hits.hits[]._source'
I am now on elastic search, I cant figure out how to update elastic search index,type or document without deleting and reindexing? or is it the best way to achieve it?
So if I have products in my sql product table, should I better delete product type and reindex it or even entire DB as index on elasticsearc. what is the best use case and how can I achieve it?
I would like to do it with Nest preferably but if it is easier, ElasticSearch works for me as well.
Thanks
This can be a real challenge! Historic records in elasticsearch will need to be reindexed when the template changes. New records will automatically be formatted according to the template you specify.
Using this link has helped us a lot:
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-templates.html
You'll want to be sure to have the logstash filter set up to match the fields in your template.
Let's say I have an index test and which already exists. I want to add a new field newfield1 with some data for all documents in the database. Currently I am simply deleting all everything and then reinserting the data with the newfield1 data added in. I understand this isn't the most efficient way, but that's not my question right now.
Sometimes the data in newfield1 does not get indexed and I can't visualize it in Kibana. It's pretty annoying. Is there something wrong with what I'm doing?
NOTE: I CAN query this field in ElasticSearch which makes me think there's a problem with Kibana
Kibana caches the field mapping. Go to Settings -> Indices, select your index, and click the orange "Refresh" button.
Not much to go on here but first make sure your cluster is Green.
$ curl -XGET 'http://localhost:9200/_cluster/health?pretty=true'
If you are still struggling to understand the state of you cluster then perhaps consider installing on of the plugins like HQ https://github.com/royrusso/elasticsearch-HQ
I'm using Elasticsearch version 1.2.0
I have documents indexed by bulk indexing.
When it comes to search, it works fine when I use _search endpoint to get a document that I want.
However, I cannot get the exactly same document using GET API.
For example, the code snippet below does not retrieve any result.
curl -XGET "http://xxx.xxx.xxx.xxx:9200/my_index/my_type/my_id?pretty"
However, when I specify the routing value, it retrieves correct result that I wanted to get.
curl -XGET "http://xxx.xxx.xxx.xxx:9200/my_index/my_type/my_id?routing=3&pretty"
Here is the thing that I want to know because I've never used any kind of routing settings for indexing operation.
And there is NO parent-child relations with the "my_type".
Could anyone recommend other possible reasons for this kind of problem?
Thanks in advance.
Elasticsearch version 1.2.0 has a severe bug with respect to indexing.
The document recommends an upgrade to 1.2.1.I think you are running into this issue.
I was trying to run elasticsearch on couchdb using river plugin. Unfortunately, the number of hits i have got is the same with the total of actual documents. Say that i have 10000 documents in my couch database, however, it only returns 9340 hits in elasticsearch. Anyone knows why this problem arise? Would you mind to explain it to me please?
Regards,
Jemie Effendy
Look into the bool query. This will return results that MUST have what you've defined in your query. Also look into your mapping, Elasticsearch may be tokenising your data in a way in which you don't expect.