Elasticsearch geo_point mapping type overwritten on first save - elasticsearch

I realise mapping types are being removed in 7.x but I am working with a solution that uses 6.x.
I have an index I am creating which has a location property. When creating the index I add the following mapping property:
mappings: {
_doc: {
properties: {
location: {
type: 'geo_point'
}
}
}
}
There are other properties that will be in the index but I'm happy for those to be defined automatically (I presume I can do that as elsewhere in the application it has been done this way with no problems).
The index is created ok but when I index my first entity and run a query using the location field I get the following error: failed to find geo_point field [location]
Looking at the mappings now defined in ElasticSearch I can see my location field has now become an object with two float values instead of a geo_point:
{"job-posts":{"aliases":{},"mappings":{"_doc":{"properties":{"location":{"properties":{"lat":{"type":"float"},"lon":{"type":"float"}}},"settings":{"index":{"creation_date":"1591636220162","number_of_shards":"5","number_of_replicas":"1","uuid":"qwAybNlFQ4i3q7IecdZFvA","version":{"created":"6040099"},"provided_name":"job-posts"}}}}
Any ideas as to what I'm doing wrong and why my mapping is being overwritten?
Updated
Right after I create the index the mapping looks like this:
{"job-posts":{"mappings":{"_doc":{"properties":{"location":{"type":"geo_point"}}}}}}

Looks like you forgot to include properties:
PUT myindex?include_type_name=true
{
"mappings": {
"_doc": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
}
}
If that doesn't help, what's the index mapping right after you create the index but before you sync the first doc?

Related

Create a new index with field property marked as 'not-analyzed' in Kibana Dev Tools

How to create in the Kibana dev-tools a new Elasticsearch index and define one of its text field properties a not-analyzed ?
I know the command to create a new index is:
PUT /myNewIndexName
but I want one of the properties in the document to be not-analyzed
the analyzed value is from <7.X, the current approach to doing that is to use a keyword field - https://www.elastic.co/guide/en/elasticsearch/reference/7.15/keyword.html
PUT myNewIndexName
{
"mappings": {
"properties": {
"field_name": {
"type": "keyword"
}
}
}
}

Update restrictions on Elasticsearch Object type field

I have to store documents with a single field contains a single Json object. this object has a variable depth and variable schema.
I config a mapping like this:
"mappings": {
"properties": {
"#timestamp": {
"type": "date"
},
"message": {
"type": "object"
}
}
}
It works fine and Elasticsearch creates and updates mapping with documents that received.
The problem is that after some updates in mapping, it rejects new documents and do not update mapping anymore. At this time I change the indices and mapping update occurred for that indies. I'm looking forward to know the right solution.
for example the first document is:
{
personalInfo:{
fistName: "tom"
}
moviesStatistics: {
count: 100
}
}
the second document that will update Elasticsearch mapping is:
{
personalInfo:{
fistName: "tom",
lastName: "hanks"
},
moviesStatistics: {
count: 100
},
education: {
title: "a title..."
}
}
Elasticsearch creates mapping with doc1 and updates it with doc2, doc3, ... until a number of documents received. After that it starts to reject every document that is not matched to the last mapping fields.
After all I found the solution in the home page of Elasticsearch https://www.elastic.co/guide/en/elasticsearch/reference/7.13//dynamic-field-mapping.html
We can use Dynamic mapping and simply use this mapping:
"mappings": {
"dynamic": "true"
}
You should also change some default restrictions that mentioned here:
https://www.elastic.co/guide/en/elasticsearch/reference/7.13//mapping-settings-limit.html

Change geoip.location mapping from number to geo_point

I am using a logstash filter to convert my filebeat IIS logs into location:
filter {
geoip {
source => "clienthost"
}
}
But the data type in elasticSearch is:
geoip.location.lon = NUMBER
geoip.location.lat = NUMBER
But in order to map points, I need to have
geoip.location = GEO_POINT
Is there a way to change the mapping?
I tried posting a changed mapping
sudo curl -XPUT "http://localhost:9200/_template/filebeat" -d#/etc/filebeat/geoip-mapping-new.json
with a new definition but it's not making a difference:
{
"mappings": {
"geoip": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
},
"template": "filebeat-*"
}
Edit: I've tried this with both ES/Kiabana/Logstash 5.6.3 and 5.5.0
This is not a solution but I deleted all the data and reinstalled ES, Kiabana, Logstash and Filebeat 5.5
And now ES recognizes location as a geopoint - I guess previously even though I had changed the data mapping, there was still data that was mapped incorrectly and Kibana was assuming the incorrect data type - probably a reindex of the complete data would have fixed the problem

Is it possible to update an existing field in an index through mapping in Elasticsearch?

I've already created an index, and it contains data from my MySQL database. I've got few fields which are string in my table, where I need them as different types (integer & double) in Elasticsearch.
So I'm aware that I could do it through mapping as follows:
{
"mappings": {
"my_type": {
"properties": {
"userid": {
"type": "text",
"fielddata": true
},
"responsecode": {
"type": "integer"
},
"chargeamount": {
"type": "double"
}
}
}
}
}
But I've tried this when I'm creating the index as a new one. What I wanted to know is how can I update an existing field (ie: chargeamount in this scenario) using mapping as a PUT?
Is this possible? Any help could be appreciated.
Once a mapping type has been created, you're very constrained on what you can update. According to the official documentation, the only changes you can make to an existing mapping after it's been created are the following, but changing a field's type is not one of them:
In general, the mapping for existing fields cannot be updated. There
are some exceptions to this rule. For instance:
new properties can be added to Object datatype fields.
new multi-fields can be added to existing fields.
doc_values can be disabled, but not enabled.
the ignore_above parameter can be updated.

How to set existing elastic search mapping from index: no to index: analyzed

I am new to elastic search, I want to updated the existing mapping under my index. My existing mapping looks like
"load":{
"mappings": {
"load": {
"properties":{
"customerReferenceNumbers": {
"type": "string",
"index": "no"
}
}
}
}
}
I would like to update this field from my mapping to be analyzed, so that my 'customerReferenceNumber' field will be available for search.
I am trying to run the following query in Sense plugin to do so,
PUT /load/load/_mapping { "load": {
"properties": {
"customerReferenceNumbers": {
"type": "string",
"index": "analyzed"
}
}
}}
but I am getting following error with this command,
MergeMappingException[Merge failed with failures {[mapper customerReferenceNumbers] has different index values]
Though there exist data associated with these mappings, here I am unable to understand why elastic search not allowing me to update mapping from no-index to indexed?
Thanks in advance!!
ElasticSearch doesn't allow this kind of change.
And even if it was possible, as you will have to reindex your data for your new mapping to be used, it is faster for you to create a new index with the new mapping, and reindex your data into it.
If you can't afford any downtime, take a look at the alias feature which is designed for these use cases.
This is by design. You cannot change the mapping of an existing field in this way. Read more about this at https://www.elastic.co/blog/changing-mapping-with-zero-downtime and https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-put-mapping.html.

Resources