how to modify the type mapping in elasticsearch to another type - elasticsearch

The thing is that I already defined a field "myvalue" as INTEGER. Now I think was a mistake and I want to store in the same field an string, so I want to change it, without loosing data, to STRING. is there any way of making it?, or I need to re-create the index and re-index the whole data?
I already tried running:
{
"mappings": {
"myvalue": {
"type":"string"
}
}
}
But if I get the mapping again from the server still appear as Integer

There is not any way to change the mapping on a core field type for existing data. You will need to re-create the index with the myvalue field defined as a string and re-index your data.

Related

What is the field "your_type" in Elasticsearch PUT request?

I am trying to resolve this error:
Fielddata is disabled on text fields by default. Set fielddata=true on
and saw one post which suggested me to do this; but I didn't get what is your_type endpoint in the given snippet:
PUT your_index/_mapping/your_type
I don't know what version of ElasticSearch you have but as of 7.x the mapping type has been removed.
In your case it could run like this (version > 7.x)
PUT my-index-000001/_mapping
{
"properties": {
"name-field": {
"type": "text",
"fielddata": true
}
}
}
A little about the mapping type:
Since the first release of Elasticsearch, each document has been
stored in a single index and assigned a single mapping type. A mapping
type was used to represent the type of document or entity being
indexed, for instance a twitter index might have a user type and a
tweet type.
Each mapping type could have its own fields, so the user type might
have a full_name field, a user_name field, and an email field, while
the tweet type could have a content field, a tweeted_at field and,
like the user type, a user_name field.
More information here:
https://www.elastic.co/guide/en/elasticsearch/reference/6.5/removal-of-types.html#_why_are_mapping_types_being_removed

Can I add a field automatically to an elastic search index when the data is being indexed?

I have 2 loggers from 2 different clusters logging into my elasticsearch. logger1 uses indices mydata-cluster1-YYYY.MM.DD and logger2 uses indices mydata-cluster2-YYYY.MM.DD.
I have no way of touching the loggers. So i would like to add a field on the ES side when the data is indexed to show which cluster the data belongs to. Can i use mappings to do this?
Thanks
What if you use the PUT mapping API, in order to add a field to your index:
PUT mydata-cluster1-YYYY.MM.DD/_mapping/mappingtype <-- change the mapping type according to yours
{
"properties": {
"your_field": {
"type": "text" <--- type of the field
}
}
}
This SO could come in handy. Hope it helps!

Add typed additional attributes to an existing document elasticsearch

I added a field to the document:
POST /erection/shop/1/_update
{
"doc": {
"my_field":""
}
}
The new field is assigned to the type of "String". how can I create a new field with the type "Boolean"/"Integer"?
and 2nd question:
is it possible to add one field in all documents using one query? (without updating each document)
1) Explicitly define a mapping prior to the first update you do.
2) No, you can't. You can do it in your application using "scan" and then "bulk update"

Can I use ElasticSearch mapping transform to duplicate a field

I read here about mapping transform: https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-transform.html
The result of the transform is indexed but the original source is
stored in the _source field.
So I thought I can use it to "copy" a field. I try:
{
"mappings":{
"opportunity":{
"transform":{
"script":"ctx._source['skill_suggest']=ctx._source['skill']"
}
}
}
}
Then I perform the query on the "skill_suggest" field but return no result (the same query on "skill" work fine).
So what I'm doing wrong?
Can I some how "copy" some fields on the fly? I want to perform full-text seach on "skill" but also the Completion Suggester but I cannot modify the data schema sent from client.
This sounds like a perfect match for multi-fields: https://www.elastic.co/guide/en/elasticsearch/reference/current/_multi_fields.html

Elasticsearch - Extra unmapped fields on geo-shape type index

I have some extra inner fields on a geo-shape type field. For example, "shape" is a geo-shape type field which has the regular required fields like "coordinates", "radius" etc., but it may also have other fields like "metadata" which I want elasticsearch to not parse and not store in the index. For example:
"shape": {
"coordinates":[6.77,8.99]
"radius": 500
"metadata": "some value"
}
Mapping schema looks like this:
"shape":{
"type":"geo_shape"
}
How can I achieve this ? By using "dynamic": false on mapping schema does not seem to be working.
Setting dynamic to false in your root mapping, like you did, is the way to go : are your sure it desn't work? Or are you saying that because it appears in your result hit _source?
Actually, by default, the _source attribute will contains the exact same document that you submitted.
However, it doesn't mean the extra metadata field has been indexed and/or stored.
If you want to check this, request specifically that field in your search like this :
POST _search
{
"fields": ["shape.metadata"]
}
You should have your search hits but without any fields value.
If it still bother you, disabled the _source attribute in your mapping.

Resources