I am following the following docs: https://www.elastic.co/guide/en/elasticsearch/reference/current/runtime-indexed.html
I have a field which I would like to not be scripted on runtime but rather on index-time, and according to above I can do that simply by putting the field and its script inside the mapping object as normal.
Here is a simplified version of the index I'm trying to create
{
"settings": {
"analysis": {
"analyzer": {
"case_insensitive_analyzer": {
"type": "custom",
"filter": ["lowercase"],
"tokenizer": "keyword"
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "text"
},
"events": {
"properties": {
"fields": {
"type": "text"
},
"id": {
"type": "text"
},
"event": {
"type": "text"
},
"time": {
"type": "date"
},
"user": {
"type": "text"
},
"state": {
"type": "integer"
}
}
},
"eventLast": {
"type": "date",
"on_script_error": "fail",
"script": {
"source": "def events = doc['events']; emit(events[events.length-1].time.value"
}
}
}
}
}
I'm getting this 400 error back:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "unknown parameter [script] on mapper [eventLast] of type [date]"
}
],
"type": "mapper_parsing_exception",
"reason": "Failed to parse mapping [_doc]: unknown parameter [script] on mapper [eventLast] of type [date]",
"caused_by": {
"type": "mapper_parsing_exception",
"reason": "unknown parameter [script] on mapper [eventLast] of type [date]"
}
},
"status": 400
}
Essentially I'm trying to create a scripted indexed field that is calculated off the last event time in the events array of the document.
Thanks
Tldr;
As the error states, you can not define your script in here.
There is a specific way to create runtime fields in elasticsearch.
You need to put the definition at the root of the json in the runtime object.
Solution
{
"settings": {
"analysis": {
"analyzer": {
"case_insensitive_analyzer": {
"type": "custom",
"filter": ["lowercase"],
"tokenizer": "keyword"
}
}
}
},
"runtime": {
"eventLast": {
"type": "date",
"on_script_error": "fail",
"script": {
"source": "def events = doc['events']; emit(events[events.length-1].time.value"
}
}
},
"mappings": {
"properties": {
"id": {
"type": "text"
},
"events": {
"properties": {
"fields": {
"type": "text"
},
"id": {
"type": "text"
},
"event": {
"type": "text"
},
"time": {
"type": "date"
},
"user": {
"type": "text"
},
"state": {
"type": "integer"
}
}
}
}
}
}
I'm trying to setup a mapping for this kind of logs (the automatic mapping didn't work).
here's the log that I've to analyse thanks to kibana (found on the internet):
{"index":
{"_index":"logstash-2015.05.18","_type":"log"
}
}
{"#timestamp":"2015-05-18T09:03:25.877Z","ip":"185.124.182.126","extension":"gif","response":"404",
"geo":{
"coordinates":{
"lat":36.518375,"lon":-86.05828083
},
"src":"PH","dest":"MM","srcdest":"PH:MM"
},
"#tags":["success","info"],"utc_time":"2015-05-18T09:03:25.877Z","referer":"http://twitter.com/error/william-shepherd","agent":"Mozilla/5.0 (X11; Linux x86_64; rv:6.0a1) Gecko/20110421 Firefox/6.0a1","clientip":"185.124.182.126","bytes":804,"host":"motion-media.theacademyofperformingartsandscience.org","request":"/canhaz/gemini-7.gif","url":"https://motion-media.theacademyofperformingartsandscience.org/canhaz/gemini-7.gif","#message":"185.124.182.126 - - [2015-05-18T09:03:25.877Z] \"GET /canhaz/gemini-7.gif HTTP/1.1\" 404 804 \"-\" \"Mozilla/5.0 (X11; Linux x86_64; rv:6.0a1) Gecko/20110421 Firefox/6.0a1\"","spaces":"this is a thing with lots of spaces wwwwoooooo","xss":"<script>console.log(\"xss\")</script>","headings":["<h3>f-i-j-nl-ng</h5>","http://facebook.com/success/lodewijk-van-den-berg"],"links":["daniel-tani#facebook.com","http://nytimes.com/security/kathryn-sullivan","www.nytimes.com"],
"relatedContent":[
{"url":"http://www.laweekly.com/news/cbs-crew-rat-fink-2368032","og:type":"article","og:title":"CBS Crew Rat Fink","og:description":"Near a couple of auto body shops (and a sharp new Space Invader mosaic that we'll post soon) near Temple and Westmoreland is a CBS wall with a nice Rat ...","og:url":"http://www.laweekly.com/news/cbs-crew-rat-fink-2368032","article:published_time":"2008-01-14T08:05:26-08:00","article:modified_time":"2014-10-28T14:59:52-07:00","article:section":"News","article:tag":"Mark Mauer","og:image":"http://IMAGES1.laweekly.com/imager/cbs-crew-rat-fink/u/original/2430299/img_2049.jpg","og:image:height":"360","og:image:width":"480","og:site_name":"LA Weekly","twitter:title":"CBS Crew Rat Fink","twitter:description":"Near a couple of auto body shops (and a sharp new Space Invader mosaic that we'll post soon) near Temple and Westmoreland is a CBS wall with a nice Rat ...","twitter:card":"summary","twitter:image":"http://IMAGES1.laweekly.com/imager/cbs-crew-rat-fink/u/original/2430299/img_2049.jpg","twitter:site":"#laweekly"
},
{"url":"http://www.laweekly.com/news/push-and-retna-in-koreatown-2368043","og:type":"article","og:title":"Push and Retna in Koreatown","og:description":"Yeah, I originally had this posted this morning as Push & Ayer - Sorry. It looked like a Retna piece, but I saw the Ayer in there and thought that must ...","og:url":"http://www.laweekly.com/news/push-and-retna-in-koreatown-2368043","article:published_time":"2008-01-29T07:28:32-08:00","article:modified_time":"2014-10-28T14:59:54-07:00","article:section":"News","article:tag":"Shelley Leopold","og:image":"http://IMAGES1.laweekly.com/imager/push-and-retna-in-koreatown/u/original/2430376/img_3671.jpg","og:image:height":"360","og:image:width":"480","og:site_name":"LA Weekly","twitter:title":"Push and Retna in Koreatown","twitter:description":"Yeah, I originally had this posted this morning as Push & Ayer - Sorry. It looked like a Retna piece, but I saw the Ayer in there and thought that must ...","twitter:card":"summary","twitter:image":"http://IMAGES1.laweekly.com/imager/push-and-retna-in-koreatown/u/original/2430376/img_3671.jpg","twitter:site":"#laweekly"
},
{"url":"http://www.laweekly.com/news/asylm-ruets-pdb-on-santa-monica-2368012","og:type":"article","og:title":"Asylm, Ruets, PDB on Santa Monica","og:description":"Not a new piece, but a well-hidden gem a little south of Santa Monica Blvd. in an alley off of Heliotrope or Edgemont. I've been sitting on this for a w...","og:url":"http://www.laweekly.com/news/asylm-ruets-pdb-on-santa-monica-2368012","article:published_time":"2008-04-22T15:11:15-07:00","article:modified_time":"2014-10-28T14:59:48-07:00","article:section":"News","article:tag":"Culture and Lifestyle","og:image":"http://images1.laweekly.com/imager/asylm-ruets-pdb-on-santa-monica/u/original/2430137/img_5027.jpg","og:image:height":"360","og:image:width":"480","og:site_name":"LA Weekly","twitter:title":"Asylm, Ruets, PDB on Santa Monica","twitter:description":"Not a new piece, but a well-hidden gem a little south of Santa Monica Blvd. in an alley off of Heliotrope or Edgemont. I've been sitting on this for a w...","twitter:card":"summary","twitter:image":"http://images1.laweekly.com/imager/asylm-ruets-pdb-on-santa-monica/u/original/2430137/img_5027.jpg","twitter:site":"#laweekly"
},
{"url":"http://www.laweekly.com/news/laurence-tribe-tangles-with-cbs-and-la-city-hall-2396867","og:type":"article","og:title":"Laurence Tribe Tangles with CBS and L.A. City Hall","og:description":"The United States Court of Appeals for the Ninth Circuit’s Courtroom 3 - a miniature auditorium with comfortable, smoked salmon-colored seats - wa...","og:url":"http://www.laweekly.com/news/laurence-tribe-tangles-with-cbs-and-la-city-hall-2396867","article:published_time":"2008-06-04T14:16:10-07:00","article:modified_time":"2014-11-26T14:43:59-08:00","article:section":"News","og:site_name":"LA Weekly","twitter:title":"Laurence Tribe Tangles with CBS and L.A. City Hall","twitter:description":"The United States Court of Appeals for the Ninth Circuit’s Courtroom 3 - a miniature auditorium with comfortable, smoked salmon-colored seats - wa...","twitter:card":"summary","twitter:site":"#laweekly"
}
],
"machine":{
"os":"win xp","ram":3221225472
},
"#version":"1"
}
and here's the mapping I've put in the Kibana's dev tool:
PUT logstash-2019.05.09
{
"mappings": {
"doc": {
"properties": {
"index": {
"_index": {
"type": "keyword"
},
"_type": {
"type": "text"
}
},
"#timestamp": {
"type": "date"
},
"ip": {
"type": "ip"
},
"extension": {
"type": "text"
},
"response": {
"type": "text"
},
"geo": {
"coordinates": {
"type": "geo_point"
},
"src": {
"type": "text"
},
"dest": {
"type": "text"
},
"srcdest": {
"type": "text"
}
},
"tags": {
"type": "text"
},
"utc_time": {
"type": "date"
},
"referer": {
"type": "text"
},
"agent": {
"type": "text"
},
"clientip": {
"type": "ip"
},
"bytes": {
"type": "integer"
},
"host": {
"type": "text"
},
"request": {
"type": "text"
},
"url": {
"type": "text"
},
"#message": {
"type": "text"
},
"spaces": {
"type": "text"
},
"xss": {
"type": "text"
},
"links": {
"type": "text"
},
"relatedContent": {
"url": {
"type": "text"
},
"og:type": {
"type": "text"
},
"og:title": {
"type": "text"
},
"og:description": {
"type": ""
},
"og:url": {
"type": ""
},
"article:published_time": {
"type": "date"
},
"article:modified_time": {
"type": "date"
},
"article:section": {
"type": "keyword"
},
"article:tag": {
"type": "text"
},
"og:image": {
"type": "text"
},
"og:image:height": {
"type": "integer"
},
"og:image:width": {
"type": "integer"
},
"og:site_name": {
"type": "text"
},
"twitter:title": {
"type": "text"
},
"twitter:description": {
"type": "text"
},
"twitter:card": {
"type": "keyword"
},
"twitter:image": {
"type": "text"
},
"twitter:site": {
"type": "keyword"
}
},
"machine": {
"os": {
"type": "text"
},
"ram": {
"type": "integer"
}
},
"#version": {
"type": "integer"
}
}
}
}
}
and here's the error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "No type specified for field [index]"
}
],
"type": "mapper_parsing_exception",
"reason": "Failed to parse mapping [doc]: No type specified for field [index]",
"caused_by": {
"type": "mapper_parsing_exception",
"reason": "No type specified for field [index]"
}
},
"status": 400
}
I've already search on internet to find some solutions but I didn't found anything that can help me.
You're missing the properties keyword for all your object fields. Use this mapping instead
PUT logstash-2019.05.09
{
"mappings": {
"doc": {
"properties": {
"#timestamp": {
"type": "date"
},
"ip": {
"type": "ip"
},
"extension": {
"type": "text"
},
"response": {
"type": "text"
},
"geo": {
"properties": {
"coordinates": {
"type": "geo_point"
},
"src": {
"type": "text"
},
"dest": {
"type": "text"
},
"srcdest": {
"type": "text"
}
}
},
"tags": {
"type": "text"
},
"utc_time": {
"type": "date"
},
"referer": {
"type": "text"
},
"agent": {
"type": "text"
},
"clientip": {
"type": "ip"
},
"bytes": {
"type": "integer"
},
"host": {
"type": "text"
},
"request": {
"type": "text"
},
"url": {
"type": "text"
},
"#message": {
"type": "text"
},
"spaces": {
"type": "text"
},
"xss": {
"type": "text"
},
"links": {
"type": "text"
},
"relatedContent": {
"properties": {
"url": {
"type": "text"
},
"og:type": {
"type": "text"
},
"og:title": {
"type": "text"
},
"og:description": {
"type": ""
},
"og:url": {
"type": ""
},
"article:published_time": {
"type": "date"
},
"article:modified_time": {
"type": "date"
},
"article:section": {
"type": "keyword"
},
"article:tag": {
"type": "text"
},
"og:image": {
"type": "text"
},
"og:image:height": {
"type": "integer"
},
"og:image:width": {
"type": "integer"
},
"og:site_name": {
"type": "text"
},
"twitter:title": {
"type": "text"
},
"twitter:description": {
"type": "text"
},
"twitter:card": {
"type": "keyword"
},
"twitter:image": {
"type": "text"
},
"twitter:site": {
"type": "keyword"
}
}
},
"machine": {
"properties": {
"os": {
"type": "text"
},
"ram": {
"type": "integer"
}
}
},
"#version": {
"type": "integer"
}
}
}
}
}
i'm using postman to communicate with elastic search server and i am receiving an error in postman when i am trying to connect with my elastic search server. Where could i have gone wrong?
Here is my code.
{
"mappings": {
"post": {
"properties": {
"city": {
"type": "text"
},
"contact_email": {
"type": "text"
},
"country": {
"type": "text"
},
"description": {
"type": "text"
},
"image": {
"type": "text"
},
"post_id": {
"type": "text"
},
"state_province": {
"type": "text"
},
"title": {
"type": "text"
}
}
}
}
}
i've tried communicating with my server but i keep receiving this error
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "Root mapping definition has unsupported parameters: [post : {properties={country={type=text}, image={type=text}, post_id={type=text}, city={type=text}, description={type=text}, state_province={type=text}, title={type=text}, contact_email={type=text}}}]"
}
Seems like you are using elasticsearch version 7.0. Since elasticsearch no more support more than one mapping type per index, mapping name is no more required and should not be provided in this version.
Remove the mapping name post from the json input. Use as below:
{
"mappings": {
"properties": {
"city": {
"type": "text"
},
"contact_email": {
"type": "text"
},
"country": {
"type": "text"
},
"description": {
"type": "text"
},
"image": {
"type": "text"
},
"post_id": {
"type": "text"
},
"state_province": {
"type": "text"
},
"title": {
"type": "text"
}
}
}
}
I just deployed a small application that loads a few thousand docs into an index and when working with production data i get an error in my search request.
http code is 400 and the error is
{
"error": {
"root_cause": [
{
"type": "number_format_exception",
"reason": "empty String"
}
],
"type": "number_format_exception",
"reason": "empty String"
},
"status": 400
}
Okay i kind of get it that my mapping defines some numeric field which i oviously dont store correctly, but how am i supposed to find that field?
each doc contains hundereds of fields.... i mean, really?
I tried looking in /var/log/elasticsearch but nothing useful there...
Please help me i need to get the thing going
I defined my fields as integer which should hold arrays and might be empty. Could that be a problem?
My ES Version is 6.6.0
Update:
The error does occur while searching, during index all is fine
My mapping for that index:
{
"development-object-1551202425": {
"mappings": {
"_doc": {
"dynamic": "false",
"properties": {
"accommodation": {
"properties": {
"badges": {
"properties": {
"maskedProp1": {
"type": "boolean"
},
"maskedProp2": {
"type": "boolean"
},
"maskedProp3": {
"type": "boolean"
},
"maskedProp4": {
"type": "boolean"
},
"maskedProp5": {
"type": "boolean"
},
"maskedProp6": {
"type": "boolean"
}
}
},
"businessTypes": {
"type": "integer"
},
"classification": {
"properties": {
"classification": {
"type": "keyword"
},
"classificationValue": {
"type": "short"
}
}
},
"endowments": {
"type": "integer"
},
"hasPrice": {
"type": "boolean"
},
"lowestPrice": {
"type": "float"
},
"metascore": {
"type": "short"
},
"rating": {
"type": "short"
},
"regionscore": {
"type": "short"
}
}
},
"certificates": {
"type": "integer"
},
"geoLocation": {
"type": "geo_point"
},
"id": {
"type": "text"
},
"isAccommodation": {
"type": "boolean"
},
"location": {
"properties": {
"maskedProp1": {
"type": "integer"
},
"maskedProp2": {
"type": "integer"
},
"id": {
"type": "integer"
},
"name": {
"type": "text",
"fielddata": true
},
"zipcodes": {
"type": "integer"
}
}
},
"maskedProp1": {
"type": "integer"
},
"maskedProp2": {
"type": "integer"
},
"description": {
"type": "text"
},
"sortTitle": {
"type": "keyword"
},
"title": {
"type": "text"
}
}
}
}
}
}
The name consists of an environment string (development) and a timestamp appended (i work with automatic index switching and query for the alias, that does is called {env}-{name}-current.
In my case the error was an empty "size" parameter in the query, i tried to find the error in my filters and did not see that...
A more verbose error message (at least at what property or setting the error occured) could save thousands of hours of debugging all around the world i guess xD.
For now you would have to take apart your dsl section by section to find the issue.
I tried posting a _bulk post into elastic search, but it throws:
{
"took": 1,
"errors": true,
"items": [
{
"index": {
"_index": "quick",
"_type": "parts",
"_id": "ACI250-2016",
"status": 400,
"error": {
"type": "mapper_parsing_exception",
"reason": "failed to parse [part]",
"caused_by": {
"type": "number_format_exception",
"reason": "For input string: \"250-2016\""
}
}
}
}
]
}
And here's what I'm trying to post:
POST _bulk
{"index":{"_index":"quick","_type":"parts","_id":"ACI250-2016"}}
{"eMfg":"ACI","part":"250-2016"}
And the map is:
{
"quick": {
"mappings": {
"parts": {
"properties": {
"app": {
"type": "string"
},
"eMfg": {
"type": "string"
},
"fPart": {
"type": "long"
},
"oPart": {
"type": "string"
},
"ofPart": {
"type": "string"
},
"part": {
"type": "long"
},
"price": {
"type": "double"
},
"title": {
"type": "string"
}
}
}
}
}
}
According to your mapping, part has type long and you're trying to send "250-2016". The reason might be that you've sent a document at some point with a part that was coercible to a number, e.g. "250" and now you're trying to send a string and it fails.
The best way is to use the mapping above to define a new index with the correct mapping type (see below) and then you can try your bulk import again.
DELETE /quick
PUT /quick
{
"mappings": {
"parts": {
"properties": {
"app": {
"type": "string"
},
"eMfg": {
"type": "string"
},
"fPart": {
"type": "long"
},
"oPart": {
"type": "string"
},
"ofPart": {
"type": "string"
},
"part": {
"type": "string" <-- change this
},
"price": {
"type": "double"
},
"title": {
"type": "string"
}
}
}
}
}