How to cast a field in Elasticsearch pipelines / Painless script - elasticsearch

I have an application which is logging level as integers. I am using filebeat to send the logs to ES. I have set level as string in the ES index, which is working for most of the applications. But when filebeat is receiving an integer, the indexation is failing of course with:
"type":"illegal_argument_exception","reason":"field [level] of type [java.lang.Integer] cannot be cast to [java.lang.String]"
In my document: "level":30
I added a step Script in my ingestion pipeline. But I can't manage to make it work: either I get a compilation error or the script is somehow failing and nothing at all get indexed.
Some very basic script I tried:
if (doc['level'].value == 30) {
doc['level'].value = 'info';
}
Any idea on how to handle this in ES pipelines?
Regards

The best way is to transform data before sending to ES.
You can usse processsor in filebeat to filter you data.
https://www.elastic.co/guide/en/beats/filebeat/current/defining-processors.html

Related

Apache NiFi: PutElasticSearchHttp is not working, with blank error

I currently have Elasticsearch version 6.2.2 and Apache Nifi version 1.5.0 running on the same machine. I'm trying to follow the Nifi example located: https://community.hortonworks.com/articles/52856/stream-data-into-hive-like-a-king-using-nifi.html except instead of storing to Hive, I want to store to Elasticsearch.
Initially I tried using the PutElasticsearch5 processor but I was getting the following error on Elasticsearch:
Received message from unsupported version: [5.0.0] minimal compatible version is: [5.6.0]
When I tried Googling this error message, it seemed like the consensus was to use the PutElasticsearchHttp processor. My Nifi looks like:
And the configuration for the PutElasticsearchHttp processor:
When the flowfile gets to the PutElasticsearchHttp processor, the following error shows up:
PutElasticSearchHttp failed to insert StandardFlowFileRecord into Elasticsearch due to , transferring to failure.
It seems like the reason is blank/null. There also wasn't anything in the Elasticsearch log.
After the ConvertAvroToJson, the data is a JSON array with all of the entries on a single line. Here's a sample value:
{"City": "Athens",
"Edition": 1896,
"Sport": "Aquatics",
"sub_sport": "Swimming",
"Athlete": "HAJOS, Alfred",
"country": "HUN",
"Gender": "Men",
"Event": "100m freestyle",
"Event_gender": "M",
"Medal": "Gold"}
Any ideas on how to debug/solve this problem? Do I need to create anything in Elasticsearch first? Is my configuration correct?
I was able to figure it out. After the ConvertAvroToJSON, the flow file was a single line that contained a JSON Array of JSON indices. Since I wanted to store the individual indices I needed a SplitJSON processor. Now my Nifi looks like this:
The configuration of the SplitJson looks like this:
The index name cannot contain the / character. Try with a valid index name: e.g. sports.
I had a similar flow, wherein changing the type to _doc did the trick after including splitTojSON.

ElasticSearch MetricBeat mapping issue

I have installed MetricBeat on my Windows system. And started it. In the configuration metricbeats.yml I have set the elasticsearch property as follows
output.elasticsearch:
_ # Array of hosts to connect to._
_ hosts: [“10.193.164.145:9200”]_
_ template.name: “metricbeat”_
_ template.path: “metricbeat.template.json”_
_ template.overwrite: false_
Now when I start my MetricBeat, I repeatedly get this message in the logs
Can not index event (status=400): "MapperParsingException[mapping [default]]; nested: MapperParsingException[No handler for type [keyword] declared on field [hostname]]; "
What is the issue here?
Is it due to compatibility? My ElasticSearch version is 1.4.x and MetricBeats is 5.5.x
Please do let me know.
1.4 doesn't seem to be supported anymore.
https://discuss.elastic.co/t/metricbeat-compatibility-with-elasticsearch/99213
i don't think there is any matrix right now who support elastic 1.x series with 5.x metrixbeat. but
you can cross check compatibility matrix here
product compatibility matrix
you can check below document also for your reference. Not sure this might be helpful to your problem.
elastic product end of life dates

Elasticsearch _cat/indices is giving error?

Currently I am using elasticsearch helper scan api, but it is not able to fetch data.
command :
helpers.scan(
client=client,
query={"query":{"match_all":{}}},
scroll='10m',
index="debug",
doc_type = "tool",
_source=True
)
output :
......
generator object scan at 0x1556640
generator object scan at 0x1556640
generator object scan at 0x1556640
generator object scan at 0x1556640
generator object scan at 0x1556640
.......
when I am doing
curl -XGET"http://localhost:9200/debug/tool/_search?pretty=true&q=*:*"
(only 10 by default)
it is able to fetch the data.
After digging the elastic when I check the indices using this command :
http://127.0.0.1:9200/_cat/indices
I found this : No handler found for uri [/_cat/indices] and method [GET]
But when I am using http://localhost:9200/_aliases, I can see my indexing. Why indexes is not coming when I run _cat/indices command?
_cat/indices is available in ES versions after 1.0.x.

Spring Data MongoDB - $eq within $project support

I'm currently writing an aggregation query for MongoDB in my Spring project in which I'm using $project operator. Within this operator I would like to compare two fields in order to return the result as projected "matches" key value. Here's the mongoDB shell equivalent (which works):
{$project:
{matches:
{$eq: ["$lastDate", "$meta.date"]}
}
}
I've read Spring Data MongoDB documentation and found some useful info about ProjectionOperator's 'andExpression' method which uses SpEL. The result Java code of my investigation was:
new ProjectionOperation().andExpression("lastDate == meta.date").as("matches")
Unfortunately I'm receiving exception:
java.lang.IllegalArgumentException: Unsupported Element:
org.springframework.data.mongodb.core.spel.OperatorNode#70c1152a Type: class org.springframework.data.mongodb.core.spel.OperatorNode You probably have a syntax error in your SpEL expression!
As far as I've checked, Spring Data MongoDB handles all Arithmetic operators correctly but cannot handle the comparison ones. Therefore I want to ask is there any other way to create such query with Spring Data MongoDB? Or maybe I don't know something crucial about SpEL?
I resolved this issue by passing JSON aggregate command (created with DBObjects in order to preserve flexibility of the query) to MongoDB, i.e.:
MongoOperations#executeCommand(DBObject command)

debugging elasticsearch

I'm using tire and elasticsearch. The service has started using port 9200. However, it was returning 2 errors:
"org.elasticsearch.search.SearchParseException: [countries][0]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"query":{"query_string":{"query":"name:"}}}]]"
and
"Caused by: org.apache.lucene.queryParser.ParseException: Cannot parse 'name:': Encountered "<EOF>" at line 1, column 5."
So, I reinstalled elasticsearch and the service container. Service starts fine.
Now, when I search using tire I get no results when results should appear and I don't receive any error messages.
Does anybody have any idea how I might find out what is wrong, let alone fix it?
first of all, you don't need to reindex anything, in the usual cases. It depends how you installed and configured elasticsearch, but when you install and upgrade eg. with Homebrew, the data are persisted safely.
Second, no need to reinstall anything. The error you're seeing means just what it says on the tin: SearchParseException, ie. your query is invalid:
{"query":{"query_string":{"query":"name:"}}}
Notice that you didn't pass any query string for the name qualifier. You have to pass something, eg:
{"query":{"query_string":{"query":"name:foo"}}}
or, in Ruby terms:
Tire.index('test') { query { string "name:hey" } }
See this update to the Railscasts episode on Tire for an example how to catch errors due to incorrect Lucene queries.

Resources