Elasticsearch get field mapping for all fields without using asterisk (*) - elasticsearch

To get mapping for all fields in an index, I do this:
GET http://localhost:9200/some_awesome_index/_mapping/field/*
But I don't want to use the *. Is there another way to get the same result without using a * ? I have dynamic fields, so I can't use a comma-separated list of every single field I need (without having to use the *).

If you want to the get the mapping for all the fields of all the types in an index, use
GET /index name/_mapping
If you want to get the mapping for all the fields of a single type in an index, use
GET/index name/type name/_mapping

Related

elasticsearch - Tag data with lookup table values

I’m trying to tag my data according to a lookup table.
The lookup table has these fields:
• Key- represent the field name in the data I want to tag.
In the real data the field is a subfield of “Headers” field..
An example for the “Key” field:
“Server. (* is a wildcard)
• Value- represent the wanted value of the mentioned field above.
The value in the lookup table is only a part of a string in the real data value.
An example for the “Value” field:
“Avtech”.
• Vendor- the value I want to add to the real data if a combination of field- value is found in an document.
An example for combination in the real data:
“Headers.Server : Linux/2.x UPnP/1.0 Avtech/1.0”
A match with that document in the look up table will be:
Key= Server (with wildcard on both sides).
Value= Avtech(with wildcard on both sides)
Vendor= Avtech
So baisically I’ll need to add a field to that document with the value- “ Avtech”.
the subfields in “Headers” are dynamic fields that changes from document to document.
of a match is not found I’ll need to add to the tag field with value- “Unknown”.
I’ve tried to use the enrich processor , use the lookup table as the source data , the match field will be ”Value” and the enrich field will be “Vendor”.
In the enrich processor I didn’t know how to call to the field since it’s dynamic and I wanted to search if the value is anywhere in the “Headers” subfields.
Also, I don’t think that there will be a match between the “Value” in the lookup table and the value of the Headers subfield, since “Value” field in the lookup table is a substring with wildcards on both sides.
I can use some help to accomplish what I’m trying to do.. and how to search with wildcards inside an enrich processor.
or if you have other idea besides the enrich processor- such as parent- child and lookup terms mechanism.
Thanks!
Adi.
There are two ways to accomplish this:
Using the combination of Logstash & Elasticsearch
Using the only the Elastichsearch Ingest node
Constriant: You need to know the position of the Vendor term occuring in the Header field.
Approach 1
If so then you can use the GROK filter to extract the term. And based on the term found, do a lookup to get the corresponding value.
Reference
https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html
https://www.elastic.co/guide/en/logstash/current/plugins-filters-kv.html
https://www.elastic.co/guide/en/logstash/current/plugins-filters-jdbc_static.html
https://www.elastic.co/guide/en/logstash/current/plugins-filters-jdbc_streaming.html
Approach 2
Create an index consisting of KV pairs. In the ingest node, create a pipeline which consists of Grok processor and then Enrich it. The Grok would work the same way mentioned in the Approach 1. And you seem to have already got the Enrich part working.
Reference
https://www.elastic.co/guide/en/elasticsearch/reference/current/grok-processor.html
If you are able to isolate the sub field within the Header where the Term of interest is present then it would make things easier for you.

How to set field suffix as alias in Grafana with Elasticsearch

I have graphs defined from Elasticsearch source with long field names such as private_data.systemMetrics.systemData.cpu.usage_user. I would like to set alias for the cpu usage fields that will display only the field name suffixes, in above example, usage_user.
Using Grafana v4.5.2
I found a way to display a short alias but it is costly. I split the query with multiple fields into a list of queries, each on a single field with explicit alias.
Is there a better way to do it?

Elastic Search - Fetch only certain field across all indexes

I have hundreds of indexes and I want to fetch only a given field from every record under these indexes. I can do the following
curl -XGET 'http://localhost:9200/cms-2016-03-30/job/_search?pretty=true&field=CMSDataset'
This unfortunately returns a lot of things I don't want and also doesn't give me all the records (~10^6).
Also, there are many cms-* style indexes, and I want to parse through all of them and get only this field. How do I do this?
You need to use source filtering instead of fields (which is deprecated)
curl -XGET 'http://localhost:9200/cms-2016-03-30/job/_search?pretty=true&_source=CMSDataset'
^
|
change this
From the official documentation:
The fields parameter is about fields that are explicitly marked as stored in the mapping, which is off by default and generally not recommended. Use source filtering instead to select subsets of the original source document to be returned.
UPDATE
You can use the size parameter (e.g. 100) in order to return more records (by default it's 10) and then simply use * as the index name:
curl -XGET 'http://localhost:9200/*/_search?pretty=true&size=100&_source=CMSDataset'

Elasticsearch query for what fields have a given type?

I have an elasticsearch (version 1.7) cluster with multiple indices. Each index has multiple doc_types, and each has fields w/ a variety of types. I'd like to get a list of field names for a given field type. This would be a necessarily nested list. For example, I'd like to query for field type "string", and return {index1: {doc_type1.1: [field1.1.1, field1.1.2], ...} -- the leaves of this nested dicts are only those fields w/ the given type. So the hits for this query won't be documents but rather a subset of the cluster's mapping. Is this possible using Elasticsearch?
One solution: I know I can get the mapping as a dict using Python, then work on the mapping dict to recover this nested list. But I think there should be an elasticsearch way of doing this, not a Python solution. In my searches through the documentation I just keep finding the "type filter" which filters by doc_type, not field type.
There's currently no way of achieving this. The _mapping endpoint will return all fields of the request mapping type(s).
However, there might be a way, provided your fields have a special naming convention hinting at their type, for instance name_str (string field for "name"), age_int (integer field for "age"), etc. In this case, you could use response filtering on the _mapping call and retrieve only the fields ending with _str:
curl -XGET localhost:9200/yourindex/_mapping/yourtype?filter_path=*_str

How to query elasticsearch without specifying which property to match

I have an elasticsearch document with multiple properties, like title, body, author.
I'd like to get results from a query that matches ANY of the fields.
Usually, I'll query by specifying a property to match. I'd rather write a more flexible query that will return the whole document if any of the properties are matched.

Resources