Is there an option in elastic search to store vales just for the purpose of retrieving and not used for searching? So when indexing we'll index all fields and when searching we'll search on a single field only, but need other data as well.
For example, we'll index products, fields could be Name, SKU, Supplier Name etc. Out of which, only Name needs to be indexed and searched. SKU and Supplier Name are just for storing and retrieving with a search.
Since the _source document is stored anyway, the best way to achieve what you want is to neither store nor index any fields, except the one you're searching on, like this:
PUT my-index
{
"mappings": {
"_source": {
"enabled": true <--- true by default, but adding for completeness
},
"properties": {
"name": {
"type": "text",
"index": true <--- true by default, but adding for completeness
},
"sku": {
"type": "keyword",
"index": false, <--- don't index this field
"store": false <--- false by default, but adding for completeness
},
"supplier": {
"type": "keyword",
"index": false, <--- don't index this field
"store": false <--- false by default, but adding for completeness
},
}
}
}
So to sum up:
the fields you want to search on must have index: true
the fields you don't want to search on must have index: false
store is false by default so you don't need to specify it
_source is enabled by default, so you don't need to specify it
enabled should only be used at the top-level or on object fields, so it doesn't have its place here
With the above mapping, you can
search on name
retrieve all fields from the _source document since the _source field is stored by default and contains the original document
Related
I have a big size field in MySQL and do not want to save the original value to ElasticSearch. Is there a method just like Lucene Field.Store.NO?
Thanks.
You just need to define the "store" mapping accordingly, eg. :
PUT your-index
{
"mappings": {
"properties": {
"some_field": {
"type": "text",
"index": true,
"store": false
}
}
}
}
You may also want to disable the _source field :
#disable-source-field
The _source field contains the original JSON document body that was passed at index time [...] Though very handy to have around, the source field does incur storage overhead within the index.
For this reason, it can be disabled as follows:
PUT your-index
{
"mappings": {
"_source": {
"enabled": false
}
}
}
I am using ElasticSearch v6 to search my product catalog.
My product has a number fields, such as title, description, price, etc... one of the fields is: photo_path, which would contain the location of product photo on disk.
photo_path does need to be searched, but need to be retrieved.
Question: Is there a way to mark this field as not searchable/not indexed? And is this a good idea, for example will I save storage/process time, by marking this field not searchable.
I have seen this answer and read, _source and _all, but since _all is deprecated in version 6, I am confused what to do.
If you want some field are not indexed are not queryable, setting property"index": false, and if you only want "photo_path" field as the search result, includes this field on source only (save disk space and fetch less data from disk), show mappings like below:
{
"mappings": {
"data": {
"_source": {
"includes": [
"photo_path" // search result only contains this
]
},
"properties": {
"photo_path": {
"type": "keyword",
"doc_values": false, // Set docValues as false if you don't want to use this field to sort/aggregate
"index": false // Not index this field
},
"title": {
"type": "..."
}
}
}
}
}
Using elastic 2.3.5. Is there a way to make a field filterable, but not searchable? For example, I have a language field, with values like en-US. Setting several filters in query->bool->filter->term, I'm able to filter the result set without affecting the score, for example, searching for only documents that have en-US in the language field.
However, I want a query searching for the term en-US to return no results, since this is not really an indexed field for searching, but just so I can filter.
Can I do this?
ElasticSearch use an _all field to allow fast full-text search on entire documents. This is why searching for en-US in all fields of all documents return you the one containing 'language':'en-US'.
https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-all-field.html
You can specify "include_in_all": false in the mapping to deactivate include of a field into _all.
PUT my_index
{
"mappings": {
"my_type": {
"properties": {
"title": {
"type": "string"
},
"country": {
"type": "string"
},
"language": {
"type": "string",
"include_in_all": false
}
}
}
}
}
In this example, searching for 'US' in all field will return only document containing US in title or country. But you still be able to filter your query using the language field.
https://www.elastic.co/guide/en/elasticsearch/reference/current/include-in-all.html
Does elastic search provide a functionality to map different fields to a single field and use that single field for search.
For eg _all refers to all the fields in the docs.
Similarly do we have any mapping configuration to define a field which would be referring to multiple fields.
Eg : I have a field called Brand,Name,Category.
I need to map Brand and Name to a single field custome_field.
I want it during mapping time and not during query time. I know cross fields does that during query time.
Take a look at copy_to functionality. It acts just like a custom _all. See here more about this:
In Metadata: _all field we explained that the special _all field
indexes the values from all other fields as one big string. Having all
fields indexed into one field is not terribly flexible though. It
would be nice to have one custom _all field for the person’s name, and
another custom _all field for their address.
Elasticsearch provides us with this functionality via the copy_to
parameter in a field mapping:
PUT /my_index {
"mappings": {
"person": {
"properties": {
"first_name": {
"type": "string",
"copy_to": "full_name"
},
"last_name": {
"type": "string",
"copy_to": "full_name"
},
"full_name": {
"type": "string"
}
}
}
} }
Ok, in my elastisearch I am using the following mapping for an index:
{
"mappings": {
"mytype": {
"type":"object",
"dynamic" : "false",
"properties": {
"name": {
"type": "string"
},
"address": {
"type": "string"
},
"published": {
"type": "date"
}
}
}
}
}
it works. In fact if I put a malformed date in the field "published" it complains and fails.
Also I've the following configuration:
...
node.name : node1
index.mapper.dynamic : false
index.mapper.dynamic.strict : true
...
And without the mapping, I can't really use the type. The problem is that if I insert something like:
{
"name":"boh58585",
"address": "hiohio",
"published": "2014-4-4",
"test": "hophiophop"
}
it will happily accept it. Which is not the behaviour that I expect, because the field test is not in the mapping. How can I restrict the fields of the document to only those that are in the mapping???
The use of "dynamic": false tells Elasticsearch to never allow the mapping of an index to be changed. If you want an error thrown when you try to index new documents with fields outside of the defined mapping, use "dynamic": "strict" instead.
From the docs:
"The dynamic parameter can also be set to strict, meaning that not only new fields will not be introduced into the mapping, parsing (indexing) docs with such new fields will fail."
Since you've defined this in the settings, I would guess that leaving out the dynamic from the mapping definition completely will default to "dynamic": "strict".
Is your problem with the malformed date field?
I would fix the date issue and continue to use dynamic: false.
You can read about the ways to set up the date field mapping for a custom format here.
Stick the date format string in a {type: date, format: ?} mapping.