Error occurred when add new field on Elastic Search - elasticsearch

I am using Elastic Search as a database. By changing requirements, I added a new field on ES. After added it, my NodeJS application threw an error on the line which I added sortable.
My expectation is: update the index mappings for all old documents when I add a new field on ES.
Thank you very much for helping me!

Related

Deleting new index after/during reindex operation

I'm new to Elasticsearch operations. I had to do a job of renaming the field names in an existing Elasticsearch index.
I updated the existing ingest pipeline by resubmitting the ingest pipeline configuration with additional rename processor.
I did not update the index template because the new pipeline name is still same.
Then I posted the request for re-indexing without prior creating the new index.
I noticed that the new index have got created on the master node.
I did not add any new alias name for the new index yet as I was thinking to see if the operation completes fine then I'll add those to reflect in my kibana search.
Also, I had not specified any field type and I see one of the date field was getting the string data and while conversion of field name, errors started coming in elasticsearch logs.
Now, I see the data for the index is not coming in the kibana search at all and there are error message in the logs
[o.e.x.i.IndexLifecycleRunner] [leader-node-02] policy [logs-for-dev-policy] for index new-index-v0] on an error step due to a transitive error, moving back to the failed step [check-rollover-ready] for execution. retry attempt [2]
I'm thinking to delete the newly created index new-index-v0 as the old index index-for-dev is still there.
I tried reading the documentation but could not find a clue to know if deleting the new index will create any problem to the older index .
Any suggestions please ?
I am using Elasticsearch version 7.2 .

Update indices in Elasticsearch on adding new documents to my database

I'm new to elastic search however had to work with it. I have successfully set it up using logstash to connect it to my oracle database(one particular table). Now if new records are added to one of the tables in my oracle database(which I built the index on), what should be done?
I have thought of two solutions,
Re-build the indices by running the logstash conf file.
On insert into the table, also POST to elastic search.
The first solution is not working like it should. I mean that if 'users' is the table that I have updated with new records, then on re-building indices(for the 'users' table) in elastic search, the new records also should be reflected in the logstash get query.
The first should would help as a POC.
So, Any help is appreciated.
Thank you Val for pointing me in the right direction.
However, for the first brute-force solution it was about changing the document type in the logstash conf file.
{"document_type":"same_type"}
This must be consistent with the previously mentioned type. I had run it with different type, first time(Same_type). After adding new records, I used same_type. So, the elastic search as thrown an exception for multiple mapping rejection.
For further clarification, it looked up here.
Thank you guys.

Elasticsearch update index template

I have a question about elasticsearch index template, there is a scene of my question.
Create a template for a series indices, named templateA, and there are some indices create from this template, named Index-yyyy.mm.dd2 and Index-yyyy.mm.dd2. After a period of time, I need create some new fields in indice, and I update the templateA.
SO, How to make the previously created indices use the new template? please give me some suggestion. Thanks a lot!
The template is only used at index creation. You'll have to modify your mapping or recreate your index and reindex your data.
You can use the PUT mapping API to modify your mapping.
At least in ElasticSearch 7.15 you could create or update an index template using the same endpoint, also:
Index templates are applied during data stream or index creation
It is ovious but "old" data needs to be updated in some way.
In case you are using Logstash, just restart it to do the reindex.

Specifying data type and analyzer while creating index

I am using elastic search to index my data and i was able to do it with the following POST request
http://localhost:9200/index/type/id
{
JSON data over here
}
Yesterday while i was going through some of the elastic tutorials i found one person mentioning about setting analyzer to those fields where we are planning to do full text search.I found during my googling that mapping API can be used to update datatypes and analyzer, but in my case i want to do it as i am creating index.
How can i do it?
You can create index with custom settings (and mappings) in the first request and then index your data with second request. In this case you can not do both at the same time.
However if you index your data first and index does not exist yet, it will be created automatically with default settings. You can then update your mappings.
Source: Index

reindexing elastic search or updating indexes?

I am now on elastic search, I cant figure out how to update elastic search index,type or document without deleting and reindexing? or is it the best way to achieve it?
So if I have products in my sql product table, should I better delete product type and reindex it or even entire DB as index on elasticsearc. what is the best use case and how can I achieve it?
I would like to do it with Nest preferably but if it is easier, ElasticSearch works for me as well.
Thanks
This can be a real challenge! Historic records in elasticsearch will need to be reindexed when the template changes. New records will automatically be formatted according to the template you specify.
Using this link has helped us a lot:
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-templates.html
You'll want to be sure to have the logstash filter set up to match the fields in your template.

Resources