I have created elastic search index for search on my MySql database using python flask API, i want to know if the database is updated with new tables and records, Will the index be updated by itself, or I have to create the index again or do versioning of the index manually?
Related
If I have to design a GET API on elastic search, such that if data is not yet present in Elastic, then it should query the backing DB and see if the record is present there.
Main reason why we need to think of this is because there is a delay of some seconds between DB and Elastic and the GET call will be invoked way too many times to have it directly hit DB.
Any ideas? I'm using spring-data-elasticsearch right now.
If you are using Elastic for search then keep it as Elastic only. I would suggest thay if the data is not present then reindex(delete the specific data from Elastic if present and then save it again from Database (Jpa)). Your Elastic need to be consistent with the Jpa. Anyway, of you still wanna go ahead with it, then create a if condition for null and if it comes then search by JPA repository.
I have worked in solr local mode. Now Iam trying to import my tables in oracle database to collection created in solrcloud.How to achieve this?
I surfed a lot but no documents helped much.I want a clear idea of how to index daatabase tables into solrcloud.
I am working on a application built using Spring/Hibernate & PostgreSQL as a database, Recently I implement elastic search in my application, I am new to this so having some doubt about how to use ES in my application.
its good to save/update document in Elastic search where I save/update in PostgreSQL. if No what is the best way to do it?
Should I use data from ES for all my queries like get user profile, educational profile , contact profile etc or its good to fetch them from PostgreSQL and use ES data only for search?
What is the best way to keep my PostgreSQL and ES in Sync?
Thanks in advance for you help.
I'm running a site as an Azure Web App, using Azure SQL, Azure Search, and Azure Blob Storage.
Currently the Azure Search index (for the document search) is built using an indexer drawing data from multiple SQL tables (via a view) to associate permissions and other meta data indirectly associated with the documents, including the url to the doc in Azure Blob Storage.
The newly released update to Azure Search seems to allow full-text searching of blobs which is great, but the data source has to be changed to the blob storage container, missing out on the additional meta that would be populated by my SQL view.
Can a Search index document be populated by more than one data source, or can a second indexer update an existing search document (to add the full-text data to the document)?
I've looked at trying to capture the data and creating the full text within the SQL db at document upload, but on Azure web apps there doesn't seem to be a suitable parser, and Azure SQL Full text index doesn't support Word or PDF docs which are mostly what I'm uploading.
Is it possible to modify the indexer to incorporate Azure Blob Storage full text indexing, or should I be looking for a completely different approach?
Azure Search indexes can be populated by multiple indexers, or even by a mix of indexer and your own code calling indexing API. (Specifically, indexers use mergeOrUpload indexing action.)
You just need to make sure that both SQL and blob indexers agree on the document key, so they update the same documents.
HTH!
I have some meta data in ElasticSearch, when the meta data updated I need to sync the updates to Mysql. So I'm wondering that is there an ES Watch/Triger could do this automatically please?
There isn't any direct action in watcher calling mysql ( that I know of ).
But you could create a simple API called by watcher that would update your database on a Webhook Action