Is ElasticSearch watch able to write data to Mysql - elasticsearch

I have some meta data in ElasticSearch, when the meta data updated I need to sync the updates to Mysql. So I'm wondering that is there an ES Watch/Triger could do this automatically please?

There isn't any direct action in watcher calling mysql ( that I know of ).
But you could create a simple API called by watcher that would update your database on a Webhook Action

Related

how to use jpaReposotory methods with an exist database?

who can help me to understand this scenario please. let say there is a client who have been using php as Backend, now he wants to merge his project from php to Spring knowing that he already has a full data and completly defined (mapping, primary key, tables), he export his data as sql (database.sql) so my question is how can we work and interact with this data in Spring Data ?
So you have the database export file(s), you create a new empty database and import those files there, configure your Spring Data to connect to the new db and stop using the Php app if you can (if you continue to use it you will have to somehow synchronize the two databases, more complicated)

Send the logs prestashop to kibana

How i can send the logs of my application which is in prestashop to kibana? as you know prestashop save the logs in database. Thanks
You can use one of these approachs :
Using the official framework (elasticsearch-php) you create a single file script that communicates directly with your ES cluster and sends the data via a direct query to the db :
select * from ps_log where date_add > xxxx
that you can implement in your script.
You create a daily CSV log of the table data with a PHP script and "ingest" this log to your elasticsearch cluster using logstash.

Apache Kafka for an existing get request with Oracle DB

I’m trying to learn about streaming services and reading kafka doc’s :
https://kafka.apache.org/quickstart
https://kafka.apache.org/24/documentation/streams/quickstart
To take a simple example I’m attempting to refactor a Spring web services GET request which accepts an ID parameter and returns a list of attributes associated with that ID. The DB backend is Oracle.
What is the approach for loading a single Oracle DB table which can be served by Kafka ? The above docs don't contain information for this. Do I need to replicate the Oracle DB to a NoSql DB such as MongoDB ? (Why we require Apache Kafka with NoSQL databases?)
Kafka is an event streaming platform. It is not a database. Instead of thinking about "loading a single Oracle DB table which can be served by Kafka", you need to think in terms of what events are you looking for that will trigger processing?
Change Data Capture (CDC) products like Oracle Golden Gate (there are other products too) will detect changes to rows and send messages into Kafka each time a row changes.
Alternatively you could configure a Kafka JDBC Source Connector to execute a query and pull data into Kafka.

How to use JPA as a backup for Elastic Using spring data?

If I have to design a GET API on elastic search, such that if data is not yet present in Elastic, then it should query the backing DB and see if the record is present there.
Main reason why we need to think of this is because there is a delay of some seconds between DB and Elastic and the GET call will be invoked way too many times to have it directly hit DB.
Any ideas? I'm using spring-data-elasticsearch right now.
If you are using Elastic for search then keep it as Elastic only. I would suggest thay if the data is not present then reindex(delete the specific data from Elastic if present and then save it again from Database (Jpa)). Your Elastic need to be consistent with the Jpa. Anyway, of you still wanna go ahead with it, then create a if condition for null and if it comes then search by JPA repository.

How to restore Parse data after migrate to a own mongo?

I am trying to migrate mongo parse to my mongo, but I have lost my mongo and the parse data.
Have a way to restore the parse data?
If you click on finalize on parse.com when migrated the DB, i think you cant go back

Resources