I am working on a application built using Spring/Hibernate & PostgreSQL as a database, Recently I implement elastic search in my application, I am new to this so having some doubt about how to use ES in my application.
its good to save/update document in Elastic search where I save/update in PostgreSQL. if No what is the best way to do it?
Should I use data from ES for all my queries like get user profile, educational profile , contact profile etc or its good to fetch them from PostgreSQL and use ES data only for search?
What is the best way to keep my PostgreSQL and ES in Sync?
Thanks in advance for you help.
Related
I have created elastic search index for search on my MySql database using python flask API, i want to know if the database is updated with new tables and records, Will the index be updated by itself, or I have to create the index again or do versioning of the index manually?
If I have to design a GET API on elastic search, such that if data is not yet present in Elastic, then it should query the backing DB and see if the record is present there.
Main reason why we need to think of this is because there is a delay of some seconds between DB and Elastic and the GET call will be invoked way too many times to have it directly hit DB.
Any ideas? I'm using spring-data-elasticsearch right now.
If you are using Elastic for search then keep it as Elastic only. I would suggest thay if the data is not present then reindex(delete the specific data from Elastic if present and then save it again from Database (Jpa)). Your Elastic need to be consistent with the Jpa. Anyway, of you still wanna go ahead with it, then create a if condition for null and if it comes then search by JPA repository.
Are they ALL indexed by default? Should I index them? If so, how? By directly talking to mongodb?
Thanks!
When you use parse-server you must maintain the DB indexes on your own.
Parse.com did it for you but since parse.com shut down their service and open source it (parse-server) you need to maintain the indexes on your own.
You have multiple options to create indexes in MongoDB:
via MongoDB shell
via MongoDB client tools (e.g. Compass, Robomongo etc.)
via code
The must easiest option is 2
As part of my bachelor's thesis I'm building a Microservice using Postgres which would in part replace an existing part of an application using MongoDB. Now to change as little as possible at the moment on the client side I was wondering if there was an easy way to translate a Mongoid::Criteria to an SQL query (assuming all fields are named the same, of course), without having to write a complete parser myself. Are there any gems out there that might support this?
Any input is highly appreciated.
Maybe you're looking for this : https://github.com/stripe/mosql.
I don't dig it but it seems to work for what you need :
"MoSQL imports the contents of your MongoDB database cluster into a PostgreSQL instance, using an oplog tailer to keep the SQL mirror live up-to-date. This lets you run production services against a MongoDB database, and then run offline analytics or reporting using the full power of SQL."
I'm trying to move snapshots of data from our MongoDB into our Oracle BI data store.
From the BI team I've been asked to make the data available for ODI, but I haven't been able to find an example of that being done.
Is it possible and what do I need to implement it?
If there is a more generic way of getting MongoDB data into Oracle then I'm happy to propose that as well.
Versions
MongoDB: 2.0.1
ODI: 11.1.1.5
Oracle: 11.2g
Edit:
This is something that will be queried once a day, maybe twice but at this stage the BI report granularity is daily
In ODI, under the Topology tab and Physical Architecture sub-tab, you can see all technologies that are supported out of the box. MongoDB is not one of them. There are also no Knowledge Modules available for importing/exporting from/to MongoDB.
ODI supports implementing your own technologies and your own Knowledge Modules.
This manual will get you started with developing your won Knowledge module, and in one of the other manuals i'm sure you can find an explanation on how to implement your own technologies. (Ctrl-F for "Data integrator")
If you're lucky, you might find someone else who has already implemented it. Your best places to look would be The Oracle Technology Network Forum, or a forum related to MongoDB.
Instead of creating a direct link, you could also take an easier workaround. Export the data from the MongoDB to a format that ODI supports, and MongoDB can extract to. CSV or XML maybe? Then load the data trough ODI into the oracle database. I think... that will be the best option, unless you have to do this frequently...
Look at the blog post below for an option;
https://blogs.oracle.com/dataintegration/entry/odi_mongodb_and_a_java
Cheers
David