Different Microservices Using Different Relational Databases [closed] - microservices

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 months ago.
Improve this question
My goal is to create a social media and I'm planning to use microservice architecture. I want to create user-service and post-service. Post service will get a user id and return their posts.
In the database side, I was thinking about using a relational database, having a "post" table with user_id as a foreign key in it. However I heard that microservices should have different databases. So "user" and "post" tables should be in seperate databases.
In that case why would I choose a relational database instead of a non-relational database? Referring to a primary key in a different database with a foreign key in the current database makes no sense for me.

In most cases multiple tables might be required for each service (e.g. your user-service would have many other tables related to users and not just one table) and having relational database helps maintaining relations amongst those table of single service.

Related

Kafka source - sink connectors - multiple tables with single topic [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
Populating messages from different tables to a single topic. But need to separate out messages and sink to destination table set with same names.
Using debezium/connect, I am populating the topic from source table with jdbc source connector. With jdbc sink connector, I populate the destination table. (from topic to destination table) In thi scenario, I need 1 topic for 1 table. Since there are 100's of tables, I need to have a single topic to sync many tables
Any idea how to achieve this?
You can use Kafka Connect transforms to modify the topic / table names in flight.
It seems you may want to use a RegexRouter to collect multiple topics together.
https://docs.confluent.io/current/connect/transforms/regexrouter.html

Materialized view in same DB [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Is there any sense in creating materialized views from one schema to another in same database ? Will a public synonym or a simple view suffice? Please suggest
The question is about the same as if you ask if it makes sense to have the same table in another schema of the database. If anybody who can access materialized view / table has access to the schema why bother? And even if not I would ask why they have no access to the schema.
Your question: Will a public synonym or a simple view suffice? That said, I also would say yes.

Lotus Notes to Oracle database migration [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have a NSF file of a lotus database. The objective is to give up the legacy lotus notes database and migrate it to relational database oracle. Do any one have expertise in this area to give a step wise process to carry out migration from lotus notes to oracle db.
10 years ago I integrated Domino and Oracle and well it was pretty impressive.
I googled migrate from domino to oracle and didn't find much more than the LEI (or DECS) to allow connection of DATA between the 2 systems.
some steps:
1 analyze the NSF: size (MB or GB ?) number of form/view logic in code (my 5 cents opinion find someone that really use the DB that explain what they use in it !)
2 form/view will be table and requests in Oracle
3 data migration : all text date ... will be strate forward. BUT attachment Rich Text and in lined image will be painfull
4 logic well you will have to rewrite all no formula/lotuscript/xpages to J2EE or else
read also http://searchdomino.techtarget.com/answer/Migrating-from-Domino-to-Java-and-Oracle

using materialised views to fix bugs and reduce code [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
The application I'm working on has a legacy problem where 2 tables were created ADULT and CHILD in an oracle 11g dB.
This has led to a number of related tables that have both a field for ADULT and CHILD no FK applied.
The bugs have arisen where poor development has mapped relationships to the wrong field.
Our technical architect plans to merge the ADULT and CHILD tables in to a new ADULT_CHILD table and create materialised views in place of the tables. The plan is to also create a new id value and replace the I'd values in all associated tables so even if the plsql/apex code maps to the wrong field the data mapping will still be correct.
The reasoning behind this solution it it does not require that we change any other code.
My opinion is this is a fudge but I'm more a Java/.NET OO.
What arguments can I use to convince the architect this is wrong and not a real solution. I'm concerned we are creating a more complex solution and performance will be an issue.
Thanks for any pointers
While it may be a needed solution it might also create new issues. If you really do need to use an MV that is up to date at all times, you need on commit refresh and that in turn tends to make all updates sequential. Meaning that all processes writing to it waits in line for the one updating the table to commit. Not, the table, not the row.
So it is prudent to test the approach with realistic loads. Why does it have to become a single table? Could they not stay separate, add a FK? If you need more control on the updates, rename them and put views with instead-of triggers in their place.

SQLite-like alternative for MongoDB? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I'm looking for a document-oriented db with a Ruby API that has SQLite-like properties:
self-contained,
serverless,
zero-configuration.
Are there light alternatives to MongoDB or CouchDB?
Is RDDB a possibility?
If not, what are the best paths to walk then?
I know, the question was asked 5 years ago, but just for completeness' sake, embedded MongoDB has happened since:
https://github.com/hamiltop/MongoLiteDB
It's not ready yet, but embeddable version of CouchDB are on the long term roadmap.
Replication is intended to enable offline applications with CouchDB. If you ended up with very specific needs you could replicate data from couchdb to a local datastructure, store it locally, update it, and push the data back via replication but it would take some code.
If you were using Perl, I'd recommend DBM::Deep, which stores arbitrary data structures on disk, including transactions with commit/rollback, and it's a non-C one-Perl-module install. Doesn't get much lighter than that.
I almost feel you could do some sort of hack to achieve this.
Have a table using sqlite's row ids along with a field for collection name and text blob that would be json code.
Have another table for indexing with fields in a collection (collection name, field name, field value, document row id).
You could do some wrapper class to handle things like updates and lookups. Would be interesting.

Resources