We have a scenario where third party system needs to update some data in our database and also need to read from our database.
We are thinking of using pub sub model where third party system will write data to topic and then our application will read that data and insert into database.
There will be multiple messages that would be written to topic and we need to read in real time.
What would be best design for pub sub ? This will be deployed on GCP cloud. We are thinking of using GCP pub sub. We are using spring boot. Can we have event driven design ?
Thanks
Related
This is for a project on Bot Framework Composer (not SDK, so i'm using built in telemetry export settings).
I am looking for the best way to store event logs from bot conversations for analysis. From what I've researched, the method recommended is going through Application Insights, which I activated and tested. The data I require seems to be all captured in table customEvents.
The issue is I need to be able manipulate the data for analysis. But in Application Insights it's read only (and possibly purge via API). I need to be able to add tables, edit text, etc. I have a lot of experice with postgreSQL so that's my first choice for bot log storage.
So my question is, what is the efficient way to get the customEvents data table that is in application insights to a postgres database? From what I see, application insights only exports to azure storage? But that does not have a database option. And if I understand some of the pipelines suggested, they copy data to storage, and then copy to a database. Isn't that a lot of storage cost, as same data will be in application insights, storageBlobs AND postgres?
What is the best pipeline? The goal is to have non-redundant pipeline that transfers event data that is in 'customEvents' to a postgres table with same columns.
(If there is a way to redirect data that goes to customEvents in application isights directly to postgres table that would be perfect too. )
There is no such to redirect data from application insights directly into postgres table.
The first solution is continuous export to azure storage as you know. Storage blob does not cost very much and you can clear the old data periodically to reduce the cost.
Another way is to use the application insights query api. To do that, you need to write your own logic to query the custom events from application insights, then insert them into your DB by your code.
Are there any concerns with using Snowflake as the data repository for a web API from an enterprise architecture perspective?
I think the question to be asked is how are you going to use the data. It is not clear what you mean by web API data repository. If you are talking about the API interaction data, then Snowflake is not the right choice for that. You should look for a transactional data store for such use cases. However, from that data , if you want to derive insights and analytics you can ingest the transactional data to Snowflake and build your analytics layer on top of it. But the question will be why would you like to do that, most of the API products have that analytics engine already built in their product.
I am preparing Microservice design for one of the application where Microservice communicates to UI and that part is very straight forward and can achieve. Where other part is, one of microservice communicates to third party system using SOAP and it is SYNC communication. I want to avoid SYNC communication and build some components. That component should be responsible to fetch data from third party system and stores in local database.
So far I considered this could separate module and microservice communicates to internal module.
Now problem is
Is there any possible solution where application achieve ASYNC communication. Although these are real time request
Does it possible to store millions of transaction data in our database and then use database communication.
I'm managing a very large enterprise application in that I've implemented microservice architecture. Standalone microservices have been created based on business entities & operations.
For example,
User Operations Service
Product Operations Service
Finance Operations Service
Please note that each service implemented using an n-tier architecture with WCF. i.e have separate tiers(which is independently deployable to separate server) for business and data access.
There is a centralized database which is accessed by all the microservices. There are a couple of common entities like 'user' accessed by all the services, so we have redundant database calls in multiple services. More efforts required due to database access from many places(i.e a column rename requires deployment of all the apps)
To reduce & optimize code, I'm planning to create separate microservice and move all the database operations into it. i.e services can call "Database Operations Service" for any database operations like add/update/select.
I want to know if there are any hidden challenges that I'm not aware of. Whether should I go with this thought? What can I consider as improvements in this concept?
I'm planning to create separate microservice and move all the database operations into it
That's how you will lose all benefits from microservice architecture. One service is down — the whole application is down. Unless you have replication on several nodes.
If your app does not work if one service went down(not implying that it's that service that connects to database), then it's still bad architecture and you are not using benefits of microservice architecture.
Correct for of communication would be if service would have their own databases. Or at least that every service that wants, for example, entity User, will not fetch it from DB, but will fetch it from appropriate service. And that appropriate service could fetch it from common DB at the beginning.
Next step (improvement) in the process of accommodation to microservice architecture would be creation of separate databases for each service. And by “separate” I mean that temporal fault of one service or temporal fault of one database will allow the rest of the app to be alive and functioning.
Generally, there are no hidden challenges in your approach. It just does not give any benefits, as an intermediate form between monolith application and microservice-based.
Is it possible to load/import data through third party API to Parse.com daily?
To be more specific, I want to load data from https://api.sample.com/event.json?key=1234 to Parse.com daily. And then sort those data on Parse. And then iOS users pull data from Parse.
Of course. Parse provide a REST API for all collections (db tables) or you could write your own custom code to load the data if you want to add a business rules layer.
You just need to watch your volume of JSON data that you are posting.
I should also add that since you want to do this daily you will need run a daily batch job to do this. Parse has this ability or you can initiative this process manually yourself. It depends how automated this needs to be.