I would like to know if it's possible to use Oracle.EntityFrameworkCore with NServicebus 7.6?
I am using hosted .NET 6 service running NSB 7.6.
Data access are using Oracle.EntityFrameworkCore
From the documentation it looks like you have to use NHibernate or "old" NET 4.x
There's a sample for using Entity Framework Core and like Martin Anderson mentioned, you can use it together with SQL Persistence for Oracle and share the connection. As the documentation mentions:
To maintain consistency, the business data has to reuse the same connection context as NServiceBus persistence. With SQL persistence, this is achieved by using the same ADO.NET connection and transaction objects in both NServiceBus and Entity Framework.
Related
We are rewriting legacy app using microservices. Each microservice has its own DB. There are certain api calls that require to call another microservice and persist data into both DBs. How to implement distributed transaction management effectively in this case?
Since we are not migrated completely to the new micro services environment, we still writeback data to old monolith. For this when an microservice end point is called, we call monolith service from microservice api to writeback same data. How to deal with the same problem in this case as well.
Thanks in advance.
There are different distributer transaction frameworks usually included and maintained as part of heavy application servers like JBoss and WebLogic.
The standard usually used by such services is Jakarta Transactions (JTA; formerly Java Transaction API).
Tomcat and Spring don't support distributed transactions out-of-the-box. You can add this functionality using third party framework like Atomikos (just googled, I've never used it).
But remember, microservice with JTA ist not "micro" anymore :-)
Here is a small overview over available technologies and possible workarounds:
https://www.baeldung.com/transactions-across-microservices
If you can afford to write to the legacy system later (i.e. allow some latency between updating the microservice and the legacy system) you can use the outbox pattern.
Essentially that means that you write to the microservice database in a transactional way both to the tables you usually write and an additional "outbox" table of changes to apply and then have a separate process that reads that table and updates the legacy system.
You can also achieve something similar with a change data capture mechanism on the db used in the microservice(s)
Check out this answer on "Why is 2-phase commit not suitable for a microservices architecture?": https://stackoverflow.com/a/55258458/3794744
I am new to reactive programming I have started using webflux, previously I work on spring boot there I have used hibernate as a ORM framework. My doubt is what is replace of hibernate in reactive stack , which framework I have to use to connect & implement database logic.
I am using mongoDB.
Thanks in advance.
you have to use either R2DBC or Hibernate reactive.
if you migrating an old service from spring boot to webflux, I recommended you to use Hibernate reactive.
when you use R2DBC, you can't use hibernate mappings and annotations.
according to hibernate reactive documentation:
When using a stateless session, you should be aware of the following additional limitations:
persistence operations never cascade to associated instances,
changes to #ManyToMany associations and #ElementCollections cannot be
made persistent, and
operations performed via a stateless session bypass callbacks.
Hibernate is based on JDBC. JDBC is blocking. Blocking APIs don't work well in reactive stacks. Also, Hibernate under the hood uses ThreadLocals which makes it even worse candidate for reactive applications.
For Webflux, as the alternative to Hibernate you should look into Spring Data R2DBC which does the basic database result to Java object mapping, but keep in mind that it's not a full fledge ORM like Hibernate.
You may also want to give it a try to Hibernate Reactive. With this you can use the power of Hibernate mappings in a reactive non blocking way. One thing that will not work though (at least not yet) is declarative transaction management with #Transactional.
It depends on what database driver you are using.
If using the jdbc driver to talk to your database, then yes you can use hibernate. But important to note is that the JDBC spec is blocking so every call to the database will be blocking, and must be placed on its own scheduler (thread) and you will most likely not get the full performance benefits of a fully reactive application.
If you want a fully reactive application you must use a database driver that supports the R2DBC protocol. Hibernate does not support R2DBC so can not be used if you want a fully reactive application.
Hibernate is most commonly used with relation database such as mysql, postgres, oracle etc and not NoSql databases such as monogDB.
If you are using MongoDB then there is full support for R2DBC and there is no need for hibernate.
Is it possible to use transactions when Neo4j is used as standalone server? I am using functions from my Spring repositories, and probably each of them is executed as a separate transaction, but I would like to merge them into one. Is it possible to do this?
SDN doesn't support remote transactions (which only work with the transactional endpoint and Cypher) yet.
So the option you have to speed your operation up is to move the processing of the SDN entities into the server an expose a domain level REST API to your clients (either with Jersey, or SD-REST).
see: http://inserpio.wordpress.com/2014/04/30/extending-the-neo4j-server-with-spring-data-neo4j/
I came to know that aqualogic service bus is renamed to oracle service bus when oracle acquired BEA. Is there a relationship between oracle fusion middleware and oracle service bus or are they totally independent of each other? Thanks
Oracle Service Bus is a product within the Oracle Fusion Middleware suite.
Enterprise Integration aims to connect and combine people, processes, systems, and technologies to ensure that the right people and the right processes have the right information ant the right resources at the right time.
Integration can be done using
Integration Framework
ESB
Integration Suite
The Enterprise Service Bus (ESB) has been most widely accepted as a tool to support application integration. However it is to be noted that on the integration complexity path, an ESB usually falls between a framework and a suite as an alternative for application integration.
OSB is an Enterprise Service Bus and Oracle Fusion middleware is an Integration Suite.
Anyone aware of any good examples/resources using WCF to interact with Oracle AQ (Advanced Queueing), possibly even a custom binding?
Thanks.
To my mind WCF solves the same problems as OracleAQ can provide. WCF is tightly coupled by data contracts and can be bound to multiple transport types. OracleAQ is configurable by data contract (via registered XSD) or use a RAW xml queue. I am not aware of customer binding to differing transport mechanisms for OracleAQ and is very well documented and supported by ODAC. This wasn't the case for OO4O chatting to 9i but that's another story...
I would consider expending effort on implementing OracleAQ support within the client connection as this follows a similar overall pattern to WCF. Putting a service in between OracleAQ and the client endpoint to translate bindings doesn't appear to make sense other than introducing a level of complexity.