Spring Kafka JDBC Connector compatibility - spring-boot

Is Kafka JDBC connect compatible with Spring-Kafka library?
I did follow https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/ and still have some confusions.

Let's say you want to consume from a Kafka topic and write to a JDBC database. Some of your options are
Use plain Kafka consumer to consume from the topic and use Jdbc api to write the consumed record to database.
Use spring Kafka to consume from the Kafka Topic and spring jdbc template or spring data to write it to the database
Use Kafka connect with Jdbc connector as sink to read from topic and write to a table.
So as you can see
Kafka Jdbc connector is a specialised component that can only do one job.
Kafka Consumer is very genric component which can do lot of job and you will be writing lot of code. In facr, it will be the foundational API from which other frameworks build on and specialise.
Spring Kafka simplfies it and let you deal with kafka records as java objects but doesnt tell you how to write that object to your db.
So they are alternative solutions to fulfil the task. Having said that you may have a flow where different segments are controlled by different teams and for each segment, any of them can be used and Kafka topic will act as joining channel

Related

Spring Batch and Kafka

I am a junior programmer in banking. I want to make a microservice system that get data from kafka and processes it. after that, save to database and send final data to client app. What technology can i use? I plan to use spring bacth and kafka. Can the technology be implemented in my project or is there a better alternative?
To process data from a Kafka topic I recommend you to use Kafka Streams API, especially Spring Kafka Streams.
Kafka Streams and Spring
And to store the data in a database, you should use a Kafka Sink Connector.
Kafka Connect
This approach is very common and easy if your company has a Kafka ecosystem.
In terms of alternatives, here you will find an interesting comparison:
https://scramjet.org/blog/welcome-to-the-family
3 in 1 serverless
Scramjet takes a slightly different approach - 3 platforms in one.
Both the free product https://hub.scramjet.org/ for installation on your server and the cloud platform are available - currently also free in the beta version https://scramjet.org/#join-beta

Kafka connect with EventStoreDB

I'm working on a small academic project - Event sourcing with EventStoreDB and Apache Kafka as a broker. The idea is that get events from EventStoreDB and push them to Kafka for further distribution. I saw Apache Kafka has connections to different DB systems but didn't find any connector with EvenStoreDB.
How can I create(code or use existing one) Kafka connector to EventStoreDB, so these two systems would be able to transfer events vise-versa, from Kafka to EventStoreDB and from EventStoreDB to Kafka?
There is no official Kafka Connect Connector between Kafka and EventStoreDB, and I haven't heard about any unofficial so far. Still, there is a tool called Replicator that enables replicating data from EventStoreDB to Kafka (https://replicator.eventstore.org/docs/features/sinks/kafka/). It's open-sourced, so you can either use it or check the implementation.
For the EventStoreDB to Kafka, I recommend using the subscriptions mechanism: catch-up if you need an ordering guarantee, persistent if ordering is not critical: https://developers.eventstore.com/clients/grpc/subscriptions.html. The crucial part here is to define how to map EventStoreDB streams to Kafka topics and partitions. Typically you'd expect to have at least an ordering guarantee on the stream level, so single stream events should land to the same partition.
For Kafka to EventStoreDB integration, you could either write your own pass-through service or try to use the HTTP sink connector (e.g. https://docs.confluent.io/kafka-connect-http/current/overview.html). EventStoreDB exposes HTTP API (https://developers.eventstore.com/clients/http-api/v5/introduction/). Sidenote, this API (Atom pub based) may be replaced with another HTTP API in the future, so the structure may change.
You can use Event Store Replicator, which has a Kafka sink.
Keep in mind that it doesn't do anything with regards to events schema, so things like Kafka Streams and KSQL might not work properly.
The sink was created solely for the purpose of pushing events to Kafka being used as a message broker.

Using Kafka without RabbitMQ

I am testing MassTransit with Kafka without RabbitMQ
I need a IBusControl, can i use bus.UsingInMemory()?
Is it safe to use for Kafka ITopicProducers?
Does features like scheduling/sagas will work with Kafka or with InMemory bus?
You can use an in-memory bus with Kafka, but realize that the in-memory bus is not durable. The only reason you would want to use it is because you want to consume/produce Kafka messages without using an accompanying broker.
Sagas work with Kafka just fine, since sagas are a type of consumer.
Scheduling is not supported with Kafka, since it isn't a broker. Using scheduling with in-memory is not recommended outside of unit tests.

More than one JDBC Connectors one single topic

I have requirement to consume two exactly same table on different database (location also differ) to publish on same topic.
Kafka JDBC connectors doesn't explain how it manage high-watermark so thought to check whats best practices in this scenario?
1. Can we keep 2 separate JDBC connector publishing to separate topic
2. Can we keep 2 separate JDBC connector publishing to same topic.
If we choose option 2 how Kafka JDBC connector manage in case message arrived into table concurrently at same time? How it manage different database time zone?
Can we keep 2 separate JDBC connector publishing to separate topic
Yes.
Can we keep 2 separate JDBC connector publishing to same topic.
Yes
how Kafka JDBC connector manage in case message arrived into table concurrently at same time?
You'll get both messages on the target topic. Your consumer would need logic in it to deal with the duplication if there is any. You could use a Single Message Transform to set the key on the message written to the topic and use that as part of the de-duplication.

Connection between Apache Kafka and JMS

I was wondering could Apache Kafka communicate and send messages to JMS? Can I establish connection between them? For example, I'm using JMS in my system and it should send messages to the other system that uses Kafka
answering bit late, but if I understood correctly the requirement.
If the requirement is synchronous messaging from
client->JMS->Kafka --- > consumer
then following is not the solution, but if its ( and most likely) the async requirement like:
client->JMS | ----> Kafka ---> consumer
then, this would be related to KafkaConnect framework which is solving the problem of how to integrate different sources and sinks with Kafka.
http://docs.confluent.io/2.0.0/connect/
http://www.confluent.io/product/connectors
so what you need is a JMSSourceConnector.
Not directly. And the two are incomparable concepts. JMS is a vendor-neutral API specification of a messaging service.
While Kafka may be classified as a messaging service, it is not compatible with the JMS API, and to the best of my knowledge there is no trivial way of adapting JMS to fit Kafka's use cases without making significant compromises.
However, if your needs are simply to move messages between Kafka and a JMS-compliant broker, then this can easily be achieved by either writing a simple relay app that consumes from one and publishes onto another, or use something like Kafka Connect, which has pre-canned sinks for most data sources, including JMS brokers, databases, etc.
If the requirement is the reverse of the previous answer:
Kafka Producer -> Kafka Broker -> JMS Broker -> JMS Consumer
then you would need a KafkaConnect Sink like the following one from Data Mountaineer
http://docs.datamountaineer.com/en/latest/jms.html

Resources