Connection between Apache Kafka and JMS - jms

I was wondering could Apache Kafka communicate and send messages to JMS? Can I establish connection between them? For example, I'm using JMS in my system and it should send messages to the other system that uses Kafka

answering bit late, but if I understood correctly the requirement.
If the requirement is synchronous messaging from
client->JMS->Kafka --- > consumer
then following is not the solution, but if its ( and most likely) the async requirement like:
client->JMS | ----> Kafka ---> consumer
then, this would be related to KafkaConnect framework which is solving the problem of how to integrate different sources and sinks with Kafka.
http://docs.confluent.io/2.0.0/connect/
http://www.confluent.io/product/connectors
so what you need is a JMSSourceConnector.

Not directly. And the two are incomparable concepts. JMS is a vendor-neutral API specification of a messaging service.
While Kafka may be classified as a messaging service, it is not compatible with the JMS API, and to the best of my knowledge there is no trivial way of adapting JMS to fit Kafka's use cases without making significant compromises.
However, if your needs are simply to move messages between Kafka and a JMS-compliant broker, then this can easily be achieved by either writing a simple relay app that consumes from one and publishes onto another, or use something like Kafka Connect, which has pre-canned sinks for most data sources, including JMS brokers, databases, etc.

If the requirement is the reverse of the previous answer:
Kafka Producer -> Kafka Broker -> JMS Broker -> JMS Consumer
then you would need a KafkaConnect Sink like the following one from Data Mountaineer
http://docs.datamountaineer.com/en/latest/jms.html

Related

Kafka connect with EventStoreDB

I'm working on a small academic project - Event sourcing with EventStoreDB and Apache Kafka as a broker. The idea is that get events from EventStoreDB and push them to Kafka for further distribution. I saw Apache Kafka has connections to different DB systems but didn't find any connector with EvenStoreDB.
How can I create(code or use existing one) Kafka connector to EventStoreDB, so these two systems would be able to transfer events vise-versa, from Kafka to EventStoreDB and from EventStoreDB to Kafka?
There is no official Kafka Connect Connector between Kafka and EventStoreDB, and I haven't heard about any unofficial so far. Still, there is a tool called Replicator that enables replicating data from EventStoreDB to Kafka (https://replicator.eventstore.org/docs/features/sinks/kafka/). It's open-sourced, so you can either use it or check the implementation.
For the EventStoreDB to Kafka, I recommend using the subscriptions mechanism: catch-up if you need an ordering guarantee, persistent if ordering is not critical: https://developers.eventstore.com/clients/grpc/subscriptions.html. The crucial part here is to define how to map EventStoreDB streams to Kafka topics and partitions. Typically you'd expect to have at least an ordering guarantee on the stream level, so single stream events should land to the same partition.
For Kafka to EventStoreDB integration, you could either write your own pass-through service or try to use the HTTP sink connector (e.g. https://docs.confluent.io/kafka-connect-http/current/overview.html). EventStoreDB exposes HTTP API (https://developers.eventstore.com/clients/http-api/v5/introduction/). Sidenote, this API (Atom pub based) may be replaced with another HTTP API in the future, so the structure may change.
You can use Event Store Replicator, which has a Kafka sink.
Keep in mind that it doesn't do anything with regards to events schema, so things like Kafka Streams and KSQL might not work properly.
The sink was created solely for the purpose of pushing events to Kafka being used as a message broker.

Is it possible sending websocket messages to a kafka topic?

I am trying to find a way to consume messages that being sent by a websocket to a kafka topic (the messages are sent by the websocket to the address 'ws://address:port/topic_name' and I want to add all of those messages to a kafka topic).
I read about kafka connect and tried to find a way to do it with it but it doesnt seem to work...
thanks in advance :)
There is no Kafka Connector to a socket in Confluent Platform.
I work in a team that use Kafka in production and our source is a socket, so your options are to use platforms that support this socket->Kafka producing, or write one by yourself.
About possible platforms, I think most of them will be overkill though you can utilize them for this problem, some options are:
1. NiFi or MiniFi for smaller loads, use PublishKafka Processor
2. StreamSets with Kafka Producer Destination
3. Apache Flume- not very recommended, this project is stops to evolve.
If you wish to write your own producer, you basically have to create a listener on this port, and produce the incoming messages to Kafka; if this is a web socket, just get the payload of the requests and produce them to Kafka.
Example Kafka Producer Code can be copied from tutorialspoint simple producer example*
Here are some open-source projects examples:
1. https://github.com/DataReply/kafka-connect-socket-source
2. https://github.com/kafka-socket/miniature_engine
3. https://github.com/dhanuka84/kafka-connect-tcp
4. https://github.com/krux/tcp-stream-kafka-producer
The idea of Kafka connect is that you have some sort of external integration that serves as storage. This can be SAP, Salesforce, RDBMS, MQ or anything else that has state. You websocket endpoint does not have data, you can not poll it it is someone else that is invoking it and there fore the data is transfered. Now if you know who is actualy holding the data than you can potentialy build a conector using this guide. https://docs.confluent.io/current/connect/devguide.html
For your particular case, the best you can do is either to use Kafka Producer API https://docs.confluent.io/current/clients/producer.html
and from your websocket enpoint use this producer to post a message to the topic, or even better if you are using spring you can use a higher level abstraction, that will be KafkaTemplate https://docs.spring.io/spring-kafka/reference/html/#sending-messages.
Full disclosure: I work for MigratoryData.
You can check out MigratoryData's solution for Kafka. MigratoryData is a scalable WebSocket server. The MigratoryData Source/Sink Connector for Kafka makes use of Kafka Connect API and can be used to stream data in real-time from Kafka to WebSocket clients and vice versa. The main advantage of the solution is it extends Kafka messaging to WebSocket clients while preserving Kafka's key features like guaranteed delivery, message ordering, etc.

Spring Integration - ActiveMQ to Kafka

I am currently trying to write an adapter which will consume messages from ActiveMQ and publish it to Kafka.
I am thinking of using spring integration to integrate these two messaging systems.
My problem is that my application will not maintain registry of the Models using which many applications will publish the records to activeMQ. I want to receive these javax jms message and want to perform some transformation like adding jmscorrelationId into kafka message.
ALso, another requirement is to send acknowledgement to active mq only when kafka send/publish is successfull.
Can ack be send back to activemq using spring integration?
Will spring integration be a good option?
Kindly note my tech architect is not in favor of using Camel/Mule. Also, he does not want to use Kafka Connect as i was planning to use Kafka connect source.
Please suggest.
The Spring Integration Kafka extension project has a sync mode for publishing, which will block the thread until Kafka confirms delivery (or throw an exception on a failure).
The JMS inbound gateway can be used to return a reply to a JMS queue.
You can add transformers (or whatever) in the flow to modify the message.

Connect JMS client to Apache Kafka

I have a 3rd party system pumping data into HornetQ using JMS. I need to replace HornetQ by Kafka but I cannot change the 3rd party system. What is the correct way to get the data into kafka.
I googled around and found JMS-Client and kafka connect. After reading both documentation I'm confused and not sure which one is the right one.
Has anyone any experience with this and can give me some hints on how to do this?
The right way is to use the JMS-Client because it's an implementation of the JMS API specification but with the Kafka wire-protocol. It means that you can use this client in your 3rd party system and using Kafka instead of HornetQ on the other side. It means that at least you need to add this dependency to the 3rd party system in order to use this JMS implementation for Kafka instead of the HornetQ one.
Use the Kafka JMS Client when you want to replace a JMS Broker with Apache Kafka
Use the Kafka JMS Connector when you want to integrate Kafka with a legacy JMS broker and send messages between the two different systems.

JMS and ESB - how they are related?

For me JMS and ESB seem to be very related things and I'm trying to understand how exactly they are related.
I've seen a sentence that JMS can be used as a transport for ESB - then what else except the transport should be present in such an ESB? Is JMS a simple ESB or if not, then what it lacks from the real ESB?
JMS offers a set of APIs for messaging: put a message on a queue, someone else, sometime later, perhaps geographically far away takes the message off the queue and processes it. We have decoupled in time and location of the message provider and consumer. Even if the message consumer happens to be down for a time we can keep producing messages.
JMS also offers a publish/subscribe capability where the producer puts the message to a "topic" and any interested parties can subscribe to that topic, receiving messages as and when they are produced, but for the moment focus just on the queue capabilty.
We have decoupled some aspects of the relationship between provider and consumer. However some coupling remains. First, as things stand every message is processed in the same way. Suppose we want to introduce different kinds of processing for different kinds of messages:
if ( message.customer.type == Platinum )
do something special
Obviously we can write code like that, but an alternative would be to have a messaging system that can send different messages to different places we set up three queues:
Request Queue, the producer(s) puts their requests here
Platinum Queue, platinum consumer processing reads from here
Standard Queue, a standard consumer reads messages from here
And then all we need is a little bit of cleverness in the queue system itself to transfer then messsage from the Request Queue to the Platinum Queue or Standard Queue.
So this is a Content-Based Routing capability, and is something that an ESB provides. Note that the ESB uses the fundamental Queueing capabilities offered by JMS.
A second kind of couppling is that the consumer and producer must agree about the message format. In simple cases that's fine. But when you start to have many producers all putting message to the same queue you start to hit versioning problems. New message formats are introduced but you don't want to change all the existing providers.
Request Version 1 Queue Existing providers write here
Request Version 2 Queue New provider write here, New Consumer Reads here
And the ESB picks up the Version 1 Queue messages and transforms them into Version 2 messages and puts them onto the Version 2 queue.
Message transformation is another possible ESB capability.
Have a look at ESB products, see what they can do. As I work for IBM, I'm most familiar with WebSphere ESB
I would say ESB is like a facade into a number of protocals....JMS being one of them.
An addition to the above list is the latest Open Source ESB - UltraESB
JMS is not well suited for the integration of REST services, File systems, S/FTP, Email, Hessian, SOAP etc. which are better handled with an ESB that supports these types natively. For example, if you have a process that dumps a CSV file of 500MB at midnight, and you want another system to pickup the file, parse CSV and import into a database, this can easily be accomplished by an ESB - whereas a solution with just JMS will be bad. Similarly, integration of REST services, with load balancing/failover to multiple backend instances can be done better with an ESB supporting HTTP/S natively.
This Transformation does not happen automatically. You need to configure the mapping or write transformation service
Look at https://access.redhat.com/knowledge/docs/en-US/JBoss_Enterprise_SOA_Platform/4.2/html/SOA_ESB_Message_Transformation_Guide/ch02s03.html
Regards,
Raja Nagendra Kumar,
C.T.O
www.tejasoft.com
ESB offers integration with a lot of different protocols in addition to JMS.
Most use JMS behind the scenes to transfer, stor and move messages. One such solution OpenESB, uses XML format messages.
There are open source ESB which you could checkout -
OpenESB
Apache Camel
MuleESB
WSO2 ESB
JMS implementation like ActiveMQ come with Camel inbuilt into them.
JMS is a protocol for communicating with an underlying messaging layer. ESB operates at a higher level, offering integration with multiple technologies and protocols, one of which would be JMS, in a uniform way that makes management of complex flows much simpler.
There are JMS message brokers , that you can easily configure with ESB. https://docs.wso2.com/display/ESB470/JMS+Transport
JMS and ESB both provide a way of communication between different applications. But the context for JMS and ESB are different. JMS is for simple need. JMS is implemented by JMS Provider. It is Java specific.
Examples of JMS Providers are: Apache Active MQ, IBM MQ, HornetQ etc.
ESB is for complex need. ESB is a component in EAI providing communication facility to various applications. It is generic & not specific to Java. JMS is one of the supported protocols.
Examples of ESB provider are: MuleESB, Apache Camel, OpenESB
Use Case: It may be an overhead to use ESB, if all our communicating applications are in Java and are using the same message format. Here JMS may be sufficient.

Resources