What's the best way to integrate a mqtt broker with tomcat - tomcat7

I have a tomcat7 application currently accepting https calls which carry a JSON payload. So this work perfectly in a client/server relationship.
I want to be able to push data out to 'clients' so am investigating using MQTT. This work fine - I can publish/subscribe messages between MQTT broker and the 'clients'.
I want now to be able to re-use my Tomcat code. Do I configure tomcat to publish/subscribe to MQTT topics? Do I make some 3rd process which subscribes to MQTT topic and calls into tomcat.
I'm at the beginning of my investigation stage of a project. Any help/recommendations are appreciated.

Yes it is possible, there are many client libraries available (e.g. Poho for Java), so your server can subscribe/publish messages. Now to handle multiple messages published from various clients, implement a message queue.
Checkout this, RabbitMQ MQTT Adapter
https://www.rabbitmq.com/mqtt.html

Related

Is it possible sending websocket messages to a kafka topic?

I am trying to find a way to consume messages that being sent by a websocket to a kafka topic (the messages are sent by the websocket to the address 'ws://address:port/topic_name' and I want to add all of those messages to a kafka topic).
I read about kafka connect and tried to find a way to do it with it but it doesnt seem to work...
thanks in advance :)
There is no Kafka Connector to a socket in Confluent Platform.
I work in a team that use Kafka in production and our source is a socket, so your options are to use platforms that support this socket->Kafka producing, or write one by yourself.
About possible platforms, I think most of them will be overkill though you can utilize them for this problem, some options are:
1. NiFi or MiniFi for smaller loads, use PublishKafka Processor
2. StreamSets with Kafka Producer Destination
3. Apache Flume- not very recommended, this project is stops to evolve.
If you wish to write your own producer, you basically have to create a listener on this port, and produce the incoming messages to Kafka; if this is a web socket, just get the payload of the requests and produce them to Kafka.
Example Kafka Producer Code can be copied from tutorialspoint simple producer example*
Here are some open-source projects examples:
1. https://github.com/DataReply/kafka-connect-socket-source
2. https://github.com/kafka-socket/miniature_engine
3. https://github.com/dhanuka84/kafka-connect-tcp
4. https://github.com/krux/tcp-stream-kafka-producer
The idea of Kafka connect is that you have some sort of external integration that serves as storage. This can be SAP, Salesforce, RDBMS, MQ or anything else that has state. You websocket endpoint does not have data, you can not poll it it is someone else that is invoking it and there fore the data is transfered. Now if you know who is actualy holding the data than you can potentialy build a conector using this guide. https://docs.confluent.io/current/connect/devguide.html
For your particular case, the best you can do is either to use Kafka Producer API https://docs.confluent.io/current/clients/producer.html
and from your websocket enpoint use this producer to post a message to the topic, or even better if you are using spring you can use a higher level abstraction, that will be KafkaTemplate https://docs.spring.io/spring-kafka/reference/html/#sending-messages.
Full disclosure: I work for MigratoryData.
You can check out MigratoryData's solution for Kafka. MigratoryData is a scalable WebSocket server. The MigratoryData Source/Sink Connector for Kafka makes use of Kafka Connect API and can be used to stream data in real-time from Kafka to WebSocket clients and vice versa. The main advantage of the solution is it extends Kafka messaging to WebSocket clients while preserving Kafka's key features like guaranteed delivery, message ordering, etc.

Difference between SockJS and ActiveMQ/RabbitMQ

I have recently developed a simple messaging application with Spring Boot and Spring Security. The application takes in 2 users - user A and user B. Once, user A performs a specific task a notification is sent to user B. Currently I am doing this by adding a Spring Messaging dependency and SockJS and it works great.
Here is where I am confused and hoping to receive some guidance. I realize there are many tutorials that speak about RabbitMQ and ActiveMQ. From what I understand, they are message brokers. May I ask what is the difference between SockJS and RabbitMQ/ActiveMQ? And do I need RabbitMQ/ActiveMQ in my current application together with SockJS?
SockJS is JavaScript based WebSocket client library that runs in a browser. It can be used to send messages to or receive messages from a broker.
Both RabbitMQ and ActiveMQ are message brokers, examples of message-oriented middleware. They both support WebSocket clients which use a messaging protocol (e.g. STOMP or AMQP). Brokers receive messages from and dispatch messages to clients.
You haven't really provided enough information to determine whether or not you actually need to use either RabbitMQ or ActiveMQ in your current application given that it's already working as it is.

Spring Integration - ActiveMQ to Kafka

I am currently trying to write an adapter which will consume messages from ActiveMQ and publish it to Kafka.
I am thinking of using spring integration to integrate these two messaging systems.
My problem is that my application will not maintain registry of the Models using which many applications will publish the records to activeMQ. I want to receive these javax jms message and want to perform some transformation like adding jmscorrelationId into kafka message.
ALso, another requirement is to send acknowledgement to active mq only when kafka send/publish is successfull.
Can ack be send back to activemq using spring integration?
Will spring integration be a good option?
Kindly note my tech architect is not in favor of using Camel/Mule. Also, he does not want to use Kafka Connect as i was planning to use Kafka connect source.
Please suggest.
The Spring Integration Kafka extension project has a sync mode for publishing, which will block the thread until Kafka confirms delivery (or throw an exception on a failure).
The JMS inbound gateway can be used to return a reply to a JMS queue.
You can add transformers (or whatever) in the flow to modify the message.

Connection between Apache Kafka and JMS

I was wondering could Apache Kafka communicate and send messages to JMS? Can I establish connection between them? For example, I'm using JMS in my system and it should send messages to the other system that uses Kafka
answering bit late, but if I understood correctly the requirement.
If the requirement is synchronous messaging from
client->JMS->Kafka --- > consumer
then following is not the solution, but if its ( and most likely) the async requirement like:
client->JMS | ----> Kafka ---> consumer
then, this would be related to KafkaConnect framework which is solving the problem of how to integrate different sources and sinks with Kafka.
http://docs.confluent.io/2.0.0/connect/
http://www.confluent.io/product/connectors
so what you need is a JMSSourceConnector.
Not directly. And the two are incomparable concepts. JMS is a vendor-neutral API specification of a messaging service.
While Kafka may be classified as a messaging service, it is not compatible with the JMS API, and to the best of my knowledge there is no trivial way of adapting JMS to fit Kafka's use cases without making significant compromises.
However, if your needs are simply to move messages between Kafka and a JMS-compliant broker, then this can easily be achieved by either writing a simple relay app that consumes from one and publishes onto another, or use something like Kafka Connect, which has pre-canned sinks for most data sources, including JMS brokers, databases, etc.
If the requirement is the reverse of the previous answer:
Kafka Producer -> Kafka Broker -> JMS Broker -> JMS Consumer
then you would need a KafkaConnect Sink like the following one from Data Mountaineer
http://docs.datamountaineer.com/en/latest/jms.html

How does RabbitMQ compare to Mule

How does RabbitMQ compare to Mule, I am going to build an application using message oriented architecture and AMQP (RabbitMQ) provides everything i want, but i am perplexed with so many related technology choice and similar concepts like ESB. I am having a doubt if i am making a choice without considering other alternatives.
I am mostly clear that RabbitMQ is a message broker and it helps me in mediating message between producer and consumer (all forms or publish subscribe and i could understand how its used from real examples like twitter , or Facebook updates, etc)
What is Mule, if i could achieve what i do in RabbitMQ using mule, should i consider mule similar to RabbitMQ?
Does mule has a different objective than that of a message broker?
Does mule assumes that underlying it there is a message broker that delivers message to the appropriate mule listeners (i could easily write a listener in RabbitMQ)
Is mule a complete Java bases system ( The current experiment i did with RabbitMQ took me less than 30 Min to write a simple RPC Client Server with client as C# and Server as Java , will such things be done in Mule easily).
Mule is an ESB (Enterprise Service Bus). RabbitMQ is a message broker.
An ESB provides added layers atop of a message broker such as routing, transformations and business process management. It is a mediator between applications, integrating Web Services, REST endpoints, database connections, email and ftp servers - you name it. It is a high-level integration backbone which orchestrates interoperability within a network of applications that speak different protocols.
A message broker is a lower level component which enables you as a developer to relay raw messages between publishers and subscribers, typically between components of the same system but not always. It is used to enable asynchronous processing to keep response times low. Some tasks take longer to process and you don't want them to hold things up if they're not time-sensitive. Instead, post a message to a queue (as a publisher) and have a subscriber pick it up and process it "later".
Mule is a "higher level" service implemented with message broker. From the docs
The messaging backbone of the ESB is
usually implemented using JMS, but any
other message server implementation
could be used
You can build an ESB with rabbit; however, you're going to be limited to sending byte[] packages, and you'll have to build your system out of messaging primitives like topics and queues. It might be a bit faster (based on absolutely no benchmarking, testing or data) because there are fewer layers of translation. Mule provides an abstraction on top of this, speaks a variety of transports, and can handle some routing logic.
Mule is a Enterprise service bus providing end to end integration solution where as Rabbit is message broker for queueing messages between subscriber and receiver.
RabbitMQ, a open source message broker software is written in Erlang programming language and is built on Open Telecom Platform for clustering and failover. It is easy to use, supports a huge number of developer platforms and runs on all major operating systems. It works on a concept called Exchange.
Mule connects RabbitMQ with AMQP connector.

Resources