Spring Kafka or Kafka streams for a high volume data processing Spring boot application [closed] - apache-kafka-streams

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I am working on to create a high volume JSON data processing application for a bank using Spring boot, Kafka and QuickFIX/J. This is my first time play with technologies like Kafka and QuickFIX/J and unable to decide that should I use plain Kafka spring or Kafka streams or spring cloud streams.
Here is the requirement:
Read data from multiple Kafka topics
Process and send the data to a QuickFIX/J initiator that further sends it to an external FIX engine
A QuickFIX/J acceptor receives the data from external FIX engine and write it back again to multiple Kafka topics, but different ones this time
I have gone through tutorials/articles that say Kafka streams or spring cloud stream is good if you have both consumer/producer, performing high volume data streaming and want to achieve exactly once processing. But, here I need to send data to an external party after processing, receive it and then write to Kafka topics.
Is using Kafka stream a good choice? or shall I use spring kafka with normal producers & consumers?

Spring Cloud Stream is just a higher level, opinionated, abstraction on top of Spring for Apache Kafka. It can handle your use case (there are several "sink" sample applications).
Similarly, Kafka Streams does not necessarily have to produce output to Kafka (although that's what it is designed to do).
Probably the fastest on-ramp is Spring Cloud Stream (or Spring for Apache Kafka with Spring Boot) because most cookie-cutter configuration is provided for you and you can just concentrate on your business logic.

Related

Spring Batch and Kafka

I am a junior programmer in banking. I want to make a microservice system that get data from kafka and processes it. after that, save to database and send final data to client app. What technology can i use? I plan to use spring bacth and kafka. Can the technology be implemented in my project or is there a better alternative?
To process data from a Kafka topic I recommend you to use Kafka Streams API, especially Spring Kafka Streams.
Kafka Streams and Spring
And to store the data in a database, you should use a Kafka Sink Connector.
Kafka Connect
This approach is very common and easy if your company has a Kafka ecosystem.
In terms of alternatives, here you will find an interesting comparison:
https://scramjet.org/blog/welcome-to-the-family
3 in 1 serverless
Scramjet takes a slightly different approach - 3 platforms in one.
Both the free product https://hub.scramjet.org/ for installation on your server and the cloud platform are available - currently also free in the beta version https://scramjet.org/#join-beta

Calling Hibernate in Spring cloud Stream

I'm new to Spring cloud stream.
Say I Spring cloud stream app that listen to some topic from kafka using #StreamListener("input-channel").
I want to do some calculation and send the result to another topic but in the middle of the processing I also need to call Hibernate (via spring data jpa) to persist some data to my mySQL data base.
Is it valid to call Hibernate in the middle of stream processing? is there other pattern to do it?
Yes, it's a database call, so why not. People do it all the time.
Also, #StreamListener, has been deprecated for 3 years now, and is already removed from the new versions, so please transition to functional programming model

How to provide a document for the message formats an activemq instance requires to communicate with? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I am trying to provide some services over ActiveMQ using Camel routing features. But I need my clients to know what kind of messages they can send over the ActiveMQ. I am thinking of something like swagger documentation for Spring MVC rest APIs. Is there any mechanism for that or I should do it manually?
What you're asking for isn't really the way messaging works. ActiveMQ is a message broker. Each protocol which the broker supports can have client implementations in essentially any language on any platform and each such client implementation would have it's own API documentation.
ActiveMQ does provide a JMS client implementation as that is expected for JMS providers. You can read JMS 1.1 specification or peruse the JavaDoc in order to understand the API better.
Aside from that, ActiveMQ supports the following protocols:
AMQP 1.0
STOMP 1.0, 1.1, & 1.2
MQTT 3.1
Again, each of these protocols will have various client implementations with their own documentation.
These protocols would be akin to HTTP in your REST use-case. They are essentially a transport mechanism. You will have to specify message formats in order to exchange data between applications. These message formats will be akin your REST API.
Thanks to #Helen's comment I found out about AsyncAPI. It provides documentation and code generation tools for services provided over event-driven architectures. It is based on OpenAPI specifications like Swagger. As stated in AsyncAPI specifications V2.1.0:
The AsyncAPI Specification is a project used to describe and document message-driven APIs in a machine-readable format. It’s protocol-agnostic, so you can use it for APIs that work over any protocol (e.g., AMQP, MQTT, WebSockets, Kafka, STOMP, HTTP, Mercure, etc).
The AsyncAPI Specification defines a set of files required to describe such an API. These files can then be used to create utilities, such as documentation, integration and/or testing tools.
You just have to create a YAML or JSON file. They provide multiple generators that generate codes and documents using your specification files. I used their HTML generators to generate my documents.
Also, this is a good example of how to define your specifications based on AsyncAPI.

Spring Cloud Stream vs Spring AMQP

I'm developing an application that consumes messages from an exchange and it can publish to one of the multiple exchanges based on the input message transformation result.
I am trying to decide whether to go with sprrimg amqp or spring cloud stream.
What would be apt for this scenario?
Spring Cloud Stream (its Rabbit Binder) is a higher-level abstraction on top of Spring AMQP.
It is more opinionated and performs some configuration automatically.

Spring Boot fully reactive Kafka processing

Is there any fully reactive, production ready Kafka support within Spring Boot ecosystem? By fully reactive I mean respecting backpressure / polling, concurrent message processing (flatMap) and handles possible failures (out of order processing errors). I did my research and there are several promising options (spring-cloud-stream with reactor based spring-cloud-function support, spring-integration with KafkaMessageSource and Reactor Kafka project), but I am not sure if they meet all the requirements and it is actually kinda confusing.

Resources