Does Spring Kafka producer guarantee delivery by default? - spring-boot

I wonder whether spring kafka Producer within spring boot guarantee delivery or not.
Does anybody know what happens if some random listener fails to receive message? Would spring kafka retry to send the message?

There are some concepts here:
Producer will produce events and send them to kafka server. You must be aware on the producer side for retries and things like that if Kafka will have downtime or other error scenarios that are specific to your context.
Consumers will have assign partitions by Kafka, each partition will deliver events and each event will have an offset. Consumers will poll for data from kafka (they will request for data, kafka will not push data to consumers, but consumers will go to kafka and require data). Every event that is delivered with success by Kafka to the consumers will produce and Acknowledgment and Kafka will commit the offset of the event. So the next event, with a higher offeset will be delivered to the consumer. If a consumer goes down, partitions will be reasigned to other consumers, so you won't lose your data. If you have only one consumer, the data will be stored in Kafka and when the consumer will be back, it will go and request data from the latest/earliest offset.

Related

Kafka refresh event is not broadcasted to all subscriber on single topic

I am getting an unexpected scenario where all subscribers(Spring boot application) of a single Kafka topic are not getting Spring Cloud Config configuration change refresh notifications. Only one subscriber is getting refresh notification who has Kafka partition. Other subscriber isnot assigned with Kafka partitions and not getting refresh event.
That is how Kafka works, and so should be expected; only one active consumer in a consumer group can read any single message from a partition.
You'll need external libraries that distribute that consumed event to other channels.

Persist state of Kafka Producer within Spring Clod/Boot

I want to implement a Kafka Producer with Spring that observes a Cloud Storage and emits meta informations about newly arrived files.
Until now we did that with a Kafka Connector but for some reasons we now have to do this with a simple Kafka producer.
Now I need to persist the state of the producer (e.g. timestamp of last commited file) in a kind of Offset Topic like the Connector did, but did not find a reasonable approach to do that.
My current idea is to hold the state by committing it to a topic that the producer also consumes but just acknowledge the last consumed state when commuting a new one. So if the Kubernetes pod of the producer dies and comes up again to consume the last state (not acknowledged) and so knows where it stopped.
But this idea seems to be a bit complex to just hold a state of a Kafka app. Is there a better approach for that?

How can i send a message to disconnected customers with spring kafka?

I can not send messages to disconnected clients.
I use spring boot with apach kafka as a message broker.
If you assign a consumer group id to a user's inbox, then the consumer protocol will read from the last unread message automatically until you commit the consumed offset back to Kafka
Kafka persists messages itself and consumers are not required to be online to receive events sent immediately from producers

Multiple instances of the same Springboot application but only 1 instance consumes messages from ActiveMQ queue

I am running multiple instances of the same Spring Boot 2.0.4 Application, for scaling purposes, that consume messages from an ActiveMQ queue using the following:
#JmsListener(destination = "myQ")
Only the first consumer receives messages and if I stop the first consumer the second instance starts receiving the messages. I want each consumer to consume a message, not the same message, in a round robin fashion. But only the first consumer consumes messages.
It sounds like you want a JMS Topic rather than a Queue. You should also research durable subscriptions, shared subscriptions, and durable topics before you settle on the configuration you need for your setup.
See:
JMS API Programming Model (Search for JMS Message Consumers)
Queues vs Topics
Durable Queues and Topics

Kafka consumer , able process tons of messages

In general case Kafka consumer could be anything, that connects to Kafka and gets messages.
I'm interested in known Kafka consumers for several purposes:
1) process messages and save result in DB(Oracle)
2) process messages and save result in files
What established Kafka consumers can you suggest?
Thanks.
You can use Camus Consumer for Kakfa->HDFS. It is a mapreduce job that does distributed data loads out of Kafka.

Resources