how to consume events from kafka by a Spring Rest endpoint - spring

I'm new to Kafka. I've seen that the consumer is "always running" and retrieves messages from a topic as soon as been published.
In a typical database web application you have a rest API that connects to DB and returns some response.
From what I see the consumer stays active and never close.
So I don't figure out how to return a subset of messages from a topic based on client request.
I thought the service would create a consumer to get what I need, but as far as consumer never close, I guess my opinion is not correct.
What should I do?

Then it's a simple question of persisting messages rceived thru KafkaListener, let's say adding each of them to a simple collecton (along with its timestamp) and implementing an endpoint to filter the messages accordingly and returning some of them.
#Controller
public class KafkaController {
#Autowired
private KafkaProducerConfig kafkaProducerConfig;
private Map<Date, String> msgMap = new HashMap();
#KafkaListener(topics = "myTopic", groupId = "myGroup")
public void listenAndAddMsg(String message) {
msgMap.put(new Date(), message);
}
#PostMapping("messages")
#ResponseBody
public String filterMessages(#RequestBody Interval interval) {
return msgMap.entrySet()
.stream()
.filter(map -> map.getKey().after(interval.getStartDate()) && map.getKey().before(interval.getEndDate()))
.collect(Collectors.toMap(map -> map.getKey(), map -> map.getValue()));
}
}
public class Interval {
private Date startDate;
private Date endDate;
// setters and getters
}

Related

How many kafka topics to create for an api?

I'm using kafka for my api. I'm using spring with microservice I'll post my kafka code below:
Command:
private static final Logger logger =
LoggerFactory.getLogger(UserCommandServiceImpl.class);
#Autowired
private KafkaTemplate<String, Object> kafkaTemplate;
public void sendMessage(User objeto)
{
logger.info(String.format("Message sent -> %s", objeto.toString()));
this.kafkaTemplate.send("quickstart-events", objeto);
}
Query:
private final Logger logger = LoggerFactory.getLogger(UserQueryServiceImpl.class);
#Autowired
private MongoTemplate mongoTemplate;
#KafkaListener(topics = "quickstart-events" , groupId = "group-id")
public void consume(String message)
{
logger.info(String.format("Message recieved -> %s", message));
mongoTemplate.insert(message, "user");
}
I installed kafka from that site:
I'm using CQRS Pattern so each query is a microservice and command another.
My question is simple for each microservice I create a kafka topic?
Thanks!
Imagine a Kafka Topic as a database table, use one topic per kind of data.
If you are wondering how you can scale your application, you may ask how many partitions your topic should have. A topic is a set of partitions that will handle all data.
Take a look at the image below, a topic will receive values from more than one producer and it will have just one kind of message. A message can be stored in any partition and this is defined by the message key.

SSE events, check if there is active session

In a Spring Boot application I have an endpoint which returns a Flux which is set with a custom Sink similar to this post.
I dont have a Flux with interval.
The code looks like this, constructor:
private final Flux<MyDto> events;
public MyController(MyService service,
EventPublisher publisher
) {
this.service = service;
this.events = Flux.create(publisher).share();
}
API:
#GetMapping(path = "/path/events-stream", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<MyDto> fluxEvents() {
return this.events.map(event -> {
return (MyDto) event.getSource();
});
}
The event is triggered by an external factor exactly like the post mentioned.
How can I know if there are active connections on the events-stream ?
I need this to know if I should trigger new events to the user
The question is asked here again with no answer

Spring-Kafka: How to insert an application.yml topic in Producer Kafka

I have a spring-kafka microservice to which I recently added a dead letter to be able to send the various error messages
//some code..
#Component
public class KafkaProducer {
#Autowired
private KafkaTemplate<String, String> kafkaTemplate;
public void sendDeadLetter(String message) {
kafkaTemplate.send("myDeadLetter", message);
}
}
I would like to call the topic kafka of the dead letter as "messageTopic" + "_deadLetter", my main topic being "messageTopic". In my Consumer the topic name gives him the application.yml as follows:
#KafkaListener(topics = "${spring.kafka.topic.name}")
How can I set the same kafka topic by possibly inserting the "+ deadLetter" from the application.yml? I tried such a thing:
#Component
#KafkaListener(topics = "${spring.kafka.topic.name}"+"_deadLetter")
public class KafkaProducer {
#Autowired
private KafkaTemplate<String, String> kafkaTemplate;
public void sendDeadLetter(String message) {
kafkaTemplate.send("messageTopic_deadLetter", message);
}
}
but it creates me two different topics with the same name. I am waiting for some advice, thanks for the help!
Kafka Listener accepts constant for the Topic name, we can't modify the TOPIC name here.
Ideally good to go with separate methods (Kafka listeners) for actual topic and dead letter topic, define two different properties in YAML to hold two topic names.
#KafkaListener(topics = "${spring.kafka.topic.name}")
public void listen(......){
}
#KafkaListener(topics = "${spring.kafka.deadletter.topic.name}")
public void listenDlt(......){
}
To refer topic name inside send(...) from yml or property file
#Component
#KafkaListener(topics = "${spring.kafka.deadletter.topic.name}")
public class KafkaProducer {
#Value("${spring.kafka.deadletter.topic.name}")
private String DLT_TOPIC_NAME;
#Autowired
private KafkaTemplate<String, String> kafkaTemplate;
public void sendDeadLetter(String message) {
kafkaTemplate.send(DLT_TOPIC_NAME, message);
}
}
You can construct the topic name with SpEL:
#KafkaListener(topics = "#{'${spring.kafka.topic.name}' + '_deadLetter'"})
Note the single quotes around the property placeholder and literal.
This example may not be relevant to your use case, but sharing in case it's helpful to someone.
If you are building a Kafka Stream application, variable sink topic names can be achieved with the following:
When producing to the sink topic, pass a lambda that has the context as argument and the method that will handle the name definition.
... /* precedent stream operations */
// terminal operation 'to'.
.to(
(k, v, ctx) -> sinkTopicNameGenerator(ctx),
Produced.with(Serdes, Serdes)
);
Implement the method that generates the sink topic names:
protected static String sinkTopicNameGenerator(RecordContext ctx) {
return ctx.topic().concat("_deadLetter");
}
The above example is simple enough to be simplified to (k, v, ctx) -> ctx.topic().concat("_deadLetter"), but I wanted to keep the separate method approach for cases where further transformations are required, i.e. when part of the topic name will be replaced by some constant or regex defined in the config file.

Not able to to filter messages received using condition attribute in Spring Cloud Stream #StreamListener annotation

I am trying to create a event based system for communicating between services using Apache Kafka as Messaging system and Spring Cloud Stream Kafka.
I have written my Receiver class methods as below,
#StreamListener(target = Sink.INPUT, condition = "headers['eventType']=='EmployeeCreatedEvent'")
public void handleEmployeeCreatedEvent(#Payload String payload) {
logger.info("Received EmployeeCreatedEvent: " + payload);
}
This method is specifically to catch for messages or events related to EmployeeCreatedEvent.
#StreamListener(target = Sink.INPUT, condition = "headers['eventType']=='EmployeeTransferredEvent'")
public void handleEmployeeTransferredEvent(#Payload String payload) {
logger.info("Received EmployeeTransferredEvent: " + payload);
}
This method is specifically to catch for messages or events related to EmployeeTransferredEvent.
#StreamListener(target = Sink.INPUT)
public void handleDefaultEvent(#Payload String payload) {
logger.info("Received payload: " + payload);
}
This is the default method.
When I run the application, I am not able to see the methods annoated with condition attribute being called. I only see the handleDefaultEvent method being called.
I am sending a message to this Receiver Application from the Sending/Source App using the below CustomMessageSource class as below,
#Component
#EnableBinding(Source.class)
public class CustomMessageSource {
#Autowired
private Source source;
public void sendMessage(String payload,String eventType) {
Message<String> myMessage = MessageBuilder.withPayload(payload)
.setHeader("eventType", eventType)
.build();
source.output().send(myMessage);
}
}
I am calling the method from my controller in Source App as below,
customMessageSource.sendMessage("Hello","EmployeeCreatedEvent");
The customMessageSource instance is autowired as below,
#Autowired
CustomMessageSource customMessageSource;
Basicaly, I would like to filter messages received by the Sink/Receiver application and handle them accordingly.
For this I have used the #StreamListener annotation with condition attribute to simulate the behaviour of handling different events.
I am using Spring Cloud Stream Chelsea.SR2 version.
Can someone help me in resolving this issue.
It seems like the headers are not propagated. Make sure you include the custom headers in spring.cloud.stream.kafka.binder.headers http://docs.spring.io/autorepo/docs/spring-cloud-stream-docs/Chelsea.SR2/reference/htmlsingle/#_kafka_binder_properties .

How to get a piece of identifiable data when writing to an AWS SQS Queue?

I've got an app that needs to track the transaction id (or something similar) of a message sent to an SQS queue.
If I send with the following class, how could I get some piece of identifiable data with querying the queue?
#Component
public class SqsQueueDao {
private static final Logger LOGGER = LoggerFactory.getLogger(SqsQueueDao.class);
private final QueueMessagingTemplate queueMessagingTemplate;
#Autowired
#Qualifier("awsClient")
AmazonSQSAsyncClient amazonSQSAsyncClient;
public SqsQueueDao(AmazonSQSAsync amazonSQSAsyncClient) {
this.queueMessagingTemplate = new QueueMessagingTemplate(amazonSQSAsyncClient);
}
//TODO: implement a strategy for identifying the message id
public Long send(String queueName, String message) {
queueMessagingTemplate.convertAndSend(queueName, MessageBuilder.withPayload(message).build());
//return some long identifying data
}
}
SQS assigns a message ID, but the queueMessagingTemplate.convertAndSend method doesn't return anything. If you send the message using the SQS client directly then you would get a SendMessageResult object that would have the message ID on it. However the SQS message ID Is a String not a number, so you still wouldn't be able to fulfill your contract to return a Long.
If you can return a String message ID instead of a Long, then the code would look like this:
public String send(String queueName, String message) {
// Could probably cache this URL instead of looking up each time
String queueUrl = amazonSQSAsyncClient.getQueueUrl(queueName).getQueueUrl();
SendMessageResult result = amazonSQSAsyncClient.sendMessage(queueUrl, message);
return result.getMessageId();
}

Resources