How to load a compacted topic in memory before starting the context - spring

I'm using a compacted topic in kafka which I load into a HashMap at the application startup.
Then I'm listening to a normal topic for messages, and processing them using the HashMap constructed from the compacted topic.
How can I make sure the compacted topic is fully read and the HashMap fully initialized before starting to listen to the other topics ?
(Same for RestControllers)

Implement SmartLifecycle and load the map in start(). Make sure the phase is earlier than any other object that needs the map.

This is an old question, I know, but I wanted to provide a more complete code sample of a solution that I ended up with when I struggled with this very problem myself.
The idea is that, like Gary has mentioned in the comments of his own answer, a listener isn't the correct thing to use during initialization - that comes afterwards. An alternative to Garry's SmartLifecycle idea, however, is InitializingBean, which I find less complicated to implement, since it's only one method: afterPropertiesSet():
#Slf4j
#Configuration
#RequiredArgsConstructor
public class MyCacheInitializer implements InitializingBean {
private final ApplicationProperties applicationProperties; // A custom ConfigurationProperties-class
private final KafkaProperties kafkaProperties;
private final ConsumerFactory<String, Bytes> consumerFactory;
private final MyKafkaMessageProcessor messageProcessor;
#Override
public void afterPropertiesSet() {
String topicName = applicationProperties.getKafka().getConsumer().get("my-consumer").getTopic();
Duration pollTimeout = kafkaProperties.getListener().getPollTimeout();
try (Consumer<String, Bytes> consumer = consumerFactory.createConsumer()) {
consumer.subscribe(List.of(topicName));
log.info("Starting to cache the contents of {}", topicName);
ConsumerRecords<String, Bytes> records;
do {
records = consumer.poll(pollTimeout);
records.forEach(messageProcessor::process);
} while (!records.isEmpty());
}
log.info("Completed caching {}", topicName);
}
}
For brevity's sake I'm using Lombok's #Slf4j and #RequiredArgsConstructor annotations, but those can be easily replaced. The ApplicationProperties class is just my way of getting the topic name I'm interested in. It can be replaced with something else, but my implementation uses Lombok's #Data annotation, and looks something like this:
#Data
#Configuration
#ConfigurationProperties(prefix = "app")
public class ApplicationProperties {
private Kafka kafka = new Kafka();
#Data
public static class Kafka {
private Map<String, KafkaConsumer> consumer = new HashMap<>();
}
#Data
public static class KafkaConsumer {
private String topic;
}
}

Related

KafkaListener Not triggered in Spring Boot test

I have a spring boot test to check if a kafka consumer listens for a message in specific topic. The kafka listener is triggered when using #SpringBootTest. But I just don't want to load all the classes and I only supplied the listener class like this #SpringBootTest(classes={KafkaConsumerTest.class}).
When only loading the consumer class, the listener has stopped to trigger. Is there something I am missing?
Here is the KafkaTestConsumer class
#Service
public class KafkaTestConsumer {
private static final Logger LOGGER = LoggerFactory.getLogger(KafkaTestConsumer.class);
private CountDownLatch latch = new CountDownLatch(1);
private String payload;
#KafkaListener(topics = {"topic"})
public void receive(ConsumerRecord<?, ?> consumerRecord) {
payload = consumerRecord.toString();
latch.countDown();
}
public CountDownLatch getLatch() {
return latch;
}
public void resetLatch() {
latch = new CountDownLatch(1);
}
public String getPayload() {
return payload;
}
}
It would be great to see what is your KafkaConsumerTest, but perhaps you just override the whole auto-configuration with a plain #Configuration.
See more in docs: https://docs.spring.io/spring-boot/docs/current/reference/html/features.html#features.testing.spring-boot-applications.detecting-configuration
If you want to customize the primary configuration, you can use a nested #TestConfiguration class. Unlike a nested #Configuration class, which would be used instead of your application’s primary configuration, a nested #TestConfiguration class is used in addition to your application’s primary configuration.

Why are we Injecting on constructor/method

I am new to Spring. Please help me in understanding the below logic.
I have gone through Google and Stack Overflow search, but nowhere I was able to find a clear and straightforward explanation.
Exporter.java
public class Exporter {
#NonNull
public final Subscription subscription;
#Inject
public Exporter(#Named(DOWNLOAD) final boolean download,
final Subscription subscription){
this.subscription = subscription;
}
}
Subscription.java
#Getter
#singleton
#AllArgsConstructor(onConstructor = #__(Inject))
public class Subscription {
//some logic
}
Why are we using #Inject here for a constructor(Exporter)? what exactly it does?
When does it get initialized?
Why are we required to use #AllArgsConstructor for Subscription class?

Spring Bookt Kafka ABSwitchCluster

I couldn't find any example to swtich between kafka cluster .
Anyone has implmeneted this class ABSwitchCluster from Spring Kafka.
https://docs.spring.io/spring-kafka/reference/html/
I tried with below code, but its not switching cluster.
#RestController
public class ApacheKafkaWebController {
#Autowired
ConsumerKakfaConfiguration configuration;
#Autowired
private KafkaListenerEndpointRegistry registry;
#Autowired
private ABSwitchCluster switcher;
#GetMapping(value = "/switch")
public String producer() {
registry.stop();
switcher.secondary();
registry.start();
return "switched!";
}
}
and swticher bean here:
#Bean
public ABSwitchCluster switcher() {
return new ABSwitchCluster("127.0.0.1:9095", "127.0.0.1:9096");
}
Could you please tell me am I missing anything here?, still its running in 9095 port.
See this answer and this test.
Basically, you switch the cluster and reset the connections by stopping and starting listener containers and resetting the producer factory.

Spring-Kafka: How to pass the kafka topic from the application.yml

I have a small project in Spring Kafka
I wish I could pass my kafka topic from application.yml and avoid a hard-coding problem. For the moment I have this situation:
public class KafkaConsumer {
#Autowired
private UserRepository userRepository;
#KafkaListener(topics = "myTopic")
public void listen(#Validate UserDto userDto) {
User user= new User(userDto);
userRepository.save(userDto.getAge(), user);
}
}
at this moment I have the static kafka topic (being a string) is it possible to put it in the application.yml and have it read from there? Thanks everyone for any help
You can post your topic in the application.yml :
kafka:
template:
default-topic: "MyTopic"
In your KafkaListerner :
#KafkaListener(topics = "#{'${spring.kafka.template.default-topic}'}")
So you should solve the problem of the "Attribute Value" failing to take a dynamic value
This worked for me.
You can use below entry in application.yml file
Usually we use #Value as below to pick data from properties/yaml files for a specified key in you Java class as below.
#Value("${kafka.topic.name}")
private String TOPIC_NAME;
Since Kafka Listener expects constant here, you can use directly as below
public class KafkaConsumer {
#Autowired
private UserRepository userRepository;
#KafkaListener(topics = "${kafka.topic.name}")
public void listen(#Validate UserDto userDto) {
User user= new User(userDto);
userRepository.save(userDto.getAge(), user);
}
}

Spring integartion LoggingHandler logs all messages to Error

I created a spring boot application that sends Messages through a PublishSubscribeChannel. This Channel is "autowired" as SubscribableChannel interface.
I am only subscribing one MessageHandler to this channel, a KafkaProducerMessageHandler.
My problem is that one additional MessageHandler is subscribed and this is an LoggingHandler. It is instantiated with ERROR level. So i see every message logged es error.
I want to know why and where this LoggingHandler is wired (instantiated) and why it is subscribed to the channel - i want to disable it.
(
I debugged around a bit but (was not really helpful):
The LoggingHandler is instantiated and subscribed after the KafkaHandler.
I see this chain EventdrivenConsumer.doStart()<-- ``ConsumerEndpointFactoryBean.initializeEndpoint()<-- ... until reflective calls
)
EDIT
As suggested in comments here is some code (i can't share the whole project). My problem is that the code can't explain the behavior. The LoggingHandler is beeing subscribed to my PublishSubscribeChannel for some unknown reason and it is instantiated with error as level for some unknown reason.
The class that subscribes the KafkaHandler:
#Component
public class EventRelay {
#Autowired
private EventRelay( SubscribableChannel eventBus, #Qualifier( KafkaProducerConfig.KAFKA_PRODUCER ) MessageHandler kafka ) {
eventBus.subscribe( kafka );
}
}
The class that send events is implementing an proprietary interface with many callback methods:
public class PropEvents implements PropClass.IEvents {
private SubscribableChannel eventBus;
private final ObjectMapper om;
private final String userId;
public PropEvents( SubscribableChannel eventBus, ObjectMapper om, String userId ) {
this.eventBus = eventBus;
this.om = om;
this.userId = userId;
}
#Override
public void onLogin( ) {
eventBus.send( new OnLoginMessage(... ) ) );
}
//many other onXYZ methods
}
Here is the Factory that produces instances of PropEvents:
#Configuration
public class EventHandlerFactory {
private final ObjectMapper om;
private final SubscribableChannel eventBus;
#Autowired
public EventHandlerFactory( ObjectMapper om, SubscribableChannel eventBus){
this.om = checkNotNull( om );
this.eventBus = checkNotNull( eventBus );
}
#Bean
#Scope( SCOPE_PROTOTYPE)
public IEvents getEvantHandler(String userId){
if(Strings.isNullOrEmpty(userId)){
throw new IllegalArgumentException( "user id must be set." );
}
return new PropEvents(eventBus, om, userId);
}
}
I appreciate any help with debugging or use tooling (e.g. Eclipse Spring tools does not show any hint to a LoggingHandler Bean) to find where and why a LoggingHandler is instantiated and subscribed to my autowired Channel.
My current workaround is to disable logging for LoggingHandler.
My question at a glance
Why spring instantiates an LoggingHandler with error level and subscribes it to my SubscribableChannel(provided by PublishSubscribeChannel)? How to disable this?
When you #Autowired SubscribableChannel, there should be one in the application context. That might be confusing a bit and mislead, but Spring Integration provides a PublishSubscribeChannel for the global errorChannel: https://docs.spring.io/spring-integration/docs/5.0.2.RELEASE/reference/html/messaging-channels-section.html#channel-special-channels
This one has a LoggingHandler to log error as a default subscriber.
I don't think that it is OK to make your logic based on the errorChannel.
You should consider to declare your own MessageChannel bean and inject it by the particular #Qualifier.

Resources