Edit FYI: working gitHub example
I was searching the internet and couldn't find a working and simple example of an embedded Kafka test.
My setup is:
Spring boot
Multiple #KafkaListener with different topics in one class
Embedded Kafka for test which is starting fine
Test with Kafkatemplate which is sending to topic but the
#KafkaListener methods are not receiving anything even after a huge sleep time
No warnings or errors are shown, only info spam from Kafka in logs
Please help me. There are mostly over configured or overengineered examples. I am sure it can be done simple.
Thanks, guys!
#Controller
public class KafkaController {
private static final Logger LOG = getLogger(KafkaController.class);
#KafkaListener(topics = "test.kafka.topic")
public void receiveDunningHead(final String payload) {
LOG.debug("Receiving event with payload [{}]", payload);
//I will do database stuff here which i could check in db for testing
}
}
private static String SENDER_TOPIC = "test.kafka.topic";
#ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true, SENDER_TOPIC);
#Test
public void testSend() throws InterruptedException, ExecutionException {
Map<String, Object> senderProps = KafkaTestUtils.producerProps(embeddedKafka);
KafkaProducer<Integer, String> producer = new KafkaProducer<>(senderProps);
producer.send(new ProducerRecord<>(SENDER_TOPIC, 0, 0, "message00")).get();
producer.send(new ProducerRecord<>(SENDER_TOPIC, 0, 1, "message01")).get();
producer.send(new ProducerRecord<>(SENDER_TOPIC, 1, 0, "message10")).get();
Thread.sleep(10000);
}
Embedded Kafka tests work for me with below configs,
Annotation on test class
#EnableKafka
#SpringBootTest(classes = {KafkaController.class}) // Specify #KafkaListener class if its not the same class, or not loaded with test config
#EmbeddedKafka(
partitions = 1,
controlledShutdown = false,
brokerProperties = {
"listeners=PLAINTEXT://localhost:3333",
"port=3333"
})
public class KafkaConsumerTest {
#Autowired
KafkaEmbedded kafkaEmbeded;
#Autowired
KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
Before annotation for setup method
#Before
public void setUp() throws Exception {
for (MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
ContainerTestUtils.waitForAssignment(messageListenerContainer,
kafkaEmbeded.getPartitionsPerTopic());
}
}
Note: I am not using #ClassRule for creating embedded Kafka rather auto-wiring #Autowired embeddedKafka
#Test
public void testReceive() throws Exception {
kafkaTemplate.send(topic, data);
}
Hope this helps!
Edit: Test configuration class marked with #TestConfiguration
#TestConfiguration
public class TestConfig {
#Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(KafkaTestUtils.producerProps(kafkaEmbedded));
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
KafkaTemplate<String, String> kafkaTemplate = new KafkaTemplate<>(producerFactory());
kafkaTemplate.setDefaultTopic(topic);
return kafkaTemplate;
}
Now #Test method will autowire KafkaTemplate and use is to send message
kafkaTemplate.send(topic, data);
Updated answer code block with above line
since the accepted answer doesn't compile or work for me. I find another solution based on https://blog.mimacom.com/testing-apache-kafka-with-spring-boot/ what I would like to share with you.
The dependency is 'spring-kafka-test' version: '2.2.7.RELEASE'
#RunWith(SpringRunner.class)
#EmbeddedKafka(partitions = 1, topics = { "testTopic" })
#SpringBootTest
public class SimpleKafkaTest {
private static final String TEST_TOPIC = "testTopic";
#Autowired
EmbeddedKafkaBroker embeddedKafkaBroker;
#Test
public void testReceivingKafkaEvents() {
Consumer<Integer, String> consumer = configureConsumer();
Producer<Integer, String> producer = configureProducer();
producer.send(new ProducerRecord<>(TEST_TOPIC, 123, "my-test-value"));
ConsumerRecord<Integer, String> singleRecord = KafkaTestUtils.getSingleRecord(consumer, TEST_TOPIC);
assertThat(singleRecord).isNotNull();
assertThat(singleRecord.key()).isEqualTo(123);
assertThat(singleRecord.value()).isEqualTo("my-test-value");
consumer.close();
producer.close();
}
private Consumer<Integer, String> configureConsumer() {
Map<String, Object> consumerProps = KafkaTestUtils.consumerProps("testGroup", "true", embeddedKafkaBroker);
consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
Consumer<Integer, String> consumer = new DefaultKafkaConsumerFactory<Integer, String>(consumerProps)
.createConsumer();
consumer.subscribe(Collections.singleton(TEST_TOPIC));
return consumer;
}
private Producer<Integer, String> configureProducer() {
Map<String, Object> producerProps = new HashMap<>(KafkaTestUtils.producerProps(embeddedKafkaBroker));
return new DefaultKafkaProducerFactory<Integer, String>(producerProps).createProducer();
}
}
I solved the issue now
#BeforeClass
public static void setUpBeforeClass() {
System.setProperty("spring.kafka.bootstrap-servers", embeddedKafka.getBrokersAsString());
System.setProperty("spring.cloud.stream.kafka.binder.zkNodes", embeddedKafka.getZookeeperConnectionString());
}
while I was debugging, I saw that the embedded kaka server is taking a random port.
I couldn't find the configuration for it, so I am setting the kafka config same as the server. Looks still a bit ugly for me.
I would love to have just the #Mayur mentioned line
#EmbeddedKafka(partitions = 1, controlledShutdown = false, brokerProperties = {"listeners=PLAINTEXT://localhost:9092", "port=9092"})
but can't find the right dependency in the internet.
In integration testing, having fixed ports like 9092 is not recommended because multiple tests should have the flexibility to open their own ports from embedded instances. So, following implementation is something like that,
NB: this implementation is based on junit5(Jupiter:5.7.0) and spring-boot 2.3.4.RELEASE
TestClass:
#EnableKafka
#SpringBootTest(classes = {ConsumerTest.Config.class, Consumer.class})
#EmbeddedKafka(
partitions = 1,
controlledShutdown = false)
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
public class ConsumerTest {
#Autowired
private EmbeddedKafkaBroker kafkaEmbedded;
#Autowired
private KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
#BeforeAll
public void setUp() throws Exception {
for (final MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
ContainerTestUtils.waitForAssignment(messageListenerContainer,
kafkaEmbedded.getPartitionsPerTopic());
}
}
#Value("${topic.name}")
private String topicName;
#Autowired
private KafkaTemplate<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> requestKafkaTemplate;
#Test
public void consume_success() {
requestKafkaTemplate.send(topicName, load);
}
#Configuration
#Import({
KafkaListenerConfig.class,
TopicConfig.class
})
public static class Config {
#Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
#Bean
public ProducerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> requestProducerFactory() {
final Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
#Bean
public KafkaTemplate<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> requestKafkaTemplate() {
return new KafkaTemplate<>(requestProducerFactory());
}
}
}
Listener Class:
#Component
public class Consumer {
#KafkaListener(
topics = "${topic.name}",
containerFactory = "listenerContainerFactory"
)
#Override
public void listener(
final ConsumerRecord<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> consumerRecord,
final #Payload Optional<Map<String, List<ImmutablePair<String, String>>>> payload
) {
}
}
Listner Config:
#Configuration
public class KafkaListenerConfig {
#Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
#Value(value = "${topic.name}")
private String resolvedTreeQueueName;
#Bean
public ConsumerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> resolvedTreeConsumerFactory() {
final Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
props.put(ConsumerConfig.GROUP_ID_CONFIG, resolvedTreeQueueName);
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new CustomDeserializer());
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> resolvedTreeListenerContainerFactory() {
final ConcurrentKafkaListenerContainerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(resolvedTreeConsumerFactory());
return factory;
}
}
TopicConfig:
#Configuration
public class TopicConfig {
#Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
#Value(value = "${topic.name}")
private String requestQueue;
#Bean
public KafkaAdmin kafkaAdmin() {
Map<String, Object> configs = new HashMap<>();
configs.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
return new KafkaAdmin(configs);
}
#Bean
public NewTopic requestTopic() {
return new NewTopic(requestQueue, 1, (short) 1);
}
}
application.properties:
spring.kafka.bootstrap-servers=${spring.embedded.kafka.brokers}
This assignment is the most important assignment that would bind the embedded instance port to the KafkaTemplate and, KafkaListners.
Following the above implementation, you could open dynamic ports per test class and, it would be more convenient.
Related
What I am trying to do? : I am new to Spring Integration and already have read many similar questions regarding error handling but I don't understand how to catch exceptions using error-channel?
What I have done so far:
#EnableIntegration
#IntegrationComponentScan
#Configuration
public class TcpClientConfig implements ApplicationEventPublisherAware {
private ApplicationEventPublisher applicationEventPublisher;
private final ConnectionProperty connectionProperty;
#Override
public void setApplicationEventPublisher(ApplicationEventPublisher applicationEventPublisher) {
this.applicationEventPublisher = applicationEventPublisher;
}
TcpClientConfig(ConnectionProperty connectionProperty) {
this.connectionProperty = connectionProperty;
}
#Bean
public AbstractClientConnectionFactory clientConnectionFactory() {
TcpNioClientConnectionFactory tcpNioClientConnectionFactory =
getTcpNioClientConnectionFactoryOf(
connectionProperty.getPrimaryHSMServerIpAddress(),
connectionProperty.getPrimaryHSMServerPort());
final List<AbstractClientConnectionFactory> fallBackConnections = getFallBackConnections();
fallBackConnections.add(tcpNioClientConnectionFactory);
final FailoverClientConnectionFactory failoverClientConnectionFactory =
new FailoverClientConnectionFactory(fallBackConnections);
return new CachingClientConnectionFactory(
failoverClientConnectionFactory, connectionProperty.getConnectionPoolSize());
}
#Bean
DefaultTcpNioSSLConnectionSupport connectionSupport() {
final DefaultTcpSSLContextSupport defaultTcpSSLContextSupport =
new DefaultTcpSSLContextSupport(
connectionProperty.getKeystorePath(),
connectionProperty.getTrustStorePath(),
connectionProperty.getKeystorePassword(),
connectionProperty.getTruststorePassword());
final String protocol = "TLSv1.2";
defaultTcpSSLContextSupport.setProtocol(protocol);
return new DefaultTcpNioSSLConnectionSupport(defaultTcpSSLContextSupport, false);
}
#Bean
public MessageChannel outboundChannel() {
return new DirectChannel();
}
#Bean
#ServiceActivator(inputChannel = "outboundChannel")
public MessageHandler outboundGateway(AbstractClientConnectionFactory clientConnectionFactory) {
TcpOutboundGateway tcpOutboundGateway = new TcpOutboundGateway();
tcpOutboundGateway.setConnectionFactory(clientConnectionFactory);
return tcpOutboundGateway;
}
#Bean
#ServiceActivator(inputChannel = "error-channel")
public void handleError(ErrorMessage em) {
throw new RuntimeException(String.valueOf(em));
}
private List<AbstractClientConnectionFactory> getFallBackConnections() {
final int size = connectionProperty.getAdditionalHSMServersConfig().size();
List<AbstractClientConnectionFactory> collector = new ArrayList<>(size);
for (final Map.Entry<String, Integer> server :
connectionProperty.getAdditionalHSMServersConfig().entrySet()) {
collector.add(getTcpNioClientConnectionFactoryOf(server.getKey(), server.getValue()));
}
return collector;
}
private TcpNioClientConnectionFactory getTcpNioClientConnectionFactoryOf(
final String ipAddress, final int port) {
TcpNioClientConnectionFactory tcpNioClientConnectionFactory =
new TcpNioClientConnectionFactory(ipAddress, port);
tcpNioClientConnectionFactory.setUsingDirectBuffers(true);
tcpNioClientConnectionFactory.setDeserializer(new CustomDeserializer());
tcpNioClientConnectionFactory.setApplicationEventPublisher(applicationEventPublisher);
tcpNioClientConnectionFactory.setSoKeepAlive(true);
tcpNioClientConnectionFactory.setConnectTimeout(connectionProperty.getConnectionTimeout());
tcpNioClientConnectionFactory.setSoTcpNoDelay(true);
tcpNioClientConnectionFactory.setTcpNioConnectionSupport(connectionSupport());
return tcpNioClientConnectionFactory;
}
}
Gateway
#Component
#MessagingGateway(defaultRequestChannel = "outboundChannel",errorChannel ="error-channel" )
public interface TcpClientGateway {
String send(String message);
}
Also currently, I am facing
required a bean of type org.springframework.messaging.support.ErrorMessage that could not be found
I need some assistance!
Thanking you in advance,
EDIT
#AllArgsConstructor
#Service
public class AsyncNonBlockingClient implements Connector {
TcpClientGateway tcpClientGateway;
#Override
public String send(final String payload) {
return tcpClientGateway.send(payload);
}
}
See documentation about messaging annotation:
Your problem is here: https://docs.spring.io/spring-integration/docs/current/reference/html/configuration.html#annotations_on_beans
#Bean
#ServiceActivator(inputChannel = "error-channel")
public void handleError(ErrorMessage em) {
This is a plain POJO method, therefore it cannot be marked with a #Bean. You use a #Bean really for beans to expose. Then you decide if that has to be a #ServiceActivator or not. So, just remove #Bean from this method and your error-channel consumer should be OK.
I have the service, that sending message
#Service
class ExportTaskService {
#Autowired
private KafkaTemplate<String, Object> template;
public void exportNewTask(ImportTaskRequest req) {
template.send('my-topic-name', req)
}
}
I configured beans: consumerFactory, producerFactory, kafkaTemplate (src/main/java)
If I run application, and execute metod -- all ok and message in real message broker.
Then I need spring test, that uses ExportTaskService.exportNewTask(request) and waiting message from same topic.
My code, but not working (i cant receive message):
#RunWith(SpringRunner.class)
#SpringBootTest
#DirtiesContext
#TestPropertySource(locations="classpath:test.properties")
#EnableKafka
#EmbeddedKafka(
topics = "new-bitrix-leads", ports = 9092
)
public class ExportingLeadTests {
#Autowired
private EmbeddedKafkaBroker embeddedKafkaBroker;
#Autowired
ExportTaskService exportTaskService;
#Autowired
ConsumerFactory<String, Object> consumerFactory;
#Test
public void test() throws InterruptedException {
assert(embeddedKafkaBroker != null);
assert(exportTaskService != null);
Consumer<String, Object> consumer = consumerFactory.createConsumer();
consumer.subscribe(Collections.singletonList("new-bitrix-leads"));
exportTaskService.exportNewTask(ImportTaskRequest.builder()
.description("descr")
.title("title")
.build());
ConsumerRecords<String, Object> records = consumer.poll(Duration.ofSeconds(3));
assert (records.count() == 1);
}
}
How I can read this message ? What I need to do ? I have no ideas...
BIG TNX :) !!
My simple solution is:
#SpringBootTest
#DirtiesContext
#TestPropertySource(locations="classpath:test.properties")
#EnableKafka
#EmbeddedKafka(
topics = "new-bitrix-leads", ports = 9092
)
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
public class ExportingLeadTests {
private BlockingQueue<ConsumerRecord<String, Object>> records;
private KafkaMessageListenerContainer<String, String> container;
#Autowired
private EmbeddedKafkaBroker embeddedKafkaBroker;
#Autowired
ExportTaskService exportTaskService;
#BeforeAll
void setUp() {
DefaultKafkaConsumerFactory<String, Object> consumerFactory = new DefaultKafkaConsumerFactory<>(getConsumerProperties());
ContainerProperties containerProperties = new ContainerProperties("new-bitrix-leads");
container = new KafkaMessageListenerContainer<>(consumerFactory, containerProperties);
records = new LinkedBlockingQueue<>();
container.setupMessageListener((MessageListener<String, Object>) e -> records.add(e));
container.start();
ContainerTestUtils.waitForAssignment(container, embeddedKafkaBroker.getPartitionsPerTopic());
}
#AfterAll
void tearDown() {
container.stop();
}
private Map<String, Object> getConsumerProperties() {
Map<String, Object> map = new HashMap<>();
map.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafkaBroker.getBrokersAsString());
map.put(ConsumerConfig.GROUP_ID_CONFIG, "consumer");
map.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true");
map.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "10");
map.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "60000");
map.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
map.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
map.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return map;
}
#Test
public void test() throws InterruptedException {
exportTaskService.exportNewTask(ImportTaskRequest.builder()
.description("descr")
.title("title")
.gclid("gclid")
.id("id")
.name("my name")
.website("http://website.com")
.yclid("yclid")
.source("Source")
.build());
ConsumerRecord<String, Object> record = records.poll(5, TimeUnit.SECONDS);
assert (record != null);
assertThat(record.value().toString(), containsString("gclid"));
}
}
really works. nice
I have been trying to write a simple Kafka Listener unit test using KafkaEmbedded. However my listener does not gets invoked. I have been using this link for inspiration since I also need an Avro Serializer/DeSerializer.
Below are how my test class looks like.
#ExtendWith(SpringExtension.class)
public class KafkaTest{
public static final String TOPIC_2 = "topic";
#Autowired
private Service listener;
#Autowired
private SomeClient someClient;
#Autowired
private KafkaTemplate<String, Avro> template;
#Test
void testSimple() {
template.send(TOPIC_2, "test", Avro.newBuilder()
.setGameProvider("abcd")
.setMessageName("someMessageName")
.setRequestId(UUID.randomUUID().toString())
.setBody(UUID.randomUUID().toString())
.build());
verify(someClient).register(anyString(), anyString(), anyString(), anyString());
template.flush();
}
#Configuration
#EnableKafka
public static class Config {
#Bean
public EmbeddedKafkaBroker kafkaEmbedded() {
return new EmbeddedKafkaBroker(1, true, 1, TOPIC_2);
}
#Bean
public ConsumerFactory<String, Avro> createConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaEmbedded().getBrokersAsString());
props.put(ConsumerConfig.GROUP_ID_CONFIG, "group-1");
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, CustomKafkaAvroDeserializer.class);
props.put("schema.registry.url", "not-used");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Avro> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Avro> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(createConsumerFactory());
return factory;
}
#MockBean
private SomeClient client;
#MockBean
private Marshaller marshaller;
#Bean
public SomeService listener() {
return new SomeService(client, marshaller, kafkaTemplate());
}
#Bean
public ProducerFactory<String, Avro> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaEmbedded().getBrokersAsString());
props.put(ProducerConfig.RETRIES_CONFIG, 1);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, CustomKafkaAvroSerializer.class);
props.put("schema.registry.url", "not-used");
return new DefaultKafkaProducerFactory<>(props);
}
#Bean
public KafkaTemplate<String, Avro> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
}
public class CustomKafkaAvroDeserializer extends KafkaAvroDeserializer {
#Override
public Object deserialize(String topic, byte[] bytes) {
if (topic.equals("topic")) {
this.schemaRegistry = getMockClient(SafeReport.SCHEMA$);
}
return super.deserialize(topic, bytes);
}
private static SchemaRegistryClient getMockClient(final Schema schema$) {
return new MockSchemaRegistryClient() {
#Override
public synchronized Schema getById(int id) {
return schema$;
}
};
}
}
public class CustomKafkaAvroSerializer extends KafkaAvroSerializer {
public CustomKafkaAvroSerializer() {
super();
super.schemaRegistry = new MockSchemaRegistryClient();
}
public CustomKafkaAvroSerializer(SchemaRegistryClient client) {
super(new MockSchemaRegistryClient());
}
public CustomKafkaAvroSerializer(SchemaRegistryClient client, Map<String, ?> props) {
super(new MockSchemaRegistryClient(), props);
}
}
#Service
#EnableBinding(Sink.class)
public class SomeService {
#KafkaListener(topics = "topic", groupId = "group-1")
public void listen(ConsumerRecord<String, Avro> cr) {
System.out.println(String.format("#### -> Consumed message -> %s", cr.toString()));
}
}
My listener is never invoked when I run this test.
I have a spring boot service that consumes from a kafka topic. When i consume i perform certain tasks on the kafka message. Before i can perform these operations i need to wait for the service to load some data into caches that i have set up. My issue is if i set kafka consumer to autostart it starts consuming before the cache loads and it errors out.
I am trying to explicitly start the consumer after i load the cache however i get null pointer exceptions.
#Configuration
public class KafkaConfig {
#Value("${kafka.server}")
String server;
#Value("${kafka.port}")
String port;
#Value("${kafka.group.id}")
String groupid;
#Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, server+":"+port);
config.put(ConsumerConfig.GROUP_ID_CONFIG, groupid);
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
// config.put("security.protocol","SASL_PLAINTEXT");
// config.put("sasl.kerberos.service.name","kafka");
return new DefaultKafkaConsumerFactory<>(config);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory();
factory.setConsumerFactory(consumerFactory());
factory.setAutoStartup(false);
return factory;
}
}
KafkaListener
#Service
public class KafkaConsumer {
#Autowired
AggregationService aggregationService;
#Autowired
private KafkaListenerEndpointRegistry registry;
private final CounterService counterService;
public KafkaConsumer(CounterService counterService) {
this.counterService = counterService;
}
#KafkaListener(topics = "gliTransactionTopic", group = "gliDecoupling", id = "gliKafkaListener")
public boolean consume(String message,
#Header(KafkaHeaders.RECEIVED_PARTITION_ID) Integer partition,
#Header(KafkaHeaders.OFFSET) Long offset) throws ParseException {
System.out.println("Inside kafka listener :" + message+" partition :"+partition.toString()+" offset :"+offset.toString());
aggregationService.run();
return true;
}
}
service To start stop
#Service
public class DecouplingController {
#Autowired
private KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
public void stop() {
MessageListenerContainer listenerContainer = kafkaListenerEndpointRegistry
.getListenerContainer("gliKafkaListener");
listenerContainer.stop();
}
public void start() {
MessageListenerContainer listenerContainer = kafkaListenerEndpointRegistry
.getListenerContainer("gliKafkaListener");
listenerContainer.start();
}
}
main method
#SpringBootApplication
public class DecouplingApplication {
Ignite ignite;
static IgniteCache<Long, MappingsEntity> mappingsCache;
public static void main(String[] args) {
SpringApplication.run(DecouplingApplication.class, args);
Ignition.setClientMode(true);
Ignite ignite = Ignition.ignite("ignite");
loadCaches(ignite);
}
public static boolean loadCaches(Ignite ignite) {
mappingsCache = ignite.getOrCreateCache("MappingsCache");
mappingsCache.loadCache(null);
System.out.println("Data Loaded");
DecouplingController dc=new DecouplingController();
dc.start();
return true;
}
}
Below is the exception
Data Loaded
Exception in thread "main" java.lang.NullPointerException
at com.ignite.spring.decoupling.controller.DecouplingController.start(DecouplingController.java:126)
at com.ignite.spring.decoupling.DecouplingApplication.loadCaches(DecouplingApplication.java:64)
at com.ignite.spring.decoupling.DecouplingApplication.main(DecouplingApplication.java:37)
Instead of manually creating an object of DecouplingController , autowire the dependency in DecouplingApplication.
#Autowired
DecouplingController deDecouplingController;
The ApplicationContext which deals with autowired dependencies is not aware about the object you manually created using "new". The autowired kafkaListenerEndpointRegistry is unknown to the new DecouplingController object you created.
Seems like the gliKafkaListener1 was not registred an some part of ConsumerConfig/ListenerConfig
How to use li-apache-kafka-clients in spring boot app to send large message (above 1MB) from Kafka producer to Kafka Consumer? Below is the GitHub link of li-apache-kafka-clients:
https://github.com/linkedin/li-apache-kafka-clients
I have imported .jar file of li-apache-kafka-clients and put the below configuration for producer:
props.put("large.message.enabled", "true");
props.put("max.message.segment.bytes", 1000 * 1024);
props.put("segment.serializer", DefaultSegmentSerializer.class.getName());
and for consumer:
message.assembler.buffer.capacity,
max.tracked.messages.per.partition,
exception.on.message.dropped,
segment.deserializer.class
but still getting error for large message. Please help me to solve error.
Below is my code, please let me know where I need to create LiKafkaProducer:
#Configuration
public class KafkaProducerConfig {
#Value("${kafka.boot.server}")
private String kafkaServer;
#Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfig());
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<String, String>(producerFactory());
}
#Bean
public Map<String, Object> producerConfig() {
// TODO Auto-generated method stub
Map<String, Object> config = new HashMap<>();
config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put("bootstrap.servers", "localhost:9092");
config.put("acks", "all");
config.put("retries", 0);
config.put("batch.size", 16384);
config.put("linger.ms", 1);
config.put("buffer.memory", 33554432);
// The following properties are used by LiKafkaProducerImpl
config.put("large.message.enabled", "true");
config.put("max.message.segment.bytes", 1000 * 1024);
config.put("segment.serializer", DefaultSegmentSerializer.class.getName());
config.put("auditor.class", LoggingAuditor.class.getName());
return config;
}
}
#RestController
#RequestMapping("/kafkaProducer")
public class KafkaProducerController {
#Autowired
private KafkaSender sender;
#PostMapping
public ResponseEntity<List<Student>> sendData(#RequestBody List<Student> student){
sender.sendData(student);
return new ResponseEntity<List<Student>>(student, HttpStatus.OK);
}
}
#Service
public class KafkaSender {
private static final Logger LOGGER = LoggerFactory.getLogger(KafkaSender.class);
#Autowired
private KafkaTemplate<String, String> kafkaTemplate;
#Value("${kafka.topic.name}")
private String topicName;
public void sendData(List<Student> student) {
// TODO Auto-generated method stub
Map<String, Object> headers = new HashMap<>();
headers.put(KafkaHeaders.TOPIC, topicName);
headers.put("payload", student.get(0));
// Construct a JSONObject from a Map.
JSONObject HeaderObject = new JSONObject(headers);
System.out.println("\nMethod-2: Using new JSONObject() ==> " + HeaderObject);
final String record = HeaderObject.toString();
Message<String> message = MessageBuilder.withPayload(record).setHeader(KafkaHeaders.TOPIC, topicName)
.setHeader(KafkaHeaders.MESSAGE_KEY, "Message")
.build();
kafkaTemplate.send(topicName, message.toString());
}
}
You would need to implement your own ConsumerFactory and ProducerFactory to create the LiKafkaConsumer and LiKafkaProducer respectively.
You should be able to subclass the default factories provided by the framework.