Trying to connect a containerized spring boot app with an containerized kafka server - spring-boot

I have a Spring Boot application which worked fine with Kafka in a container but when I containerize the Spring Boot application it won't work.
This is the docker-compose file with which I created
version: "3.4"
services:
zookeeper:
image: bitnami/zookeeper
restart: always
container_name: "zookeeper"
ports:
- "2181:2181"
volumes:
- "zookeeper_data:/bitnami"
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
kafka:
image: bitnami/kafka
ports:
- "9092:9092"
restart: always
container_name: "kafka"
volumes:
- "kafka_data:/bitnami"
environment:
- KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
- ALLOW_PLAINTEXT_LISTENER=yes
- KAFKA_LISTENERS=PLAINTEXT://:9092
- KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092
depends_on:
- zookeeper
volumes:
zookeeper_data:
driver: local
kafka_data:
driver: local
The application.yml of the Spring Boot application
server:
port: 5001
spring:
jpa:
database-platform: org.hibernate.dialect.MySQL8Dialect
show-sql: true
hibernate:
ddl-auto: update
datasource:
url: jdbc:mysql://mysql-container:3306/craproject?autoReconnect=true&useSSL=false&useSSL=false&serverTimezone=UTC&createDatabaseIfNotExist=true
username: ***
password: ****
data:
mongodb:
host: mongo-container
port: 27017
database: craprojet
kafka:
bootstrap-servers:
- kafka:9092
consumer:
group-id: project-group
enable-auto-commit: false
auto-offset-reset: latest
isolation-level: read_committed
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
The dockerfile with which I created the image of the app:
FROM openjdk:11
COPY target/project-service-1.jar project-service-1.jar
EXPOSE 5001
ENTRYPOINT ["java", "-jar" , "project-service-1.jar"]
The containers of kafka and data bases are running fine :
This is the command which i use to run the spring boot app container :
docker run --name project-service\
--network techbankNet\
-p 5001:5001\
--link mysql-container:mysql\
--link mongo-container:mongo\
--link adminer:adminer\
--link kafka:kafka project-service
the log:
2022-08-23 19:05:05.170 INFO 1 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = latest
bootstrap.servers = [kafka:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = consumer-project-group-1
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = project-group
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_committed
key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
socket.connection.setup.timeout.max.ms = 127000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.springframework.kafka.support.serializer.JsonDeserializer
2022-08-23 19:05:15.315 WARN 1 --- [ main] org.apache.kafka.clients.ClientUtils : Couldn't resolve server kafka:9092 from bootstrap.servers as DNS resolution failed for kafka
2022-08-23 19:05:15.316 INFO 1 --- [ main] org.apache.kafka.common.metrics.Metrics : Metrics scheduler closed
2022-08-23 19:05:15.316 INFO 1 --- [ main] org.apache.kafka.common.metrics.Metrics : Closing reporter org.apache.kafka.common.metrics.JmxReporter
2022-08-23 19:05:15.317 INFO 1 --- [ main] org.apache.kafka.common.metrics.Metrics : Metrics reporters closed
2022-08-23 19:05:15.319 INFO 1 --- [ main] o.a.kafka.common.utils.AppInfoParser : App info kafka.consumer for consumer-project-group-1 unregistered
2022-08-23 19:05:15.319 WARN 1 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextExcepti
on: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
2022-08-23 19:05:15.340 INFO 1 --- [ main] j.LocalContainerEntityManagerFactoryBean : Closing JPA EntityManagerFactory for persistence unit 'default'
2022-08-23 19:05:15.343 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown initiated...
2022-08-23 19:05:15.365 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown completed.
2022-08-23 19:05:15.368 INFO 1 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat]
2022-08-23 19:05:15.389 INFO 1 --- [ main] ConditionEvaluationReportLoggingListener :
Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2022-08-23 19:05:15.420 ERROR 1 --- [ main] o.s.boot.SpringApplication : Application run failed
org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.KafkaException: Failed to construct
kafka consumer
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:181) ~[spring-context-5.3.9.jar!/:5.3.9]
at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:54) ~[spring-context-5.3.9.jar!/:5.3.9]
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:356) ~[spring-context-5.3.9.jar!/:5.3.9]
at java.base/java.lang.Iterable.forEach(Iterable.java:75) ~[na:na]
at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:155) ~[spring-context-5.3.9.jar!/:5.3.9]
at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:123) ~[spring-context-5.3.9.jar!/:5.3.9]
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:935) ~[spring-context-5.3.9.jar!/:5.3.9]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:586) ~[spring-context-5.3.9.jar!/:5.3.9]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:145) ~[spring-boot-2.5.3.jar!/:2.5.3]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:754) ~[spring-boot-2.5.3.jar!/:2.5.3]
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:434) ~[spring-boot-2.5.3.jar!/:2.5.3]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:338) ~[spring-boot-2.5.3.jar!/:2.5.3]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1343) ~[spring-boot-2.5.3.jar!/:2.5.3]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1332) ~[spring-boot-2.5.3.jar!/:2.5.3]
at com.project.CQRS.ProjectServiceApplication.main(ProjectServiceApplication.java:16) ~[classes!/:1]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:na]
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:na]
at java.base/java.lang.reflect.Method.invoke(Method.java:566) ~[na:na]
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49) ~[project-service-1.jar:1]
at org.springframework.boot.loader.Launcher.launch(Launcher.java:108) ~[project-service-1.jar:1]
at org.springframework.boot.loader.Launcher.launch(Launcher.java:58) ~[project-service-1.jar:1]
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:88) ~[project-service-1.jar:1]
Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:819) ~[kafka-clients-2.7.1.jar!/:na]
at org.springframework.kafka.core.DefaultKafkaConsumerFactory.createRawConsumer(DefaultKafkaConsumerFactory.java:366) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.core.DefaultKafkaConsumerFactory.createKafkaConsumer(DefaultKafkaConsumerFactory.java:334) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.core.DefaultKafkaConsumerFactory.createConsumerWithAdjustedProperties(DefaultKafkaConsumerFactory.java:310) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.core.DefaultKafkaConsumerFactory.createKafkaConsumer(DefaultKafkaConsumerFactory.java:277) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.core.DefaultKafkaConsumerFactory.createConsumer(DefaultKafkaConsumerFactory.java:254) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.<init>(KafkaMessageListenerContainer.java:715) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.listener.KafkaMessageListenerContainer.doStart(KafkaMessageListenerContainer.java:320) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.listener.AbstractMessageListenerContainer.start(AbstractMessageListenerContainer.java:397) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.listener.ConcurrentMessageListenerContainer.doStart(ConcurrentMessageListenerContainer.java:205) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.listener.AbstractMessageListenerContainer.start(AbstractMessageListenerContainer.java:397) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.config.KafkaListenerEndpointRegistry.startIfNecessary(KafkaListenerEndpointRegistry.java:327) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.kafka.config.KafkaListenerEndpointRegistry.start(KafkaListenerEndpointRegistry.java:272) ~[spring-kafka-2.7.4.jar!/:2.7.4]
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:178) ~[spring-context-5.3.9.jar!/:5.3.9]
... 22 common frames omitted
Caused by: org.apache.kafka.common.config.ConfigException: No resolvable bootstrap urls given in bootstrap.servers
at org.apache.kafka.clients.ClientUtils.parseAndValidateAddresses(ClientUtils.java:89) ~[kafka-clients-2.7.1.jar!/:na]
at org.apache.kafka.clients.ClientUtils.parseAndValidateAddresses(ClientUtils.java:48) ~[kafka-clients-2.7.1.jar!/:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:728) ~[kafka-clients-2.7.1.jar!/:na]

You should put project-service (and adminer, mongo, and mysql) in the same Docker Compose file as Kafka and not use docker run.
This will create a default bridge network where the containers can talk to each other.
Or you need to attach techbankNet Docker network to the Kafka service in the compose file.
https://docs.docker.com/compose/networking/
Also see Connect to Kafka running in Docker

Related

The springboot service of the windows system cannot connect to the kafka started with docker-compose in the cloud server

Please! Help!
I built a stand-alone kafka service through docker-compose on the cloud server (ip: 49.234.12.199).
docker-compose.yml
version: '3'
services:
zookeeper:
image: wurstmeister/zookeeper
ports:
- "2181:2181"
kafka:
image: wurstmeister/kafka
depends_on: [ zookeeper ]
ports:
- "9092:9092"
environment:
KAFKA_ADVERTISED_HOST_NAME: 49.234.12.199
KAFKA_CREATE_TOPICS: "test:1:1"
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://49.234.12.199:9092
KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:9092
volumes:
- /var/run/docker.sock:/var/run/docker.sock
I can normally call port 9092 of 49.234.12.199 through the telnet command, and I can also create topics normally through commands inside the container.
But on my local windows system computer, the service started by springboot can't call the kafka service in docker normally, which makes me crazy.
The relevant configuration of springboot is as follows:
maven
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.6.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<artifactId>boot-kafka</artifactId>
<name>boot-kafka</name>
<description>Kafka demo project for Spring Boot</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
application.yml
server:
port: 9090
spring:
kafka:
bootstrap-servers: 49.234.12.199:9092
consumer:
auto-offset-reset: earliest
producer:
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
retries: 3
kafka:
topic:
my-topic: my-topic
my-topic2: my-topic2
KafkaConfig.class
package org.liu.demo.config;
import org.apache.kafka.clients.admin.NewTopic;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.support.converter.RecordMessageConverter;
import org.springframework.kafka.support.converter.StringJsonMessageConverter;
#Configuration
public class KafkaConfig {
#Value("${kafka.topic.my-topic}")
private String myTopic;
#Value("${kafka.topic.my-topic2}")
private String myTopic2;
#Bean
public RecordMessageConverter jsonConverter() {
return new StringJsonMessageConverter();
}
#Bean
public NewTopic myTopic() {
return new NewTopic(myTopic, 2, (short) 1);
}
#Bean
public NewTopic myTopic2() {
return new NewTopic(myTopic2, 1, (short) 1);
}
}
Finally start the error message
E:\Java\jdk1.8.0_152\bin\java.exe -agentlib:jdwp=transport=dt_socket,address=127.0.0.1:52858,suspend=y,server=n -XX:TieredStopAtLevel=1 -noverify -Dspring.output.ansi.enabled=always -Dcom.sun.management.jmxremote -Dspring.jmx.enabled=true -Dspring.liveBeansView.mbeanDomain -Dspring.application.admin.enabled=true -javaagent:C:\Users\yue.liu2\AppData\Local\JetBrains\IntelliJIdea2020.2\captureAgent\debugger-agent.jar -Dfile.encoding=UTF-8 -classpath "E:\Java\jdk1.8.0_152\jre\lib\charsets.jar;E:\Java\jdk1.8.0_152\jre\lib\deploy.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\access-bridge-64.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\cldrdata.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\dnsns.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\jaccess.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\jfxrt.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\localedata.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\nashorn.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\sunec.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\sunjce_provider.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\sunmscapi.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\sunpkcs11.jar;E:\Java\jdk1.8.0_152\jre\lib\ext\zipfs.jar;E:\Java\jdk1.8.0_152\jre\lib\javaws.jar;E:\Java\jdk1.8.0_152\jre\lib\jce.jar;E:\Java\jdk1.8.0_152\jre\lib\jfr.jar;E:\Java\jdk1.8.0_152\jre\lib\jfxswt.jar;E:\Java\jdk1.8.0_152\jre\lib\jsse.jar;E:\Java\jdk1.8.0_152\jre\lib\management-agent.jar;E:\Java\jdk1.8.0_152\jre\lib\plugin.jar;E:\Java\jdk1.8.0_152\jre\lib\resources.jar;E:\Java\jdk1.8.0_152\jre\lib\rt.jar;E:\个人项目\boot-related-frame-study\boot-kafka\target\classes;E:\Maven\Repository\org\springframework\boot\spring-boot-starter-web\2.1.6.RELEASE\spring-boot-starter-web-2.1.6.RELEASE.jar;E:\Maven\Repository\org\springframework\boot\spring-boot-starter\2.1.6.RELEASE\spring-boot-starter-2.1.6.RELEASE.jar;E:\Maven\Repository\org\springframework\boot\spring-boot\2.1.6.RELEASE\spring-boot-2.1.6.RELEASE.jar;E:\Maven\Repository\org\springframework\boot\spring-boot-autoconfigure\2.1.6.RELEASE\spring-boot-autoconfigure-2.1.6.RELEASE.jar;E:\Maven\Repository\org\springframework\boot\spring-boot-starter-logging\2.1.6.RELEASE\spring-boot-starter-logging-2.1.6.RELEASE.jar;E:\Maven\Repository\ch\qos\logback\logback-classic\1.2.3\logback-classic-1.2.3.jar;E:\Maven\Repository\ch\qos\logback\logback-core\1.2.3\logback-core-1.2.3.jar;E:\Maven\Repository\org\apache\logging\log4j\log4j-to-slf4j\2.11.2\log4j-to-slf4j-2.11.2.jar;E:\Maven\Repository\org\apache\logging\log4j\log4j-api\2.11.2\log4j-api-2.11.2.jar;E:\Maven\Repository\org\slf4j\jul-to-slf4j\1.7.26\jul-to-slf4j-1.7.26.jar;E:\Maven\Repository\javax\annotation\javax.annotation-api\1.3.2\javax.annotation-api-1.3.2.jar;E:\Maven\Repository\org\yaml\snakeyaml\1.23\snakeyaml-1.23.jar;E:\Maven\Repository\org\springframework\boot\spring-boot-starter-json\2.1.6.RELEASE\spring-boot-starter-json-2.1.6.RELEASE.jar;E:\Maven\Repository\com\fasterxml\jackson\core\jackson-databind\2.9.9\jackson-databind-2.9.9.jar;E:\Maven\Repository\com\fasterxml\jackson\core\jackson-annotations\2.9.0\jackson-annotations-2.9.0.jar;E:\Maven\Repository\com\fasterxml\jackson\core\jackson-core\2.9.9\jackson-core-2.9.9.jar;E:\Maven\Repository\com\fasterxml\jackson\datatype\jackson-datatype-jdk8\2.9.9\jackson-datatype-jdk8-2.9.9.jar;E:\Maven\Repository\com\fasterxml\jackson\datatype\jackson-datatype-jsr310\2.9.9\jackson-datatype-jsr310-2.9.9.jar;E:\Maven\Repository\com\fasterxml\jackson\module\jackson-module-parameter-names\2.9.9\jackson-module-parameter-names-2.9.9.jar;E:\Maven\Repository\org\springframework\boot\spring-boot-starter-tomcat\2.1.6.RELEASE\spring-boot-starter-tomcat-2.1.6.RELEASE.jar;E:\Maven\Repository\org\apache\tomcat\embed\tomcat-embed-core\9.0.21\tomcat-embed-core-9.0.21.jar;E:\Maven\Repository\org\apache\tomcat\embed\tomcat-embed-el\9.0.21\tomcat-embed-el-9.0.21.jar;E:\Maven\Repository\org\apache\tomcat\embed\tomcat-embed-websocket\9.0.21\tomcat-embed-websocket-9.0.21.jar;E:\Maven\Repository\org\hibernate\validator\hibernate-validator\6.0.17.Final\hibernate-validator-6.0.17.Final.jar;E:\Maven\Repository\javax\validation\validation-api\2.0.1.Final\validation-api-2.0.1.Final.jar;E:\Maven\Repository\org\jboss\logging\jboss-logging\3.3.2.Final\jboss-logging-3.3.2.Final.jar;E:\Maven\Repository\com\fasterxml\classmate\1.4.0\classmate-1.4.0.jar;E:\Maven\Repository\org\springframework\spring-web\5.1.8.RELEASE\spring-web-5.1.8.RELEASE.jar;E:\Maven\Repository\org\springframework\spring-beans\5.1.8.RELEASE\spring-beans-5.1.8.RELEASE.jar;E:\Maven\Repository\org\springframework\spring-webmvc\5.1.8.RELEASE\spring-webmvc-5.1.8.RELEASE.jar;E:\Maven\Repository\org\springframework\spring-aop\5.1.8.RELEASE\spring-aop-5.1.8.RELEASE.jar;E:\Maven\Repository\org\springframework\spring-expression\5.1.8.RELEASE\spring-expression-5.1.8.RELEASE.jar;E:\Maven\Repository\org\springframework\kafka\spring-kafka\2.2.7.RELEASE\spring-kafka-2.2.7.RELEASE.jar;E:\Maven\Repository\org\springframework\spring-context\5.1.8.RELEASE\spring-context-5.1.8.RELEASE.jar;E:\Maven\Repository\org\springframework\spring-messaging\5.1.8.RELEASE\spring-messaging-5.1.8.RELEASE.jar;E:\Maven\Repository\org\springframework\spring-tx\5.1.8.RELEASE\spring-tx-5.1.8.RELEASE.jar;E:\Maven\Repository\org\springframework\retry\spring-retry\1.2.4.RELEASE\spring-retry-1.2.4.RELEASE.jar;E:\Maven\Repository\org\apache\kafka\kafka-clients\2.0.1\kafka-clients-2.0.1.jar;E:\Maven\Repository\org\lz4\lz4-java\1.4.1\lz4-java-1.4.1.jar;E:\Maven\Repository\org\xerial\snappy\snappy-java\1.1.7.1\snappy-java-1.1.7.1.jar;E:\Maven\Repository\org\slf4j\slf4j-api\1.7.26\slf4j-api-1.7.26.jar;E:\Maven\Repository\org\springframework\spring-core\5.1.8.RELEASE\spring-core-5.1.8.RELEASE.jar;E:\Maven\Repository\org\springframework\spring-jcl\5.1.8.RELEASE\spring-jcl-5.1.8.RELEASE.jar;D:\IntelliJ IDEA 2020.2\lib\idea_rt.jar" org.liu.demo.BootKafkaApplication
Connected to the target VM, address: '127.0.0.1:52858', transport: 'socket'
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.1.6.RELEASE)
2020-09-30 17:45:09.656 INFO 3268 --- [ main] org.liu.demo.BootKafkaApplication : Starting BootKafkaApplication on SH-CODE-PC0638 with PID 3268 (started by yue.liu2 in E:\个人项目\boot-related-frame-study)
2020-09-30 17:45:09.658 INFO 3268 --- [ main] org.liu.demo.BootKafkaApplication : No active profile set, falling back to default profiles: default
2020-09-30 17:45:10.179 INFO 3268 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.kafka.annotation.KafkaBootstrapConfiguration' of type [org.springframework.kafka.annotation.KafkaBootstrapConfiguration$$EnhancerBySpringCGLIB$$880d02f5] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2020-09-30 17:45:10.339 INFO 3268 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 9090 (http)
2020-09-30 17:45:10.354 INFO 3268 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat]
2020-09-30 17:45:10.354 INFO 3268 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.21]
2020-09-30 17:45:10.359 INFO 3268 --- [ main] o.a.catalina.core.AprLifecycleListener : Loaded APR based Apache Tomcat Native library [1.2.24] using APR version [1.7.0].
2020-09-30 17:45:10.359 INFO 3268 --- [ main] o.a.catalina.core.AprLifecycleListener : APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true].
2020-09-30 17:45:10.359 INFO 3268 --- [ main] o.a.catalina.core.AprLifecycleListener : APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true]
2020-09-30 17:45:10.361 INFO 3268 --- [ main] o.a.catalina.core.AprLifecycleListener : OpenSSL successfully initialized [OpenSSL 1.1.1g 21 Apr 2020]
2020-09-30 17:45:10.419 INFO 3268 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext
2020-09-30 17:45:10.419 INFO 3268 --- [ main] o.s.web.context.ContextLoader : Root WebApplicationContext: initialization completed in 732 ms
2020-09-30 17:45:10.576 INFO 3268 --- [ main] o.s.s.concurrent.ThreadPoolTaskExecutor : Initializing ExecutorService 'applicationTaskExecutor'
2020-09-30 17:45:10.749 INFO 3268 --- [ main] o.a.k.clients.admin.AdminClientConfig : AdminClientConfig values:
bootstrap.servers = [49.234.12.199:9092]
client.id =
connections.max.idle.ms = 300000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 120000
retries = 5
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
2020-09-30 17:45:10.780 INFO 3268 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version : 2.0.1
2020-09-30 17:45:10.780 INFO 3268 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId : fa14705e51bd2ce5
2020-09-30 17:45:49.400 ERROR 3268 --- [ main] o.springframework.kafka.core.KafkaAdmin : Could not configure topics
org.springframework.kafka.KafkaException: Timed out waiting to get existing topics; nested exception is java.util.concurrent.TimeoutException
at org.springframework.kafka.core.KafkaAdmin.lambda$checkPartitions$2(KafkaAdmin.java:238) [spring-kafka-2.2.7.RELEASE.jar:2.2.7.RELEASE]
at java.util.HashMap.forEach(HashMap.java:1289) ~[na:1.8.0_152]
at org.springframework.kafka.core.KafkaAdmin.checkPartitions(KafkaAdmin.java:213) [spring-kafka-2.2.7.RELEASE.jar:2.2.7.RELEASE]
at org.springframework.kafka.core.KafkaAdmin.addTopicsIfNeeded(KafkaAdmin.java:199) [spring-kafka-2.2.7.RELEASE.jar:2.2.7.RELEASE]
at org.springframework.kafka.core.KafkaAdmin.initialize(KafkaAdmin.java:169) [spring-kafka-2.2.7.RELEASE.jar:2.2.7.RELEASE]
at org.springframework.kafka.core.KafkaAdmin.afterSingletonsInstantiated(KafkaAdmin.java:139) [spring-kafka-2.2.7.RELEASE.jar:2.2.7.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:862) [spring-beans-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877) [spring-context-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549) [spring-context-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:742) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:389) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:311) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1213) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1202) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.liu.demo.BootKafkaApplication.main(BootKafkaApplication.java:10) [classes/:na]
Caused by: java.util.concurrent.TimeoutException: null
at org.apache.kafka.common.internals.KafkaFutureImpl$SingleWaiter.await(KafkaFutureImpl.java:108) ~[kafka-clients-2.0.1.jar:na]
at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:274) ~[kafka-clients-2.0.1.jar:na]
at org.springframework.kafka.core.KafkaAdmin.lambda$checkPartitions$2(KafkaAdmin.java:216) [spring-kafka-2.2.7.RELEASE.jar:2.2.7.RELEASE]
... 15 common frames omitted
2020-09-30 17:45:59.401 INFO 3268 --- [| adminclient-1] o.a.k.clients.admin.KafkaAdminClient : [AdminClient clientId=adminclient-1] Forcing a hard I/O thread shutdown. Requests in progress will be aborted.
2020-09-30 17:45:59.421 INFO 3268 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 9090 (http) with context path ''
2020-09-30 17:45:59.423 INFO 3268 --- [ main] org.liu.demo.BootKafkaApplication : Started BootKafkaApplication in 50.077 seconds (JVM running for 50.591)
There seems to be an issue with the KAFKA_ADVERTISED_LISTENERS config. I am guessing you would need the public host name for the Docker cloud server.
The following article might help with configuring this correctly:
https://rmoff.net/2018/08/02/kafka-listeners-explained/

Spring App Not Connecting to Kafka with SSL

I have a Spring boot app with a very simple kafka producer. Everything works great if I connect to a kafka cluster without encryption. But times out if I try to connect to a kafka cluster with SSL. Is there some other configuration I need in the producer or some other property I need to define to allow spring to correctly use all of the configurations?
I have the following properties set:
spring.kafka.producer.bootstrap-servers=broker1.kafka.poc.com:9093,broker3.kafka.poc.com:9093,broker4.kafka.poc.com:9093,broker5.kafka.poc.com:9093
spring.kafka.ssl.key-store-type=jks
spring.kafka.ssl.trust-store-location=file:/home/ec2-user/truststore.jks
spring.kafka.ssl.trust-store-password=test1234
spring.kafka.ssl.key-store-location=file:/home/ec2-user/keystore.jks
spring.kafka.ssl.key-store-password=test1234
logging.level.org.apache.kafka=debug
server.ssl.key-password=test1234
spring.kafka.ssl.key-password=test1234
spring.kafka.producer.client-id=sym
spring.kafka.admin.ssl.protocol=ssl
With the following result printing as the ProducerConfig when the app starts up:
o.a.k.clients.producer.ProducerConfig : ProducerConfig values:
acks = 1
batch.size = 16384
bootstrap.servers = [broker1.kafka.allypoc.com:9093, broker3.kafka.allypoc.com:9093, broker4.kafka.allypoc.com:9093, broker5.kafka.allypoc.com:9093]
buffer.memory = 33554432
client.dns.lookup = default
client.id = sym
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 120000
enable.idempotence = false
interceptor.classes = []
key.serializer = class org.apache.kafka.common.serialization.StringSerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = /home/ec2-user/keystore.jks
ssl.keystore.password = [hidden]
ssl.keystore.type = jks
ssl.protocol = ssl
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = /home/ec2-user/truststore.jks
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.StringSerializer
My producer is extremely simple:
#Service
public class Producer {
private final KafkaTemplate<String, String> kafkaTemplate;
public Producer(KafkaTemplate<String, String> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
void sendMessage(String topic, String message) {
this.kafkaTemplate.send(topic, message);
}
void sendMessage(String topic, String key, String message) {
this.kafkaTemplate.send(topic, key, message);
}
}
Connecting to kafka with SSL gets a TimeoutException saying Topic symbols not present in metadata after 60000 ms.
If I turn on debug logs, I get this repeatedly, looping all of my brokers.
2019-05-29 20:10:25.768 DEBUG 1381 --- [rk-thread | sym] org.apache.kafka.clients.NetworkClient : [Producer clientId=sym] Completed connection to node -4. Fetching API versions.
2019-05-29 20:10:25.768 DEBUG 1381 --- [rk-thread | sym] org.apache.kafka.clients.NetworkClient : [Producer clientId=sym] Initiating API versions fetch from node -4.
2019-05-29 20:10:25.768 DEBUG 1381 --- [rk-thread | sym] org.apache.kafka.clients.NetworkClient : [Producer clientId=sym] Initialize connection to node 10.25.77.13:9093 (id: -3 rack: null) for sending metadata request
2019-05-29 20:10:25.768 DEBUG 1381 --- [rk-thread | sym] org.apache.kafka.clients.NetworkClient : [Producer clientId=sym] Initiating connection to node 10.25.77.13:9093 (id: -3 rack: null) using address /10.25.77.13
2019-05-29 20:10:25.994 DEBUG 1381 --- [rk-thread | sym] org.apache.kafka.common.metrics.Metrics : Added sensor with name node--3.bytes-sent
2019-05-29 20:10:25.996 DEBUG 1381 --- [rk-thread | sym] org.apache.kafka.common.metrics.Metrics : Added sensor with name node--3.bytes-received
2019-05-29 20:10:25.997 DEBUG 1381 --- [rk-thread | sym] org.apache.kafka.common.metrics.Metrics : Added sensor with name node--3.latency
2019-05-29 20:10:25.998 DEBUG 1381 --- [rk-thread | sym] o.apache.kafka.common.network.Selector : [Producer clientId=sym] Created socket with SO_RCVBUF = 32768, SO_SNDBUF = 131072, SO_TIMEOUT = 0 to node -3
2019-05-29 20:10:26.107 DEBUG 1381 --- [rk-thread | sym] o.apache.kafka.common.network.Selector : [Producer clientId=sym] Connection with /10.25.75.151 disconnected
java.io.EOFException: null
at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:119) ~[kafka-clients-2.1.1.jar!/:na]
at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:381) ~[kafka-clients-2.1.1.jar!/:na]
at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:342) ~[kafka-clients-2.1.1.jar!/:na]
at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:609) ~[kafka-clients-2.1.1.jar!/:na]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:541) ~[kafka-clients-2.1.1.jar!/:na]
at org.apache.kafka.common.network.Selector.poll(Selector.java:467) ~[kafka-clients-2.1.1.jar!/:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:535) ~[kafka-clients-2.1.1.jar!/:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:311) ~[kafka-clients-2.1.1.jar!/:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:235) ~[kafka-clients-2.1.1.jar!/:na]
at java.base/java.lang.Thread.run(Thread.java:835) ~[na:na]
2019-05-29 20:10:26.108 DEBUG 1381 --- [rk-thread | sym] org.apache.kafka.clients.NetworkClient : [Producer clientId=sym] Node -1 disconnected.
2019-05-29 20:10:26.110 DEBUG 1381 --- [rk-thread | sym] org.apache.kafka.clients.NetworkClient : [Producer clientId=sym] Completed connection to node -3. Fetching API versions.
In producer config security.protocol should be set to SSL. You could also try setting ssl.endpoint.identification.algirithm = "" to disable hostname validation of the certificate in case that's the issue. Other than that, would be useful to see the Kafka broker config.

Cannot connect to kafka through SpringBoot (docker) application

Locally started the kafka and I wrote a sample Spring-boot producer. When I run this application it works fine. But when I start the application via docker container, I'm getting below logs "Connection to node 0 could not be established. Broker may not be available."
2019-03-20 06:06:56.023 INFO 1 --- [ XNIO-2 task-1] o.a.k.c.u.AppInfoParser : Kafka version : 1.0.1
2019-03-20 06:06:56.023 INFO 1 --- [ XNIO-2 task-1] o.a.k.c.u.AppInfoParser : Kafka commitId : c0518aa65f25317e
2019-03-20 06:06:56.224 WARN 1 --- [ad | producer-1] o.a.k.c.NetworkClient : [Producer clientId=producer-1] Connection to node 0 could not be established. Broker may not be available.
2019-03-20 06:06:56.263 WARN 1 --- [ad | producer-1] o.a.k.c.NetworkClient : [Producer clientId=producer-1] Connection to node 0 could not be established. Broker may not be available.
2019-03-20 06:06:56.355 WARN 1 --- [ad | producer-1] o.a.k.c.NetworkClient : [Producer clientId=producer-1] Connection to node 0 could not be established. Broker may not be available.
2019-03-20 06:06:56.594 WARN 1 --- [ad | producer-1] o.a.k.c.NetworkClient : [Producer clientId=producer-1] Connection to node 0 could not be established. Broker may not be available.
2019-03-20 06:06:56.919 WARN 1 --- [ad | producer-1] o.a.k.c.NetworkClient : [Producer clientId=producer-1] Connection to node 0 could not be established. Broker may not be available.
2019-03-20 06:06:57.877 WARN 1 --- [ad | producer-1] o.a.k.c.NetworkClient : [Producer clientId=producer-1] Connection to node 0 could not be established. Broker may not be available.
Please find the ProducerConfig values below based on the log
2019-03-20 06:06:55.953 INFO 1 --- [ XNIO-2 task-1] o.a.k.c.p.ProducerConfig : ProducerConfig values:
acks = 1
batch.size = 16384
bootstrap.servers = [192.168.0.64:9092]
buffer.memory = 33554432
client.id =
compression.type = none
connections.max.idle.ms = 540000
enable.idempotence = false
interceptor.classes = null
key.serializer = class org.apache.kafka.common.serialization.StringSerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 0
retry.backoff.ms = 100
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.springframework.kafka.support.serializer.JsonSerializer
My ProducerConfig as below
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "192.168.0.64:9092");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return props;
}
Is there any additional configuration required when connecting through docker?
Probably you connect to the wrong port. Do a docker ps:
e.g.
2ca7f0cdddd confluentinc/cp-enterprise-kafka:5.1.2 "/etc/confluent/dock…" 2 weeks ago Up 50 seconds 0.0.0.0:9092->9092/tcp, 0.0.0.0:29092->29092/tcp broker
and use the later broker port: 29092 in the above example.
also usually from your laptop you can access the docker network at localhost.

Spring Boot: I have a kafka broker code that makes the application get stuck

Spring Boot: I have a kafka broker code that is in the application which listens to a topic. I have a controller which has api endpoints. My kafka polls to the topic. When i start the application i see kafka getting started but the problem is my endpoints are not working. Even the application is not up on the port mentioned. I need to test my endpoints but i cannot. I see my kafka getting started and working fine, but i dont see the prompt saying that the application started at port XXXX and the endpoints are not working.
Logs on console:
2018-04-17 17:18:59.447 INFO 59720 --- [ main]
c.f.s.e.EventAggregationApplication : Starting
EventAggregationApplication on FPTECHS48s-MacBook-Pro.local with PID
59720 (/Users/fptechs48/IdeaProjects/event-aggregation/target/classes
started by fptechs48 in
/Users/fptechs48/IdeaProjects/event-aggregation) 2018-04-17
17:18:59.450 DEBUG 59720 --- [ main]
c.f.s.e.EventAggregationApplication : Running with Spring Boot
v1.5.9.RELEASE, Spring v4.3.13.RELEASE 2018-04-17 17:18:59.450 INFO
59720 --- [ main] c.f.s.e.EventAggregationApplication :
The following profiles are active: dev 2018-04-17 17:18:59.487 INFO
59720 --- [ main] s.c.a.AnnotationConfigApplicationContext :
Refreshing
org.springframework.context.annotation.AnnotationConfigApplicationContext#1198b989:
startup date [Tue Apr 17 17:18:59 IST 2018]; root of context hierarchy
2018-04-17 17:18:59.894 INFO 59720 --- [ main]
trationDelegate$BeanPostProcessorChecker : Bean
'org.springframework.kafka.annotation.KafkaBootstrapConfiguration' of
type
[org.springframework.kafka.annotation.KafkaBootstrapConfiguration$$EnhancerBySpringCGLIB$$b36fb556]
is not eligible for getting processed by all BeanPostProcessors (for
example: not eligible for auto-proxying) 2018-04-17 17:19:00.521 INFO
59720 --- [ main] o.a.k.clients.admin.AdminClientConfig :
AdminClientConfig values: bootstrap.servers = [13.126.200.243:9092]
client.id = connections.max.idle.ms = 300000 metadata.max.age.ms =
300000 metric.reporters = [] metrics.num.samples = 2
metrics.recording.level = INFO metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50 request.timeout.ms = 120000 retries = 5
retry.backoff.ms = 100 sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter
= 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1,
TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password
= null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS
ssl.protocol = TLS ssl.provider = null
ssl.secure.random.implementation = null ssl.trustmanager.algorithm =
PKIX ssl.truststore.location = null ssl.truststore.password = null
ssl.truststore.type = JKS
2018-04-17 17:19:00.553 INFO 59720 --- [ main]
o.a.kafka.common.utils.AppInfoParser : Kafka version : 1.0.0
2018-04-17 17:19:00.553 INFO 59720 --- [ main]
o.a.kafka.common.utils.AppInfoParser : Kafka commitId :
aaa7af6d4a11b29d
After which kafka starts listening to the topic but if i hit the endpoint written in my application its not working.
Even the console doesn't say me Tomcat started on port XXXX

Spring Cloud Kafka Stream Unable to create Producer Config Error

I have two Spring boot project with Kafka-stream dependencies, they have exactly same dependencies in gradle and exactly same configurations, yet one of the project when started logs error as below
11:35:37.974 [restartedMain] INFO o.a.k.c.admin.AdminClientConfig - AdminClientConfig values:
bootstrap.servers = [192.169.0.109:6667]
client.id = client
connections.max.idle.ms = 300000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 120000
retries = 5
retry.backoff.ms = 100
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
11:35:38.017 [restartedMain] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 1.0.0
11:35:38.017 [restartedMain] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : aaa7af6d4a11b29d
11:35:38.136 [restartedMain] INFO o.s.c.s.b.k.p.KafkaTopicProvisioner - Using kafka topic for outbound: createschedule
11:36:08.147 [restartedMain] ERROR o.s.c.stream.binding.BindingService - Failed to create producer binding; retrying in 30 seconds
org.springframework.cloud.stream.provisioning.ProvisioningException: provisioning exception; nested exception is java.util.concurrent.TimeoutException
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopic(KafkaTopicProvisioner.java:243)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.provisionProducerDestination(KafkaTopicProvisioner.java:126)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.provisionProducerDestination(KafkaTopicProvisioner.java:71)
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindProducer(AbstractMessageChannelBinder.java:140)
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindProducer(AbstractMessageChannelBinder.java:77)
at org.springframework.cloud.stream.binder.AbstractBinder.bindProducer(AbstractBinder.java:136)
at org.springframework.cloud.stream.binding.BindingService.doBindProducer(BindingService.java:244)
at org.springframework.cloud.stream.binding.BindingService.bindProducer(BindingService.java:221)
at org.springframework.cloud.stream.binding.BindableProxyFactory.bindOutputs(BindableProxyFactory.java:252)
at org.springframework.cloud.stream.binding.OutputBindingLifecycle.doStartWithBindable(OutputBindingLifecycle.java:46)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580)
at org.springframework.cloud.stream.binding.AbstractBindingLifecycle.start(AbstractBindingLifecycle.java:47)
at org.springframework.cloud.stream.binding.OutputBindingLifecycle.start(OutputBindingLifecycle.java:29)
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:181)
at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:52)
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:356)
at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:157)
at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:121)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:884)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.finishRefresh(ServletWebServerApplicationContext.java:161)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:552)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:752)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:388)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:327)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1246)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1234)
at com.und.ClientPanelApplicationKt.main(ClientPanelApplication.kt:21)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49)
Caused by: java.util.concurrent.TimeoutException: null
at org.apache.kafka.common.internals.KafkaFutureImpl$SingleWaiter.await(KafkaFutureImpl.java:108)
at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:225)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopicAndPartitions(KafkaTopicProvisioner.java:271)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopicIfNecessary(KafkaTopicProvisioner.java:251)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopic(KafkaTopicProvisioner.java:236)
... 34 common frames omitted
I am using kotlin 2.30, and Spring Release 2.0.0.0
Relevant part of my build gradle is below
ext {
kotlinVersion = '1.2.30'
springBootVersion = '2.0.0.RELEASE'
springCloudVersion = 'Finchley.M8'
}
dependencies{
compile('org.springframework.cloud:spring-cloud-starter-stream-kafka')
}
I have defined one OutPut channel as below
interface EventStream {
#Output("createschedule")
fun outputEvent(): MessageChannel
}
Below is relvant section of my code where i have defined sender and listener
#Service
class TestService {
#Autowired
private lateinit var eventStream: EventStream
fun test() {
processSchedule("Hello")
}
#SendTo("createschedule")
fun processSchedule(campaign: String): String {
return campaign
}
#StreamListener("createschedule")
fun listenSchedule(campaign: String) {
println(campaign)
//return campaign
}
}
Below is relevant section of my application.yaml
spring:
cloud:
stream:
kafka:
binder:
brokers: ${KAFKA_IP}
defaultBrokerPort: ${KAFKA_PORT}
zkNodes: ${ZK_IP}
defaultZkPort: ${ZK_PORT}
bindings:
createschedule:
group: createscheduleGroup
destination: createschedule
consumer:
group: createscheduleGroup
kafka:
admin:
properties:
#security.protocol: SSL
client.id: client
As i already stated it throw error while starting and after it starts it keep throwing below error in logs
11:54:38.231 [pool-3-thread-1] INFO o.s.c.s.b.k.p.KafkaTopicProvisioner - Using kafka topic for outbound: createschedule
11:54:59.070 [kafka-admin-client-thread | client-panel] WARN o.apache.kafka.clients.NetworkClient - [AdminClient clientId=client-panel] Connection to node -1 could not be established. Broker may not be available.
11:55:08.234 [pool-3-thread-1] ERROR o.s.c.stream.binding.BindingService - Failed to create producer binding; retrying in 30 seconds
org.springframework.cloud.stream.provisioning.ProvisioningException: provisioning exception; nested exception is java.util.concurrent.TimeoutException
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopic(KafkaTopicProvisioner.java:243)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.provisionProducerDestination(KafkaTopicProvisioner.java:126)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.provisionProducerDestination(KafkaTopicProvisioner.java:71)
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindProducer(AbstractMessageChannelBinder.java:140)
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindProducer(AbstractMessageChannelBinder.java:77)
at org.springframework.cloud.stream.binder.AbstractBinder.bindProducer(AbstractBinder.java:136)
at org.springframework.cloud.stream.binding.BindingService.lambda$rescheduleProducerBinding$2(BindingService.java:262)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at java.util.concurrent.Executors$RunnableAdapter.call$$$capture(Executors.java:511)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java)
at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
at java.util.concurrent.FutureTask.run(FutureTask.java)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: null
at org.apache.kafka.common.internals.KafkaFutureImpl$SingleWaiter.await(KafkaFutureImpl.java:108)
at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:225)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopicAndPartitions(KafkaTopicProvisioner.java:271)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopicIfNecessary(KafkaTopicProvisioner.java:251)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopic(KafkaTopicProvisioner.java:236)
... 16 common frames omitted
It is the problem with kafka.
If you are using docker cointainers, then stop kafka, remove kafka and start kafka cointainer.

Resources