I am using docker to initiate kafka and zookeper services on local machine(mac os). Here is my docker-compose.yml file;
version: '3'
services:
zookeeper:
image: wurstmeister/zookeeper
ports:
- "2181:2181"
kafka:
image: wurstmeister/kafka
ports:
- "9092"
environment:
KAFKA_ADVERTISED_HOST_NAME: 10.200.10.1
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
volumes:
- /var/run/docker.sock:/var/run/docker.sock
10.200.10.1 : this is my docker host ip. i find it via this command ;
ifconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $2 }' | cut -f2 -d: | head -n1
I can create topic inside docker container via ;
kafka-topics.sh --bootstrap-server :9092 --create --topic topic1 --partitions 1 --replication-factor 1
and initiate producer by :
kafka-console-consumer.sh --bootstrap-server :9092 --group sam --topic topic1
and initiate consumer by :
kafka-console-producer.sh --broker-list :9092 --topic topic1
There is no problem everything is ok on the terminals; i can send message via producer and receive it on consumer.
But i can not send message via Spring Boot app which I describe it below;
application.yml file;
spring:
kafka:
producer:
bootstrap-servers: 0.0.0.0:9092
KafkaConfiguration;
#Configuration
public class KafkaConfiguration {
#Value("${spring.kafka.producer.bootstrap-servers}")
private String bootstrapAddress;
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
I am using this sendSms function on my producer;
#Slf4j
#Component
#RequiredArgsConstructor
public class MessageProducer {
private final KafkaTemplate kafkaTemplate;
public void sendSms(String sms) {
kafkaTemplate.send("topic1", sms)
.addCallback(new ListenableFutureCallback<SendResult<String, String>>() {
#Override
public void onSuccess(SendResult<String, String> result) {
log.info("Message '{}' sent to kafka with offset : {}", sms, result.getRecordMetadata().offset());
}
#Override
public void onFailure(Throwable ex) {
log.error("Enable to send to message : {}. ex : {}", sms, ex.getMessage());
}
});
}
}
When i try to send a message, i am getting this error ;
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_201]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_201]
at org.apache.kafka.common.network.PlaintextTransportLayer.finishConnect(PlaintextTransportLayer.java:50) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.KafkaChannel.finishConnect(KafkaChannel.java:216) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:531) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.poll(Selector.java:483) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:539) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.runOnce(Sender.java:335) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:244) [kafka-clients-2.3.1.jar:na]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_201]
2020-05-17 19:53:06.210 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node -1 disconnected.
2020-05-17 19:53:06.210 WARN 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node -1 (/0.0.0.0:9092) could not be established. Broker may not be available.
2020-05-17 19:53:06.210 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Give up sending metadata request since no node is available
2020-05-17 19:53:06.265 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Give up sending metadata request since no node is available
2020-05-17 19:53:06.319 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Initialize connection to node 0.0.0.0:9092 (id: -1 rack: null) for sending metadata request
2020-05-17 19:53:06.320 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Initiating connection to node 0.0.0.0:9092 (id: -1 rack: null) using address /0.0.0.0
2020-05-17 19:53:06.320 DEBUG 15739 --- [ad | producer-1] o.apache.kafka.common.network.Selector : [Producer clientId=producer-1] Connection with /0.0.0.0 disconnected
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_201]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_201]
at org.apache.kafka.common.network.PlaintextTransportLayer.finishConnect(PlaintextTransportLayer.java:50) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.KafkaChannel.finishConnect(KafkaChannel.java:216) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:531) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.poll(Selector.java:483) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:539) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.runOnce(Sender.java:335) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:244) [kafka-clients-2.3.1.jar:na]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_201]
here is the whole log output ;
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.2.7.RELEASE)
2020-05-17 19:52:57.176 INFO 15739 --- [ restartedMain] .p.SpringKafkaProducerExampleApplication : Starting SpringKafkaProducerExampleApplication on sam-MacBook-Pro.local with PID 15739 (/Users/sam/Downloads/kafka-hello/spring-kafka-producer-example/build/classes/java/main started by sam in /Users/sam/Downloads/kafka-hello/spring-kafka-producer-example)
2020-05-17 19:52:57.180 INFO 15739 --- [ restartedMain] .p.SpringKafkaProducerExampleApplication : No active profile set, falling back to default profiles: default
2020-05-17 19:52:57.231 INFO 15739 --- [ restartedMain] .e.DevToolsPropertyDefaultsPostProcessor : Devtools property defaults active! Set 'spring.devtools.add-properties' to 'false' to disable
2020-05-17 19:52:57.231 INFO 15739 --- [ restartedMain] .e.DevToolsPropertyDefaultsPostProcessor : For additional web related logging consider setting the 'logging.level.web' property to 'DEBUG'
2020-05-17 19:52:58.111 INFO 15739 --- [ restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8081 (http)
2020-05-17 19:52:58.120 INFO 15739 --- [ restartedMain] o.apache.catalina.core.StandardService : Starting service [Tomcat]
2020-05-17 19:52:58.120 INFO 15739 --- [ restartedMain] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.34]
2020-05-17 19:52:58.181 INFO 15739 --- [ restartedMain] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext
2020-05-17 19:52:58.181 INFO 15739 --- [ restartedMain] o.s.web.context.ContextLoader : Root WebApplicationContext: initialization completed in 950 ms
2020-05-17 19:52:58.373 INFO 15739 --- [ restartedMain] o.s.s.concurrent.ThreadPoolTaskExecutor : Initializing ExecutorService 'applicationTaskExecutor'
2020-05-17 19:52:58.562 INFO 15739 --- [ restartedMain] o.s.b.d.a.OptionalLiveReloadServer : LiveReload server is running on port 35729
2020-05-17 19:52:58.610 INFO 15739 --- [ restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8081 (http) with context path ''
2020-05-17 19:52:58.614 INFO 15739 --- [ restartedMain] .p.SpringKafkaProducerExampleApplication : Started SpringKafkaProducerExampleApplication in 1.878 seconds (JVM running for 2.602)
2020-05-17 19:53:04.243 INFO 15739 --- [nio-8081-exec-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring DispatcherServlet 'dispatcherServlet'
2020-05-17 19:53:04.243 INFO 15739 --- [nio-8081-exec-1] o.s.web.servlet.DispatcherServlet : Initializing Servlet 'dispatcherServlet'
2020-05-17 19:53:04.248 INFO 15739 --- [nio-8081-exec-1] o.s.web.servlet.DispatcherServlet : Completed initialization in 5 ms
2020-05-17 19:53:06.105 INFO 15739 --- [nio-8081-exec-1] c.s.e.kafka.producer.MessageController : Message request received : sam
2020-05-17 19:53:06.130 INFO 15739 --- [nio-8081-exec-1] o.a.k.clients.producer.ProducerConfig : ProducerConfig values:
acks = 1
batch.size = 16384
bootstrap.servers = [0.0.0.0:9092]
buffer.memory = 33554432
client.dns.lookup = default
client.id =
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 120000
enable.idempotence = false
interceptor.classes = []
key.serializer = class org.apache.kafka.common.serialization.StringSerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.StringSerializer
2020-05-17 19:53:06.143 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name bufferpool-wait-time
2020-05-17 19:53:06.148 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name buffer-exhausted-records
2020-05-17 19:53:06.153 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name errors
2020-05-17 19:53:06.158 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name produce-throttle-time
2020-05-17 19:53:06.163 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name connections-closed:
2020-05-17 19:53:06.164 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name connections-created:
2020-05-17 19:53:06.164 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name successful-authentication:
2020-05-17 19:53:06.164 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name successful-reauthentication:
2020-05-17 19:53:06.164 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name successful-authentication-no-reauth:
2020-05-17 19:53:06.165 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name failed-authentication:
2020-05-17 19:53:06.165 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name failed-reauthentication:
2020-05-17 19:53:06.165 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name reauthentication-latency:
2020-05-17 19:53:06.165 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name bytes-sent-received:
2020-05-17 19:53:06.166 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name bytes-sent:
2020-05-17 19:53:06.166 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name bytes-received:
2020-05-17 19:53:06.167 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name select-time:
2020-05-17 19:53:06.167 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name io-time:
2020-05-17 19:53:06.170 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name batch-size
2020-05-17 19:53:06.171 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name compression-rate
2020-05-17 19:53:06.171 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name queue-time
2020-05-17 19:53:06.171 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name request-time
2020-05-17 19:53:06.172 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name records-per-request
2020-05-17 19:53:06.172 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name record-retries
2020-05-17 19:53:06.172 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name record-size
2020-05-17 19:53:06.173 DEBUG 15739 --- [nio-8081-exec-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name batch-split-rate
2020-05-17 19:53:06.174 DEBUG 15739 --- [ad | producer-1] o.a.k.clients.producer.internals.Sender : [Producer clientId=producer-1] Starting Kafka producer I/O thread.
2020-05-17 19:53:06.176 INFO 15739 --- [nio-8081-exec-1] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.3.1
2020-05-17 19:53:06.177 INFO 15739 --- [nio-8081-exec-1] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 18a913733fb71c01
2020-05-17 19:53:06.177 INFO 15739 --- [nio-8081-exec-1] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1589741586174
2020-05-17 19:53:06.178 DEBUG 15739 --- [nio-8081-exec-1] o.a.k.clients.producer.KafkaProducer : [Producer clientId=producer-1] Kafka producer started
2020-05-17 19:53:06.179 DEBUG 15739 --- [nio-8081-exec-1] o.s.k.core.DefaultKafkaProducerFactory : Created new Producer: CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer#4f619fb3]
2020-05-17 19:53:06.185 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Initialize connection to node 0.0.0.0:9092 (id: -1 rack: null) for sending metadata request
2020-05-17 19:53:06.187 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Initiating connection to node 0.0.0.0:9092 (id: -1 rack: null) using address /0.0.0.0
2020-05-17 19:53:06.194 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name node--1.bytes-sent
2020-05-17 19:53:06.195 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name node--1.bytes-received
2020-05-17 19:53:06.195 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.common.metrics.Metrics : Added sensor with name node--1.latency
2020-05-17 19:53:06.208 DEBUG 15739 --- [ad | producer-1] o.apache.kafka.common.network.Selector : [Producer clientId=producer-1] Connection with /0.0.0.0 disconnected
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_201]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_201]
at org.apache.kafka.common.network.PlaintextTransportLayer.finishConnect(PlaintextTransportLayer.java:50) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.KafkaChannel.finishConnect(KafkaChannel.java:216) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:531) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.poll(Selector.java:483) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:539) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.runOnce(Sender.java:335) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:244) [kafka-clients-2.3.1.jar:na]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_201]
2020-05-17 19:53:06.210 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node -1 disconnected.
2020-05-17 19:53:06.210 WARN 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node -1 (/0.0.0.0:9092) could not be established. Broker may not be available.
2020-05-17 19:53:06.210 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Give up sending metadata request since no node is available
2020-05-17 19:53:06.265 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Give up sending metadata request since no node is available
2020-05-17 19:53:06.319 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Initialize connection to node 0.0.0.0:9092 (id: -1 rack: null) for sending metadata request
2020-05-17 19:53:06.320 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Initiating connection to node 0.0.0.0:9092 (id: -1 rack: null) using address /0.0.0.0
2020-05-17 19:53:06.320 DEBUG 15739 --- [ad | producer-1] o.apache.kafka.common.network.Selector : [Producer clientId=producer-1] Connection with /0.0.0.0 disconnected
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_201]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_201]
at org.apache.kafka.common.network.PlaintextTransportLayer.finishConnect(PlaintextTransportLayer.java:50) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.KafkaChannel.finishConnect(KafkaChannel.java:216) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:531) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.poll(Selector.java:483) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:539) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.runOnce(Sender.java:335) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:244) [kafka-clients-2.3.1.jar:na]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_201]
2020-05-17 19:53:06.320 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node -1 disconnected.
2020-05-17 19:53:06.320 WARN 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node -1 (/0.0.0.0:9092) could not be established. Broker may not be available.
2020-05-17 19:53:06.320 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Give up sending metadata request since no node is available
2020-05-17 19:53:06.375 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Give up sending metadata request since no node is available
2020-05-17 19:53:06.427 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Initialize connection to node 0.0.0.0:9092 (id: -1 rack: null) for sending metadata request
2020-05-17 19:53:06.427 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Initiating connection to node 0.0.0.0:9092 (id: -1 rack: null) using address /0.0.0.0
2020-05-17 19:53:06.427 DEBUG 15739 --- [ad | producer-1] o.apache.kafka.common.network.Selector : [Producer clientId=producer-1] Connection with /0.0.0.0 disconnected
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_201]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_201]
at org.apache.kafka.common.network.PlaintextTransportLayer.finishConnect(PlaintextTransportLayer.java:50) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.KafkaChannel.finishConnect(KafkaChannel.java:216) ~[kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:531) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.common.network.Selector.poll(Selector.java:483) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:539) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.runOnce(Sender.java:335) [kafka-clients-2.3.1.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:244) [kafka-clients-2.3.1.jar:na]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_201]
2020-05-17 19:53:06.427 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node -1 disconnected.
2020-05-17 19:53:06.427 WARN 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node -1 (/0.0.0.0:9092) could not be established. Broker may not be available.
2020-05-17 19:53:06.427 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Give up sending metadata request since no node is available
2020-05-17 19:53:06.482 DEBUG 15739 --- [ad | producer-1] org.apache.kafka.clients.Netwo
Replace KAFKA_ADVERTISED_HOST_NAME with KAFKA_ADVERTISED_LISTENERS, add KAFKA_LISTENERS and expose the ports correctly:
kafka:
image: wurstmeister/kafka
ports:
- 9092:9092
environment:
KAFKA_LISTENERS: PLAINTEXT://:9092
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
Connect to localhost:9092:
spring:
kafka:
producer:
bootstrap-servers: localhost:9092
bootstrap-servers: 0.0.0.0:9092
0.0.0.0 is an invalid IP address on the client side.
0.0.0.0 on the server side means listen on all interfaces.
You must use the actual address; presumably 10.200.10.1:9092.
Related
I tried to use spring cloud stream with kafka binder. But when I called WebClient in chain, then trace id is lost.
My flow is 'external service' -> 'functionStream-in' -> 'http call' -> functionStream-out' -> 'testStream-in' -> 'testStream-out' -> 'external service'
But after http call(or not?) the trace id is not propagated and I don't understand why. If I remove http call, then everything is OK.
I tried to add Hooks.enableAutomaticContextPropagation();, but that didn't help.
I tried to add ContextSnapshot.setThreadLocalsFrom around http call - same thing.
How can I solve it?
Dependencies:
dependencies {
implementation 'org.springframework.boot:spring-boot-starter-actuator'
implementation 'org.springframework.boot:spring-boot-starter-webflux'
implementation 'org.springframework.cloud:spring-cloud-stream'
implementation 'org.springframework.cloud:spring-cloud-starter-stream-kafka'
implementation 'io.micrometer:micrometer-tracing-bridge-brave'
implementation 'io.zipkin.reporter2:zipkin-reporter-brave'
implementation "io.projectreactor:reactor-core:3.5.3"
implementation "io.micrometer:context-propagation:1.0.2"
implementation "io.micrometer:micrometer-core:1.10.4"
implementation "io.micrometer:micrometer-tracing:1.0.2"
}
application.yml:
spring:
cloud.stream:
kafka.binder:
enableObservation: true
headers:
- b3
function.definition: functionStream;testStream
default.producer.useNativeEncoding: true
bindings:
functionStream-in-0:
destination: spring-in
group: spring-test1
functionStream-out-0:
destination: test-in
testStream-in-0:
destination: test-in
group: spring-test2
testStream-out-0:
destination: spring-out
integration:
management:
observation-patterns: "*"
kafka:
bootstrap-servers: localhost:9092
consumer:
auto-offset-reset: earliest
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
properties:
spring.deserializer.value.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.apache.kafka.common.serialization.StringSerializer
management:
tracing:
enabled: true
sampling.probability: 1.0
propagation.type: b3
logging.pattern.level: "%5p [%X{traceId:-},%X{spanId:-}]"
Code:
#Bean
WebClient webClient(final WebClient.Builder builder) {
return builder.build();
}
#Bean
Function<Flux<Message<String>>, Flux<Message<String>>> functionStream(final WebClient webClient, final ObservationRegistry registry) {
return flux -> flux
.<Message<String>>handle((msg, sink) -> {
log.info("functionStream-1");
sink.next(msg);
})
.flatMap(msg -> webClient.get()
.uri("http://localhost:8080/test")
.exchangeToMono(httpResponse -> httpResponse.bodyToMono(String.class)
.map(httpBody -> MessageBuilder.withPayload(httpBody)
.copyHeaders(httpResponse.headers().asHttpHeaders())
.build())
.<Message<String>>handle((m, sink) -> {
log.info("functionStream-3");
sink.next(m);
})
)
)
.handle((msg, sink) -> {
log.info("functionStream-2");
sink.next(msg);
});
}
#Bean
Function<Flux<Message<String>>, Flux<Message<String>>> testStream(final ObservationRegistry registry) {
return flux -> flux
.publishOn(Schedulers.boundedElastic())
.<Message<String>>handle((msg, sink) -> {
log.info("testStream-1");
sink.next(msg);
})
.map(msg -> MessageBuilder
.withPayload(msg.getPayload())
.copyHeaders(msg.getHeaders())
.build());
}
#Bean
RouterFunction<ServerResponse> router(final ObservationRegistry registry) {
return route()
.GET("/test", r -> ServerResponse.ok().body(Mono.deferContextual(contextView -> {
try (final var scope = ContextSnapshot.setThreadLocalsFrom(contextView, ObservationThreadLocalAccessor.KEY)) {
log.info("GET /test");
}
return Mono.just("answer");
}), String.class))
.build();
}
With this code I have output:
2023-02-16T17:06:22.111 INFO [63ee385de15f1061dea076eb06b0d1e0,39a60588a695a702] 220348 --- [container-0-C-1] com.example.demo.TestApplication : functionStream-1
2023-02-16T17:06:22.166 WARN [63ee385de15f1061dea076eb06b0d1e0,39a60588a695a702] 220348 --- [container-0-C-1] i.m.o.c.ObservationThreadLocalAccessor : Scope from ObservationThreadLocalAccessor [null] is not the same as the one from ObservationRegistry [io.micrometer.observation.SimpleObservation$SimpleScope#523fe6a9]. You must have created additional scopes and forgotten to close them. Will close both of them
2023-02-16T17:06:22.170 WARN [63ee385de15f1061dea076eb06b0d1e0,de5d233d531b10f7] 220348 --- [container-0-C-1] i.m.o.c.ObservationThreadLocalAccessor : Scope from ObservationThreadLocalAccessor [null] is not the same as the one from ObservationRegistry [io.micrometer.observation.SimpleObservation$SimpleScope#545339d8]. You must have created additional scopes and forgotten to close them. Will close both of them
2023-02-16T17:06:22.187 WARN [63ee385de15f1061dea076eb06b0d1e0,de5d233d531b10f7] 220348 --- [container-0-C-1] i.m.o.c.ObservationThreadLocalAccessor : Scope from ObservationThreadLocalAccessor [null] is not the same as the one from ObservationRegistry [io.micrometer.observation.SimpleObservation$SimpleScope#44400bcc]. You must have created additional scopes and forgotten to close them. Will close both of them
2023-02-16T17:06:22.361 INFO [63ee385de15f1061dea076eb06b0d1e0,908f48f8485a4277] 220348 --- [ctor-http-nio-4] com.example.demo.TestApplication : GET /test
2023-02-16T17:06:22.407 INFO [,] 220348 --- [ctor-http-nio-3] com.example.demo.TestApplication : functionStream-3
2023-02-16T17:06:22.409 INFO [,] 220348 --- [ctor-http-nio-3] com.example.demo.TestApplication : functionStream-2
2023-02-16T17:06:22.448 INFO [63ee385eda64dcebdd1b0fd86a6c39ca,dd1b0fd86a6c39ca] 220348 --- [ctor-http-nio-3] o.a.k.clients.admin.AdminClientConfig : AdminClientConfig values:
2023-02-16T17:06:22.456 INFO [63ee385eda64dcebdd1b0fd86a6c39ca,dd1b0fd86a6c39ca] 220348 --- [ctor-http-nio-3] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.3.2
2023-02-16T17:06:22.457 INFO [63ee385eda64dcebdd1b0fd86a6c39ca,dd1b0fd86a6c39ca] 220348 --- [ctor-http-nio-3] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: b66af662e61082cb
2023-02-16T17:06:22.457 INFO [63ee385eda64dcebdd1b0fd86a6c39ca,dd1b0fd86a6c39ca] 220348 --- [ctor-http-nio-3] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1676556382456
2023-02-16T17:06:22.477 INFO [,] 220348 --- [| adminclient-6] o.a.kafka.common.utils.AppInfoParser : App info kafka.admin.client for adminclient-6 unregistered
2023-02-16T17:06:22.481 INFO [,] 220348 --- [| adminclient-6] o.apache.kafka.common.metrics.Metrics : Metrics scheduler closed
2023-02-16T17:06:22.481 INFO [,] 220348 --- [| adminclient-6] o.apache.kafka.common.metrics.Metrics : Closing reporter org.apache.kafka.common.metrics.JmxReporter
2023-02-16T17:06:22.481 INFO [,] 220348 --- [| adminclient-6] o.apache.kafka.common.metrics.Metrics : Metrics reporters closed
2023-02-16T17:06:22.512 INFO [63ee385eda64dcebdd1b0fd86a6c39ca,b5babc6bef4e30ca] 220348 --- [oundedElastic-1] com.example.demo.TestApplication : testStream-1
2023-02-16T17:06:22.539 INFO [63ee385eda64dcebdd1b0fd86a6c39ca,30126c50752d5928] 220348 --- [oundedElastic-1] o.a.k.clients.admin.AdminClientConfig : AdminClientConfig values:
2023-02-16T17:06:22.543 INFO [63ee385eda64dcebdd1b0fd86a6c39ca,30126c50752d5928] 220348 --- [oundedElastic-1] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.3.2
2023-02-16T17:06:22.544 INFO [63ee385eda64dcebdd1b0fd86a6c39ca,30126c50752d5928] 220348 --- [oundedElastic-1] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: b66af662e61082cb
2023-02-16T17:06:22.544 INFO [63ee385eda64dcebdd1b0fd86a6c39ca,30126c50752d5928] 220348 --- [oundedElastic-1] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1676556382543
Without http call I have output:
2023-02-16T17:03:09.518 INFO [63ee379d924e5645fc1d9e27b8135b48,9ad408700a3b5684] 204228 --- [container-0-C-1] com.example.demo.TestApplication : functionStream-1
2023-02-16T17:03:09.518 INFO [63ee379d924e5645fc1d9e27b8135b48,9ad408700a3b5684] 204228 --- [container-0-C-1] com.example.demo.TestApplication : functionStream-2
2023-02-16T17:03:09.615 INFO [63ee379d924e5645fc1d9e27b8135b48,3d4c6bd14a3ca4b6] 204228 --- [container-0-C-1] o.a.k.clients.admin.AdminClientConfig : AdminClientConfig values:
2023-02-16T17:03:09.629 INFO [63ee379d924e5645fc1d9e27b8135b48,3d4c6bd14a3ca4b6] 204228 --- [container-0-C-1] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.3.2
2023-02-16T17:03:09.629 INFO [63ee379d924e5645fc1d9e27b8135b48,3d4c6bd14a3ca4b6] 204228 --- [container-0-C-1] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: b66af662e61082cb
2023-02-16T17:03:09.629 INFO [63ee379d924e5645fc1d9e27b8135b48,3d4c6bd14a3ca4b6] 204228 --- [container-0-C-1] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1676556189628
2023-02-16T17:03:09.691 INFO [,] 204228 --- [| adminclient-6] o.a.kafka.common.utils.AppInfoParser : App info kafka.admin.client for adminclient-6 unregistered
2023-02-16T17:03:09.693 INFO [,] 204228 --- [| adminclient-6] o.apache.kafka.common.metrics.Metrics : Metrics scheduler closed
2023-02-16T17:03:09.693 INFO [,] 204228 --- [| adminclient-6] o.apache.kafka.common.metrics.Metrics : Closing reporter org.apache.kafka.common.metrics.JmxReporter
2023-02-16T17:03:09.693 INFO [,] 204228 --- [| adminclient-6] o.apache.kafka.common.metrics.Metrics : Metrics reporters closed
2023-02-16T17:03:09.859 INFO [63ee379d924e5645fc1d9e27b8135b48,b92a1a59ffd32d80] 204228 --- [oundedElastic-1] com.example.demo.TestApplication : testStream-1
2023-02-16T17:03:09.868 INFO [63ee379d924e5645fc1d9e27b8135b48,db97f5eed98602f6] 204228 --- [oundedElastic-1] o.a.k.clients.admin.AdminClientConfig : AdminClientConfig values:
2023-02-16T17:03:09.874 INFO [63ee379d924e5645fc1d9e27b8135b48,db97f5eed98602f6] 204228 --- [oundedElastic-1] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.3.2
2023-02-16T17:03:09.874 INFO [63ee379d924e5645fc1d9e27b8135b48,db97f5eed98602f6] 204228 --- [oundedElastic-1] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: b66af662e61082cb
2023-02-16T17:03:09.874 INFO [63ee379d924e5645fc1d9e27b8135b48,db97f5eed98602f6] 204228 --- [oundedElastic-1] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1676556189874
I'm updating a JHipster microservice and found some issues when running the app in production environment. The application crashes with a strange Hibernate error:
2020-10-22 10:59:12.755 INFO 1 --- [ main] com.hazelcast.core.LifecycleService : [172.18.0.3]:5701 [dev] [3.12.7] [172.18.0.3]:5701 is STARTED
2020-10-22 10:59:13.066 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : Driver class org.postgresql.Driver found in Thread context class loader jdk.internal.loader.ClassLoaders$AppClassLoader#4ae3c1cd
2020-10-22 10:59:13.868 INFO 1 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : Saw local status change event StatusChangeEvent [timestamp=1603364353868, current=UP, previous=STARTING]
2020-10-22 10:59:13.946 INFO 1 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_MISCELLANEOUS/miscellaneous:2b1983b040c376288a15e1a536a4f8f9: registering service...
2020-10-22 10:59:14.182 INFO 1 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_MISCELLANEOUS/miscellaneous:2b1983b040c376288a15e1a536a4f8f9 - registration status: 204
2020-10-22 10:59:14.257 INFO 1 --- [ main] t.h.e.m.config.WebConfigurer : Web application configuration, using profiles: prod
2020-10-22 10:59:14.258 INFO 1 --- [ main] t.h.e.m.config.WebConfigurer : Web application fully configured
2020-10-22 10:59:14.961 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : Hikari - configuration:
2020-10-22 10:59:14.969 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : allowPoolSuspension.............false
2020-10-22 10:59:14.969 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : autoCommit......................false
2020-10-22 10:59:14.969 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : catalog.........................none
2020-10-22 10:59:14.970 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : connectionInitSql...............none
2020-10-22 10:59:14.970 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : connectionTestQuery.............none
2020-10-22 10:59:14.971 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : connectionTimeout...............30000
2020-10-22 10:59:14.971 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : dataSource......................none
2020-10-22 10:59:14.971 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : dataSourceClassName.............none
2020-10-22 10:59:14.972 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : dataSourceJNDI..................none
2020-10-22 10:59:15.042 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : dataSourceProperties............{password=<masked>}
2020-10-22 10:59:15.043 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : driverClassName................."org.postgresql.Driver"
2020-10-22 10:59:15.043 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : exceptionOverrideClassName......none
2020-10-22 10:59:15.044 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : healthCheckProperties...........{}
2020-10-22 10:59:15.045 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : healthCheckRegistry.............none
2020-10-22 10:59:15.045 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : idleTimeout.....................600000
2020-10-22 10:59:15.047 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : initializationFailTimeout.......1
2020-10-22 10:59:15.048 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : isolateInternalQueries..........false
2020-10-22 10:59:15.049 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : jdbcUrl.........................jdbc:postgresql://postgresql:5432/miscellaneous?socketTimeout=30
2020-10-22 10:59:15.050 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : leakDetectionThreshold..........15000
2020-10-22 10:59:15.050 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : maxLifetime.....................1800000
2020-10-22 10:59:15.050 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : maximumPoolSize.................5
2020-10-22 10:59:15.051 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : metricRegistry..................none
2020-10-22 10:59:15.051 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : metricsTrackerFactory...........none
2020-10-22 10:59:15.051 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : minimumIdle.....................5
2020-10-22 10:59:15.054 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : password........................<masked>
2020-10-22 10:59:15.055 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : poolName........................"Hikari"
2020-10-22 10:59:15.056 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : readOnly........................false
2020-10-22 10:59:15.057 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : registerMbeans..................false
2020-10-22 10:59:15.059 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : scheduledExecutor...............none
2020-10-22 10:59:15.059 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : schema..........................none
2020-10-22 10:59:15.059 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : threadFactory...................internal
2020-10-22 10:59:15.060 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : transactionIsolation............default
2020-10-22 10:59:15.061 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : username........................"teste"
2020-10-22 10:59:15.064 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : validationTimeout...............5000
2020-10-22 10:59:15.065 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : Hikari - Starting...
2020-10-22 10:59:15.445 DEBUG 1 --- [ main] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#47d6a31a
2020-10-22 10:59:15.450 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : Hikari - Start completed.
2020-10-22 10:59:15.550 DEBUG 1 --- [ari housekeeper] com.zaxxer.hikari.pool.HikariPool : Hikari - Pool stats (total=1, active=1, idle=0, waiting=0)
2020-10-22 10:59:15.566 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#c2d5356
2020-10-22 10:59:15.573 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#2eaaca3
2020-10-22 10:59:15.643 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#6d53826
2020-10-22 10:59:15.654 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#43d7a698
2020-10-22 10:59:15.659 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - After adding stats (total=5, active=1, idle=4, waiting=0)
2020-10-22 10:59:28.156 WARN 1 --- [scoveryClient-0] c.netflix.discovery.TimedSupervisorTask : task supervisor timed out
java.util.concurrent.TimeoutException: null
at java.base/java.util.concurrent.FutureTask.get(Unknown Source)
at com.netflix.discovery.TimedSupervisorTask.run(TimedSupervisorTask.java:68)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
2020-10-22 10:59:30.453 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : Hikari - Shutdown initiated...
2020-10-22 10:59:30.454 DEBUG 1 --- [ main] com.zaxxer.hikari.pool.HikariPool : Hikari - Before shutdown stats (total=5, active=0, idle=5, waiting=0)
2020-10-22 10:59:30.465 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#47d6a31a: (connection evicted)
2020-10-22 10:59:30.471 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#c2d5356: (connection evicted)
2020-10-22 10:59:30.544 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#2eaaca3: (connection evicted)
2020-10-22 10:59:30.545 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#6d53826: (connection evicted)
2020-10-22 10:59:30.548 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#43d7a698: (connection evicted)
2020-10-22 10:59:30.564 DEBUG 1 --- [ main] com.zaxxer.hikari.pool.HikariPool : Hikari - After shutdown stats (total=0, active=0, idle=0, waiting=0)
2020-10-22 10:59:30.564 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : Hikari - Shutdown completed.
2020-10-22 10:59:30.565 WARN 1 --- [ main] i.g.j.c.liquibase.AsyncSpringLiquibase : Warning, Liquibase took more than 5 seconds to start up!
2020-10-22 10:59:31.050 DEBUG 1 --- [ main] o.hibernate.jpa.internal.util.LogHelper : PersistenceUnitInfo [
name: default
persistence provider classname: null
classloader: jdk.internal.loader.ClassLoaders$AppClassLoader#4ae3c1cd
excludeUnlistedClasses: true
JTA datasource: null
Non JTA datasource: HikariDataSource (Hikari)
Transaction type: RESOURCE_LOCAL
PU root URL: file:/app/libs/commons-microservice-1.1.0.jar
Shared Cache Mode: UNSPECIFIED
Validation Mode: AUTO
Jar files URLs []
Managed classes names [
tech.h2r.ecommerce.miscellaneous.domain.AbstractAuditingEntity
tech.h2r.ecommerce.miscellaneous.domain.Banner
tech.h2r.ecommerce.miscellaneous.domain.CustomPage
tech.h2r.ecommerce.miscellaneous.domain.DirectMail
tech.h2r.ecommerce.miscellaneous.domain.PersistentAuditEvent
tech.h2r.ecommerce.miscellaneous.domain.Theme
tech.h2r.commons.microservice.domain.ConsumedMessage
tech.h2r.commons.microservice.domain.ProducedMessage
tech.h2r.commons.domain.Config]
Mapping files names []
Properties []
2020-10-22 10:59:31.064 DEBUG 1 --- [ main] o.h.i.internal.IntegratorServiceImpl : Adding Integrator [org.hibernate.cfg.beanvalidation.BeanValidationIntegrator].
2020-10-22 10:59:31.066 DEBUG 1 --- [ main] o.h.i.internal.IntegratorServiceImpl : Adding Integrator [org.hibernate.secure.spi.JaccIntegrator].
2020-10-22 10:59:31.068 DEBUG 1 --- [ main] o.h.i.internal.IntegratorServiceImpl : Adding Integrator [org.hibernate.cache.internal.CollectionCacheInvalidator].
2020-10-22 10:59:31.270 INFO 1 --- [ main] org.hibernate.Version : HHH000412: Hibernate ORM core version 5.4.15.Final
2020-10-22 10:59:31.343 DEBUG 1 --- [ main] org.hibernate.cfg.Environment : HHH000206: hibernate.properties not found
2020-10-22 10:59:31.851 DEBUG 1 --- [ main] o.hibernate.service.spi.ServiceBinding : Overriding existing service binding [org.hibernate.secure.spi.JaccService]
2020-10-22 10:59:31.867 DEBUG 1 --- [ main] o.h.c.internal.RegionFactoryInitiator : Cannot default RegionFactory based on registered strategies as `[]` RegionFactory strategies were registered
2020-10-22 10:59:31.868 DEBUG 1 --- [ main] o.h.c.internal.RegionFactoryInitiator : Cache region factory : org.hibernate.cache.internal.NoCachingRegionFactory
2020-10-22 10:59:31.960 INFO 1 --- [ main] o.hibernate.annotations.common.Version : HCANN000001: Hibernate Commons Annotations {5.1.0.Final}
2020-10-22 10:59:32.769 DEBUG 1 --- [ main] o.h.boot.internal.BootstrapContextImpl : Injecting JPA temp ClassLoader [org.springframework.instrument.classloading.SimpleThrowawayClassLoader#20852557] into BootstrapContext; was [null]
2020-10-22 10:59:32.770 DEBUG 1 --- [ main] o.h.boot.internal.ClassLoaderAccessImpl : ClassLoaderAccessImpl#injectTempClassLoader(org.springframework.instrument.classloading.SimpleThrowawayClassLoader#20852557) [was null]
2020-10-22 10:59:32.771 DEBUG 1 --- [ main] o.h.boot.internal.BootstrapContextImpl : Injecting ScanEnvironment [org.hibernate.jpa.boot.internal.StandardJpaScanEnvironmentImpl#41ca7df9] into BootstrapContext; was [null]
2020-10-22 10:59:32.842 DEBUG 1 --- [ main] o.h.boot.internal.BootstrapContextImpl : Injecting ScanOptions [org.hibernate.boot.archive.scan.internal.StandardScanOptions#4f0908de] into BootstrapContext; was [org.hibernate.boot.archive.scan.internal.StandardScanOptions#19469022]
2020-10-22 10:59:33.153 DEBUG 1 --- [ main] o.h.boot.internal.BootstrapContextImpl : Injecting JPA temp ClassLoader [null] into BootstrapContext; was [org.springframework.instrument.classloading.SimpleThrowawayClassLoader#20852557]
2020-10-22 10:59:33.160 DEBUG 1 --- [ main] o.h.boot.internal.ClassLoaderAccessImpl : ClassLoaderAccessImpl#injectTempClassLoader(null) [was org.springframework.instrument.classloading.SimpleThrowawayClassLoader#20852557]
2020-10-22 10:59:33.243 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [uuid2] -> [org.hibernate.id.UUIDGenerator]
2020-10-22 10:59:33.243 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [guid] -> [org.hibernate.id.GUIDGenerator]
2020-10-22 10:59:33.244 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [uuid] -> [org.hibernate.id.UUIDHexGenerator]
2020-10-22 10:59:33.245 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [uuid.hex] -> [org.hibernate.id.UUIDHexGenerator]
2020-10-22 10:59:33.245 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [assigned] -> [org.hibernate.id.Assigned]
2020-10-22 10:59:33.247 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [identity] -> [org.hibernate.id.IdentityGenerator]
2020-10-22 10:59:33.249 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [select] -> [org.hibernate.id.SelectGenerator]
2020-10-22 10:59:33.251 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [sequence] -> [org.hibernate.id.enhanced.SequenceStyleGenerator]
2020-10-22 10:59:33.253 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [seqhilo] -> [org.hibernate.id.SequenceHiLoGenerator]
2020-10-22 10:59:33.254 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [increment] -> [org.hibernate.id.IncrementGenerator]
2020-10-22 10:59:33.255 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [foreign] -> [org.hibernate.id.ForeignGenerator]
2020-10-22 10:59:33.256 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [sequence-identity] -> [org.hibernate.id.SequenceIdentityGenerator]
2020-10-22 10:59:33.256 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [enhanced-sequence] -> [org.hibernate.id.enhanced.SequenceStyleGenerator]
2020-10-22 10:59:33.258 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [enhanced-table] -> [org.hibernate.id.enhanced.TableGenerator]
2020-10-22 10:59:33.262 WARN 1 --- [ main] o.h.e.j.e.i.JdbcEnvironmentInitiator : HHH000342: Could not obtain connection to query metadata : HikariDataSource HikariDataSource (Hikari) has been closed.
2020-10-22 10:59:33.266 WARN 1 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [tech/h2r/ecommerce/miscellaneous/config/DatabaseConfiguration.class]: Invocation of init method failed; nested exception is org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment]
2020-10-22 10:59:33.365 INFO 1 --- [ main] com.hazelcast.core.LifecycleService : [172.18.0.3]:5701 [dev] [3.12.7] [172.18.0.3]:5701 is SHUTTING_DOWN
2020-10-22 10:59:33.449 INFO 1 --- [ main] com.hazelcast.instance.Node : [172.18.0.3]:5701 [dev] [3.12.7] Shutting down connection manager...
2020-10-22 10:59:33.457 INFO 1 --- [ main] com.hazelcast.instance.Node : [172.18.0.3]:5701 [dev] [3.12.7] Shutting down node engine...
2020-10-22 10:59:33.547 INFO 1 --- [ main] com.hazelcast.instance.NodeExtension : [172.18.0.3]:5701 [dev] [3.12.7] Destroying node NodeExtension.
2020-10-22 10:59:33.548 INFO 1 --- [ main] com.hazelcast.instance.Node : [172.18.0.3]:5701 [dev] [3.12.7] Hazelcast Shutdown is completed in 101 ms.
2020-10-22 10:59:33.549 INFO 1 --- [ main] com.hazelcast.core.LifecycleService : [172.18.0.3]:5701 [dev] [3.12.7] [172.18.0.3]:5701 is SHUTDOWN
2020-10-22 10:59:33.550 INFO 1 --- [ main] t.h.e.m.config.CacheConfiguration : Closing Cache Manager
2020-10-22 10:59:33.554 INFO 1 --- [ main] com.netflix.discovery.DiscoveryClient : Shutting down DiscoveryClient ...
2020-10-22 10:59:36.557 INFO 1 --- [ main] com.netflix.discovery.DiscoveryClient : Unregistering ...
2020-10-22 10:59:36.666 INFO 1 --- [ main] com.netflix.discovery.DiscoveryClient : DiscoveryClient_MISCELLANEOUS/miscellaneous:2b1983b040c376288a15e1a536a4f8f9 - deregister status: 200
2020-10-22 10:59:36.744 INFO 1 --- [ main] com.netflix.discovery.DiscoveryClient : Completed shut down of DiscoveryClient
2020-10-22 10:59:36.951 ERROR 1 --- [ main] o.s.boot.SpringApplication : Application run failed
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [tech/h2r/ecommerce/miscellaneous/config/DatabaseConfiguration.class]: Invocation of init method failed; nested exception is org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1796)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:595)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:517)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:323)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:226)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:321)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1108)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:868)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:141)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:747)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:315)
at tech.h2r.ecommerce.miscellaneous.MiscellaneousApp.main(MiscellaneousApp.java:44)
Caused by: org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment]
at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:275)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:237)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214)
at org.hibernate.id.factory.internal.DefaultIdentifierGeneratorFactory.injectServices(DefaultIdentifierGeneratorFactory.java:152)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.injectDependencies(AbstractServiceRegistryImpl.java:286)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:243)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214)
at org.hibernate.boot.internal.InFlightMetadataCollectorImpl.<init>(InFlightMetadataCollectorImpl.java:176)
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:118)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.metadata(EntityManagerFactoryBuilderImpl.java:1214)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:1245)
at org.springframework.orm.jpa.vendor.SpringHibernateJpaPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateJpaPersistenceProvider.java:58)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:365)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:391)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:378)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.afterPropertiesSet(LocalContainerEntityManagerFactoryBean.java:341)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1855)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1792)
... 14 common frames omitted
Caused by: org.hibernate.HibernateException: Access to DialectResolutionInfo cannot be null when 'hibernate.dialect' not set
at org.hibernate.engine.jdbc.dialect.internal.DialectFactoryImpl.determineDialect(DialectFactoryImpl.java:100)
at org.hibernate.engine.jdbc.dialect.internal.DialectFactoryImpl.buildDialect(DialectFactoryImpl.java:54)
at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:137)
at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:35)
at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.initiateService(StandardServiceRegistryImpl.java:101)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:263)
... 31 common frames omitted
The error says the problem is with dialect, but it is configured correctly:
datasource:
type: com.zaxxer.hikari.HikariDataSource
url: jdbc:postgresql://postgresql:5432/miscellaneous?socketTimeout=30
hikari:
poolName: Hikari
auto-commit: false
jpa:
database-platform: io.github.jhipster.domain.util.FixedPostgreSQL10Dialect
show-sql: false
open-in-view: false
properties:
hibernate.jdbc.time_zone: UTC
hibernate.id.new_generator_mappings: true
hibernate.connection.provider_disables_autocommit: true
hibernate.cache.use_second_level_cache: true
hibernate.cache.use_query_cache: false
hibernate.generate_statistics: false
hibernate.jdbc.batch_size: 25
hibernate.order_inserts: true
hibernate.order_updates: true
hibernate.query.fail_on_pagination_over_collection_fetch: true
hibernate.query.in_clause_parameter_padding: true
hibernate.cache.region.factory_class: com.hazelcast.hibernate.HazelcastCacheRegionFactory
hibernate.cache.use_minimal_puts: true
hibernate.cache.hazelcast.instance_name: miscellaneous
hibernate.cache.hazelcast.use_lite_member: true
hibernate:
ddl-auto: none
naming:
physical-strategy: org.springframework.boot.orm.jpa.hibernate.SpringPhysicalNamingStrategy
implicit-strategy: org.springframework.boot.orm.jpa.hibernate.SpringImplicitNamingStrategy
What could be causing this problem?
Instead of use the injected jpaVendorAdapter in my EntityManager, I was creating it:
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource, SchemaPerTenantConnectionProvider schemaPerTenantConnectionProvider, HeaderTenantIdentifierResolver headerTenantIdentifierResolver) {
...
em.setJpaVendorAdapter(jpaVendorAdapter());
em.setJpaPropertyMap(properties);
return em;
}
simply using the injected bean solved this problem:
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource, SchemaPerTenantConnectionProvider schemaPerTenantConnectionProvider, HeaderTenantIdentifierResolver headerTenantIdentifierResolver, JpaVendorAdapter jpaVendorAdapter) {
...
em.setJpaVendorAdapter(jpaVendorAdapter);
em.setJpaPropertyMap(properties);
return em;
}
Add the dialect to the properties under the name hibernate.dialect.
Defintieyl searched stackex for this already. Problems with hasIpAddress seem often unique.
I believe I understand the route of my reques to my server.
User -> Zuul -> My web service
http.authorizeRequests().antMatchers("/**").permitAll();
in my webservice, allows me to send requests and receive responses from localhost and my system's IP.
http.authorizeRequests().antMatchers("/**").hasIpAddress(10.10.1.24);
or
http.authorizeRequests().antMatchers("/**").hasIpAddress("127.0.0.1");
both fail.
When Zuul gives access to my web service... is it misreporting my request IP or something?
If my hasIpAddress() shouldnt be localhost, 127.0.0.1 or 10.10.1.24 then what else could it be?
I've shut down Zuul, Eureka and the ws and started them all up again.
I also did a maven clean.
2019-10-23 11:58:46.608 INFO 7468 --- [ restartedMain] o.s.s.web.DefaultSecurityFilterChain : Creating filter chain: any request, [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter#27040a7b, org.springframework.security.web.context.SecurityContextPersistenceFilter#6d6a2d29, org.springframework.security.web.header.HeaderWriterFilter#3485fdae, org.springframework.security.web.authentication.logout.LogoutFilter#198a3831, org.springframework.security.web.savedrequest.RequestCacheAwareFilter#2c7fb62d, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter#20043371, org.springframework.security.web.authentication.AnonymousAuthenticationFilter#ad82f08, org.springframework.security.web.session.SessionManagementFilter#2285c828, org.springframework.security.web.access.ExceptionTranslationFilter#6511c7f9, org.springframework.security.web.access.intercept.FilterSecurityInterceptor#7e27dfef]
2019-10-23 11:58:46.618 WARN 7468 --- [ restartedMain] c.n.c.sources.URLConfigurationSource : No URLs will be polled as dynamic configuration sources.
2019-10-23 11:58:46.618 INFO 7468 --- [ restartedMain] c.n.c.sources.URLConfigurationSource : To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath.
2019-10-23 11:58:46.621 WARN 7468 --- [ restartedMain] c.n.c.sources.URLConfigurationSource : No URLs will be polled as dynamic configuration sources.
2019-10-23 11:58:46.621 INFO 7468 --- [ restartedMain] c.n.c.sources.URLConfigurationSource : To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath.
2019-10-23 11:58:46.731 INFO 7468 --- [ restartedMain] o.s.s.concurrent.ThreadPoolTaskExecutor : Initializing ExecutorService 'applicationTaskExecutor'
2019-10-23 11:58:47.264 WARN 7468 --- [ restartedMain] ockingLoadBalancerClientRibbonWarnLogger : You already have RibbonLoadBalancerClient on your classpath. It will be used by default. As Spring Cloud Ribbon is in maintenance mode. We recommend switching to BlockingLoadBalancerClient instead. In order to use it, set the value of `spring.cloud.loadbalancer.ribbon.enabled` to `false` or remove spring-cloud-starter-netflix-ribbon from your project.
2019-10-23 11:58:47.365 INFO 7468 --- [ restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 54293 (http) with context path ''
2019-10-23 11:58:47.366 INFO 7468 --- [ restartedMain] .s.c.n.e.s.EurekaAutoServiceRegistration : Updating port to 54293
2019-10-23 11:58:47.370 INFO 7468 --- [ restartedMain] o.s.c.n.eureka.InstanceInfoFactory : Setting initial instance status as: STARTING
2019-10-23 11:58:47.392 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Initializing Eureka in region us-east-1
2019-10-23 11:58:47.491 INFO 7468 --- [ restartedMain] c.n.d.provider.DiscoveryJerseyProvider : Using JSON encoding codec LegacyJacksonJson
2019-10-23 11:58:47.492 INFO 7468 --- [ restartedMain] c.n.d.provider.DiscoveryJerseyProvider : Using JSON decoding codec LegacyJacksonJson
2019-10-23 11:58:47.570 INFO 7468 --- [ restartedMain] c.n.d.provider.DiscoveryJerseyProvider : Using XML encoding codec XStreamXml
2019-10-23 11:58:47.571 INFO 7468 --- [ restartedMain] c.n.d.provider.DiscoveryJerseyProvider : Using XML decoding codec XStreamXml
2019-10-23 11:58:47.690 INFO 7468 --- [ restartedMain] c.n.d.s.r.aws.ConfigClusterResolver : Resolving eureka endpoints via configuration
2019-10-23 11:58:47.814 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Disable delta property : false
2019-10-23 11:58:47.814 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Single vip registry refresh property : null
2019-10-23 11:58:47.814 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Force full registry fetch : false
2019-10-23 11:58:47.814 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Application is null : false
2019-10-23 11:58:47.815 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Registered Applications size is zero : true
2019-10-23 11:58:47.815 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Application version is -1: true
2019-10-23 11:58:47.815 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Getting all instance registry info from the eureka server
2019-10-23 11:58:47.890 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : The response status is 200
2019-10-23 11:58:47.892 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Starting heartbeat executor: renew interval is: 30
2019-10-23 11:58:47.894 INFO 7468 --- [ restartedMain] c.n.discovery.InstanceInfoReplicator : InstanceInfoReplicator onDemand update allowed rate per min is 4
2019-10-23 11:58:47.897 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Discovery Client initialized at timestamp 1571842727896 with initial instances count: 0
2019-10-23 11:58:47.900 INFO 7468 --- [ restartedMain] o.s.c.n.e.s.EurekaServiceRegistry : Registering application USERS-WS with eureka with status UP
2019-10-23 11:58:47.900 INFO 7468 --- [ restartedMain] com.netflix.discovery.DiscoveryClient : Saw local status change event StatusChangeEvent [timestamp=1571842727900, current=UP, previous=STARTING]
2019-10-23 11:58:47.902 INFO 7468 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_USERS-WS/users-ws:90ae4ec0932916bcd2b9155854f3a269: registering service...
2019-10-23 11:58:47.945 INFO 7468 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_USERS-WS/users-ws:90ae4ec0932916bcd2b9155854f3a269 - registration status: 204
2019-10-23 11:58:48.064 INFO 7468 --- [ restartedMain] c.p.p.a.u.PhotoAppApiUsersApplication : Started PhotoAppApiUsersApplication in 5.037 seconds (JVM running for 5.825)
2019-10-23 11:59:17.895 INFO 7468 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Disable delta property : false
2019-10-23 11:59:17.895 INFO 7468 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Single vip registry refresh property : null
2019-10-23 11:59:17.895 INFO 7468 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Force full registry fetch : false
2019-10-23 11:59:17.896 INFO 7468 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Application is null : false
2019-10-23 11:59:17.896 INFO 7468 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Registered Applications size is zero : true
2019-10-23 11:59:17.896 INFO 7468 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Application version is -1: false
2019-10-23 11:59:17.896 INFO 7468 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Getting all instance registry info from the eureka server
2019-10-23 11:59:17.959 INFO 7468 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : The response status is 200
Here is a working basic example using spring-boot 2.2.0.RELEASE with spring-boot-starter-security and spring-boot-starter-web
Works when accessing via http://localhost:8080/ip
#SpringBootApplication
public class SpringSecurityHasIpAddressApplication {
public static void main(String[] args) {
SpringApplication.run(SpringSecurityHasIpAddressApplication.class, args);
}
}
#RestController
class HelloController {
#GetMapping("/hello")
public String hello() {
return "Hello World!";
}
#GetMapping("/ip")
public String ip(HttpServletRequest request) {
return request.getRemoteAddr();
}
#GetMapping("/secure")
public String secure(Principal principal,HttpServletRequest request) {
return principal.getName() + " with " + request.getRemoteAddr();
}
}
#Configuration
class SecurityConfig extends WebSecurityConfigurerAdapter {
#Override
protected void configure(HttpSecurity http) throws Exception {
http
.authorizeRequests(
authorizeRequests ->
authorizeRequests
.antMatchers("/hello").permitAll()
.antMatchers("/secure").authenticated()
.antMatchers("/ip").hasIpAddress("0:0:0:0:0:0:0:1") // localhost
.anyRequest().authenticated()
)
.formLogin();
}
}
You could access the /secure path, then you can see you're actually used IP-address.
0:0:0:0:0:0:0:1 is my localhost address so I can access /ip without authentication
By setting the log-level of org.springframework.security could also be very helpful.
application.properties
logging.level.org.springframework.security=debug
then you can see in logging something like:
2019-10-30 19:14:01.039 DEBUG 3692 --- [nio-8080-exec-6] o.s.s.w.a.i.FilterSecurityInterceptor : Previously Authenticated: org.springframework.security.authentication.AnonymousAuthenticationToken#536ff536: Principal: anonymousUser; Credentials: [PROTECTED]; Authenticated: true; Details: org.springframework.security.web.authentication.WebAuthenticationDetails#166c8: RemoteIpAddress: 0:0:0:0:0:0:0:1; SessionId: 7602343558C34E2576CD0D3E20EDCBEE; Granted Authorities: ROLE_ANONYMOUS
2019-10-30 19:14:01.040 DEBUG 3692 --- [nio-8080-exec-6] o.s.s.access.vote.AffirmativeBased : Voter: org.springframework.security.web.access.expression.WebExpressionVoter#7527e914, returned: -1
2019-10-30 19:14:01.041 DEBUG 3692 --- [nio-8080-exec-6] o.s.s.w.a.ExceptionTranslationFilter : Access is denied (user is anonymous); redirecting to authentication entry point
if you try via http://127.0.0.1/ip above solution will fail
then you can use
...
.antMatchers("/ip").hasIpAddress("127.0.0.1/32")
...
If you want to use a range of allowed IP-addresses then you could you use
...
.antMatchers("/access") // multiple IP matching
.access("hasIpAddress('192.168.0.1/16') or hasIpAddress('127.0.0.1/32')")
...
hasIpAddress(1.1.1.1) has always worked fine for me. you don't need the /32 but you can use it. it's actually the same IP with /32 on it but you really on need it if you're attempting to match a range of IPs in a subnet . My guess is you're getting the IP of Zuul and not localhost/127.0.0.1 and you're using embedded Tomcat without the <Valve className="org.apache.catalina.valves.RemoteIpValve" /> installed. Also, enable the access log for Tomcat to see what IP is hitting your service via the SPring Boot properties located at https://docs.spring.io/spring-boot/docs/current/reference/html/appendix-application-properties.html - just search for tomcat
I'm building a microservices app and I've run into problem with configuring the Spring Cloud gateway to proxy the calls to the API from frontend running on Nginx server.
When I make a POST request to /users/login, I get this response: OPTIONS http://28a41511677e:8082/login net::ERR_NAME_NOT_RESOLVED.
The string 28a41511677e is the services docker container ID. When I call another service (using GET method), it returns data just fine.
I'm using Eureka discovery server which seems to find all the services correctly. The service in question is registered as 28a41511677e:users-service:8082
Docker compose:
version: "3.7"
services:
db:
build: db/
expose:
- 5432
registry:
build: registryservice/
expose:
- 8761
ports:
- 8761:8761
gateway:
build: gatewayservice/
expose:
- 8080
depends_on:
- registry
users:
build: usersservice/
expose:
- 8082
depends_on:
- registry
- db
timetable:
build: timetableservice/
expose:
- 8081
depends_on:
- registry
- db
ui:
build: frontend/
expose:
- 80
ports:
- 80:80
depends_on:
- gateway
Gateway implementation:
#EnableDiscoveryClient
#SpringBootApplication
public class GatewayserviceApplication {
#Bean
public RouteLocator customRouteLocator(RouteLocatorBuilder builder){
return builder.routes()
.route("users-service", p -> p.path("/user/**")
.uri("lb://users-service"))
.route("timetable-service", p -> p.path("/routes/**")
.uri("lb://timetable-service"))
.build();
}
public static void main(String[] args) {
SpringApplication.run(GatewayserviceApplication.class, args);
}
}
Gateway settings:
spring:
application:
name: gateway-service
cloud:
gateway:
globalcors:
cors-configurations:
'[/**]':
allowedOrigins: "*"
allowedMethods:
- GET
- POST
- PUT
- DELETE
eureka:
client:
service-url:
defaultZone: http://registry:8761/eureka
Users service controller:
#RestController
#CrossOrigin
#RequestMapping("/user")
public class UserController {
private UserService userService;
#Autowired
public UserController(UserService userService) {
this.userService = userService;
}
#PostMapping(path = "/login")
ResponseEntity<Long> login(#RequestBody LoginDto loginDto) {
logger.info("Logging in user");
Long uid = userService.logIn(loginDto);
return new ResponseEntity<>(uid, HttpStatus.OK);
}
}
Edit:
This also happens on NPM dev server. I tried changing the lb://users-service to http://users:8082, with no success, still getting ERR_NAME_NOT_RESOLVED.
I however found that when I call the endpoint, the following output can be seen in log:
gateway_1 | 2019-05-19 23:55:10.842 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Disable delta property : false
gateway_1 | 2019-05-19 23:55:10.866 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Single vip registry refresh property : null
gateway_1 | 2019-05-19 23:55:10.867 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Force full registry fetch : false
gateway_1 | 2019-05-19 23:55:10.868 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Application is null : false
gateway_1 | 2019-05-19 23:55:10.868 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Registered Applications size is zero : true
gateway_1 | 2019-05-19 23:55:10.869 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Application version is -1: false
gateway_1 | 2019-05-19 23:55:10.871 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Getting all instance registry info from the eureka server
gateway_1 | 2019-05-19 23:55:11.762 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : The response status is 200
users_1 | 2019-05-19 21:55:19.268 INFO 1 --- [nio-8082-exec-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring DispatcherServlet 'dispatcherServlet'
users_1 | 2019-05-19 21:55:19.273 INFO 1 --- [nio-8082-exec-1] o.s.web.servlet.DispatcherServlet : Initializing Servlet 'dispatcherServlet'
users_1 | 2019-05-19 21:55:19.513 INFO 1 --- [nio-8082-exec-1] o.s.web.servlet.DispatcherServlet : Completed initialization in 239 ms
users_1 | 2019-05-19 21:55:20.563 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Disable delta property : false
users_1 | 2019-05-19 21:55:20.565 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Single vip registry refresh property : null
users_1 | 2019-05-19 21:55:20.565 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Force full registry fetch : false
users_1 | 2019-05-19 21:55:20.566 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Application is null : false
users_1 | 2019-05-19 21:55:20.566 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Registered Applications size is zero : true
users_1 | 2019-05-19 21:55:20.566 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Application version is -1: false
users_1 | 2019-05-19 21:55:20.567 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : Getting all instance registry info from the eureka server
users_1 | 2019-05-19 21:55:20.958 INFO 1 --- [freshExecutor-0] com.netflix.discovery.DiscoveryClient : The response status is 200
Edit 2:
I enabled logging for the gateway service and this is the output whenever I call /user/login. According to the logs, the gateway matches the /users/login/ correctly, but then starts using just /login for some reason.
2019-05-20 12:58:47.002 DEBUG 1 --- [or-http-epoll-2] r.n.http.server.HttpServerOperations : [id: 0xff6d8305, L:/172.19.0.4:8080 - R:/172.19.0.7:42958] New http connection, requesting read
2019-05-20 12:58:47.025 DEBUG 1 --- [or-http-epoll-2] reactor.netty.channel.BootstrapHandlers : [id: 0xff6d8305, L:/172.19.0.4:8080 - R:/172.19.0.7:42958] Initialized pipeline DefaultChannelPipeline{(BootstrapHandlers$BootstrapInitializerHandler#0 = reactor.netty.channel.BootstrapHandlers$BootstrapInitializerHandler), (reactor.left.httpCodec = io.netty.handler.codec.http.HttpServerCodec), (reactor.left.httpTrafficHandler = reactor.netty.http.server.HttpTrafficHandler), (reactor.right.reactiveBridge = reactor.netty.channel.ChannelOperationsHandler)}
2019-05-20 12:58:47.213 DEBUG 1 --- [or-http-epoll-2] r.n.http.server.HttpServerOperations : [id: 0xff6d8305, L:/172.19.0.4:8080 - R:/172.19.0.7:42958] Increasing pending responses, now 1
2019-05-20 12:58:47.242 DEBUG 1 --- [or-http-epoll-2] reactor.netty.http.server.HttpServer : [id: 0xff6d8305, L:/172.19.0.4:8080 - R:/172.19.0.7:42958] Handler is being applied: org.springframework.http.server.reactive.ReactorHttpHandlerAdapter#575e590e
2019-05-20 12:58:47.379 TRACE 1 --- [or-http-epoll-2] o.s.c.g.f.WeightCalculatorWebFilter : Weights attr: {}
2019-05-20 12:58:47.817 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.r.RouteDefinitionRouteLocator : RouteDefinition CompositeDiscoveryClient_USERS-SERVICE applying {pattern=/USERS-SERVICE/**} to Path
2019-05-20 12:58:47.952 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.r.RouteDefinitionRouteLocator : RouteDefinition CompositeDiscoveryClient_USERS-SERVICE applying filter {regexp=/USERS-SERVICE/(?<remaining>.*), replacement=/${remaining}} to RewritePath
2019-05-20 12:58:47.960 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.r.RouteDefinitionRouteLocator : RouteDefinition matched: CompositeDiscoveryClient_USERS-SERVICE
2019-05-20 12:58:47.961 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.r.RouteDefinitionRouteLocator : RouteDefinition CompositeDiscoveryClient_GATEWAY-SERVICE applying {pattern=/GATEWAY-SERVICE/**} to Path
2019-05-20 12:58:47.964 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.r.RouteDefinitionRouteLocator : RouteDefinition CompositeDiscoveryClient_GATEWAY-SERVICE applying filter {regexp=/GATEWAY-SERVICE/(?<remaining>.*), replacement=/${remaining}} to RewritePath
2019-05-20 12:58:47.968 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.r.RouteDefinitionRouteLocator : RouteDefinition matched: CompositeDiscoveryClient_GATEWAY-SERVICE
2019-05-20 12:58:47.979 TRACE 1 --- [or-http-epoll-2] o.s.c.g.h.p.RoutePredicateFactory : Pattern "/user/**" matches against value "/user/login"
2019-05-20 12:58:47.980 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.h.RoutePredicateHandlerMapping : Route matched: users-service
2019-05-20 12:58:47.981 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.h.RoutePredicateHandlerMapping : Mapping [Exchange: POST http://gateway:8080/user/login] to Route{id='users-service', uri=lb://users-service, order=0, predicate=org.springframework.cloud.gateway.support.ServerWebExchangeUtils$$Lambda$333/0x000000084035ac40#276b060f, gatewayFilters=[]}
2019-05-20 12:58:47.981 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.h.RoutePredicateHandlerMapping : [ff6d8305] Mapped to org.springframework.cloud.gateway.handler.FilteringWebHandler#4faea64b
2019-05-20 12:58:47.994 DEBUG 1 --- [or-http-epoll-2] o.s.c.g.handler.FilteringWebHandler : Sorted gatewayFilterFactories: [OrderedGatewayFilter{delegate=GatewayFilterAdapter{delegate=org.springframework.cloud.gateway.filter.AdaptCachedBodyGlobalFilter#773f7880}, order=-2147482648}, OrderedGatewayFilter{delegate=GatewayFilterAdapter{delegate=org.springframework.cloud.gateway.filter.NettyWriteResponseFilter#65a4798f}, order=-1}, OrderedGatewayFilter{delegate=GatewayFilterAdapter{delegate=org.springframework.cloud.gateway.filter.ForwardPathFilter#4c51bb7}, order=0}, OrderedGatewayFilter{delegate=GatewayFilterAdapter{delegate=org.springframework.cloud.gateway.filter.RouteToRequestUrlFilter#878452d}, order=10000}, OrderedGatewayFilter{delegate=GatewayFilterAdapter{delegate=org.springframework.cloud.gateway.filter.LoadBalancerClientFilter#4f2613d1}, order=10100}, OrderedGatewayFilter{delegate=GatewayFilterAdapter{delegate=org.springframework.cloud.gateway.filter.WebsocketRoutingFilter#83298d7}, order=2147483646}, OrderedGatewayFilter{delegate=GatewayFilterAdapter{delegate=org.springframework.cloud.gateway.filter.NettyRoutingFilter#6d24ffa1}, order=2147483647}, OrderedGatewayFilter{delegate=GatewayFilterAdapter{delegate=org.springframework.cloud.gateway.filter.ForwardRoutingFilter#426b6a74}, order=2147483647}]
2019-05-20 12:58:47.996 TRACE 1 --- [or-http-epoll-2] o.s.c.g.filter.RouteToRequestUrlFilter : RouteToRequestUrlFilter start
2019-05-20 12:58:47.999 TRACE 1 --- [or-http-epoll-2] o.s.c.g.filter.LoadBalancerClientFilter : LoadBalancerClientFilter url before: lb://users-service/user/login
2019-05-20 12:58:48.432 INFO 1 --- [or-http-epoll-2] c.netflix.config.ChainedDynamicProperty : Flipping property: users-service.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2019-05-20 12:58:48.492 INFO 1 --- [or-http-epoll-2] c.n.u.concurrent.ShutdownEnabledTimer : Shutdown hook installed for: NFLoadBalancer-PingTimer-users-service
2019-05-20 12:58:48.496 INFO 1 --- [or-http-epoll-2] c.netflix.loadbalancer.BaseLoadBalancer : Client: users-service instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=users-service,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null
2019-05-20 12:58:48.506 INFO 1 --- [or-http-epoll-2] c.n.l.DynamicServerListLoadBalancer : Using serverListUpdater PollingServerListUpdater
2019-05-20 12:58:48.543 INFO 1 --- [or-http-epoll-2] c.netflix.config.ChainedDynamicProperty : Flipping property: users-service.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2019-05-20 12:58:48.555 INFO 1 --- [or-http-epoll-2] c.n.l.DynamicServerListLoadBalancer : DynamicServerListLoadBalancer for client users-service initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=users-service,current list of Servers=[157e1f567371:8082],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;]
},Server stats: [[Server:157e1f567371:8082; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 01:00:00 CET 1970; First connection made: Thu Jan 01 01:00:00 CET 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList#3cd9b0bf
2019-05-20 12:58:48.580 TRACE 1 --- [or-http-epoll-2] o.s.c.g.filter.LoadBalancerClientFilter : LoadBalancerClientFilter url chosen: http://157e1f567371:8082/user/login
2019-05-20 12:58:48.632 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : Creating new client pool [proxy] for 157e1f567371:8082
2019-05-20 12:58:48.646 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439] Created new pooled channel, now 0 active connections and 1 inactive connections
2019-05-20 12:58:48.651 DEBUG 1 --- [or-http-epoll-2] reactor.netty.channel.BootstrapHandlers : [id: 0xa9634439] Initialized pipeline DefaultChannelPipeline{(BootstrapHandlers$BootstrapInitializerHandler#0 = reactor.netty.channel.BootstrapHandlers$BootstrapInitializerHandler), (SimpleChannelPool$1#0 = io.netty.channel.pool.SimpleChannelPool$1), (reactor.left.httpCodec = io.netty.handler.codec.http.HttpClientCodec), (reactor.right.reactiveBridge = reactor.netty.channel.ChannelOperationsHandler)}
2019-05-20 12:58:48.673 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] onStateChange(PooledConnection{channel=[id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082]}, [connected])
2019-05-20 12:58:48.679 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] onStateChange(GET{uri=/, connection=PooledConnection{channel=[id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082]}}, [configured])
2019-05-20 12:58:48.682 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Registering pool release on close event for channel
2019-05-20 12:58:48.690 DEBUG 1 --- [or-http-epoll-2] r.netty.http.client.HttpClientConnect : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Handler is being applied: {uri=http://157e1f567371:8082/user/login, method=POST}
2019-05-20 12:58:48.701 DEBUG 1 --- [or-http-epoll-2] r.n.channel.ChannelOperationsHandler : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] New sending options
2019-05-20 12:58:48.720 DEBUG 1 --- [or-http-epoll-2] r.n.channel.ChannelOperationsHandler : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Writing object DefaultHttpRequest(decodeResult: success, version: HTTP/1.1)
POST /user/login HTTP/1.1
content-length: 37
accept-language: cs-CZ,cs;q=0.9,en;q=0.8
referer: http://localhost/user/login
cookie: JSESSIONID=6797219EB79F6026BD8F19E9C46C09DB
accept: application/json, text/plain, */*
user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.157 Safari/537.36
content-type: application/json;charset=UTF-8
origin: http://gateway:8080
accept-encoding: gzip, deflate, br
Forwarded: proto=http;host="gateway:8080";for="172.19.0.7:42958"
X-Forwarded-For: 172.19.0.1,172.19.0.7
X-Forwarded-Proto: http,http
X-Forwarded-Port: 80,8080
X-Forwarded-Host: localhost,gateway:8080
host: 157e1f567371:8082
2019-05-20 12:58:48.751 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Channel connected, now 1 active connections and 0 inactive connections
2019-05-20 12:58:48.759 DEBUG 1 --- [or-http-epoll-2] r.n.channel.ChannelOperationsHandler : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Writing object
2019-05-20 12:58:48.762 DEBUG 1 --- [or-http-epoll-2] reactor.netty.channel.FluxReceive : [id: 0xff6d8305, L:/172.19.0.4:8080 - R:/172.19.0.7:42958] Subscribing inbound receiver [pending: 1, cancelled:false, inboundDone: true]
2019-05-20 12:58:48.808 DEBUG 1 --- [or-http-epoll-2] r.n.channel.ChannelOperationsHandler : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Writing object EmptyLastHttpContent
2019-05-20 12:58:48.809 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] onStateChange(POST{uri=/user/login, connection=PooledConnection{channel=[id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082]}}, [request_sent])
2019-05-20 12:58:49.509 INFO 1 --- [erListUpdater-0] c.netflix.config.ChainedDynamicProperty : Flipping property: users-service.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2019-05-20 12:58:49.579 DEBUG 1 --- [or-http-epoll-2] r.n.http.client.HttpClientOperations : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Received response (auto-read:false) : [Set-Cookie=JSESSIONID=7C47A99C1F416F910AB554F4617247D6; Path=/; HttpOnly, X-Content-Type-Options=nosniff, X-XSS-Protection=1; mode=block, Cache-Control=no-cache, no-store, max-age=0, must-revalidate, Pragma=no-cache, Expires=0, X-Frame-Options=DENY, Location=http://157e1f567371:8082/login, Content-Length=0, Date=Mon, 20 May 2019 10:58:49 GMT]
2019-05-20 12:58:49.579 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] onStateChange(POST{uri=/user/login, connection=PooledConnection{channel=[id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082]}}, [response_received])
2019-05-20 12:58:49.581 TRACE 1 --- [or-http-epoll-2] o.s.c.g.filter.NettyWriteResponseFilter : NettyWriteResponseFilter start
2019-05-20 12:58:49.586 DEBUG 1 --- [or-http-epoll-2] reactor.netty.channel.FluxReceive : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Subscribing inbound receiver [pending: 0, cancelled:false, inboundDone: false]
2019-05-20 12:58:49.586 DEBUG 1 --- [or-http-epoll-2] r.n.http.client.HttpClientOperations : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Received last HTTP packet
2019-05-20 12:58:49.593 DEBUG 1 --- [or-http-epoll-2] r.n.channel.ChannelOperationsHandler : [id: 0xff6d8305, L:/172.19.0.4:8080 - R:/172.19.0.7:42958] Writing object DefaultFullHttpResponse(decodeResult: success, version: HTTP/1.1, content: EmptyByteBufBE)
HTTP/1.1 302 Found
Set-Cookie: JSESSIONID=7C47A99C1F416F910AB554F4617247D6; Path=/; HttpOnly
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Pragma: no-cache
Expires: 0
X-Frame-Options: DENY
Location: http://157e1f567371:8082/login
Date: Mon, 20 May 2019 10:58:49 GMT
content-length: 0
2019-05-20 12:58:49.595 DEBUG 1 --- [or-http-epoll-2] r.n.http.server.HttpServerOperations : [id: 0xff6d8305, L:/172.19.0.4:8080 - R:/172.19.0.7:42958] Detected non persistent http connection, preparing to close
2019-05-20 12:58:49.595 DEBUG 1 --- [or-http-epoll-2] r.n.http.server.HttpServerOperations : [id: 0xff6d8305, L:/172.19.0.4:8080 - R:/172.19.0.7:42958] Last Http packet was sent, terminating channel
2019-05-20 12:58:49.598 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] onStateChange(POST{uri=/user/login, connection=PooledConnection{channel=[id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082]}}, [disconnecting])
2019-05-20 12:58:49.598 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Releasing channel
2019-05-20 12:58:49.598 DEBUG 1 --- [or-http-epoll-2] r.n.resources.PooledConnectionProvider : [id: 0xa9634439, L:/172.19.0.4:59624 - R:157e1f567371/172.19.0.5:8082] Channel cleaned, now 0 active connections and 1 inactive connections
I managed to fix it. The problem was actually not in the gateway, it was in the users service. It had improper security configuration and required a login when accessing its endpoints. So, when I called any endpoint, it got redirected to /login.
I added the following code to the service and it works properly now.
#Configuration
public class SecurityConfig extends WebSecurityConfigurerAdapter {
#Bean
public PasswordEncoder passwordEncoder() {
return new BCryptPasswordEncoder();
}
#Override
protected void configure(HttpSecurity httpSecurity) throws Exception {
httpSecurity.authorizeRequests().antMatchers("/").permitAll();
httpSecurity.cors().and().csrf().disable();
}
#Bean
CorsConfigurationSource corsConfigurationSource() {
CorsConfiguration configuration = new CorsConfiguration();
configuration.setAllowedOrigins(Arrays.asList("*"));
configuration.setAllowedMethods(Arrays.asList("*"));
configuration.setAllowedHeaders(Arrays.asList("*"));
configuration.setAllowCredentials(true);
UrlBasedCorsConfigurationSource source = new UrlBasedCorsConfigurationSource();
source.registerCorsConfiguration("/**", configuration);
return source;
}
}
That's probably not a proper solution, but on a non production code it gets the job done.
I create a spring-boot app with #EnableSidecar and I see that a route for the registered eureka clients "offers" and "customers" is created (Mapped URL path [/customers/**] onto ...) and the routes show up on the http://localhost:9000/routes endpoint:
{
"_links":{
"self":{
"href":"http://localhost:8090/routes"
}
},
"/customers/**":"customers",
"/zuul-proxy/**":"zuul-proxy",
"/offers/**":"offers"
}
When accessing http://localhost:9000/customers on the zuul-proxy in the browser I get a 404 thought.
My application.yml
spring:
application:
name: zuul-proxy
server:
port: 9000
eureka:
client:
service-url:
defaultZone: http://localhost:8761/eureka/
ZuulProxyApplication.java:
#EnableSidecar
#SpringBootApplication
public class ZuulProxyApplication {
public static void main(String[] args) {
SpringApplication.run(ZuulProxyApplication.class, args);
}
}
Log output:
2015-09-11 23:33:09.236 INFO 42750 --- [ main] s.b.c.e.t.TomcatEmbeddedServletContainer : Tomcat started on port(s): 8090 (http)
2015-09-11 23:33:09.237 INFO 42750 --- [ main] c.n.e.EurekaDiscoveryClientConfiguration : Registering application zuul-proxy with eureka with status UP
2015-09-11 23:33:09.238 INFO 42750 --- [ main] com.netflix.discovery.DiscoveryClient : Saw local status change event StatusChangeEvent [current=UP, previous=STARTING]
2015-09-11 23:33:09.241 INFO 42750 --- [ main] o.s.c.n.zuul.web.ZuulHandlerMapping : Mapped URL path [/customers/**] onto handler of type [class org.springframework.cloud.netflix.zuul.web.ZuulController]
2015-09-11 23:33:09.242 INFO 42750 --- [ main] o.s.c.n.zuul.web.ZuulHandlerMapping : Mapped URL path [/offers/**] onto handler of type [class org.springframework.cloud.netflix.zuul.web.ZuulController]
2015-09-11 23:33:09.245 INFO 42750 --- [ main] ch.local.zuul.ZuulProxyApplication : Started ZuulProxyApplication in 8.637 seconds (JVM running for 9.26)
2015-09-11 23:33:09.245 INFO 42750 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_ZUUL-PROXY/192.168.0.108: registering service...
2015-09-11 23:33:09.294 INFO 42750 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_ZUUL-PROXY/192.168.0.108 - registration status: 204
2015-09-11 23:33:14.545 INFO 42750 --- [nio-8090-exec-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring FrameworkServlet 'dispatcherServlet'
2015-09-11 23:33:14.545 INFO 42750 --- [nio-8090-exec-1] o.s.web.servlet.DispatcherServlet : FrameworkServlet 'dispatcherServlet': initialization started
2015-09-11 23:33:14.572 INFO 42750 --- [nio-8090-exec-1] o.s.web.servlet.DispatcherServlet : FrameworkServlet 'dispatcherServlet': initialization completed in 27 ms
2015-09-11 23:33:14.616 INFO 42750 --- [nio-8090-exec-1] o.s.c.n.zuul.filters.ProxyRouteLocator : Finding route for path: /customers
2015-09-11 23:33:14.626 INFO 42750 --- [nio-8090-exec-1] s.c.a.AnnotationConfigApplicationContext : Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext#7442808: startup date [Fri Sep 11 23:33:14 CEST 2015]; parent: org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext#3a60c416
2015-09-11 23:33:14.646 INFO 42750 --- [nio-8090-exec-1] f.a.AutowiredAnnotationBeanPostProcessor : JSR-330 'javax.inject.Inject' annotation found and supported for autowiring
2015-09-11 23:33:14.759 INFO 42750 --- [nio-8090-exec-1] c.netflix.config.ChainedDynamicProperty : Flipping property: customers.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2015-09-11 23:33:14.802 INFO 42750 --- [nio-8090-exec-1] c.n.u.concurrent.ShutdownEnabledTimer : Shutdown hook installed for: NFLoadBalancer-PingTimer-customers
2015-09-11 23:33:14.829 INFO 42750 --- [nio-8090-exec-1] c.netflix.loadbalancer.BaseLoadBalancer : Client:customers instantiated a LoadBalancer:DynamicServerListLoadBalancer:{NFLoadBalancer:name=customers,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null
2015-09-11 23:33:14.855 INFO 42750 --- [nio-8090-exec-1] c.netflix.config.ChainedDynamicProperty : Flipping property: customers.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2015-09-11 23:33:14.858 INFO 42750 --- [nio-8090-exec-1] c.n.l.DynamicServerListLoadBalancer : DynamicServerListLoadBalancer for client customers initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=customers,current list of Servers=[192.168.0.108:customers - 8083, 192.168.0.108:customers - 8081],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:2; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;]
},Server stats: [[Server:192.168.0.108:customers - 8081; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 01:00:00 CET 1970; First connection made: Thu Jan 01 01:00:00 CET 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
, [Server:192.168.0.108:customers - 8083; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 01:00:00 CET 1970; First connection made: Thu Jan 01 01:00:00 CET 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList#55d7aafe
2015-09-11 23:33:14.941 INFO 42750 --- [nio-8090-exec-1] com.netflix.http4.ConnectionPoolCleaner : Initializing ConnectionPoolCleaner for NFHttpClient:customers
2015-09-11 23:33:15.838 INFO 42750 --- [ool-10-thread-1] c.netflix.config.ChainedDynamicProperty : Flipping property: customers.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2015-09-11 23:33:19.287 INFO 42750 --- [nio-8090-exec-2] o.s.c.n.zuul.filters.ProxyRouteLocator : Finding route for path: /customers
2015-09-11 23:33:35.780 INFO 42750 --- [pool-3-thread-1] o.s.c.n.zuul.web.ZuulHandlerMapping : Mapped URL path [/zuul-proxy/**] onto handler of type [class org.springframework.cloud.netflix.zuul.web.ZuulController]
I see the customers and offers instances registered in eureka. I also can access them via a RestTemplate in the zuul-proxy app like:
#RestController
public class TestController {
#Autowired
private RestTemplate restTemplate;
#RequestMapping("/test")
public Customer test() {
return restTemplate.getForObject("http://customers/customers/1", Customer.class);
}
}
This is working when I call http://localhost:9000/test
What could be the problem that I get a 404 for the registered routes http://localhost:9000/customers and http://localhost:9000/offers?