Reactor GroupBy with Parallelism Runs on same thread - spring

I'm trying to achieve parallelism per group, wherein grouped element runs in parallel and within the group each element works sequentially. However for the below code, the first emit uses parallel thread, but for subsequent emit it uses some different thread pool. How can I achieve parallelism for group and sequential execution for element within group.
public class ReactorTest implements SmartLifecycle, ApplicationListener<ApplicationReadyEvent> {
private AtomicInteger counter = new AtomicInteger(1);
private Many<Integer> healthSink;
private Disposable dispose;
private ScheduledExecutorService executor;
#Override
public void start() {
executor = Executors.newSingleThreadScheduledExecutor();
healthSink = Sinks.many().unicast().onBackpressureBuffer();
dispose = healthSink.asFlux().groupBy(v -> v % 3 == 0).parallel(10)
.runOn(Schedulers.newBoundedElastic(10, 100, "k-task")).log().flatMap(v -> v)
.subscribe(v -> log.info("Data {}", v));
}
#Override
public void stop() {
executor.shutdownNow();
if (dispose != null) {
dispose.dispose();
}
}
#Override
public boolean isRunning() {
return executor == null ? false : !executor.isShutdown();
}
#Override
public void onApplicationEvent(ApplicationReadyEvent event) {
executor.scheduleAtFixedRate(() -> {
healthSink.tryEmitNext(counter.incrementAndGet());
healthSink.tryEmitNext(counter.incrementAndGet());
healthSink.tryEmitNext(counter.incrementAndGet());
}, 10, 10, TimeUnit.SECONDS);
}
}
log
2021-07-27 14:15:34.189 INFO 22212 --- [ restartedMain] i.g.kprasad99.reactor.DemoApplication : Started DemoApplication in 1.464 seconds (JVM running for 1.795)
2021-07-27 14:15:44.206 INFO 22212 --- [ k-task-1] reactor.Parallel.RunOn.1 : onNext(UnicastGroupedFlux)
2021-07-27 14:15:44.207 INFO 22212 --- [ k-task-2] reactor.Parallel.RunOn.1 : onNext(UnicastGroupedFlux)
2021-07-27 14:15:44.207 INFO 22212 --- [ k-task-1] io.github.kprasad99.reactor.ReactorTest : Data 2
2021-07-27 14:15:44.207 INFO 22212 --- [ k-task-2] io.github.kprasad99.reactor.ReactorTest : Data 3
2021-07-27 14:15:44.207 INFO 22212 --- [ k-task-1] io.github.kprasad99.reactor.ReactorTest : Data 4
2021-07-27 14:15:54.200 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 5
2021-07-27 14:15:54.200 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 6
2021-07-27 14:15:54.200 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 7
2021-07-27 14:16:04.195 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 8
2021-07-27 14:16:04.195 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 9
2021-07-27 14:16:04.195 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 10
2021-07-27 14:16:14.206 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 11
2021-07-27 14:16:14.206 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 12
2021-07-27 14:16:14.206 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 13
2021-07-27 14:16:24.197 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 14
2021-07-27 14:16:24.197 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 15
2021-07-27 14:16:24.197 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 16
2021-07-27 14:16:34.196 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 17
2021-07-27 14:16:34.196 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 18
2021-07-27 14:16:34.196 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 19
2021-07-27 14:16:44.201 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 20
2021-07-27 14:16:44.201 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 21
2021-07-27 14:16:44.201 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 22
2021-07-27 14:16:54.201 INFO 22212 --- [pool-3-thread-1] io.github.kprasad99.reactor.ReactorTest : Data 23

You need to put the .parallel(..) after the .flatMap(..) operator:
Flux.interval(Duration.ofMillis(100))
.take(10)
.groupBy(v -> v % 2 == 0)
.flatMap(f -> f)
.parallel(2)
.runOn(Schedulers.newBoundedElastic(2, 10, "k-task"))
.subscribe(i -> log.info("Data {}", i));
Result:
10:32:33.377 [k-task-1] INFO Data 0
10:32:33.466 [k-task-2] INFO Data 1
10:32:33.562 [k-task-1] INFO Data 2
10:32:33.673 [k-task-2] INFO Data 3
10:32:33.766 [k-task-1] INFO Data 4
10:32:33.860 [k-task-2] INFO Data 5
10:32:33.971 [k-task-1] INFO Data 6
10:32:34.065 [k-task-2] INFO Data 7
10:32:34.163 [k-task-1] INFO Data 8
10:32:34.268 [k-task-2] INFO Data 9

Related

Inside docker container Eureka doesn't see the defaultZone parameter value in properties.yml

There are no problems happens if I'm launching apps without containers. My temporary solution is to send the parameter through environment variable, but I can't understand why does it work like this. Could somebody please explain what am I doing wrong?
application-docker.yml (api-gateway)
server:
port: 8080
spring:
application:
name: api-gateway
cloud:
gateway:
routes:
- id: customer
uri: lb://CUSTOMER
predicates:
- Path=/api/v1/customers/**
zipkin:
base-url: http://zipkin:9411/
eureka:
client:
serviceUrl:
defaultZone: http://eureka-server:8761/eureka/
fetch-registry: true
register-with-eureka: true
application-docker.yml (eureka-server)
server:
port: 8761
spring:
application:
name: eureka-server
zipkin:
base-url: http://zipkin:9411/
eureka:
client:
fetch-registry: false
register-with-eureka: false
docker-compose.yaml
services:
postgres:
container_name: postgres
image: postgres
environment:
POSTGRES_USER: *
POSTGRES_PASSWORD: *
PGDATA: /data/postgres
volumes:
- postgres:/data/postgres
ports:
- "5432:5432"
networks:
- postgres
restart: unless-stopped
zipkin:
image: openzipkin/zipkin
container_name: zipkin
ports:
- "9411:9411"
networks:
- spring
rabbitmq:
image: rabbitmq:3.9.11-management-alpine
container_name: rabbitmq
ports:
- "5672:5672"
- "15672:15672"
networks:
- spring
eureka-server:
image: ftmpt/eureka-server:latest
container_name: eureka-server
environment:
- SPRING_PROFILES_ACTIVE=docker
ports:
- "8761:8761"
networks:
- spring
depends_on:
- zipkin
api-gateway:
image: ftmpt/api-gateway:latest
container_name: api-gateway
environment:
- SPRING_PROFILES_ACTIVE=docker
# - eureka.client.service-url.defaultZone=http://eureka-server:8761/eureka
ports:
- "8080:8080"
networks:
- spring
- postgres
depends_on:
- eureka-server
- zipkin
networks:
postgres:
driver: bridge
spring:
driver: bridge
volumes:
postgres:
logs from api-gateway container:
Powered by Spring Boot 2.5.7
2022-10-03 16:19:28.632 INFO [api-gateway,,] 1 --- [ main] e.fatumepta.apigw.ApiGatewayApplication : Starting ApiGatewayApplication using Java 17.0.1 on f492a34f31a2 with PID 1 (/app/classes started by root in /)
2022-10-03 16:19:28.640 INFO [api-gateway,,] 1 --- [ main] e.fatumepta.apigw.ApiGatewayApplication : The following profiles are active: docker
2022-10-03 16:19:31.883 INFO [api-gateway,,] 1 --- [ main] o.s.cloud.context.scope.GenericScope : BeanFactory id=7282d2ab-5b80-3882-a881-0bf42ac5f5e5
2022-10-03 16:19:32.624 INFO [api-gateway,,] 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.cloud.client.loadbalancer.reactive.LoadBalancerBeanPostProcessorAutoConfiguration' of type [org.springframework.cloud.client.loadbalancer.reactive.LoadBalancerBeanPostProcessorAutoConfiguration] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2022-10-03 16:19:32.638 INFO [api-gateway,,] 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.cloud.client.loadbalancer.reactive.LoadBalancerBeanPostProcessorAutoConfiguration$ReactorDeferringLoadBalancerFilterConfig' of type [org.springframework.cloud.client.loadbalancer.reactive.LoadBalancerBeanPostProcessorAutoConfiguration$ReactorDeferringLoadBalancerFilterConfig] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2022-10-03 16:19:32.647 INFO [api-gateway,,] 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'reactorDeferringLoadBalancerExchangeFilterFunction' of type [org.springframework.cloud.client.loadbalancer.reactive.DeferringLoadBalancerExchangeFilterFunction] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [After]
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [Before]
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [Between]
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [Cookie]
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [Header]
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [Host]
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [Method]
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [Path]
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [Query]
2022-10-03 16:19:34.880 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [ReadBody]
2022-10-03 16:19:34.881 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [RemoteAddr]
2022-10-03 16:19:34.881 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [Weight]
2022-10-03 16:19:34.881 INFO [api-gateway,,] 1 --- [ main] o.s.c.g.r.RouteDefinitionRouteLocator : Loaded RoutePredicateFactory [CloudFoundryRouteService]
2022-10-03 16:19:36.003 INFO [api-gateway,,] 1 --- [ main] DiscoveryClientOptionalArgsConfiguration : Eureka HTTP Client uses RestTemplate.
2022-10-03 16:19:36.316 WARN [api-gateway,,] 1 --- [ main] iguration$LoadBalancerCaffeineWarnLogger : Spring Cloud LoadBalancer is currently working with the default cache. You can switch to using Caffeine cache, by adding it and org.springframework.cache.caffeine.CaffeineCacheManager to the classpath.
2022-10-03 16:19:36.509 INFO [api-gateway,,] 1 --- [ main] o.s.c.n.eureka.InstanceInfoFactory : Setting initial instance status as: STARTING
2022-10-03 16:19:36.782 INFO [api-gateway,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Initializing Eureka in region us-east-1
2022-10-03 16:19:36.800 INFO [api-gateway,,] 1 --- [ main] c.n.d.s.r.aws.ConfigClusterResolver : Resolving eureka endpoints via configuration
2022-10-03 16:19:36.843 INFO [api-gateway,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Disable delta property : false
2022-10-03 16:19:36.843 INFO [api-gateway,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Single vip registry refresh property : null
2022-10-03 16:19:36.843 INFO [api-gateway,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Force full registry fetch : false
2022-10-03 16:19:36.843 INFO [api-gateway,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Application is null : false
2022-10-03 16:19:36.843 INFO [api-gateway,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Registered Applications size is zero : true
2022-10-03 16:19:36.843 INFO [api-gateway,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Application version is -1: true
2022-10-03 16:19:36.843 INFO [api-gateway,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Getting all instance registry info from the eureka server
2022-10-03 16:19:37.584 INFO [api-gateway,,] 1 --- [ main] c.n.d.s.t.d.RedirectingEurekaHttpClient : Request execution error. endpoint=DefaultEndpoint{ serviceUrl='http://localhost:8761/eureka/}, exception=I/O error on GET request for "http://localhost:8761/eureka/apps/": Connect to localhost:8761 [localhost/127.0.0.1] failed: Connection refused; nested exception is org.apache.http.conn.HttpHostConnectException: Connect to localhost:8761 [localhost/127.0.0.1] failed: Connection refused stacktrace=org.springframework.web.client.ResourceAccessException: I/O error on GET request for "http://localhost:8761/eureka/apps/": Connect to localhost:8761 [localhost/127.0.0.1] failed: Connection refused; nested exception is org.apache.http.conn.HttpHostConnectException: Connect to localhost:8761 [localhost/127.0.0.1] failed: Connection refused
at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:785)
at org.springframework.web.client.RestTemplate.execute(RestTemplate.java:711)
at org.springframework.web.client.RestTemplate.exchange(RestTemplate.java:602)
at org.springframework.cloud.netflix.eureka.http.RestTemplateEurekaHttpClient.getApplicationsInternal(RestTemplateEurekaHttpClient.java:145)
at org.springframework.cloud.netflix.eureka.http.RestTemplateEurekaHttpClient.getApplications(RestTemplateEurekaHttpClient.java:135)
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator$6.execute(EurekaHttpClientDecorator.java:137)
at com.netflix.discovery.shared.transport.decorator.RedirectingEurekaHttpClient.executeOnNewServer(RedirectingEurekaHttpClient.java:121)
at com.netflix.discovery.shared.transport.decorator.RedirectingEurekaHttpClient.execute(RedirectingEurekaHttpClient.java:80)
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator.getApplications(EurekaHttpClientDecorator.java:134)
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator$6.execute(EurekaHttpClientDecorator.java:137)
at com.netflix.discovery.shared.transport.decorator.RetryableEurekaHttpClient.execute(RetryableEurekaHttpClient.java:120)
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator.getApplications(EurekaHttpClientDecorator.java:134)
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator$6.execute(EurekaHttpClientDecorator.java:137)
at com.netflix.discovery.shared.transport.decorator.SessionedEurekaHttpClient.execute(SessionedEurekaHttpClient.java:77)
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator.getApplications(EurekaHttpClientDecorator.java:134)
at com.netflix.discovery.DiscoveryClient.getAndStoreFullRegistry(DiscoveryClient.java:1101)
at com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:1014)
at com.netflix.discovery.DiscoveryClient.<init>(DiscoveryClient.java:441)
at com.netflix.discovery.DiscoveryClient.<init>(DiscoveryClient.java:283)
at com.netflix.discovery.DiscoveryClient.<init>(DiscoveryClient.java:279)
at org.springframework.cloud.netflix.eureka.CloudEurekaClient.<init>(CloudEurekaClient.java:66)
at org.springframework.cloud.netflix.eureka.EurekaClientAutoConfiguration$RefreshableEurekaClientConfiguration.eurekaClient(EurekaClientAutoConfiguration.java:295)
eureka-server logs from container
2022-10-06 11:46:06.693 INFO [eureka-server,,] 1 --- [ main] e.f.eureka.EurekaServerApplication : Starting EurekaServerApplication using Java 17.0.1 on bf39ca283c65 with PID 1 (/app/classes started by root in /)
2022-10-06 11:46:06.727 INFO [eureka-server,,] 1 --- [ main] e.f.eureka.EurekaServerApplication : The following profiles are active: docker
2022-10-06 11:46:10.568 INFO [eureka-server,,] 1 --- [ main] o.s.cloud.context.scope.GenericScope : BeanFactory id=c87a0e94-d419-31a3-a4bc-1879ab705f9c
2022-10-06 11:46:12.271 INFO [eureka-server,,] 1 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8761 (http)
2022-10-06 11:46:12.311 INFO [eureka-server,,] 1 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat]
2022-10-06 11:46:12.311 INFO [eureka-server,,] 1 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.55]
2022-10-06 11:46:12.604 INFO [eureka-server,,] 1 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext
2022-10-06 11:46:12.604 INFO [eureka-server,,] 1 --- [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 5728 ms
2022-10-06 11:46:15.226 INFO [eureka-server,,] 1 --- [ main] c.s.j.s.i.a.WebApplicationImpl : Initiating Jersey application, version 'Jersey: 1.19.4 05/24/2017 03:20 PM'
2022-10-06 11:46:15.401 INFO [eureka-server,,] 1 --- [ main] c.n.d.provider.DiscoveryJerseyProvider : Using JSON encoding codec LegacyJacksonJson
2022-10-06 11:46:15.402 INFO [eureka-server,,] 1 --- [ main] c.n.d.provider.DiscoveryJerseyProvider : Using JSON decoding codec LegacyJacksonJson
2022-10-06 11:46:15.755 INFO [eureka-server,,] 1 --- [ main] c.n.d.provider.DiscoveryJerseyProvider : Using XML encoding codec XStreamXml
2022-10-06 11:46:15.755 INFO [eureka-server,,] 1 --- [ main] c.n.d.provider.DiscoveryJerseyProvider : Using XML decoding codec XStreamXml
2022-10-06 11:46:18.003 INFO [eureka-server,,] 1 --- [ main] DiscoveryClientOptionalArgsConfiguration : Eureka HTTP Client uses Jersey
2022-10-06 11:46:18.261 WARN [eureka-server,,] 1 --- [ main] iguration$LoadBalancerCaffeineWarnLogger : Spring Cloud LoadBalancer is currently working with the default cache. You can switch to using Caffeine cache, by adding it and org.springframework.cache.caffeine.CaffeineCacheManager to the classpath.
2022-10-06 11:46:18.274 INFO [eureka-server,,] 1 --- [ main] o.s.c.n.eureka.InstanceInfoFactory : Setting initial instance status as: STARTING
2022-10-06 11:46:18.293 INFO [eureka-server,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Initializing Eureka in region us-east-1
2022-10-06 11:46:18.295 INFO [eureka-server,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Client configured to neither register nor query for data.
2022-10-06 11:46:18.300 INFO [eureka-server,,] 1 --- [ main] com.netflix.discovery.DiscoveryClient : Discovery Client initialized at timestamp 1665056778299 with initial instances count: 0
2022-10-06 11:46:18.396 INFO [eureka-server,,] 1 --- [ main] c.n.eureka.DefaultEurekaServerContext : Initializing ...
2022-10-06 11:46:18.405 INFO [eureka-server,,] 1 --- [ main] c.n.eureka.cluster.PeerEurekaNodes : Adding new peer nodes [http://localhost:8761/eureka/]
2022-10-06 11:46:18.857 INFO [eureka-server,,] 1 --- [ main] c.n.d.provider.DiscoveryJerseyProvider : Using JSON encoding codec LegacyJacksonJson
2022-10-06 11:46:18.857 INFO [eureka-server,,] 1 --- [ main] c.n.d.provider.DiscoveryJerseyProvider : Using JSON decoding codec LegacyJacksonJson
2022-10-06 11:46:18.857 INFO [eureka-server,,] 1 --- [ main] c.n.d.provider.DiscoveryJerseyProvider : Using XML encoding codec XStreamXml
2022-10-06 11:46:18.857 INFO [eureka-server,,] 1 --- [ main] c.n.d.provider.DiscoveryJerseyProvider : Using XML decoding codec XStreamXml
2022-10-06 11:46:19.589 INFO [eureka-server,,] 1 --- [ main] c.n.eureka.cluster.PeerEurekaNodes : Replica node URL: http://localhost:8761/eureka/
2022-10-06 11:46:19.608 INFO [eureka-server,,] 1 --- [ main] c.n.e.registry.AbstractInstanceRegistry : Finished initializing remote region registries. All known remote regions: []
2022-10-06 11:46:19.609 INFO [eureka-server,,] 1 --- [ main] c.n.eureka.DefaultEurekaServerContext : Initialized
2022-10-06 11:46:19.855 INFO [eureka-server,,] 1 --- [ main] o.s.b.a.e.web.EndpointLinksResolver : Exposing 1 endpoint(s) beneath base path '/actuator'
2022-10-06 11:46:19.953 INFO [eureka-server,,] 1 --- [ main] o.s.c.n.e.s.EurekaServiceRegistry : Registering application EUREKA-SERVER with eureka with status UP
2022-10-06 11:46:19.978 INFO [eureka-server,,] 1 --- [ Thread-9] o.s.c.n.e.server.EurekaServerBootstrap : Setting the eureka configuration..
2022-10-06 11:46:19.991 INFO [eureka-server,,] 1 --- [ Thread-9] o.s.c.n.e.server.EurekaServerBootstrap : isAws returned false
2022-10-06 11:46:20.008 INFO [eureka-server,,] 1 --- [ Thread-9] o.s.c.n.e.server.EurekaServerBootstrap : Initialized server context
2022-10-06 11:46:20.008 INFO [eureka-server,,] 1 --- [ Thread-9] c.n.e.r.PeerAwareInstanceRegistryImpl : Got 1 instances from neighboring DS node
2022-10-06 11:46:20.008 INFO [eureka-server,,] 1 --- [ Thread-9] c.n.e.r.PeerAwareInstanceRegistryImpl : Renew threshold is: 1
2022-10-06 11:46:20.008 INFO [eureka-server,,] 1 --- [ Thread-9] c.n.e.r.PeerAwareInstanceRegistryImpl : Changing status to UP
2022-10-06 11:46:20.034 INFO [eureka-server,,] 1 --- [ Thread-9] e.s.EurekaServerInitializerConfiguration : Started Eureka Server
2022-10-06 11:46:20.034 INFO [eureka-server,,] 1 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8761 (http) with context path ''
2022-10-06 11:46:20.035 INFO [eureka-server,,] 1 --- [ main] .s.c.n.e.s.EurekaAutoServiceRegistration : Updating port to 8761
2022-10-06 11:46:20.089 INFO [eureka-server,,] 1 --- [ main] e.f.eureka.EurekaServerApplication : Started EurekaServerApplication in 14.267 seconds (JVM running for 14.702)

How to make different instances of consumers in the same consumer group consume different shards of the same kinesis stream?

I'm following the example given in spring-cloud-stream-samples with the following modifications.
application.yml
spring:
cloud:
stream:
instanceCount: 2
bindings:
produceOrder-out-0:
destination: test_stream
content-type: application/json
producer:
partitionCount: 2
partitionSelectorName: eventPartitionSelectorStrategy
partitionKeyExtractorName: eventPartitionKeyExtractorStrategy
processOrder-in-0:
group: eventConsumers
destination: test_stream
content-type: application/json
function:
definition: processOrder;produceOrder
ProducerConfiguration.java
package demo.config;
import demo.stream.Event;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.cloud.stream.binder.PartitionKeyExtractorStrategy;
import org.springframework.cloud.stream.binder.PartitionSelectorStrategy;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.messaging.Message;
#Configuration
public class ProducerConfiguration {
private static Logger logger = LoggerFactory.getLogger(ProducerConfiguration.class);
#Bean
public PartitionSelectorStrategy eventPartitionSelectorStrategy() {
return new PartitionSelectorStrategy() {
#Override
public int selectPartition(Object key, int partitionCount) {
if(key instanceof Integer) {
int partition = (((Integer)key)%partitionCount + partitionCount)%partitionCount;
logger.info("key {} falls into partition {}" , key , partition);
return partition;
}
return 0;
}
};
}
#Bean
public PartitionKeyExtractorStrategy eventPartitionKeyExtractorStrategy() {
return new PartitionKeyExtractorStrategy() {
#Override
public Object extractKey(Message<?> message) {
if(message.getPayload() instanceof Event) {
return ((Event) message.getPayload()).hashCode();
} else {
return 0;
}
}
};
}
}
When I run two instances of this application by setting --spring.cloud.stream.instanceIndex=0 and --spring.cloud.stream.instanceIndex=1 I'm able to see the events getting produced. However, only one of the instance is consuming the records from both the partitions, the other instance is not consuming despite the producer creating partitioned records.
Logs seen in KinesisProducer
2022-09-04 00:17:22.628 INFO 34029 --- [ main] a.i.k.KinesisMessageDrivenChannelAdapter : started KinesisMessageDrivenChannelAdapter{shardOffsets=[KinesisShardOffset{iteratorType=LATEST, sequenceNumber='null', timestamp=null, stream='test_stream', shard='shardId-000000000000', reset=false}], consumerGroup='eventConsumers'}
2022-09-04 00:17:22.658 INFO 34029 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 64398 (http) with context path ''
2022-09-04 00:17:22.723 INFO 34029 --- [ main] demo.KinesisApplication : Started KinesisApplication in 18.487 seconds (JVM running for 19.192)
2022-09-04 00:17:23.938 INFO 34029 --- [esis-consumer-1] a.i.k.KinesisMessageDrivenChannelAdapter : The [ShardConsumer{shardOffset=KinesisShardOffset{iteratorType=LATEST, sequenceNumber='null', timestamp=null, stream='test_stream', shard='shardId-000000000000', reset=false}, state=NEW}] has been started.
2022-09-04 00:17:55.222 INFO 34029 --- [io-64398-exec-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring DispatcherServlet 'dispatcherServlet'
2022-09-04 00:17:55.222 INFO 34029 --- [io-64398-exec-1] o.s.web.servlet.DispatcherServlet : Initializing Servlet 'dispatcherServlet'
2022-09-04 00:17:55.224 INFO 34029 --- [io-64398-exec-1] o.s.web.servlet.DispatcherServlet : Completed initialization in 2 ms
2022-09-04 00:17:55.598 INFO 34029 --- [io-64398-exec-1] demo.stream.OrdersSource : Event sent: Event [id=null, subject=Order [id=5fbaca2f-d947-423d-a1f1-b1c9c268d2d0, name=pen], type=ORDER, originator=KinesisProducer]
2022-09-04 00:17:56.337 INFO 34029 --- [ask-scheduler-3] demo.config.ProducerConfiguration : key 1397835167 falls into partition 1
2022-09-04 00:18:02.047 INFO 34029 --- [io-64398-exec-2] demo.stream.OrdersSource : Event sent: Event [id=null, subject=Order [id=83021259-89b5-4451-a0ec-da3152d37a58, name=pen], type=ORDER, originator=KinesisProducer]
2022-09-04 00:18:02.361 INFO 34029 --- [ask-scheduler-3] demo.config.ProducerConfiguration : key 147530256 falls into partition 0
Logs seen in KinesisConsumer
2022-09-04 00:17:28.050 INFO 34058 --- [ main] a.i.k.KinesisMessageDrivenChannelAdapter : started KinesisMessageDrivenChannelAdapter{shardOffsets=[KinesisShardOffset{iteratorType=LATEST, sequenceNumber='null', timestamp=null, stream='test_stream', shard='shardId-000000000001', reset=false}], consumerGroup='eventConsumers'}
2022-09-04 00:17:28.076 INFO 34058 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 64399 (http) with context path ''
2022-09-04 00:17:28.116 INFO 34058 --- [ main] demo.KinesisApplication : Started KinesisApplication in 18.566 seconds (JVM running for 19.839)
2022-09-04 00:17:29.365 INFO 34058 --- [esis-consumer-1] a.i.k.KinesisMessageDrivenChannelAdapter : The [ShardConsumer{shardOffset=KinesisShardOffset{iteratorType=AFTER_SEQUENCE_NUMBER, sequenceNumber='49632927200161141377996226513172299243826807332967284754', timestamp=null, stream='test_stream', shard='shardId-000000000001', reset=false}, state=NEW}] has been started.
2022-09-04 00:17:57.346 INFO 34058 --- [esis-consumer-1] demo.stream.OrderStreamConfiguration : An order has been placed from this service Event [id=null, subject=Order [id=5fbaca2f-d947-423d-a1f1-b1c9c268d2d0, name=pen], type=ORDER, originator=KinesisProducer]
2022-09-04 00:18:04.384 INFO 34058 --- [esis-consumer-1] demo.stream.OrderStreamConfiguration : An order has been placed from this service Event [id=null, subject=Order [id=83021259-89b5-4451-a0ec-da3152d37a58, name=pen], type=ORDER, originator=KinesisProducer]
spring-cloud-stream-binder-kinesis version : 2.2.0
I have these following questions:
For Static shard distribution within a single consumer group, is there any other parameter that needs to be configured that I have missed?
Do I need to specify the DynamoDB Checkpoint properties only for dynamic shard distribution?
EDIT
I have added the DEBUG logs seen in KinesisProducer below:
2022-09-07 08:30:38.120 INFO 4993 --- [io-64398-exec-1] demo.stream.OrdersSource : Event sent: Event [id=null, subject=Order [id=b3927132-a80d-481e-a219-dbd0c0c7d124, name=pen], type=ORDER, originator=KinesisProducer]
2022-09-07 08:30:38.806 INFO 4993 --- [ask-scheduler-3] demo.config.ProducerConfiguration : key 1842629003 falls into partition 1
2022-09-07 08:30:38.812 DEBUG 4993 --- [ask-scheduler-3] o.s.c.s.m.DirectWithAttributesChannel : preSend on channel 'bean 'produceOrder-out-0'', message: GenericMessage [payload=byte[126], headers={scst_partition=1, id=9cb8ec58-4a9e-7b6f-4263-c9d4d1eec906, contentType=application/json, timestamp=1662519638809}]
2022-09-07 08:30:38.813 DEBUG 4993 --- [ask-scheduler-3] tractMessageChannelBinder$SendingHandler : org.springframework.cloud.stream.binder.AbstractMessageChannelBinder$SendingHandler#63811d15 received message: GenericMessage [payload=byte[126], headers={scst_partition=1, scst_partitionOverride=0, id=731f444b-d3df-a51a-33de-8adf78e1e746, contentType=application/json, timestamp=1662519638813}]
2022-09-07 08:30:38.832 DEBUG 4993 --- [ask-scheduler-3] o.s.c.s.m.DirectWithAttributesChannel : postSend (sent=true) on channel 'bean 'produceOrder-out-0'', message: GenericMessage [payload=byte[126], headers={scst_partition=1, scst_partitionOverride=0, id=731f444b-d3df-a51a-33de-8adf78e1e746, contentType=application/json, timestamp=1662519638813}]
2022-09-07 08:35:51.153 INFO 4993 --- [io-64398-exec-2] demo.stream.OrdersSource : Event sent: Event [id=null, subject=Order [id=6a5b3084-11dc-4080-a80e-61cc73315139, name=pen], type=ORDER, originator=KinesisProducer]
2022-09-07 08:35:51.915 INFO 4993 --- [ask-scheduler-5] demo.config.ProducerConfiguration : key 1525662264 falls into partition 0
2022-09-07 08:35:51.916 DEBUG 4993 --- [ask-scheduler-5] o.s.c.s.m.DirectWithAttributesChannel : preSend on channel 'bean 'produceOrder-out-0'', message: GenericMessage [payload=byte[126], headers={scst_partition=0, id=115c5421-00f2-286d-de02-0020e9322a17, contentType=application/json, timestamp=1662519951916}]
2022-09-07 08:35:51.916 DEBUG 4993 --- [ask-scheduler-5] tractMessageChannelBinder$SendingHandler : org.springframework.cloud.stream.binder.AbstractMessageChannelBinder$SendingHandler#63811d15 received message: GenericMessage [payload=byte[126], headers={scst_partition=0, scst_partitionOverride=0, id=145be7e8-381f-af73-e430-9cb645ff785f, contentType=application/json, timestamp=1662519951916}]
2022-09-07 08:35:51.917 DEBUG 4993 --- [ask-scheduler-5] o.s.c.s.m.DirectWithAttributesChannel : postSend (sent=true) on channel 'bean 'produceOrder-out-0'', message: GenericMessage [payload=byte[126], headers={scst_partition=0, scst_partitionOverride=0, id=145be7e8-381f-af73-e430-9cb645ff785f, contentType=application/json, timestamp=1662519951916}]

application shutting down in spring boot

In a spring boot project in configuration file there is a task executor whose code goes like this
#Bean(name = "asyncExec")
public Executor taskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(10);
executor.setMaxPoolSize(50);
executor.setQueueCapacity(500);
executor.setThreadNamePrefix("CashFlowThread-");
executor.initialize();
return executor;
}
I am deploying an API which download from s3 bucket and create 4 pdf and store it in target folder . while the api is called console shows error that asyncExec is shutting down .
Stack trace for it shows
Initializing Spring DispatcherServlet 'dispatcherServlet'
2020-12-01 17:04:30.174 INFO 3680 --- [nio-5000-exec-2] o.s.web.servlet.DispatcherServlet : Initializing Servlet 'dispatcherServlet'
2020-12-01 17:04:30.179 INFO 3680 --- [nio-5000-exec-2] o.s.web.servlet.DispatcherServlet : Completed initialization in 5 ms
2020-12-01 17:04:30.185 INFO 3680 --- [nio-5000-exec-2] com.zaxxer.hikari.HikariDataSource : HikariPool-17 - Starting...
2020-12-01 17:04:35.767 INFO 3680 --- [nio-5000-exec-2] com.zaxxer.hikari.HikariDataSource : HikariPool-17 - Start completed.
File is created!
Successfully obtained bytes from an S3 object
2020-12-01 17:04:43.907 INFO 3680 --- [ Thread-174] o.s.s.concurrent.ThreadPoolTaskExecutor : Shutting down ExecutorService 'asyncExec'
2020-12-01 17:04:43.907 INFO 3680 --- [ Thread-174] com.zaxxer.hikari.HikariDataSource : HikariPool-17 - Shutdown initiated...

Hibernate - Dialect configured but still getting error

I'm updating a JHipster microservice and found some issues when running the app in production environment. The application crashes with a strange Hibernate error:
2020-10-22 10:59:12.755 INFO 1 --- [ main] com.hazelcast.core.LifecycleService : [172.18.0.3]:5701 [dev] [3.12.7] [172.18.0.3]:5701 is STARTED
2020-10-22 10:59:13.066 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : Driver class org.postgresql.Driver found in Thread context class loader jdk.internal.loader.ClassLoaders$AppClassLoader#4ae3c1cd
2020-10-22 10:59:13.868 INFO 1 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : Saw local status change event StatusChangeEvent [timestamp=1603364353868, current=UP, previous=STARTING]
2020-10-22 10:59:13.946 INFO 1 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_MISCELLANEOUS/miscellaneous:2b1983b040c376288a15e1a536a4f8f9: registering service...
2020-10-22 10:59:14.182 INFO 1 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_MISCELLANEOUS/miscellaneous:2b1983b040c376288a15e1a536a4f8f9 - registration status: 204
2020-10-22 10:59:14.257 INFO 1 --- [ main] t.h.e.m.config.WebConfigurer : Web application configuration, using profiles: prod
2020-10-22 10:59:14.258 INFO 1 --- [ main] t.h.e.m.config.WebConfigurer : Web application fully configured
2020-10-22 10:59:14.961 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : Hikari - configuration:
2020-10-22 10:59:14.969 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : allowPoolSuspension.............false
2020-10-22 10:59:14.969 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : autoCommit......................false
2020-10-22 10:59:14.969 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : catalog.........................none
2020-10-22 10:59:14.970 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : connectionInitSql...............none
2020-10-22 10:59:14.970 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : connectionTestQuery.............none
2020-10-22 10:59:14.971 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : connectionTimeout...............30000
2020-10-22 10:59:14.971 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : dataSource......................none
2020-10-22 10:59:14.971 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : dataSourceClassName.............none
2020-10-22 10:59:14.972 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : dataSourceJNDI..................none
2020-10-22 10:59:15.042 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : dataSourceProperties............{password=<masked>}
2020-10-22 10:59:15.043 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : driverClassName................."org.postgresql.Driver"
2020-10-22 10:59:15.043 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : exceptionOverrideClassName......none
2020-10-22 10:59:15.044 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : healthCheckProperties...........{}
2020-10-22 10:59:15.045 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : healthCheckRegistry.............none
2020-10-22 10:59:15.045 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : idleTimeout.....................600000
2020-10-22 10:59:15.047 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : initializationFailTimeout.......1
2020-10-22 10:59:15.048 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : isolateInternalQueries..........false
2020-10-22 10:59:15.049 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : jdbcUrl.........................jdbc:postgresql://postgresql:5432/miscellaneous?socketTimeout=30
2020-10-22 10:59:15.050 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : leakDetectionThreshold..........15000
2020-10-22 10:59:15.050 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : maxLifetime.....................1800000
2020-10-22 10:59:15.050 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : maximumPoolSize.................5
2020-10-22 10:59:15.051 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : metricRegistry..................none
2020-10-22 10:59:15.051 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : metricsTrackerFactory...........none
2020-10-22 10:59:15.051 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : minimumIdle.....................5
2020-10-22 10:59:15.054 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : password........................<masked>
2020-10-22 10:59:15.055 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : poolName........................"Hikari"
2020-10-22 10:59:15.056 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : readOnly........................false
2020-10-22 10:59:15.057 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : registerMbeans..................false
2020-10-22 10:59:15.059 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : scheduledExecutor...............none
2020-10-22 10:59:15.059 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : schema..........................none
2020-10-22 10:59:15.059 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : threadFactory...................internal
2020-10-22 10:59:15.060 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : transactionIsolation............default
2020-10-22 10:59:15.061 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : username........................"teste"
2020-10-22 10:59:15.064 DEBUG 1 --- [ main] com.zaxxer.hikari.HikariConfig : validationTimeout...............5000
2020-10-22 10:59:15.065 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : Hikari - Starting...
2020-10-22 10:59:15.445 DEBUG 1 --- [ main] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#47d6a31a
2020-10-22 10:59:15.450 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : Hikari - Start completed.
2020-10-22 10:59:15.550 DEBUG 1 --- [ari housekeeper] com.zaxxer.hikari.pool.HikariPool : Hikari - Pool stats (total=1, active=1, idle=0, waiting=0)
2020-10-22 10:59:15.566 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#c2d5356
2020-10-22 10:59:15.573 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#2eaaca3
2020-10-22 10:59:15.643 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#6d53826
2020-10-22 10:59:15.654 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - Added connection org.postgresql.jdbc.PgConnection#43d7a698
2020-10-22 10:59:15.659 DEBUG 1 --- [onnection adder] com.zaxxer.hikari.pool.HikariPool : Hikari - After adding stats (total=5, active=1, idle=4, waiting=0)
2020-10-22 10:59:28.156 WARN 1 --- [scoveryClient-0] c.netflix.discovery.TimedSupervisorTask : task supervisor timed out
java.util.concurrent.TimeoutException: null
at java.base/java.util.concurrent.FutureTask.get(Unknown Source)
at com.netflix.discovery.TimedSupervisorTask.run(TimedSupervisorTask.java:68)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
2020-10-22 10:59:30.453 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : Hikari - Shutdown initiated...
2020-10-22 10:59:30.454 DEBUG 1 --- [ main] com.zaxxer.hikari.pool.HikariPool : Hikari - Before shutdown stats (total=5, active=0, idle=5, waiting=0)
2020-10-22 10:59:30.465 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#47d6a31a: (connection evicted)
2020-10-22 10:59:30.471 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#c2d5356: (connection evicted)
2020-10-22 10:59:30.544 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#2eaaca3: (connection evicted)
2020-10-22 10:59:30.545 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#6d53826: (connection evicted)
2020-10-22 10:59:30.548 DEBUG 1 --- [nnection closer] com.zaxxer.hikari.pool.PoolBase : Hikari - Closing connection org.postgresql.jdbc.PgConnection#43d7a698: (connection evicted)
2020-10-22 10:59:30.564 DEBUG 1 --- [ main] com.zaxxer.hikari.pool.HikariPool : Hikari - After shutdown stats (total=0, active=0, idle=0, waiting=0)
2020-10-22 10:59:30.564 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : Hikari - Shutdown completed.
2020-10-22 10:59:30.565 WARN 1 --- [ main] i.g.j.c.liquibase.AsyncSpringLiquibase : Warning, Liquibase took more than 5 seconds to start up!
2020-10-22 10:59:31.050 DEBUG 1 --- [ main] o.hibernate.jpa.internal.util.LogHelper : PersistenceUnitInfo [
name: default
persistence provider classname: null
classloader: jdk.internal.loader.ClassLoaders$AppClassLoader#4ae3c1cd
excludeUnlistedClasses: true
JTA datasource: null
Non JTA datasource: HikariDataSource (Hikari)
Transaction type: RESOURCE_LOCAL
PU root URL: file:/app/libs/commons-microservice-1.1.0.jar
Shared Cache Mode: UNSPECIFIED
Validation Mode: AUTO
Jar files URLs []
Managed classes names [
tech.h2r.ecommerce.miscellaneous.domain.AbstractAuditingEntity
tech.h2r.ecommerce.miscellaneous.domain.Banner
tech.h2r.ecommerce.miscellaneous.domain.CustomPage
tech.h2r.ecommerce.miscellaneous.domain.DirectMail
tech.h2r.ecommerce.miscellaneous.domain.PersistentAuditEvent
tech.h2r.ecommerce.miscellaneous.domain.Theme
tech.h2r.commons.microservice.domain.ConsumedMessage
tech.h2r.commons.microservice.domain.ProducedMessage
tech.h2r.commons.domain.Config]
Mapping files names []
Properties []
2020-10-22 10:59:31.064 DEBUG 1 --- [ main] o.h.i.internal.IntegratorServiceImpl : Adding Integrator [org.hibernate.cfg.beanvalidation.BeanValidationIntegrator].
2020-10-22 10:59:31.066 DEBUG 1 --- [ main] o.h.i.internal.IntegratorServiceImpl : Adding Integrator [org.hibernate.secure.spi.JaccIntegrator].
2020-10-22 10:59:31.068 DEBUG 1 --- [ main] o.h.i.internal.IntegratorServiceImpl : Adding Integrator [org.hibernate.cache.internal.CollectionCacheInvalidator].
2020-10-22 10:59:31.270 INFO 1 --- [ main] org.hibernate.Version : HHH000412: Hibernate ORM core version 5.4.15.Final
2020-10-22 10:59:31.343 DEBUG 1 --- [ main] org.hibernate.cfg.Environment : HHH000206: hibernate.properties not found
2020-10-22 10:59:31.851 DEBUG 1 --- [ main] o.hibernate.service.spi.ServiceBinding : Overriding existing service binding [org.hibernate.secure.spi.JaccService]
2020-10-22 10:59:31.867 DEBUG 1 --- [ main] o.h.c.internal.RegionFactoryInitiator : Cannot default RegionFactory based on registered strategies as `[]` RegionFactory strategies were registered
2020-10-22 10:59:31.868 DEBUG 1 --- [ main] o.h.c.internal.RegionFactoryInitiator : Cache region factory : org.hibernate.cache.internal.NoCachingRegionFactory
2020-10-22 10:59:31.960 INFO 1 --- [ main] o.hibernate.annotations.common.Version : HCANN000001: Hibernate Commons Annotations {5.1.0.Final}
2020-10-22 10:59:32.769 DEBUG 1 --- [ main] o.h.boot.internal.BootstrapContextImpl : Injecting JPA temp ClassLoader [org.springframework.instrument.classloading.SimpleThrowawayClassLoader#20852557] into BootstrapContext; was [null]
2020-10-22 10:59:32.770 DEBUG 1 --- [ main] o.h.boot.internal.ClassLoaderAccessImpl : ClassLoaderAccessImpl#injectTempClassLoader(org.springframework.instrument.classloading.SimpleThrowawayClassLoader#20852557) [was null]
2020-10-22 10:59:32.771 DEBUG 1 --- [ main] o.h.boot.internal.BootstrapContextImpl : Injecting ScanEnvironment [org.hibernate.jpa.boot.internal.StandardJpaScanEnvironmentImpl#41ca7df9] into BootstrapContext; was [null]
2020-10-22 10:59:32.842 DEBUG 1 --- [ main] o.h.boot.internal.BootstrapContextImpl : Injecting ScanOptions [org.hibernate.boot.archive.scan.internal.StandardScanOptions#4f0908de] into BootstrapContext; was [org.hibernate.boot.archive.scan.internal.StandardScanOptions#19469022]
2020-10-22 10:59:33.153 DEBUG 1 --- [ main] o.h.boot.internal.BootstrapContextImpl : Injecting JPA temp ClassLoader [null] into BootstrapContext; was [org.springframework.instrument.classloading.SimpleThrowawayClassLoader#20852557]
2020-10-22 10:59:33.160 DEBUG 1 --- [ main] o.h.boot.internal.ClassLoaderAccessImpl : ClassLoaderAccessImpl#injectTempClassLoader(null) [was org.springframework.instrument.classloading.SimpleThrowawayClassLoader#20852557]
2020-10-22 10:59:33.243 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [uuid2] -> [org.hibernate.id.UUIDGenerator]
2020-10-22 10:59:33.243 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [guid] -> [org.hibernate.id.GUIDGenerator]
2020-10-22 10:59:33.244 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [uuid] -> [org.hibernate.id.UUIDHexGenerator]
2020-10-22 10:59:33.245 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [uuid.hex] -> [org.hibernate.id.UUIDHexGenerator]
2020-10-22 10:59:33.245 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [assigned] -> [org.hibernate.id.Assigned]
2020-10-22 10:59:33.247 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [identity] -> [org.hibernate.id.IdentityGenerator]
2020-10-22 10:59:33.249 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [select] -> [org.hibernate.id.SelectGenerator]
2020-10-22 10:59:33.251 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [sequence] -> [org.hibernate.id.enhanced.SequenceStyleGenerator]
2020-10-22 10:59:33.253 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [seqhilo] -> [org.hibernate.id.SequenceHiLoGenerator]
2020-10-22 10:59:33.254 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [increment] -> [org.hibernate.id.IncrementGenerator]
2020-10-22 10:59:33.255 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [foreign] -> [org.hibernate.id.ForeignGenerator]
2020-10-22 10:59:33.256 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [sequence-identity] -> [org.hibernate.id.SequenceIdentityGenerator]
2020-10-22 10:59:33.256 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [enhanced-sequence] -> [org.hibernate.id.enhanced.SequenceStyleGenerator]
2020-10-22 10:59:33.258 DEBUG 1 --- [ main] .i.f.i.DefaultIdentifierGeneratorFactory : Registering IdentifierGenerator strategy [enhanced-table] -> [org.hibernate.id.enhanced.TableGenerator]
2020-10-22 10:59:33.262 WARN 1 --- [ main] o.h.e.j.e.i.JdbcEnvironmentInitiator : HHH000342: Could not obtain connection to query metadata : HikariDataSource HikariDataSource (Hikari) has been closed.
2020-10-22 10:59:33.266 WARN 1 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [tech/h2r/ecommerce/miscellaneous/config/DatabaseConfiguration.class]: Invocation of init method failed; nested exception is org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment]
2020-10-22 10:59:33.365 INFO 1 --- [ main] com.hazelcast.core.LifecycleService : [172.18.0.3]:5701 [dev] [3.12.7] [172.18.0.3]:5701 is SHUTTING_DOWN
2020-10-22 10:59:33.449 INFO 1 --- [ main] com.hazelcast.instance.Node : [172.18.0.3]:5701 [dev] [3.12.7] Shutting down connection manager...
2020-10-22 10:59:33.457 INFO 1 --- [ main] com.hazelcast.instance.Node : [172.18.0.3]:5701 [dev] [3.12.7] Shutting down node engine...
2020-10-22 10:59:33.547 INFO 1 --- [ main] com.hazelcast.instance.NodeExtension : [172.18.0.3]:5701 [dev] [3.12.7] Destroying node NodeExtension.
2020-10-22 10:59:33.548 INFO 1 --- [ main] com.hazelcast.instance.Node : [172.18.0.3]:5701 [dev] [3.12.7] Hazelcast Shutdown is completed in 101 ms.
2020-10-22 10:59:33.549 INFO 1 --- [ main] com.hazelcast.core.LifecycleService : [172.18.0.3]:5701 [dev] [3.12.7] [172.18.0.3]:5701 is SHUTDOWN
2020-10-22 10:59:33.550 INFO 1 --- [ main] t.h.e.m.config.CacheConfiguration : Closing Cache Manager
2020-10-22 10:59:33.554 INFO 1 --- [ main] com.netflix.discovery.DiscoveryClient : Shutting down DiscoveryClient ...
2020-10-22 10:59:36.557 INFO 1 --- [ main] com.netflix.discovery.DiscoveryClient : Unregistering ...
2020-10-22 10:59:36.666 INFO 1 --- [ main] com.netflix.discovery.DiscoveryClient : DiscoveryClient_MISCELLANEOUS/miscellaneous:2b1983b040c376288a15e1a536a4f8f9 - deregister status: 200
2020-10-22 10:59:36.744 INFO 1 --- [ main] com.netflix.discovery.DiscoveryClient : Completed shut down of DiscoveryClient
2020-10-22 10:59:36.951 ERROR 1 --- [ main] o.s.boot.SpringApplication : Application run failed
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [tech/h2r/ecommerce/miscellaneous/config/DatabaseConfiguration.class]: Invocation of init method failed; nested exception is org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1796)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:595)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:517)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:323)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:226)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:321)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1108)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:868)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:141)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:747)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:315)
at tech.h2r.ecommerce.miscellaneous.MiscellaneousApp.main(MiscellaneousApp.java:44)
Caused by: org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment]
at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:275)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:237)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214)
at org.hibernate.id.factory.internal.DefaultIdentifierGeneratorFactory.injectServices(DefaultIdentifierGeneratorFactory.java:152)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.injectDependencies(AbstractServiceRegistryImpl.java:286)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:243)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:214)
at org.hibernate.boot.internal.InFlightMetadataCollectorImpl.<init>(InFlightMetadataCollectorImpl.java:176)
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:118)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.metadata(EntityManagerFactoryBuilderImpl.java:1214)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:1245)
at org.springframework.orm.jpa.vendor.SpringHibernateJpaPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateJpaPersistenceProvider.java:58)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:365)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:391)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:378)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.afterPropertiesSet(LocalContainerEntityManagerFactoryBean.java:341)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1855)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1792)
... 14 common frames omitted
Caused by: org.hibernate.HibernateException: Access to DialectResolutionInfo cannot be null when 'hibernate.dialect' not set
at org.hibernate.engine.jdbc.dialect.internal.DialectFactoryImpl.determineDialect(DialectFactoryImpl.java:100)
at org.hibernate.engine.jdbc.dialect.internal.DialectFactoryImpl.buildDialect(DialectFactoryImpl.java:54)
at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:137)
at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:35)
at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.initiateService(StandardServiceRegistryImpl.java:101)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:263)
... 31 common frames omitted
The error says the problem is with dialect, but it is configured correctly:
datasource:
type: com.zaxxer.hikari.HikariDataSource
url: jdbc:postgresql://postgresql:5432/miscellaneous?socketTimeout=30
hikari:
poolName: Hikari
auto-commit: false
jpa:
database-platform: io.github.jhipster.domain.util.FixedPostgreSQL10Dialect
show-sql: false
open-in-view: false
properties:
hibernate.jdbc.time_zone: UTC
hibernate.id.new_generator_mappings: true
hibernate.connection.provider_disables_autocommit: true
hibernate.cache.use_second_level_cache: true
hibernate.cache.use_query_cache: false
hibernate.generate_statistics: false
hibernate.jdbc.batch_size: 25
hibernate.order_inserts: true
hibernate.order_updates: true
hibernate.query.fail_on_pagination_over_collection_fetch: true
hibernate.query.in_clause_parameter_padding: true
hibernate.cache.region.factory_class: com.hazelcast.hibernate.HazelcastCacheRegionFactory
hibernate.cache.use_minimal_puts: true
hibernate.cache.hazelcast.instance_name: miscellaneous
hibernate.cache.hazelcast.use_lite_member: true
hibernate:
ddl-auto: none
naming:
physical-strategy: org.springframework.boot.orm.jpa.hibernate.SpringPhysicalNamingStrategy
implicit-strategy: org.springframework.boot.orm.jpa.hibernate.SpringImplicitNamingStrategy
What could be causing this problem?
Instead of use the injected jpaVendorAdapter in my EntityManager, I was creating it:
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource, SchemaPerTenantConnectionProvider schemaPerTenantConnectionProvider, HeaderTenantIdentifierResolver headerTenantIdentifierResolver) {
...
em.setJpaVendorAdapter(jpaVendorAdapter());
em.setJpaPropertyMap(properties);
return em;
}
simply using the injected bean solved this problem:
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource, SchemaPerTenantConnectionProvider schemaPerTenantConnectionProvider, HeaderTenantIdentifierResolver headerTenantIdentifierResolver, JpaVendorAdapter jpaVendorAdapter) {
...
em.setJpaVendorAdapter(jpaVendorAdapter);
em.setJpaPropertyMap(properties);
return em;
}
Add the dialect to the properties under the name hibernate.dialect.

I got org.springframework.amqp.AmqpConnectException when I try to integration test with qpid broker

I try to config my test class with #TestConfiguration and when I declare a DirectMessageListenerContainer with a consumersPerQueue property's value more than 1. I got an exception when running a test.
I use spring boot 2.1.5.RELEASE, spring-amqp 2.1.6.RELEASE and qpid 7.1.0
Here is my configuration
#Configuration
public class MyConfiguration {
public static final String EXCHANGE_NAME = "x.sample.first";
public static final String Q_NAME = "q.sample.first";
#Bean(name = "firstQueue")
Queue queue() {
final boolean durable = true;
final boolean exclusive = false;
final boolean autoDelete = false;
return new Queue(Q_NAME, durable, exclusive, autoDelete);
}
#Bean("firstExchange")
DirectExchange exchange() {
final boolean durable = true;
final boolean autoDelete = false;
return new DirectExchange(EXCHANGE_NAME, durable, autoDelete);
}
#Bean(name = "firstQueueBinding")
Binding binding(#Qualifier("firstQueue") Queue queue, #Qualifier("firstExchange") DirectExchange exchange) {
return BindingBuilder.bind(queue).to(exchange).withQueueName();
}
#Bean("firstQueueMessageContainer")
public DirectMessageListenerContainer messageContainer(ConnectionFactory connectionFactory, MyMessageListener consumer) {
DirectMessageListenerContainer container = new DirectMessageListenerContainer(connectionFactory);
container.setConsumersPerQueue(2);
container.setMessageListener(consumer);
container.setQueueNames(Q_NAME);
return container;
}
}
My test looks like below.
#ActiveProfiles("test")
#SpringBootTest
#ExtendWith(SpringExtension.class)
public class MyMessageListenerTest {
#Test
public void test() {
//some test here...
}
#TestConfiguration
static class TestConfig {
#Bean(name = "qpidBroker", initMethod = "start", destroyMethod = "shutdown")
public EmbeddedInMemoryQpidBroker broker() {
return new EmbeddedInMemoryQpidBroker(); //this my wrapper class to provide in memory qpid broker
}
#Bean
#DependsOn("qpidBroker")
public ConnectionFactory connectionFactory() {
CachingConnectionFactory connectionFactory = new CachingConnectionFactory();
connectionFactory.setHost("localhost");
connectionFactory.setPort(5672);
connectionFactory.setUsername("guest");
connectionFactory.setPassword("guest");
return connectionFactory;
}
#Bean("firstQueueMessageContainer")
public DirectMessageListenerContainer messageContainer(ConnectionFactory connectionFactory, MyMessageListener consumer) {
DirectMessageListenerContainer container = new DirectMessageListenerContainer(connectionFactory);
container.setConsumersPerQueue(2); // if I remove this line everything working correctly.
container.setMessageListener(consumer);
container.setQueueNames(Q_NAME);
return container;
}
}
}
And I got below exception.
2019-06-06 15:18:37.108 INFO 8576 --- [ main] com.example.MyMessageListenerTest : Starting MyMessageListenerTest on AliveX with PID 8576 (started by development in C:\seamless\workspace-spring\spring-rabbit-sample)
2019-06-06 15:18:37.109 INFO 8576 --- [ main] com.example.MyMessageListenerTest : The following profiles are active: test
2019-06-06 15:18:38.782 INFO 8576 --- [ Broker-Config] o.a.q.server.store.GenericStoreUpgrader : Broker store has model version 7.0. Number of record(s) 6
2019-06-06 15:18:38.798 INFO 8576 --- [ Broker-Config] q.message.authenticationprovider.create : [Broker] ATH-1001 : Create "hardcoded"
2019-06-06 15:18:38.806 INFO 8576 --- [ Broker-Config] qpid.message.port.create : [Broker] PRT-1001 : Create "AMQP"
[Broker] BRK-1006 : Using configuration : N/A
2019-06-06 15:18:38.818 INFO 8576 --- [ Broker-Config] qpid.message.broker.config : [Broker] BRK-1006 : Using configuration : N/A
[Broker] BRK-1001 : Startup : Version: 7.1.0 Build: 5cb4ba20207da1390c79ef8b654a395e58dad5a0
2019-06-06 15:18:39.119 INFO 8576 --- [ Broker-Config] qpid.message.broker.startup : [Broker] BRK-1001 : Startup : Version: 7.1.0 Build: 5cb4ba20207da1390c79ef8b654a395e58dad5a0
[Broker] BRK-1010 : Platform : JVM : Oracle Corporation version: 1.8.0_112-b15 OS : Windows 10 version: 10.0 arch: amd64 cores: 8
2019-06-06 15:18:39.120 INFO 8576 --- [ Broker-Config] qpid.message.broker.platform : [Broker] BRK-1010 : Platform : JVM : Oracle Corporation version: 1.8.0_112-b15 OS : Windows 10 version: 10.0 arch: amd64 cores: 8
[Broker] BRK-1011 : Maximum Memory : Heap : 3,791,650,816 bytes Direct : 3,791,650,816 bytes
2019-06-06 15:18:39.121 INFO 8576 --- [ Broker-Config] qpid.message.broker.max_memory : [Broker] BRK-1011 : Maximum Memory : Heap : 3,791,650,816 bytes Direct : 3,791,650,816 bytes
[Broker] BRK-1017 : Process : PID : 8576
2019-06-06 15:18:39.122 INFO 8576 --- [ Broker-Config] qpid.message.broker.process : [Broker] BRK-1017 : Process : PID : 8576
2019-06-06 15:18:39.129 INFO 8576 --- [ Broker-Config] qpid.message.configstore.created : [Broker] [vh(/default)/ms(MemoryConfigurationStore)] CFG-1001 : Created
2019-06-06 15:18:39.130 INFO 8576 --- [ Broker-Config] qpid.message.configstore.recovery_start : [Broker] [vh(/default)/ms(MemoryConfigurationStore)] CFG-1004 : Recovery Start
2019-06-06 15:18:39.137 INFO 8576 --- [ Broker-Config] o.a.q.server.store.GenericStoreUpgrader : VirtualHost store has model version 7.1. Number of record(s) 5
2019-06-06 15:18:39.142 INFO 8576 --- [ Broker-Config] qpid.message.virtualhost.created : [Broker] VHT-1001 : Created : default
2019-06-06 15:18:39.164 INFO 8576 --- [ Broker-Config] q.message.configstore.recovery_complete : [Broker] [vh(/default)/ms(MemoryConfigurationStore)] CFG-1005 : Recovery Complete
2019-06-06 15:18:39.179 INFO 8576 --- [-default-Config] qpid.message.exchange.created : [Broker] EXH-1001 : Create : Durable Type: fanout Name: amq.fanout
2019-06-06 15:18:39.180 INFO 8576 --- [-default-Config] qpid.message.exchange.created : [Broker] EXH-1001 : Create : Durable Type: headers Name: amq.match
2019-06-06 15:18:39.180 INFO 8576 --- [-default-Config] qpid.message.exchange.created : [Broker] EXH-1001 : Create : Durable Type: topic Name: amq.topic
2019-06-06 15:18:39.180 INFO 8576 --- [-default-Config] qpid.message.exchange.created : [Broker] EXH-1001 : Create : Durable Type: direct Name: amq.direct
[Broker] BRK-1002 : Starting : Listening on TCP port 5672
2019-06-06 15:18:39.187 INFO 8576 --- [ Broker-Config] qpid.message.broker.listening : [Broker] BRK-1002 : Starting : Listening on TCP port 5672
2019-06-06 15:18:39.216 INFO 8576 --- [-default-Config] q.message.messagestore.recovery_start : [Broker] [vh(/default)/ms(MemoryMessageStore)] MST-1004 : Recovery Start
2019-06-06 15:18:39.218 INFO 8576 --- [-default-Config] q.message.transactionlog.recovery_start : [Broker] [vh(/default)/ms(MemoryMessageStore)] TXN-1004 : Recovery Start
2019-06-06 15:18:39.219 INFO 8576 --- [-default-Config] q.m.transactionlog.recovery_complete : [Broker] [vh(/default)/ms(MemoryMessageStore)] TXN-1006 : Recovery Complete
2019-06-06 15:18:39.220 INFO 8576 --- [-default-Config] qpid.message.messagestore.recovered : [Broker] [vh(/default)/ms(MemoryMessageStore)] MST-1005 : Recovered 0 messages
2019-06-06 15:18:39.220 INFO 8576 --- [-default-Config] q.m.messagestore.recovery_complete : [Broker] [vh(/default)/ms(MemoryMessageStore)] MST-1006 : Recovery Complete
[Broker] BRK-1004 : Qpid Broker Ready
2019-06-06 15:18:39.224 INFO 8576 --- [ Broker-Config] qpid.message.broker.ready : [Broker] BRK-1004 : Qpid Broker Ready
2019-06-06 15:18:39.344 INFO 8576 --- [ main] o.s.s.c.ThreadPoolTaskScheduler : Initializing ExecutorService
2019-06-06 15:18:40.023 INFO 8576 --- [ main] o.s.a.r.c.CachingConnectionFactory : Attempting to connect to: localhost:5672
2019-06-06 15:18:40.060 INFO 8576 --- [ Broker-Config] qpid.message.connection.open : [con:0(/127.0.0.1:49260)] CON-1001 : Open : Destination : AMQP(127.0.0.1:5672) : Protocol Version : 0-9-1
2019-06-06 15:18:40.120 INFO 8576 --- [127.0.0.1:49260] qpid.message.connection.open : [con:0(guest#/127.0.0.1:49260/default)] CON-1001 : Open : Destination : AMQP(127.0.0.1:5672) : Protocol Version : 0-9-1 : Client ID : e1ac1a25-0511-49ae-99d3-95e35b4fc0f9 : Client Version : 5.4.3 : Client Product : RabbitMQ
2019-06-06 15:18:40.124 INFO 8576 --- [ main] o.s.a.r.c.CachingConnectionFactory : Created new connection: connectionFactory#64b7225f:0/SimpleConnection#288ca5f0 [delegate=amqp://guest#127.0.0.1:5672/, localPort= 49260]
2019-06-06 15:18:40.177 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.create : [con:0(guest#/127.0.0.1:49260/default)/ch:1] CHN-1001 : Create
2019-06-06 15:18:40.206 INFO 8576 --- [-default-Config] qpid.message.exchange.created : [con:0(guest#/127.0.0.1:49260/default)/ch:1] EXH-1001 : Create : Durable Type: direct Name: x.sample.second
2019-06-06 15:18:40.212 INFO 8576 --- [-default-Config] qpid.message.exchange.created : [con:0(guest#/127.0.0.1:49260/default)/ch:1] EXH-1001 : Create : Durable Type: direct Name: x.sample.first
2019-06-06 15:18:40.263 INFO 8576 --- [-default-Config] qpid.message.queue.created : [con:0(guest#/127.0.0.1:49260/default)/ch:1] [vh(/default)/qu(q.sample.second)] QUE-1001 : Create : ID: 91584d34-9030-4a85-963b-dc45fbb97bb8 Durable
2019-06-06 15:18:40.275 INFO 8576 --- [-default-Config] qpid.message.queue.created : [con:0(guest#/127.0.0.1:49260/default)/ch:1] [vh(/default)/qu(q.sample.first)] QUE-1001 : Create : ID: 040a1dc8-de33-4aa9-9c9a-f4b67a3094be Durable
2019-06-06 15:18:40.285 INFO 8576 --- [-default-Config] qpid.message.binding.created : [con:0(guest#/127.0.0.1:49260/default)/ch:1] [vh(/default)/ex(direct/x.sample.second)] BND-1001 : Create : {bindingKey=q.sample.second, destination=q.sample.second, arguments={}}
2019-06-06 15:18:40.290 INFO 8576 --- [-default-Config] qpid.message.binding.created : [con:0(guest#/127.0.0.1:49260/default)/ch:1] [vh(/default)/ex(direct/x.sample.first)] BND-1001 : Create : {bindingKey=q.sample.first, destination=q.sample.first, arguments={}}
2019-06-06 15:18:40.306 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.prefetch_size : [con:0(guest#/127.0.0.1:49260/default)/ch:1] CHN-1004 : Prefetch Size (bytes) 0 : Count 250
2019-06-06 15:18:40.321 INFO 8576 --- [-default-Config] qpid.message.subscription.create : [con:0(guest#/127.0.0.1:49260/default)/ch:1] [sub:0(vh(/default)/qu(q.sample.second)] SUB-1001 : Create
2019-06-06 15:18:40.327 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.create : [con:0(guest#/127.0.0.1:49260/default)/ch:2] CHN-1001 : Create
2019-06-06 15:18:40.332 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.prefetch_size : [con:0(guest#/127.0.0.1:49260/default)/ch:2] CHN-1004 : Prefetch Size (bytes) 0 : Count 250
2019-06-06 15:18:40.334 INFO 8576 --- [-default-Config] qpid.message.subscription.create : [con:0(guest#/127.0.0.1:49260/default)/ch:2] [sub:1(vh(/default)/qu(q.sample.first)] SUB-1001 : Create
2019-06-06 15:18:40.341 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.create : [con:0(guest#/127.0.0.1:49260/default)/ch:3] CHN-1001 : Create
2019-06-06 15:18:40.344 INFO 8576 --- [ main] o.s.a.r.l.DirectMessageListenerContainer : Container initialized for queues: [q.sample.first]
2019-06-06 15:18:40.346 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.prefetch_size : [con:0(guest#/127.0.0.1:49260/default)/ch:3] CHN-1004 : Prefetch Size (bytes) 0 : Count 250
2019-06-06 15:18:40.350 INFO 8576 --- [-default-Config] qpid.message.subscription.create : [con:0(guest#/127.0.0.1:49260/default)/ch:3] [sub:2(vh(/default)/qu(q.sample.first)] SUB-1001 : Create
2019-06-06 15:18:40.351 INFO 8576 --- [sageContainer-1] o.s.a.r.l.DirectMessageListenerContainer : SimpleConsumer [queue=q.sample.first, consumerTag=sgen_1 identity=124f5893] started
2019-06-06 15:18:40.353 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.create : [con:0(guest#/127.0.0.1:49260/default)/ch:4] CHN-1001 : Create
2019-06-06 15:18:40.355 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.prefetch_size : [con:0(guest#/127.0.0.1:49260/default)/ch:4] CHN-1004 : Prefetch Size (bytes) 0 : Count 250
2019-06-06 15:18:40.359 INFO 8576 --- [-default-Config] qpid.message.subscription.create : [con:0(guest#/127.0.0.1:49260/default)/ch:4] [sub:3(vh(/default)/qu(q.sample.first)] SUB-1001 : Create
2019-06-06 15:18:40.360 INFO 8576 --- [sageContainer-1] o.s.a.r.l.DirectMessageListenerContainer : SimpleConsumer [queue=q.sample.first, consumerTag=sgen_1 identity=18b733e3] started
2019-06-06 15:18:40.363 INFO 8576 --- [ main] com.example.MyMessageListenerTest : Started MyMessageListenerTest in 3.502 seconds (JVM running for 4.314)
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.027 s - in com.example.MyMessageListenerTest
2019-06-06 15:18:40.561 INFO 8576 --- [-default-Config] qpid.message.subscription.close : [con:0(guest#/127.0.0.1:49260/default)/ch:1] [sub:0(vh(/default)/qu(q.sample.second)] SUB-1002 : Close
2019-06-06 15:18:40.566 INFO 8576 --- [ Thread-5] o.s.a.r.l.SimpleMessageListenerContainer : Waiting for workers to finish.
2019-06-06 15:18:40.568 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.close_forced : [IO Pool] [con:0(guest#/127.0.0.1:49260/default)/ch:1] CHN-1003 : Close : 320 - Connection closed by external action
2019-06-06 15:18:40.568 INFO 8576 --- [-default-Config] qpid.message.subscription.close : [IO Pool] [sub:1(vh(/default)/qu(q.sample.first)] SUB-1002 : Close
2019-06-06 15:18:40.569 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.close_forced : [IO Pool] [con:0(guest#/127.0.0.1:49260/default)/ch:2] CHN-1003 : Close : 320 - Connection closed by external action
2019-06-06 15:18:40.569 INFO 8576 --- [-default-Config] qpid.message.subscription.close : [IO Pool] [sub:2(vh(/default)/qu(q.sample.first)] SUB-1002 : Close
2019-06-06 15:18:40.570 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.close_forced : [IO Pool] [con:0(guest#/127.0.0.1:49260/default)/ch:3] CHN-1003 : Close : 320 - Connection closed by external action
2019-06-06 15:18:40.570 INFO 8576 --- [-default-Config] qpid.message.subscription.close : [IO Pool] [sub:3(vh(/default)/qu(q.sample.first)] SUB-1002 : Close
2019-06-06 15:18:40.571 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.close_forced : [IO Pool] [con:0(guest#/127.0.0.1:49260/default)/ch:4] CHN-1003 : Close : 320 - Connection closed by external action
2019-06-06 15:18:40.574 ERROR 8576 --- [ 127.0.0.1:5672] o.s.a.r.c.CachingConnectionFactory : Channel shutdown: connection error; protocol method: #method<connection.close>(reply-code=320, reply-text=Connection closed by external action, class-id=0, method-id=0)
2019-06-06 15:18:40.574 ERROR 8576 --- [ 127.0.0.1:5672] o.s.a.r.c.CachingConnectionFactory : Channel shutdown: connection error; protocol method: #method<connection.close>(reply-code=320, reply-text=Connection closed by external action, class-id=0, method-id=0)
2019-06-06 15:18:40.574 ERROR 8576 --- [ 127.0.0.1:5672] o.s.a.r.c.CachingConnectionFactory : Channel shutdown: connection error; protocol method: #method<connection.close>(reply-code=320, reply-text=Connection closed by external action, class-id=0, method-id=0)
2019-06-06 15:18:40.575 ERROR 8576 --- [ 127.0.0.1:5672] o.s.a.r.c.CachingConnectionFactory : Channel shutdown: connection error; protocol method: #method<connection.close>(reply-code=320, reply-text=Connection closed by external action, class-id=0, method-id=0)
2019-06-06 15:18:40.579 INFO 8576 --- [ Broker-Config] qpid.message.broker.shutting_down : [Shutdown] BRK-1003 : Shutting down : TCP port 5672
2019-06-06 15:18:40.582 INFO 8576 --- [ Broker-Config] qpid.message.connection.close : [con:0(guest#/127.0.0.1:49260/default)] CON-1002 : Close : 320 - Connection closed by external action
2019-06-06 15:18:40.586 INFO 8576 --- [-default-Config] qpid.message.virtualhost.closed : [Shutdown] VHT-1002 : Closed : default
2019-06-06 15:18:40.588 INFO 8576 --- [ Broker-Config] qpid.message.configstore.close : [Shutdown] [vh(/default)/ms(MemoryConfigurationStore)] CFG-1003 : Closed
2019-06-06 15:18:41.226 INFO 8576 --- [ Broker-Config] qpid.message.broker.stopped : [Shutdown] BRK-1005 : Stopped
2019-06-06 15:18:41.325 INFO 8576 --- [ Thread-5] o.s.a.r.l.SimpleMessageListenerContainer : Successfully waited for workers to finish.
2019-06-06 15:18:41.326 INFO 8576 --- [ Thread-5] o.s.a.r.l.SimpleMessageListenerContainer : Waiting for workers to finish.
2019-06-06 15:18:41.327 INFO 8576 --- [ Thread-5] o.s.a.r.l.SimpleMessageListenerContainer : Successfully waited for workers to finish.
2019-06-06 15:18:41.327 INFO 8576 --- [ Thread-5] o.s.a.r.c.CachingConnectionFactory : Attempting to connect to: localhost:5672
2019-06-06 15:18:43.346 WARN 8576 --- [ Thread-5] o.s.c.support.DefaultLifecycleProcessor : Failed to stop bean 'firstQueueMessageContainer'
org.springframework.amqp.AmqpConnectException: java.net.ConnectException: Connection refused: connect
at org.springframework.amqp.rabbit.support.RabbitExceptionTranslator.convertRabbitAccessException(RabbitExceptionTranslator.java:62) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.connection.AbstractConnectionFactory.createBareConnection(AbstractConnectionFactory.java:509) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.connection.CachingConnectionFactory.createConnection(CachingConnectionFactory.java:700) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.connection.CachingConnectionFactory.createBareChannel(CachingConnectionFactory.java:651) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.connection.CachingConnectionFactory.access$800(CachingConnectionFactory.java:102) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.connection.CachingConnectionFactory$CachedChannelInvocationHandler.invoke(CachingConnectionFactory.java:1138) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at com.sun.proxy.$Proxy97.basicCancel(Unknown Source) ~[na:na]
at org.springframework.amqp.rabbit.listener.DirectMessageListenerContainer.cancelConsumer(DirectMessageListenerContainer.java:819) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at java.lang.Iterable.forEach(Iterable.java:75) ~[na:1.8.0_112]
at org.springframework.amqp.rabbit.listener.DirectMessageListenerContainer.actualShutDown(DirectMessageListenerContainer.java:798) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.listener.DirectMessageListenerContainer.doShutdown(DirectMessageListenerContainer.java:756) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.shutdown(AbstractMessageListenerContainer.java:1237) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.doStop(AbstractMessageListenerContainer.java:1353) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.listener.DirectMessageListenerContainer.doStop(DirectMessageListenerContainer.java:377) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.stop(AbstractMessageListenerContainer.java:1326) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.stop(AbstractMessageListenerContainer.java:1342) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.doStop(DefaultLifecycleProcessor.java:238) [spring-context-5.1.7.RELEASE.jar:5.1.7.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.access$300(DefaultLifecycleProcessor.java:53) [spring-context-5.1.7.RELEASE.jar:5.1.7.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.stop(DefaultLifecycleProcessor.java:377) [spring-context-5.1.7.RELEASE.jar:5.1.7.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.stopBeans(DefaultLifecycleProcessor.java:210) [spring-context-5.1.7.RELEASE.jar:5.1.7.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.onClose(DefaultLifecycleProcessor.java:128) [spring-context-5.1.7.RELEASE.jar:5.1.7.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1018) [spring-context-5.1.7.RELEASE.jar:5.1.7.RELEASE]
at org.springframework.context.support.AbstractApplicationContext$1.run(AbstractApplicationContext.java:945) [spring-context-5.1.7.RELEASE.jar:5.1.7.RELEASE]
Caused by: java.net.ConnectException: Connection refused: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method) ~[na:1.8.0_112]
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85) ~[na:1.8.0_112]
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) ~[na:1.8.0_112]
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) ~[na:1.8.0_112]
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) ~[na:1.8.0_112]
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172) ~[na:1.8.0_112]
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) ~[na:1.8.0_112]
at java.net.Socket.connect(Socket.java:589) ~[na:1.8.0_112]
at com.rabbitmq.client.impl.SocketFrameHandlerFactory.create(SocketFrameHandlerFactory.java:60) ~[amqp-client-5.4.3.jar:5.4.3]
at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1102) ~[amqp-client-5.4.3.jar:5.4.3]
at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1054) ~[amqp-client-5.4.3.jar:5.4.3]
at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1218) ~[amqp-client-5.4.3.jar:5.4.3]
at org.springframework.amqp.rabbit.connection.AbstractConnectionFactory.createBareConnection(AbstractConnectionFactory.java:471) ~[spring-rabbit-2.1.6.RELEASE.jar:2.1.6.RELEASE]
... 21 common frames omitted
2019-06-06 15:18:43.348 INFO 8576 --- [ Thread-5] o.s.a.r.l.DirectMessageListenerContainer : Shutdown ignored - container is not active already
2019-06-06 15:18:43.349 INFO 8576 --- [ Thread-5] o.s.a.r.l.SimpleMessageListenerContainer : Shutdown ignored - container is not active already
2019-06-06 15:18:43.349 INFO 8576 --- [ Thread-5] o.s.a.r.l.SimpleMessageListenerContainer : Shutdown ignored - container is not active already
2019-06-06 15:18:43.349 ERROR 8576 --- [ Thread-5] o.a.q.s.c.updater.TaskExecutorImpl : Task executor Broker-Config is not in ACTIVE state, unable to execute : Task['close' on 'SystemConfig[id=00000000-0000-0000-0000-000000000000, name=System, type=Memory]']
2019-06-06 15:18:43.351 INFO 8576 --- [ Thread-5] o.s.b.f.support.DisposableBeanAdapter : Destroy method 'shutdown' on bean with name 'qpidBroker' threw an exception: java.lang.IllegalStateException: Task executor Broker-Config is not in ACTIVE state
Results:
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
What am I doing wrong?
From the logs above I'm not sure there is a problem per se. It looks like the test "completes" but then you see errors because the broker has closed before the client has shut down, so when the client tries to shut down it gets errors trying to communicate with the broker:
2019-06-06 15:18:40.359 INFO 8576 --- [-default-Config] qpid.message.subscription.create : [con:0(guest#/127.0.0.1:49260/default)/ch:4] [sub:3(vh(/default)/qu(q.sample.first)] SUB-1001 : Create
2019-06-06 15:18:40.360 INFO 8576 --- [sageContainer-1] o.s.a.r.l.DirectMessageListenerContainer : SimpleConsumer [queue=q.sample.first, consumerTag=sgen_1 identity=18b733e3] started
2019-06-06 15:18:40.363 INFO 8576 --- [ main] com.example.MyMessageListenerTest : Started MyMessageListenerTest in 3.502 seconds (JVM running for 4.314)
Everything is fine up till here
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.027 s - in com.example.MyMessageListenerTest
Indicates the test finished
2019-06-06 15:18:40.561 INFO 8576 --- [-default-Config] qpid.message.subscription.close : [con:0(guest#/127.0.0.1:49260/default)/ch:1] [sub:0(vh(/default)/qu(q.sample.second)] SUB-1002 : Close
2019-06-06 15:18:40.566 INFO 8576 --- [ Thread-5] o.s.a.r.l.SimpleMessageListenerContainer : Waiting for workers to finish.
2019-06-06 15:18:40.568 INFO 8576 --- [127.0.0.1:49260] qpid.message.channel.close_forced : [IO Pool] [con:0(guest#/127.0.0.1:49260/default)/ch:1] CHN-1003 : Close : 320 - Connection closed by external action
These above messages indicate the server is closing open connections (presumably because the broker is being shut down).
You then get exceptions indicating that the client can't connect to the server, but if you look at the methods in which these exceptions are being thrown, it appears to be in the client shut down logic.

Resources