sleuth does not show Trace Id / Span Id in logs while WebClient Rest call - spring-boot

On rest api call with Webclient, few default logs are printed like below but sleuth doesn't add tracid with it. see below:
2022-08-10 10:18:26.123 DEBUG [cib_bulk,,] 1 --- [or-http-epoll-1] r.netty.http.client.HttpClientConnect : [7c54bef8-1, L:/1.1.1.:60568 - R:xyz.c11.1.1.:443] Handler is being applied: {uri=xyz.c/services/productInventory/v2/product/search/count?abc=2346&status=ACTIVE, method=GET}
only application name is attached here [cib_bulk,,]. But in entire application, when I log manually through logger, then sleuth attach traceid and span id.
#Bean
public WebClient webClientWithTimeout() {
String baseUrl = environment.getProperty("cibase.productapi.service.url");
LOG.info("Base Url of Product Inventory Service: {}",baseUrl);
String username = environment.getProperty("cibase.productapi.basicauth.username");
String password = environment.getProperty("cibase.productapi.basicauth.password");
String trackingid = environment.getProperty("cibase.productapi.basicauth.trackingid");
String trackingIdValue = environment.getProperty("cibase.productapi.basicauth.trackingid.value");
HttpClient httpClient = HttpClient.create();
Builder builder =
WebClient.builder()
.codecs(configurer -> configurer.defaultCodecs().maxInMemorySize(IN_MEMORY_SIZE))
.filter(basicAuthentication(username, password));
if(trackingid != null){
builder.defaultHeader(trackingid, trackingIdValue);
}
return builder.baseUrl(baseUrl).clientConnector(new ReactorClientHttpConnector(httpClient)).build();
}
=============
List<Product> productList = webClient
.get()
.uri(uriBuilder -> uriBuilder.path(MessageConstants.PRODUCT_INVENTORY_API_URL).replaceQuery(queryString).build())
.retrieve()
.bodyToFlux(Product.class)
.collectList()
.retryWhen(retryConfiguration())
.block();
=====
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-sleuth</artifactId>
<version>3.1.1</version>
</dependency>

I found the solution. Just use below code to print Trace id and spanId in logging. This code is also useful to print REST call's request and response body in pretty format.
import static org.springframework.web.reactive.function.client.ExchangeFilterFunctions.basicAuthentication;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.env.Environment;
import org.springframework.http.client.reactive.ReactorClientHttpConnector;
import org.springframework.web.reactive.function.client.WebClient;
import org.springframework.web.reactive.function.client.WebClient.Builder;
import brave.http.HttpTracing;
import io.netty.handler.logging.LogLevel;
import reactor.netty.http.brave.ReactorNettyHttpTracing;
import reactor.netty.http.client.HttpClient;
import reactor.netty.transport.logging.AdvancedByteBufFormat;
#Configuration
public class WebClientConfiguration {
public static final Logger LOG = LoggerFactory.getLogger(WebClientConfiguration.class);
#Autowired
private Environment environment;
private static final int IN_MEMORY_SIZE = -1; // unlimited in-memory
/* Step 1: This bean is responsible to add Sleuth generated TraceId and SpanId in logs*/
#Bean
ReactorNettyHttpTracing reactorNettyHttpTracing(final HttpTracing httpTracing) {
return ReactorNettyHttpTracing.create(httpTracing);
}
#Bean
public WebClient webClientWithTimeout(final ReactorNettyHttpTracing reactorNettyHttpTracing) {
String baseUrl = environment.getProperty("cibase.productapi.service.url");
LOG.info("Base Url of Product Inventory Service: {}",baseUrl);
String username = environment.getProperty("cibase.productapi.basicauth.username");
String password = environment.getProperty("cibase.productapi.basicauth.password");
String trackingid = environment.getProperty("cibase.productapi.basicauth.trackingid");
String trackingIdValue = environment.getProperty("cibase.productapi.basicauth.trackingid.value");
// wiretap used to log request and response body
HttpClient httpClient = HttpClient.create().wiretap(this.getClass().getCanonicalName(), LogLevel.DEBUG, AdvancedByteBufFormat.TEXTUAL);
Builder builder =
WebClient.builder()
.codecs(configurer -> configurer.defaultCodecs().maxInMemorySize(IN_MEMORY_SIZE))
.filter(basicAuthentication(username, password));
if(trackingid != null){
builder.defaultHeader(trackingid, trackingIdValue);
}
/* step 2. reactorNettyHttpTracing object used here */
return builder
.baseUrl(baseUrl)
.clientConnector(new ReactorClientHttpConnector(reactorNettyHttpTracing.decorateHttpClient(httpClient)))
.build(); // here we have used the above bean
}
}

Related

How can i implement slf4j MDC in springboot webflux [duplicate]

I referenced with the blog post Contextual Logging with Reactor Context and MDC but I don't know how to access reactor context in WebFilter.
#Component
public class RequestIdFilter implements WebFilter {
#Override
public Mono<Void> filter(ServerWebExchange exchange, WebFilterChain chain) {
List<String> myHeader = exchange.getRequest().getHeaders().get("X-My-Header");
if (myHeader != null && !myHeader.isEmpty()) {
MDC.put("myHeader", myHeader.get(0));
}
return chain.filter(exchange);
}
}
Here's one solution based on the latest approach, as of May 2021, taken from the official documentation:
import java.util.List;
import java.util.Optional;
import java.util.UUID;
import java.util.function.Consumer;
import lombok.extern.slf4j.Slf4j;
import org.slf4j.MDC;
import org.springframework.context.annotation.Configuration;
import org.springframework.http.HttpHeaders;
import org.springframework.http.server.reactive.ServerHttpRequest;
import org.springframework.web.server.ServerWebExchange;
import org.springframework.web.server.WebFilter;
import org.springframework.web.server.WebFilterChain;
import reactor.core.publisher.Mono;
import reactor.core.publisher.Signal;
import reactor.util.context.Context;
#Slf4j
#Configuration
public class RequestIdFilter implements WebFilter {
#Override
public Mono<Void> filter(ServerWebExchange exchange, WebFilterChain chain) {
ServerHttpRequest request = exchange.getRequest();
String requestId = getRequestId(request.getHeaders());
return chain
.filter(exchange)
.doOnEach(logOnEach(r -> log.info("{} {}", request.getMethod(), request.getURI())))
.contextWrite(Context.of("CONTEXT_KEY", requestId));
}
private String getRequestId(HttpHeaders headers) {
List<String> requestIdHeaders = headers.get("X-Request-ID");
return requestIdHeaders == null || requestIdHeaders.isEmpty()
? UUID.randomUUID().toString()
: requestIdHeaders.get(0);
}
public static <T> Consumer<Signal<T>> logOnEach(Consumer<T> logStatement) {
return signal -> {
String contextValue = signal.getContextView().get("CONTEXT_KEY");
try (MDC.MDCCloseable cMdc = MDC.putCloseable("MDC_KEY", contextValue)) {
logStatement.accept(signal.get());
}
};
}
public static <T> Consumer<Signal<T>> logOnNext(Consumer<T> logStatement) {
return signal -> {
if (!signal.isOnNext()) return;
String contextValue = signal.getContextView().get("CONTEXT_KEY");
try (MDC.MDCCloseable cMdc = MDC.putCloseable("MDC_KEY", contextValue)) {
logStatement.accept(signal.get());
}
};
}
}
Given you have the following line in your application.properties:
logging.pattern.level=[%X{MDC_KEY}] %5p
then every time an endpoint is called your server logs will contain a log like this:
2021-05-06 17:07:41.852 [60b38305-7005-4a05-bac7-ab2636e74d94] INFO 20158 --- [or-http-epoll-6] my.package.RequestIdFilter : GET http://localhost:12345/my-endpoint/444444/
Every time you want to log manually something within a reactive context you will have add the following to your reactive chain:
.doOnEach(logOnNext(r -> log.info("Something")))
If you want the X-Request-ID to be propagated to other services for distributed tracing, you need to read it from the reactive context (not from MDC) and wrap your WebClient code with the following:
Mono.deferContextual(
ctx -> {
RequestHeadersSpec<?> request = webClient.get().uri(uri);
request = request.header("X-Request-ID", ctx.get("CONTEXT_KEY"));
// The rest of your request logic...
});
You can do something similar to below, You can set the context with any class you like, for this example I just used headers - but a custom class will do just fine.
If you set it here, then any logging with handlers etc will also have access to the context.
The logWithContext below, sets the MDC and clears it after. Obviously this can be replaced with anything you like.
public class RequestIdFilter implements WebFilter {
private Logger LOG = LoggerFactory.getLogger(RequestIdFilter.class);
#Override
public Mono<Void> filter(ServerWebExchange exchange, WebFilterChain chain) {
HttpHeaders headers = exchange.getRequest().getHeaders();
return chain.filter(exchange)
.doAfterSuccessOrError((r, t) -> logWithContext(headers, httpHeaders -> LOG.info("Some message with MDC set")))
.subscriberContext(Context.of(HttpHeaders.class, headers));
}
static void logWithContext(HttpHeaders headers, Consumer<HttpHeaders> logAction) {
try {
headers.forEach((name, values) -> MDC.put(name, values.get(0)));
logAction.accept(headers);
} finally {
headers.keySet().forEach(MDC::remove);
}
}
}
As of Spring Boot 2.2 there is Schedulers.onScheduleHook that enables you to handle MDC:
Schedulers.onScheduleHook("mdc", runnable -> {
Map<String, String> map = MDC.getCopyOfContextMap();
return () -> {
if (map != null) {
MDC.setContextMap(map);
}
try {
runnable.run();
} finally {
MDC.clear();
}
};
});
Alternatively, Hooks.onEachOperator can be used to pass around the MDC values via subscriber context.
http://ttddyy.github.io/mdc-with-webclient-in-webmvc/
This is not full MDC solution, e.g. in my case I cannot cleanup MDC values in R2DBC threads.
UPDATE: this article really solves my MDC problem: https://www.novatec-gmbh.de/en/blog/how-can-the-mdc-context-be-used-in-the-reactive-spring-applications/
It provides correct way of updating MDC based on subscriber context.
Combine it with SecurityContext::class.java key populated by AuthenticationWebFilter and you will be able to put user login to your logs.
My solution based on Reactor 3 Reference Guide approach but using doOnSuccess instead of doOnEach.
The main idea is to use Context for MDC propagation in the next way:
Fill a downstream Context (which will be used by derived threads) with the MDC state from an upstream flow (can be done by .contextWrite(context -> Context.of(MDC.getCopyOfContextMap())))
Access the downstream Context in derived threads and fill MDC in derived thread with values from the downstream Context (the main challenge)
Clear the MDC in the downstream Context (can be done by .doFinally(signalType -> MDC.clear()))
The main problem is to access a downstream Context in derived threads. And you can implement step 2 with the most convenient for you approach). But here is my solution:
webclient.post()
.bodyValue(someRequestData)
.retrieve()
.bodyToMono(String.class)
// By this action we wrap our response with a new Mono and also
// in parallel fill MDC with values from a downstream Context because
// we have an access to it
.flatMap(wrapWithFilledMDC())
.doOnSuccess(response -> someActionWhichRequiresFilledMdc(response)))
// Fill a downstream context with the current MDC state
.contextWrite(context -> Context.of(MDC.getCopyOfContextMap()))
// Allows us to clear MDC from derived threads
.doFinally(signalType -> MDC.clear())
.block();
// Function which implements second step from the above main idea
public static <T> Function<T, Mono<T>> wrapWithFilledMDC() {
// Using deferContextual we have an access to downstream Context, so
// we can just fill MDC in derived threads with
// values from the downstream Context
return item -> Mono.deferContextual(contextView -> {
// Function for filling MDC with Context values
// (you can apply your action)
fillMdcWithContextView(contextView);
return Mono.just(item);
});
}
public static void fillMdcWithContextValues(ContextView contextView) {
contextView.forEach(
(key, value) -> {
if (key instanceof String keyStr && value instanceof String valueStr) {
MDC.put(keyStr, valueStr);
}
});
}
This approach is also can be applied to doOnError and onErrorResume methods since the main idea is the same.
Used versions:
spring-boot: 2.7.3
spring-webflux: 5.3.22 (from spring-boot)
reactor-core: 3.4.22 (from spring-webflux)
reactor-netty: 1.0.22 (from spring-webflux)
I achieved this with :-
package com.nks.app.filter;
import lombok.extern.slf4j.Slf4j;
import org.slf4j.MDC;
import org.springframework.stereotype.Component;
import org.springframework.web.server.ServerWebExchange;
import org.springframework.web.server.WebFilter;
import org.springframework.web.server.WebFilterChain;
import reactor.core.publisher.Mono;
/**
* #author nks
*/
#Component
#Slf4j
public class SessionIDFilter implements WebFilter {
private static final String APP_SESSION_ID = "app-session-id";
/**
* Process the Web request and (optionally) delegate to the next
* {#code WebFilter} through the given {#link WebFilterChain}.
*
* #param serverWebExchange the current server exchange
* #param webFilterChain provides a way to delegate to the next filter
* #return {#code Mono<Void>} to indicate when request processing is complete
*/
#Override
public Mono<Void> filter(ServerWebExchange serverWebExchange, WebFilterChain webFilterChain) {
serverWebExchange.getResponse()
.getHeaders().add(APP_SESSION_ID, serverWebExchange.getRequest().getHeaders().getFirst(APP_SESSION_ID));
MDC.put(APP_SESSION_ID, serverWebExchange.getRequest().getHeaders().getFirst(APP_SESSION_ID));
log.info("[{}] : Inside filter of SessionIDFilter, ADDED app-session-id in MDC Logs", MDC.get(APP_SESSION_ID));
return webFilterChain.filter(serverWebExchange);
}
}
and, values associated with app-session-id for the thread can be logged.

calling blocking feign client from reactive spring service

I am trying to call generated feign client from reactive spring flux like this:
.doOnNext(user1 -> {
ResponseEntity<Void> response = recorderClient.createUserProfile(new UserProfileDto().principal(user1.getLogin()));
if (!response.getStatusCode().equals(HttpStatus.OK)) {
log.error("recorder backend could not create user profile for user: {} ", user1.getLogin());
throw new RuntimeException("recorder backend could not create user profile for login name" + user1.getLogin());
}
})
Call is executed, but when I try to retrieve jwt token from reactive security context ( in a requets interceptor ) like this:
public static Mono<String> getCurrentUserJWT() {
return ReactiveSecurityContextHolder
.getContext()
.map(SecurityContext::getAuthentication)
.filter(authentication -> authentication.getCredentials() instanceof String)
.map(authentication -> (String) authentication.getCredentials());
}
....
SecurityUtils.getCurrentUserJWT().blockOptional().ifPresent(s -> template.header(AUTHORIZATION_HEADER, String.format("%s %s", BEARER, s)));
context is empty. As I am pretty new to reactive spring I am surely mussing something stupid and important.
Not sure how is your interceptor configured,
but in my case, i just simply implement ReactiveHttpRequestInterceptor and override apply() function
import feign.RequestInterceptor;
import feign.RequestTemplate;
import org.springframework.stereotype.Component;
import reactivefeign.client.ReactiveHttpRequest;
import reactivefeign.client.ReactiveHttpRequestInterceptor;
import reactor.core.publisher.Mono;
import java.util.Collections;
#Component
public class UserFeignClientInterceptor implements ReactiveHttpRequestInterceptor {
private static final String AUTHORIZATION_HEADER = "Authorization";
private static final String BEARER = "Bearer";
#Override
public Mono<ReactiveHttpRequest> apply(ReactiveHttpRequest reactiveHttpRequest) {
return SecurityUtils.getCurrentUserJWT()
.flatMap(s -> {
reactiveHttpRequest.headers().put(AUTHORIZATION_HEADER, Collections.singletonList(String.format("%s %s", BEARER, s)));
return Mono.just(reactiveHttpRequest);
});
}
}

Spring-data-elasticsearch: cannot convert from Flux<SearchHit<Sugestao>> to Flux<Sugestao> after updated to 7.6.2. How deal with SearchHit?

Context: I want to use ElasticSearch in a full reactive stack compound by ElasticSearch and Spring WebFlux.
It is my first time using springframework.data.elasticsearch.client.reactive.ReactiveElasticsearchClient and springframework.data.elasticsearch.core.ReactiveElasticsearchOperations. I have worked in a reactive stack using MongoDb but it is my first time with ElasticSearch.
I have successfully follow a tutorial using ReactiveElasticsearchOperations with spring-data-elasticsearch-3.2.6 and elasticsearch-6.8.7 (Elastic Tutorial)
And the findAll/findById are working properly with elastic-6.8.7 and spring-data-elasticsearch-3.2.6
MyModelService:
...
private final ReactiveElasticsearchOperations reactiveElasticsearchOperations;
private final ReactiveElasticsearchClient reactiveElasticsearchClient;
public MyModelServiceImpl(ReactiveElasticsearchOperations reactiveElasticsearchOperations,
ReactiveElasticsearchClient reactiveElasticsearchClient) {
this.reactiveElasticsearchOperations = reactiveElasticsearchOperations;
this.reactiveElasticsearchClient = reactiveElasticsearchClient;
}
#Override
public Mono<MyModel> findMyModelById(String id){
return reactiveElasticsearchOperations.findById(
id,
MyModel.class,
MYMODEL_ES_INDEX,
DEFAULT_ES_DOC_TYPE
).doOnError(throwable -> logger.error(throwable.getMessage(), throwable));
}
#Override
public Flux<MyModel> findAllMyModels(String field, String value){
NativeSearchQueryBuilder query = new NativeSearchQueryBuilder();
if (!StringUtils.isEmpty(field) && !StringUtils.isEmpty(value)) {
query.withQuery(QueryBuilders.matchQuery(field, value));
}
return reactiveElasticsearchOperations.find(
query.build(),
MyModel.class,
MYMODEL_ES_INDEX
).doOnError(throwable -> logger.error(throwable.getMessage(), throwable));
}
I try follow same idea with updated versions (spring-data-elasticsearch-4 and elast-7.6.2. Since I can read "Deprecated. since 4.0, use search(Query, ...) Flux emitting matching entities one by one wrapped in a SearchHit." then I got completely stuck because the result is wrraped in SearchHit. Well, searching around I din't get the idea why such wrrapper neither how to convert/map/flatMap/etc to a Flux of my model to return by controller method.
Here is my tentative causing the issue mentioned on this question topic:
service:
import com.poc.favoritos.model.Sugestao;
import org.elasticsearch.index.query.QueryBuilders;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.elasticsearch.client.reactive.ReactiveElasticsearchClient;
import org.springframework.data.elasticsearch.core.ReactiveElasticsearchOperations;
import org.springframework.data.elasticsearch.core.query.NativeSearchQueryBuilder;
import org.springframework.util.StringUtils;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
public class SugestaoServiceImpl implements SugestaoService{
private static final Logger logger = LoggerFactory.getLogger(SugestaoServiceImpl.class);
private final ReactiveElasticsearchOperations reactiveElasticsearchOperations;
private final ReactiveElasticsearchClient reactiveElasticsearchClient;
public SugestaoServiceImpl(ReactiveElasticsearchOperations reactiveElasticsearchOperations,
ReactiveElasticsearchClient reactiveElasticsearchClient) {
this.reactiveElasticsearchOperations = reactiveElasticsearchOperations;
this.reactiveElasticsearchClient = reactiveElasticsearchClient;
}
#Override
public Mono<Sugestao> findSugestaoById(String id) {
return reactiveElasticsearchOperations.get(id, Sugestao.class)
.doOnError(throwable -> logger.error(throwable.getMessage(), throwable));
}
#Override
public Flux<Sugestao> findAllMySugestoes(String field, String value) {
NativeSearchQueryBuilder query = new NativeSearchQueryBuilder();
if (!StringUtils.isEmpty(field) && !StringUtils.isEmpty(value)) {
query.withQuery(QueryBuilders.matchQuery(field, value));
}
return reactiveElasticsearchOperations.search(query.build(), Sugestao.class);
}
}
ElastiSearchConfig orinally copied from Same tutorial mentioned above . Honestly, I am not sure why do I need and what is this config adding to my project. BTW, I am studding it also from operations reference.
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.elasticsearch.client.ClientConfiguration;
import org.springframework.data.elasticsearch.client.reactive.ReactiveElasticsearchClient;
import org.springframework.data.elasticsearch.client.reactive.ReactiveRestClients;
import org.springframework.data.elasticsearch.core.ReactiveElasticsearchOperations;
import org.springframework.data.elasticsearch.core.ReactiveElasticsearchTemplate;
import org.springframework.data.elasticsearch.core.convert.ElasticsearchConverter;
import org.springframework.data.elasticsearch.core.convert.MappingElasticsearchConverter;
import org.springframework.data.elasticsearch.core.mapping.SimpleElasticsearchMappingContext;
import org.springframework.web.reactive.function.client.ExchangeStrategies;
#Configuration
public class ElasticsearchConfig {
#Bean
public ReactiveElasticsearchClient reactiveElasticsearchClient() {
ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo(elassandraHostAndPort)
.withWebClientConfigurer(webClient -> {
ExchangeStrategies exchangeStrategies = ExchangeStrategies.builder()
.codecs(configurer -> configurer.defaultCodecs()
.maxInMemorySize(-1))
.build();
return webClient.mutate().exchangeStrategies(exchangeStrategies).build();
})
.build();
return ReactiveRestClients.create(clientConfiguration);
}
#Bean
public ElasticsearchConverter elasticsearchConverter() {
return new MappingElasticsearchConverter(elasticsearchMappingContext());
}
#Bean
public SimpleElasticsearchMappingContext elasticsearchMappingContext() {
return new SimpleElasticsearchMappingContext();
}
#Bean
public ReactiveElasticsearchOperations reactiveElasticsearchOperations() {
return new ReactiveElasticsearchTemplate(reactiveElasticsearchClient(), elasticsearchConverter());
}
#Value("${spring.data.elasticsearch.client.reactive.endpoints}")
private String elassandraHostAndPort;
}
As for the SearchHit: This class contains information form a search result that is not part of the entity, but part of the search result like score, sort values, highlight entries.
If you don't need this and just want to have a Flux with the entity alone:
Flux<SearchHit<Entity>> fluxSearchHits = ...
Flux<Entity> fluxEntity = fluxSearchHits.map(searchHit -> searchHit.getContent);
As for the configuration:
you need the ReactiveElasticsearchClient bean to configure Spring Data Elasticsearch. The other 3 beans: Don't know why they are there; they are not needed for Spring Data Elasticsearch 4.0
Edit 16.05.2020:
The configuration: You should derive your configuration class from AbstractReactiveElasticsearchConfiguration, then you don't need the other beans, because the base class defines the necessary things:
#Configuration
public class ElasticsearchConfig extends AbstractReactiveElasticsearchConfiguration{
#Value("${spring.data.elasticsearch.client.reactive.endpoints}")
private String elassandraHostAndPort;
#Bean
public ReactiveElasticsearchClient reactiveElasticsearchClient() {
ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo(elassandraHostAndPort)
.build();
return ReactiveRestClients.create(clientConfiguration);
}
}
and the customized WebClientConfiguration is only needed if you retrieve large result sets and the default memory size for the result buffer is too low.

Spring-Integration: Tcp Server Response not sent on Exception

I migrated a legacy tcp server code into spring-boot and added spring-intergration (annotation based) dependencies to handle tcp socket connections.
My inbound Channel is tcpIn() , outbound Channel is serviceChannel() and i have created a custom Channel [ exceptionEventChannel() ] to hold exception event messages.
I have a custom serializer/Deserialier method (ByteArrayLengthPrefixSerializer() extends AbstractPooledBufferByteArraySerializer), and a MessageHandler #ServiceActivator method to send response back to tcp client.
//SpringBoot 2.0.3.RELEASE, Spring Integration 5.0.6.RELEASE
package com.test.config;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationListener;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.event.EventListener;
import org.springframework.integration.annotation.IntegrationComponentScan;
import org.springframework.integration.annotation.ServiceActivator;
import org.springframework.integration.annotation.Transformer;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.event.inbound.ApplicationEventListeningMessageProducer;
import org.springframework.integration.ip.IpHeaders;
import org.springframework.integration.ip.tcp.TcpReceivingChannelAdapter;
import org.springframework.integration.ip.tcp.TcpSendingMessageHandler;
import org.springframework.integration.ip.tcp.connection.*;
import org.springframework.integration.ip.tcp.serializer.TcpDeserializationExceptionEvent;
import org.springframework.integration.router.ErrorMessageExceptionTypeRouter;
import org.springframework.integration.support.MessageBuilder;
import org.springframework.messaging.Message;
import org.springframework.messaging.MessageChannel;
import org.springframework.messaging.MessageHandlingException;
import org.springframework.messaging.MessagingException;
import java.io.IOException;
#Configuration
#IntegrationComponentScan
public class TcpConfiguration {
#SuppressWarnings("unused")
#Value("${tcp.connection.port}")
private int tcpPort;
#Bean
TcpConnectionEventListener customerTcpListener() {
return new TcpConnectionEventListener();
}
#Bean
public MessageChannel tcpIn() {
return new DirectChannel();
}
#Bean
public MessageChannel serviceChannel() {
return new DirectChannel();
}
#ConditionalOnMissingBean(name = "errorChannel")
#Bean
public MessageChannel errorChannel() {
return new DirectChannel();
}
#Bean
public MessageChannel exceptionEventChannel() {
return new DirectChannel();
}
#Bean
public ByteArrayLengthPrefixSerializer byteArrayLengthPrefixSerializer() {
ByteArrayLengthPrefixSerializer byteArrayLengthPrefixSerializer = new ByteArrayLengthPrefixSerializer();
byteArrayLengthPrefixSerializer.setMaxMessageSize(98304); //max allowed size set to 96kb
return byteArrayLengthPrefixSerializer;
}
#Bean
public AbstractServerConnectionFactory tcpNetServerConnectionFactory() {
TcpNetServerConnectionFactory tcpServerCf = new TcpNetServerConnectionFactory(tcpPort);
tcpServerCf.setSerializer(byteArrayLengthPrefixSerializer());
tcpServerCf.setDeserializer(byteArrayLengthPrefixSerializer());
return tcpServerCf;
}
#Bean
public TcpReceivingChannelAdapter tcpReceivingChannelAdapter() {
TcpReceivingChannelAdapter adapter = new TcpReceivingChannelAdapter();
adapter.setConnectionFactory(tcpNetServerConnectionFactory());
adapter.setOutputChannel(tcpIn());
adapter.setErrorChannel(exceptionEventChannel());
return adapter;
}
#ServiceActivator(inputChannel = "exceptionEventChannel", outputChannel = "serviceChannel")
public String handle(Message<MessagingException> msg) {
//String unfilteredMessage = new String(byteMessage, StandardCharsets.US_ASCII);
System.out.println("-----------------EXCEPTION ==> " + msg);
return msg.toString();
}
#Transformer(inputChannel = "errorChannel", outputChannel = "serviceChannel")
public String transformer(String msg) {
//String unfilteredMessage = new String(byteMessage, StandardCharsets.US_ASCII);
System.out.println("-----------------ERROR ==> " + msg);
return msg.toString();
}
#ServiceActivator(inputChannel = "serviceChannel")
#Bean
public TcpSendingMessageHandler out(AbstractServerConnectionFactory cf) {
TcpSendingMessageHandler tcpSendingMessageHandler = new TcpSendingMessageHandler();
tcpSendingMessageHandler.setConnectionFactory(cf);
return tcpSendingMessageHandler;
}
#Bean
public ApplicationListener<TcpDeserializationExceptionEvent> listener() {
return new ApplicationListener<TcpDeserializationExceptionEvent>() {
#Override
public void onApplicationEvent(TcpDeserializationExceptionEvent tcpDeserializationExceptionEvent) {
exceptionEventChannel().send(MessageBuilder.withPayload(tcpDeserializationExceptionEvent.getCause())
.build());
}
};
}
}
Messages in tcpIn() is sent to a #ServiceActivator method inside a separate #Component Class, which is structured like so :
#Component
public class TcpServiceActivator {
#Autowired
public TcpServiceActivator() {
}
#ServiceActivator(inputChannel = "tcpIn", outputChannel = "serviceChannel")
public String service(byte[] byteMessage) {
// Business Logic returns String Ack Response
}
I don't have issues running a success scenario. My Tcp TestClient gets Ack response as expected.
However, when i try to simulate an exception, say Deserializer Exception, The exception message is not sent back as a response to Tcp Client.
I can see my Application Listener getting TcpDeserializationExceptionEvent and sending the message to exceptionEventChannel. The #ServiceActivator method handle(Message msg) also prints my exception message. But it never reaches the breakpoints (in a debug mode) inside MessageHandler method out(AbstractServerConnectionFactory cf).
I am struggling to understand whats going wrong. Thanks for any help in advance.
UPDATE : I notice that the Socket is closed due to exception before the response can be sent. I'm trying to figure out a way around this
SOLUTION UPDATE (12th Mar 2019) :
Courtesy of Gary, i edited my deserializer to return a message that can be traced by a #Router method and redirected to errorChannel. The ServiceActivator listening to errorchannel then sends the desired error message to outputChannel . This solution seems to work.
My deserializer method inside ByteArrayLengthPrefixSerializer returning a "special value" as Gary recommended, instead of the original inputStream message.
public byte[] doDeserialize(InputStream inputStream, byte[] buffer) throws IOException {
boolean isValidMessage = false;
try {
int messageLength = this.readPrefix(inputStream);
if (messageLength > 0 && fillUntilMaxDeterminedSize(inputStream, buffer, messageLength)) {
return this.copyToSizedArray(buffer, messageLength);
}
return EventType.MSG_INVALID.getName().getBytes();
} catch (SoftEndOfStreamException eose) {
return EventType.MSG_INVALID.getName().getBytes();
}
}
I also made a few new channels to accommodate my Router such that the flow is as follows :
Success flow
tcpIn (#Router) -> serviceChannel(#serviceActivator that holds business logic) -> outputChannel (#serviceActivator that sends response to client)
Exception flow
tcpIn (#Router) -> errorChannel(#serviceActivator that prepares the error Response message) -> outputChannel (#serviceActivator that sends response to client)
My #Router and 'errorHandling' #serviceActivator -
#Router(inputChannel = "tcpIn", defaultOutputChannel = "errorChannel")
public String messageRouter(byte[] byteMessage) {
String unfilteredMessage = new String(byteMessage, StandardCharsets.US_ASCII);
System.out.println("------------------> "+unfilteredMessage);
if (Arrays.equals(EventType.MSG_INVALID.getName().getBytes(), byteMessage)) {
return "errorChannel";
}
return "serviceChannel";
}
#ServiceActivator(inputChannel = "errorChannel", outputChannel = "outputChannel")
public String errorHandler(byte[] byteMessage) {
return Message.ACK_RETRY;
}
The error channel is for handling exceptions that occur while processing a message. Deserialization errors occur before a message is created (the deserializer decodes the payload for the message).
Deserialization exceptions are fatal and, as you have observed, the socket is closed.
One option would be to catch the exception in the deserializer and return a "special" value that indicates a deserialization exception occurred, then check for that value in your main flow.

CXF InInterceptor not firing

I have created web service. It works fine. Now I'm trying to implement authentication to it. I'm using CXF interceptors for that purpose. For some reason interceptors won't fire. What am I missing? This is my first web service.
import javax.annotation.Resource;
import javax.inject.Inject;
import javax.jws.WebMethod;
import javax.jws.WebParam;
import javax.jws.WebService;
import javax.xml.ws.WebServiceContext;
import org.apache.cxf.interceptor.InInterceptors;
#WebService
#InInterceptors(interceptors = "ws.BasicAuthAuthorizationInterceptor")
public class Service {
#WebMethod
public void test(#WebParam(name = "value") Integer value) throws Exception {
System.out.println("Value = " + value);
}
}
-
package ws;
import java.io.IOException;
import java.io.OutputStream;
import java.net.HttpURLConnection;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.apache.cxf.binding.soap.interceptor.SoapHeaderInterceptor;
import org.apache.cxf.configuration.security.AuthorizationPolicy;
import org.apache.cxf.endpoint.Endpoint;
import org.apache.cxf.interceptor.Fault;
import org.apache.cxf.message.Exchange;
import org.apache.cxf.message.Message;
import org.apache.cxf.transport.Conduit;
import org.apache.cxf.ws.addressing.EndpointReferenceType;
public class BasicAuthAuthorizationInterceptor extends SoapHeaderInterceptor {
#Override
public void handleMessage(Message message) throws Fault {
System.out.println("**** GET THIS LINE TO CONSOLE TO SEE IF INTERCEPTOR IS FIRING!!!");
AuthorizationPolicy policy = message.get(AuthorizationPolicy.class);
// If the policy is not set, the user did not specify credentials.
// 401 is sent to the client to indicate that authentication is required.
if (policy == null) {
sendErrorResponse(message, HttpURLConnection.HTTP_UNAUTHORIZED);
return;
}
String username = policy.getUserName();
String password = policy.getPassword();
// CHECK USERNAME AND PASSWORD
if (!checkLogin(username, password)) {
System.out.println("handleMessage: Invalid username or password for user: "
+ policy.getUserName());
sendErrorResponse(message, HttpURLConnection.HTTP_FORBIDDEN);
}
}
private boolean checkLogin(String username, String password) {
if (username.equals("admin") && password.equals("admin")) {
return true;
}
return false;
}
private void sendErrorResponse(Message message, int responseCode) {
Message outMessage = getOutMessage(message);
outMessage.put(Message.RESPONSE_CODE, responseCode);
// Set the response headers
#SuppressWarnings("unchecked")
Map<String, List<String>> responseHeaders = (Map<String, List<String>>) message
.get(Message.PROTOCOL_HEADERS);
if (responseHeaders != null) {
responseHeaders.put("WWW-Authenticate", Arrays.asList(new String[] { "Basic realm=realm" }));
responseHeaders.put("Content-Length", Arrays.asList(new String[] { "0" }));
}
message.getInterceptorChain().abort();
try {
getConduit(message).prepare(outMessage);
close(outMessage);
} catch (IOException e) {
e.printStackTrace();
}
}
private Message getOutMessage(Message inMessage) {
Exchange exchange = inMessage.getExchange();
Message outMessage = exchange.getOutMessage();
if (outMessage == null) {
Endpoint endpoint = exchange.get(Endpoint.class);
outMessage = endpoint.getBinding().createMessage();
exchange.setOutMessage(outMessage);
}
outMessage.putAll(inMessage);
return outMessage;
}
private Conduit getConduit(Message inMessage) throws IOException {
Exchange exchange = inMessage.getExchange();
EndpointReferenceType target = exchange.get(EndpointReferenceType.class);
Conduit conduit = exchange.getDestination().getBackChannel(inMessage, null, target);
exchange.setConduit(conduit);
return conduit;
}
private void close(Message outMessage) throws IOException {
OutputStream os = outMessage.getContent(OutputStream.class);
os.flush();
os.close();
}
}
I'm fighting with this for few days now. Don't know what to google any more. Help is appreciated.
I've found solution. I was missing the following line in MANIFEST.MF file in war project:
Dependencies: org.apache.cxf
Maven wasn't includint this line by himself so I had to find workaround. I found about that here. It says: When using annotations on your endpoints / handlers such as the Apache CXF ones (#InInterceptor, #GZIP, ...) remember to add the proper module dependency in your manifest. Otherwise your annotations are not picked up and added to the annotation index by JBoss Application Server 7, resulting in them being completely and silently ignored.
This is where I found out how to change MANIFEST.MF file.
In short, I added custom manifest file to my project and referenced it in pom.xml. Hope this helps someone.
The answer provided by Felix is accurate. I managed to solve the problem using his instructions. Just for completion here is the maven config that lets you use your own MANIFEST.MF file placed in the META-INF folder.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<archive>
<manifestFile>src/main/resources/META-INF/MANIFEST.MF</manifestFile>
</archive>
</configuration>
</plugin>
and here is the relevant content of the content of the MANIFEST.MF file I was using.
Manifest-Version: 1.0
Description: yourdescription
Dependencies: org.apache.ws.security,org.apache.cxf

Resources