API call not returning response with HttpRequestExecutingMessageHandler - spring-boot

I'm facing an issue where which ever API I call first using
HttpRequestExecutingMessageHandler
will return response back and second API called again using
HttpRequestExecutingMessageHandler
just hangs and return 504 timeout response back even though the request gets accepted for second API at server and processing is also done. Both the APIs call methods are listed in two different classes with separate output queue channels.
If I now restart the server and call second API first, this time, will return 200 response back but now first API start failing to respond back the 200 response.
#Configuration
class OutgoingHttpChannelAdapterConfig {
#Bean
#Qualifier("responseChannel1")
fun fromResponseChannel1(): QueueChannel = MessageChannels.queue().get()
#Bean
#Qualifier("customRestTemplateOut")
fun customRestTemplateOut(): RestTemplate {
return RestTemplate()
}
#Bean
#ServiceActivator(inputChannel = "forwardDataChannel")
#Throws(MessageHandlingException::class)
fun forwardRequestMethod(
#Qualifier("customRestTemplateOut") restTemplate: RestTemplate
): MessageHandler {
val headerMapper = DefaultHttpHeaderMapper()
headerMapper.setOutboundHeaderNames("Authorization","key")
val msgHandler = HttpRequestExecutingMessageHandler(url)
msgHandler.setHeaderMapper(headerMapper)
msgHandler.setHttpMethod(HttpMethod.POST)
msgHandler.isExpectReply = true
msgHandler.outputChannel = fromResponseChannel1()
msgHandler.setExpectedResponseType(DataResponse::class.java)
return msgHandler
}
}
#Configuration
class IncomingHttpChannelAdapterConfig{
#Bean
#Qualifier("responseChannel2")
fun fromResponseChannel2(): QueueChannel = MessageChannels.queue().get()
#Bean
#Qualifier("customRestTemplate")
fun customRestTemplate(): RestTemplate {
return RestTemplate()
}
#Bean
#ServiceActivator(inputChannel = "acceptRequestChannel")
#Throws(MessageHandlingException::class)
fun acceptRequestMethod(
#Qualifier("customRestTemplate") restTemplate: RestTemplate
): MessageHandler {
val parser = SpelExpressionParser()
val map = mapOf<String, Expression>(
"id" to parser.parseRaw("payload.id")
)
val msgHandler = HttpRequestExecutingMessageHandler(url, restTemplate)
msgHandler.setHeaderMapper(headerMapper)
msgHandler.setHttpMethod(HttpMethod.PUT)
msgHandler.outputChannel = fromResponseChannel2()
msgHandler.setUriVariableExpressions(map)
return msgHandler
}
}
#MessagingGateway(
defaultRequestChannel = "forwardDataChannel", errorChannel = "newErrorChannel",
defaultReplyChannel = "replyChannel1"
)
interface ForwardRequest {
fun forwardRequest(msg: Message<MessageNotification>): Message<*>
}
#MessagingGateway(
defaultRequestChannel = "acceptRequestChannel", errorChannel = "newErrorChannel",
defaultReplyChannel = "replyChannel2"
)
interface AcceptRequest {
fun acceptRequest(msg: Message<MessageNotification>): Message<*>
}
Right now we are not doing anything with queue channel. Its just used for placeholder purpose.

Related

Spring cloud function send image / bytes

I want to receive and send back an image file using Spring Cloud Function Web.
Receiving works fine, but not sending. Somehow only JSON is delivered.
I am using Spring Boot 3.0.0 with Spring Cloud 2022.0.0.
Here is my Kotlin source code:
// you can call this with POST http://localhost:8080/kotlinByteConsumer sending the file as form-data
#Bean
fun kotlinByteConsumer(): (MultipartFile) -> Message<ByteArray> {
return {
// save file to disc
val receivedFile = File("${it.originalFilename}")
receivedFile.writeBytes(it.bytes)
// send file back
MessageBuilder
.withPayload(receivedFile.readBytes())
.setHeader(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_OCTET_STREAM_VALUE)
.build()
}
}
The trick was to configure the JSON Mapper to not fail on binary transfer:
#Configuration
class JSONConfig() {
#Bean
fun getObjectMapper(): ObjectMapper{
val mapper = ObjectMapper()
mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false)
return mapper
}
}
With this you can create the function:
#Bean
fun getUserPicture(): () -> Message<InputStreamResource> {
return {
val file = File("avatar_dummy.jpg")
val resource = InputStreamResource(file.inputStream())
MessageBuilder
.withPayload(resource)
.setHeader(HttpHeaders.CONTENT_DISPOSITION,
ContentDisposition.attachment().filename(file.name).build()
.toString()
)
.setHeader(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_OCTET_STREAM_VALUE)
.build()
}
}

2 ways streaming using spring webflux

I'm wondering if it's possible to achieve 2 ways streaming using Spring Webflux?
Basically, I'm looking to make the client to send a flux of data that the server receives maps them to String then return the result, all fluently without having to collect data.
I did it using RSocket but I'm wondering if I can get the same result using http 2.0 (with Spring and Project-Reactor).
Tried doing like this:
1- Client:
public Mono<Void> stream() {
var input = Flux.range(1, 10).delayElements(Duration.ofMillis(500));
return stockWebClient.post()
.uri("/stream")
.body(BodyInserters.fromPublisher(input, Integer.class))
.accept(MediaType.TEXT_EVENT_STREAM)
.retrieve()
.bodyToFlux(String.class)
.log()
.then();
}
2- Server:
#PostMapping(value = "/stream", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<String> stream(#RequestBody Integer i) {
return Flux.range(i, i+10).map(n -> String.valueOf(i)).log();
}
Or:
public Flux<String> stream(#RequestBody Flux<Integer> i) {
return i.map(n -> String.valueOf(i)).log();
}
Or:
#PostMapping(value = "/stream", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<String> stream(#RequestBody List<Integer> i) {
return Flux.fromIterable(i).map(n -> String.valueOf(i)).log();
}
None worked correctly.
If you want use Server Sent Event you need to return a Flux<ServerSentEvent<String>>.
So your server merthod should be:
#PostMapping(value = "/stream", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<ServerSentEvent<String>> stream(#RequestBody Integer i) {
return Flux.range(i, i + 10).map(n -> ServerSentEvent.builder(String.valueOf(n)).build());
}
But in this case the body is only an Integer and your client code becomes:
input.flatMap(i ->
stockWebClient
.post()
.uri("/stream")
.bodyValue(i)
.accept(MediaType.TEXT_EVENT_STREAM)
.retrieve()
.bodyToFlux(new ParameterizedTypeReference<ServerSentEvent<String>>() {})
.mapNotNull(ServerSentEvent::data)
.log())
.blockLast();
You can also do the same with functional endpoint.
If you want to be able to stream data from the client to the server and back you won't be able to use SSE but you can achieve this with websocket.
You will need a HandlerMapping and a WebSocketHandler
public class TestWebSocketHandler implements WebSocketHandler {
#Override
public Mono<Void> handle(WebSocketSession session) {
Flux<WebSocketMessage> output = session.receive()
.map(WebSocketMessage::getPayloadAsText)
.map(Integer::parseInt)
.concatMap(i -> Flux.range(i, i + 10).map(String::valueOf))
.map(session::textMessage);
return session.send(output);
}
}
The configuration with the handler :
#Bean
public TestWebSocketHandler myHandler() {
return new TestWebSocketHandler();
}
#Bean
public HandlerMapping handlerMapping(final TestWebSocketHandler myHandler) {
Map<String, WebSocketHandler> map = new HashMap<>();
map.put("/streamSocket", myHandler);
int order = -1; // before annotated controllers
return new SimpleUrlHandlerMapping(map, order);
}
On the client side:
var input2 = Flux.range(1, 10).delayElements(Duration.ofMillis(500));
WebSocketClient client = new ReactorNettyWebSocketClient();
client.execute(URI.create("http://localhost:8080/streamSocket"), session ->
session.send(input2.map(i -> session.textMessage("" + i))).then(session.receive().map(WebSocketMessage::getPayloadAsText).log().then())
).block();

Streaming upload via #Bean-provided RestTemplateBuilder buffers full file

I'm building a reverse-proxy for uploading large files (multiple gigabytes), and therefore want to use a streaming model that does not buffer entire files. Large buffers would introduce latency and, more importantly, they could result in out-of-memory errors.
My client class contains
#Autowired private RestTemplate restTemplate;
#Bean
public RestTemplate restTemplate(RestTemplateBuilder restTemplateBuilder) {
int REST_TEMPLATE_MODE = 1; // 1=streams, 2=streams, 3=buffers
return
REST_TEMPLATE_MODE == 1 ? new RestTemplate() :
REST_TEMPLATE_MODE == 2 ? (new RestTemplateBuilder()).build() :
REST_TEMPLATE_MODE == 3 ? restTemplateBuilder.build() : null;
}
and
public void upload_via_streaming(InputStream inputStream, String originalname) {
SimpleClientHttpRequestFactory requestFactory = new SimpleClientHttpRequestFactory();
requestFactory.setBufferRequestBody(false);
restTemplate.setRequestFactory(requestFactory);
InputStreamResource inputStreamResource = new InputStreamResource(inputStream) {
#Override public String getFilename() { return originalname; }
#Override public long contentLength() { return -1; }
};
MultiValueMap<String, Object> body = new LinkedMultiValueMap<String, Object>();
body.add("myfile", inputStreamResource);
HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.MULTIPART_FORM_DATA);
HttpEntity<MultiValueMap<String, Object>> requestEntity = new HttpEntity<>(body,headers);
String response = restTemplate.postForObject(UPLOAD_URL, requestEntity, String.class);
System.out.println("response: "+response);
}
This is working, but notice my REST_TEMPLATE_MODE value controls whether or not it meets my streaming requirement.
Question: Why does REST_TEMPLATE_MODE == 3 result in full-file buffering?
References:
How to forward large files with RestTemplate?
How to send Multipart form data with restTemplate Spring-mvc
Spring - How to stream large multipart file uploads to database without storing on local file system -- establishing the InputStream
How to autowire RestTemplate using annotations
Design notes and usage caveats, also: restTemplate does not support streaming downloads
In short, the instance of RestTemplateBuilder provided as an #Bean by Spring Boot includes an interceptor (filter) associated with actuator/metrics -- and the interceptor interface requires buffering of the request body into a simple byte[].
If you instantiate your own RestTemplateBuilder or RestTemplate from scratch, it won't include this by default.
I seem to be the only person visiting this post, but just in case it helps someone before I get around to posting a complete solution, I've found a big clue:
restTemplate.getInterceptors().forEach(item->System.out.println(item));
displays...
org.SF.boot.actuate.metrics.web.client.MetricsClientHttpRequestInterceptor
If I clear the interceptor list via setInterceptors, it solves the problem. Furthermore, I found that any interceptor, even if it only performs a NOP, will introduce full-file buffering.
public class SimpleClientHttpRequestFactory { ...
I have explicitly set bufferRequestBody = false, but apparently this code is bypassed if interceptors are used. This would have been nice to know earlier...
#Override
public ClientHttpRequest createRequest(URI uri, HttpMethod httpMethod) throws IOException {
HttpURLConnection connection = openConnection(uri.toURL(), this.proxy);
prepareConnection(connection, httpMethod.name());
if (this.bufferRequestBody) {
return new SimpleBufferingClientHttpRequest(connection, this.outputStreaming);
}
else {
return new SimpleStreamingClientHttpRequest(connection, this.chunkSize, this.outputStreaming);
}
}
public abstract class InterceptingHttpAccessor extends HttpAccessor { ...
This shows that the InterceptingClientHttpRequestFactory is used if the list of interceptors is not empty.
/**
* Overridden to expose an {#link InterceptingClientHttpRequestFactory}
* if necessary.
* #see #getInterceptors()
*/
#Override
public ClientHttpRequestFactory getRequestFactory() {
List<ClientHttpRequestInterceptor> interceptors = getInterceptors();
if (!CollectionUtils.isEmpty(interceptors)) {
ClientHttpRequestFactory factory = this.interceptingRequestFactory;
if (factory == null) {
factory = new InterceptingClientHttpRequestFactory(super.getRequestFactory(), interceptors);
this.interceptingRequestFactory = factory;
}
return factory;
}
else {
return super.getRequestFactory();
}
}
class InterceptingClientHttpRequest extends AbstractBufferingClientHttpRequest { ...
The interfaces make it clear that using InterceptingClientHttpRequest requires buffering body to a byte[]. There is not an option to use a streaming interface.
#Override
public ClientHttpResponse execute(HttpRequest request, byte[] body) throws IOException {

Combining #SqsListener and #RequestMapping

We're currently in the middle of migrating our current architecture into Spring-AWS-based microservices. One of my tasks is to research on how our microservices communicate with one another. I'm aiming to set-up a hybrid system of RESTful HTTP endpoints and SQS producers and consumers.
As an example, I have the below code:
#SqsListener("request_queue")
#SendTo("response_queue")
#PostMapping("/send")
public Object send(#RequestBody Request request, #Header("SenderId") String senderId) {
if (senderId != null && !senderId.trim().isEmpty()) {
logger.info("SQS Message Received!");
logger.info("Sender ID: ".concat(senderId));
request = new Gson().fromJson(payload, Request.class);
}
Response response = processRequest(request); // Process request
return response;
}
Theoretically, this method should be able to handle the following:
Receive a Request object via HTTP
Continually poll the request_queue for a message containing the Request object
As an HTTP endpoint, the method returns no error. However, as an SQS listener, it runs into the following exception:
org.springframework.messaging.converter.MessageConversionException:
Cannot convert from [java.lang.String] to [com.oriente.salt.Request] for
GenericMessage [payload={"source":"QueueTester","message":"This is a wonderful
message send by queue from Habanero to Salt. Spicy.","msisdn":"+639772108550"},
headers={LogicalResourceId=salt_queue, ApproximateReceiveCount=1,
SentTimestamp=1523444620218, ....
I've tried to annotate the Request param with #Payload, but to no avail. Currently I've also set-up the AWS config via Java, as seen below:
ConsuerAWSSQSConfig.java
#Configuration
public class ConsumerAWSSQSConfig {
#Bean
public SimpleMessageListenerContainer simpleMessageListenerContainer() {
SimpleMessageListenerContainer msgListenerContainer = simpleMessageListenerContainerFactory()
.createSimpleMessageListenerContainer();
msgListenerContainer.setMessageHandler(queueMessageHandler());
return msgListenerContainer;
}
#Bean
public SimpleMessageListenerContainerFactory simpleMessageListenerContainerFactory() {
SimpleMessageListenerContainerFactory msgListenerContainerFactory = new SimpleMessageListenerContainerFactory();
msgListenerContainerFactory.setAmazonSqs(amazonSQSClient());
return msgListenerContainerFactory;
}
#Bean
public QueueMessageHandler queueMessageHandler() {
QueueMessageHandlerFactory queueMsgHandlerFactory = new QueueMessageHandlerFactory();
queueMsgHandlerFactory.setAmazonSqs(amazonSQSClient());
QueueMessageHandler queueMessageHandler = queueMsgHandlerFactory.createQueueMessageHandler();
List<HandlerMethodArgumentResolver> list = new ArrayList<>();
HandlerMethodArgumentResolver resolver = new PayloadArgumentResolver(new MappingJackson2MessageConverter());
list.add(resolver);
queueMessageHandler.setArgumentResolvers(list);
return queueMessageHandler;
}
#Lazy
#Bean(name = "amazonSQS", destroyMethod = "shutdown")
public AmazonSQSAsync amazonSQSClient() {
AmazonSQSAsync awsSQSAsync = AmazonSQSAsyncClientBuilder.standard().withRegion(Regions.AP_SOUTHEAST_1).build();
return awsSQSAsync;
}
}
What do you guys think?

How to log request and response bodies in Spring WebFlux

I want to have centralised logging for requests and responses in my REST API on Spring WebFlux with Kotlin. So far I've tried this approaches
#Bean
fun apiRouter() = router {
(accept(MediaType.APPLICATION_JSON) and "/api").nest {
"/user".nest {
GET("/", userHandler::listUsers)
POST("/{userId}", userHandler::updateUser)
}
}
}.filter { request, next ->
logger.info { "Processing request $request with body ${request.bodyToMono<String>()}" }
next.handle(request).doOnSuccess { logger.info { "Handling with response $it" } }
}
Here request method and path log successfully but the body is Mono, so how should I log it? Should it be the other way around and I have to subscribe on request body Mono and log it in the callback?
Another problem is that ServerResponse interface here doesn't have access to the response body. How can I get it here?
Another approach I've tried is using WebFilter
#Bean
fun loggingFilter(): WebFilter =
WebFilter { exchange, chain ->
val request = exchange.request
logger.info { "Processing request method=${request.method} path=${request.path.pathWithinApplication()} params=[${request.queryParams}] body=[${request.body}]" }
val result = chain.filter(exchange)
logger.info { "Handling with response ${exchange.response}" }
return#WebFilter result
}
Same problem here: request body is Flux and no response body.
Is there a way to access full request and response for logging from some filters? What don't I understand?
This is more or less similar to the situation in Spring MVC.
In Spring MVC, you can use a AbstractRequestLoggingFilter filter and ContentCachingRequestWrapper and/or ContentCachingResponseWrapper. Many tradeoffs here:
if you'd like to access servlet request attributes, you need to actually read and parse the request body
logging the request body means buffering the request body, which can use a significant amount of memory
if you'd like to access the response body, you need to wrap the response and buffer the response body as it's being written, for later retrieval
ContentCaching*Wrapper classes don't exist in WebFlux but you could create similar ones. But keep in mind other points here:
buffering data in memory somehow goes against the reactive stack, since we're trying there to be very efficient with the available resources
you should not tamper with the actual flow of data and flush more/less often than expected, otherwise you'd risk breaking streaming uses cases
at that level, you only have access to DataBuffer instances, which are (roughly) memory-efficient byte arrays. Those belong to buffer pools and are recycled for other exchanges. If those aren't properly retained/released, memory leaks are created (and buffering data for later consumption certainly fits that scenario)
again at that level, it's only bytes and you don't have access to any codec to parse the HTTP body. I'd forget about buffering the content if it's not human-readable in the first place
Other answers to your question:
yes, the WebFilter is probably the best approach
no, you shouldn't subscribe to the request body otherwise you'd consume data that the handler won't be able to read; you can flatMap on the request and buffer data in doOn operators
wrapping the response should give you access to the response body as it's being written; don't forget about memory leaks, though
I didn't find a good way to log request/response bodies, but if you are just interested in meta data then you can do it like follows.
import org.springframework.http.HttpHeaders
import org.springframework.http.HttpStatus
import org.springframework.http.server.reactive.ServerHttpResponse
import org.springframework.stereotype.Component
import org.springframework.web.server.ServerWebExchange
import org.springframework.web.server.WebFilter
import org.springframework.web.server.WebFilterChain
import reactor.core.publisher.Mono
#Component
class LoggingFilter(val requestLogger: RequestLogger, val requestIdFactory: RequestIdFactory) : WebFilter {
val logger = logger()
override fun filter(exchange: ServerWebExchange, chain: WebFilterChain): Mono<Void> {
logger.info(requestLogger.getRequestMessage(exchange))
val filter = chain.filter(exchange)
exchange.response.beforeCommit {
logger.info(requestLogger.getResponseMessage(exchange))
Mono.empty()
}
return filter
}
}
#Component
class RequestLogger {
fun getRequestMessage(exchange: ServerWebExchange): String {
val request = exchange.request
val method = request.method
val path = request.uri.path
val acceptableMediaTypes = request.headers.accept
val contentType = request.headers.contentType
return ">>> $method $path ${HttpHeaders.ACCEPT}: $acceptableMediaTypes ${HttpHeaders.CONTENT_TYPE}: $contentType"
}
fun getResponseMessage(exchange: ServerWebExchange): String {
val request = exchange.request
val response = exchange.response
val method = request.method
val path = request.uri.path
val statusCode = getStatus(response)
val contentType = response.headers.contentType
return "<<< $method $path HTTP${statusCode.value()} ${statusCode.reasonPhrase} ${HttpHeaders.CONTENT_TYPE}: $contentType"
}
private fun getStatus(response: ServerHttpResponse): HttpStatus =
try {
response.statusCode
} catch (ex: Exception) {
HttpStatus.CONTINUE
}
}
This is what I came up with for java.
public class RequestResponseLoggingFilter implements WebFilter {
#Override
public Mono<Void> filter(ServerWebExchange exchange, WebFilterChain chain) {
ServerHttpRequest httpRequest = exchange.getRequest();
final String httpUrl = httpRequest.getURI().toString();
ServerHttpRequestDecorator loggingServerHttpRequestDecorator = new ServerHttpRequestDecorator(exchange.getRequest()) {
String requestBody = "";
#Override
public Flux<DataBuffer> getBody() {
return super.getBody().doOnNext(dataBuffer -> {
try (ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream()) {
Channels.newChannel(byteArrayOutputStream).write(dataBuffer.asByteBuffer().asReadOnlyBuffer());
requestBody = IOUtils.toString(byteArrayOutputStream.toByteArray(), "UTF-8");
commonLogger.info(LogMessage.builder()
.step(httpUrl)
.message("log incoming http request")
.stringPayload(requestBody)
.build());
} catch (IOException e) {
commonLogger.error(LogMessage.builder()
.step("log incoming request for " + httpUrl)
.message("fail to log incoming http request")
.errorType("IO exception")
.stringPayload(requestBody)
.build(), e);
}
});
}
};
ServerHttpResponseDecorator loggingServerHttpResponseDecorator = new ServerHttpResponseDecorator(exchange.getResponse()) {
String responseBody = "";
#Override
public Mono<Void> writeWith(Publisher<? extends DataBuffer> body) {
Mono<DataBuffer> buffer = Mono.from(body);
return super.writeWith(buffer.doOnNext(dataBuffer -> {
try (ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream()) {
Channels.newChannel(byteArrayOutputStream).write(dataBuffer.asByteBuffer().asReadOnlyBuffer());
responseBody = IOUtils.toString(byteArrayOutputStream.toByteArray(), "UTF-8");
commonLogger.info(LogMessage.builder()
.step("log outgoing response for " + httpUrl)
.message("incoming http request")
.stringPayload(responseBody)
.build());
} catch (Exception e) {
commonLogger.error(LogMessage.builder()
.step("log outgoing response for " + httpUrl)
.message("fail to log http response")
.errorType("IO exception")
.stringPayload(responseBody)
.build(), e);
}
}));
}
};
return chain.filter(exchange.mutate().request(loggingServerHttpRequestDecorator).response(loggingServerHttpResponseDecorator).build());
}
}
You can actually enable DEBUG logging for Netty and Reactor-Netty related to see full picture of what's happening. You could play with the below and see what you want and don't. That was the best I could.
reactor.ipc.netty.channel.ChannelOperationsHandler: DEBUG
reactor.ipc.netty.http.server.HttpServer: DEBUG
reactor.ipc.netty.http.client: DEBUG
io.reactivex.netty.protocol.http.client: DEBUG
io.netty.handler: DEBUG
io.netty.handler.proxy.HttpProxyHandler: DEBUG
io.netty.handler.proxy.ProxyHandler: DEBUG
org.springframework.web.reactive.function.client: DEBUG
reactor.ipc.netty.channel: DEBUG
Since Spring Boot 2.2.x, Spring Webflux supports Kotlin coroutines. With coroutines, you can have the advantages of non-blocking calls without having to handle Mono and Flux wrapped objects. It adds extensions to ServerRequest and ServerResponse, adding methods like ServerRequest#awaitBody() and ServerResponse.BodyBuilder.bodyValueAndAwait(body: Any). So you could rewrite you code like this:
#Bean
fun apiRouter() = coRouter {
(accept(MediaType.APPLICATION_JSON) and "/api").nest {
"/user".nest {
/* the handler methods now use ServerRequest and ServerResponse directly
you just need to add suspend before your function declaration:
suspend fun listUsers(ServerRequest req, ServerResponse res) */
GET("/", userHandler::listUsers)
POST("/{userId}", userHandler::updateUser)
}
}
// this filter will be applied to all routes built by this coRouter
filter { request, next ->
// using non-blocking request.awayBody<T>()
logger.info("Processing $request with body ${request.awaitBody<String>()}")
val res = next(request)
logger.info("Handling with Content-Type ${res.headers().contentType} and status code ${res.rawStatusCode()}")
res
}
}
In order to create a WebFilter Bean with coRoutines, I think you can use this CoroutineWebFilter interface (I haven't tested it, I don't know if it works).
I am pretty new to Spring WebFlux, and I don't know how to do it in Kotlin, but should be the same as in Java using WebFilter:
public class PayloadLoggingWebFilter implements WebFilter {
public static final ByteArrayOutputStream EMPTY_BYTE_ARRAY_OUTPUT_STREAM = new ByteArrayOutputStream(0);
private final Logger logger;
private final boolean encodeBytes;
public PayloadLoggingWebFilter(Logger logger) {
this(logger, false);
}
public PayloadLoggingWebFilter(Logger logger, boolean encodeBytes) {
this.logger = logger;
this.encodeBytes = encodeBytes;
}
#Override
public Mono<Void> filter(ServerWebExchange exchange, WebFilterChain chain) {
if (logger.isInfoEnabled()) {
return chain.filter(decorate(exchange));
} else {
return chain.filter(exchange);
}
}
private ServerWebExchange decorate(ServerWebExchange exchange) {
final ServerHttpRequest decorated = new ServerHttpRequestDecorator(exchange.getRequest()) {
#Override
public Flux<DataBuffer> getBody() {
if (logger.isDebugEnabled()) {
final ByteArrayOutputStream baos = new ByteArrayOutputStream();
return super.getBody().map(dataBuffer -> {
try {
Channels.newChannel(baos).write(dataBuffer.asByteBuffer().asReadOnlyBuffer());
} catch (IOException e) {
logger.error("Unable to log input request due to an error", e);
}
return dataBuffer;
}).doOnComplete(() -> flushLog(baos));
} else {
return super.getBody().doOnComplete(() -> flushLog(EMPTY_BYTE_ARRAY_OUTPUT_STREAM));
}
}
};
return new ServerWebExchangeDecorator(exchange) {
#Override
public ServerHttpRequest getRequest() {
return decorated;
}
private void flushLog(ByteArrayOutputStream baos) {
ServerHttpRequest request = super.getRequest();
if (logger.isInfoEnabled()) {
StringBuffer data = new StringBuffer();
data.append('[').append(request.getMethodValue())
.append("] '").append(String.valueOf(request.getURI()))
.append("' from ")
.append(
Optional.ofNullable(request.getRemoteAddress())
.map(addr -> addr.getHostString())
.orElse("null")
);
if (logger.isDebugEnabled()) {
data.append(" with payload [\n");
if (encodeBytes) {
data.append(new HexBinaryAdapter().marshal(baos.toByteArray()));
} else {
data.append(baos.toString());
}
data.append("\n]");
logger.debug(data.toString());
} else {
logger.info(data.toString());
}
}
}
};
}
}
Here some tests on this: github
I think this is what Brian Clozel (#brian-clozel) meant.
Here is the GitHub Repo with complete implementation to log both request and response body along with http headers for webflux/java based application...
What Brian said. In addition, logging request/response bodies don't make sense for reactive streaming. If you imagine the data flowing through a pipe as a stream, you don't have the full content at any time unless you buffer it, which defeats the whole point. For small request/response, you can get away with buffering, but then why use the reactive model (other than to impress your coworkers :-) )?
The only reason for logging request/response that I could conjure up is debugging, but with the reactive programming model, debugging method has to be modified too. Project Reactor doc has an excellent section on debugging that you can refer to: http://projectreactor.io/docs/core/snapshot/reference/#debugging
Assuming we are dealing with a simple JSON or XML response, if debug level for corresponding loggers is not sufficient for some reason, one can use string representation before transforming it to object:
Mono<Response> mono = WebClient.create()
.post()
.body(Mono.just(request), Request.class)
.retrieve()
.bodyToMono(String.class)
.doOnNext(this::sideEffectWithResponseAsString)
.map(this::transformToResponse);
the following are the side-effect and transformation methods:
private void sideEffectWithResponseAsString(String response) { ... }
private Response transformToResponse(String response) { /*use Jackson or JAXB*/ }
If your using controller instead of handler best way is aop with annotating you controller class with #Log annotation.And FYI this takes plain json object as request not mono.
#Target(AnnotationTarget.FUNCTION)
#Retention(AnnotationRetention.RUNTIME)
annotation class Log
#Aspect
#Component
class LogAspect {
companion object {
val log = KLogging().logger
}
#Around("#annotation(Log)")
#Throws(Throwable::class)
fun logAround(joinPoint: ProceedingJoinPoint): Any? {
val start = System.currentTimeMillis()
val result = joinPoint.proceed()
return if (result is Mono<*>) result.doOnSuccess(getConsumer(joinPoint, start)) else result
}
fun getConsumer(joinPoint: ProceedingJoinPoint, start: Long): Consumer<Any>? {
return Consumer {
var response = ""
if (Objects.nonNull(it)) response = it.toString()
log.info(
"Enter: {}.{}() with argument[s] = {}",
joinPoint.signature.declaringTypeName, joinPoint.signature.name,
joinPoint.args
)
log.info(
"Exit: {}.{}() had arguments = {}, with result = {}, Execution time = {} ms",
joinPoint.signature.declaringTypeName, joinPoint.signature.name,
joinPoint.args[0],
response, System.currentTimeMillis() - start
)
}
}
}
I think the appropriate thing to do here is to write the contents of each request to a file in an asynchronous manner (java.nio) and set up an interval that reads those request body files asynchrolusly and writes them to the log in a memory usage aware manner (atleast one file at a time but up too 100 mb at a time) and after logging them removes the files from disk.
Ivan Lymar's answer but in Kotlin:
import org.apache.commons.io.IOUtils
import org.reactivestreams.Publisher
import org.springframework.core.io.buffer.DataBuffer
import org.springframework.http.server.reactive.ServerHttpRequestDecorator
import org.springframework.http.server.reactive.ServerHttpResponseDecorator
import org.springframework.stereotype.Component
import org.springframework.web.server.ServerWebExchange
import org.springframework.web.server.WebFilter
import org.springframework.web.server.WebFilterChain
import reactor.core.publisher.Flux
import reactor.core.publisher.Mono
import java.io.ByteArrayOutputStream
import java.io.IOException
import java.nio.channels.Channels
#Component
class LoggingWebFilter : WebFilter {
override fun filter(exchange: ServerWebExchange, chain: WebFilterChain): Mono<Void> {
val httpRequest = exchange.request
val httpUrl = httpRequest.uri.toString()
val loggingServerHttpRequestDecorator: ServerHttpRequestDecorator =
object : ServerHttpRequestDecorator(exchange.request) {
var requestBody = ""
override fun getBody(): Flux<DataBuffer> {
return super.getBody().doOnNext { dataBuffer: DataBuffer ->
try {
ByteArrayOutputStream().use { byteArrayOutputStream ->
Channels.newChannel(byteArrayOutputStream)
.write(dataBuffer.asByteBuffer().asReadOnlyBuffer())
requestBody =
IOUtils.toString(
byteArrayOutputStream.toByteArray(),
"UTF-8"
)
log.info(
"Logging Request Filter: {} {}",
httpUrl,
requestBody
)
}
} catch (e: IOException) {
log.error(
"Logging Request Filter Error: {} {}",
httpUrl,
requestBody,
e
)
}
}
}
}
val loggingServerHttpResponseDecorator: ServerHttpResponseDecorator =
object : ServerHttpResponseDecorator(exchange.response) {
var responseBody = ""
override fun writeWith(body: Publisher<out DataBuffer>): Mono<Void> {
val buffer: Mono<DataBuffer> = Mono.from(body)
return super.writeWith(
buffer.doOnNext { dataBuffer: DataBuffer ->
try {
ByteArrayOutputStream().use { byteArrayOutputStream ->
Channels.newChannel(byteArrayOutputStream)
.write(
dataBuffer
.asByteBuffer()
.asReadOnlyBuffer()
)
responseBody = IOUtils.toString(
byteArrayOutputStream.toByteArray(),
"UTF-8"
)
log.info(
"Logging Response Filter: {} {}",
httpUrl,
responseBody
)
}
} catch (e: Exception) {
log.error(
"Logging Response Filter Error: {} {}",
httpUrl,
responseBody,
e
)
}
}
)
}
}
return chain.filter(
exchange.mutate().request(loggingServerHttpRequestDecorator)
.response(loggingServerHttpResponseDecorator)
.build()
)
}
}

Resources