spring webflux webclient response convert list of string to string - spring

Response:
[
{
"version": "1.0",
"content": [
"12345",
"67076",
"123462",
"604340",
"1331999",
"1332608",
"1785581",
]
}
]
Code:
Mono<List<String>> mp = webClient.get().uri(accountMgmtURI)
.retrieve()
.bodyToMono(Map.class)
.flatMap(trans-> {
List<String> content= (List<String>) trans.get("content");
System.out.println("content :: "+content);
return Mono.just(content);
});
System.out.println("content :: "+mp.toString());
String sites=mp.toString();

The first issue is that the API you're using is not returning a single object, but an array of objects, indicated by the square brackets ([]).
This means you should at least refactor your code to use bodyToFlux() in stead of bodyToMono():
client
.get()
.retrieve()
// bodyToFlux() in stead of bodyToMono(
.bodyToFlux(Map.class)
// ...
The second issue is that it isn't easy to work with a Map in this case since you would have to cast everything the whole time as you weren't able to pass any generics. Working with a proper class, would make things easier. For example, you could write the following class:
public class VersionContent {
private String version;
private List<String> content;
// TODO: Getters + Setters
}
And change your code to:
client
.get()
.retrieve()
.bodyToFlux(VersionContent.class)
.map(VersionContent::getContent)
.flatMap(Flux::fromIterable)
// ...
This piece of code will retrieve the content for each object, and flatMap it so that each individual value is emitted separately.
Right now, each item within the content array would be published individually. This brings us to the third issue, which is that you're not concatenating your strings.
To concat items, you could use the reduce() operator:
client
.get()
.retrieve()
.bodyToFlux(VersionContent.class)
.map(VersionContent::getContent)
.flatMap(Flux::fromIterable)
// reduce() can be used to merge all individual items to a single item
.reduce((sites, site) -> sites + "|" + site)
// ...
The final issue is that you're using toString(), which won't work. One of the key aspects of reactive programming is that everything happens asynchronously. That means that if you try to do anything with your data in the main thread, nothing will happen.
Additionally, another feature of publishers like Mono and Flux is that they're lazy. Without a proper subscription, nothing will even happen.
The solution is to properly subscribe() to get your value, for example:
client
.get()
.retrieve()
.bodyToFlux(VersionContent.class)
.map(VersionContent::getContent)
.flatMap(Flux::fromIterable)
.reduce((sites, site) -> sites + "|" + site)
.subscribe(System.out::println);
For your example, the code above would print the following to the console:
12345|67076|123462|604390|1331999|1332608|1785581
Be aware, this also means that every operation you want to do with these sites, should be done asynchronously.
If you don't want to work asynchronously, you can use the block() operator like this:
String sites = client
.get()
.retrieve()
.bodyToFlux(VersionContent.class)
.map(VersionContent::getContent)
.flatMap(Flux::fromIterable)
.reduce((sites, site) -> sites + "|" + site)
.block();

Please use ready solutions. You should not use List<String> content= (List<String>) trans.get("content"). Java is strong typing language - so create classes for types. And frameworks as spring works with classes and objects.
At this case:
public class VersionedDataResponse {
private List<VersionedData> versionedDataList;
}
....
public class VersionedData {
private String version;
private List<String> content;
}
And spring will convert it at bodyToMono(VersionedDataResponse.class)

Related

Spring Cloud Function - how to pass multiple parameters to a function call in Kotlin

In the documentation for Spring Cloud Function, the examples for Kotlin consist of function that takes a single parameter, e.g.
#Bean
open fun lower(): (String) -> String = { it.lowercase() }
which is called via a URL that has the single parameter on the end as so:
http://localhost/lower/UpperCaseParam
How can more than one parameter be passed ?
Is something like this supported ?
#Bean
open fun upper(): (String,String) -> String = { x,y -> x+y }
or if not multiple parameters, an object ?
#Bean
open fun upper(): (Pair<String,String>) -> String = { it.first+it.second }
Function by definition has only a single input/output. Even if we were to add support for BiFunction that would only satisfy cases where you have two inputs etc.
The best way to achieve what you want is to use Message Headers which you can pass as HTTP headers.
Then you can make your function signature to accept Function<Message<YourPOJOType>, ...> uppercase(); and then get payload (your main argument, such as request param) and headers from Message.
You can use BiFunction where second argument would be Map representing message headers and first argument payload. This way you can deal with your types and keep your function completely free from anything Spring. BiFunction<YourPOJOType, Map, ...>

Access Spring WebClient response body **BEFORE** being parsed

I've got a problem with an URL call response encoding.
So, just before Spring's WebClient converts the body response into an String object, as desired, I need to access the raw body response to parse it with the proper encoding. So, just before:
<execution>.flatMap(servletResponse -> {
Mono<String> mono = servletResponse.bodyToMono(String.class);
}
I need to access the raw URL call response; I think before "flatMap".
So... I've been looking at "codecs" within Spring documentation, but... So; even for testing, I have:
myWriter is an instance of: EncoderHttpMessageWriter.
myReader is an instance of: DecoderHttpMessageReader.
myWriter handles myDecoder, an instance of Decoder.
myReader handles myEncoder, an instance of Encoder.
as per Spring Documentation about Codecs; and testing with both options for the WebClient Builder:
myWebClientBuilder = WebClient.Builder; // provided by Spring Context,
myWebClientBuilder = WebClient.builder(); // "by hand"
So, the relevant part of code looks like this (tried even with register and registerWithDefaultConfig):
WebClient.builder().codecs(consumer -> {
consumer.customCodecs().register(myWriter.getEncoder());
consumer.customCodecs().register(myWriter);
consumer.customCodecs().register(myReader.getDecoder());
consumer.customCodecs().register(myReader);
})
.build();
shows that the codecs load, and internal basic methods are called:
canDecode
canEncode
canRead
canWrite
getDecodableMimeTypes
getDecoder
getEncodableMimeTypes
getEncoder
but... No one of the read/write... Mono<T> / Flux<T> methods are used. Is there anything left for configuring a codec to properly parse the incoming response with the proper encoding?
The response is a String; a old-fashioned String, with all data-fields in a single row, that wll be parsed later, according to rules about positions and lengths of fields; nothing related with JSON or Jackson.
Is there another better way to perform this pre-action?
Many thanks in advance.

Mirror #RequestPart behavior in WebFlux functional router definitions with different content types

Problem
We're developing a Spring Boot service to upload data to different back end databases. The idea is that, in one multipart/form-data request a user will send a "model" (basically a file) and "modelMetadata" (which is JSON that defines an object of the same name in our code).
We got the below to work in the WebFlux annotated controller syntax, when the user sends the "modelMetadata" in the multipart form with the content-type of "application/json":
#PostMapping(consumes = [MediaType.MULTIPART_FORM_DATA_VALUE], produces = [MediaType.APPLICATION_JSON_VALUE])
fun saveModel(#RequestPart("modelMetadata") monoModelMetadata: Mono<ModelMetadata>,
#RequestPart("model") monoModel: Mono<FilePart>,
#RequestHeader headers: HttpHeaders) : Mono<ResponseEntity<ModelMetadata>> {
return modelService.saveModel(monoModelMetadata, monoModel, headers)
}
But we can't seem to figure out how to do the same thing in Webflux's functional router definition. Below are the relevant code snippets we have:
#Bean
fun modelRouter() = router {
accept(MediaType.MULTIPART_FORM_DATA).nest {
POST(ROOT, handler::saveModel)
}
}
fun saveModel(r: ServerRequest): Mono<ServerResponse> {
val headers = r.headers().asHttpHeaders()
val monoModelPart = r.multipartData().map { multiValueMap ->
it["model"] // What do we do with this List<Part!> to get a Mono<FilePart>
it["modelMetadata"] // What do we do with this List<Part!> to get a Mono<ModelMetadata>
}
From everything we've read, we should be able to replicate the same functionality found in the annotation controller syntax with the router functional syntax, but this particular aspect doesn't seem to be well documented. Our goal was to move over to use the new functional router syntax since this is a new application we're developing and there are some nice forward thinking features/benefits as described here.
What we've tried
Googling to the ends of the Earth for a relevant example
this is a similar question, but hasn't gained any traction and doesn't relate to our need to create an object from one piece of the multipart request data
this may be close to what we need for uploading the file component of our multipart request data, but doesn't handle the object creation from JSON
Tried looking at the #RequestPart annotation code to see how things are done on that side, there's a nice comment that seems to hint at how they are converting the parts to objects, but we weren't able to figure out where that code lives or any relevant example of how to use an HttpMessageConverter on the ``
the content of the part is passed through an {#link HttpMessageConverter} taking into consideration the 'Content-Type' header of the request part.
Any and all help would be appreciated! Even just some links for us to better understand Part/FilePart types and there role in multipart requests would be helpful!
I was able to come up with a solution to this issue using an autowired ObjectMapper. From the below solution I could turn the modelMetadata and modelPart into Monos to mirror the #RequestPart return types, but that seems ridiculous.
I was also able to solve this by creating a MappingJackson2HttpMessageConverter and turning the metadataDataBuffer into a MappingJacksonInputMessage, but this solution seemed better for our needs.
fun saveModel(r: ServerRequest): Mono<ServerResponse> {
val headers = r.headers().asHttpHeaders()
return r.multipartData().flatMap {
// We're only expecting one Part of each to come through...assuming we understand what these Parts are
if (it.getOrDefault("modelMetadata", listOf()).size == 1 && it.getOrDefault("model", listOf()).size == 1) {
val modelMetadataPart = it["modelMetadata"]!![0]
val modelPart = it["model"]!![0] as FilePart
modelMetadataPart
.content()
.map { metadataDataBuffer ->
// TODO: Only do this if the content is JSON?
objectMapper.readValue(metadataDataBuffer.asInputStream(), ModelMetadata::class.java)
}
.next() // We're only expecting one object to be serialized from the buffer
.flatMap { modelMetadata ->
// Function was updated to work without needing the Mono's of each type
// since we're mapping here
modelService.saveModel(modelMetadata, modelPart, headers)
}
}
else {
// Send bad request response message
}
}
Although this solution works, I feel like it's not as elegant as the one alluded to in the #RequestPart annotation comments. Thus I will accept this as the solution for now, but if someone has a better solution please let us know and I will accept it!

Spring Reactive Programming: How to create a dynamic list of Publishers as input to Flux.merge

I'm new to Spring Reactive programming and I'm developing a REST endpoint that returns a Flux. For example:
#PostMapping
public Flux<MyResponse> processRequests(#RequestBody List<MyRequest> requests) {
return Flux.merge(Arrays.asList(dataSource.processRequest(requests.get(0)), dataSource2.processRequest(requests.get(0)))).parallel()
.runOn(Schedulers.elastic()).sequential();
}
Each data souce (dataSource and dataSource2) in the example code implements an interface that looks like this:
public interface MyResponseAdapter {
Flux<MyResponse> processRequest(MyRequest request);
}
This code works fine in that it returns the Flux as expected, but as you can see, the code only references the first element in the list of MyRequest. What I need to do is construct the Flux.merge for each element in the list of MyRequest. Can anyone point my in the right direction?
I think I've identified a simple solution:
List<Flux<MyResponse>> results = new ArrayList<>();
for (MyRequest myRequest : requests ) {
results.add(dataSource.processRequest(myRequest));
results.add(dataSource2.processRequest(myRequest));
}
return Flux.merge(results).parallel().runOn(Schedulers.elastic()).sequential();

How to limit number of results in ReactiveMongoRepository

I am looking for a way to pass the limit to the mongo query in ReactiveCrudRepository
I tried adding "First2" to the method name but I'm still getting all the results.
What I'm really looking for is a way to pass the value of the 'limit' to the method, passing it in request as #RequestParam int limit
This is my code for the repository
public interface ReactiveUserRepository
extends ReactiveCrudRepository<User, String> {
#Query("{ 'roles': ?0 }")
Flux<User> findFirst2ByRole(String role);
}
And this is controller method:
#GetMapping(path = "/byrole", produces = "application/stream+json")
Flux<User> getByRole(#RequestParam String role) {
return users.findFirst2ByRole(role).doOnNext(next -> {
System.out.println("Next user=" + next.getAssocId());
}).switchIfEmpty(Mono.error(new ResponseStatusException(HttpStatus.NOT_FOUND, String.format("No users found with role=%s", role))));
}
limitRequest(long) on a Flux may also be worth to have a look at: https://projectreactor.io/docs/core/release/api/reactor/core/publisher/Flux.html#limitRequest-long-
it can be used as a stricter form of take(long)
As it does cap the request amount that goes to the upstream source. This may prevent the reactive MongoDB driver from requesting/loading a huge batch of data when only a limited number is needed.
When using limitRequest, make sure that the provided long is "> 0" (<0 makes no sense and =0 results in a never completing Flux)
try to use reactor method users.findFirst2ByRole(role).take(2)
as well you can use skip() if needed

Resources