Webflux subscribe to nested Publishers and serialize them to JSON - spring-boot

I have a UserDto with related items taken from repository which is Project-Reactor based, thus returns Flux/Mono publishers.
My idea was to add fields/getters to DTO, which themselves are publishers and lazily evaluate them (subscribe) on demand, but there is a problem:
Controller returns Flux of DTOs, all is fine, except spring doesn't serialize inner Publishers
What I'm trying to achieve in short:
#Repository
class RelatedItemsRepo {
static Flux<Integer> findAll() {
// simulates Flux of User related data (e.g. Orders or Articles)
return Flux.just(1, 2, 3);
}
}
#Component
class UserDto {
// Trying to get related items as field
Flux<Integer> relatedItemsAsField = RelatedItemsRepo.findAll();
// And as a getter
#JsonProperty("related_items_as_method")
Flux<Integer> relatedItemsAsMethod() {
return RelatedItemsRepo.findAll();
}
// Here was suggestion to collect flux to list and return Mono
// but unfortunately it doesn't make the trick
#JsonProperty("related_items_collected_to_list")
Mono<List<Integer>> relatedItemsAsList() {
return RelatedItemsRepo.findAll().collectList();
}
// .. another user data
}
#RestController
#RequestMapping(produces = MediaType.APPLICATION_JSON_VALUE)
public class MyController {
#GetMapping
Flux<UserDto> dtoFlux() {
return Flux.just(new UserDto(), new UserDto(), new UserDto());
}
}
And this is the response I get:
HTTP/1.1 200 OK
Content-Type: application/json
transfer-encoding: chunked
[
{
"related_items_as_method": {
"prefetch": -1,
"scanAvailable": true
},
"related_items_collected_to_list": {
"scanAvailable": true
}
},
{
"related_items_as_method": {
"prefetch": -1,
"scanAvailable": true
},
"related_items_collected_to_list": {
"scanAvailable": true
}
},
{
"related_items_as_method": {
"prefetch": -1,
"scanAvailable": true
},
"related_items_collected_to_list": {
"scanAvailable": true
}
}
]
It seems like Jackson doesn't serialize Flux properly and just calls .toString() on it (or something similar).
My question is: Is there existing Jackson serializers for Reactor Publishers or should I implement my own, or maybe am I doing something conceptually wrong.
So in short: how can I push Spring to evaluate those fields (subscribe to them)

If I understand correctly, what you try to achieve is to create an API that needs to respond with the following response:
HTTP 200
[
{
"relatedItemsAsField": [1,2,3]
},
{
"relatedItemsAsField": [1,2,3]
},
{
"relatedItemsAsField": [1,2,3]
}
]
I would collect all the elements emitted by the Flux generated by RelatedItemsRepo#findAll by using Flux#collectList, then map this to set the UserDto object as required.
Here is a gist.

Related

Simple Spring GraphQL Subscription returning error

I'm trying to create a simple Spring GraphQL subscription handler. Here's my controller:
#Controller
public class GreetingController {
#QueryMapping
String sayHello() {
return "Hello!";
}
#SubscriptionMapping
Flux<String> greeting(#Argument int count) {
return Flux.fromStream(Stream.generate(() -> "Hello # " + Instant.now()))
.delayElements(Duration.ofSeconds(1))
.take(count);
}
}
Here's the GraphQL schema:
type Query {
sayHello: String
}
type Subscription {
greeting(count: Int): String
}
Spring configuration:
spring:
graphql:
graphiql:
enabled: true
path: /graphiql
When I try to run above subscription using graphiql hosted by the spring I receive following error:
{
"errors": [
{
"isTrusted": true
}
]
}
When I run the same graphql request using Postman I receive following response:
{
"data": {
"upstreamPublisher": {
"scanAvailable": true,
"prefetch": -1
}
}
}
What is causing the subscription not to return data from my controller?
As explained in the linked GitHub issue, a subscription requires the ability to stream data within a persistent transport connection - this is not available other plain HTTP.
You'll need to enable WebSocket support in your application first. The GraphiQL UI should use the WebSocket transport transparently for this.

Spring rest api same url for bulk and single request

I would like to have this API in a Spring boot app:
POST /items
{
"name": "item1"
}
POST /items
[
{
"name": "item1"
},
{
"name": "item2"
},
]
So the same endpoint could accept array or a single element in a json body.
Unfortunatelly this doesn't work:
#PostMapping(path="items")
public ResponseEntity<String> items(#RequestBody Item item) {}
#PostMapping(path="items")
public ResponseEntity<String> items(#RequestBody List<Item> items) {}
I also tried this:
#PostMapping(path="items")
public ResponseEntity<String> items(#RequestBody #JsonFormat(with= JsonFormat.Feature.ACCEPT_SINGLE_VALUE_AS_ARRAY) List<Item> items) {}
it doesn't work.
If I wrap the list like:
public class Items {
#JsonFormat(with= JsonFormat.Feature.ACCEPT_SINGLE_VALUE_AS_ARRAY)
private List<item> items;
}
then unfortunately my request body would look like:
{
"items": [
{
"name": "item1"
},
{
"name": "item2"
},
]
}
Do you know how I can have such API with spring boot?
You want a variable that can directly hold an Array or an Object. There is no straightforward way to achieve something like that because of the Static Typing restrictions of Java.
The only way I can imagine so far to achieve something like this is to create a single API that takes some generic type.
Like an Object:
#PostMapping(path="items")
public ResponseEntity<String> items(#RequestBody Object body) {
if(body instanceof List) {
} else {
}
}
Or a String:
#PostMapping(path="items")
public ResponseEntity<String> items(#RequestBody String body) {
if(body.charAt(0) == '[' && body.charAt(body.length() - 1) == ']') {
} else if(body.charAt(0) == '{' && body.charAt(body.length() - 1) == '}') {
} else {
}
}
You'll have to do a lot of work manually.
Also, you can't create two APIs with the same path /items and the same method POST, they will raise a compile-time error.

Send message only to certain client using websockets with Rsocket and Spring Webflux

I am trying to use Rsocket with websocket in one of my POC projects. In my case user login is not required. I would like to send a message only to certain clients when I receive a message from another service. Basically, my flow goes like this.
Service A Service B
|--------| websocket |------------------| Queue based comm |---------------|
| Web |----------------->| Rsocket server |--------------------->| Another |
| |<-----------------| using Websocket |<---------------------| service |
|--------| websocket |------------------| Queue based comm |---------------|
In my case, I am thinking to use a unique id for each connection and each request. Merge both identifiers as correlation id and send the message to Service B and when I get the message from Service B figure which client it needs to go to and send it. Now I understand I may not need 2 services to do this but I am doing this for a few other reasons. Though I have a rough idea about how to implement other pieces. I am new to the Rsocket concept. Is it possible to send a message to the only certain client by a certain id using Spring Boot Webflux, Rsocket, and websocket?
Basically, I think you have got two options here. The first one is to filter the Flux which comes from Service B, the second one is to use RSocketRequester and Map as #NikolaB described.
First option:
data class News(val category: String, val news: String)
data class PrivateNews(val destination: String, val news: News)
class NewsProvider {
private val duration: Long = 250
private val externalNewsProcessor = DirectProcessor.create<News>().serialize()
private val sink = externalNewsProcessor.sink()
fun allNews(): Flux<News> {
return Flux
.merge(
carNews(), bikeNews(), cosmeticsNews(),
externalNewsProcessor)
.delayElements(Duration.ofMillis(duration))
}
fun externalNews(): Flux<News> {
return externalNewsProcessor;
}
fun addExternalNews(news: News) {
sink.next(news);
}
fun carNews(): Flux<News> {
return Flux
.just("new lambo!!", "amazing ferrari!", "great porsche", "very cool audi RS4 Avant", "Tesla i smarter than you")
.map { News("CAR", it) }
.delayElements(Duration.ofMillis(duration))
.log()
}
fun bikeNews(): Flux<News> {
return Flux
.just("specialized enduro still the biggest dream", "giant anthem fast as hell", "gravel long distance test")
.map { News("BIKE", it) }
.delayElements(Duration.ofMillis(duration))
.log()
}
fun cosmeticsNews(): Flux<News> {
return Flux
.just("nivea - no one wants to hear about that", "rexona anti-odor test")
.map { News("COSMETICS", it) }
.delayElements(Duration.ofMillis(duration))
.log()
}
}
#RestController
#RequestMapping("/sse")
#CrossOrigin("*")
class NewsRestController() {
private val log = LoggerFactory.getLogger(NewsRestController::class.java)
val newsProvider = NewsProvider()
#GetMapping(value = ["/news/{category}"], produces = [MediaType.TEXT_EVENT_STREAM_VALUE])
fun allNewsByCategory(#PathVariable category: String): Flux<News> {
log.info("hello, getting all news by category: {}!", category)
return newsProvider
.allNews()
.filter { it.category == category }
}
}
The NewsProvider class is a simulation of your Service B, which should return Flux<>. Whenever you call the addExternalNews it's going to push the News returned by the allNews method. In the NewsRestController class, we filter the news by category. Open the browser on localhost:8080/sse/news/CAR to see only car news.
If you want to use RSocket instead, you can use a method like this:
#MessageMapping("news.{category}")
fun allNewsByCategory(#DestinationVariable category: String): Flux<News> {
log.info("RSocket, getting all news by category: {}!", category)
return newsProvider
.allNews()
.filter { it.category == category }
}
Second option:
Let's store the RSocketRequester in the HashMap (I use vavr.io) with #ConnectMapping.
#Controller
class RSocketConnectionController {
private val log = LoggerFactory.getLogger(RSocketConnectionController::class.java)
private var requesterMap: Map<String, RSocketRequester> = HashMap.empty()
#Synchronized
private fun getRequesterMap(): Map<String, RSocketRequester> {
return requesterMap
}
#Synchronized
private fun addRequester(rSocketRequester: RSocketRequester, clientId: String) {
log.info("adding requester {}", clientId)
requesterMap = requesterMap.put(clientId, rSocketRequester)
}
#Synchronized
private fun removeRequester(clientId: String) {
log.info("removing requester {}", clientId)
requesterMap = requesterMap.remove(clientId)
}
#ConnectMapping("client-id")
fun onConnect(rSocketRequester: RSocketRequester, clientId: String) {
val clientIdFixed = clientId.replace("\"", "") //check serialezer why the add " to strings
// rSocketRequester.rsocket().dispose() //to reject connection
rSocketRequester
.rsocket()
.onClose()
.subscribe(null, null, {
log.info("{} just disconnected", clientIdFixed)
removeRequester(clientIdFixed)
})
addRequester(rSocketRequester, clientIdFixed)
}
#MessageMapping("private.news")
fun privateNews(news: PrivateNews, rSocketRequesterParam: RSocketRequester) {
getRequesterMap()
.filterKeys { key -> checkDestination(news, key) }
.values()
.forEach { requester -> sendMessage(requester, news) }
}
private fun sendMessage(requester: RSocketRequester, news: PrivateNews) {
requester
.route("news.${news.news.category}")
.data(news.news)
.send()
.subscribe()
}
private fun checkDestination(news: PrivateNews, key: String): Boolean {
val list = destinations(news)
return list.contains(key)
}
private fun destinations(news: PrivateNews): List<String> {
return news.destination
.split(",")
.map { it.trim() }
}
}
Note that we have to add two things in the rsocket-js client: a payload in SETUP frame to provide client-id and register the Responder, to handle messages sent by RSocketRequester.
const client = new RSocketClient({
// send/receive JSON objects instead of strings/buffers
serializers: {
data: JsonSerializer,
metadata: IdentitySerializer
},
setup: {
//for connection mapping on server
payload: {
data: "provide-unique-client-id-here",
metadata: String.fromCharCode("client-id".length) + "client-id"
},
// ms btw sending keepalive to server
keepAlive: 60000,
// ms timeout if no keepalive response
lifetime: 180000,
// format of `data`
dataMimeType: "application/json",
// format of `metadata`
metadataMimeType: "message/x.rsocket.routing.v0"
},
responder: responder,
transport
});
For more information about that please see this question: How to handle message sent from server to client with RSocket?
I didn't yet personally use RSocket with WebSocket transport, but as stated in RSocket specification underlying transport protocol shouldn't even be important.
One RSocket component is at the same time server and a client. So when browsers connect to your RSocket "server" you can inject the RSocketRequester instance which you can then use to send messages to the "client".
You can then add these instances in your local cache (e.g. put them in some globally available ConcurrentHashMap with key of your choosing - something from which you'll know/be able to calculate to which clients should the message from Service B be propagated).
Then in the code where you receive message from Service B just fetch all RSocketRequester instances from the local cache which match your criteria and send them the message.

Spring ModelAndView to RestController json response

I have a legacy spring code where they use ModelAndView and they add the objects to it as below.
ModelAndView result = new ModelAndView();
result.addObject("folders", folders);
return result;
for the above i am getting response as
{
"folders": [
{
"recordCount": 0,
"folderContentType": "Reports",
"folderId": 34,
},
{
"recordCount": 2,
"folderContentType": "SharedReports",
"folderId": 88,
}
]
}
I have changed these to use Spring's RestController with a POJO backing the results returned from DB.
#GetMapping("/folders")
public List<Folder> getAllFolders() {
return Service.findAllFolders(1,2);
}
This returns a JSON as below
[
{
"folderId": 359056,
"folderName": "BE Shared Report Inbox",
"folderDescription": "BE Shared Report Inbox",
},
{
"folderId": 359057,
"folderName": "BE Shared Spec Inbox",
}]
How could i return this as exactly as my legacy code response. I know i can convert the List to Map and display. But, is there any equivalent
way.
Thanks.
You can put your result into a map.
#GetMapping("/folders")
public List<Folder> getAllFolders() {
return Service.findAllFolders(1,2);
}
Change to:
#GetMapping("/folders")
public Map<String,List<Folder>> getAllFolders() {
Map<String,List<Folder>> map = new HashMap<>();
map.put("folders",Service.findAllFolders(1,2));
return map;
}

How to properly format data with AppSync and DynamoDB when Lambda is in between

Receiving data with AppSync directly from DynamoDB seems working for my case, but when I try to put a lambda function in between, I receive errors that says "Can't resolve value (/issueNewMasterCard/masterCards) : type mismatch error, expected type LIST"
Looking to the AppSync cloudwatch response mapping output, I get this:
"context": {
"arguments": {
"userId": "18e946df-d3de-49a8-98b3-8b6d74dfd652"
},
"result": {
"Item": {
"masterCards": {
"L": [
{
"M": {
"cardId": {
"S": "95d67f80-b486-11e8-ba85-c3623f6847af"
},
"cardImage": {
"S": "https://s3.eu-central-1.amazonaws.com/logo.png"
},
"cardWallet": {
"S": "0xFDB17d12057b6Fe8c8c434653456435634565"
},...............
here is how I configured my response mapping template:
$utils.toJson($context.result.Item)
I'm doing this mutation:
mutation IssueNewMasterCard {
issueNewMasterCard(userId:"18e946df-d3de-49a8-98b3-8b6d74dfd652"){
masterCards {
cardId
}
}
}
and this is my schema :
type User {
userId: ID!
masterCards: [MasterCard]
}
type MasterCard {
cardId: String
}
type Mutation {
issueNewMasterCard(userId: ID!): User
}
The Lambda function:
exports.handler = (event, context, callback) => {
const userId = event.arguments.userId;
const userParam = {
Key: {
"userId":{S:userId}
},
TableName:"FidelityCardsUsers"
}
dynamoDB.getItem(userParam, function(err, data) {
if (err) {
console.log('error from DynamDB: ',err)
callback(err);
} else {
console.log('mastercards: ',JSON.stringify(data));
callback(null,data)
}
})
I think the problem is that the getItem you use when you use the DynamoDB datasource is not the same as the the DynamoDB.getItem function in the aws-sdk.
Specifically it seems like the datasource version returns an already marshalled response (that is, instead of something: { L: [ list of things ] } it just returns something: [ list of things]).
This is important, because it means that $utils.toJson($context.result.Item) in your current setup is returning { masterCards: { L: [ ... which is why you are seeing the type error- masterCards in this case is an object with a key L, rather than an array/list.
To solve this in the resolver, you can use the $util.dynamodb.toDynamoDBJson(Object) macro (https://docs.aws.amazon.com/appsync/latest/devguide/resolver-util-reference.html#dynamodb-helpers-in-util-dynamodb). i.e. your resolver should be:
$util.dynamodb.toDynamoDBJson($context.result.Item)
Alternatively you might want to look at the AWS.DynamoDB.DocumentClient class (https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/DocumentClient.html). This includes versions of getItem, etc. that automatically marshal and unmarshall the proprietary DynamoDB typing back into native JSON. (Frankly I find this much nicer to work with and use it all the time).
In that case you can keep your old resolver, because you'll be returning an object where masterCards is just a JSON array.

Resources