How to produce and consume a RabbitMQ message with inside Spring RestController and send it back to the user - spring

Hello Anyone and Everyone. I am working on a Spring Boot application. Here is my problem. I have a Spring RestController with a post-mapping that takes in some data. I am then needing to send that data over RabbitMQ to another application which in return will perform some calculations on that data and then send it back to me which I then want to return back to the user.
I know that RabbitMQ is for async communication. But I need my controller to return the result that comes back from RabbitMQ all in one go. Right now I am using.
#EnableBinding(Sink::class)
class OptimizedScheduleMessageListener {
#StreamListener(Sink.INPUT)
fun handler(incomingMessage: MyDTO) {
println(incomingMessage)
}
}
to retrieve the results from RabbitMQ. Now I just need my Controller to return it.
#PostMapping( produces = ["application/json"])
fun retrieveOptimizedSchedule: Result<MyDTO> {
myUncalculatedDTO: MyDTO()
source.output().send(MessageBuilder.withPayload(myUncalculadeDTO).build())
return ???
}
Any help with this endeavor is much appreciated.
Thanks in Advance.

Spring Cloud Stream is not designed for request/reply processing.
See the Spring AMQP (Spring for RabbitMQ) project.
The RabbitTemplate has sendAndReceive and convertSendAndReceive methods to implement the RPC model.
On the server side, a #RabbitListener method can be used for request/reply.

What you are trying to do is not advised for couple of reasons.
1. The failure of the 'Another application' which consumes the Rabbit
MQ messages will result in Requests being blocked on the controller end.
2. There is a limit on how many requests you can have simultaneously from the server to clients.
What you can do is use any other communication protocol than REST for this specific part. May be Websocket will be an ideal solution. If not you need to have two REST endpoints. One to submit and get back an request-id, another to poll periodically with the request-id and get processed, completed response.

Related

Spring Cloud Stream - Send and receive synchronized

In my spring boot rest controller, I'm using StreamBridge for sending a message to the rabbitMQ server with something like
streamBridge.send("consumer-in-0", "hello world");
Is there a way to do a send and wait the response ?
Alright, as I said, late to the party, but there seems to be a way. simply you can make the producer synchronous, this can be done in application.properties as such:
spring.cloud.stream.kafka.bindings.functionName-out-0.producer.sync=true
The same configuration was originally working for the MessageChannel pre Spring cloud function, but as I understand the underlying functionality is already the same

Create thread and route message from Camel to microservice and back

I'm using Camel with JMS and Spring Boot and would like to build a route for next scenario:
User 1 (MQTT client) sends a message (topic) to ActiveMQ Artemis.
Camel (by using from) catch that message and print it out with a help of log.
Thing I would like to do is - create a new thread (asynchronous) for caught message. Send that message from Camel to microservice (python program) that should take message and insert some extra strings, and then send changed message back to Camel and ActiveMQ.
On the end, from ActiveMQ changed message will be sent to User 2.
Can you give me some directions or route examples of how to do something like that?
So, key points are to create new thread for every message and create route to and back from that microservice.
The route could look like this
from("jms:queue:inputQueue")
.log("${body}")
.to("http://oldhost")
.to("jms:queue:outputQueue")
Some notes:
You can call a downstream HTTP endpoint with .to(). The current message body is used as request body. The response of the service overwrites the message body.
I don't know why you want to create a new thread for a synchronous call. You can leverage parallel processing by consuming multiple messages from Artemis in parallel with multiple consumers. Like this, every message is processed in its own thread. If your motivation is resilience, there is also a Circuit Breaker EIP in Camel
If you use Camel 2.x, use the HTTP4 Component ("http4://") to use the newer HTTP client lib. In Camel 3.x the old one was dropped and the new one is simply called HTTP component

Does Spring make "calling its own controller" multithread?

I have an spring boot application that pulls message from an cloud message queue and put it back to a cloud db. I realize that my program is single thread(I am not using request mapping, just pull,process,put to db). I want Spring handle concurrency things. So can I make a dispatcher function, which calls controller in the application with #RequestMapping?
#RestController
#RequestMapping("/test")
public class GatewayController {
#RequestMapping("/service")
public void InvokeService(...) {...}
}
I need mutithread to call other service for response, which I don't want it to block others. If I recieve 10 messages, I want it to call /test/service... which have 10 threads processing them.
My question is:
Will Spring make the controller multithread?
How to call its own controller? Send request to the url? (I don't need response from controller, just let controller call a service to put response in a db on could)
RequestMapping is MVC thing - intended to issue http requests. And yes, it uses tomcat under the hood.
If you'll inject RestController into your class it won't issue any HTTP requests, you'll only call the controller as a regular bean. If you consume messages in one thread, it won't become multithreaded to answer your first question.
You can, of course, create HTTP request but frankly it's just wrong. So don't do it. This answers your second question to some extent :)
Now, there is nothing wrong conceptually if your microservice acts as a consumer and producer and deals with queues, not all microservices have to be accessible via HTTP.
In order to work in a multi threaded environment:
Check whether you can consume messages in a multi-threaded manner. Maybe the client of your "cloud message queue" offers multi-threaded configuration (thread pool or something).
If it's not possible, create a thread pool executor by yourself and upon each message submit the processing task to this thread pool. This will make the processing logic multithreaded with a parallelism level confined by the thread pool size and thread pool configurations.

Spring 5 Reactive WebSockets: recommended use

I've been learning a bit about Spring 5 WebFlux, reactive programming and websockets. I've watched Josh Long's Spring Tips: Reactive WebSockets with Spring Framework 5. The code that sends data from server to client through a WebSocket connection uses a Spring Integration IntegrationFlow that publishes to a PublishSubcribeChannel which has a custom MessageHandler subscribed to it that takes the message, converts it to an object that is then converted to Json and emitted to the FluxSink from the callback supplied to Flux.create(), which is used to send to the WebSocketConnection.
I was wondering if the use of IntegrationFlow and PublishSubscribeChannel is the recommended way to push events from a background process to the client, or if this is just more convenient in this particular example (monitoring the file system). I'd think if you have control over the background process, you could have it emit to the FluxSink directly?
I'm thinking about use cases similar to the following:
a machine learning process whose progress is monitored
updates to the state of a game world that are sent to players
chat rooms / team collaboration software
...
What I've done in the past that has worked for me is to create a Spring Component that implements WebSocketHandler:
#Component
public class ReactiveWebSocketHandler implements WebSocketHandler {
Then in the handle method, Spring injects the WebSocketSession object
#Override
public Mono<Void> handle(WebSocketSession session) {
Then create one or more Flux reactive publishers that emit messages(WebSocketMessage) for the client.
final var output = session.send(Flux.merge(flux1, flux2));
Then you can zip up the incoming and outgoing Flux objects in a Mono and then Spring will take it from there.
return Mono.zip(incomingWebsocketMsgResponse.getWebSocketMsgFlux().then(),
outputWithErrorMsgs)
.then();
Example: https://howtodoinjava.com/spring-webflux/reactive-websockets/
Since this question, Spring introduced RSocket support - you might think about it like the WebSocket STOMP support existing in Spring MVC, but much more powerful and efficient, supporting backpressure and advanced communication patterns at the protocol level.
For the use cases you're mentioning, I'd advise using RSocket as you'd get a powerful programming model with #MessageMapping and all the expected support in Spring (codecs for JSON and CBOR, security, etc).

How to send data to multiple servers in spring boot micro service?

I have requirement something like this:
once the request is received by my service, i need to send it 2-3 third party servers at a time and get the response from all server and return the response.
How can I achieve that?.
My thought : I can create separate threads for different servers and send the request to all servers parallely, but here the issue is, how I will come to know the threads are finished and consolidate the response from all servers and return to caller.
Is there any other way to do in spring boot(micro service)?.
You can leverage asyn feature supported by Spring Framework. Let's see the folllowing example which issue multiple calls in Async style from Spring's official guides:Async Method
Another possible solution for internal communications between Microserives is to use the message queue.
You didn't specify what type of services are you consuming. If they are HTTP, you may want to use some Enterprise Integrations abstractions (most popular are Spring Integration and Apache Camel).
If you don't want to introduce message bus solution into your microservice, you may want to take a look at AsyncRestTemplate

Resources