Spring 5 reactive websocket and multiple endpoints? - spring

We are doing a little hackathon at work and I wanted to try some new technology to get away from the usual controller.
I started using Spring Webflux with reactive WebSockets and everything is working fine so far. I configured my WebSocket handler as follows:
import my.handler.DomWebSocketHandler;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.web.reactive.HandlerMapping;
import org.springframework.web.reactive.config.WebFluxConfigurer;
import org.springframework.web.reactive.handler.SimpleUrlHandlerMapping;
import org.springframework.web.reactive.socket.WebSocketHandler;
import org.springframework.web.reactive.socket.server.support.WebSocketHandlerAdapter;
import java.util.HashMap;
import java.util.Map;
#Configuration
public class AppConfig implements WebFluxConfigurer {
#Autowired
private WebSocketHandler domWebSocketHandler;
#Bean
public HandlerMapping webSocketMapping() {
Map<String, WebSocketHandler> map = new HashMap<>();
map.put("/event-emitter", domWebSocketHandler);
SimpleUrlHandlerMapping mapping = new SimpleUrlHandlerMapping();
mapping.setOrder(1);
mapping.setUrlMap(map);
return mapping;
}
#Bean
public WebSocketHandlerAdapter handlerAdapter() {
return new WebSocketHandlerAdapter();
}
}
After some more research, I learned that it is best practice to work with one connection per client.
Furthermore using more than a web-socket per browsing session for the same application seems overkill since you can use pub/sub channels. See answer here
Is there a way to restrict the connections per client and use only one endpoint for all required client "requests" or would it be better to create additional endpoints (like you would with a normal controller)?
Thank you in advance for the help.

Related

How to close spring boot with routing properly

I'm trying to use a routing to stop my spring boot application with the code below
with
#GetMapping("/close/")
fun terminate() {
exitProcess(0)
}
but the test server has a different API, so I can't use this code (It's shutdown a whole system)
My question is: how to stop only my spring boot application (replace exitProcess(0))
You can do it using Actuator's shutdown for example.
Find an example here.
Or you can just close your Application Context and that will work.
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.AbstractApplicationContext;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
#RestController
public class SomeRestController {
#Autowired
ApplicationContext applicationContext;
#GetMapping("/close")
public void terminate() {
((AbstractApplicationContext)applicationContext).close();
}
}

How do i connect my webMethods REStful service to my spring boot application

All i wanted to know is that if i have a flow service in SoftwareAG webMethods and i have converted it into a REStful service by making a rest resource and i have exposed it, so how do i make a call to that service via a spring boot appplication.
This is my code for the my spring application, Can someone please suggest me that how do i make a rest call to a webMethods Rest Resource flow service which is already been exposed.
A quick help is appreciated.
package com.scb.controller;
import java.util.ArrayList;
import java.util.List;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RestController;
import com.scb.entity.ReadingEntity;
import java.util.concurrent.atomic.AtomicLong;
#RestController
public class PreProcessorController {
private List<ReadingEntity> myEntity=new ArrayList();
private final AtomicLong counter = new AtomicLong();
#GetMapping("/GetData")
public void getData(#RequestBody ReadingSCBMLEntity entity) {
//myEntity.add(entity);
final String uri="http://uklvadapp881.uk.dev.net:5555/invoke/scb.wb.fm.support.flow.ResourceGet/_get?";
RestTemplate template=new RestTemplate();
String result=template.getForObject(uri, String.class);
System.out.println(result);
}
}

Transfer a file using Apache Camel file component

I am trying a demo file transfer program using Spring Boot and Apache Camel file component. I have exposed a REST Controller using Spring Boot which is calling an Apache Camel route and it is doing the file transfer. I have three files in the directory C:\CamelDemo\inputFolder namely input1.txt, input2.txt and input3.txt. I want to only transfer the file input2.txt in the output folder. My Spring Boot controller is as below:
package com.example.demo.controller;
import java.util.HashMap;
import java.util.Map;
import org.apache.camel.ProducerTemplate;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RestController;
#RestController
#RequestMapping("/camel")
public class FileTransferController {
#Autowired private ProducerTemplate producerTemplate;
#RequestMapping(value="/file", method=RequestMethod.GET)
public String callCamelRoute() {
String fileName = "input2.txt";
Map<String, Object> headerMap = new HashMap<String, Object>();
headerMap.put("fileName", fileName);
producerTemplate.sendBodyAndHeaders("direct:transferFile", null, headerMap);
return "Route invoked";
}
}
My Route is as below:
package com.example.demo.route;
import org.apache.camel.LoggingLevel;
import org.apache.camel.builder.RouteBuilder;
import org.springframework.stereotype.Component;
#Component
public class FileTransferRoute extends RouteBuilder {
#SuppressWarnings("deprecation")
#Override
public void configure() {
errorHandler(defaultErrorHandler()
.maximumRedeliveries(3)
.redeliverDelay(1000)
.retryAttemptedLogLevel(LoggingLevel.WARN));
from("direct:transferFile")
.log("Route reached")
.log("file:C:\\CamelDemo\\inputFolder?fileName=${in.headers.fileName}&noop=true")
.pollEnrich("file://C:/CamelDemo/inputFolder?fileName=${in.headers.fileName}&noop=true")
.to("file://C:/CamelDemo/outputFolder?autoCreate=false")
.end();
}
}
But the first time I invoke this route, the file input1.txt is getting transferred even when I have specified the fileName parameter. Please help.
I think the issue is that your file name isn't being set, because you're not telling Camel that you're using a Simple expression, rather than a fixed URI.
Looking at the manual (https://camel.apache.org/manual/latest/pollEnrich-eip.html#_using_dynamic_uris), it implies that you will need
.pollEnrich().simple("file://C:/CamelDemo/inputFolder?fileName=${in.headers.fileName}&noop=true")
to be able to use the dynamic endpoint.

Adding custom header using Spring Kafka

I am planning to use the Spring Kafka client to consume and produce messages from a kafka setup in a Spring Boot application. I see support for custom headers in Kafka 0.11 as detailed here. While it is available for native Kafka producers and consumers, I don't see support for adding/reading custom headers in Spring Kafka.
I am trying to implement a DLQ for messages based on a retry count that I was hoping to store in the message header without having to parse the payload.
I was looking for an answer when I stumbled upon this question. However I'm using the ProducerRecord<?, ?> class instead of Message<?>, so the header mapper does not seem to be relevant.
Here is my approach to add a custom header:
var record = new ProducerRecord<String, String>(topicName, "Hello World");
record.headers().add("foo", "bar".getBytes());
kafkaTemplate.send(record);
Now to read the headers (before consuming), I've added a custom interceptor.
import java.util.List;
import lombok.extern.slf4j.Slf4j;
import org.apache.kafka.clients.consumer.ConsumerInterceptor;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
#Slf4j
public class MyConsumerInterceptor implements ConsumerInterceptor<Object, Object> {
#Override
public ConsumerRecords<Object, Object> onConsume(ConsumerRecords<Object, Object> records) {
Set<TopicPartition> partitions = records.partitions();
partitions.forEach(partition -> interceptRecordsFromPartition(records.records(partition)));
return records;
}
private void interceptRecordsFromPartition(List<ConsumerRecord<Object, Object>> records) {
records.forEach(record -> {
var myHeaders = new ArrayList<Header>();
record.headers().headers("MyHeader").forEach(myHeaders::add);
log.info("My Headers: {}", myHeaders);
// Do with header as you see fit
});
}
#Override public void onCommit(Map<TopicPartition, OffsetAndMetadata> offsets) {}
#Override public void close() {}
#Override public void configure(Map<String, ?> configs) {}
}
The final bit is to register this interceptor with the Kafka Consumer Container with the following (Spring Boot) configuration:
import java.util.Map;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.springframework.boot.autoconfigure.kafka.KafkaProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
#Configuration
public class MessagingConfiguration {
#Bean
public ConsumerFactory<?, ?> kafkaConsumerFactory(KafkaProperties properties) {
Map<String, Object> consumerProperties = properties.buildConsumerProperties();
consumerProperties.put(ConsumerConfig.INTERCEPTOR_CLASSES_CONFIG, MyConsumerInterceptor.class.getName());
return new DefaultKafkaConsumerFactory<>(consumerProperties);
}
}
Well, Spring Kafka provides headers support since version 2.0: https://docs.spring.io/spring-kafka/docs/2.1.2.RELEASE/reference/html/_reference.html#headers
You can have that KafkaHeaderMapper instance and use it to populated headers to the Message before sending it via KafkaTemplate.send(Message<?> message). Or you can use the plain KafkaTemplate.send(ProducerRecord<K, V> record).
When you receive records using KafkaMessageListenerContainer, the KafkaHeaderMapper can be supplied there via a MessagingMessageConverter injected to the RecordMessagingMessageListenerAdapter.
So, any custom headers can be transferred either way.

Validating JMS payload in Spring

I have a simple service sending emails. It can be invoked using REST and JMS APIs. I want the requests to be validated before processing.
When I invoke it using REST I can see that org.springframework.validation.DataBinder invokes void validate(Object target, Errors errors, Object... validationHints) and then validator from Hibernate is invoked. This works as expected.
The problem is I can't achieve the same effect with JMS Listener. The listener is implemented as follows:
import lombok.AllArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.jms.annotation.JmsListener;
import org.springframework.stereotype.Component;
import our.domain.mailing.Mailing;
import our.domain.mailing.jms.api.SendEmailFromTemplateRequest;
import our.domain.mailing.jms.api.SendSimpleEmailRequest;
import javax.validation.Valid;
#ConditionalOnProperty("jms.configuration.destination")
#Component
#AllArgsConstructor(onConstructor = #__(#Autowired))
#Slf4j
public class SendMailMessageListener {
Mailing mailing;
#JmsListener(destination = "${jms.configuration.destination}")
public void sendEmailUsingTemplate(#Valid SendEmailFromTemplateRequest request) {
log.debug("Received jms message: {}", request);
mailing.sendEmailTemplate(
request.getEmailDetails().getRecipients(),
request.getEmailDetails().getAccountType(),
request.getTemplateDetails().getTemplateCode(),
request.getTemplateDetails().getLanguage(),
request.getTemplateDetails().getParameters());
}
#JmsListener(destination = "${jms.configuration.destination}")
public void sendEmail(#Valid SendSimpleEmailRequest request) {
log.debug("Received jms message: {}", request);
mailing.sendEmail(
request.getRecipients(),
request.getSubject(),
request.getMessage());
}
}
The methods receive payloads but they are not validated. It's Spring Boot application and I have #EnableJms added. Can you guide what part of Spring source code is responsible for discovering #Validate and handling it? If you have any hints on how to make it running I would appreciate it a lot.
The solution is simple and was clearly described in official documentation: 29.6.3 Annotated endpoint method signature. There are few things you have to do:
Provide configuration implementing JmsListenerConfigurer (add #Configuration class implementing this interface)
Add annotation #EnableJms on the top of this configuration
Create bean DefaultMessageHandlerMethodFactory. It can be done in this configuration
Implement method void configureJmsListeners(JmsListenerEndpointRegistrar registrar) of interface JmsListenerConfigurer implemented by your configuration and set MessageHandlerMethodFactory using the bean you've just created
Add #Validated instead of #Valid to payload parameters
You can use #Valid in your listeners. Your answer was very close to it. In the step when you create DefaultMessageHandlerMethodFactory call .setValidator(validator) where validator is from org.springframework.validation. You can configure validator like this:
#Bean
public LocalValidatorFactoryBean configureValidator ()
{
return new LocalValidatorFactoryBean();
}
And then inject validator instance into your jms config

Resources