I have a spring boot application that needs to send message ~ 1MB to WebSocket clients. I have implemented the WebSocketMessageBrokerConfigurer for the websocket configuration.
The data needs to be sent in chunks of 8K since I do not want to increase WebSocketTransportRegistration.sendBufferSizeLimit as it would affect server side memory.
I have set up the clients to obtain data in chunks using a custom WebSocketHandler that can handle partial messages.
Is there any spring configuration that I can set so that the data is automatically set in chunks and sent to clients? If not, how can I add a custom handler to split the data in chunks and put it on the output channel? I do not see how I can add a custom handler in an implementation of WebSocketMessageBrokerConfigurer.
Do I need to add a custom WebSocketHandlerDecoratorFactory? If yes, is there any examples that I can look at?
Related
The messages created by the producer are all being consumed as expected.
The thing is, I need to create an endpoint to retrieve the latest messages from the consumer.
Is there a way to do it?
Like an on-demand consumer?
I found this SO post but is only to consume the last N records. I want to consume the latest without caring about the offsets.
Spring Kafka Consumer, rewind consumer offset to go back 'n' records
I'm working with Kotlin but if you have the answer in Java I don't mind either.
There are several ways to create listener containers dynamically; you can then start/stop them on demand. To get the records back into the controller, you'd need to use something like a blocking queue, or make the controller itself a MessageListener.
These answers show a couple of techniques for creating containers on demand:
How to dynamically create multiple consumers in Spring Kafka
Kafka Consumer in spring can I re-assign partitions programmatically?
Sorry this might sound naive to JMS gurus, but still.
I have a requirement where a Spring based application is not able to connect synchronously to a SAP back-end (via their web-service interface) because the response from SAP is way too slow. We are thinking of a solution where the updates from GUI would be saved by the Spring middle-ware in a local database, simultaneously sending a message to a JMS queue. We want that after (say) every few hours (or may be nightly) a batch job runs to consume the message from the JMS queue, and based on the message contents, queries on the local database and sends the result to the SAP web-service.
Is this approach correct? Would I need a batch to trigger the JMS message consumption (because I don't want to consume the message immediately but in a deferred manner and at a pre-decided time)? Is there any way in Spring to implement this gracefully (like Camel)? Appreciate your help.
Spring Batch has a JmsItemReader that can be used in a batch program; an empty queue signals the end of the batch. Spring Cloud Task is built on top of batch and can be used for cloud deployments.
Is there a way to split large message payloads and send through Spring JmsTemplate?
IBM MQ supports splitting large payloads via properties JMSX_GROUPID, JMS_IBM_LAST_MSG_IN_GROUP yet there seems to be now way (could not find any resources) of how to send large files over JMS.
My queue max size is set at 4Mb, and I have to send messages of over 100Mb as a ByteMessage.
I managed to get it working by using different properties name as following:
JMSXGroupID
JMSXGroupSeq
JMS_IBM_Last_Msg_In_Group
Header of the messages is properly setup.
I am using Spring Cloud Stream with RabbitMQ binder. I need to call external service which does not use Spring Cloud Stream. This service is using type and correlation_id message properties.
I tried to set headers in outgoing message but, even though properties, technically are headers, they are treated in special way. So setting type header does not set a property.
I am aware about interceptors and, if using only Spring RabbitMq, it would not be a problem. But since Spring Cloud Stream represents higher level of abstraction, all binder specific settings are hidden.
Is there any possibility to set up RabbitMQ properties in outgoing stream message?
Properties are mapped from message headers keyed by AmqpHeaders constants; in this case AmqpHeaders.TYPE (amqp_type) and AmqpHeaders.CORRELATION_ID (amqp_correlationId).
All "unknown" message headers are mapped as rabbit headers.
Our project is to integrate two applications, using the REST API of each and using JMS (to provide asynchronous nature). Application-1 writes the message on the queue. The next step is to read the message from the queue, process it, and send it to application2.
I have two questions:
Should we use one more queue for storing messages after processing and before sending them to application2?
Should we use spring batch or spring integration to read/process the data?
Or you don't show the whole premise, or you really try to overhead your app. If there is just need to read messages from the queue, there is just enough to use Spring JMS directly... From other side with the Spring Integration and its Adapters power you can just process messes from the <int-jms:message-driven-channel-adapter> to the <int-http:outbound-channel-adapter>.
Don't see reason to store message somewhere else in the reading and sending process. Just because with some exception here you just rollback your message to the JMS queue back.