I am using Spring Cloud Stream with RabbitMQ binder. I need to call external service which does not use Spring Cloud Stream. This service is using type and correlation_id message properties.
I tried to set headers in outgoing message but, even though properties, technically are headers, they are treated in special way. So setting type header does not set a property.
I am aware about interceptors and, if using only Spring RabbitMq, it would not be a problem. But since Spring Cloud Stream represents higher level of abstraction, all binder specific settings are hidden.
Is there any possibility to set up RabbitMQ properties in outgoing stream message?
Properties are mapped from message headers keyed by AmqpHeaders constants; in this case AmqpHeaders.TYPE (amqp_type) and AmqpHeaders.CORRELATION_ID (amqp_correlationId).
All "unknown" message headers are mapped as rabbit headers.
Related
I'm experiencing a problem, I have an application in spring boot, this application consumes from one topic and produces in another topic.
the topic that the application consumes is on-premises, the topic that the application produces is on cloud aws.
is there a way to specify a bootstrap server and schema registry for each topic?
my application.properties has the following property:
spring.kafka.bootstrap-servers=localhost:32202 spring.kafka.properties.schema.registry.url=127.0.0.1:8082
the problem here is that these properties are for both consumer and producer.
I need to specify a bootstrap server for the consumer, another for the producer.
also specify a schema-registry for the consumer, another for the producer.
I don't know if this way is the best way to deal with this problem.
spring.kafka.consumer.bootstrap-servers=consumer-localhost:32202 spring.kafka.consumer.schema.registry.url=consumer-127.0.0.1:8082 spring.kafka.producer.bootstrap-servers=producer-localhost:10010 spring.kafka.producer.schema.registry.url=producer-127.0.0.1:9090
Thanks in advance!
See the Spring Boot documentation.
The properties supported by auto configuration are shown in the “Integration Properties” section of the Appendix. Note that, for the most part, these properties (hyphenated or camelCase) map directly to the Apache Kafka dotted properties. See the Apache Kafka documentation for details.
The first few of these properties apply to all components (producers, consumers, admins, and streams) but can be specified at the component level if you wish to use different values. Apache Kafka designates properties with an importance of HIGH, MEDIUM, or LOW. Spring Boot auto-configuration supports all HIGH importance properties, some selected MEDIUM and LOW properties, and any properties that do not have a default value.
Only a subset of the properties supported by Kafka are available directly through the KafkaProperties class. If you wish to configure the producer or consumer with additional properties that are not directly supported, use the following properties:
spring.kafka.properties[prop.one]=first
spring.kafka.admin.properties[prop.two]=second
spring.kafka.consumer.properties[prop.three]=third
spring.kafka.producer.properties[prop.four]=fourth
spring.kafka.streams.properties[prop.five]=fifth
So...
spring.kafka.consumer.bootstrap-servers=consumer-localhost:32202
spring.kafka.consumer.properties.schema.registry.url=consumer-127.0.0.1:8082
spring.kafka.producer.bootstrap-servers=producer-localhost:10010
spring.kafka.producer.properties.schema.registry.url=producer-127.0.0.1:9090
This sets the common prop.one Kafka property to first (applies to producers, consumers and admins), the prop.two admin property to second, the prop.three consumer property to third, the prop.four producer property to fourth and the prop.five streams property to fifth.
For the management of the headers of the messages that are produced/consumed in the Kafka binder there is the KafkaHeaderMapper interface, whose implementation as a bean can be configured with the following property: spring.cloud.stream.kafka.binder.headerMapperBeanName.
Is there something similar for the KafkaStreams binder in Spring Cloud Stream? My intention is to be able to control how to deserialize/serialize or include/exclude message headers on stream input and output. does anyone know how to realize this?
There is not an equivalent; for streams; Spring is only involved with setting up the infrastructure/topology; it is not involved with the runtime record processing.
You can, however, use a custom serializer/deserializer and manipulate the headers there.
Is dynamic routing is same as dynamic destination binding in spring cloud stream ?
Dynamic routing as per rabbit all producer published to same queue, producer configured with routingKeyExpression and consumer listener configured with bindingRoutingKey and exchange routes the message to matched bindingKey.
does this can be accomplished using stream bridge or BinderAwareChannelResolver? If not how does spring manage with this in case someone wants to move from rabbit to any other broker.
Yes this can be accomplished with StreamBridge, RoutingFunction, spring.cloud.stream.sendto.destination etc., depending on your use case which is not clear from your post, hence I am giving you everything.
You can find more information here and here for StreamBridge.
The BinderAwareChannelResolver is deprecated in favor of StreamBridge
I have a usecase where I want to get the underlying Kafka producer (KafkaTemplate) in a Spring Cloud Stream application. While navigating the code I stumbled upon KafkaProducerMessageHandler which has a getKafkaTemplate method. However, it fails to auto-wire.
Also, if I directly auto-wire KafkaTemplate, the template is initialized with default properties and it ignores the broker in the binder key of the SCSt configuration
How can I access the underlying KafkaTemplate or a producer/consumer in a Spring Cloud Stream app?
EDIT: Actually my SCSt app has multiple Kafka binders and I want to get the KafkaTemplate or Kafka producer corresponding to each binder. Is that possible somehow?
It's not entirely clear why you would need to do that, but you can capture the KafkaTemplates by adding a ProducerMessageHandlerCustomizer #Bean to the application context.
Configuring multiple queues with a topic exchange and using routing key to direct the message specific queue with spring cloud streams
My requirement is example I have the queues and exchange defined as below in consumer end
spring.cloud.stream.bindings.inputA.destination=Common-Exchange
spring.cloud.stream.bindings.inputA.group=A-Queue
spring.cloud.stream.bindings.inputB.destination=Common-Exchange
spring.cloud.stream.bindings.inputB.group=B-Queue
I should be able to specify the routing key in the consumer just like
we do it in AMQP where we can pass the exchange queue and routing key
to create the binding
I should be able to set the routing key when sending the message in
producer end using MessageBuilder
channel.send(MessageBuilder.withPayload(message).build())
Of course we can use one queue and use headers to direct different type of messages but I need to know how multiple queue connected to a single exchange work with streams.
See the Rabbit binder documentation.
On the consumer side, set the bindingRoutingKey consumer binding property.
On the producer side, the the routingKeyExpression producer binding property (e.g. headers['routingKey'] and set that header as needed).
Also see Using Existing Queues/Exchanges.