I am having problem customizing input channel name when using spring cloud stream azure event hub binder
spring-cloud-azure-eventhubs-stream-binder/1.2.6
When I use Sink interface provided by spring cloud like this:
#EnableBinding(Sink::class)
internal class SomeListener{
#StreamListener(Sink.INPUT)
fun listener(
message: GenericMessage<MyMessage>
) {
logger.trace("Got message")
}
}
with application properties like this:
spring:
cloud:
azure:
eventhub:
connection-string: ...
checkpoint-access-key: ...
checkpoint-storage-account: ...
stream:
eventhub:
bindings:
input:
consumer:
checkpoint-mode: BATCH
bindings:
input:
destination: test
group: $Default
binder: eventhub
I can receive messages without any problems.
But when I want to use custom channel like this:
internal interface CustomSink {
#Output(INPUT)
fun input(): SubscribableChannel
companion object {
const val INPUT = "custom-input"
}
}
#EnableBinding(CustomSink::class)
internal class SomeEventSinkListener{
#StreamListener(CustomSink.INPUT)
fun listener(
message: GenericMessage<MyMessage>
) {
logger.trace("Got message")
}
}
and properties like this:
spring:
cloud:
azure:
eventhub:
connection-string: <secret>
checkpoint-access-key: <secret>
checkpoint-storage-account: <secret>
stream:
eventhub:
bindings:
custom-input:
consumer:
checkpoint-mode: BATCH
bindings:
custom-input:
destination: test
group: $Default
binder: eventhub
I did not receive any messages.
The only think changed is channel name. I don't know what is wrong. I can customize output channel name without problem but input channels work only if I use Sink interface Binding. I would like to listen on two different destinations in future that's why I need to custom input channels.
I appreciate any help. Thanks.
Related
in this page tells that you can't make just one Outputs
but I need to make just one Outputs by using Spring Cloud Stream Kafka Binder
what should I do?
some articles says that using org.springframework.cloud.stream.function.StreamBridge but it's not works for me
I made StreamBridge to send topics to Kafka but Kafka doesn't produce topics to my Spring boot application
and this is my application.yml and produce topic code
// producer Springboot application
spring.cloud.stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
bindings:
deliveryIncoming:
destination : deliveryIncomingtopic
#Bean
public CommandLineRunner commandLineRunner(ApplicationContext ctx) {
// wanna produce deliveryIncomingtopic and send to Spring's Consumer
}
// Consumer Springboot application
spring :
cloud:
stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
function:
definition : deliveryIncoming;
bindings:
deliveryIncoming-in-0:
destination : deliveryIncomingtopic
#Bean
public Consumer<KStream<String, String>> deliveryIncoming() {
return input ->
input.foreach((key, value) -> {
System.out.println("deliveryIncoming is playing");
System.out.println("Key: " + key + " Value: " + value);
});
}
EDIT
sorry I think I made kind of unclear
I just want to do like below
produce(deliveryIncomingtopic) -> Kafka -> consumer(deliveryIncomingtopic)
If that's the case, then you need to change your bean function definition to return java.util.Function instead of java.util.Consumer.
#Bean
public Function<KStream<String, String>, KStream<String, String>> deliveryIncoming() {
return input ->
input.foreach((key, value) -> {
System.out.println("deliveryIncoming is playing");
System.out.println("Key: " + key + " Value: " + value);
});
}
However, AFAIK.. you still need to define the output channel in your application.yml. You can use the same name with different suffix. Something like below :
deliveryIncoming-in-0:
destination: <your_topic_name>
deliveryIncoming-out-0:
destination: <your_topic_name>
Just want to make things clear here. Are you looking to consume a message from your inbound topic deliveryIncomingtopic and then generate/produce another message to another output topic ?
If that's your question about, then I believe you were missing something there within your application.yml.
You need to have another configuration for your output topic. e.g. :
Since you're using Spring Cloud Function (based on what i see in your application.yml), then you should add more configuration for your output topic as follow:
spring :
cloud:
stream:
kafka:
binder:
brokers: {AWS.IP}:9092
zkNodes: {AWS.IP}:2181
function:
definition: deliveryIncoming;deliveryOutput
bindings:
deliveryIncoming-in-0:
destination: deliveryIncomingtopic
deliveryOutput-out-0:
destination: deliveryOutput
And also define another bean for your producer function :
#Bean
public Producer<KStream<String, String>> deliveryOutput() {
// do your necessary logic here to outbound your message
}
Hope this will match your expectation.
I am writing a service with Sprint Boot and Kotlin. The service should listen to Kafka topics. I am using Spring Cloud Stream for that.
What's working is having a hardcoded topic to listen to. For example in application.yaml I define the topic to listen:
spring:
cloud:
stream:
default:
contentType: application/*+avro
group: my-consumer-group
consumer:
useNativeDecoding: false
# Binding-specific configs (Kafka agnostic)
bindings:
my_topic:
# Topic to consume from
destination: my_topic
and then access it like:
interface MyTopicSink {
#Input(INPUT)
fun input(): SubscribableChannel
companion object {
const val INPUT = "my_topic" # (From `application.yaml`)
}
}
/**
* Reads from my_topic Kafka topics.
*/
#Service
#EnableBinding(MyTopicSink::class)
class MyFancyConsumer() {
/**
* Listens & consumes from the my_topic.
*/
#StreamListener(MyTopicSink.INPUT)
fun processTopic(
#Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) key: String,
#Payload payload: org.apache.avro.generic.GenericData.Record
) {
log.info("Yaay, this is working")
}
}
That's great and all, but does anyone know if I can make this dynamic? For example, one of my ideas is to have a list of topics defined somewhere, and then just have this consumer be dynamic. Reading from all the topics. Anyone did something similar? Thoughts?
How can i send a message with the new Spring Cloud Stream Kafka Functional Model?
The deprecated way looked like this.
public interface OutputTopic {
#Output("output")
MessageChannel output();
}
#Autowired
OutputTopic outputTopic;
public someMethod() {
outputTopic.output().send(MessageBuilder.build());
}
But how can i send a message in the functional style?
application.yml
spring:
cloud:
function:
definition: process
stream:
bindings:
process-out-0:
destination: output
binder: kafka
#Configuration
public class Configuration {
#Bean
Supplier<Message<String>> process() {
return () -> {
return MessageBuilder.withPayload("foo")
.setHeader(KafkaHeaders.MESSAGE_KEY, "bar".getBytes()).build();
};
}
I would Autowire a MessageChannel but there is no MessageChannel-Bean for process, process-out-0, output or something like that. Or can i send a message with a Supplier-Bean?
Could someone please give me an example?
Thanks a lot!
You can either use the StreamBridge or the reactor API - see Sending arbitrary data to an output (e.g. Foreign event-driven sources)
I am using spring cloud streams to consume a message from Kafka.
Is it possible to read the Kafka Message Key from the code?
I have a Kafka topic that generally has 2 types of messages. The action to be taken varies depending on the message key. I see the spring documentation has only the following to read the message. Here, I need to specify the actual mapping of the message (Greetings class here). However, I need a way through which I can read the message key and determine the deserializable Pojo
public class GreetingsListener {
#StreamListener(GreetingsProcessor.INPUT)
public void handleGreetings(#Payload Greetings request) {
}
}
You can try something like this:
#StreamListener(GreetingsProcessor.INPUT)
public void handleGreetings(#Payload Greetings request, #Header(KafkaHeaders.RECEIVED_MESSAGE_KEY)String key) {
}
You need to provide a proper deserializer for the key. For e.g. if your key is String, then you can provide:
spring.cloud.stream.kafka.binder.configuration.key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
If there is a need to use different key deserializer for different input channels, this setting can be extended under producer section of each kafka bindings. For example:
spring:
cloud:
stream:
kafka:
bindings:
<channel_name>:
consumer:
startOffset: latest
autoCommitOffset: true
autoCommitOnError: true
configuration:
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
I have a config server with the properties and a microservice as consumer.
I've tried to configure maxAttempts to avoid retries on the consumer microservices, but it seems not to work.
I also define the bindings properties on config servers and them works fine. My consumer is listening and receive messages, but it tries 3 times and then crash.
This is my application.yml in my config server
server:
servlet:
contextPath: /cmsrsssitemap/v1
spring:
cloud:
stream:
bindings:
sitemap-main-output:
destination: sitemap-main
group: cms-microservices-v1
content-type: application/json
#consumer.concurrency: 2
test-job-output:
destination: test-job
group: cms-microservices-v1
content-type: application/json
rabbit:
bindings:
test-job-output:
consumer:
maxAttempts: 1
requeueRejected: false
autoBindDlq: true
#dlqTtl: 5000
#requeueRejected: false
#dlqDeadLetterExchange: dltexchange1
#republishToDlq: true
This is the application.yml in the producer side
server.servlet.contextPath: /cmsjmshandler/v1
spring:
cloud:
stream:
bindings:
sitemap-main-input:
destination: sitemap-main
content-type: application/json
test-job-input:
destination: test-job
group: cms-microservices-v1
content-type: application/json
And this is the lisener. It's throwing a NullPointer for testing purpose
#Component
public class TestJobListener {
#StreamListener(StreamProcessor.TEST_JOB)
public void testJobInput(#Payload String input) throws InterruptedException {
// Thread.sleep(3000);
System.out.println("########################### "+new Date() + " Mensaje Recibido");
throw new NullPointerException();
}
}
StreamProcesor.java
public interface StreamProcessor {
public static final String TEST_JOB = "test-job";
public static final String SITEMAP_MAIN = "sitemap-main";
#Input(StreamProcessor.TEST_JOB)
SubscribableChannel testJobOutputInput();
#Input(StreamProcessor.SITEMAP_MAIN)
SubscribableChannel processSitemapMain();
}
The goal of this it's making to move failed messages to DLQ, but it isn't work either
EDIT 1: Can't make it work. I've made changes according to Artem Bilan but it doesn't work either.
server:
servlet:
contextPath: /cmsrsssitemap/v1
spring:
cloud:
stream:
bindings:
test-job-output:
destination: test-job
group: cms-microservices-v1
content-type: application/json
consumer:
maxAttempts: 1
rabbit:
bindings:
test-job-output:
consumer:
requeueRejected: false
The maxAttempts is not a rabbit property. It is a core one.
There is a sample in the Docs on the matter: https://docs.spring.io/spring-cloud-stream/docs/Elmhurst.RELEASE/reference/htmlsingle/#spring-cloud-stream-overview-error-handling
spring.cloud.stream.bindings.input.consumer.max-attempts=1
spring.cloud.stream.rabbit.bindings.input.consumer.requeue-rejected=true
The problem was I put the wrong name on the StreamProcesor
#StreamListener(StreamProcessor.TEST_JOB)
StreamProcesor.TEST_JOB should be the channel name, nor destination. Updating my question.
Corrected SteamProcesor.java
StreamProcesor.java
public interface StreamProcessor {
public static final String TEST_JOB = "test-job-output";
public static final String SITEMAP_MAIN = "sitemap-main";
#Input(StreamProcessor.TEST_JOB)
SubscribableChannel testJobOutputInput();
#Input(StreamProcessor.SITEMAP_MAIN)
SubscribableChannel processSitemapMain();
}
I just tested it and it works fine with this (corrected) config for me. If you enable the actuator/env endpoint on the client and you can see the properties:
(I used input and a local file-based config server).