StateStore is never added on Spring cloud - spring

Any Help how can I add state store on Spring cloud
I always receive this error "nested exception is org.springframework.kafka.KafkaException: Could not start stream: ; nested exception is org.apache.kafka.streams.errors.TopologyException: Invalid topology: StateStore myStore is not added yet."
Here is the bean definition however it never works
#Bean
public StoreBuilder storeBuilder() {
KeyValueBytesStoreSupplier storeSupplier = Stores.persistentKeyValueStore("mystore");
StoreBuilder<KeyValueStore<String, MyData>> storeBuilder = Stores.keyValueStoreBuilder(storeSupplier, Serdes.String(), StreamsSerde.MyDataSerde());
return storeBuilder;
}
Here is the Serde
public static final class MyDataSerde extends Serdes.WrapperSerde<MyData> {
public MyDataSerde() {
super(new JsonSerializer<>(), new JsonDeserializer<>(MyData.class));
}
}
Here is the data class
public class MyData {
private String name;
private String course;
}
Here is the spring cloud dependencies
springBootVersion = "2.2.5.RELEASE"
set('springCloudVersion', "Hoxton.SR3")
implementation group:"org.springframework.cloud", name: "spring-cloud-stream"
implementation group: "org.springframework.cloud", name: "spring-cloud-stream-binder-kafka-streams"
implementation group: "org.springframework.cloud", name: "spring-cloud-starter-stream-kafka"

You need to add state stores like this when you have to use the lower level processor or transformer API. Did you try to add the state store to your process or transform method call? Here is a test that works. Take a look at the process call and the way the state stores are passed along.

I found a solution to add the store programmatically on this article
public void initializeStateStores() throws Exception {
StreamsBuilderFactoryBean streamsBuilderFactoryBean =
applicationContext.getBean("&stream-builder-requestListener", StreamsBuilderFactoryBean.class);
StreamsBuilder streamsBuilder = streamsBuilderFactoryBean.getObject();
StoreBuilder<KeyValueStore<String, Long>> keyValueStoreBuilder = Stores.keyValueStoreBuilder(Stores.persistentKeyValueStore(stateStoreName), Serdes.String(), Serdes.Long());
streamsBuilder.addStateStore(keyValueStoreBuilder);
}
https://medium.com/#daniyaryeralin/utilizing-kafka-streams-processor-api-and-implementing-custom-aggregator-6cb23d00eaa7

Related

Validate Consumer Group is created on topic or not

I am working on project where I need to validate consumer group is created on topic or not. Is there any way in boldSpring Kafkabold to validate it
Currently, I haven't seen describeConsumerGroups supported in Spring-Kafka KafkaAdmin. So, you may need to create a Kafka AdminClient and call the method by yourself.
E.g: Here, I took advantage of the auto-configuration property class KafkaProperties and autowired it to the service.
#Service
public class KafkaBrokerService implements BrokerService {
private Map<String, Object> configs;
public KafkaBrokerService(KafkaProperties kafkaProperties) { // Autowired
this.configs = kafkaProperties.buildAdminProperties();
}
private AdminClient createAdmin() {
Map<String, Object> configs2 = new HashMap<>(this.configs);
return AdminClient.create(configs2);
}
public SomeDto consumerGroupDescription(String groupId) {
try (AdminClient adminClient = createAdmin()) {
// ConsumerGroup's members
ConsumerGroupDescription consumerGroupDescription = adminClient.describeConsumerGroups(Collections.singletonList(groupId))
.describedGroups().get(groupId).get();
// ConsumerGroup's partitions and the committed offset in each partition
Map<TopicPartition, OffsetAndMetadata> offsets = adminClient.listConsumerGroupOffsets(groupId).partitionsToOffsetAndMetadata().get();
// When you get the information, you can validate it here.
...
} catch (ExecutionException | InterruptedException e) {
//
}
}
}

ReadOnlyKeyValueStore from a KTable using spring-kafka

I am migrating a Kafka Streams implementation which uses pure Kafka apis to use spring-kafka instead as it's incorporated in a spring-boot application.
Everything works fine the Stream, GlobalKTable, branching that I have all works perfectly fine but I am having a hard time incorporating a ReadOnlyKeyValueStore. Based on the spring-kafka documentation here: https://docs.spring.io/spring-kafka/docs/2.6.10/reference/html/#streams-spring
It says:
If you need to perform some KafkaStreams operations directly, you can
access that internal KafkaStreams instance by using
StreamsBuilderFactoryBean.getKafkaStreams(). You can autowire
StreamsBuilderFactoryBean bean by type, but you should be sure to use
the full type in the bean definition.
Based on that I tried to incorporate it to my example as in the following fragments below:
#Bean(name = KafkaStreamsDefaultConfiguration.DEFAULT_STREAMS_CONFIG_BEAN_NAME)
public KafkaStreamsConfiguration defaultKafkaStreamsConfig() {
Map<String, Object> props = defaultStreamsConfigs();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "quote-stream");
props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, SpecificAvroSerde.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "stock-quotes-stream-group");
return new KafkaStreamsConfiguration(props);
}
#Bean(name = KafkaStreamsDefaultConfiguration.DEFAULT_STREAMS_BUILDER_BEAN_NAME)
public StreamsBuilderFactoryBean defaultKafkaStreamsBuilder(KafkaStreamsConfiguration defaultKafkaStreamsConfig) {
return new StreamsBuilderFactoryBean(defaultKafkaStreamsConfig);
}
...
final GlobalKTable<String, LeveragePrice> leverageBySymbolGKTable = streamsBuilder
.globalTable(KafkaConfiguration.LEVERAGE_PRICE_TOPIC,
Materialized.<String, LeveragePrice, KeyValueStore<Bytes, byte[]>>as("leverage-by-symbol-table")
.withKeySerde(Serdes.String())
.withValueSerde(leveragePriceSerde));
leveragePriceView = myKStreamsBuilder.getKafkaStreams().store("leverage-by-symbol-table", QueryableStoreTypes.keyValueStore());
But adding the StreamsBuilderFactoryBean(which seems to be needed to get a reference to KafkaStreams) definition causes an error:
The bean 'defaultKafkaStreamsBuilder', defined in class path resource [com/resona/springkafkastream/repository/KafkaConfiguration.class], could not be registered. A bean with that name has already been defined in class path resource [org/springframework/kafka/annotation/KafkaStreamsDefaultConfiguration.class] and overriding is disabled.
The issue is I don't want to control the lifecycle of the stream that's what I get with the plain Kafka APIs so I would like to get a reference to the default managed one as I want spring to manage it but whenever I try to expose the bean it gives the error. Any ideas on what's the correct approach to that using spring-kafka?
P.S - I am not interested in solutions using spring-cloud-stream I am looking for implementations of spring-kafka.
You don't need to define any new beans; something like this should work...
spring.application.name=quote-stream
spring.kafka.streams.properties.default.key.serde=org.apache.kafka.common.serialization.Serdes$StringSerde
spring.kafka.streams.properties.default.value.serde=org.apache.kafka.common.serialization.Serdes$StringSerde
#SpringBootApplication
#EnableKafkaStreams
public class So69669791Application {
public static void main(String[] args) {
SpringApplication.run(So69669791Application.class, args);
}
#Bean
GlobalKTable<String, String> leverageBySymbolGKTable(StreamsBuilder sb) {
return sb.globalTable("gkTopic",
Materialized.<String, String, KeyValueStore<Bytes, byte[]>> as("leverage-by-symbol-table"));
}
private ReadOnlyKeyValueStore<String, String> leveragePriceView;
#Bean
StreamsBuilderFactoryBean.Listener afterStart(StreamsBuilderFactoryBean sbfb,
GlobalKTable<String, String> leverageBySymbolGKTable) {
StreamsBuilderFactoryBean.Listener listener = new StreamsBuilderFactoryBean.Listener() {
#Override
public void streamsAdded(String id, KafkaStreams streams) {
leveragePriceView = streams.store("leverage-by-symbol-table", QueryableStoreTypes.keyValueStore());
}
};
sbfb.addListener(listener);
return listener;
}
#Bean
KStream<String, String> stream(StreamsBuilder builder) {
KStream<String, String> stream = builder.stream("someTopic");
stream.to("otherTopic");
return stream;
}
}

axon 4 snapshot in demand

I work with spring boot and axon example, i implement the snapshot feature, with the below code is working fine, after 3 events i found the data in the table snapshot_event_entry in the database
#Configuration
#AutoConfigureAfter(value = { AxonAutoConfiguration.class })
public class AxonConfig {
#Bean
public SnapshotTriggerDefinition catalogSnapshotTrigger(Snapshotter snapshotter) {
return new EventCountSnapshotTriggerDefinition(snapshotter, 3);
}
}
#Aggregate(snapshotTriggerDefinition = "catalogSnapshotTrigger")
public class CatalogAggregate { }
My question, is there a method to do a snapshot in demand? That means i want to implement an api to do the snapshot, not automatically after 3 events
there is nothing already in place.
One way to implement what you need is to create a dedicated Command, eg PerformShapshotCmd, that will carry the aggregateId information, and a #CommandHandler into your Aggregate. You could then let Spring autowire the Snapshotter instance bean, and call for the scheduleSnapshot(Class<?> aggregateType, String aggregateIdentifier) method.
Below some code snippet that could guide you.
data class PerformShapshotCmd(#TargetAggregateIdentifier val id: String)
#CommandHandler
public void handle(PerformShapshotCmd cmd, Snapshotter snapshotter) {
logger.debug("handling {}", cmd);
snapshotter.scheduleSnapshot(this.getClass(), cmd.getId());
}
You should also define one Bean of type Snapshotter into your config
#Bean
public SpringAggregateSnapshotterFactoryBean snapshotter() {
SpringAggregateSnapshotterFactoryBean springAggregateSnapshotterFactoryBean = new SpringAggregateSnapshotterFactoryBean();
//Setting async executors
springAggregateSnapshotterFactoryBean.setExecutor(Executors.newSingleThreadExecutor());
return springAggregateSnapshotterFactoryBean;
}
Please note that the first argument of your commandHandler needs to be the command, otherwise the framework will complain with an exception at startup time.

Implementing axon snapshot with springboot 2.3.3 and axon 4.4.2

can anyone suggest any tutorial/sample project for Implementing Snapshot in AXON 4.4.2 with springBoot 2.3.3.
i went through the documentation(https://docs.axoniq.io/reference-guide/axon-framework/tuning/event-snapshots#snapshotting) and did below:
The AxonConfig.class
#Configuration
public class AxonConfig {
#Bean
public SnapshotTriggerDefinition app1SnapshotTrigger(Snapshotter snapshotter) {
return new EventCountSnapshotTriggerDefinition(snapshotter, 10);
}
}
The Aggregate
#Aggregate(snapshotTriggerDefinition = "app1SnapshotTrigger")
public class MyAggregate {
#AggregateIdentifier
private String id;
private String name;
#AggregateMember
private List<Address> addresses = new ArrayList<>();
private MyAggregate () {
}
#CommandHandler
private MyAggregate (CreateNameCommand createNameCommand) {
-----
}
#EventSourcingHandler
private void on(NameCreatedEvent nameCreatedEvent) {
----
}
Am i missing something. Will it create a snapshot at the threshold value 10.
Thanks.
unfortunately we have no sample demo ready to show in this case.
From your code snippet looks that all is in place. Maybe there is some other configuration that is taking over your annotation.
To give a try, I applied your configuration to our https://github.com/AxonIQ/giftcard-demo/
First note that can guide is the following
if you declared a Repository as we did in https://github.com/AxonIQ/giftcard-demo/blob/master/src/main/java/io/axoniq/demo/giftcard/command/GcCommandConfiguration.java#L17 this configuration will take over Annotation placed into your aggregate. If you prefer annotation, you can remove this Bean definition.
Here the piece of code, instead, to have this configured as a Bean
#Bean
public Repository<GiftCard> giftCardRepository(EventStore eventStore, SnapshotTriggerDefinition giftCardSnapshotTrigger) {
return EventSourcingRepository.builder(GiftCard.class)
.snapshotTriggerDefinition(giftCardSnapshotTrigger)
.eventStore(eventStore)
.build();
}
#Bean
public SpringAggregateSnapshotterFactoryBean snapshotter() {
var springAggregateSnapshotterFactoryBean = new SpringAggregateSnapshotterFactoryBean();
//Setting async executors
springAggregateSnapshotterFactoryBean.setExecutor(Executors.newSingleThreadExecutor());
return springAggregateSnapshotterFactoryBean;
}
#Bean("giftCardSnapshotTrigger")
public SnapshotTriggerDefinition giftCardSnapshotTriggerDefinition(Snapshotter snapshotter) {
return new EventCountSnapshotTriggerDefinition(snapshotter, 10);
}
You can check that your snapshot is working fine looking at client log : after 10 events on the same agggregateId, you should find this info log entry
o.a.a.c.event.axon.AxonServerEventStore : Snapshot created
To check you can use the REST api to retrieve the events from an aggregate
curl -X GET "http://localhost:8024/v1/events?aggregateId=A01"
This will produce a stream containing events starting from the latest Snapshot: you will have nine events listed until the tenth event will be processed. After that, the endpoint will list events from the snapshot.
You can also check /actuator/health endpoint: it will shows the last snapshot token if the showDetails is enabled (enabled by default in EE, not enabled by default in SE).
Corrado.

JAXBElement: providing codec (/converter?) for class java.lang.Class

I have been evaluating to adopt spring-data-mongodb for a project. In summary, my aim is:
Using existing XML schema files to generate Java classes.
This is achieved using JAXB xjc
The root class is TSDProductDataType and is further modeled as below:
The thing to note here is that ExtensionType contains protected List<Object> any; allowing it to store Objects of any class. In my case, it is amongst the classes named TSDModule_Name_HereModuleType and can be browsed here
Use spring-data-mongodb as persistence store
This is achieved using a simple ProductDataRepository
#RepositoryRestResource(collectionResourceRel = "product", path = "product")
public interface ProductDataRepository extends MongoRepository<TSDProductDataType, String> {
TSDProductDataType queryByGtin(#Param("gtin") String gtin);
}
The unmarshalled TSDProductDataType, however, contains JAXBElement which spring-data-mongodb doesn't seem to handle by itself and throws a CodecConfigurationException org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.lang.Class.
Here is the faulty statement:
TSDProductDataType tsdProductDataType = jaxbElement.getValue();
repository.save(tsdProductDataType);
I tried playing around with Converters for spring-data-mongodb as explained here, however, it seems I am missing something since the exception is about "Codecs" and not "Converters".
Any help is appreciated.
EDIT:
Adding converters for JAXBElement
Note: Works with version 1.5.6.RELEASE of org.springframework.boot::spring-boot-starter-parent. With version 2.0.0.M3, hell breaks loose
It seems that I missed something while trying to add converter earlier. So, I added it like below for testing:
#Component
#ReadingConverter
public class JAXBElementReadConverter implements Converter<DBObject, JAXBElement> {
//#Autowired
//MongoConverter converter;
#Override
public JAXBElement convert(DBObject dbObject) {
Class declaredType, scope;
QName name = qNameFromString((String)dbObject.get("name"));
Object rawValue = dbObject.get("value");
try {
declaredType = Class.forName((String)dbObject.get("declaredType"));
} catch (ClassNotFoundException e) {
if (rawValue.getClass().isArray()) declaredType = List.class;
else declaredType = LinkedHashMap.class;
}
try {
scope = Class.forName((String) dbObject.get("scope"));
} catch (ClassNotFoundException e) {
scope = JAXBElement.GlobalScope.class;
}
//Object value = rawValue instanceof DBObject ? converter.read(declaredType, (DBObject) rawValue) : rawValue;
Object value = "TODO";
return new JAXBElement(name, declaredType, scope, value);
}
QName qNameFromString(String s) {
String[] parts = s.split("[{}]");
if (parts.length > 2) return new QName(parts[1], parts[2], parts[0]);
if (parts.length == 1) return new QName(parts[0]);
return new QName("undef");
}
}
#Component
#WritingConverter
public class JAXBElementWriteConverter implements Converter<JAXBElement, DBObject> {
//#Autowired
//MongoConverter converter;
#Override
public DBObject convert(JAXBElement jaxbElement) {
DBObject dbObject = new BasicDBObject();
dbObject.put("name", qNameToString(jaxbElement.getName()));
dbObject.put("declaredType", jaxbElement.getDeclaredType().getName());
dbObject.put("scope", jaxbElement.getScope().getCanonicalName());
//dbObject.put("value", converter.convertToMongoType(jaxbElement.getValue()));
dbObject.put("value", "TODO");
dbObject.put("_class", JAXBElement.class.getName());
return dbObject;
}
public String qNameToString(QName name) {
if (name.getNamespaceURI() == XMLConstants.NULL_NS_URI) return name.getLocalPart();
return name.getPrefix() + '{' + name.getNamespaceURI() + '}' + name.getLocalPart();
}
}
#SpringBootApplication
public class TsdApplication {
public static void main(String[] args) {
SpringApplication.run(TsdApplication.class, args);
}
#Bean
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(),
new JAXBElementWriteConverter()
));
}
}
So far so good. However, how do I instantiate MongoConverter converter;?
MongoConverter is an interface so I guess I need an instantiable class adhering to this interface. Any suggestions?
I understand the desire for convenience in being able to just map an existing domain object to the database layer with no boilerplate, but even if you weren't having the JAXB class structure issue, I would still be recommending away from using it verbatim. Unless this is a simple one-off project, you almost definitely will hit a point where your domain models will need to change but your persisted data need to remain in an existing state. If you are just straight persisting the data, you have no mechanism to convert between a newer domain schema and an older persisted data scheme. Versioning of the persisted data scheme would be wise too.
The link you posted for writing the customer converters is one way to achieve this and fits in nicely with the Spring ecosystem. That method should also solve the issue you are experiencing (about the underlying messy JAXB data structure not converting cleanly).
Are you unable to get that method working? Ensure you are loading them into the Spring context with #Component plus auto-class scanning or manually via some Configuration class.
EDIT to address your EDIT:
Add the following to each of your converters:
private final MongoConverter converter;
public JAXBElement____Converter(MongoConverter converter) {
this.converter = converter;
}
Try changing your bean definition to:
#Bean
public CustomConversions customConversions(#Lazy MongoConverter converter) {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(converter),
new JAXBElementWriteConverter(converter)
));
}

Resources