Hello i have problem about writing an object that i defined from sprin-xd to gemfire.
if deploy my class and if i write following command to gemfire console i can create new entry in region which contains object of Employee class.
put --key-class=java.lang.String --value-class=Employee --key=('id':'998') --value=('id':186,'firstName':'James','lastName':'Goslinga') --region=replicated2
The think that i want to do is i will send data from spring-xd. And i will have a new object of Employee class in Gemfire.
If i create such stream which will get data from rabbit MQ and send it to gemfire.
stream create --name reference-data-import --definition "rabbit --outputType=text/plain | gemfire-json-server --host=MC0WJ1BC --regionName=region10 --keyExpression=payload.getField('id')" --deploy
I can see that data in this type of "com.gemstone.gemfire.pdx.internal.PdxInstanceImpl".
Regarding to spring-xd documentation i can use such parametter outputType=application/x-java-object;type=com.bar.Foo but i never managed to work it out even though i deploy my class. if i can see a simple working example it will be great for me.
Spring XD provides no facilities for automatically converting from one object type to another since there is no easy way to handle the mapping in an automated way. Instead, you'll need to create a transformer before the gemfire-server sink to convert to your com.bar.Foo object. From there, it will be serialized correctly into Gemfire.
Related
I think I'm not grasping some basic concepts of Kafka here, so I'm hoping Stack maybe able to the help.
I've been trying to learn Kafka with Spring boot by following this GIT repo here:
I understand how to without avro take a Java class from one Microservice, send it to Kafka and consume / serialise it on another Microservice...however I hate that idea. As it means I must have an identical class on the other Microservice in terms of package location / name etc
So overall I've two questions here I guess.
I want to understand how I can share message across my spring boot microservices and map them to classes without copying said classes from one service to the other
I want to be able to consume from my Spring Kafka listeners messages created from another language say C#
Where I'm currently at is, I have the avro example from the repo above up and running along with my local kafka and Schema registry instance.
However if I create a duplicate class and call it UserTest (For example) and have it identical to the User class consumed here I get stacktraces like the following:
Caused by: org.springframework.messaging.converter.MessageConversionException: Cannot convert from [io.confluent.developer.User] to [io.confluent.developer.kafkaworkshop.streams.User] for GenericMessage [payload={"name": "vik", "age": 33}, headers={kafka_offset=6, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#54200a0e, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=vik, kafka_receivedPartitionId=1, kafka_receivedTopic=users12, kafka_receivedTimestamp=1611278455840, kafka_groupId=simple-consumer}]
Am I missing something exceptionally basic here? I thought that once the message was send in Avro format that it could be consume and mapped to another object which had the same fields...that way if the object was created in c#, the spring service would be able to interpret it no?
If anyone can help me that would be great....
Thanks!
I have 2 spring boot micro-services core and web:
The core service reacts to some event (EmployeeCreatedEvent) which is triggered by web.
The core service is using jackson serializer to serialize commands, queries, events and messages whereas the web service is using xstream serializer.
i am getting below error in core while handling EmployeeCreatedEvent triggered by web:
Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character (’<’ (code 60)):
expected a valid value (JSON String, Number, Array, Object or token ‘null’, ‘true’ or ‘false’)
i am using below properties (jackson for core and default for web):
axon.serializer.general = jackson/default
axon.serializer.events = jackson/default
axon.serializer.messages = jackson/default
can someone suggest whether it is ok to use different serializer for same event in different services.
I agree with #Augusto here and you should make a decision about which serialization format you are going to use across all your services.
I am assuming you started with the default serializer (which is XStream and XML) and later on decided to move to Jackson (which is JSON).
In that case, there are 2 advices I can share with you:
You can write a Custom Serializer which have both implementations and try with both of them and see which one works, for example trying with XML and fallbacking to JSON.
Or you can have a Component which will listen to all Events from your EventStore, deserialize them using XStream and write them back to another EventStore using Jackson. In this case, for this migration period, you will have this component using 2 Event Streams (one for each EventStore) but after the migration is done your whole EventStore will be in JSON. This requires some work but is the best approach in my opinion and will save you a lot of time and pain in the future.
You can look more about configuring 2 sources here.
I have multiple consumers for an API who post similar data into my API. My API needs to consume this data and persist the data into cassandra tables identified by consumer name. Eg. consumername_tablename
My spring boot entity is annotated with #Table which doesn't let me change the table name dynamically. Most recommendations online suggest that its not something we should try and change.
But in my scenario identifying all consumers and creating table in advance doesnt sound right. In future I want to be able to add consumers to my API seamlessly.
I want to use a variable passed in my API call as the prefix for my cassandra table names. Is this something I can achieve?
For starters: You cannot change annotations without recompiling- they are baked into the compiled class file. This is not the right approach.
Why not put everything in one table and make consumer part of the key? This should give you identical functionality without any of the hassle.
If the message payload passed from one "filter" to the next "filter" in Spring XD stream is a custom Java class instance, I suppose some kind of serialization mechanism is required if the "pipe" in between is a remote transport.
What kind of "serialization"/"transformation" is available in Spring XD for this case?
Will Java serialization work for this case? And if the custom class is serializable, will Spring XD automatically serialise/deserialize the object, or we still need to give some hints in the stream definition/module definition?
Thanks,
Simon
XD uses Kryo serialization with remote transports. Java.io.serialization would work in theory, however we don't want to assume that payload types implement java.io.Serializable. Also, I personally don't see any advantage in choosing Java serialization automatically over Kryo if the payload is Serializable. Java serialization is supported via Spring XD's type conversion.
You should be able to create a stream containing something like:
filter1 --outputType=--application/x-java-serialized-object | filter2 --input-type=my.custom.SerializablePayloadType
In this case, the type conversion will use Java serialization before hitting the transport. The message bus will detect that the payload is a byte array and will send it directly to the next module as is. The message containing the bytes will set the content-type header to the declared outputType so that it can be deserialized using Java serialization by the inbound converter.
The above requires that the payload implements Serializable. Also custom payload types must be included in Spring XD's class path, i.e., add a jar to xd/lib on each installed container.
The Mule documentation gives an example on how to connect to Oracle AQ using a queue table with the queue_payload_type set to sys.aq$_jms_text_message.
How would I get Mule to work with queue_payload_type set to my own Oracle Object Type?
When I try to run the flow, I get the following error: JMS-137: Payload factory must be specified for destinations with ADT payloads. According to this question and this Oracle documentation it seems that I need to create my own class which implements ORADataFactory and works with my Oracle Object Type, which I've done, and use it when calling createConsumer, but I don't know how to get my ORADataFactory to be passed to createConsumer.
Do I have to create my own custom JMS Connector to get this working or is there a simpler way?
This is how you can solve it if you are using Spring: http://blog.javaforge.net/post/30858904340/oracle-advanced-queuing-spring-custom-types
In a "springless" environment just create your own message consumer like described in the article above.