Problem with Protostream and UUID in Infinispan 13.0.0.Final - protocol-buffers

I'm using Infinispan 13.0.0.final with the default marshaller (protobuf). When I try to use UUID fields in my datatypes
data class CounterState(
#get:ProtoField(number = 1) var index: Long? = null,
#get:ProtoField(number = 2) var uuid: UUID? = null
)
I get the following error at build time:
.../gradle-kotlin-protobuf/build/tmp/kapt3/stubs/main/io/radiosphere/ProtoSchema.java:8: error: org.infinispan.protostream.annotations.ProtoSchemaBuilderException: The class java.util.UUID must be instantiable using an accessible no-argument constructor.
public abstract interface ProtoSchema extends org.infinispan.protostream.GeneratedSchema {
It seems like I'm not allowed to use UUID in my types unless I generate a protoschema for it, but since UUID is a class outside of my control I can't do this.
Previous questions on the topic have gotten the suggestion to use the JavaSerializationMarshaller, but I want to solve this while still using the Protostream Marshaller. It has also been suggested that this would be fixed in version 12.0.0 here.
An example of this not working can be found here. Note that this project will not build because of the annotation processing failing as mentioned above. If it would build the proof that it is working would be shown by running the main project (ie. not the tests).
The question becomes: What do I need to do to configure UUID to be usable in my protobuf marshalled classes in Infinispan 13? Both for embedded and for a program using the hotrod client?
EDIT:
Based on a given answer I have also tried doing the following:
#AutoProtoSchemaBuilder(
includeClasses = [UUIDAdapter::class, CounterState::class],
schemaPackageName = "tutorial")
interface ProtoSchema : GeneratedSchema {
}
This makes the build work, but when starting Quarkus I get the following error:
Caused by: org.infinispan.protostream.DescriptorParserException: Duplicate type id 1005 for type org.infinispan.protostream.commons.UUID. Already used by tutorial.UUID
at org.infinispan.protostream.descriptors.ResolutionContext.checkUniqueTypeId(ResolutionContext.java:151)
at org.infinispan.protostream.descriptors.ResolutionContext.addGenericDescriptor(ResolutionContext.java:97)
at org.infinispan.protostream.descriptors.FileDescriptor.collectDescriptors(FileDescriptor.java:313)
at org.infinispan.protostream.descriptors.FileDescriptor.resolveDependencies(FileDescriptor.java:245)
at org.infinispan.protostream.descriptors.FileDescriptor.resolveDependencies(FileDescriptor.java:210)
at org.infinispan.protostream.descriptors.ResolutionContext.resolve(ResolutionContext.java:57)
at org.infinispan.protostream.impl.SerializationContextImpl.registerProtoFiles(SerializationContextImpl.java:127)
at org.infinispan.protostream.types.java.CommonTypesSchema.registerSchema(CommonTypesSchema.java:49)
at org.infinispan.client.hotrod.RemoteCacheManager.registerSerializationContextInitializer(RemoteCacheManager.java:422)
at org.infinispan.client.hotrod.RemoteCacheManager.registerDefaultSchemas(RemoteCacheManager.java:437)
at org.infinispan.client.hotrod.RemoteCacheManager.initializeProtoStreamMarshaller(RemoteCacheManager.java:409)
at org.infinispan.client.hotrod.RemoteCacheManager.actualStart(RemoteCacheManager.java:365)
at org.infinispan.client.hotrod.RemoteCacheManager.start(RemoteCacheManager.java:334)
at org.infinispan.client.hotrod.RemoteCacheManager.<init>(RemoteCacheManager.java:192)
at org.infinispan.client.hotrod.RemoteCacheManager.<init>(RemoteCacheManager.java:149)
at io.quarkus.infinispan.client.runtime.InfinispanClientProducer.initialize(InfinispanClientProducer.java:68)
If I instead change to use dependsOn like this:
#AutoProtoSchemaBuilder(
includeClasses = [CounterState::class],
dependsOn = [org.infinispan.protostream.types.java.CommonTypes::class, org.infinispan.protostream.types.java.CommonContainerTypes::class],
schemaPackageName = "tutorial")
I'm back to the build failing with:
error: org.infinispan.protostream.annotations.ProtoSchemaBuilderException: The class java.util.UUID must be instantiable using an accessible no-argument constructor.
public abstract interface ProtoSchema extends org.infinispan.protostream.GeneratedSchema {
It seems to be like Quarkus and the Annotation processor are getting in each others way here when it comes to having a simple working solution for UUID marshalling.

You have to include the org.infinispan.protostream.types.java.util.UUIDAdapter class in your annotation:
#AutoProtoSchemaBuilder(includeClasses = [CounterState::class, UUIDAdapter::class] , schemaPackageName = "tutorial")
For more info, check the documentation page.

Related

ConfigurationProperties loading list from YML

I'm trying to load Configuration from YML. I can load value and I can also load list if these are comma seperated values. But i can't load a typical YML List.
Configuration Class
#Component
#PropertySource("classpath:routing.yml")
#ConfigurationProperties
class RoutingProperties(){
var angular = listOf("nothing")
var value: String = ""
}
Working routing.yml
angular: /init, /home
value: Hello World
Not Working routing.yml
angular:
- init
- home
value: Hello World
Why can't i load the second version of yml / do I have a syntaxt error?
ENV: Kotlin, Spring 2.0.0.M3
As #flyx say, #PropetySource not worked with yaml files. But in spring you may override almost everything :)
PropertySource has additional parameter: factory. It's possible to create your own PropertySourceFactory base on DefaultPropertySourceFactory
open class YamlPropertyLoaderFactory : DefaultPropertySourceFactory() {
override fun createPropertySource(name: String?, resource: EncodedResource?): org.springframework.core.env.PropertySource<*> {
if (resource == null)
return super.createPropertySource(name, resource)
return YamlPropertySourceLoader().load(resource.resource.filename, resource.resource, null)
}
}
And when use this factory in propertysource annotation:
#PropertySource("classpath:/routing.yml", factory = YamlPropertyLoaderFactory::class)
Last that you need is to initialized variable angular with mutableList
Full code sample:
#Component
#PropertySource("classpath:/routing.yml", factory = YamlPropertyLoaderFactory::class)
#ConfigurationProperties
open class RoutingProperties {
var angular = mutableListOf("nothing")
var value: String = ""
override fun toString(): String {
return "RoutingProperties(angular=$angular, value='$value')"
}
}
open class YamlPropertyLoaderFactory : DefaultPropertySourceFactory() {
override fun createPropertySource(name: String?, resource: EncodedResource?): org.springframework.core.env.PropertySource<*> {
if (resource == null)
return super.createPropertySource(name, resource)
return YamlPropertySourceLoader().load(resource.resource.filename, resource.resource, null)
}
}
#SpringBootApplication
#EnableAutoConfiguration(exclude = arrayOf(DataSourceAutoConfiguration::class))
open class Application {
companion object {
#JvmStatic
fun main(args: Array<String>) {
val context = SpringApplication.run(Application::class.java, *args)
val bean = context.getBean(RoutingProperties::class.java)
println(bean)
}
}
}
Kinda old post, i know. But i am at the very same topic right now.
As of now, it seems that PropertySource does indeed work with yaml Files. Given the restriction that it only allows for primitive types (it seems) and it cant handle nested elements. I'm probably gonna dig a bit deeper and update my answer accordingly, but as of now, the accepted answer seems like a functioning workaround.
Well, according to the docs, your YAML file will be rewritten into a property file. The first YAML file becomes:
angular=/init, /home
value=Hello World
While the second one becomes:
angular[0]=init
angular[1]=home
value=Hello World
These are obviously two very different things and therefore behave differently.
Moreover, later in the docs, it is stated that YAML does not even work with #PropertySource:
24.6.4 YAML shortcomings
YAML files can’t be loaded via the #PropertySource annotation. So in the case that you need to load values that way, you need to use a properties file.
That makes me kind of wonder why the first case works for you at all.
The docs say this about the generated …[index] properties:
To bind to properties like that using the Spring DataBinder utilities (which is what #ConfigurationProperties does) you need to have a property in the target bean of type java.util.List (or Set) and you either need to provide a setter, or initialize it with a mutable value, e.g. this will bind to the properties above
So, let's have a look at Kotlin docs: listOf returns a new read-only list of given elements. So the list is not mutable as required by the docs, which I assume is why it doesn't work. Try using a mutable list (since I have never used Kotlin, I cannot give you working code). Also try to declare it as java.util.List if that's possible in Kotlin.

Lagom framework / streamed response / websocket / pathCall / Descriptor / Creator instead of Function

I have my service declared this way:
public interface BlogQueryService extends Service {
public ServiceCall<String, Source<String, ?>> tick(int interval);
public ServiceCall<String, Source<String, ?>> tock();
public ServiceCall<NotUsed, Source<PostSummary, ?>> newPosts();
public ServiceCall<String, Source<PostSummary, ?>> getPostSummaries();
#Override
default Descriptor descriptor() {
return named("blog-query").with(
//pathCall("/api/bloggie/tick/:interval", this::tick),
pathCall("/api/bloggie/tock", tock())
//pathCall("/api/bloggie/newPosts", this::newPosts),
//pathCall("/api/bloggie/postSummaries", this::getPostSummaries)
).withAutoAcl(true);
}
}
The tick works. The tock doesn't.
When I invoke it using websocket client (to ws://localhost:9000/api/bloggie/tock ) , I got "undefined" as response, indicating no mapping found for that URL.
After some experimentings, found out why: tick works because it has url param (the :interval). Tick doesn't work because it doesn't have url param. Seriously pathCall requires you to have param in your URL? So I checked the API of Service: http://www.lagomframework.com/documentation/1.0.x/api/java/com/lightbend/lagom/javadsl/api/Service.html
There are several overloaded declarations of pathCall. Apparently the tick uses this one:
static <Request,Response,A> Descriptor.Call<Request,Response> pathCall(String pathPattern, akka.japi.function.Function<A,ServiceCall<Request,Response>> methodRef)
So from the signature, yes it requires the method to take a parameter. So, if the method (such is tock) doesn't take a param, the binding will fail at runtime. So I guess I need to use this one instead:
static <Request,Response> Descriptor.Call<Request,Response> pathCall(String pathPattern, akka.japi.function.Creator<ServiceCall<Request,Response>> methodRef)
The problem is... I don't know how. I haven't seen any example of the use of akka.japi.function.Creator in pathCall.
I tried this:
default Descriptor descriptor() {
return named("blog-query").with(
pathCall("/api/bloggie/tick/:interval", this::tick),
pathCall("/api/bloggie/tock", new Creator<ServiceCall<String, Source<String, ?>>> () {
public ServiceCall<String, Source<String, ?>> create() {
return tock();
}
})
//pathCall("/api/bloggie/newPosts", this::newPosts),
//pathCall("/api/bloggie/postSummaries", this::getPostSummaries)
).withAutoAcl(true);
}
It compiles. But it throws an error at runtime:
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Error in custom provider, java.lang.IllegalStateException: Unable to resolve method for service call with ID PathCallId{pathPattern='/api/bloggie/tock'}. Ensure that the you have passed a method reference (ie, this::someMethod). Passing anything else, for example lambdas, anonymous classes or actual implementation classes, is forbidden in declaring a service descriptor.
at com.lightbend.lagom.javadsl.server.ServiceGuiceSupport.bindServices(ServiceGuiceSupport.java:43) (via modules: com.google.inject.util.Modules$OverrideModule -> sample.bloggie.impl.BlogServiceModule)
while locating com.lightbend.lagom.internal.server.ResolvedServices
Thanks in advance!
I just did some experiments... All compiled, but none of them works....
namedCall("/api/bloggie/tock", this::tock)
Result: Compile success. Runtime: path unknown (no binding (?)).
Then I tried
pathCall("/api/bloggie/tock", () -> this.tock())
Result: exception.
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Error in custom provider, scala.MatchError: Request (of class sun.reflect.generics.reflectiveObjects.TypeVariableImpl)
at com.lightbend.lagom.javadsl.server.ServiceGuiceSupport.bindServices(ServiceGuiceSupport.java:43) (via modules: com.google.inject.util.Modules$OverrideModule -> sample.bloggie.impl.BlogServiceModule)
while locating com.lightbend.lagom.internal.server.ResolvedServices
for parameter 1 at com.lightbend.lagom.internal.server.ServiceRegistrationModule$RegisterWithServiceRegistry.<init>(ServiceRegistrationModule.scala:55)
at com.lightbend.lagom.internal.server.ServiceRegistrationModule.bindings(ServiceRegistrationModule.scala:29):
Binding(class com.lightbend.lagom.internal.server.ServiceRegistrationModule$RegisterWithServiceRegistry to self eagerly) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
while locating com.lightbend.lagom.internal.server.ServiceRegistrationModule$RegisterWithServiceRegistry
Then I tried:
public ServiceCall<NotUsed, Source<String, ?>> tock(Void x);
Result: exception
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Error in custom provider, java.lang.IllegalArgumentException: Don't know how to serialize ID class java.lang.Void
at com.lightbend.lagom.javadsl.server.ServiceGuiceSupport.bindServices(ServiceGuiceSupport.java:43) (via modules: com.google.inject.util.Modules$OverrideModule -> sample.bloggie.impl.BlogServiceModule)
Update: "Solved" (partially). Figured out that this one works:
pathCall("/tock", this::tock)
I can open it using this URL: ws://localhost:9000/tock
So..., I can't have nicely structured URL for those functions that returns stream, when those functions need no param? At least for now (?).
UPDATE: seems like this problem is happening not only with pathCall. I encountered the same problem with rest call. This one doesn't work (no binding):
public ServiceCall<NotUsed, PSequence<PostSummary>> getPostSummaries();
...
restCall(Method.GET, "/api/bloggie/postSummaries", this::getPostSummaries)
This one works:
public ServiceCall<NotUsed, PSequence<PostSummary>> getPostSummaries();
...
restCall(Method.GET, "/postSummaries", this::getPostSummaries)
Thanks!
So firstly, namedCall should only be used if you don't care about the path. You are invoking the service call directly, which means you do care about the path, so you have to use pathCall or restCall.
This should work:
pathCall("/api/bloggie/tock", this::tock)
Also, I think you're not pasting the full errors. Make sure you check right to the bottom of the list of Guice errors, that should explain exactly what the problem is, in many of the cases above, the problem is that you're not passing a method reference, you're passing a lambda, and the error message should say that.

grails 2.2.2 platform-core-plugin No signature of method event in domain model

I try out the platform-core-1.0 rc5 Plugin to services by events. Now I write a service in the grails-plugin "listadmin":
package listadmin
class SECO_ListenService {
#grails.events.Listener(topic='getEntriesOfList', namespace='listadmin')
def getEntriesOfList(String intnalListName) {
println "SECO_ListenService"
def Liste aList = Liste.findByInternal_name(intnalListName)
return aList.eintrage.toList()
}
}
This service should return a list for dropdown in an other grails-plugin called "institutionadmin". I want to use this list of the service for a dropdown of a domain-model. I should mention that I use dynamic scaffolding. Now I try to call this event in the domain-model:
package institutionadmin
import org.springframework.dao.DataIntegrityViolationException
class Einrichtung {
Long einrichtungs_type
Long type_of_conzept
int anzahl_gruppen
int anzahl_kinder_pro_Gruppe
String offnungszeiten
static hasMany = [rooms : Raum]
static constraints = {
def aList = []
def reply = event(for:"listadmin", topic:"getEntriesOfList", data:"einrichtung_type").waitFor()
aList = reply.value.toList()
einrichtungs_type(inList: aList)
}
}
If I try to run this application i get the following error:
Caused by MissingMethodException: No signature of method: institutionadmin.Einrichtung.event() is applicable for argument types: (java.util.LinkedHashMap) values: [[for:listadmin, topic:testEventBus]]
Possible solutions: ident(), every(), every(groovy.lang.Closure), count(), get(java.io.Serializable), print(java.lang.Object)
If call this event in a controller everything is fine and the documentation of this plugin describe that I can call events also in domain-models and services... This error-method tell me, that the class don't know the event method.
Do I have to configure anything else?
Should call the event in another way or where is my mistake?
Has anybody experiences with this module?
The event(...) dynamic methods are not available on class (static) level.
You can pull the grailsEvents spring bean and call its event() method alternatively. You still have to get the bean from the application context statically though.
You could also use a custom validator instead, as you can get the current domain instance as a parameter, which should have the event() method injected.
something like this :
static myList = []
static constraints = {
einrichtungs_type validator: { value, instance ->
if(!myList){
// cache it the first time you save/validate the domain
// I would probably recommend you NOT to do this here though in
// real life scenario
def reply = instance.event('blabla').get()
myList = reply.value.toList()
}
return value in myList
}
}
Anyway, In my case I would probably load the list elsewhere (in the Bootstrap.groovy for instance) and use it / inject it in my domain instead of doing in the constraints closure.
I faced similar kind of problem, I wanted to use the event call inside a service class which is going to call the listener in other service class. When I started my application I got the same error.What I did was, added the plugin(platform-core:1.0.RC5) entries in BuildConfig.groovy like below
plugins {
build(":tomcat:$grailsVersion",
":platform-core:1.0.RC5") {
export = false
}
compile ':platform-core:1.0.RC5'
runtime ':platform-core:1.0.RC5'
}
Then I ran grails > clean and grails > compile on that project and restarted the server.It started working. Might be you can give a try.

MongoDB - override default Serializer for a C# primitive type

I'd like to change the representation of C# Doubles to rounded Int64 with a four decimal place shift in the serialization C# Driver's stack for MongoDB. In other words, store (Double)29.99 as (Int64)299900
I'd like this to be transparent to my app. I've had a look at custom serializers but I don't want to override everything and then switch on the Type with fallback to the default, as that's a bit messy.
I can see that RegisterSerializer() won't let me add one for an existing type, and that BsonDefaultSerializationProvider has a static list of primitive serializers and it's marked as internal with private members so I can't easily subclass.
I can also see that it's possible to RepresentAs Int64 for Doubles, but this is a cast not a conversion. I need essentially a cast AND a conversion in both serialization directions.
I wish I could just give the default serializer a custom serializer to override one of it's own, but that would mean a dirty hack.
Am I missing a really easy way?
You can definitely do this, you just have to get the timing right. When the driver starts up there are no serializers registered. When it needs a serializer, it looks it up in the dictionary where it keeps track of the serializers it knows about (i.e. the ones that have been registered). Only it it can't find one in the dictionary does it start figuring out where to get one (including calling the serialization providers) and if it finds one it registers it.
The limitation in RegisterSerializer is there so that you can't replace an existing serializer that has already been used. But that doesn't mean you can't register your own if you do it early enough.
However, keep in mind that registering a serializer is a global operation, so if you register a custom serializer for double it will be used for all doubles, which could lead to unexpected results!
Anyway, you could write the custom serializer something like this:
public class CustomDoubleSerializer : BsonBaseSerializer
{
public override object Deserialize(BsonReader bsonReader, Type nominalType, Type actualType, IBsonSerializationOptions options)
{
var rep = bsonReader.ReadInt64();
return rep / 100.0;
}
public override void Serialize(BsonWriter bsonWriter, Type nominalType, object value, IBsonSerializationOptions options)
{
var rep = (long)((double)value * 100);
bsonWriter.WriteInt64(rep);
}
}
And register it like this:
BsonSerializer.RegisterSerializer(typeof(double), new CustomDoubleSerializer());
You could test it using the following class:
public class C
{
public int Id;
public double X;
}
and this code:
BsonSerializer.RegisterSerializer(typeof(double), new CustomDoubleSerializer());
var c = new C { Id = 1, X = 29.99 };
var json = c.ToJson();
Console.WriteLine(json);
var r = BsonSerializer.Deserialize<C>(json);
Console.WriteLine(r.X);
You can also use your own serialization provider to tell Mongo which serializer to use for certain types, which I ended up doing to mitigate some of the timing issues mentioned when trying to override existing serializers. Here's an example of a serialisation provider that overrides how to serialize decimals:
public class CustomSerializationProvider : IBsonSerializationProvider
{
public IBsonSerializer GetSerializer(Type type)
{
if (type == typeof(decimal)) return new DecimalSerializer(BsonType.Decimal128);
return null; // falls back to Mongo defaults
}
}
If you return null from your custom serialization provider, it will fall back to using Mongo's default serialization provider.
Once you've written your provider, you just need to register it:
BsonSerializer.RegisterSerializationProvider(new CustomSerializationProvider());
I looked through the latest iteration of the driver's code and checked if there's some sort of backdoor to set custom serializers. I am afraid there's none; you should open an issue in the project's bug tracker if you think this needs to be looked at for future iterations of the driver (https://jira.mongodb.org/).
Personally, I'd open a ticket -- and if a quick workaround is necessary or required, I'd subclass DoubleSerializer, implement the new behavior, and then use Reflection to inject it into either MongoDB.Bson.Serialization.Serializers.DoubleSerializer.__instance or MongoDB.Bson.Serialization.BsonDefaultSerializationProvider.__serializers.

Enterprise Library Validation Block - Should validation be placed on class or interface?

I am not sure where the best place to put validation (using the Enterprise Library Validation Block) is? Should it be on the class or on the interface?
Things that may effect it
Validation rules would not be changed in classes which inherit from the interface.
Validation rules would not be changed in classes which inherit from the class.
Inheritance will occur from the class in most cases - I suspect some fringe cases to inherit from the interface (but I would try and avoid it).
The interface main use is for DI which will be done with the Unity block.
The way you are trying to use the Validation Block with DI, I dont think its a problem if you set the attributes at interface level. Also, I dont think it should create problems in the inheritance chain. However, I have mostly seen this block used at class level, with an intent to keep interfaces not over specify things. IMO i dont see a big threat in doing this.
Be very careful here, your test is too simple.
This will not work as you expect for SelfValidation Validators or Class Validators, only for the simple property validators like you have there.
Also, if you are using the PropertyProxyValidator in an ASP.NET page, iI don;t believe it will work either, because it only looks a field validators, not inherited/implemented validators...
Yes big holes in the VAB if you ask me..
For the sake of completeness I decided to write a small test to make sure it would work as expected and it does, I'm just posting it here in case anyone else wants it in future.
using System;
using Microsoft.Practices.EnterpriseLibrary.Validation;
using Microsoft.Practices.EnterpriseLibrary.Validation.Validators;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
ISpike spike = new Spike();
spike.Name = "A really long name that will fail.";
ValidationResults r = Validation.Validate<ISpike>(spike);
if (!r.IsValid)
{
throw new InvalidOperationException("Validation error found.");
}
}
}
public class Spike : ConsoleApplication1.ISpike
{
public string Name { get; set; }
}
interface ISpike
{
[StringLengthValidator(2, 5)]
string Name { get; set; }
}
}
What version of Enterprise Library are you using for your code example? I tried it using Enterprise Library 5.0, but it didn't work.
I tracked it down to the following section of code w/in the EL5.0 source code:
[namespace Microsoft.Practices.EnterpriseLibrary.Validation]
[public static class Validation]
public static ValidationResults Validate<T>(T target, ValidationSpecificationSource source)
{
Type targetType = target != null ? target.GetType() : typeof(T);
Validator validator = ValidationFactory.CreateValidator(targetType, source);
return validator.Validate(target);
}
If the target object is defined, then target.GetType() will return the most specific class definition, NOT the interface definition.
My workaround is to replace your line:
ValidationResults r = Validation.Validate<ISpike>(spike);
With:
ValidationResults r ValidationFactory.CreateValidator<ISpike>().Validate(spike);
This got it working for me.

Resources