How to ignore unknown enum values? - graphql

I'm wondering what would be the best way to ignore/discard the unknown enum values in GraphQL/Apollo server.
Let's say my GraphQL schema defines array of enums "enum Service { Supermarket, TicketSales }" and it works fine now, but later on other service I'm using is adding some new values (e.g. Playground) and my client just doesn't support it and I would just like to ignore it and return the supported values without error.
What would be the best way to do this in GraphQL. My first idea was to make directive that would read the supported values from schema and ignore everything else, but after googling around I didn't find any good examples how to do it. Can you point me a direction where to go about this?

If your resolver function will accept arbitrary strings, then you can use a custom scalar type, or just String.
"""
The type of a service. `Supermarket` means..., and
`TicketSales` means...; any other value is ignored.
"""
scalar Service
GraphQL generally places responsibility on the client to conform to the server's expectations, rather than making the server try to support any request. There are a couple of places you can reasonably expect an enum value like this to appear:
enum Service { Supermarket, TicketSales }
type Query {
inAReturnValue: Service!
asAQueryParam(service: Service!): Node
}
type Mutation {
asAMutationInput(service: Service!): Node
}
In particular it may not make sense to tell the server "make the type of this object be a playground" if the server just doesn't understand that. Conversely, if the server knows about "playground", it could return it in cases the client may not expect. Having an enum here makes it explicit what the server knows about. The server has said what it supports and it's the client's responsibility to cooperate.
Note that it's possible for the client to find out if the server supports playgrounds, if it's an enum value, and this might help it inform its behavior.
query GetServiceTypes {
__type(name: "Service") {
enumValues { name }
}
}

After playing around I found something that I can use to get around my original problem, so I will post it here in case somebody else is wondering the same thing.
So my original problem was in short that I'm receiving several different "available services" kind of string arrays from another services and I was thinking to map them to enum for better typescript support etc. But the problem was that if I get some unknown value from another service, my graphql will fail.
So my original idea was to fix it with directive which I after all got working:
# In schema
directive #mapUnknownTo(value: String) on ENUM
enum SomeAttribute #mapUnknownTo(value: "__UNKNOWN__") {
SomeAttribute1
AnotherAttribute
SomethingElse
__UNKNOWN__
}
And the directive implementation is:
import { SchemaDirectiveVisitor } from 'graphql-tools';
import { GraphQLEnumType } from 'graphql';
export class MapUnknownToDirective extends SchemaDirectiveVisitor {
visitEnum(type: GraphQLEnumType) {
const { value = '__UNKNOWN__' } = this.args;
const valueMap = type.getValues().reduce((map, v) => map.set(v.value, v.name), new Map<string, string>());
type.serialize = (v: string): string => valueMap.get(v) || value;
}
}
So this will map all the values not defined in schema into some custom value, which is not exactly what I originally wanted, but at least it's not giving an error, so it's okay-ish.
I'm still not 100% sure if directives are way to go on cases like this, but at least it's one possible solution.

Related

Is there known way to add new syntax features to Protobuf?

Protobuf provides service keyword that defines rpc-interface of one application.
I also want to use concept of entity which means that is part of service (one service contains multiple entities). Each entity type has own unique identifier that gives possibility to address different entities in service.
I would like to use proto like this
message UserReq {
string username = 1;
string password = 2;
}
message RegReq {
uint8 result_code = 1;
}
message RemoteEntityInterface
{
MyEntity entity = 1;
}
message GiveItemResult
{
uint8 result_code = 1;
}
service MyService {
rpc RegisterUser (UserReq) returns (RegReq) {}
rpc Login(UserReq) returns (RemoteEntityInterface) {}
}
entity MyEntity
{
rpc GiveItem (GiveItemReq) returns (GiveItemResult) {}
}
As you can see in example, I used unknown for protobuf keyword entity, this keyword means that MyService can return the interface to some remote object (MyEntity) by using Login remote method.
What are the ways to do this? (maybe write plugin or known way to modify source code of protobuf). Or maybe there are more flexible solutions than protobuf?
I also would like to use multiple parameters per one rpc; adding java-like attributes to rpc; service and entity; and data-model for entity (variables/fields) to add real-time replication support from entity to another service.
I think it is very flexible for services in game-development.
The only official way to extend .proto syntax is to define custom options.
For example, you could have something like:
extend google.protobuf.ServiceOptions {
optional bool is_entity = 123456;
}
service MyEntity
{
option (is_entity) = true;
rpc GiveItem (GiveItemReq) returns (GiveItemResult) {}
}
The default code generator will not do anything special with this option, but you can access it from your own code and from a protoc plugin if you write one.

Passing default/static values from server to client

I have an input type with two fields used for filtering a query on the client.
I want to pass the default values (rentIntervalLow + rentIntervalHigh) from server to the client, but don't know how to do it.
Below is my current code. I've come up with two naïve solutions:
Letting the client introspect the whole schema.
Have a global config object, and create a querable Config type with a resolver that returns the config object values.
Any better suggestions than the above how to make default/config values on the server accessible to the client?
// schema.js
const typeDefs = gql`
input FilteringOptions {
rentIntervalLow: Int = 4000
rentIntervalHigh: Int = 10000
}
type Home {
id: Int
roomCount: Int
rent: Int
}
type Query {
allHomes(first: Int, cursor: Int, input: FilteringOptions): [Home]
}
`
export default typeDefs
I'm using Apollo Server 2.8.1 and Apollo React 3.0.
It's unnecessary to introspect the whole schema to get information about a particular type. You can just write a query like:
query {
__type(name:"FilteringOptions") {
inputFields {
name
description
defaultValue
}
}
}
Default values are values that will be used when a particular input value is omitted from the query. So to utilize the defaults, the client would pass an empty object to the input argument of the allHomes field. You could also give input a default value of {}, which would allow the client not to provide the input argument at all, while still relaying the min and max default values to the resolver.
If, however, your intent is to provide the minimum and maximum values to your client in order to drive some client-specific logic (like validation, drop down menu values, etc.), then you should not utilize default values for this. Instead, this information should be queried directly by the client, using, for example, a Config type like you suggested.

protobuf-net concurrent performance issue in TakeLock

We're using protobuf-net for sending log messages between services. When profiling stress testing, under high concurrency, we see very high CPU usage and that TakeLock in RuntimeTypeModel is the culprit. The hot call stack looks something like:
*Our code...*
ProtoBuf.Serializer.SerializeWithLengthPrefix(class System.IO.Stream,!!0,valuetype ProtoBuf.PrefixStyle)
ProtoBuf.Serializer.SerializeWithLengthPrefix(class System.IO.Stream,!!0,valuetype ProtoBuf.PrefixStyle,int32)
ProtoBuf.Meta.TypeModel.SerializeWithLengthPrefix(class System.IO.Stream,object,class System.Type,valuetype ProtoBuf.PrefixStyle,int32)
ProtoBuf.Meta.TypeModel.SerializeWithLengthPrefix(class System.IO.Stream,object,class System.Type,valuetype ProtoBuf.PrefixStyle,int32,class ProtoBuf.SerializationContext)
ProtoBuf.ProtoWriter.WriteObject(object,int32,class ProtoBuf.ProtoWriter,valuetype ProtoBuf.PrefixStyle,int32)
ProtoBuf.BclHelpers.WriteNetObject(object,class ProtoBuf.ProtoWriter,int32,valuetype
ProtoBuf.BclHelpers/NetObjectOptions)
ProtoBuf.Meta.TypeModel.GetKey(class System.Type&)
ProtoBuf.Meta.RuntimeTypeModel.GetKey(class System.Type,bool,bool)
ProtoBuf.Meta.RuntimeTypeModel.FindOrAddAuto(class System.Type,bool,bool,bool)
ProtoBuf.Meta.RuntimeTypeModel.TakeLock(int32&)
[clr.dll]
I see that we can use the new precompiler to get a speed boost, but I'm wondering if that will get rid of the issue (sounds like it doesn't use reflection); it would be a bit of work for me to integrate this, so I haven't tested it yet. I also see the option to call Serializer.PrepareSerializer. My initial (small scale) testing didn't make the prepare seem promising.
A little more info about the type we're serializing:
[ProtoContract]
public class SomeMessage
{
[ProtoMember(1)]
public SomeEnumType SomeEnum { get; set; }
[ProtoMember(2)]
public long SomeId{ get; set; }
[ProtoMember(3)]
public string SomeString{ get; set; }
[ProtoMember(4)]
public DateTime SomeDate { get; set; }
[ProtoMember(5, DynamicType = true, OverwriteList = true)]
public Collection<object> SomeArguments
}
Thanks for your help!
UPDATE 9/17
Thanks for your response! We're going to try the workaround you suggest and see if that fixes things.
This code lives in our logging system so, in the SomeMessage example, SomeString is really a format string (e.g. "Hello {0}") and the SomeArguments collection is a list of objects used to fill in the format string, just like String.Format. Before we serialize, we look at each argument and call DynamicSerializer.IsKnownType(argument.GetType()), if it isn't known, we convert it to a string first. I haven't looked at the ratios of data, but I'm pretty sure we have a lot of different strings coming in as arguments.
Let me know if this helps. If you need, I'll try to get more details.
TakeLock is only used when it is changing the model, for example because it is seeing a type for the first time. You shouldn't normally see TakeLock after the first time a particular type has been used. In most cases, using Serializaer.PrepareSerializer<SomeMessage>() should perform all the necessary initialization (and similar for any other contracts you are using).
However! I wonder if perhaps this is also related to your use of DynamicType; what are the actual objects being used here? It might be that I need to tweak the logic here, so that it doesn't spend any time on that step. If you let me know the actual objects (so I can repro), I will try to run some tests.
As for whether the precompiler would change this; yes it would. A fully compiled static model has a completely different implementation of the ProtoBuf.Meta.TypeModel.GetKey method, so it would never call TakeLock (you don't need to protect a model that can never change!). But you can actuallydo something very similar without needing to use precompile. Consider the following, run as part of your app's initialization:
static readonly TypeModel serializer;
...
var model = TypeModel.Create();
model.Add(typeof(SomeMessage), true);
// TODO add other contracts you use here
serializer = model.Compile();
This will create a fully static-compiled serializer assembly in memory (instead of a mutable model with individual operations compiled). If you now use serializer.Serialize(...) instead of Serializer.Serialize (i.e. the instance method on your stored TypeModel rather than the static method on Serializer) then it will essentially be doing something very similar to "precompiler", but without the need to actualy precompile it (obviously this will only be available on "full" .NET). This will then never call TakeLock, as it is running a fixed model, rather than a flexible model. It does, however, require you to know what contract-types you use. You could use reflection to find these, by looking for all those types with a given attribute:
static readonly TypeModel serializer;
...
var model = TypeModel.Create();
Type attributeType = typeof(ProtoContractAttribute);
foreach (var type in typeof(SomeMessage).Assembly.GetTypes()) {
if (Attribute.IsDefined(type, attributeType)) {
model.Add(type, true);
}
}
serializer = model.Compile();
But emphasis: the above is a workaround; it sounds like there's a glitch, which I'll happily investigate if I can see an example where it actually happens; most importantly: what are the objects in SomeArguments?

MongoDB - override default Serializer for a C# primitive type

I'd like to change the representation of C# Doubles to rounded Int64 with a four decimal place shift in the serialization C# Driver's stack for MongoDB. In other words, store (Double)29.99 as (Int64)299900
I'd like this to be transparent to my app. I've had a look at custom serializers but I don't want to override everything and then switch on the Type with fallback to the default, as that's a bit messy.
I can see that RegisterSerializer() won't let me add one for an existing type, and that BsonDefaultSerializationProvider has a static list of primitive serializers and it's marked as internal with private members so I can't easily subclass.
I can also see that it's possible to RepresentAs Int64 for Doubles, but this is a cast not a conversion. I need essentially a cast AND a conversion in both serialization directions.
I wish I could just give the default serializer a custom serializer to override one of it's own, but that would mean a dirty hack.
Am I missing a really easy way?
You can definitely do this, you just have to get the timing right. When the driver starts up there are no serializers registered. When it needs a serializer, it looks it up in the dictionary where it keeps track of the serializers it knows about (i.e. the ones that have been registered). Only it it can't find one in the dictionary does it start figuring out where to get one (including calling the serialization providers) and if it finds one it registers it.
The limitation in RegisterSerializer is there so that you can't replace an existing serializer that has already been used. But that doesn't mean you can't register your own if you do it early enough.
However, keep in mind that registering a serializer is a global operation, so if you register a custom serializer for double it will be used for all doubles, which could lead to unexpected results!
Anyway, you could write the custom serializer something like this:
public class CustomDoubleSerializer : BsonBaseSerializer
{
public override object Deserialize(BsonReader bsonReader, Type nominalType, Type actualType, IBsonSerializationOptions options)
{
var rep = bsonReader.ReadInt64();
return rep / 100.0;
}
public override void Serialize(BsonWriter bsonWriter, Type nominalType, object value, IBsonSerializationOptions options)
{
var rep = (long)((double)value * 100);
bsonWriter.WriteInt64(rep);
}
}
And register it like this:
BsonSerializer.RegisterSerializer(typeof(double), new CustomDoubleSerializer());
You could test it using the following class:
public class C
{
public int Id;
public double X;
}
and this code:
BsonSerializer.RegisterSerializer(typeof(double), new CustomDoubleSerializer());
var c = new C { Id = 1, X = 29.99 };
var json = c.ToJson();
Console.WriteLine(json);
var r = BsonSerializer.Deserialize<C>(json);
Console.WriteLine(r.X);
You can also use your own serialization provider to tell Mongo which serializer to use for certain types, which I ended up doing to mitigate some of the timing issues mentioned when trying to override existing serializers. Here's an example of a serialisation provider that overrides how to serialize decimals:
public class CustomSerializationProvider : IBsonSerializationProvider
{
public IBsonSerializer GetSerializer(Type type)
{
if (type == typeof(decimal)) return new DecimalSerializer(BsonType.Decimal128);
return null; // falls back to Mongo defaults
}
}
If you return null from your custom serialization provider, it will fall back to using Mongo's default serialization provider.
Once you've written your provider, you just need to register it:
BsonSerializer.RegisterSerializationProvider(new CustomSerializationProvider());
I looked through the latest iteration of the driver's code and checked if there's some sort of backdoor to set custom serializers. I am afraid there's none; you should open an issue in the project's bug tracker if you think this needs to be looked at for future iterations of the driver (https://jira.mongodb.org/).
Personally, I'd open a ticket -- and if a quick workaround is necessary or required, I'd subclass DoubleSerializer, implement the new behavior, and then use Reflection to inject it into either MongoDB.Bson.Serialization.Serializers.DoubleSerializer.__instance or MongoDB.Bson.Serialization.BsonDefaultSerializationProvider.__serializers.

Separation of Concerns: Returning Projected Data between layers From a Linq Query

I'm using Linq and having trouble doing something that I believe should be trivial. I want to return data from one layer so it can be used independently of linq in another layer.
Suppose I have a Data Access Layer. It knows about the entity framework and how to interact with it. But, it doesn't care who accesses it. The one interesting requirement I have is that the queries in the entity framework return projected data that is not part of the Entity Model itself. Please don't ask me to change this part of the requirement and make POCOs for each return type, as it is not the best design given the problem I am trying to solve. Below is an example.
public class ChartData
{
public function <<returnType??>> GetData()
{
MyEntities context = new MyEntities();
var results = from context.vManyColumnsOfData as v
where v.CompanyName = "acme"
select new {Year = v.SalesYear, Income = v.Income};
return ??;
}
}
Then, I would like to have an ASP.Net UI layer be able to call into the Data Access Layer to get the data in order to bind it to a control. The UI layer should have no notion of where the data came from. It should only know that it has the data it needs to bind. Below is an example.
protected void chart_Load(object sender, EventArgs e)
{
// set some chart properties
chart.Skin = "Default";
...
// Set the data source
ChartData dataMgr = new ChartData();
<<returnType?>> data = dataMgr.GetData();
chart.DataSource = data;
chart.DataBind();
}
What is the best way to send linq projected data back to another layer?
If you don't need to use the projected type statically, just return IEnumerable<object>.
Please don't ask me to change this part of the requirement and make
POCOs for each return type, as it is not the best design given the
problem I am trying to solve.
I feel like I should rightly ignore this, as the best thing to do is to return a defined type. Anonymous types are useful when they are wholly contained within the method that creates them. Once you start passing them around, it is time to go ahead and give them the proper class treatment.
However, to live within your imposed limitations, you can return IEnumerable<object> from the method and use that or var at the callsite and rely upon the dynamic binding of the control to get at the data. It's not going to help you if you need to deal with the object programmatically, but it will serve fine for databinding.
You can not return an anonymous type, so basically for this you will need POCO's even though you don't want them.
"not the best design given the problem I am trying to solve"
Could you explain what you are trying to achieve a little more? It might be possible to return some type of list containing a dictionary of items (ie rows and columns). Think something like an untyped dataset (yuck)
Your GetData method can use IEnumerable (the "old" non-generic interface) as its return type.
Any dynamic resolution (e.g. ASP.NET or XAML bindings) should work as expected, which seems to be what you want to do.
However, if you want to use the results in your code, you will probably have to resort to .NET 4's dynamic keyword.
The following example can be run in LINQPad (in "C# Program" mode) and illustrates this:
void Main()
{
var v = GetData();
foreach (dynamic element in v)
{
((string)element.Name).Dump();
}
}
public IEnumerable GetData()
{
return from i in Enumerable.Range(1, 10)
select new
{
Name = "Item " + i,
Value = i
};
}
Keep in mind that, design-wise, coding like this will make most people frown and can affect performance.

Resources