.net 5.0 signature change of System.Text.Json.JsonSerializer.Deserialize() - .net-5

I am attempting the step from .NET Core 3.1 to .NET 5.0 and get a bunch of nullability warnings at the uses of Deserialize<TValue>(String, JsonSerializerOptions). A quick investigation shows that the signature has changed from
public static TValue Deserialize<TValue> (string json, System.Text.Json.JsonSerializerOptions options = default); (doc) in .NET Core 3.1 to
public static TValue? Deserialize<TValue> (string json, System.Text.Json.JsonSerializerOptions? options = default); (doc) in .NET 5.0.
It looks as a reasonable change, but I haven't been able to provoke a null to actually be returned, since all bad input/bad use will throw an exception in my experiments, and the documentation does not describe why the call would return a null as far as I can tell.
It seems a bit unnecessary to add null return checks to all our uses, if failed deserialization will throw rather than returning null.
What am I missing?

As shown in the original JSON proposal, the text null is perfectly well-formed JSON:
A value can be a string in double quotes, or a number, or true or false or null, or an object or an array. These structures can be nested.
This is further clarified in RFC 8259: The JavaScript Object Notation (JSON) Data Interchange Format which states that a well-formed JSON text need be nothing more than a single primitive value including null:
A JSON text is a sequence of tokens. The set of tokens includes six structural characters, strings, numbers, and three literal names [false, true and null].
A JSON text is a serialized value. Note that certain previous specifications of JSON constrained a JSON text to be an object or an array. Implementations that generate only objects or arrays where a JSON text is called for will be interoperable in the sense that all implementations will accept these as conforming JSON texts.
Since null is a well-formed JSON text according to this most recent JSON RFC, JsonSerializer will not throw when deserializing it to a reference type or nullable value type, and will instead just return a null value:
object? obj1 = JsonSerializer.Deserialize<object>("null"); // Does not throw; explicitly typed for clarity.
Assert.IsNull(obj1); // Passes
var array = JsonSerializer.Deserialize<int []>("null"); // Does not throw;
Assert.IsNull(array); // Passes
var nullable = JsonSerializer.Deserialize<int?>("null"); // Does not throw;
Assert.IsNull(nullable); // Passes
Conversely the following generates a compiler warning:
#nullable enable
object obj2 = JsonSerializer.Deserialize<object>("null"); // Compiler warning: Converting null literal or possible value to non-nullable type;
And the following throws, since an int is a non-nullable value type to which null cannot be assigned:
var i = JsonSerializer.Deserialize<int>("null"); // Throws, since int is a non-nullable value type.
If you want to an exception to be thrown when deserializing the JSON text null, you could add the following extension method:
public static class ObjectExtensions
{
public static T ThrowOnNull<T>(this T? value) where T : class => value ?? throw new ArgumentNullException();
}
And do:
var value = JsonSerializer.Deserialize<TValue>(json).ThrowOnNull();
Demo fiddle here.

Related

Elastic NEST De-serializing the wrong field

Using ElasticSearch.Net v6.0.2
Given the Indexed item
{
"PurchaseFrequency": 76,
"purchaseFrequency": 80
}
and the POCO Object
public class Product
{
public int PurchaseFrequency { get; set; }
}
and the setting
this.DefaultFieldNameInferrer(x => x);
Nest is returning a PurchaseFrequency = 80 even though this is the wrong field.
How can I get NEST to pull the correct cased field from ElasticSearch?
I don't think that this is going to be easily possible because this behaviour is defined in Json.NET, which NEST uses internally (not a direct dependency in 6.x, it's IL-merged into the assembly).
For example,
JsonConvert.DeserializeAnonymousType("{\"a\":1, \"A\":2}", new { a = 0 })
deserializes the anonymous type property a value to 2. But
JsonConvert.DeserializeAnonymousType("{\"A\":2, \"a\":1}", new { a = 0 })
deserializes the anonymous type property a value to 1 i.e. the order of properties as they appear in the returned JSON has a bearing on the final value assigned to a property on an instance of a type.
If you can, avoid JSON property names that differ only in case. If you can't, then you'd need to hook up the JsonNetSerializer in the NEST.JsonSerializer nuget package and write a custom JsonConverter for your type which only honours the exact casing expected.

Handle NullObjects across two models web api

I have two models in my asp.net web api. One is the database model, and the other is the model the end user passes in and we map the properties like this:
public static IndividualProc ToIndividualnternal(this AssignmentExternal item) {
return new IndividualProc() {
IndividualID = (int)item.person.id,
EventID = (int)item.event.id,
EventScehduleID = item.schedule.id,
EventGroupID = item.group.id
};
}
The problem is that when the user passes a null, I get an exception; "Nullable object must have a value". How can I cast the nullable properties so that this exception is not caused?
What's nullable? For the purpose of this answer I'm going to assume this is a nullable int:
item.event.id
and this is a regular int:
EventID
In that case, clearly you can't directly cast the former to the latter because there's no way for int to handle the case of a null value. The Nullable<t> structure has properties to help check for that value:
EventID = item.event.id.HasValue ? item.event.id.Value : default(int)
This will check if the nullable int has a value and, if it does, use that value. If it doesn't, use the default value for int (which is 0).

Adding the localized Display Name to the constraint violation message?

We are using the new GWT validation library in 2.5.
We are adding an aggregated list of violations to our screen. This list must display the localized field name.
#MyNotNull(foo= "Stage")
public String getStage();
Localized message needs to display
"Stage is a required field"
The message in MyValidationMessages.properties reads
{foo} is a required field
Note that annotations do not allow non-constant values to be assigned to attributes. So we have to get the locale value somehow at design time :/
This will not work
#MyNotNull(foo = injector.getLocale().errorMessage())
public String errorMessage()
How do I use localeKey to look up the locale in the locale files since the property requires a constant?
The solution is to
add something like FieldLocale.properties this is a constants lookup
Add an attribute to your annotation like localeKey
Iterate your ConstraintViolation collection
Use something like the below to get the attribute value
Look up the localized value in your FieldLocale.properties file
Copy the violation and change the message to the localized version
protected String getAttributeValue(ConstraintViolation violation, String key) {
ConstraintDescriptor descriptor = violation.getConstraintDescriptor();
if (descriptor.getAttributes().containsKey(key))
return (String) descriptor.getAttributes().get(key);
return null;
}
protected ConstraintViolation<T> copyMessage(ConstraintViolation<T> violation, String message) {
return ConstraintViolationImpl.<T> builder() //
.setConstraintDescriptor(violation.getConstraintDescriptor()) //
.setInvalidValue(violation.getInvalidValue()) //
.setLeafBean(violation.getLeafBean()) //
.setMessage(message) //
.setMessageTemplate(violation.getMessageTemplate()) //
.setPropertyPath(violation.getPropertyPath()) //
.setRootBean(violation.getRootBean()) //
.setRootBeanClass(violation.getRootBeanClass()) //
.build();
}

XmlReader.ReadContentAsObject always returns string type

According to the MSDN documentation, XMLWriter.WriteValue writes xsd type information to the xml for simple CLR types. Then XMLReader.ReadContentAsObject supposedly reads out the appropriately-typed object when the XML is parsed. However, this always seems to return a string object for me and the ValueType property of the XMLReader is string. I've tried inserting longs and DateTimes, but they always end up as strings. Any ideas what I'm doing wrong or is this a Windows Phone bug?
XML Writing Code
public void WriteXml(XmlWriter writer) {
// KeyValuePair<string, object> pair initialized previously
writer.WriteStartElement(pair.Key);
writer.WriteValue(pair.Value)
writer.WriteEndElement();
}
XML Parsing Code
public void ReadXml(XMLReader reader) {
while (reader.Read()) {
if (reader.NodeType == XmlNodeType.Element) {
Type T = reader.ValueType; // T is string
reader.ReadStartElement();
object o = reader.ReadContentAsObject(); // o is string
o = reader.ReadContentAs(T, null); // o is string
}
}
}
You need to use a schema file (XSD) so that the framework can infer the type of a node. Otherwise ValueType will always return System.String.
MSDN says:
If a validation error occurs while parsing the content and the reader is an XmlReader object created by the Create method, the reader returns the content as a string. In other words when a validation error or warning occurs, the content is considered to be untyped.
I was making this too difficult. My goal was to serialize a Dictionary with generic type (string, object) by traversing its KeyValuePairs , but that class doesn't seem to be serializeable using XmlSerializer. I just created another class with two public properties, Key and Value, so I could use XmlSerializer. When deserializing with XmlSerializer, the type of Value is restored as long as it is a supported CLR type.
public void WriteXml(XmlWriter writer) {
// KeyValuePair<string, object> pair initialized previously
writer.WriteStartElement("entry");
MyClass toSerialize = new MyClass(pair.Key, pair.Value);
XmlSerializer serializer = new XmlSerializer(typeof(MyClass));
serializer.Serialize(writer, toSerialize);
writer.WriteEndElement();
}

Azure Table Storage, WCF Service and Enum

Here's my problem. A class which defines an order has a property called PaymentStatus, which is an enum defined like so:
public enum PaymentStatuses : int
{
OnDelivery = 1,
Paid = 2,
Processed = 3,
Cleared = 4
}
And later on, in the class itself, the property definition is very simple:
public PaymentStatuses? PaymentStatus { get; set; }
However, if I try to save an order to the Azure Table Storage, I get the following exception:
System.InvalidOperationException: The type Order+PaymentStatuses' has no settable properties.
At this point I thought using enum isn't possible, but a quick Google search returned this: http://social.msdn.microsoft.com/Forums/en-US/windowsazure/thread/7eb1a2ca-6c1b-4440-b40e-012db98ccb0a
This page lists two answers, one of which seems to ignore the problems and suggests that using an enum in Azure Storage is fine.
Now, I don't NEED to store the enum in the Azure Table Storage as such, I could just as well store a corresponding int, however, I do need this property to be exposed in the WCF service.
I've tried making the property use get and set to return the enum from a stored integer, and remove this property from Azure by using the WritingEntity event on my DataContext, but I get that exception before the event for this entity is fired.
At this point, I'm at a loss, I don't know what else I can do to have this property in WCF as an enum, but have Azure store just the int.
Enum is not supported. Even though it is defined like an int, it is really not an integral type supported by Table Storage. Here is the list of types supported. An enum is just a string expression of an integral number with an object-oriented flavor.
You can store int in table storage and then convert it using Enum.Parse.
Here's a simple workaround:
public int MyEnumValue { get; set; } //for use by the Azure client libraries only
[IgnoreProperty] public MyEnum MyEnum
{
get { return (MyEnum) MyEnumValue; }
set { MyEnumValue = (int) value; }
}
It would have been nicer if a simple backing value could have been employed rather than an additional (public!) property - without the hassle of overriding ReadEntity/WriteEntity of course. I opened a user voice ticket that would facilitate that, so you might want to upvote it.
ya i was having this same problem
i changed my property which was earlier enum to int. now this int property parses the incoming int and saves it into a variale of the same enum type so now the code that was
public CompilerOutputTypes Type
{get; set;}
is chaged to
private CompilerOutputTypes type;
public int Type
{
get {return (int)type;}
set { type = (CompilerOutputTypes)value; }
}
Just suggestions...
I remember that in WCF you have to mark enums with special attributes: http://msdn.microsoft.com/en-us/library/aa347875.aspx
Also, when you declare PaymentStatuses? PaymentStatus, you are declaring Nullable<PaymentStatuses> PaymentStatus. The ? sintax is just syntactic sugar. Try to remove the ? and see what happen (you could add a PaymentStatuses.NoSet = 0 , because the default value for an Int32 is 0).
Good luck.
Parvs solution put me on the right track but I had some minor adjustments.
private string _EnumType;
private EnumType _Type;
//*********************************************
//*********************************************
public string EnumType
{
get { return _Type.ToString(); }
set
{
_EnumType = value;
try
{
_Type = (EnumType)Enum.Parse(typeof(EnumType), value);
}
catch (Exception)
{
_EnumType = "Undefined";
_Type = [mynamespace].EnumType.Undefined;
}
}
}
I have come across a similar problem and have implemented a generic object flattener/recomposer API that will flatten your complex entities into flat EntityProperty dictionaries and make them writeable to Table Storage, in the form of DynamicTableEntity.
Same API will then recompose the entire complex object back from the EntityProperty dictionary of the DynamicTableEntity.
This is relevant to your question because the ObjectFlattenerRecomposer API supports flattening property types that are normally not writeable to Azure Table Storage like Enum, TimeSpan, all Nullable types, ulong and uint by converting them into writeable EntityProperties.
The API also handles the conversion back to the original complex object from the flattened EntityProperty Dictionary. All that the client needs to do is to tell the API, I have this EntityProperty Dictionary that I just read from Azure Table (in the form of DynamicTableEntity.Properties), can you convert it to an object of this specific type. The API will recompose the full complex object with all of its properties including 'Enum' properties with their original correct values.
All of this flattening and recomposing of the original object is done transparently to the client (user of the API). Client does not need to provide any schema or any knowledge to the ObjectFlattenerRecomposer API about the complex object that it wants to write, it just passes the object to the API as 'object' to flatten it. When converting it back, the client only needs to provide the actual type of object it wants the flattened EntityProperty Dictionary to be converted to. The generic ConvertBack method of the API will simply recompose the original object of Type T and return it to the client.
See the usage example below. The objects do not need to implement any interface like 'ITableEntity' or inherit from a particular base class either. They do not need to provide a special set of constructors.
Blog: https://doguarslan.wordpress.com/2016/02/03/writing-complex-objects-to-azure-table-storage/
Nuget Package: https://www.nuget.org/packages/ObjectFlattenerRecomposer/
Usage:
//Flatten object (ie. of type Order) and convert it to EntityProperty Dictionary
Dictionary<string, EntityProperty> flattenedProperties = EntityPropertyConverter.Flatten(order);
// Create a DynamicTableEntity and set its PK and RK
DynamicTableEntity dynamicTableEntity = new DynamicTableEntity(partitionKey, rowKey);
dynamicTableEntity.Properties = flattenedProperties;
// Write the DynamicTableEntity to Azure Table Storage using client SDK
//Read the entity back from AzureTableStorage as DynamicTableEntity using the same PK and RK
DynamicTableEntity entity = [Read from Azure using the PK and RK];
//Convert the DynamicTableEntity back to original complex object.
Order order = EntityPropertyConverter.ConvertBack<Order>(entity.Properties);

Resources