I have been using gson to serialize objects and send json data to the front-end. It always worked as expected in the past.
Now I have an class. It has an object field called "destination". The owning class is serialized correctly, but the the object field does not get serialized, and further, its json data has a mysterious field named "handler" (the class for "destination" does not have this "handler" field).
Here is the json data (it shows the idea):
{
name: my name (this gets serialized)
destination: {
seq: 0 (this does not get serizlied. 0 is the default)
validation: true (this does not get serialized. true is the default)
(don't konw why this handler is here)
handler: {
Object { interfaces=[1], constructed=true, persistentClass={...}, more...}
constructed: true
entityName:"abc.MyClass"
initialized: true
interfaces: [Object {}]
overridesEquals: false
persistentClass: Object {}
readOnly: false
specjLazyLoad: false
target: Object { validation=false, seq=2, more...} (this object contains all the actual values)
unwrap: false
}
}
}
Interestingly, the "handler" has a property called "target", which has all the serialized data.
Here is my Java code:
Gson gson = new GsonBuilder().create();
String json = gson.toJson(myData);
Does anyone ever see this "handler" situation? What I do could be wrong?
I use gson 1.7.1.
Thanks!
Gson uses only fields private,public,protected via Java Reflection API
(don't konw why this handler is here)
handler: { ....}
Gson is not doing magic, during serialization to add extra handler filed object in JSON, if it is coming in generated json means you've handler property some where inside destination class, check your code again. If you find it then you can ignore using Expose annotation
#Expose(serialize = false) // don't serialize handler field during serialization process
private Handler handler;
Interestingly, the "handler" has a property called "target", which has all the serialized data.
nothing interestingly, means you have handler property inside your destination object, that why you seeing it in generated json.
Related
My target is to generate the JSON configuration of an apexchart ( https://apexcharts.com/javascript-chart-demos/bar-charts/custom-datalabels/ ) in Java with GSON.
The configuration contains a property "formatter" that has a JavaScript function as a value:
formatter: function (val, opt) {
return opt.w.globals.labels[opt.dataPointIndex] + ": " + val
},
When I add a property to a JsonObject like this jsonDataLabels.addProperty("formatter", "(val, opt) {...}"); then the property value in the output is (as expected) a String (with quotes) and apexchart doesn't interpret it.
How can I add an unquoted JavaScript function into a GSON JsonObject?
How can I add an unquoted JavaScript function into a GSON JsonObject?
The short answer is you can't; Gson is a JSON library and what you are creating is not JSON but a JavaScript object.
However, you might be able to achieve this with Gson's JsonWriter.jsonValue method. That method is intended to write a raw JSON value, but you could misuse it to write your JavaScript function value:
JsonWriter jsonWriter = new JsonWriter(...);
jsonWriter.beginObject();
jsonWriter.name("formatter");
jsonWriter.jsonValue("function (val, opt) { ... }");
jsonWriter.endObject();
But be careful because jsonValue does not check the syntax of the provided value, so if you provide a malformed value, the resulting JSON (or in your case JavaScript object) might be malformed.
Maybe there are JavaScript code writer libraries out there though, which would be better suited for your use case.
A workaround (not my preferred approach) is ...
to write a unique placeholder into the JSON property,
export the JSON document into a String
replace the unique placeholder within the String
I'm designing a gRPC service written in Go.
In front of the gRPC service is Envoy which converts incoming HTTP requests to gRPC and converts the gRPC responses to JSON.
The requirement of this application is to have an endpoint that returns the following JSON object:
{
my_id: "AAA"
}
I can model this response pretty simply in Go like this:
// A MyResponse object.
message MyResponse {
// contents is a list of contents.
string my_id = 1;
}
But the requirement that I have is that sometimes my_id might be null. In that case, I want to get the following JSON back:
{
my_id: null
}
it
Is it possible to modify MyResponse such that my_id can be a string or a null in the JSON object that is returned? If so, how? If not, isn't this a pretty big gap in the design of gRPC?
I suggest you to use the StringValue field of the Package google.protobuf:
StringValue Wrapper message for string.
The JSON representation for StringValue is JSON string.
So in your proto files, you should import:
import "google/protobuf/wrappers.proto";
then use as example:
google.protobuf.StringValue name = 2;
For handle the values you can check the wrappers.StringValue
type of the github.com/golang/protobuf/ptypes/wrappers package and the helpers of the google.golang.org/protobuf/types/known/wrapperspb repo.
TLNR: I was trying to test DTO validation in the controller spec instead of in e2e specs, which are precisely crafted for that. McDoniel's answer pointed me to the right direction.
I develop a NestJS entrypoint, looking like that:
#Post()
async doStuff(#Body() dto: MyDto): Promise<string> {
// some code...
}
I use class-validator so that when my API receives a request, the payload is parsed and turned into a MyDto object, and validations present as annotations in MyDto class are performed. Note that MyDto has an array of nested object of class MySubDto. With the #ValidateNested and #Type annotations, the nested objects are also validated correctly.
This works great.
Now I want to write tests for the performed validations. In my .spec file, I write:
import { validate } from 'class-validator';
// ...
it('should FAIL on invalid DTO', async () => {
const dto = {
//...
};
const errors = await validate( dto );
expect(errors.length).not.toBe(0);
}
This fails because the validated dto object is not a MyDto. I can rewrite the test as such:
it('should FAIL on invalid DTO', async () => {
const dto = new MyDto()
dto.attribute1 = 1;
dto.subDto = { 'name':'Vincent' };
const errors = await validate( dto );
expect(errors.length).not.toBe(0);
}
Validations are now properly made on the MyDto object, but not on my nested subDto object, which means I will have to instantiate aaaall objects of my Dto with according classes, which would be much inefficient. Also, instantiating classes means that TypeScript will raise errors if I voluntarily omits some required properties or indicate incorrect values.
So the question is:
How can I use NestJs built-in request body parser in my tests, so that I can write any JSON I want for dto, parse it as a MyDto object and validate it with class-validator validate function?
Any alternate better-practice ways to tests validations are welcome too!
Although, we should test how our validation DTOs work with ValidationPipe, that's a form of integration or e2e tests. Unit tests are unit tests, right?! Every unit should be testable independently.
The DTOs in Nest.js are perfectly unit-tastable. It becomes necessary to unit-test the DTOs, when they contain complex regular expressions or sanitation logic.
Creating an object of the DTO for test
The request body parser in Nest.js that you are looking for is the class-transformer package. It has a function plainToInstance() to turn your literal or JSON object into an object of the specified type. In your example the specified type is the type of your DTO:
const myDtoObject = plainToInstance(MyDto, myBodyObject)
Here, myBodyObject is your plain object that you created for test, like:
const myBodyObject = { attribute1: 1, subDto: { name: 'Vincent' } }
The plainToInstance() function also applies all the transformations that you have in your DTO. If you just want to test the transformations, you can assert after this statement. You don't have to call the validate() function to test the transformations.
Validating the object of the DTO in test
To the emulate validation of Nest.js, simply pass the myDtoObject to the validate() function of the class-validator package:
const errors = await validate(myDtoObject)
Also, if your DTO or SubDTO object is too big or too complex to create, you have the option to skip the remaining properties or subObjects like your subDto:
const errors = await validate(myDtoObject, { skipMissingProperties: true })
Now your test object could be without the subDto, like:
const myBodyObject = { attribute1: 1 }
Asserting the errors
Apart from asserting that the errors array is not empty, I also like to specify a custom error message for each validation in the DTO:
#IsPositive({ message: `Attribute1 must be a positive number.` })
readonly attribute1: number
One advantage of a custom error message is that we can write it in a user-friendly way instead of the generic messages created by the library. Another big advantage is that I can assert this error message in my tests. This way I can be sure that the errors array is not empty because it contains the error for this particular validation and not something else:
expect(stringified(errors)).toContain(`Attribute1 must be a positive number.`)
Here, stringified() is a simple utility function to convert the errors object to a JSON string, so we can search our error message in it:
export function stringified(errors: ValidationError[]): string {
return JSON.stringify(errors)
}
Your final test code
Instead of the controller.spec.ts file, create a new file specific to your DTO, like my-dto.spec.ts for unit tests of your DTO. A DTO can have plenty of unit tests and they should not be mixed with the controller's tests:
it('should fail on invalid DTO', async () => {
const myBodyObject = { attribute1: -1, subDto: { name: 'Vincent' } }
const myDtoObject = plainToInstance(MyDto, myBodyObject)
const errors = await validate(myDtoObject)
expect(errors.length).not.toBe(0)
expect(stringified(errors)).toContain(`Attribute1 must be a positive number.`)
}
Notice how you don't have to assign the values to the properties one by one for creating the myDtoObject. In most cases, the properties of your DTOs should be marked readonly. So, you can't assign the values one by one. The plainToInstance() to the rescue!
That's it! You were almost there, unit testing your DTO. Good efforts! Hope that helps now.
To test input validation with the validation pipes, I think it is agreed that the best place to do this is in e2e tests rather than in unit tests, just make sure that you remember to register your pipes (if you normally use app.useGlobalPipes() instead of using dependency injection)
Using Java8 in eclipse AWS SDK, I've created and uploaded a lambda function that is hooked in upon fulfillment of my lex intent.
Lambda has not problem receiving JSON request and parsing.
Then, I format a simple "Close" dialogAction response and send back to lex and receive the following error from the Test Bot page in the lex console:
An error has occurred: Received invalid response from Lambda:
Can not construct instance of IntentResponse:
no String-argument constructor/factory method to deserialize
from String value
('{"dialogAction
{"type":"Close","fulfillmentState":"Fulfilled","message":
{"contentType":"PlainText","content":"Thanks I got your info"}}}')
at [Source: "{\"dialogAction\":
{\"type\":\"Close\",\"fulfillmentState\":\"Fulfilled\",\"message\":
{\"contentType\":\"PlainText\",\"content\":\"Thanks I got your
info\"}}}";line: 1, column: 1]
It seems to have a problem right away with the format (line 1, column 1), but my JSON string looks ok to me. Before returning the output string in the handleRequest java function, I am writing the it to the Cloudwatch log and it writes as follows:
{
"dialogAction": {
"type": "Close",
"fulfillmentState": "Fulfilled",
"message": {
"contentType": "PlainText",
"content": "Thanks I got your info"
}
}
}
Things I've tried:
Removing the message element as it's not required
Adding in non-required properties like sessionAttributes,
responseCard, etc
removing the double quotes
replacing double quotes with single quotes
hardcoding json from sample response format message in documentation
Is there something hidden at the http headers level or is java8 doing something to the JSON that is not visible?
Not sure if this is because I'm using Java8 or not, but a return value of "String" from the RequestHandler class handleRequest method will not work.
Yes, String is an object, but the constructors on the Lex side are expecting an "Object". I was converting my lex response POJO to a String before returning it in the handleRequest method. That was my mistake.
I fixed it by changing the return type of the handleRequest method to be "Object" instead of "String".
public Object handleRequest(Object input, Context context)
instead of
public String handleRequest(Object input, Context context)
You also have to implement the
public class LambdaFunctionHandler implements RequestHandler<Object, Object>
not
public class LambdaFunctionHandler implements RequestHandler<Object, String>
This solved my issue.
In my case I was facing exactly the same issue and was able to fix it by creating specific response POJO type and using this POJO as the return type for 'handleRequest' method. E.g. BotResponse.java as follow:
public class BotResponse implements Serializable{
private static final long serialVersionUID = 1L;
public DialogAction dialogAction = new DialogAction();
public DialogAction getDialogAction() {
return dialogAction;
}
public void setDialogAction(DialogAction dialogAction) {
this.dialogAction = dialogAction;
}
}
Note, I have also added the 'implements Serializable' just to be on safer side. Probably it is an overkill.
Not sure why but for me returning a well formatted JSON String object did not worked even after changing the return type of 'handleRequest' method to 'Object'.
I know this is an old question however thought this might help some else
#Mattbob Solution dint fix my issue, However he is in the right path. Best approach is to use a Response object, a custom response object and make the lambda return the custom response object. So i went to the Documentation and created a custom object that looks Response format here
http://docs.aws.amazon.com/lex/latest/dg/lambda-input-response-format.html
At the time of answering question i couldnt find an object in SDK that matched the response Object so i had to recreate but if some one knows please comment below
Class xxxxx implements RequestHandler<Object, AccountResponse> {
#Override
public AccountResponse handleRequest(Object input, Context context) {
}
}
Lambda will look somewhat like this and just populate and return the object to match response structure and error goes away. Hope this helps.
Whenever we are returning the object to the bot from the backend make sure we need to pass content type along with content. But here we are passing wrong. So wE need to pass as like below. It is in Node.js
let message = {
contentType: "PlainText",
content: 'Testing bot'
};
I have to consume a REST webservice which has the following syntax for all requests:
{
message: "OK",
success: true,
results: 1,
data: {
name: "Berlin",
lat: 52.2,
lon: 13.25,
id: 1701
},
(...)
}
When I try to deserialize using:
City source = getRestTemplate().getForObject("http://myws.com/cities/{cityId}", City.class, "1701");
The default HttpMessageConverter tries to look for attributes named message, success, results in the City bean, and since it cannot find them, it's throwing an Exception.
I wonder if there's any way to take advantage of the default HttpMessageConverter but somehow tell it to interpret message, success and results differently, or do I have to create my own HttpMessageConverter altogether?
I had the same issue and the way around it is to create a wrapper object that contains the meta data fields. This method turned out to be quite useful and made it incredibly easy to get at the data.
In your case, the core model is obviously the city but then the wrapper object would be something like this
class CityWrapper{
String message;
String success;
Integer results;
#JsonProperty("data")
City city;
}
When I tackled it I had a list of data coming back and that worked fine with
#JsonProperty("data")
List<City> cities;