Trouble Deserializing Object Pulled from SQS using GSON - events

I have a lambda function the receives an S3Event object when a file is put into an S3 Bucket. When the lambda fails, it goes to a dead letter queue set up in Amazon SQS.
When I pull these messages, this this the body:
{
"Records": [
{
"eventVersion": "2.1",
"eventSource": "aws:s3",
"awsRegion": "us-east-1",
"eventTime": "d",
"eventName": "d:Put",
"userIdentity": {
"principalId": ""
},
"requestParameters": {
"sourceIPAddress": "2"
},
"responseElements": {
"x-amz-request-id": "",
"x-amz-id-2": "g"
},
"s3": {
"s3SchemaVersion": "1.0",
"configurationId": "",
"bucket": {
"name": "",
"ownerIdentity": {
"principalId": ""
},
"arn": ""
},
"object": {
"key": "",
"size": 12502,
"eTag": "",
"sequencer": ""
}
}
}
]
}
That looks quite a bit like the S3Event object which contains a list of S3EventNotification records. I have tried to deserialize it to the S3 Event Object using the following:
S3Event event = new GsonBuilder().serializeNulls().create().fromJson(s3EventString, S3Event.class);
This results in a null object like so:
{"records":null}
I noticed in the json return from SQS, the "R" in Records is capitalized. I wasn't sure if that made a difference so I changed it to a lowercase "r" and it throws this error:
java.lang.IllegalStateException: Expected BEGIN_OBJECT but was STRING
I'm really not sure what type of object this actually is.
Any help would be greatly appreciated.

Strange. Using Jackson it works perfectly so I will use this for now..
import com.fasterxml.jackson.databind.ObjectMapper;
import com.amazonaws.services.sqs.model.Message;
private S3Event extractS3Event(Message message){
ObjectMapper objectMapper = new ObjectMapper();
return objectMapper.readValue(message.getBody(), S3Event.class)
}
//Then to get the S3 Details
S3Event event = extractS3Event(message);
S3Entity entity = event.getRecords().get(0).getS3();
String bucketName = entity.getBucket().getName();
String s3Key = entity.getObject().getKey();

Re:BEGIN_OBJECT but was STRING.
This is because AWS uses JodaTime for EventTime. You can avoid the issue by removing the field from the JSON text (assuming you do not need it)

Related

Issue with Google Classroom courses.courseWork.list "field_mask: Unknown field mask values: individual_students_options"

I need to query the partial fields for courseWork.list and courseWork.get so I am passing this value in fields as described in the documentation. But courseWork(individualStudentsOptions) API call returns a error:
{
"error": {
"code": 400,
"message": "field_mask: Unknown field mask values: individual_students_options",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.rpc.BadRequest",
"fieldViolations": [
{
"field": "field_mask",
"description": "Unknown field mask values: individual_students_options"
}
]
}
]
}
}
In other experiments, for example courseWork(id) everything is fine and the API call returns this:
{
"courseWork": [
{
"id": "93359557635"
},
{
"id": "93359557700"
},
...
]
}
So please help me how to fill in the field of individualStudentsOptions ?
How exactly are you calling this api?
This is how we do it here:
String studentUser = "student#gmail.com";
IndividualStudentsOptions individualStudentsOptions = new IndividualStudentsOptions();
ArrayList<String> studentIdList = new ArrayList<>();
studentIdList.add(studentUser);
individualStudentsOptions.setStudentIds(studentIdList);
CourseWork courseWork = new CourseWork()
.setCourseId(course.getId())
.setTitle("My course work")
.setDescription("desc")
.setMaxPoints(100.0)
.setDueDate(date)
.setDueTime(timeOfDay)
.setAssigneeMode("INDIVIDUAL_STUDENTS")
.setIndividualStudentsOptions(individualStudentsOptions)
.setWorkType("ASSIGNMENT")
.setState("PUBLISHED");
courseWork = service.courses().courseWork().create(course.getId(), courseWork).execute();

Jackson derealization with SpringBoot : To get field names present in request along with respective field mapping

I have a requirement to throw different error in case of different scenarios like below, and there are many such fields not just 1.
e.g.
{
"id": 1,
"name": "nameWithSpecialChar$"
}
Here it should throw error for special character.
{
"id": 1,
"name": null
}
Here throw field null error.
{
"id": 1
}
Here throw field missing error.
Handling, 1st and 2nd scenario is easy, but for 3rd one, is there any way we can have a List of name of fields that were passed in input json at the time of serialization itself with Jackson?
One way, I am able to do it is via mapping request to JsonNode and then check if nodes are present for required fields and after that deserialize that JsonNode manually and then validate rest of the members as below.
public ResponseEntity myGetRequest(#RequestBody JsonNode requestJsonNode) {
if(!requestJsonNode.has("name")){
throw some error;
}
MyRequest request = ObjectMapper.convertValue(requestJsonNode, MyRequest .class);
validateIfFieldsAreInvalid(request);
But I do not like this approach, is there any other way of doing it?
You can define a JSON schema and validate your object against it. In your example, your schema may look like this:
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"type": "object",
"properties": {
"id": {
"description": "The identifier",
"type": "integer"
},
"name": {
"description": "The item name",
"type": "string",
"pattern": "^[a-zA-Z]*$"
}
},
"required": [ "id", "name" ]
}
To validate your object, you could use the json-schema-validator library. This library is built on Jackson. Since you're using Spring Boot anyway, you already have Jackson imported.
The example code looks more or less like this:
String schema = "<define your schema here>";
String data = "<put your data here>";
JsonSchemaFactory factory = JsonSchemaFactory.byDefault();
ObjectMapper m = new ObjectMapper();
JsonSchema jsonSchema = factory.getJsonSchema(m.readTree(schema));
JsonNode json = m.readTree(data);
ProcessingReport report = jsonSchema.validate(json);
System.out.println(report);
The report includes detailed errors for different input cases. For example, with this input
{
"id": 1,
"name": "nameWithSpecialChar$"
}
this output is printed out
--- BEGIN MESSAGES ---
error: ECMA 262 regex "^[a-zA-Z]*$" does not match input string "nameWithSpecialChar$"
level: "error"
schema: {"loadingURI":"#","pointer":"/properties/name"}
instance: {"pointer":"/name"}
domain: "validation"
keyword: "pattern"
regex: "^[a-zA-Z]*$"
string: "nameWithSpecialChar$"
--- END MESSAGES ---
Or instead of just printing out the report, you can loop through all errors and have your specific logic
for (ProcessingMessage message : report) {
// Add your logic here
}
You could check the example code to gain more information about how to use the library.

Spring MVC Converter and Swagger doc: how to?

I use converters in my Spring MCV controllers. In this example, the String from the path variable is mapped into a UserId:
#GetMapping(path = "/user/{user-id}")
public User get(#Parameter(description = "User id", required = true, example = "3fa85f64-5717-4562-b3fc-2c963f66afa6")
#PathVariable("user-id")
UserId userId) {
return userService.get(userId)
}
It seems to annoy Swagger as the generated doc requires an object as parameter and not a plain string:
...
"/api/v1/user/{user-id}": {
"get": {
"operationId": "get",
"parameters": [
{
"name": "user-id",
"in": "path",
"schema": {
"$ref": "#/components/schemas/UserId"
},
}
],
...
with the UserId schema:
"UserId": {
"type": "object",
"properties": {
"value": {
"type": "string",
"format": "uuid"
}
}
}
And thus the Swagger UI cannot be used because either the parameter is considered as invalid when a single string is provided, either the data is actually invalid when the object format is used.
What is an option to fix that?
To achieve that, the schema parameter of the #Parameter annotation is the answer.
The above example becomes:
#Parameter(description = "User id",
required = true,
schema = #Schema(implementation = String.class), // this is new
example = "3fa85f64-5717-4562-b3fc-2c963f66afa6")

laravel/codeception : test if json response contains only certain keys

I have a json array coming from my api as response:
{
"data": [
{
"id": 1,
"name": "abc"
}
}
I am using laravel for api and laravel-codeception for testing.
public function getAll(ApiTester $I)
{
$I->sendGET($this->endpoint);
}
I have to test if the response contains only id and name key (not any other key) example this response should fail the test.
{
"data": [
{
"id": 1,
"name": "abc",
"email":"abc#xyz"
}
}
I have found $I->seeResponseContainsJson(), but it checks if JSON is present or not. It does not check if JSON response contains only specified keys.
Thanks.

Spring Data ElasticSearch Build In IN query returning partial match

I am new to elastic search spring data, Today I was trying to get In query working with Spring data ES repository.
I have to do a lookup for list of user names, and if its exactly match in the index, need to get those users back as result.
I tried to use the built in repository 'In' method to do so, but it returns partial matches, please help me to make this working like SQL IN query.
Here is my repository code:
public interface UserRepository extends ElasticsearchRepository<EsUser, String>
{
public List<EsUser> findByUserAccountUserNameIn(Collection<String> terms);
}
REQUEST:
{"terms":["vijay", "arun"], "type":"NAME"}
RESPONSE:
[
{
"userId": "236000",
"fbId": "",
"userAccount": {
"userName": "arun",
"urlFriendlyName": "arun",
},
"userProfile": {
},
"userStats": {
}
},
{
"userId": "6228",
"userAccount": {
"userName": "vijay",
"urlFriendlyName": "vijay",
},
"userProfile": {
},
"userStats": {
}
},
{
"userId": "236000",
"fbId": "",
"userAccount": {
"userName": "arun singh",
"urlFriendlyName": "arun-singh",
},
"userProfile": {
},
"userStats": {
}
}
{
"userId": "236000",
"fbId": "",
"userAccount": {
"userName": "vijay mohan",
"urlFriendlyName": "vijay-mohan",
},
"userProfile": {
},
"userStats": {
}
}
]
This is because your userAccount.userName field is an analyzed string, and thus, the two tokens arun and singh have been indexed. Your query then matches the first token, which is normal.
In order to prevent this and guarantee an exact match you need to declare your field as not_analyzed, like this:
#Field(index = FieldIndex.not_analyzed)
private String userName;
Then you'll need to delete your index and the associated template in /_template, restart your application so a new template and index are created with the proper field mapping.
Then your query will work.

Resources