Publishing Avro messages using Kafka REST Proxy throws "Conversion of JSON to Avro failed" - spring-boot

I am trying to publish a message which has a union for one field as
{
"name": "somefield",
"type": [
"null",
{
"type": "array",
"items": {
"type": "record",
Publishing the message using the Kafka REST Proxy keeps throwing me the following error when somefield has an array populated.
{
"error_code": 42203,
"message": "Conversion of JSON to Avro failed: Failed to convert JSON to Avro: Expected start-union. Got START_ARRAY"
}
Same schema with somefield: null is working fine.
The Java classes are built in the Spring Boot project using the gradle plugin from the Avro schemas. When I use the generated Java classes and publish a message, with the array populated using the Spring KafkaTemplate, the message is getting published correctly with the correct schema. (The schema is taken from the generated Avro Specific Record) I copy the same json value and schema and publish via REST proxy, it fails with the above error.
I have these content types in the API call
accept:application/vnd.kafka.v2+json, application/vnd.kafka+json, application/json
content-type:application/vnd.kafka.avro.v2+json
What am I missing here? Any pointers to troubleshoot the issue is appreciated.

The messages I tested for were,
{
"somefield" : null
}
and
{
"somefield" : [
{"field1": "hello"}
]
}
However, it should be instead passed as,
{
"somefield" : {
"array": [
{"field1": "hello"}
]}
}

Related

Fluentd JSON string parsing with multiple data type in array

I am trying to set up a logging pipeline with Fluentd and elastic search. One of my log patterns looks like the following:
{
"key": "value",
"inputs": [
[
"2023-01-16T04: 45: 12.238Z",
{
"type": "channel",
"subtype": "profile",
"data": {
"firstName": "Customer"
}
}
]
]
}
The issue with this structure is that the first object in the inner array date is a string. Whenever Flunetd is trying to write it to ES, it throws an exception with error code 400, and following message
#0 dump an error event: error_class=Fluent::Plugin::ElasticsearchErrorHandler::ElasticsearchError error="400 - Rejected by Elasticsearch [error type]: illegal_argument_exception [reason]: 'can't merge a non object mapping [data.inputs] with an object mapping'" location=nil
What is the way forward?
When I remove this date from the array, it is getting synced correctly to ES.

Error to create a Databricks datasource using Power BI Rest API

I successfuly used the "Gateways - Create Datasource" (https://learn.microsoft.com/en-us/rest/api/power-bi/gateways/create-datasource) method from Power BI Rest API to create a SQL datasource, but I´m stuck when I try to create a Databricks datasource.
I saw that there is no Databricks datasource kind, but it could be possible to use the kind "Extension". So a try this code:
OBS: I generated the credential part using the data gateway publick key and a Databricks key.
What am I missing, or doing wrong?
{
"datasourceName": "Databricks AIDA Teste",
"datasourceType":"Extension",
"connectionDetails":{
"path":"{\"host\":\"adb-xxx.azuredatabricks.net\",\"httpPath\":\"\\/sql\\/1.0\\/warehouses\\/xxx\"}",
"kind":"Databricks"
},
"credentialDetails": {
"credentialType": "Key",
"credentials": "xxx",
"privacyLevel": "Organizational"
}
}```
{
"error": {
"code": "BadRequest",
"message": "Bad Request",
"details": [
{
"message": "Unexpected character encountered while parsing value: {. Path 'connectionDetails', line 4, position 30.",
"target": "datasourceToGatewayRequest.connectionDetails"
},
{
"message": "'datasourceToGatewayRequest' is a required parameter",
"target": "datasourceToGatewayRequest"
}
]
}
}
Many thanks!

Jackson derealization with SpringBoot : To get field names present in request along with respective field mapping

I have a requirement to throw different error in case of different scenarios like below, and there are many such fields not just 1.
e.g.
{
"id": 1,
"name": "nameWithSpecialChar$"
}
Here it should throw error for special character.
{
"id": 1,
"name": null
}
Here throw field null error.
{
"id": 1
}
Here throw field missing error.
Handling, 1st and 2nd scenario is easy, but for 3rd one, is there any way we can have a List of name of fields that were passed in input json at the time of serialization itself with Jackson?
One way, I am able to do it is via mapping request to JsonNode and then check if nodes are present for required fields and after that deserialize that JsonNode manually and then validate rest of the members as below.
public ResponseEntity myGetRequest(#RequestBody JsonNode requestJsonNode) {
if(!requestJsonNode.has("name")){
throw some error;
}
MyRequest request = ObjectMapper.convertValue(requestJsonNode, MyRequest .class);
validateIfFieldsAreInvalid(request);
But I do not like this approach, is there any other way of doing it?
You can define a JSON schema and validate your object against it. In your example, your schema may look like this:
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"type": "object",
"properties": {
"id": {
"description": "The identifier",
"type": "integer"
},
"name": {
"description": "The item name",
"type": "string",
"pattern": "^[a-zA-Z]*$"
}
},
"required": [ "id", "name" ]
}
To validate your object, you could use the json-schema-validator library. This library is built on Jackson. Since you're using Spring Boot anyway, you already have Jackson imported.
The example code looks more or less like this:
String schema = "<define your schema here>";
String data = "<put your data here>";
JsonSchemaFactory factory = JsonSchemaFactory.byDefault();
ObjectMapper m = new ObjectMapper();
JsonSchema jsonSchema = factory.getJsonSchema(m.readTree(schema));
JsonNode json = m.readTree(data);
ProcessingReport report = jsonSchema.validate(json);
System.out.println(report);
The report includes detailed errors for different input cases. For example, with this input
{
"id": 1,
"name": "nameWithSpecialChar$"
}
this output is printed out
--- BEGIN MESSAGES ---
error: ECMA 262 regex "^[a-zA-Z]*$" does not match input string "nameWithSpecialChar$"
level: "error"
schema: {"loadingURI":"#","pointer":"/properties/name"}
instance: {"pointer":"/name"}
domain: "validation"
keyword: "pattern"
regex: "^[a-zA-Z]*$"
string: "nameWithSpecialChar$"
--- END MESSAGES ---
Or instead of just printing out the report, you can loop through all errors and have your specific logic
for (ProcessingMessage message : report) {
// Add your logic here
}
You could check the example code to gain more information about how to use the library.

Micronaut GraphQL: How to respond with a non-200 HTTP status code from within GraphQL handler?

Following the docs and here's my exception handler (Kotlin):
#Produces
#Singleton
#Requirements(Requires(classes = [ForbiddenException::class, ExceptionHandler::class]))
class ForbiddenExceptionHandler : ExceptionHandler<ForbiddenException, HttpResponse<*>> {
override fun handle(request: HttpRequest<*>, exception: ForbiddenException): HttpResponse<*> {
return HttpResponse.status<String>(HttpStatus.FORBIDDEN, exception?.message)
}
}
Throwing a ForbiddenException from within my GraphQL handler bubbles the message into the response body, but the status code is always 200.
Example response:
{
"errors": [
{
"message": "Exception while fetching data (/createUser) : FORBIDDEN",
"locations": [
{
"line": 2,
"column": 3
}
],
"path": [
"createUser"
],
"extensions": {
"classification": "DataFetchingException"
}
}
],
"data": null
}
Micronaut version: 1.3.3
Micronaut GraphQL version: 1.3.0.RC1
Disclaimer:
GraphQL is not REST. You are here asking a question related to the core foundation of graphql specification (and any implementations of graphql in general).
They made the choice to embed most errors encountered in the execution of the queries but yet always return a 200 HTTP status. Therefore, you won't be able to change that in your project. It is not a configuration of graphql-java.
The good news is that the format of errors is known. Therefore, you are able to deserialize the error return payload in your application and handle correctly any error that would be thrown by graphql.
Please have a look at this link for in-depth explanations about the main difference between REST and Graphql.

graphql-java error data (like extensions) is being stuffed into "message" field on client

In graphql-java, I'm overriding getExtensions on a GraphQLError, in order to pass a "code" extension for Apollo Client to consume.
When I added a breakpoint, I can see that GraphQLObjectMapper sees the error in the executionResult with all its fields, including extensions properly set as expected.
However, when the error arrives at apollo, it gets strangely morphed and the entire error object (including the extensions array) appears to be stuffed into the string message field like so:
{
"errors": [
{
"message": "Unexpected error value: { message: \"Exception while fetching data (/myQuery) : Didn't work\", locations: [[Object]], path: [\"myQuery\"], extensions: { code: \"MY_CODE\", classification: \"DataFetchingException\" } }",
"locations": [
{
"line": 2,
"column": 3
}
],
"path": [
"mailLabels"
],
"extensions": {
"code": "INTERNAL_SERVER_ERROR"
}
}
],
"data": null
}
As far as I can tell apollo is not doing anything wrong. My suspicion is that graphql-java may be the one stuffing this entire error into the "message" field" and then setting the code as "INTERNAL_SERVER_ERROR", but I'm not really sure. Is there something I can do on the graphql-java end to prevent this and make it properly pass the extensions and not stuff them into the message value? Any assistance would be appreciated.
It is not caused by graphql-java but graphql-java-servlet. In some old version , by default it does not serialize GraphQLError to the structure defined by the GraphQL specification , which should be fixed in 7.5.1 in this issue.
If you cannot upgrade to the latest version , the simplest way is customize GraphQLObjectMapper and overriding its convertSanitizedExecutionResult :
public class CustomObjectMapper extends GraphQLObjectMapper {
#Override
public Map<String, Object> convertSanitizedExecutionResult(ExecutionResult executionResult, boolean includeData) {
Map<String, Object> result = super.convertSanitizedExecutionResult(executionResult, includeData);
if(result.containsKey("errors")){
result.put("errors", executionResult.getErrors().stream().map(err->err.toSpecification()).collect(toList()));
}
}
}

Resources