I want to send a POST request from Node-RED to the Composer REST server.
Error trying invoke business network. Error: No valid responses from any peers.Response from attempted peer comms was an error: Error: 2 UNKNOWN: error executing chaincode: transaction returned with failure: ValidationException: Instance org.acme.shipping.perishable.AccelReading#c8c829bfd738d7ec63180c5225ae85bd77fad29b4ad9d8ad4bc40a14362f1060 missing required field accel_x
Playground/Test
{
"$class": "org.acme.shipping.perishable.AccelReading",
"accel_x": 0,
"accel_y": 0,
"accel_z": 0,
"latitude": "",
"longitude": "",
"readingTime": "",
"shipment": "resource:org.acme.shipping.perishable.Shipment#4879"
}
Node-RED URL
http://...:31090/api/AccelReading?data=
{"$class":"org.acme.shipping.perishable.AccelReading",
"accel_x":23264,
"accel_y":-20960,
"accel_z":-2448,
"readingTime":"2018-02-14T15:16:44.284Z",
"latitude":"51",
"longitude":"11",
"shipment":"resource:org.acme.shipping.perishable.Shipment#320022000251363131363432"
}
Payload
Postman POST request with all parameters defined as key/value pairs in the body
Response
{
"error": {
"statusCode": 422,
"name": "ValidationError",
"message": "The `AccelReading` instance is not valid. Details: `shipment` can't be blank (value: undefined).",
"details": {
"context": "AccelReading",
"codes": {
"shipment": [
"presence"
]
},
"messages": {
"shipment": [
"can't be blank"
]
}
},
"stack": "ValidationError: The `AccelReading` instance is not valid. Details: `shipment` can't be blank (value: undefined).\n at /home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/dao.js:398:12\n at AccelReading.<anonymous> (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/validations.js:578:11)\n at AccelReading.next (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/hooks.js:93:12)\n at AccelReading.<anonymous> (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/validations.js:575:23)\n at AccelReading.trigger (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/hooks.js:83:12)\n at AccelReading.Validatable.isValid (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/validations.js:541:8)\n at /home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/dao.js:394:9\n at doNotify (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:155:49)\n at doNotify (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:155:49)\n at doNotify (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:155:49)\n at doNotify (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:155:49)\n at Function.ObserverMixin._notifyBaseObservers (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:178:5)\n at Function.ObserverMixin.notifyObserversOf (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:153:8)\n at Function.ObserverMixin._notifyBaseObservers (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:176:15)\n at Function.ObserverMixin.notifyObserversOf (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:153:8)\n at Function.ObserverMixin._notifyBaseObservers (/home/composer/.npm-global/lib/node_modules/#ibmblockchain/composer-rest-server/node_modules/loopback-datasource-juggler/lib/observer.js:176:15)"
}
}
Postman/JSON
The issue was a refresh of Composer REST server was required, to reflect the desired fields appearing in Swagger. This means to delete the REST server container and re-create using the internal 192.x address - the steps in Kubernetes are:
bx cs cluster-config blockchain
export KUBECONFIG=/Users/<name>/.bluemix/plugins/container-service/clusters/blockchain/kube-config-mil01-blockchain.yml
./delete/delete_composer-rest-server.sh
./create/create_composer-rest-server.sh --business-network-card admin#perishable-network
Attaching a screenshot of what the POST should look like (after refresh) for the IoT business network referred to
Related
I successfuly used the "Gateways - Create Datasource" (https://learn.microsoft.com/en-us/rest/api/power-bi/gateways/create-datasource) method from Power BI Rest API to create a SQL datasource, but I´m stuck when I try to create a Databricks datasource.
I saw that there is no Databricks datasource kind, but it could be possible to use the kind "Extension". So a try this code:
OBS: I generated the credential part using the data gateway publick key and a Databricks key.
What am I missing, or doing wrong?
{
"datasourceName": "Databricks AIDA Teste",
"datasourceType":"Extension",
"connectionDetails":{
"path":"{\"host\":\"adb-xxx.azuredatabricks.net\",\"httpPath\":\"\\/sql\\/1.0\\/warehouses\\/xxx\"}",
"kind":"Databricks"
},
"credentialDetails": {
"credentialType": "Key",
"credentials": "xxx",
"privacyLevel": "Organizational"
}
}```
{
"error": {
"code": "BadRequest",
"message": "Bad Request",
"details": [
{
"message": "Unexpected character encountered while parsing value: {. Path 'connectionDetails', line 4, position 30.",
"target": "datasourceToGatewayRequest.connectionDetails"
},
{
"message": "'datasourceToGatewayRequest' is a required parameter",
"target": "datasourceToGatewayRequest"
}
]
}
}
Many thanks!
Problem you have encountered:
Following steps at link below for transferJobs.patch API
https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs/patch
Patch API works as expected if want to update description. Sample Below
Request:
{
"projectId": "<MY_PROJECT>",
"transferJob": {
"transferSpec": {
"objectConditions": {
"lastModifiedSince": "2022-01-24T18:30:00Z"
}
},
"description": "updated description"
},
"updateTransferJobFieldMask": "description"
}
Response: Success 200
Patch API do not work if want to update nested object field. Sample Below
{
"projectId": "<MY_PROJECT>",
"transferJob": {
"transferSpec": {
"objectConditions": {
"lastModifiedSince": "2022-01-22T18:30:00Z"
}
},
"description": "updated description"
},
"updateTransferJobFieldMask": "transferSpec.objectConditions.lastModifiedSince"
}
Response: 400
{"error": {
"code": 400,
"message": "Invalid path in the field mask.",
"status": "INVALID_ARGUMENT"}}
Tried other combinations following documentation/sample code reference but none of them work. Tried options as
transferSpec.objectConditions.lastModifiedSince
transferJob.transferSpec.objectConditions.lastModifiedSince
objectConditions.lastModifiedSince lastModifiedSince Snake case
combination referring to FieldMaskUtil as transfer_spec.object_conditions.last_modified_since
What I expected to happen:
Patch API to work successfully for nested object as per documentation I.e. "updateTransferJobFieldMask": "transferSpec.objectConditions.lastModifiedSince"
updateTransferJobFieldMask works on the top level object, in this case transferSpec.
Changing that line to updateTransferJobFieldMask: transferSpec should work.
From the documentation:
The field mask of the fields in transferJob that are to be updated in this request. Fields in transferJob that can be updated are: description, transfer_spec, notification_config, and status. To update the transfer_spec of the job, a complete transfer specification must be provided. An incomplete specification missing any required fields will be rejected with the error INVALID_ARGUMENT.
Providing complete object having required child field worked. Sample example for future reference to other dev.
Below job transfer dat from Azure to GCP bucket and during patch updating last modified time. Both transfer_spec and transferSpec works as updateTransferJobFieldMask.
{
"projectId": "<MY_PROJECT>",
"updateTransferJobFieldMask": "transfer_spec",
"transferJob": {
"transferSpec": {
"gcsDataSink": {
"bucketName": "<BUCKET_NAME>"
},
"objectConditions": {
"lastModifiedSince": "2021-12-30T18:30:00Z"
},
"transferOptions": {},
"azureBlobStorageDataSource": {
"storageAccount": "<ACCOUNT_NAME>",
"container": "<CONTAINER>",
"azureCredentials": {
"sasToken": "<SAS TOKEN>"
}
}
}
}
}
I am facing this problem but don't know how to achieve it.
I have a graphql endpoint to fetch list of user, it already enabled authentication check.
Basically, when I send a request fetchUsers without authorization header it will throw exception or status code to let the user know, but currently, it just response
{
"errors": [
{
"message": null,
"locations": [
{
"line": 2,
"column": 3
}
],
"path": [
"fetchUsers"
],
"extensions": {
"classification": "DataFetchingException"
}
}
],
"data": {
"fetchUsers": null
}
}
And in the backend server, there have some exception throw:
SRGQL012000: Data Fetching Error: io.quarkus.security.UnauthorizedException
at io.quarkus.security.runtime.interceptor.check.AuthenticatedCheck.apply(AuthenticatedCheck.java:28)
at io.quarkus.security.runtime.interceptor.SecurityConstrainer.check(SecurityConstrainer.java:28)
at io.quarkus.security.runtime.interceptor.SecurityConstrainer_Subclass.check$$superforward1(SecurityConstrainer_Subclass.zig:100)
at io.quarkus.security.runtime.interceptor.SecurityConstrainer_Subclass$$function$$1.apply(SecurityConstrainer_Subclass$$function$$1.zig:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:62)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(InvocationInterceptor_Bean.zig:521)
Is there any way to catch this Unauthorized exception and custom it, to response 401 and the error message that we want to response.
I am throwing error from grpc service using resonseObserver.onError() but I am not getting messages in json format while hitting REST API from rest client, though the positive scenario is working fine and giving response as json.
I am using envoy as a transcoder, can anyone help me with how to get error response also as json. Currently I am getting BadRequest on error scenarios. The project is in SpringBoot.
TIA
You can use convert_grpc_status: true for do this.
http_filters:
- name: envoy.filters.http.grpc_json_transcoder
typed_config:
"#type": type.googleapis.com/envoy.extensions.filters.http.grpc_json_transcoder.v3.GrpcJsonTranscoder
proto_descriptor: "/tmp/envoy/proto.pb"
services: ["xxxxxxxx"]
convert_grpc_status: true
print_options:
always_print_primitive_fields: true
always_print_enums_as_ints: false
preserve_proto_field_names: false
If you mean return details key like this:
{
"code": 3,
"message": "API call quota depleted",
"details": [
{
"#type": "type.googleapis.com/google.rpc.ResourceInfo",
"resourceType": "xxxxxx",
"resourceName": "",
"owner": "",
"description": ""
}
]
}
You MUST compile you .proto file with:
import "google/rpc/error_details.proto";
because Envoy can't deserialize binary details from your backend server without error types.
Also you can read how you can send detailed error response with Python: How to send error details like as BadRequest
I'm creating the following request in vbscript and sending to the gocardless sandbox:
url="https://api-sandbox.gocardless.com/"
typ="GET"
Set xml = Server.CreateObject("MSXML2.ServerXMLHTTP")
xml.Open typ, url, False
xml.setRequestHeader "Authorization", "Bearer " & GCAccessToken
xml.SetRequestHeader "GoCardless-Version", "2015-07-06"
xml.SetRequestHeader "Accept","application/json"
xml.SetRequestHeader "Content-Type", "application/json"
xml.Send
GetGC = xml.responseText
Set xml = Nothing
The response I always get despite any tweaks I do is:
{"error":{"message":"not found","errors":[{"reason":"not_found","message":"not found"}],"documentation_url":"https://developer.gocardless.com/api-reference#not_found","type":"invalid_api_usage","request_id":"0AA4000DECCD_AC121CEB1F90_5BE18701_19AD0009","code":404}}
Any help would be appreciated. Have successfully done similar for Stripe but now need to use GC.
If you read the response from the API
{
"error": {
"message": "not found",
"errors": [{
"reason": "not_found",
"message": "not found"
}
],
"documentation_url": "https://developer.gocardless.com/api-reference#not_found",
"type": "invalid_api_usage",
"request_id": "0AA4000DECCD_AC121CEB1F90_5BE18701_19AD0009",
"code": 404
}
}
The error appears to be a HTTP status code (as is common with RESTful APIs) - 404 Not Foundlooking at the documentation link provided in the response;
404
Not Found. The requested resource was not found or the authenticated user cannot access the resource. The response body will explain which resource was not found.
So the issue could be;
You have failed to authenticate using the token in the code provided.
You authenticated but don't have permission to access the resource.
The resource you are looking for does not exist.
In this particular instance, I would suggest it is because the resource doesn't exist as the code doesn't specify a resource, only the base URL of the API which won't constitute an API endpoint you can interact with.
Looking at the documentation it's clear you need to provide a valid endpoint in the URL, at the time of writing there are 15 core endpoints to interact with along with 2 helper endpoints.
For example, a create payment request/response would look like;
POST https://api.gocardless.com/payments HTTP/1.1
{
"payments": {
"amount": 100,
"currency": "GBP",
"charge_date": "2014-05-19",
"reference": "WINEBOX001",
"metadata": {
"order_dispatch_date": "2014-05-22"
},
"links": {
"mandate": "MD123"
}
}
}
HTTP/1.1 201 (Created)
Location: /payments/PM123
{
"payments": {
"id": "PM123",
"created_at": "2014-05-08T17:01:06.000Z",
"charge_date": "2014-05-21",
"amount": 100,
"description": null,
"currency": "GBP",
"status": "pending_submission",
"reference": "WINEBOX001",
"metadata": {
"order_dispatch_date": "2014-05-22"
},
"amount_refunded": 0,
"links": {
"mandate": "MD123",
"creditor": "CR123"
}
}
}
Unfortunately, the code sample provided in the question doesn't really do anything so it's difficult to suggest what you are trying to do. In conclusion, I would suggest re-visiting the documentation for the API and look through the samples provided.