I'm following the tutorial "Extract Data from a Source File." I'm able to upload a file as described, but when I try to convert the file to SVF it fails with "TranslationWorker-InternalFailure." I get the same error with both a 3ds file created with Blender and an f3d created with Autodesk Fusion 360.
Here's my code (python):
r=requests.post(
'https://developer.api.autodesk.com/modelderivative/v2/designdata/job',
headers={
'authorization':'{0} {1}'.format(auth_type,auth_token)
},
json={
"input": {
"urn": urn_cube_base64,
},
"output": {
"formats": [
{
"type": "svf",
"views": [
"2d",
"3d"
]
}
]
}
}
)
print r.json()
while True:
r=requests.get('https://developer.api.autodesk.com/modelderivative/v2/designdata/{0}/manifest'.format(urn_cube_base64),
headers={
'authorization':'{0} {1}'.format(auth_type,auth_token)
},
)
print r.text
rj=r.json()
progress=rj['progress']
status=rj['status']
print rj['status'],progress
if status in ('success','failed','timeout'):
print json.dumps(rj,indent=2)
break
It produces the following output:
{
"hasThumbnail": "false",
"status": "failed",
"derivatives": [
{
"hasThumbnail": "false",
"status": "failed",
"name": "LMV Bubble",
"messages": [
{
"message": "Extractor error code -1073741819",
"code": "TranslationWorker-InternalFailure",
"type": "error"
}
],
"outputType": "svf",
"progress": "complete"
}
],
"region": "US",
"version": "1.0",
"progress": "complete",
"type": "manifest",
"urn": "dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6dG1wX2J1Y2tldDEvY3ViZS4zZHM"
}
Related
When querying archival node for transactions with EXPERIMENTAL_tx_status method, some transactions have no receipts while having receipts_outcome. How is that possible, and how is that transaction different from others?
If I understand correctly, receipts_outcome are the results of applying receipts. According to explorer, this transaction has Convert Transaction To Receipt part, so there should be some receipts generated.
According to documentation
A Receipt is the only actionable object in the system. When we talk about "processing a transaction" on the NEAR platform, this eventually means "applying receipts" at some point.
A good mental model is to think of a Receipt as a paid message to be executed at the destination (receiver). And a Transaction is an externally issued request to create the Receipt (there is a 1 to 1 relationship).
My query
{
"jsonrpc": "2.0",
"id": "2",
"method": "EXPERIMENTAL_tx_status",
"params": ["7beNxrbHxMRspJWT9NeEVwx719kVcmY9tRdPG9SYro26", "bumbleee99.near"]
}
Response
{
"jsonrpc": "2.0",
"result": {
"status": {
"SuccessValue": ""
},
"transaction": {
"signer_id": "bumbleee99.near",
"public_key": "ed25519:DFM5GRGbpNkk4XkhcFnRUFeKG8a3nzTH8NwZp754pC48",
"nonce": 59080995000003,
"receiver_id": "bumbleee99.near",
"actions": [
{
"AddKey": {
"public_key": "ed25519:CUoNs153GHrPZ9F8HpvhzFr1mwuUFUdGQsRNE2CTNjVH",
"access_key": {
"nonce": 0,
"permission": "FullAccess"
}
}
}
],
"signature": "ed25519:15v34qoyCHSvSL5uLcaPqD9vXvjcPrCaZVStCMms8e58C62z2UHiazwUXzHajPEgdHpwn7s4J9dd5UPmtvzbYgM",
"hash": "7beNxrbHxMRspJWT9NeEVwx719kVcmY9tRdPG9SYro26"
},
"transaction_outcome": {
"proof": [
{
"hash": "ECKDm5FVhzit7Wqs9sEyBB9NtuTrVRZmWwcxkkg2yUh4",
"direction": "Right"
},
{
"hash": "E4VXdwsNj3fZCbP6y9YH3M5oZHPDcdArqU9kbZJa95Qp",
"direction": "Right"
}
],
"block_hash": "ASY6HgDUQUXUa99L7dPEfghKEnEk5SNkwQrx24u3Fobz",
"id": "7beNxrbHxMRspJWT9NeEVwx719kVcmY9tRdPG9SYro26",
"outcome": {
"logs": [],
"receipt_ids": [
"JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ"
],
"gas_burnt": 209824625000,
"tokens_burnt": "20982462500000000000",
"executor_id": "bumbleee99.near",
"status": {
"SuccessReceiptId": "JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ"
},
"metadata": {
"version": 1,
"gas_profile": null
}
}
},
"receipts_outcome": [
{
"proof": [
{
"hash": "8RwCWE9HgqenPKv8JW9eg2iSLMaQW82wvebYSfjPbdTY",
"direction": "Left"
},
{
"hash": "E4VXdwsNj3fZCbP6y9YH3M5oZHPDcdArqU9kbZJa95Qp",
"direction": "Right"
}
],
"block_hash": "ASY6HgDUQUXUa99L7dPEfghKEnEk5SNkwQrx24u3Fobz",
"id": "JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ",
"outcome": {
"logs": [],
"receipt_ids": [],
"gas_burnt": 209824625000,
"tokens_burnt": "20982462500000000000",
"executor_id": "bumbleee99.near",
"status": {
"SuccessValue": ""
},
"metadata": {
"version": 1,
"gas_profile": []
}
}
}
],
"receipts": []
},
"id": "2"
}
You could see that both transaction_outcome.outcome.receipt_ids and transaction_outcome.outcome.status are pointing to a receipt with ID JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ. I've tried querying node about this receipt with EXPERIMENTAL_receipt method like this
{
"jsonrpc": "2.0",
"id": "2",
"method": "EXPERIMENTAL_receipt",
"params": {"receipt_id": "JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ"}
}
yet the node returns error indicating, that there is no receipt with given ID
{
"jsonrpc": "2.0",
"error": {
"name": "HANDLER_ERROR",
"cause": {
"name": "UNKNOWN_RECEIPT",
"info": {
"receipt_id": "JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ"
}
},
"code": -32000,
"message": "Server error",
"data": {
"name": "UNKNOWN_RECEIPT",
"info": {
"receipt_id": "JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ"
}
}
},
"id": "2"
}
TL;DR the receipt is a local receipt
The transaction from your example is a simple AddKey action where the sender is the receiver (remember this, it's important)
"Execute" transaction (means to convert the transaction into a Receipt)
Apply the Receipts
As the result of the conversion of the transaction into a receipt is your transaction_outcome
"outcome": {
"receipt_ids": [
"JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ"
],
"status": {
"SuccessReceiptId": "JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ"
},
This receipt is about to be applied and the predecessor_id and the receiver_id are equal. In nearcore such receipts are called local receipts (sir - sender-is-receiver) and those receipts are not stored in the nearcore database.
We emulate them on NEAR Indexer Framework side (that's why you can see Receipt JDnBrxh6L9KFgVUEg6U8d39rEUEmbvLQ5tZQUmJTMyFJ on the transaction details page on NEAR Explorer)
And because nearcore doesn't store such receipts in the database you got UNKNOWN_RECEIPT from the RPC.
I am implementing my Alexa Home Skill using AWS Lambda.
Given the following request I receive when I try to detect new devices on Alexa Skil test page:
{directive={header={namespace=Alexa.Discovery, name=Discover, payloadVersion=3, messageId=0160c7e7-031f-47ee-a1d9-a23f38f87a9e}, payload={scope={type=BearerToken, token=...}}}}
I respond with the following:
{
"event": {
"payload": {
"endpoints": [
{
"displayCategories": [
"SMARTPLUG"
],
"capabilities": [
{
"type": "AlexaInterface",
"interface": "Alexa",
"version": "3"
},
{
"type": "AlexaInterface",
"interface": "Alexa.PowerController",
"version": "3",
"properties": {
"retrievable": true,
"supported": [
{
"name": "powerState"
}
],
"proactivelyReported": true
}
},
{
"type": "AlexaInterface",
"interface": "Alexa.EndpointHealth",
"version": "3",
"properties": {
"retrievable": true,
"supported": [
{
"name": "connectivity"
}
],
"proactivelyReported": true
}
}
],
"manufacturerName": "mirko.io",
"endpointId": "ca84ef6d-53b1-430a-8a5e-a62f174eac5e",
"description": "mirko.io forno (id: ca84ef6d-53b1-430a-8a5e-a62f174eac5e)",
"friendlyName": "forno"
}
]
},
"header": {
"payloadVersion": "3",
"namespace": "Alexa.Discovery",
"name": "Discover.Response",
"messageId": "c0555cc8-ad7a-4377-b310-9de9b9ab6282"
}
}
}
Despite that, for some reasons Alexa answers that it did not find any new device.
I may be mistaken but I am pretty sure it used to work before I decided to add the Alexa.EndpointHealth interface.
Your response object looks right to me, except the extra "endpoint" field.
"endpoint": {
"endpointId": "INVALID",
"scope": {
"type": "BearerToken",
"token": "INVALID"
}
}
There's no such field in the Alexa.Discovery documentation. Try removing it and see if it resolves the issue.
I have established a "subscription" to updates on a specific "Bundle". For some reason, the web hook is not firing. Is FHIRCast supported on Asymmetrik's FHIR Server? Here is my "subscription" json payload:
{
"resourceType" : "Subscription",
"status" : "active",
"contact": [
{
"relationship": [
{
"coding": [
{
"system": "http://terminology.hl7.org/CodeSystem/v2-0131",
"code": "N"
}
]
}
],
"name": {
"family": "du Marché",
"_family": {
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/humanname-own-prefix",
"valueString": "VV"
}
]
},
"given": [
"Bénédicte"
]
},
"telecom": [
{
"system": "phone",
"value": "+33 (237) 998327"
}
],
"address": {
"use": "home",
"type": "both",
"line": [
"534 Erewhon St"
],
"city": "PleasantVille",
"district": "Rainbow",
"state": "Vic",
"postalCode": "3999",
"period": {
"start": "1974-12-25"
}
},
"gender": "female",
"period": {
"start": "2012"
}
}
],
"end" : "2021-02-07T13:28:17.239+02:00",
"reason" : "FHIR web hook",
"criteria" : "Bundle/af03af555d9eb78229619cfeac8767409fd22f72",
"error" : "error note",
"channel" : {
"type" : "rest-hook",
"endpoint" : "https://localhost:5001/api/FHIRNotification",
"payload" : "application/fhir+json",
"header" : [""]
}
}
Perhaps I am missing an implementation step to add the subscription web hook functionality?
Thanks for the help!
Todd
Currently we do not support FHIRCast out of the box since this is a facade server. To get FHIRCast working properly you would also probably need triggers or something else setup on the database.
Im not super familiar with FHIRCast personally but it does seem like there is some connection to a hub separate of the FHIR server as well.
I am having trouble figuring out the problem with the API hit to create a Google virtual machine through Google Compute Engine API.
URL: https://cloud.google.com/compute/docs/reference/latest/instances/insert?apix=true#examples
My request data is:
{
"machineType": "zones/us-central1-c/machineTypes/f1-micro",
"name": "api-test",
"networkInterfaces": [
{
"accessConfigs": [
{
"type": "ONE_TO_ONE_NAT",
"name": "External NAT"
}
],
"network": "global/networks/default"
}
],
"disks": [
{
"boot": true,
"autoDelete": true,
"type": "SCRATCH"
}
]
}
and I am getting output:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "backendError",
"message": "Code: '55C355EC47648.A8E5D85.FA0DAF10'"
}
],
"code": 503,
"message": "Code: '55C355EC47648.A8E5D85.FA0DAF10'"
}
}
Doesn't give me any reason for the error. Same issue is when I hit the API using Ruby Library. Authentication is fine as I can do various other stuff like getting the images and running instances data. Please help me out.
Figured out the problem. We have to use the disks type as "PERSISTENT" rather than "SCRATCH" and specify the disks["initializeParams"]["sourceImage"] as an existing image from https://console.cloud.google.com/compute/images and then use it like this in the request body of your request:
{
"name": "api-test3",
"machineType": "zones/us-central1-c/machineTypes/f1-micro",
"networkInterfaces": [
{
"accessConfigs": [
{
"type": "ONE_TO_ONE_NAT",
"name": "External NAT"
}
],
"network": "global/networks/default"
}
],
"disks": [
{
"boot": "true",
"type": "PERSISTENT",
"autoDelete": "true",
"initializeParams": [
{
"sourceImage": "global/images/ubuntu-1404-lts"
}
]
}
]
}
I am unable to connect using artillery.io with setting engine = socketio, please find my configuration json below
socket error {"type":"Transport error", "description":400}
"scenarios": [
{
"name": "my test",
"engine": "socketio",
"flow": [
{
"emit": {
"channel": "command",
"namespace": "command"
}
},
{
"think": 1
}
]
}
]
"scenarios": [
{
"name": "my test",
"engine": "socketio",
"flow": [
{
"emit": {
"channel": "command"
"data": "hello"
"namespace": "/command"
}
},
{
"think": 1
}
]
}
]
Please try using above command