org.geotools.mbstyle.parse.MBFormatException: "layers" requires JSONArray - geoserver

Geoserver 2.14.0 installed on Win7, trying to use mbstyle styling of layers. Get error "org.geotools.mbstyle.parse.MBFormatException: "layers" requires JSONArray" when using a mbstyle on a layer.
Tried to install java.util.jar in (Program Files)\GeoServer 2.14.0\webapps\geoserver\WEB-INF\lib, no change.
What am I missing?
Log output:
org.geotools.mbstyle.parse.MBFormatException: "layers" requires JSONArray
at org.geotools.mbstyle.parse.MBObjectParser.getJSONArray(MBObjectParser.java:245)
at org.geotools.mbstyle.MBStyle.layers(MBStyle.java:135)
at org.geotools.mbstyle.MBStyle.transform(MBStyle.java:347)
at org.geotools.mbstyle.MapBoxStyle.parse(MapBoxStyle.java:53)
at org.geoserver.community.mbstyle.MBStyleHandler.convertToSLD(MBStyleHandler.java:121)
at org.geoserver.community.mbstyle.MBStyleHandler.parse(MBStyleHandler.java:100)

It means that you need to provide an array in the layers element of your style.
Such as:
{
"version": 8,
"name": "point-circle-test",
"layers": [
{
"id": "point",
"type": "circle",
"paint": {
"circle-radius": 3,
"circle-color": "#FF0000",
"circle-pitch-scale": "map"
}
}
]
}

Related

FHIR - Contained Resources and Referencing

I am still new to FHIR and trying to connect the dots.
If I have a resource that I want to contain other resources, can I refer to it by element name (#myElementName) or do I need to use the contained resource id? (#myDeviceId).
I've included sample code below. What I would like to accomplish is to have a Basic resource that has two extensions: TestConfiguration(Device) and DigitalSample(ImagingStudy). I would like for both of these resources to be contained.
PS: I generated the code below using custom classes and the .net API.
Thank you much!
{
"resourceType": "TestInput",
"contained": [
{
"resourceType": "TestConfiguration",
"id": "TestConfigurationId",
"contained": [
{
"resourceType": "DeviceDefinition",
"modelNumber": "ABC123"
}
],
"definition": {
"reference": "#definition"
}
},
{
"resourceType": "DigitalSample",
"id": "DigitalSampleId"
}
],
"extension": [
{
"url": "http://MyOrganization.com/fhir/R4/StructureDefinition/Basic-TestConfiguration",
"valueReference": {
"reference": "#testConfiguration"
}
},
{
"url": "http://MyOrganization.com/fhir/R4/StructureDefinition/Basic-DigitalSample",
"valueReference": {
"reference": "#digitalSampleId"
}
}
]
}
Every local reference must point to an id of a contained resource.
In your case it should be:
"reference": "#TestConfigurationId"
"reference":"#DigitalSampleId"
Always check https://www.hl7.org/fhir/ for what you need to do. Always check the FHIR version

Latest polkodotjs.org does not connects to Pirl Coin (substrate 2) network

Can you inspect and explain the problem not to get error?
This is Pirl Source Code
https://github.com/pirl/pirl-2_0
(at v0.8.25-ad031f3)
This is pirl polkadotjs.org clone at version at (api v2.2.2-2 ,apps v0.62.2-2 check right top at link)
https://dashboard.pirl.network/
custom endpoint: wss://rpc.pirl.network
When i try to transfer coins i got this error
Pirl has their own (now stale and seemingly unmaintained) fork of the Polkadot UI:
https://github.com/pirl
I would recommend using their own products and also read their docs that detail how to use it properly.
EDIT: Looks like there is a solution here https://github.com/paritytech/subport/issues/139
it worked after i set the custom type json (julien, #masterdubs, pirl coin developer, have given me this json
{
"Address": "AccountId",
"LookupSource": "AccountId",
"Account": {
"nonce": "U256",
"balance": "U256"
},
"Transaction": {
"nonce": "U256",
"action": "String",
"gas_price": "u64",
"gas_limit": "u64",
"value": "U256",
"input": "Vec",
"signature": "Signature"
},
"Signature": {
"v": "u64",
"r": "H256",
"s": "H256"
},
"Keys": "SessionKeys5"
}

Objects in array is not well supported error observed for ELK docker image

I'm using the latest elk image for kibana dashboard and I have json file which is having list of array[] and I'm not able to show those as field in kibana and It's showing that the object in array is not well supported error message.
As per the document in kibana I just went through the below link but I didn't find anything useful for elk docker image.
https://github.com/istresearch/kibana-object-format
I just tried to run the command
Run bin/kibana-plugin install <package.zip>
but it returned as run is unknown command removed run and ran remaining command but It says that's invalid.
I'm using linux box and Kibana 7.3 version.
Is it possible to overcome this issue? how to deploy that plugin for elk image else is there any other way to make those arrays object as fields in kibana.
I'm not sure how can I proceed. Please help me.
Sample Data:
{
"expand": "schema,names",
"startAt": 0,
"maxResults": 50,
"total": 4,
"issues": [{
"expand": "operations,versionedRepresentations,editmeta,changelog,renderedFields",
"id": "1999875",
"self": "https://amazon.kindle.com/jira/rest/api/2/issue/1999875",
"key": "KINDLEAMZ-67578",
"fields": {
"summary": "contingency is displaying for confirmed card.",
"priority": {
"name": "P1",
"id": "1"
},
"created": "2019-09-23T11:25:21.000+0000"
}
},
{
"expand": "operations,versionedRepresentations,editmeta,changelog,renderedFields",
"id": "2019428",
"self": "https://amazon.kindle.com/jira/rest/api/2/issue/2019428",
"key": "KINDLEAMZ-68661",
"fields": {
"summary": "card",
"priority": {
"name": "P1",
"id": "1"
},
"created": "2019-09-23T11:25:21.000+0000"
}
},
{
"expand": "operations,versionedRepresentations,editmeta,changelog,renderedFields",
"id": "2010958",
"self": "https://amazon.kindle.com/jira/rest/api/2/issue/2010958",
"key": "KINDLEAMZ-68167",
"fields": {
"summary": "Test Card",
"priority": {
"name": "P1",
"id": "1"
},
"created": "2019-09-23T11:25:21.000+0000"
}
}
]
}
I just want to fetch KEY, Summary, Priority from each of the above array. But its not working as expected when I tried to make a field its showing as array in kibana. If this is not working with 7.3.0 should I downgrade to lower version? the steps are missing for docker user in that document. Is there any way to get those details?
Checking here: https://github.com/istresearch/kibana-object-format/releases it looks like the plugin latest release was for Elasticsearch 6.3. I guess that is the reason for your error.
I'm not sure there's a fix for this in kibana. There are many issues on this subject, open for a long time, like: https://github.com/elastic/kibana/issues/3333.

Missing elements for a Google Cloud Video Annotation request

I am trying to run annotation on a video using the Google Cloud Video Intelligence API. Annotation requests with just one feature request (i.e., one of "LABEL_DETECTION", "SHOT_CHANGE_DETECTION" or "EXPLICIT_CONTENT_DETECTION"), things work fine. However, when I request an annotation with two or more features at the same time, the response does not always return all the request feature fields. For example, here is a request I ran recently using the API explorer:
{
"features": [
"EXPLICIT_CONTENT_DETECTION",
"LABEL_DETECTION",
"SHOT_CHANGE_DETECTION"
],
"inputUri": "gs://gccl_dd_01/Video1"
}
The operation Id I got back is this: "us-east1.11264560501473964275". When I run a GET with this Id, I have the following response:
200
{
"name": "us-east1.11264560501473964275",
"metadata": {
"#type": "type.googleapis.com/google.cloud.videointelligence.v1.AnnotateVideoProgress",
"annotationProgress": [
{
"inputUri": "/gccl_dd_01/Video1",
"progressPercent": 100,
"startTime": "2018-08-06T17:13:58.129978Z",
"updateTime": "2018-08-06T17:18:01.274877Z"
},
{
"inputUri": "/gccl_dd_01/Video1",
"progressPercent": 100,
"startTime": "2018-08-06T17:13:58.129978Z",
"updateTime": "2018-08-06T17:14:39.074505Z"
},
{
"inputUri": "/gccl_dd_01/Video1",
"progressPercent": 100,
"startTime": "2018-08-06T17:13:58.129978Z",
"updateTime": "2018-08-06T17:16:23.230536Z"
}
]
},
"done": true,
"response": {
"#type": "type.googleapis.com/google.cloud.videointelligence.v1.AnnotateVideoResponse",
"annotationResults": [
{
"inputUri": "/gccl_dd_01/Video1",
"segmentLabelAnnotations": [
...
],
"shotLabelAnnotations": [
...
],
"shotAnnotations": [
...
]
}
]
}
}
The done parameter for the response is set to true, but it does not have any field containing the annotations for Explicit Content.
This issue seems to be occurring at random to my novice eyes. The APIs will return a response with all parameters on some occasions and be missing one on others. I am wondering if there is anything I am missing here or something on my end that is causing this?
I did some tests using just LABEL_DETECTION, just EXPLICIT_CONTENT_DETECTION and using the three of them.
As I am not using videos with explicit content, I don't see any specific field when adding just EXPLICIT_CONTENT_DETECTION:
{
"name": "europe-west1.462458490043912485",
"metadata": {
"#type": "type.googleapis.com/google.cloud.videointelligence.v1.AnnotateVideoProgress",
"annotationProgress": [
{
"inputUri": "/cloud-ml-sandbox/video/chicago.mp4",
"startTime": "2018-08-07T14:18:40.086713Z",
"updateTime": "2018-08-07T14:18:40.230351Z"
}
]
}
}
Can you share a specific video sample, the request.json used and two different outputs, please?

kartograph svg map is empty when using "polygons" bounds mode

I 'm trying to generate svg maps from the GEOFLA shapefiles.
Using 'bbox' bounds mode with manually setting the bbox values works well :
{
"layers": [{
"id": "depts",
"src": "data/DEPARTEMENTS/DEPARTEMENT.shp",
"filter": {"CODE_REG": "24"},
"simplify": {
"method": "distance",
"tolerance": 8
},
"attributes": "all"
}],
"bounds": {
"mode": "bbox",
"data": [-4.5, 42, 8, 48],
},
"export": {
"width": 600,
"ratio": 0.8
}
}
But when setting the bounds mode to 'polygons', then i get an empty svg map :
{
"layers": [{
"id": "depts",
"src": "data/DEPARTEMENTS/DEPARTEMENT.shp",
"filter": {"CODE_REG": "24"},
"simplify": {
"method": "distance",
"tolerance": 8
},
"attributes": "all"
}],
"bounds": {
"mode": "polygons",
"data": {
"layer": "depts"
},
"padding": 0.06
},
"export": {
"width": 600,
"ratio": 0.8
}
}
I had a look in kartograph files and i noticed that the "get_features" method in "map.py" return a Polygon which coordinates doesn't intersect with the features geometry previouly extracted from the shapefile.
Then, each feature are throw away in the "get_features" method of the "maplayer.py" file when checking if feature geometry intersects with the "layer.map.view_poly" property.
I had a similar problem using GEOFLA file projection.
The solution I've found is basically to change my shapefile projection using QGIS. My idea was to use the projection of the shapefile given in installation guide which worked for me.
Get example shape file from kartograph installation page
Load this vector layer in QGIS Add your GEOFLASH layer in QGIS
Right-click on GEOFLASH layer and "Save as..." menu
In the save window, give a new name for your layer (eg : DEPARTEMENT_WGS84.shp)
Click CSR button and select the test layer projection (WGS 84 / EPSG:4326)
Click OK
Check the new shape file has correct projection :
cat DEPARTEMENT_WGS84.prj
GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137,298.257223563]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]]
Now your script should work fine using new shape file.

Resources