JSONata: method or function for testing JSON value datatype - jsonata

JSONata offers conditional expressions and predicates which can be used to select values out of JSON trees.
However, I have not been able to find a way to test the datatype of a JSON value.
For example, given the array:
[null, true, false, 1, 2.3, "a", ["x"], {}, {"y" : "z}]
I only want to pull out the numeric values.
[1, 2.3]
Q: In a JSONata query, how does one test the JSON datatype (null, boolean, number, string, array, object) of a value?

Currently there is no way to do this in JSONata. Worthy of an enhancement request though.

Wow, today discovered this cool JSONata. Here is my try:
http://try.jsonata.org/
[null, true, false, 1, 2.3, "a", ["x"], {}, {"y" : "z"}]
*[$ ~> /^[0-9\.]{1,}$/m]

JSONata offers the $type-method to check datatype of a JSON value. For example. the following snippet will return the datatypes of the values in the invoices example data in https://try.jsonata.org.
Account.Order.Product.{
"priceType": $type(Price),
"productNameType": $type($.'Product Name'),
"descriptionType": $type(Description)
}
The result is:
[
{
"priceType": "number",
"productNameType": "string",
"descriptionType": "object"
},
{
"priceType": "number",
"productNameType": "string",
"descriptionType": "object"
},
{
"priceType": "number",
"productNameType": "string",
"descriptionType": "object"
},
{
"priceType": "number",
"productNameType": "string",
"descriptionType": "object"
}
]
By changing values of Price and Product Name to null in the example JSON, the result will change for that particular object to:
{
"priceType": "null",
"productNameType": "null",
"descriptionType": "object"
},
One can check for 'null', 'number', 'string', 'array', 'object', 'boolean'.
For my concern I've used it to check for null values when converting dates to milliseconds:
'enddate': $type($v.enddate) = 'null' ? null : $toMillis($v.enddate),

You can check if a value is a number doing value - value = 0. In case of type is number it will always be 0, so result will be true. It will trigger an error if it is a string.

Related

Why does Swagger selection of enum contains an unwanted empty string option?

Trying to get rid of the 'empty' value from the enum list on Swagger UI.
My field options is a list of enums which null is not a part of it. My OpenApi3 specification is as below:
"description": "number of options",
"explode": true,
"in": "query",
"name": "options",
"required": false,
"schema": {
"items": {
"enum": [
1,
2,
3
],
"type": "string"
},
"type": "array"
},
"style": "form"
I seem to make it disappear if I make the field required:
But I don't want this. The field should be either nullable or be part of one of the enum values. The reason is that, if the 'empty' value is selected, the request goes like
http://localhost?options=&...
This brings up a HTTP 400 result with the message
"The value '' is invalid."
I don't want this '--' entry to be listed at all. The field should be nullable (nothing selected) or be part of the enum values.

JSON Schema - array with multiple different object types

I'm writing a schema using draft/2020-12 to validate an array of multiple objects. I will be using YAML for the examples for readability (and it's YAML documents that are going to be validated).
I've seen many different approaches but none seem to do a very good job for some reason.
The data to validate
pipeline:
- foo: abc
someOption: 123
- baz: def
anotherOption: 223
- foo: ghi
As you can see, the pipeline array contains objects that has completely different properties. The same object types can appear multiple times (there's 2 instances of an object containing foo, they are to be seen as the same object type).
Schema
$id: 'https://example.com/schema'
$schema: 'https://json-schema.org/draft/2020-12/schema'
type: object
properties:
pipeline:
type: array
additionalItems: false
anyOf:
- items:
type: object
additionalProperties: false
properties:
foo:
type: string
someOption:
type: integer
required: [foo]
- items:
type: object
additionalProperties: false
properties:
baz:
type: string
anotherOption:
type: integer
required: [baz]
To me, this looks correct, although it fails validation and I can't quite see why...
Even if it had worked, I'm not sure if I'm supposed to use anyOf or oneOf in this case. Is it matching across all array items or per individual array item?
You have it slightly wrong. Currently your schema says... for the pipeline array, either, all of the items must be foo or all of the items must be baz.
You need to change your location of anyOf and items...
items applies to every item in the array.
anyOf checks that any of the subschema values are valid against the instance location (in this case, each item in the array).
{
"$id": "https://example.com/schema",
"$schema": "https://json-schema.org/draft/2020-12/schema",
"type": "object",
"properties": {
"pipeline": {
"type": "array",
"items": {
"anyOf": [
{
"type": "object",
"additionalProperties": false,
"properties": {
"foo": {
"type": "string"
},
"someOption": {
"type": "integer"
}
},
"required": [
"foo"
]
}, {
"type": "object",
"additionalProperties": false,
"properties": {
"baz": {
"type": "string"
},
"anotherOption": {
"type": "integer"
}
},
"required": [
"baz"
]
}
]
}
}
}
}

ElasticSearch URI Search null field

I need to create a query via URI to filter all data between two dates and also if this date field is null.
For example:
I have the field "creation_date" in some objects, however I want that in the resulting also does not appear the objects that the field does not have.
I tried something similar below:
http://localhost//elasticsearch/channels/channel/_search?q=channel.schedule.creation_date:[2018-06-19 TO 2018-12-22] OR channel.schedule.creation_date: NULL
As far as comparing the dates is OK, it works. The problem is to get the NULL values.
Edited
Source sample:
"_source": {
"channel": {
"activated": false,
"approved": false,
"content": "Jvjv",
"creation_date": "2018-06-21T13:06:10.000Z",
"facebookLink": "J jv",
"id": "Kvjvjv",
"instagramId": "Jvjv",
"name": "Kbkbkvk",
"ownerId": "sZtxdhiNbNY9sr2DtiCzlgJfsqb2",
"plan": 0,
"purpose": "Jvjv",
"recurrence": 1,
"segment": "Jvjvjv",
"twitterId": "Jvjv",
"youtubeId": "Jvj"
}
}
}
You can do this using the NOT(_exists_:field_name) constraint:
Can you try this ?
http://localhost//elasticsearch/channels/channel/_search?q=channel.schedule.creation_date:[2018-06-19 TO 2018-12-22] OR NOT(_exists_:channel.schedule.creation_date)

how to make array of objects for grape

I am building APIs using Grape Api for a rails application.
What I am trying right now is this form:
And this is output:
{
"page_score_master": {
"issue_date": "2014-06-23"
},
"press_id": "1",
"print_date": "2014-06-23",
"product_id": 1,
"pull_id": 2,
"press_run_id": 1,
"total_section": 1,
"ssa": [
{
"ss": {
"section_name": "A"
},
"ss1": {
"section_name": "B"
}
}
],
"foreman_id": 1,
"pic_id": 1,
"score_sheet_master_id": 1,
"score_sheet_sections_attributes": {
"score_sheet_id": "1"
},
"route_info": {
"options": {
"description": "create score sheet",
"params": {
"page_score_master": {
"required": true,
"type": "Hash"
},
"page_score_master[issue_date]": {
"required": true,
"type": "String"
},
"print_date": {
"required": true,
"type": "String"
},
"total_section": {
"required": true,
"type": "Integer"
},
"ssa": {
"required": false,
"type": "Array"
},
"ssa[section_name]": {
"required": false,
"type": "String"
},
"ssa[total_pages]": {
"required": false,
"type": "Integer"
},
"ssa[color_pages]": {
"required": false,
"type": "String"
},
"ssa[score_sheet_id]": {
"required": false,
"type": "Integer"
}
}
}
I have omitted some part of json to make it shorter.
What I need is to have an array or ssa but somehow unable to make it till now. It makes an array of ssa with only one object.
While in my API controller I have following code:
optional :ssa, type: Array do
requires :ss, type: Hash do
optional :section_name, type: String
optional :total_pages, type: Integer
optional :color_pages, type: String
optional :score_sheet_id, type: Integer
end
end
I think you have 2 problems here.
First one lies in you form declaration.
In the code you say you have an Array (called ssa) of Hashes (called ss).
In your form, you're sending a hash called ss1 as part of your 'ssa' Array. The ss1 hash will be ignored, thus you'll only have one 'ss' element in you array.
If you rename the ss1 into ss in your form:
ssa[][ss][section_name] A
ssa[][ss][section_name] B
you'll get to the second problem, which lies in the API controller definition:
Your controller expects a 'ssa' array that can only have one 'ss' hash element. Thus it will overwrite the first [ss][section_name].
What you want to do is declare the ssa as a an Array and remove the ss group:
requires :ssa, type: Array do
optional :section_name, type: String
optional :total_pages, type: Integer
optional :color_pages, type: String
optional :score_sheet_id, type: Integer
end
This will require an array (ssa) of hashes. You don't need to declare the ss group, it already expects an array of hashes with section_name, total_pages, etc. as keys. If ssa is not a required param, simply declare it optional, as you did in your controller.
Then, your form should look like this:
ssa[][section_name] ABC
opportunity[ssa][][total_pages] 3
ssa[][section_name] DEF
opportunity[ssa][][total_pages] 6
This will result in:
:ssa=>
[{:section_name=>"DEF",
:total_pages=>3,
:color_pages=>nil,
:score_sheet_id=>nil},
{:section_name=>"HGJK",
:total_pages=>6,
:color_pages=>nil,
:score_sheet_id=>nil}]

Ruby: Delete a reoccurring hash in a large nested data structure

I am trying to move data between services and need to remove a reoccurring hash from a large record that contains both hashes and arrays.
The hash to remove from every section of the record is
{
"description": "simple identifier",
"name": "id",
"type": "id"
},
Heres example data :
{"stuff": { "defs": [
{
"description": "simple identifiery",
"name": "id",
"type": "id"
},
{
"name": "aDate",
"type": "date"
},
{
"defs": [
{
"description": "simple identifier",
"name": "id",
"type": "id"
},
{
"case-sensitive": true,
"length": null,
"name": "Id",
"type": "string"
},
{
"name": "anotherDate",
"type": "dateTime"
}
],
},
{
"defs": [
{
"description": "simple identifier",
"name": "id",
"type": "id"
},
...lots more....
I created a couple recursive function to remove the element(s) but I'm left with an empty hash '{}'. I also tried to remove the parent but found that I removed the hashes parent and not the hash itself.
I'm pretty sure I could create a new hash and populate it with the data I want but there must be a way to do this.
I am not working in rails and would like to avoid using rails gems.
I figured this out by looking at the data structure closer. The elements that need to be removed are always in an array so before recursing check if the hash key/value exists and delete if so. I'm sure this could be coded better so let me know what you think.
def recursive_delete!(node, key, value)
if node.is_a?(Array)
node.delete_if { |elm| elm[key] == value }
node.each do |elm|
recursive_delete!(elm, key, value)
end
elsif node.is_a?(Hash)
node.each_value do |v|
recursive_delete!(v, key, value)
end
end
end
If you are looking for the way to delete the same hash as you have inside complex Array/Hash data structure, it's easy:
def remove_hash_from(source, hsh)
return unless source.is_a?(Hash) || source.is_a?(Array)
source.each do |*args|
if args.last == hsh
source.delete(args.first)
elsif args.last.is_a?(Hash) || args.last.is_a?(Array)
remove_hash_from(args.last, hsh)
end
end
source
end
data = [
{h: 'v',
j: [{h: 'v'},
{a: 'c'},
8,
'asdf']
},
asdf: {h: 'v', j: 'c'}
]
remove_hash_from(data, {h: 'v'})
# => [{:h=>"v", :j=>[{:a=>"c"}, 8, "asdf"]}, {:asdf=>{:h=>"v", :j=>"c"}}]
Possibly, you will need to adjust method above for your needs. But common idea is clear, I hope.

Resources