Mix Request/Response Body and Data Structures - apiblueprint

I would like use aglio/api blueprint to create a nice documentation for our new API.
The JSON might be quite big (with lots of optional values), so I'd like to give a proper use case in the body, but also use data structures for the json schema.
However, whenever the schema fits exactly the body, the resulting HTML throws "Hello, world!"s at me, since I didn't fill in the example data - but since I've got a complete and valid example in the body, I wouldn't have expected aglio to create Hello World output.
For reference, that's what I would expect in the resulting htmls body to appear:
{
"a": "i want this to appear",
"b": "in my aglio",
"c": "html file"
}
This is what actually does appear:
{
"a": "Hello, world!",
"b": "Hello, world!",
"c": "Hello, world!",
"d": "Hello, world!"
}
And that's the raw api blueprint:
FORMAT: 1A
# JSON Schema
# Test [/post/something]
## A Test [POST]
+ Request (application/json)
+ Attributes (SomeObject)
+ Body
{
"a": "i want this to appear",
"b": "in my aglio",
"c": "html file"
}
+ Response 200
# Data Structures
## SomeObject (object)
+ a (string) - A
+ b (string) - B
+ c (string) - C
+ d (string, optional) - I'm optional, yet don't want to appear in the html, only the schema
So, first: is that a valid way to do things? Would you recommend a different approach? Is this a bug in aglio, because in apiary it works as I intend it to?
Thanks!

Found the corresponding github issue, looks like nothing's wrong with my description.
https://github.com/danielgtaylor/aglio/issues/221

You can do the following
FORMAT: 1A
# JSON Schema
# Test [/post/something]
JSON Schema Title
## A Test [POST]
+ Request (application/json)
+ Attributes (object)
+ a a-value (string, required) - description about a
+ b b-value (number, required) - description about b
+ c c-value (string, required) - description about c
+ d [a1, a2, a3] (array, optional) - I'm optional
+ Response 200 (application/json)
{
"message": "this works"
}
a a-value (string, required) - description
In above line "a" --> attribute name, "a-value" --> appears in body, "(string, required)" --> for schema generation, "description"--> schema attribute description

Related

How to "inspect to file" (or to string) in Elixir?

In Elixir, we can IO.inspect anyStructure to get anyStructure's internals printed to output. Is there a similar method to output it to a file (or, as a more flexible solution, to a string)?
I've looked through some articles on debugging and io but don't see a solution. I've also tried
{:ok, file} = File.open("test.log", [:append, {:delayed_write, 100, 20}])
structure = %{ a: 1, b: 2 }
IO.binwrite(file, structure)
File.close file
but that results in
no function clause matching in IO.binwrite/2 [...]
def binwrite(device, iodata) when is_list(iodata) or is_binary(iodata)
I’ve also googled some "elixir serialize" and "elixir object to string", but haven't found anything useful (like :erlang.term_to_binary which returns, well, binary). Is there a simple way to get the same result that IO.inspect prints, into a file or a string?
There is already inspect/2 function (not the same as IO.inspect), just go with it:
#> inspect({1,2,3})
"{1, 2, 3}"
#> h inspect/2
def inspect(term, opts \\ [])
#spec inspect(
Inspect.t(),
keyword()
) :: String.t()
Inspects the given argument according to the Inspect protocol. The second
argument is a keyword list with options to control inspection.
You can do whatever you wish with the string afterwards.
You can give IO.inspect an additional param to tell it where to write to:
{:ok, pid} = StringIO.open("")
IO.inspect(pid, %{test: "data"}, label: "IO.inspect options work too \o/")
{:ok, {_in, out}} = StringIO.close(pid)
out # "IO.inspect options work too o/: %{test: \"data\"}\n"
It accepts a pid of a process to write to. StringIO provides such a process, returning you a string on close.
In Elixir, we can IO.inspect anyStructure to get anyStructure's internals printed to output.
This is not quite true; IO.inspect uses the Inspect protocol. What you see is not the internals of the struct, but whatever that struct's implementation of the Inspect protocol is written to produce. There are different options you can give to inspect, defined in Inspect.Opts, one of them is structs: false, which will print structs as maps.
For example, inspecting a range struct:
iex> inspect(1..10)
"1..10"
iex> inspect(1..10, structs: false)
"%{__struct__: Range, first: 1, last: 10, step: 1}"
To answer your question and to add to the other answers, here is a method that uses File.open!/3 to reuse an open file and log multiple inspect calls to the same file, then close the file:
File.open!("test.log", [:write], fn file ->
IO.inspect(file, %{ a: 1, b: 2 }, [])
IO.inspect(file, "logging a string", [])
IO.inspect(file, DateTime.utc_now!(), [])
IO.inspect(file, DateTime.utc_now!(), structs: false)
end)
This produces the following test.log file:
%{a: 1, b: 2}
"logging a string"
~U[2022-04-29 09:51:46.467338Z]
%{
__struct__: DateTime,
calendar: Calendar.ISO,
day: 29,
hour: 9,
microsecond: {485474, 6},
minute: 51,
month: 4,
second: 46,
std_offset: 0,
time_zone: "Etc/UTC",
utc_offset: 0,
year: 2022,
zone_abbr: "UTC"
}
You simply need to combine inspect/2 which returns a binary and File.write/3 or any other function dumping to a file.
File.write("test.log", inspect(%{a: 1, b: 2}, limit: :infinity))
Note the limit: :infinity option, without it the long structures will be truncated for better readability when inspecting to stdout.

using construct_undefined in ruamel from_yaml

I'm creating a custom yaml tag MyTag. It can contain any given valid yaml - map, scalar, anchor, sequence etc.
How do I implement class MyTag to model this tag so that ruamel parses the contents of a !mytag in exactly the same way as it would parse any given yaml? The MyTag instance just stores whatever the parsed result of the yaml contents is.
The following code works, and the asserts should should demonstrate exactly what it should do and they all pass.
But I'm not sure if it's working for the right reasons. . . Specifically in the from_yaml class method, is using commented_obj = constructor.construct_undefined(node) a recommended way of achieving this, and is consuming 1 and only 1 from the yielded generator correct? It's not just working by accident?
Should I instead be using something like construct_object, or construct_map or. . .? The examples I've been able to find tend to know what type it is constructing, so would either use construct_map or construct_sequence to pick which type of object to construct. In this case I effectively want to piggy-back of the usual/standard ruamel parsing for whatever unknown type there might be in there, and just store it in its own type.
import ruamel.yaml
from ruamel.yaml.comments import CommentedMap, CommentedSeq, TaggedScalar
class MyTag():
yaml_tag = '!mytag'
def __init__(self, value):
self.value = value
#classmethod
def from_yaml(cls, constructor, node):
commented_obj = constructor.construct_undefined(node)
flag = False
for data in commented_obj:
if flag:
raise AssertionError('should only be 1 thing in generator??')
flag = True
return cls(data)
with open('mytag-sample.yaml') as yaml_file:
yaml_parser = ruamel.yaml.YAML()
yaml_parser.register_class(MyTag)
yaml = yaml_parser.load(yaml_file)
custom_tag_with_list = yaml['root'][0]['arb']['k2']
assert type(custom_tag_with_list) is MyTag
assert type(custom_tag_with_list.value) is CommentedSeq
print(custom_tag_with_list.value)
standard_list = yaml['root'][0]['arb']['k3']
assert type(standard_list) is CommentedSeq
assert standard_list == custom_tag_with_list.value
custom_tag_with_map = yaml['root'][1]['arb']
assert type(custom_tag_with_map) is MyTag
assert type(custom_tag_with_map.value) is CommentedMap
print(custom_tag_with_map.value)
standard_map = yaml['root'][1]['arb_no_tag']
assert type(standard_map) is CommentedMap
assert standard_map == custom_tag_with_map.value
custom_tag_scalar = yaml['root'][2]
assert type(custom_tag_scalar) is MyTag
assert type(custom_tag_scalar.value) is TaggedScalar
standard_tag_scalar = yaml['root'][3]
assert type(standard_tag_scalar) is str
assert standard_tag_scalar == str(custom_tag_scalar.value)
And some sample yaml:
root:
- item: blah
arb:
k1: v1
k2: !mytag
- one
- two
- three-k1: three-v1
three-k2: three-v2
three-k3: 123 # arb comment
three-k4:
- a
- b
- True
k3:
- one
- two
- three-k1: three-v1
three-k2: three-v2
three-k3: 123 # arb comment
three-k4:
- a
- b
- True
- item: argh
arb: !mytag
k1: v1
k2: 123
# blah line 1
# blah line 2
k3:
k31: v31
k32:
- False
- string here
- 321
arb_no_tag:
k1: v1
k2: 123
# blah line 1
# blah line 2
k3:
k31: v31
k32:
- False
- string here
- 321
- !mytag plain scalar
- plain scalar
- item: no comment
arb:
- one1
- two2
In YAML you can have anchors and aliases, and it is perfectly fine to have an object be a child of itself (using an alias). If you want to dump the Python data structure data:
data = [1, 2, 4, dict(a=42)]
data[3]['b'] = data
it dumps to:
&id001
- 1
- 2
- 4
- a: 42
b: *id001
and for that anchors and aliases are necessary.
When loading such a construct, ruamel.yaml recurses into the nested data structures, but if the toplevel node has not caused a real object to be constructed to which the anchor can be made a reference, the recursive leaf cannot resolve the alias.
To solve that, a generator is used, except for scalar values. It first creates an empty object, then recurses and updates it values. In code calling the constructor a check is made to see if a generator is returned, and in that case next() is done on the data, and potential self-recursion "resolved".
Because you call construct_undefined(), you always get a generator. Practically that method could return a value if it detects a scalar node (which of course cannot recurse), but it doesn't. If it would, your code could then not load the following YAML document:
!mytag 1
without modifications that test if you get a generator or not, as is done in the code in ruamel.yaml calling the various constructors so it can handle both construct_undefined and e.g. construct_yaml_int (which is not a generator).

how to get an array response of jdbc request in jmeter?

Example I have my JDBC Request and the response is like:
X Y Z
a1 b1 c1
a2 b2 c2
a3 b3 c3
a4 b4 c4
a5 b5 c5
. . .
. . .
. . .
How can I get all the value of x, y and z?
then I have HTTP request and I'm going to assert if all the response is matched to the data selected from JDBC.
example response:
{
{
"x":"a1",
"y":"b1",
"z": "c1"
},
{
"x":"a2",
"y":"b2",
"z": "c2"
},
{
"x":"a3",
"y":"b3",
"z": "c4"
},
{
"x":"a4",
"y":"b4",
"z": "c4"
},
{
"x":"a5",
"y":"b5",
"z": "c5"
},
{
"x":"a6",
"y":"b6",
"z": "c6"
},
{
"x":"a7",
"y":"b7",
"z": "c7"
},
{
"x":"a8",
"y":"b8",
"z": "c8"
},
.
.
.
.
}
As per JDBC Request sampler documentation:
If the Variable Names list is provided, then for each row returned by a Select statement, the variables are set up with the value of the corresponding column (if a variable name is provided), and the count of rows is also set up. For example, if the Select statement returns 2 rows of 3 columns, and the variable list is A,,C, then the following variables will be set up:
A_#=2 (number of rows)
A_1=column 1, row 1
A_2=column 1, row 2
C_#=2 (number of rows)
C_1=column 3, row 1
C_2=column 3, row 2
So given you provide "Variable Names" as X,Y,Z you should be able to access the values as ${X_1}, ${Y_2}, etc.
See Debugging JDBC Sampler Results in JMeter for more detailed information on working with JDBC Test Elements results and result sets.
You should declare the "Variable Names" field and also declare a result variable name as shown below.
Then you can access them using the _1 _2 method. Please find below the sample code that you can use in the beanshell post processor.
import java.util.ArrayList;
import net.minidev.json.parser.JSONParser;
import net.minidev.json.JSONObject;
import net.minidev.json.JSONArray;
ArrayList items = vars.getObject("result1");
for ( int i = items.size() - 1; i >= 0; i--) {
JSONObject jsonitemElement = new JSONObject();
jsonitemElement.put("x", vars.get("x_" + (i + 1)));
jsonitemElement.put("y", vars.get("y_" + (i + 1)));
jsonitemElement.put("z", vars.get("z_" + (i + 1)));
log.info(jsonitemElement.toString());
}
Since you are getting these values as the response from the response payload of the HTTP request, you should add the code to parse that JSON response in an assertion or post processor and compare it with the elements from the above sample code.
A point to note - Different applications send the target JSON in any order. So, there is no guarantee that the HTTP response will always send the response as A1,B1,C1 - A2,B2,C2 etc. It can send them in any order starting with A5,B5,C5 etc. It is better to then use a hashmap or write your array comparison to ensure that your result set completely matches the HTTp response.

YAML deserializer with position information?

Does anyone know of a YAML deserializer that can provide position information for the constructed objects?
I know how to deserialize a YAML file into a Java object. Simple instructions on http://yamlbeans.sourceforge.net/.
However, I want to do some algorithmic validation on the deserialized object and report error back to the user pointing to the position in the YAML that cause the error.
Example:
=========YAML file==========
name: Nathan Sweet
age: 28
address: 4011 16th Ave S
=======JAVA class======
public class Contact {
public String name;
public int age;
public String address;
}
Imagine if I want to first load the yaml into Contact class and then validate the address against some repository and error back if its invalid. Something like:
'Line 3 Column 9: The address does not match valid entry in the database'
The problem is, currently there is no way to get the position inside a deserialized object from YAML.
Anyone know a solution to this issue?
Most YAML parsers, if they keep any information about positions around they drop it while constructing the language native objects.
In ruamel.yaml ¹, I keep more information around because I want to be able to round-trip with minimal loss of original layout (e.g. keeping comments and key order in mappings).
I don't keep information on individual key-value pairs, but I do on the "upper-left" position of a mapping². Because of the kept order of the mapping items you can give some rather nice feedback. Given an input file:
- name: anthon
age: 53
adres: Rijn en Schiekade 105
- name: Nathan Sweet
age: 28
address: 4011 16th Ave S
And a program that you call with the input file as argument:
#! /usr/bin/env python2.7
# coding: utf-8
# http://stackoverflow.com/questions/30677517/yaml-deserializer-with-position-information?noredirect=1#comment49491314_30677517
import sys
import ruamel.yaml
up_arrow = '↑'
def key_error(key, value, line, col, error, e='E'):
print('E[{}:{}]: {}'.format(line, col, error))
print('{}{}: {}'.format(' '*col, key, value))
print('{}{}'.format(' '*(col), up_arrow))
print('---')
def value_error(key, value, line, col, error, e='E'):
val_col = col + len(key) + 2
print('{}[{}:{}]: {}'.format(e, line, val_col, error))
print('{}{}: {}'.format(' '*col, key, value))
print('{}{}'.format(' '*(val_col), up_arrow))
print('---')
def value_warning(key, value, line, col, error):
value_error(key, value, line, col, error, e='W')
class Contact(object):
def __init__(self, vals):
for offset, k in enumerate(vals):
self.check(k, vals[k], vals.lc.line+offset, vals.lc.col)
for k in ['name', 'address', 'age']:
if k not in vals:
print('K[{}:{}]: {}'.format(
vals.lc.line+offset, vals.lc.col, "missing key: "+k
))
print('---')
def check(self, key, value, line, col):
if key == 'name':
if value[0].lower() == value[0]:
value_error(key, value, line, col,
'value should start with uppercase')
elif key == 'age':
if value < 50:
value_warning(key, value, line, col,
'probably too young for knowing ALGOL 60')
elif key == 'address':
pass
else:
key_error(key, value, line, col,
"unexpected key")
data = ruamel.yaml.load(open(sys.argv[1]), Loader=ruamel.yaml.RoundTripLoader)
for x in data:
contact = Contact(x)
giving you E(rrors), W(arnings) and K(eys missing):
E[0:8]: value should start with uppercase
name: anthon
↑
---
E[2:2]: unexpected key
adres: Rijn en Schiekade 105
↑
---
K[2:2]: missing key: address
---
W[4:7]: probably too young for knowing ALGOL 60
age: 28
↑
---
Which you should be able to parser in a calling program in any language to give feedback. The check method of course need adjusting to your requirements. This is not as nice as being to do that in the language the rest of your application is in, but it might be better than nothing.
In my experience handling the above format is certainly simpler than extending an existing (open source) YAML parser.
¹ Disclaimer: I am the author of that package
² I want to use that kind of information at some point to preserve spurious newlines, inserted for readability
In python, you can readily write custom Dumper/Loader objects and use them to load (or dump) your yaml code. You can have these objects track the file/line info:
import yaml
from collections import OrderedDict
class YamlOrderedDict(OrderedDict):
"""
An OrderedDict that was loaded from a yaml file, and is annotated
with file/line info for reporting about errors in the source file
"""
def _annotate(self, node):
self._key_locs = {}
self._value_locs = {}
nodeiter = node.value.__iter__()
for key in self:
subnode = nodeiter.next()
self._key_locs[key] = subnode[0].start_mark.name + ':' + \
str(subnode[0].start_mark.line+1)
self._value_locs[key] = subnode[1].start_mark.name + ':' + \
str(subnode[1].start_mark.line+1)
def key_loc(self, key):
try:
return self._key_locs[key]
except AttributeError, KeyError:
return ''
def value_loc(self, key):
try:
return self._value_locs[key]
except AttributeError, KeyError:
return ''
# Use YamlOrderedDict objects for yaml maps instead of normal dict
yaml.add_representer(OrderedDict, lambda dumper, data:
dumper.represent_dict(data.iteritems()))
yaml.add_representer(YamlOrderedDict, lambda dumper, data:
dumper.represent_dict(data.iteritems()))
def _load_YamlOrderedDict(loader, node):
rv = YamlOrderedDict(loader.construct_pairs(node))
rv._annotate(node)
return rv
yaml.add_constructor(yaml.resolver.BaseResolver.DEFAULT_MAPPING_TAG, _load_YamlOrderedDict)
Now when you read a yaml file, any mapping objects will be read as a YamlOrderedDict, which allows looking up the file location of keys in the mapping object. You can also add an iterator method like:
def iter_with_lines(self):
for key, val in self.items():
yield (key, val, self.key_loc(key))
...and now you can write a loop like:
for key,value,location in obj.iter_with_lines():
# iterate through the key/value pairs in a YamlOrderedDict, with
# the source file location

Issue in parsing Json response in ruby

#response = Typhoeus::Request.get(FOUR_SQUARE_API_SERVER_ADDRESS+'search?ll=' + current_user.altitude.to_s + "&query="+ params[:query] + FOUR_SQUARE_API_ACESS_CODE)
#venues = ActiveSupport::JSON.decode(#response.body)
#venues['response']['groups'][0]['items'].each do |venue|
venue['name'] //working
venue['name']['location'][0]['address'] //issues
venue['name']['categories'][0]['id'] //issues
end
Please check inline comments for issues.
First of all, the venue['name'] is a scalar, not an array; secondly, venue['location'] (which I think you're trying to access) is not encoded as an array, that's just an object:
location: {
address: "...',
city: "...",
// ...
}
So here you want:
venue['location']
Then, your venue['name']['categories'][0]['id'] will fail because, again, venue['name'] is a scalar; for this one, you want:
venue['categories'][0]['id']

Resources