mutidimensional array from javascript/jquery to ruby/sinatra - ruby

how do I pass a 2-dimensional array from javascript to ruby, please? I have this on client side:
function send_data() {
var testdata = {
"1": {
"name": "client_1",
"note": "bigboy"
},
"2": {
"name": "client_2",
"note": "smallboy"
}
}
console.log(testdata);
$.ajax({
type: 'POST',
url: 'test',
dataType: 'json',
data: testdata
});
}
and this on server side:
post '/test' do p params end
but I can't get it right. The best I could get on server side is something like
{"1"=>"[object Object]", "2"=>"[object Object]"}
I tried to add JSON.stringify on client side and JSON.parse on server side, but the first resulted in
{"{\"1\":{\"name\":\"client_1\",\"note\":\"bigboy\"},\"2\":{\"name\":\"client_2\",\"note\":\"smallboy\"}}"=>nil}
while the latter has thrown a TypeError - can't convert Hash into String.
Could anyone help, or maybe post a short snippet of correct code, please? Thank you

You may want to build up the JSON manually, on the javascript side:
[[{'object':'name1'},{'object':'name2'}],[...],[...]]
This will build an array of arrays with objects.
It may look like this:
testdata = [[{
"1": {
"name": "client_1",
"note": "bigboy"
}],
[{"2": {
"name": "client_2",
"note": "smallboy"
}]
}]
I may have something off here, but this should be close to what it would look like.

I'm not sure if this will help but I've got two thoughts: serialize fields and/ or iterate the array.
I managed to get a json array into activerecord objects by setting serializing the fields which had to store sub-arrays:
class MyModel < ActiveRecord::Base
serialize :tags
end
and using an iterator to process the json array:
f = File.read("myarrayof.json")
jarray = JSON.load(f)
jarray.each { |j| MyModel.create.from_json(j.to_json).save }
The conversion back-and-forth seems a bit cumbersome but I found it the most obvious way to handle the array.

Related

How can I sort a request to a view - using nodejs API

I'm querying a cloudant DB from my nodejs App.
I am now trying to sort results from a view query.
My index (keys) are like this:
[ "FR000001", 1577189089166 ]
[ "FR000001", 1577189089165 ]
etc
from the following view:
function(doc) {
emit([doc.siteId, doc.creationDate],{"id" :doc._id, "rev": doc._rev, "siteId": doc.siteId, "creationDate": doc.creationDate, "scores": doc.scores, locationId: doc.locationId});
}
I managed to make that work on a real index using the syntax "sort: "-creationDate" " using syntax found in the bugs sections of cloudant github.
var ddoc = {
q: "site:\"" + id + "\"",
include_docs: false,
sort: "-creationDate",
};
const tmp = await cloudant.use('alarms').search('alarmSearch', 'IndexBySite', ddoc);
I can't make it work on my view with an array of query parameters. I have tried different variation around:
var ddoc_view = {
startkey: ["siteid1",0000000000000],
endkey: ["siteid1",9999999999999],
include_docs: true,
sort: "creationDate"
};
Can anyone help finding the right syntax, or pointing me to where I can find good "cloudant API for nodejs" documentation? for instance there is nothing on how to use sort" on the github... Thanks...
ok after another day of searching:
- best documentation I found is directly the couchdb doc: https://docs.couchdb.org/en/stable/ddocs/views/intro.html
- I ended up modifying the view:
emit([doc.creationDate, doc.siteId], {"id" :doc._id, "rev": doc._rev, "siteId": doc.siteId, "locationTag":doc.locationTag});
and the request:
var ddoc_view = {
endkey: [0000000000000, siteid],
startkey: [9999999999999, siteid],
include_docs: false,
descending: true,
limit: docsReturned,
};
To get a sorted response.

RxJS Map array to observable and back to plain object in array

I have an array of objects from which I need to pass each object separately into async method (process behind is handled with Promise and then converted back to Observable via Observable.fromPromise(...) - this way is needed because the same method is used in case just single object is passed anytime; the process is saving objects into database). For example, this is an array of objects:
[
{
"name": "John",
...
},
{
"name": "Anna",
...
},
{
"name": "Joe",,
...
},
{
"name": "Alexandra",
...
},
...
]
Now I have the method called insert which which inserts object into database. The store method from database instance returns newly created id. At the end the initial object is copied and mapped with its new id:
insert(user: User): Observable<User> {
return Observable.fromPromise(this.database.store(user)).map(
id => {
let storedUser = Object.assign({}, user);
storedUser.id = id;
return storedUser;
}
);
}
This works well in case I insert single object. However, I would like to add support for inserting multiple objects which just call the method for single insert. Currently this is what I have, but it doesn't work:
insertAll(users: User[]): Observable<User[]> {
return Observable.forkJoin(
users.map(user => this.insert(user))
);
}
The insertAll method is inserting users as expected (or something else filled up the database with that users), but I don't get any response back from it. I was debugging what is happening and seems that forkJoin is getting response just from first mapped user, but others are ignored. Subscription to insertAll does not do anything, also there is no any error either via catch on insertAll or via second parameter in subscribe to insertAll.
So I'm looking for a solution where the Observable (in insertAll) would emit back an array of new objects with users in that form:
[
{
"id": 1,
"name": "John",
...
},
{
"id": 2,
"name": "Anna",
...
},
{
"id": 3,
"name": "Joe",,
...
},
{
"id": 4,
"name": "Alexandra",
...
},
...
]
I would be very happy for any suggestion pointing in the right direction. Thanks in advance!
To convert from array to observable you can use Rx.Observable.from(array).
To convert from observable to array, use obs.toArray(). Notice this does return an observable of an array, so you still need to .subscribe(arr => ...) to get it out.
That said, your code with forkJoin does look correct. But if you do want to try from, write the code like this:
insertAll(users: User[]): Observable<User[]> {
return Observable.from(users)
.mergeMap(user => this.insert(user))
.toArray();
}
Another more rx like way to do this would be to emit values as they complete, and not wait for all of them like forkJoin or toArray does. We can just omit the toArray from the previous example and we got it:
insertAll(users: User[]): Observable<User> {
return Observable.from(users)
.mergeMap(user => this.insert(user));
}
As #cartant mentioned, the problem might not be in Rx, it might be your database does not support multiple connections. In that case, you can replace the mergeMap with concatMap to make Rx send only 1 concurrent request:
insertAll(users: User[]): Observable<User[]> {
return Observable.from(users)
.concatMap(user => this.insert(user))
.toArray(); // still optional
}

Why is an Array in my payload being flattened in Sinatra / Rack::Test?

I'm trying to test a small Sinatra app using rspec. I want to pass a rather complex payload and am running into issues i do not understand: my payload contains an array of hashes. When I run the actual application this will work as expected, yet when I use the post helper to run my tests, the array will contain a merged hash:
post(
"/#{bot}/webhook",
sessionId: "test-session-#{session_counter}",
result: {
contexts: [
{ some: 'fixture' },
{ name: 'generic', parameters: { facebook_sender_id: 'zuck-so-cool' } }
]
}
)
In the sinatra handler I use params to access this payload:
post '/:bot/webhook' do |bot|
do_something_with(params)
end
When I now look at the structure of params when running the test suite, I will see the following structure:
[{"some" => "fixture", "name" => "generic", "parameters" => {"facebook_sender_id" => "zuck-so-cool"}}]
which I do not really understand. Is this a syntax issue (me being a ruby noob), am I using params wrong, or is this a bug?
EDIT: So i found out this is an "issue" with the way that Rack::Test will serialize the given payload when not specifying how to (i.e. as form data). If I pass JSON and pass the correct headers it will do what I expect it to do:
post(
"/#{bot}/webhook",
{
sessionId: "test-session-#{session_counter}",
result: {
contexts: [
{ some: 'fixture' },
{ name: 'generic', parameters: { facebook_sender_id: 'zuck-so-cool' } }
]
}
}.to_json,
{ 'HTTP_ACCEPT' => 'application/json', 'CONTENT_TYPE' => 'application/json' }
)
Still I am unsure of this is an issue with the passed data structure not being possible to be serialized into form data or if this is a bug in the way that Rack::Test serializes data.
Looking at the relevant portion of the specs it looks like this is is expected behavior.

Parse.com manipulate Response Object

I am trying to work Ember with Parse.com using
ember-model-parse-adapter by samharnack.
I add added a function to make multiple work search(like search engine) for which I have defined a function on cloud using Parse.Cloud.define and run from client.
The problem is the Array that my cloud response returns is not compatible with Ember Model because of two attributes they are __type and className. how can I modify the response to get response similar to that i get when I run a find query from client. i.e without __type and className
Example responses
for App.List.find() = {
"results":[
{
"text":"zzz",
"words":[
"zzz"
],
"createdAt":"2013-06-25T16:19:04.120Z",
"updatedAt":"2013-06-25T16:19:04.120Z",
"objectId":"L1X55krC8x"
}
]
}
for App.List.cloudFunction("sliptSearch",{"text" : this.get("searchText")})
{
"results":[
{
"text":"zzz",
"words":[
"zzz"
],
"createdAt":"2013-06-25T16:19:04.120Z",
"updatedAt":"2013-06-25T16:19:04.120Z",
"objectId":"L1X55krC8x",
"__type" : Object, //undesired
"className" : "Lists" //undesired
}
]
}
Thanks Vlad something like this worked for me for array
resultobj = [];
searchListQuery.find({
success: function(results) {
for( var i=0, l=results.length; i<l; i++ ) {
temp = results.pop();
resultobj.push({
text: temp.get("text"),
createdAt: temp.createdAt,
updatedAt: temp.updatedAt,
objectId: temp.id,
words: "",
hashtags: ""
});
}
In your cloud code before you make any response, create and object and extract from it the attributes/members you need and then response it. like so:
//lets say result is some Parse.User or any other Parse.Object
function(result)
{
var responseObj = {};
responseObj.name = responseObj.get("name");
responseObj.age = responseObj.get("age");
responseObj.id = responseObj.id;
response.success(responseObj);
}
on the response side you will get {"result": {"name": "jhon", "age": "26", "id": "zxc123s21"}}
Hope this would help you

Translating JSON into custom dijit objects

I am looking for an example where JSON constructed from the server side is used to represent objects that are then translated into customized widgets in dojo. The JSON would have to be very specific in its structure, so it would not be a very general solution. Could someone point me to an example of this. It would essentially be the reverse of this
http://docs.dojocampus.org/dojo/formToJson
First of all let me point out that JSON produced by dojo.formToJson() is not enough to recreate the original widgets:
{"field1": "value1", "field2": "value2"}
field1 can be literally anything: a checkbox, a radio button, a select, a text area, a text box, or anything else. You have to be more specific what widgets to use to represent fields. And I am not even touching the whole UI presentation layer: placement, styling, and so on.
But it is possible to a certain degree.
If we want to use Dojo widgets (Dijits), we can leverage the fact that they all are created uniformly:
var myDijit = new dijit.form.DijitName(props, node);
In this line:
dijit.form.DijitName is a dijit's class.
props is a dijit-specific properties.
node is an anchor node where to place this dijit. It is optional, and you don't need to specify it, but at some point you have to insert your dijit manually.
So let's encode this information as a JSON string taking this dijit snippet as an example:
var myDijit = new dijit.form.DropDownSelect({
options: [
{ label: 'foo', value: 'foo', selected: true },
{ label: 'bar', value: 'bar' }
]
}, "myNode");
The corresponding JSON can be something like that:
{
type: "DropDownSelect",
props: {
options: [
{ label: 'foo', value: 'foo', selected: true },
{ label: 'bar', value: 'bar' }
]
},
node: "myNode"
}
And the code to parse it:
function createDijit(json){
if(!json.type){
throw new Error("type is missing!");
}
var cls = dojo.getObject(json.type, false, dijit.form);
if(!cls){
// we couldn't find the type in dijit.form
// dojox widget? custom widget? let's try the global scope
cls = dojo.getObject(json.type, false);
}
if(!cls){
throw new Error("cannot find your widget type!");
}
var myDijit = new cls(json.props, json.node);
return myDijit;
}
That's it. This snippet correctly handles the dot notation in types, and it is smart enough to check the global scope too, so you can use JSON like that for your custom dijits:
{
type: "my.form.Box",
props: {
label: "The answer is:",
value: 42
},
node: "answer"
}
You can treat DOM elements the same way by wrapping dojo.create() function, which unifies the creation of DOM elements:
var myWidget = dojo.create("input", {
type: "text",
value: "42"
}, "myNode", "replace");
Obviously you can specify any placement option, or no placement at all.
Now let's repeat the familiar procedure and create our JSON sample:
{
tag: "input",
props: {
type: "text",
value: 42
},
node: "myNode",
pos: "replace"
}
And the code to parse it is straightforward:
function createNode(json){
if(!json.tag){
throw new Error("tag is missing!");
}
var myNode = dojo.create(json.tag, json.props, json.node, json.pos);
return myNode;
}
You can even categorize JSON items dynamically:
function create(json){
if("tag" in json){
// this is a node definition
return createNode(json);
}
// otherwise it is a dijit definition
return createDijit(json);
}
You can represent your form as an array of JSON snippets we defined earlier and go over it creating your widgets:
function createForm(array){
dojo.forEach(array, create);
}
All functions are trivial and essentially one-liners — just how I like it ;-)
I hope it'll give you something to build on your own custom solution.

Resources