I am trying to create a dialog that displays a object in a dynamic manner in an adaptive card. Dynamic as in I don't know what the object keys are, how many keys there are, etc.
Assuming that the object will not have nested arrays or other objects (basically the object will look like a map), how can I extract all the keys and put them in an array?
An example of an object that I wish to extract the keys from :
{
"symbol": "MSFT",
"companyName": "Microsoft Corporation",
"primaryExchange": "Nasdaq Global Select",
"sector": "Technology",
"calculationPrice": "close",
"open": 127.42,
"openTime": 1556890200,
"close": 128.9,
"closeTime": 1556913600,
"high": 129.43,
"low": 127.25,
"latestPrice": 128.9,
"latestSource": "Close",
"latestTime": "May 3, 2019",
"latestUpdate": 1556913600,
"latestVolume": 24835154,
"iexRealtimePrice": null,
"iexRealtimeSize": null,
"iexLastUpdated": null,
"delayedPrice": 128.9,
"delayedPriceTime": 1556913600,
"extendedPrice": 129.04,
"extendedChange": 0.14,
"extendedChangePercent": 0.00109,
"extendedPriceTime": 1556917190,
"previousClose": 126.21,
"change": 2.69,
"changePercent": 0.02131,
"iexMarketPercent": null,
"iexVolume": null,
"avgTotalVolume": 22183270,
"iexBidPrice": null,
"iexBidSize": null,
"iexAskPrice": null,
"iexAskSize": null,
"marketCap": 987737229888,
"peRatio": 30.84,
"week52High": 131.37,
"week52Low": 93.96,
"ytdChange": 0.30147812013916003
}
Use the select prebuilt function from Adaptive Expressions.
It would be something like:
select(myobject, x, x.key) and assigning that to a new property (or however you need to use that array).
You can test this out further by using the expressions playground:
https://playgroundclient.azurewebsites.net/
I ran a quick test with the expression:
select(myobject, x, x.key)
And my data/properties as:
{
"myobject": {
"symbol": "MSFT",
"companyName": "Microsoft Corporation",
"primaryExchange": "Nasdaq Global Select",
"sector": "Technology",
"calculationPrice": "close"
}
}
And I get the result:
["symbol","companyName","primaryExchange","sector","calculationPrice"]
Please let me know if I misunderstood your requirement/question. And if this answers your question; please mark as answered.
This can be done use Keys method.
const myObject = {
"symbol": "MSFT",
"companyName": "Microsoft Corporation",
"primaryExchange": "Nasdaq Global Select",
"sector": "Technology",
}
const keys = Object.keys(myObject);
keys.map(key=>console.log(key,"->", myObject[key]));
Now if you have inner object you need to make a function that will check is it object or not. If it's object then you can run the keys method again and push it to a new array.
Related
I have the following objects in my database:
[
{
"id": 1,
"name": "foo",
"objType": "A"
},
{
"id": 2,
"name": "bar",
"objType": "B"
}
]
And the following users:
[
{
"id": 3,
"name": "User A",
"role": "admin"
},
{
"id": 4,
"name": "User B",
"role": "client"
}
]
And I have a schema like:
enum ObjTypeEnum {
A
B
}
type MyObj {
id: Int
name: String
objType: ObjTypeEnum
}
type Mutation {
updateObj(id: Int!, name: String): MyObj
}
The user A can update any obj that he wants because he is an admin. However, the user B can only update an object only if this object is of type B.
That means:
If the user B tries to update the object 2, using the mutation updateObj(2, "new name"), this should be totally ok. However, if he tries to update the object 1, updateObj(1, "new name"), now this should return an error for this user.
My naïve solution for this is get the object in the resolver, check its type and, if is ok for the current user, then proceed with the update, otherwise throw an error. But I have the feeling I'm in the wrong direction and not using graphql properly...
Is it possible to do it using directives or something more generic, since the key that using to validate the update is an enum?
I'm trying to validate json files which have an element that has a property which contains a value that should exist in another part of the json. I'm using jsonschema Draft 07.
This is a simple little example that shows the scenario I'm trying to validate in my data.
{
"objects": {
"object1": {
"colorKey": "orange"
}
},
"colors": {
"orange": {
"red": "FF",
"green": "AF",
"blue": "00"
}
}
}
How can I validate that the 'value' of colorKey (in this case 'orange') actually exists as a property of the 'colors' object? The data isn't stored in arrays, just defined properties.
For official JSON Schema...
You cannot check that a key in the data is the same as a value of the data.
You cannot extract the value of data from your JSON instance to use in your JSON Schema.
That being said, ajv, the most popular validator, implements some unofficial extensions. One of which is $data.
Example taken from: https://github.com/epoberezkin/ajv#data-reference
var ajv = new Ajv({$data: true});
var schema = {
"properties": {
"smaller": {
"type": "number",
"maximum": { "$data": "1/larger" }
},
"larger": { "type": "number" }
}
};
var validData = {
smaller: 5,
larger: 7
};
ajv.validate(schema, validData); // true
This would not work for anyone else using your schemas.
I am developing a dot net core 1.1 app in which I am trying to use Accord.Net. According to examples in this page (Naive Bayes) I need to convert data retrieved from DB to DataTable.
The thing is that while using DataTable I got this error:
The type 'DataTable' exists in both 'Shim, ...' and
'System.Data.Common, ...'
Even if I use this:
DataTable learningDataNotCodifiedAsDataTable = new DataTable();
or this:
System.Data.DataTable learningDataNotCodifiedAsDataTable = new System.Data.DataTable();
TG.
While the DataTable is not available in .NET Core 1.1, it is now available in .NET Core 2.0. If you can upgrade your project to .NET Core 2.0, then you will be able to use it in your code.
However, if you cannot switch to .NET Core 2.0 right now, then please note that you are not required to use DataTables with any of the methods in Accord.NET framework. They are given or shown just because they can give some extra convenience, but they are not really required, as shown in the example below:
string[] columnNames = { "Outlook", "Temperature", "Humidity", "Wind", "PlayTennis" };
string[][] data =
{
new string[] { "Sunny", "Hot", "High", "Weak", "No" },
new string[] { "Sunny", "Hot", "High", "Strong", "No" },
new string[] { "Overcast", "Hot", "High", "Weak", "Yes" },
new string[] { "Rain", "Mild", "High", "Weak", "Yes" },
new string[] { "Rain", "Cool", "Normal", "Weak", "Yes" },
new string[] { "Rain", "Cool", "Normal", "Strong", "No" },
new string[] { "Overcast", "Cool", "Normal", "Strong", "Yes" },
new string[] { "Sunny", "Mild", "High", "Weak", "No" },
new string[] { "Sunny", "Cool", "Normal", "Weak", "Yes" },
new string[] { "Rain", "Mild", "Normal", "Weak", "Yes" },
new string[] { "Sunny", "Mild", "Normal", "Strong", "Yes" },
new string[] { "Overcast", "Mild", "High", "Strong", "Yes" },
new string[] { "Overcast", "Hot", "Normal", "Weak", "Yes" },
new string[] { "Rain", "Mild", "High", "Strong", "No" },
};
// Create a new codification codebook to
// convert strings into discrete symbols
Codification codebook = new Codification(columnNames, data);
// Extract input and output pairs to train
int[][] symbols = codebook.Transform(data);
int[][] inputs = symbols.Get(null, 0, -1); // Gets all rows, from 0 to the last (but not the last)
int[] outputs = symbols.GetColumn(-1); // Gets only the last column
// Create a new Naive Bayes learning
var learner = new NaiveBayesLearning();
NaiveBayes nb = learner.Learn(inputs, outputs);
// Consider we would like to know whether one should play tennis at a
// sunny, cool, humid and windy day. Let us first encode this instance
int[] instance = codebook.Translate("Sunny", "Cool", "High", "Strong");
// Let us obtain the numeric output that represents the answer
int c = nb.Decide(instance); // answer will be 0
// Now let us convert the numeric output to an actual "Yes" or "No" answer
string result = codebook.Translate("PlayTennis", c); // answer will be "No"
// We can also extract the probabilities for each possible answer
double[] probs = nb.Probabilities(instance); // { 0.795, 0.205 }
If you have System.Data assembly in Assemblies and don't want or can't delete it, then you can bypass it by using extern alias, but when I bypassed this error using it I got 'DataTable' does not contain a constructor that takes 0/1 arguments error, and if believe this discussion the reason is:
System.Data.DataTable is present in .Net core(1.0,1.1) as an empty class to
complete the interfaces implementation. This issue is to track the
work needed to bring in an API to provide DataTable like API in .Net
Core.
And it changed only in .NET Core 2.0, see this SO post. I tried you code in .NET Core 2.0 project (in VS 2017 15.3) and only then it worked fine.
UPDATE:
I meant this assemblies.
But as you say you have only NUGET packages, then you also can use aliases in you csproj file for Nuget packages like below(I used System.Data.Common you can replace it with your Shim package if needed) :
<Target Name="DataAlias" BeforeTargets="FindReferenceAssembliesForReferences;ResolveReferences">
<ItemGroup>
<ReferencePath Condition="'%(FileName)' == 'System.Data.Common'">
<Aliases>MyData</Aliases>
</ReferencePath>
</ItemGroup>
</Target>
and then reference it in C# like this:
extern alias MyData; //1st line in .cs file
...
using MyData::System.Data;
...
DataTable datatable = new DataTable();
But still you won't be able to use because you will get the error about constructor I wrote above. Here you has 2 options how to solve this:
Switch to .NET Core 2.0
Try to use workaround solution from this post using DbDataReader if it suits you
I'm using Laravel 5.2 and I have Role and Permission models with
Role.php
public function permissions()
{
return $this->hasMany('App\Permissions');
}
And if I call
return Role::with('permissions')->get()
it will return
[{
"id": 2,
"name": "training_vendor",
"display_name": "Training Vendor",
"description": "Role for vendor",
"created_at": "2016-06-23 08:05:47",
"updated_at": "2016-06-23 08:05:47",
"permissions": [
{
"permission_id": 1,
"role_id": 2
},
{
"permission_id": 2,
"role_id": 2
},
{
"permission_id": 3,
"role_id": 2
},
{
"permission_id": 4,
"role_id": 2
},
{
"permission_id": 5,
"role_id": 2
}
}]
Is it possible to change the "permissions" structure to something like these?
[{
"id": 2,
"name": "training_vendor",
"display_name": "Training Vendor",
"description": "Role for vendor",
"created_at": "2016-06-23 08:05:47",
"updated_at": "2016-06-23 08:05:47",
"permissions": [1,2,3,4,5]
}]
If you want to get an array (as title says), use toArray() method:
return Role::with('permissions')->get()->toArray();
It will convert the collection to an array.
If you need to get custom formatted JSON (as your example shows), use toArray(), then rebuild this array with using foreach or array_map()/array_filter() methods and encode result into JSON with json_encode().
I would recommend you to send data as is (without rebuilding it's structure) to frontend or whatever and work with it there.
I order to extract a Collection of permission IDs you can use the pluck() function:
$permissionIds = $role->permissions->pluck('permission_id');
You can also write a getter function in your Role model:
public function getPermissionIds()
{
return $this->permissions->pluck('permission_id');
}
and use it like:
$permissionIds = $role->getPermissionIds();
You can even override the magic __get() function:
public function __get($attr)
{
if ($attr === 'permissionIds') {
return $this->getPermissionIds();
} else {
return parent::__get($attr);
}
}
and access the permission IDs like an attribute:
$permissionIds = $role->permissionIds;
In Laravel 5.2 the returned value of models is a Collection instance which can be used to transform the result
$roles = Role::with('permissions')->get();
$roles->transform(function($role){
$role['permissions'] = $role['permissions']->pluck('permission_id')->toArray();
return $role;
});
return $roles;
This code will provide the desirable result.
Note: You can even chain the transform function after the get function.
I hope this answer will help to you,
$role = Role::all();
foreach ($role as $index){
$permissions= $index->permissions;
foreach ($permissions as $permission){
$permissionId[] = $permission->permission_id;
}
unset($index->permissions);
$index['all_permissions']= $permissionId;
}
return response()->json($role, 200);
This is work for me. So check for your code.
Pretty straightforward (I hope). I'd like to be able to use the API endpoint and have it only return specified fields. I.E. something like this
http://localhost:1337/api/reference?select=["name"]
Would ideally return something of the form
[{"name": "Ref1"}]
Unfortunately that is not the case, and in actuality it returns the following.
[
{
"contributors": [
{
"username": "aduensing",
"email": "standin#gmail.com",
"lang": "en_US",
"template": "default",
"id_ref": "1",
"provider": "local",
"id": 1,
"createdAt": "2016-07-28T19:39:09.349Z",
"updatedAt": "2016-07-28T19:39:09.360Z"
}
],
"createdBy": {
"username": "aduensing",
"email": "standin#gmail.com",
"lang": "en_US",
"template": "default",
"id_ref": "1",
"provider": "local",
"id": 1,
"createdAt": "2016-07-28T19:39:09.349Z",
"updatedAt": "2016-07-28T19:39:09.360Z"
},
"updatedBy": {
"username": "aduensing",
"email": "standin#gmail.com",
"lang": "en_US",
"template": "default",
"id_ref": "1",
"provider": "local",
"id": 1,
"createdAt": "2016-07-28T19:39:09.349Z",
"updatedAt": "2016-07-28T19:39:09.360Z"
},
"question": {
"createdBy": 1,
"createdAt": "2016-07-28T19:41:33.152Z",
"template": "default",
"lang": "en_US",
"name": "My Question",
"content": "Cool stuff, huh?",
"updatedBy": 1,
"updatedAt": "2016-07-28T19:45:02.893Z",
"id": "579a5ff83af4445c179bd8a9"
},
"createdAt": "2016-07-28T19:44:31.516Z",
"template": "default",
"lang": "en_US",
"name": "Ref1",
"link": "Google",
"priority": 1,
"updatedAt": "2016-07-28T19:45:02.952Z",
"id": "579a60ab5c8592c01f946cb5"
}
]
This immediately becomes problematic in any real world context if I decide to load 10, 20, 30, or more records at once, I and end up loading 50 times the data I needed. More bandwidth is used up, slower load times, etc.
How I solved this:
Create custom controller action (for example, 'findPaths')
in contributor/controllers/contributor.js
module.exports = {
findPaths: async ctx => {
const result = await strapi
.query('contributor')
.model.fetchAll({ columns: ['slug'] }) // here we wait for one column only
ctx.send(result);
}
}
Add custom route (for example 'paths')
in contributor/config/routes.json
{
"method": "GET",
"path": "/contributors/paths",
"handler": "contributor.findPaths",
"config": {
"policies": []
}
},
Add permission in admin panel for Contributor entity, path action
That's it. Now it shows only slug field from all contributor's records.
http://your-host:1337/contributors/paths
Here is how you can return specific fields and also exclude the relations to optimize the response.
async list (ctx) {
const result = await strapi.query('article').model.query(qb => {
qb.select('id', 'title', 'link', 'content');
}).fetchAll({
withRelated: []
}).catch(e => {
console.error(e)
});
if(result) {
ctx.send(result);
} else {
ctx.send({"statusCode": 404, "error": "Not Found", "message": "Not Found"});
}
}
I know this is old thread but I just run into exactly same problem and I could not find any solution. Nothing in the docs or anywhere else.
After a few minutes of console logging and playing with service I was able to filter my fields using following piece of code:
const q = Post
.find()
.sort(filters.sort)
.skip(filters.start)
.limit(filters.limit)
.populate(populate);
return filterFields(q, ['title', 'content']);
where filterFields is following function:
function filterFields(q, fields) {
q._fields = fields;
return q;
}
It is kinda dirty solution and I haven't figured out how to apply this to included relation entites yet but I hope it could help somebody looking for solution of this problem.
I'm not sure why strapi does not support this since it is clearly capable of filtering the fields when they are explicitly set. it would be nice to use it like this:
return Post
.find()
.fields(['title', 'content'])
.sort(filters.sort)
.skip(filters.start)
.limit(filters.limit)
.populate(populate);
It would be better to have the query select the fields rather than relying on node to remove content. However, I have found this to be useful in some situations and thought I would share. The strapi sanitizeEntity function can include extra options, one of which allows you only include fields you need. Similar to what manually deleting the fields but a more reusable function to do so.
const { sanitizeEntity } = require('strapi-utils');
let entities = await strapi.query('posts').find({ parent: parent.id })
return entities.map(entity => {
return sanitizeEntity(entity, {
model: strapi.models['posts'],
includeFields: ['id', 'name', 'title', 'type', 'parent', 'userType']
});
});
This feature is not implemented in Strapi yet. To compensate, the best option for you is probably to use GraphQL (http://strapi.io/documentation/graphql).
Feel free to create an issue or to submit a pull request: https://github.com/wistityhq/strapi
You can use the select function if you are using MongoDB Database:
await strapi.query('game-category').model.find().select(["Code"])
As you can see, I have a model called game-category and I just need the "Code" field so I used the Select function.
In the current strapi version (3.x, not sure about previous ones) this can be achieved using the select method in custom queries, regardless of which ORM is being used.
SQL example:
const restaurant = await strapi
.query('restaurant')
.model.query((qb) => {
qb.where('id', 1);
qb.select('name');
})
.fetch();
not very beautiful,but you can delete it before return.
ref here:
https://strapi.io/documentation/developer-docs/latest/guides/custom-data-response.html#apply-our-changes
const { sanitizeEntity } = require('strapi-utils');
module.exports = {
async find(ctx) {
let entities;
if (ctx.query._q) {
entities = await strapi.services.restaurant.search(ctx.query);
} else {
entities = await strapi.services.restaurant.find(ctx.query);
}
return entities.map(entity => {
const restaurant = sanitizeEntity(entity, {
model: strapi.models.restaurant,
});
if (restaurant.chef && restaurant.chef.email) {
**delete restaurant.chef.email;**
}
return restaurant;
});
},
};
yeah,I remember another way.
you can use the attribute in xx.settings.json file.
ref:
model-options
{
"options": {
"timestamps": true,
"privateAttributes": ["id", "created_at"], <-this is fields you dont want to return
"populateCreatorFields": true <- this is the system fields,set false to not return
}
}
You can override the default strapi entity response of:-
entity = await strapi.services.weeklyplans.create(add_plan);
return sanitizeEntity(entity, { model: strapi.models.weeklyplans });
By using:-
ctx.response.body = {
status: "your API status",
message: "Your own message"
}
Using ctx object, we can choose the fields we wanted to display as object.
And no need to return anything. Place the ctx.response.body where the response has to be sent when the condition fulfilled.
It is now 2023, and for a little while it has been possible to do this using the fields parameter:
http://localhost:1337/api/reference?fields[0]=name&fields[1]=something