Here is my problem :
I want to use the power of simple-schema to be able to check my inserts against the following schema :
let UprocSchema = new SimpleSchema({
"name": { type : String, label: "Nom Uproc" },
"label": { type : String, label: "Libellé Uproc" },
"status": { type : String, label: "Status UPR" }
});
For some reason I ignore, even if the SimpleSchema seems to be well instanciated, I cannot use the attachSchema property on Mongo.Collection.
Here is my code :
let repo_collection = new Mongo.Collection('repository');
export const Repository = new MongoObservable.Collection<Uproc>('repo_collection');
repo_collection.attachSchema( UprocSchema );
Here is my error messages :
Property 'attachSchema' does not exist on type 'Collection<{}>'.
TypeError: repo_collection.attachSchema is not a function
attachSchema is part of [collection2][1] package.
Documentation states:
Create one or more SimpleSchema instances and then use them to
validate objects. By adding the aldeed:collection2 package to your
app, you can attach them to collections to get automatic validation of
your insert and update operations.
Related
I am using the following code to call the NHS Retrieve Reference Data method
var result = await fhirClient.ReadAsync<CodeSystem>(url);
which returns the following Json (this is a snippet of the full json)
concept": [
{
"code": "BOOKED_CLINICAL_NEED",
"display": "Booked more urgently due to clinical need",
"property": [
{
"code": "effectiveFrom",
"valueDateTime": "2019-07-23T17:09:56.000Z"
},
{
"code": "commentIsMandatory",
"valueBoolean": true
},
{
"code": "canCancelAppointment",
"valueBoolean": false
}
]
}
I have used the GetExtensionValue method for other calls when the data is within an extension but I can't find a similar method for properties.
Is there a simple method or do I need to just cast into the required type manually?
Thanks in advance
There is no convenience method for this. However, the properties per concept are a list, so you could for example iterate over the concepts and select the properties with boolean values using regular list methods:
foreach (var c in myCodeSystem.Concept)
{
var booleanProperties = c.Property.Where(p => (p.Value.TypeName == "boolean"));
// do something with these properties
}
or find all concepts that have a boolean property:
var conceptsWithDateTimeProperties = myCodeSystem.Concept.Where(c => c.Property.Exists(p => (p.Value.TypeName == "dateTime")));
Of course you can make your selections as specific as you need.
A content-type "Product" having the following fields:
string title
int qty
string description
double price
Is there an API endpoint to retrieve the structure or schema of the "Product" content-type as opposed to getting the values?
For example: On endpoint localhost:1337/products, and response can be like:
[
{
field: "title",
type: "string",
other: "col-xs-12, col-5"
},
{
field: "qty",
type: "int"
},
{
field: "description",
type: "string"
},
{
field: "price",
type: "double"
}
]
where the structure of the schema or the table is sent instead of the actual values?
If not in Strapi CMS, is this possible on other headless CMS such as Hasura and Sanity?
You need to use Models, from the link:
Link is dead -> New link
Models are a representation of the database's structure. They are split into two separate files. A JavaScript file that contains the model options (e.g: lifecycle hooks), and a JSON file that represents the data structure stored in the database.
This is exactly what you are after.
The way I GET this info is by adding a custom endpoint - check my answers here for how to do this - https://stackoverflow.com/a/63283807/5064324 & https://stackoverflow.com/a/62634233/5064324.
For handlers you can do something like:
async getProductModel(ctx) {
return strapi.models['product'].allAttributes;
}
I needed the solution for all Content Types so I made a plugin with /modelStructure/* endpoints where you can supply the model name and then pass to a handler:
//more generic wrapper
async getModel(ctx) {
const { model } = ctx.params;
let data = strapi.models[model].allAttributes;
return data;
},
async getProductModel(ctx) {
ctx.params['model'] = "product"
return this.getModel(ctx)
},
//define all endpoints you need, like maybe a Page content type
async getPageModel(ctx) {
ctx.params['model'] = "page"
return this.getModel(ctx)
},
//finally I ended up writing a `allModels` handler
async getAllModels(ctx) {
Object.keys(strapi.models).forEach(key => {
//iterate through all models
//possibly filter some models
//iterate through all fields
Object.keys(strapi.models[key].allAttributes).forEach(fieldKey => {
//build the response - iterate through models and all their fields
}
}
//return your desired custom response
}
Comments & questions welcome
This answer pointed me in the right direction, but strapi.models was undefined for me on strapi 4.4.3.
What worked for me was a controller like so:
async getFields(ctx) {
const model = strapi.db.config.models.find( model => model.collectionName === 'clients' );
return model.attributes;
},
Where clients is replaced by the plural name of your content-type.
We are in the situation that the response of our GraphQL Query has to return some dynamic properties of an object. In our case we are not able to predefine all possible properties - so it has to be dynamic.
As we think there are two options to solve it.
const MyType = new GraphQLObjectType({
name: 'SomeType',
fields: {
name: {
type: GraphQLString,
},
elements: {
/*
THIS is our special field which needs to return a dynamic object
*/
},
// ...
},
});
As you can see in the example code is element the property which has to return an object. A response when resolve this could be:
{
name: 'some name',
elements: {
an_unkonwn_key: {
some_nested_field: {
some_other: true,
},
},
another_unknown_prop: 'foo',
},
}
1) Return a "Any-Object"
We could just return any object - so GraphQL do not need to know which fields the Object has. When we tell GraphQL that the field is the type GraphQlObjectType it needs to define fields. Because of this it seems not to be possible to tell GraphQL that someone is just an Object.
Fo this we have changed it like this:
elements: {
type: new GraphQLObjectType({ name: 'elements' });
},
2) We could define dynamic field properties because its in an function
When we define fields as an function we could define our object dynamically. But the field function would need some information (in our case information which would be passed to elements) and we would need to access them to build the field object.
Example:
const MyType = new GraphQLObjectType({
name: 'SomeType',
fields: {
name: {
type: GraphQLString,
},
elements: {
type: new GraphQLObjectType({
name: 'elements',
fields: (argsFromElements) => {
// here we can now access keys from "args"
const fields = {};
argsFromElements.keys.forEach((key) => {
// some logic here ..
fields[someGeneratedProperty] = someGeneratedGraphQLType;
});
return fields;
},
}),
args: {
keys: {
type: new GraphQLList(GraphQLString),
},
},
},
// ...
},
});
This could work but the question would be if there is a way to pass the args and/or resolve object to the fields.
Question
So our question is now: Which way would be recommended in our case in GraphQL and is solution 1 or 2 possible ? Maybe there is another solution ?
Edit
Solution 1 would work when using the ScalarType. Example:
type: new GraphQLScalarType({
name: 'elements',
serialize(value) {
return value;
},
}),
I am not sure if this is a recommended way to solve our situation.
Neither option is really viable:
GraphQL is strongly typed. GraphQL.js doesn't support some kind of any field, and all types defined in your schema must have fields defined. If you look in the docs, fields is a required -- if you try to leave it out, you'll hit an error.
Args are used to resolve queries on a per-request basis. There's no way you can pass them back to your schema. You schema is supposed to be static.
As you suggest, it's possible to accomplish what you're trying to do by rolling your own customer Scalar. I think a simpler solution would be to just use JSON -- you can import a custom scalar for it like this one. Then just have your elements field resolve to a JSON object or array containing the dynamic fields. You could also manipulate the JSON object inside the resolver based on arguments if necessary (if you wanted to limit the fields returned to a subset as defined in the args, for example).
Word of warning: The issue with utilizing JSON, or any custom scalar that includes nested data, is that you're limiting the client's flexibility in requesting what it actually needs. It also results in less helpful errors on the client side -- I'd much rather be told that the field I requested doesn't exist or returned null when I make the request than to find out later down the line the JSON blob I got didn't include a field I expected it to.
One more possible solution could be to declare any such dynamic object as a string. And then pass a stringified version of the object as value to that object from your resolver functions. And then eventually you can parse that string to JSON again to make it again an object on the client side.
I'm not sure if its recommended way or not but I tried to make it work with this approach and it did work smoothly, so I'm sharing it here.
I'm trying to use the internationalization feature of sails based on i18n.
In my controller it works well. However, I would like to setup this in my model definition.
Please see the code below:
module.exports = {
attributes: {
name:{
type:'string',
required:true,
displayName: sails.__("test")
},
....
Unfortunately it does not work. I have the error below:
displayName: sails.__("test")
^
TypeError: Object [a Sails app] has no method '__'
Would you have an idea?
Any help will be very much appreciated.
Thanks,
displayName: sails.__("test")
You are trying to invoke the internationalization function statically; that is, you're seeing the error because you're running that function the moment your .js file is require()d by node.js, and before sails has finished loading.
There are two ways you can go about solving this problem.
1. Translate the value on each query
If you'd like to store the original value of displayName, and instead internationalize it each time you query for the model, you can override toJSON().
Instead of writing custom code for every controller action that uses a particular model (including the "out of the box" blueprints), you can manipulate outgoing records by simply overriding the default toJSON function in your model.
For example:
attributes: {
name:{
type:'string',
required:true,
},
getDisplayName: function () {
return sails.__(this.name);
},
toJSON: function () {
var obj = this.toObject();
obj.displayName = sails.__(this.name);
return obj;
},
...
}
2. Translate the value before create
You can use the Waterline Lifecycle Callbacks to translate the value to a particular language before the model is saved to the databas
Sails exposes a handful of lifecycle callbacks on models that are called automatically before or after certain actions. For example, we sometimes use lifecycle callbacks for automatically encrypting a password before creating or updating an Account model.
attributes: {
name:{
type:'string',
required:true,
},
displayName: {
type: 'string'
},
...
},
beforeCreate: function (model, next) {
model.displayName = sails.__(model.name);
next();
}
This internationalized the value of displayName will now be set on your model before it is inserted into the database.
Let me know how this works out for you.
Your solution is interesting. However, my wish would be to have a display name for each properties.
module.exports = {
attributes: {
name:{
type:'string',
required:true,
displayName: "Your great name"
},
adress:{
type:'string',
required:true,
displayName: "Where do you live?"
},
....
So is there a simple or clean solution to apply sails.__( foreach properties display name of the attribute?
Thanks,
What's the best way to validate data being inserted or updated into MongoDB? Is it to write some sort of server executed Javascript code that does the validation?
Starting from MongoDB 3.2 they added document validation (slides).
You can specify validation rules for each collection, using validator option using almost all mongo query operators (except $geoNear, $near, $nearSphere, $text, and $where).
To create a new collection with a validator, use:
db.createCollection("your_coll", {
validator: { `your validation query` }
})
To add a validator to the existing collection, you can add the validator:
db.createCollection("your_coll", {
validator: { `your validation query` }
})
Validation work only on insert/update, so when you create a validator on your old collection, the previous data will not be validated (you can write application level validation for a previous data). You can also specify validationLevel and validationAction to tell what will happen if the document will not pass the validation.
If you try to insert/update the document with something that fails the validation, (and have not specified any strange validationLevel/action) then you will get an error on writeResult (sadly enough the error does not tell you what failed and you get only default validation failed):
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 121,
"errmsg" : "Document failed validation"
}
})
MongoDB doesn't have constraints or triggers so the application has to validate the data.
You can also write Javascript scripts that check once a day or more if there is invalid data. You can use this to check the quality of the business logic of your application.
I think it would be normal for your app to handle this kind of thing. If the data is invalid in some way, don't let it get added to the datastore until the user has corrected whatever error you have detected.
Starting in 2.4, MongoDB enables basic BSON object validation for mongod and mongorestore when writing to MongoDB data files. This prevents any client from inserting invalid or malformed BSON into a MongoDB database.
source: http://docs.mongodb.org/manual/release-notes/2.4/
Starting MongoDB 3.6 you can also use JSON Schema to express validation rules. These checks will happen on the database side on insert/update.
Here is an example from the docs:
validator = {
$jsonSchema: {
bsonType: "object",
required: [ "name", "year", "major", "address" ],
properties: {
name: {
bsonType: "string",
description: "must be a string and is required"
},
year: {
bsonType: "int",
minimum: 2017,
maximum: 3017,
description: "must be an integer in [ 2017, 3017 ] and is required"
},
major: {
enum: [ "Math", "English", "Computer Science", "History", null ],
description: "can only be one of the enum values and is required"
},
gpa: {
bsonType: [ "double" ],
description: "must be a double if the field exists"
},
address: {
bsonType: "object",
required: [ "city" ],
properties: {
street: {
bsonType: "string",
description: "must be a string if the field exists"
},
city: {
bsonType: "string",
description: "must be a string and is required"
}
}
}
}
}
}
db.runCommand( {
collMod: "collectionName",
validator: validator
} )
I've just started using MongoDB and PHP together, inside a Zend Framework based application.
I have created 1 object for each MongoDB collection (e.g. User.php maps to the user collection). Each object knows what collection it maps to, and what fields are required. It also knows which filters (Zend_Filter_Input) and validators (Zend_Validate) should be applied to each field. Before doing a MongoDB insert() or save(), I run $object->isValid(), which executes all the validators. If they all pass isValid() will return true, and I proceed to run the insert() or save(), otherwise I display the errors.