Loopback custom password validation - validation

very simple question: if I try to validate a password in a User model it seems I can only validate the already encrypted password?
So for example if I use
Customer.validatesLengthOf('password', { min: 8, message: 'Too short' })
Then the encrypted password is checked (which is always longer than 8 characters), so no good... If I try to use a custom validation, how can I get access to the original password (the original req.body.password basically)?

EDIT (August 20, 2019): I am unsure if this is still an issue in the latest loopback releases.
In fact, this is a known problem in loopback. The tacitly approved solution is to override the <UserModel>.validatePassword() method with your own. YMMV.
akapaul commented on Jan 10, 2017 •
I've found another way to do this. In common model User there is a
method called validatePassword. If we extend our UserModel from User,
we can redefine this method in JS, like following:
var g = require('loopback/lib/globalize');
module.exports = function(UserModel) {
UserModel.validatePassword = function(plain) {
var err,
passwordProperties = UserModel.definition.properties.password;
if (plain.length > passwordProperties.max) {
err = new Error (g.f('Password too long: %s (maximum %d symbols)', plain, passwordProperties.max));
err.code = 'PASSWORD_TOO_LONG';
} else if (plain.length < passwordProperties.min) {
err = new Error(g.f('Password too short: %s (minimum %d symbols)', plain, passwordProperties.min));
err.code = 'PASSWORD_TOO_SHORT';
} else if(!(new RegExp(passwordProperties.pattern, 'g').test(plain))) {
err = new Error(g.f('Invalid password: %s (symbols and numbers are allowed)', plain));
err.code = 'INVALID_PASSWORD';
} else {
return true;
}
err.statusCode = 422;
throw err;
};
};
This works for me. I don't think that g (globalize) object is required
here, but I added this, just in case. Also, I've added my validator
options in JSON definition of UserModel, because of Loopback docs
For using the above code, one would put their validation rules in the model's .json definition like so (see max, min, and pattern under properties.password):
{
"name": "UserModel",
"base": "User",
...
"properties": {
...
"password": {
"type": "string",
"required": true,
...
"max": 50,
"min": 8,
"pattern": "(?=.*[A-Z])(?=.*[!##$&*])(?=.*[0-9])(?=.*[a-z])^.*$"
},
...
},
...
}

ok, no answer so what I'm doing is using a remote hook to get access to the original plain password and that'll do for now.
var plainPwd
Customer.beforeRemote( 'create', function (ctx, inst, next) {
plainPwd = ctx.req.body.password
next()
})
Then I can use it in a custom validation:
Customer.validate( 'password', function (err, res) {
const pattern = new RegExp(/some-regex/)
if (plainPwd && ! pattern.test( plainPwd )) err()
}, { message: 'Invalid format' })

Ok I guess the above answer is quite novel and obviously is accepted, but If you want a real easy solution with just some basic validations done and not much code then loopback-mixin-complexity is the solution for you.
If you don't want to create another dependency then you can go ahead with a custom mixin, that you can add into your user model or any other model where you need some kind of validation and it would do the validation for you.
Here's a sample code for how to create such mixin
module.exports = function(Model, options) {
'use strict';
Model.observe('before save', function event(ctx, next) { //Observe any insert/update event on Model
if (ctx.instance) {
if(!yourValidatorFn(ctx.instance.password) )
next('password not valid');
else
next();
}
else {
if(!yourValidatorFn(ctx.data.password) )
next('password not valid');
else
next();
}
});
};

Related

How can I test validation functionality in a terraform provider

I have written some more sophisticated validation logic for fields I need to validate in a custom terraform provider. I can, of course, test these are unit tests, but that's insufficient; why if I forgot to actually apply the validator?
So, I need to actually use terraform config and have the provider do it's normal, natural thing.
Basically, I expect it to error. The documentation seems to indicate that I should do a regex match on the output. But this can't be right; it seems super brittle. Can someone tell me how this is done?
func TestValidation(t *testing.T) {
const userBasic = `
resource "my_user" "dude" {
name = "the.dude%d"
password = "Password1" // needs a special char to pass
email = "the.dude%d#domain.com"
groups = [ "readers" ]
}
`
rgx, _ := regexp.Compile("abc")
resource.UnitTest(t, resource.TestCase{
Providers: testAccProviders,
Steps: []resource.TestStep{
{
Config: userBasic,
ExpectError: rgx,
},
},
})
}
This code obviously doesn't work. And, a lot of research isn't yielding answers.
Since sdk version 2.3.0 you can set ErrorCheck function on resource.TestCase to provide more complex validation for errors.
For example:
resource.UnitTest(t, resource.TestCase{
Providers: testAccProviders,
ErrorCheck: func(err error) error {
if err == nil {
return errors.New("expected error but got none")
}
// your validation code here
// some simple example with string matching
if strings.Contains(err.Error(), "your expected error message blah") {
return nil
}
// return original error if no match
return err
},
Steps: []resource.TestStep{
{
Config: userBasic,
},
},
})

How to modify just a property from a dexie store without deleting the rest?

I'm having the dexie stores showed in the print screen below:
Dexie stores print screen
My goal is to update a dexie field row from a store without losing the rest of the data.
For example: when I edit and save the field "com_name" from the second row (key={2}) I want to update "com_name" only and not lose the rest of the properties, see first and the third row.
I already tried with collection.modify and table.update but both deleted the rest of the properties when used the code below:
dexieDB.table('company').where('dexieKey').equals('{1}')
//USING table.update
//.update(dexieRecord.dexiekey, {
// company: {
// com_name: "TOP SERVE 2"
// }
//})
.modify(
{
company:
{
com_name: TOP SERVE 2
}
}
)
.then(function (updated) {
if (updated)
console.log("Success.");
else
console.log("Nothing was updated.");
})
.catch(function (err) { console.log(err); });
Any idea how can I accomplish that?
Thanks
Alex
You where right to use Table.update or Collection.modify. They should never delete other properties than the ones specified. Can you paste a jsitor.com or jsfiddle repro of that and someone may help you pinpoint why the code doesn't work as expected.
Now that you are saying I realised that company and contact stores are created dynamically and editedRecords store has the indexes explicitly declared therefore when update company or contact store, since dexie doesn't see the indexes will overwrite. I haven't tested it yet but I suspect this is the behaviour.
See the print screen below:
Dexie stores overview
Basically I have json raw data from db and in the browser I create the stores and stores data based on it, see code below:
function createDexieTables(jsonData) { //jsonData - array, is the json from db
const stores = {};
const editedRecordsTable = 'editedRecords';
jsonData.forEach((jsonPackage) => {
for (table in jsonPackage) {
if (_.find(dexieDB.tables, { 'name': table }) == undefined) {
stores[table] = 'dexieKey';
}
}
});
stores[editedRecordsTable] = 'dexieKey, table';
addDataToDexie(stores, jsonData);
}
function addDataToDexie(stores, jsonData) {
dbv1 = dexieDB.version(1);
if (jsonData.length > 0) {
dbv1.stores(stores);
jsonData.forEach((jsonPackage) => {
for (table in jsonPackage) {
jsonPackage[table].forEach((tableRow) => {
dexieDB.table(table).add(tableRow)
.then(function () {
console.log(tableRow, ' added to dexie db.');
})
.catch(function () {
console.log(tableRow, ' already exists.');
});
});
}
});
}
}
This is the json, which I convert to object and save to dexie in the value column and the key si "dexieKey":
[
{
"company": [
{
"dexieKey": "{1}",
"company": {
"com_pk": 1,
"com_name": "CloudFire",
"com_city": "Round Rock",
"serverLastEdit": [
{
"com_pk": "2021-06-02T11:30:24.774Z"
},
{
"com_name": "2021-06-02T11:30:24.774Z"
},
{
"com_city": "2021-06-02T11:30:24.774Z"
}
],
"userLastEdit": []
}
}
]
}
]
Any idea why indexes were not populated when generating them dynamically?
Given the JSON data, i understand what's going wrong.
Instead of passing the following to update():
{
company:
{
com_name: "TOP SERVE 2"
}
}
You probably meant to pass this:
{
"company.com_name": "TOP SERVE 2"
}
Another hint is to do the add within an rw transaction, or even better if you can use bulkAdd() instead to optimize the performance.

What happens if you do a get item in DynamoDb using a projection expression, if the attribute in the expression may not exist

Inside a lambda, I'm calling getItem on a a table with a projection expression for a single field. This is working fine.
const usersTableParams = {
TableName: 'users',
Key: {
'user-name': { S: userID }
},
ProjectionExpression: 'notificationEndpointARN'
};
ddb.getItem(usersTableParams, function (err, data) {
if (err) {
console.log('error getting user info', err);
}
else {
// success
// code...
}
});
Now I want to add another attribute to the projection expression, but that attribute might not exist yet on the item. (If it doesn't exist I will add it at the end of the function).
Does the function fail, does it return null for that attribute, does it not return that attribute at all?
I can't find the answer in the documentation or in any google searches.
If Projection-Expression contains an attribute that doesn't exist in the table, it doesn't throw any error or return null.
It will simply not appear in the result and return the remaining found attributes .
cli> aws dynamodb get-item --table-name my-DynamoDBTable-I3BL7EX05JQR --key file://test.json --projection-expression "data_type,ts,username"
{
"Item": {
"ts": {
"N": "1600755209826"
},
"data_type": {
"S": "Int32"
}
}
}
You can refer this for details: https://docs.aws.amazon.com/cli/latest/reference/dynamodb/get-item.html

Parallel promise execution in resolve functions

I have a question about handling promises in resolve functions for a GraphQL client. Traditionally, resolvers would be implemented on the server, but I am wrapping a REST API on the client.
Background and Motivation
Given resolvers like:
const resolvers = {
Query: {
posts: (obj, args, context) => {
return fetch('/posts').then(res => res.json());
}
},
Post: {
author: (obj, args, _, context) => {
return fetch(`/users/${obj.userId}`)
.then(res => res.json());
.then(data => cache.users[data.id] = data)
}
}
};
If I run the query:
posts {
author {
firstName
}
}
and the Query.posts() /posts API returns four post objects:
[
{
"id": 1,
"body": "It's a nice prototyping tool",
"user_id": 1
},
{
"id": 2,
"body": "I wonder if he used logo?",
"user_id": 2
},
{
"id": 3,
"body": "Is it even worth arguing?",
"user_id": 1
},
{
"id": 4,
"body": "Is there a form above all forms? I think so.",
"user_id": 1
}
]
the Post.author() resolver will get called four times to resolve the author field.
grapqhl-js has a very nice feature where each of the promises returned from the Post.author() resolver will execute in parallel.
I've further been able to eliminate re-fetching author's with the same userId using facebook's dataloader library. BUT, I'd like to use a custom cache instead of dataloader.
The Question
Is there a way to prevent the Post.author() resolver from executing in parallel? Inside the Post.author() resolver, I would like to fetch authors one at a time, checking my cache in between to prevent duplicate http requests.
But, right now the promises returned from Post.author() are queued and executed at once, so I cannot check the cache before each request.
Thank you for any tips!
I definitely recommend looking at DataLoader as it's designed to solve exactly this problem. If you don't use it directly, at least you can read its implementation (which is not that many lines) and borrow the techniques atop your custom cache.
GraphQL and the graphql.js libraries themselves are not concerned with loading data - they leave that up to you via resolver functions. Graphql.js is just calling these resolver functions as eagerly as it can to provide for the fastest overall execution of your query. You can absolutely decide to return Promises which resolve sequentially (which I wouldn't recommend), or—as DataLoader implements—deduplicate with memoization (which is what you want for solving this).
For example:
const resolvers = {
Post: {
author: (obj, args, _, context) => {
return fetchAuthor(obj.userId)
}
}
};
// Very simple memoization
var authorPromises = {};
function fetchAuthor(id) {
var author = authorPromises[id];
if (!author) {
author = fetch(`/users/${id}`)
.then(res => res.json());
.then(data => cache.users[data.id] = data);
authorPromises[id] = author;
}
return author;
}
Just for some people who use dataSource for REST api stuff along with dataLoader(in this case, it doesn't really help as it's a single request). Here is a simple caching solution/example.
export class RetrievePostAPI extends RESTDataSource {
constructor() {
super()
this.baseURL = 'http://localhost:3000/'
}
postLoader = new DataLoader(async ids => {
return await Promise.all(
ids.map(async id => {
if (cache.keys().includes(id)) {
return cache.get(id)
} else {
const postPromise = new Promise((resolve, reject) => {
resolve(this.get(`posts/${id}`))
reject('Post Promise Error!')
})
cache.put(id, postPromise, 1000 * 60)
return postPromise
}
})
)
})
async getPost(id) {
return this.postLoader.load(id)
}
}
Note: here I use memory-cache for caching mechanism.
Hope this helps.

Extjs validate in separate files

I'm trying to validate fields in my form, but I keep getting an error message.
Here is my code:
Ext.define('ExtDoc.views.extfields.FieldsValidator',{
valEng: function(val) {
var engTest = /^[a-zA-Z0-9\s]+$/;
Ext.apply(Ext.form.field.VTypes, {
eng: function(val, field) {
return engTest.test(val);
},
engText: 'Write it in English Please',
// vtype Mask property: The keystroke filter mask
engMask: /[a-zA-Z0-9_\u0600-\u06FF\s]/i
});
}
});
And I define my field as follow:
{
"name": "tik_moed_chasifa",
"type": "ExtDoc.views.extfields.ExtDocTextField",
"label": "moed_hasifa",
"vtype": "eng",
"msgTarget": "under"
}
The first snippet is in a separate js file, and I have it in my fields js file as required.
When I start typing text in the text field, I keep seeing the following error msg in the explorer debugger:
"SCRIPT438: Object doesn't support property or method 'eng' "
What could it be? Have I declared something wrong?
You have defined your own class with a function valEng(val), but you don't instantiate it, neither do you call the function anywhere.
Furthermore, your function valEng(val) does not require a parameter, because you are not using that parameter anywhere.
It would be far easier and more readable, would you remove the Ext.define part and create the validators right where you need them. For instance if you need them inside an initComponent function:
initComponent:function() {
var me = this;
Ext.apply(Ext.form.field.VTypes, {
mobileNumber:function(val, field) {
var numeric = /^[0-9]+$/
if(!Ext.String.startsWith(val,'+')) return false;
if(!numeric.test(val.substring(1))) return false;
return true;
},
mobileNumberText:'This is not a valid mobile number'
});
Ext.apply(me,{
....
items: [{
xtype:'fieldcontainer',
items:[{
xtype: 'combobox',
vtype: 'mobileNumber',
Or, you could add to your Application.js, in the init method, if you need it quite often at different levels of your application:
Ext.define('MyApp.Application', {
extend: 'Ext.app.Application',
views: [
],
controllers: [
],
stores: [
],
init:function() {
Ext.apply(Ext.form.field.VTypes, {
mobileNumber:function(val, field) {
var numeric = /^[0-9]+$/
if(!Ext.String.startsWith(val,'+')) return false;
if(!numeric.test(val.substring(1))) return false;
return true;
},
mobileNumberText:'This is not a valid mobile number'
});
}

Resources