Date and datetime set to null in beforeValidate transforms to '0000-00-00' and throws error - validation

I've looked quite extensively around for an answer to how to handle my date-related problem, but I can't seem to find a proper answer anywhere.
I'm using SailsJS (beta) with Waterline as the data-handler. My case is as follows:
My User-model is as such:
module.exports = {
attributes: {
(.. some attributes ..),
birthDate: {
type: 'date',
required: false
}
},
// Modifies user input before validation
beforeValidation: function(user, cb){
// Make sure birthdate is not saved as 0000-00-00
if(!user.birthDate || user.birthDate == '0000-00-00'){
user.birthDate == null;
}
cb(null, user);
},
}
The beforeValidation()-function triggers as it should, but I always gets thrown an error as follows. This seems to be the case for both date and datetime types in Waterline models.
warn: Error (E_VALIDATION) :: 1 attribute is invalid
at WLValidationError.WLError (C:\web\node_modules\sails\node_modules\waterline\lib\waterline\error\WLError.js:33:18)
at new WLValidationError (C:\web\node_modules\sails\node_modules\waterline\lib\waterline\error\WLValidationError.js:20:28)
at C:\web\node_modules\sails\node_modules\waterline\lib\waterline\query\validate.js:44:43
at allValidationsChecked (C:\web\node_modules\sails\node_modules\waterline\lib\waterline\core\validations.js:181:5)
at done (C:\web\node_modules\sails\node_modules\waterline\node_modules\async\lib\async.js:128:19)
at C:\web\node_modules\sails\node_modules\waterline\node_modules\async\lib\async.js:25:16
at C:\web\node_modules\sails\node_modules\waterline\lib\waterline\core\validations.js:162:23
at Object.async.each (C:\web\node_modules\sails\node_modules\waterline\node_modules\async\lib\async.js:114:20)
at validate (C:\web\node_modules\sails\node_modules\waterline\lib\waterline\core\validations.js:142:11)
at C:\web\node_modules\sails\node_modules\waterline\node_modules\async\lib\async.js:118:13
Invalid attributes sent to User:
birthDate
`undefined` should be a date (instead of "0000-00-00", which is a string)
How do I set the birthDate to null in the database using sailsjs/waterline?
I hope someone can help:)

change
cb(null, user);
to
cb();
And you have a type-mistake in the if-statement:
user.birthDate == null;
have to be
user.birthDate = null;

Related

NestJS GraphQL custom argument type

I'm trying to use LocalDate type from js-joda as parameter on GraphQL query like this:
#Query(() => DataResponse)
async getData(#Args() filter: DataFilter): Promise<DataResponse> { ... }
And here is filter type definition:
#ArgsType()
export class DataFilter {
#Field({ nullable: true })
#IsOptional()
date?: LocalDate;
#Field()
#Min(1)
page: number;
#Field()
#Min(1)
pageSize: number;
}
I've also registered LocalDate as scalar type and added it to application providers.
#Scalar('LocalDate', (type) => LocalDate)
export class LocalDateScalar implements CustomScalar<string, LocalDate> {
description = 'A date string, such as 2018-07-01, serialized in ISO8601 format';
parseValue(value: string): LocalDate {
return LocalDate.parse(value);
}
serialize(value: LocalDate): string {
return value.toString();
}
parseLiteral(ast: ValueNode): LocalDate {
if (ast.kind === Kind.STRING) {
return LocalDate.parse(ast.value);
}
return null;
}
}
This is the error I'm getting
[Nest] 9973 - 02/16/2022, 5:33:41 PM ERROR [ExceptionsHandler] year
must not be null NullPointerException: year must not be null
at requireNonNull (/Users/usr/my-app/node_modules/#js-joda/core/src/assert.js:33:15)
at new LocalDate (/Users/usr/my-app/node_modules/#js-joda/core/src/LocalDate.js:284:9)
at TransformOperationExecutor.transform (/Users/usr/my-app/node_modules/src/TransformOperationExecutor.ts:160:22)
at TransformOperationExecutor.transform (/Users/usr/my-app/node_modules/src/TransformOperationExecutor.ts:333:33)
at ClassTransformer.plainToInstance (/Users/usr/my-app/node_modules/src/ClassTransformer.ts:77:21)
at Object.plainToClass (/Users/usr/my-app/node_modules/src/index.ts:71:27)
at ValidationPipe.transform (/Users/usr/my-app/node_modules/#nestjs/common/pipes/validation.pipe.js:51:39)
at /Users/usr/my-app/node_modules/#nestjs/core/pipes/pipes-consumer.js:17:33
at processTicksAndRejections (node:internal/process/task_queues:96:5)
I'm not sure why is this exactly happening but from what I've managed to debug, is that LocalDateScalar defined above is transforming the value from string to LocalDate correctly, but the problem is that class-transformer is also trying to transform the value, and since it's already transformed it recognizes it as object, which is automatically being call through parameterless constructor and it's causing this error.
This is the line from class-transformer that's calling the constructor
newValue = new (targetType as any)();
Is there maybe a way to tell class-transformers which types to ignore? I'm aware of the #Exclude attribute, but then property is completely excluded, I just need to exclude property being transformed via plainToClass method of class-transformer. Or this whole situation should be handled differently?
Any suggestion will be well appreciated.
Not sure if this is the right solution but I had a similar scalar <string, Big> working with the following decorators:
#Field(() => AmountScalar) // your actual scalar class
#Type(() => String) // the "serialized" type of the scalar
#Transform(({ value }) => {
return Big(value) // custom parse function
})
amount: Big // the "parsed" type of the scalar
The two custom parse functions in the scalar can also contain some validation steps (like moment.isValid() in your case) since it will be called before class-validator.

Create complex argument-driven queries from AWS Lambda?

Look for // HERE IS THE PROBLEM PART sentence to find code that is the problem.
I am trying to implement AppSync using AWS Lambda (that connects to RDS Postgres server) as a data source. I want to create puKnowledgeFile query that will update my KnowledgeFile with optional arguments. If the client only provided htmlText and properties as arguments, then my update query should only update these two fields.
type Mutation {
putKnowledgeFile(
id: ID!,
htmlText: String,
plainText: String,
properties: AWSJSON
): KnowledgeFile
}
type KnowledgeFile {
id: ID!
htmlText: String!
plainText: String!
properties: AWSJSON!
lastDateTimeModified: AWSDateTime!
dateTimeCreated: AWSDateTime!
}
Here is an piece of AWS Lambda code:
exports.handler = async (event, context, callback) => {
/* Connecting to Postgres */
let data = null;
let query = ``;
let values = [];
switch (event.info.fieldName) {
case "putKnowledgeFile":
if(event.arguments.htmlText === undefined &&
event.arguments.plainText === undefined &&
event.arguments.properties === undefined) {
callback(`At least one argument except id should be provided in putKnowledgeFile request`);
}
// HERE IS THE PROBLEM PART
query += `update knowledge_file`
query += `
set `;
let index = 0;
for (let fieldName in event.arguments) {
if(arguments.hasOwnProperty(fieldName)) {
const fieldValue = event.arguments[fieldName];
if(index === 0) {
query += `${fieldName}=$${index+1}`
values.push(fieldValue);
} else {
query += `, ${fieldName}=$${index+1}`
values.push(fieldValue);
}
index++;
}
}
query += `
where knowledge_file.id = $${index+1};`;
values.push(event.arguments.id);
// HERE IS THE PROBLEM PART
break;
default:
callback(`There is no functionality to process this field: ${event.info.fieldName}`);
return;
}
let res = null;
try {
res = await client.query(query, values); // just sending created query
} catch(error) {
console.log("#client.query");
console.log(error);
}
/* DisConnecting from Postgres */
callback(null, res.rows);
};
Basically, this algorithm creates my query string through multiple string concatenations. I think it's too complicated and error-prone. Is there a way to create dynamic queries based on the presence / absence of certain arguments easily?
Just in case, here is my PostgreSQL schema:
-- main client object for clients
CREATE TABLE client (
id bigserial primary key,
full_name varchar(255)
);
-- knowledge_file
create table knowledge_file (
id bigserial primary key,
html_text text,
plain_text text,
properties jsonb,
last_date_modified timestamptz,
date_created timestamptz,
word_count varchar(50)
);
-- which client holds which knowledge file
create TABLE client_knowledge_file (
id bigserial primary key,
client_id bigint not null references client(id),
knowledge_file_id bigint not null references knowledge_file(id) unique ON DELETE CASCADE
);
I know this is not an optimum solution and might not completely answer your question but I also ran into similar problem and this is how I solved it.
I created a resolver pipeline.
In one function, I used the select statement to get the current
record.
In second function, I checked if the fields (in your case htmlText and properties) are null. If true, then use the ctx.prev.result values otherwise use the new ones).
Practical example
First resolver function:
{
"version": "2018-05-29",
"statements": [
"select id, html_text AS \"htmlText\", plain_text AS \"plainText\", properties, last_date_modified AS \"lastDateTimeModified\", date_created AS \"dateTimeCreated\" from knowledge_file where id = $ctx.args.Id"
]
}
Second resolver function:
#set($htmlText = $util.defaultIfNull($ctx.args.htmlText , $ctx.prev.result.htmlText))
#set($properties = $util.defaultIfNull($ctx.args.properties , $ctx.prev.result.properties))
{
"version": "2018-05-29",
"statements": [
"update knowledge_file set html_text = $htmlText, plain_text = $ctx.args.plainText, properties = $properties, last_date_modified = CURRENT_TIMESTAMP, date_created = CURRENT_DATE where id = $ctx.args.Id returning id, html_text AS \"htmlText\", plain_text AS \"plainText\", properties, last_date_modified AS \"lastDateTimeModified\", date_created AS \"dateTimeCreated\""
]
}

How can i search by field of joined table in graphql and nestjs

i create two table tag and tagTranslation.
following is field of each
Tag
id, type, transloations, creaed_at, updated_at
TagTranslation
id, tag_id, name, language
I use graphql, i want to get tag list by type, name and language
{ tags(name:"tag1", language:"en, type:3){
id,
type,
translations{
id,
name,
language,
}
}
}
so I create resolver like following
#Query(returns => [Tag])
tags(#Args() tagArgs: TagArgs): Promise<Tag[]> {
const where = {
...(tagArgs.type) && {type: tagArgs.type}
};
const include_where = {
...(tagArgs.name) && {name: { [Op.like]: `%${tagArgs.name}%` }},
...(tagArgs.language) && {language: tagArgs.language}
};
return this.tagService.findAll({
where: where,
include: {
as: 'translations',
model: TagTranslation,
where: include_where,
required: true,
}
});
}
#Query(returns => Tag)
tag(#Args({name: 'id', type: ()=> Int}) id: number): Promise<Tag>{
return this.tagService.get(id)
}
#ResolveProperty()
async translations(#Parent() tag): Promise<TagTranslation[]>{
const { id } = tag;
return await this.tagTranslationService.findAll({tag_id: id});
}
when i call tags, the query is called twice
first, A query is executed to get the results I want.
but second,
SELECT `id`, `tag_id`, `name`, `language`, `created_at`, `updated_at` FROM `tag_translation` AS `TagTranslation` WHERE `TagTranslation`.`tag_id` = 1;
query is called once more, so i can't get results what i want.
I think second query is called because of ResolveProperty, I remove ResolveProperty. after that, tag query is not include tagtranslation info...
how can i solve that problem ? or is there another idea??
how can i solve that problem ? or is there another idea??
Relations between entities should be resolved on a field resolver (#ResolveProperty()) level because when someone requests only id and type, you will still perform additional, not needed join on TagTranslation in sql query.

Loopback custom password validation

very simple question: if I try to validate a password in a User model it seems I can only validate the already encrypted password?
So for example if I use
Customer.validatesLengthOf('password', { min: 8, message: 'Too short' })
Then the encrypted password is checked (which is always longer than 8 characters), so no good... If I try to use a custom validation, how can I get access to the original password (the original req.body.password basically)?
EDIT (August 20, 2019): I am unsure if this is still an issue in the latest loopback releases.
In fact, this is a known problem in loopback. The tacitly approved solution is to override the <UserModel>.validatePassword() method with your own. YMMV.
akapaul commented on Jan 10, 2017 •
I've found another way to do this. In common model User there is a
method called validatePassword. If we extend our UserModel from User,
we can redefine this method in JS, like following:
var g = require('loopback/lib/globalize');
module.exports = function(UserModel) {
UserModel.validatePassword = function(plain) {
var err,
passwordProperties = UserModel.definition.properties.password;
if (plain.length > passwordProperties.max) {
err = new Error (g.f('Password too long: %s (maximum %d symbols)', plain, passwordProperties.max));
err.code = 'PASSWORD_TOO_LONG';
} else if (plain.length < passwordProperties.min) {
err = new Error(g.f('Password too short: %s (minimum %d symbols)', plain, passwordProperties.min));
err.code = 'PASSWORD_TOO_SHORT';
} else if(!(new RegExp(passwordProperties.pattern, 'g').test(plain))) {
err = new Error(g.f('Invalid password: %s (symbols and numbers are allowed)', plain));
err.code = 'INVALID_PASSWORD';
} else {
return true;
}
err.statusCode = 422;
throw err;
};
};
This works for me. I don't think that g (globalize) object is required
here, but I added this, just in case. Also, I've added my validator
options in JSON definition of UserModel, because of Loopback docs
For using the above code, one would put their validation rules in the model's .json definition like so (see max, min, and pattern under properties.password):
{
"name": "UserModel",
"base": "User",
...
"properties": {
...
"password": {
"type": "string",
"required": true,
...
"max": 50,
"min": 8,
"pattern": "(?=.*[A-Z])(?=.*[!##$&*])(?=.*[0-9])(?=.*[a-z])^.*$"
},
...
},
...
}
ok, no answer so what I'm doing is using a remote hook to get access to the original plain password and that'll do for now.
var plainPwd
Customer.beforeRemote( 'create', function (ctx, inst, next) {
plainPwd = ctx.req.body.password
next()
})
Then I can use it in a custom validation:
Customer.validate( 'password', function (err, res) {
const pattern = new RegExp(/some-regex/)
if (plainPwd && ! pattern.test( plainPwd )) err()
}, { message: 'Invalid format' })
Ok I guess the above answer is quite novel and obviously is accepted, but If you want a real easy solution with just some basic validations done and not much code then loopback-mixin-complexity is the solution for you.
If you don't want to create another dependency then you can go ahead with a custom mixin, that you can add into your user model or any other model where you need some kind of validation and it would do the validation for you.
Here's a sample code for how to create such mixin
module.exports = function(Model, options) {
'use strict';
Model.observe('before save', function event(ctx, next) { //Observe any insert/update event on Model
if (ctx.instance) {
if(!yourValidatorFn(ctx.instance.password) )
next('password not valid');
else
next();
}
else {
if(!yourValidatorFn(ctx.data.password) )
next('password not valid');
else
next();
}
});
};

Grials update fails on unique constraint

I have a user domain in which I have a unique constraint on it's employee number
class User {
Integer employeeNumber
String employeeLogin
String firstName
String middleName
String lastName
String nickname
Date birthday
static mapping = {
table 'user'
id generator: 'assigned', name: 'employeeNumber', type: 'int'
employeeNumber column: 'employee_number'
version false
sort 'lastName'
}
static constraints = {
employeeNumber(blank: false, nullable: false, unique: true)
employeeLogin(blank: false, nullable: false, unique: true)
firstName(blank: false, nullable: false)
middleName(blank: true, nullable: true)
lastName(blank: false, nullable: false)
nickname(blank: true, nullable: true)
birthday(blank: true, nullable: true)
}
}
and I am trying to update the user with
class UserController {
...
def saveUser() {
SimpleDateFormat formatter = new SimpleDateFormat("MM/dd/yyyy");
if (params.userId) { // handles user update for existing employeeNumber
def user = User.findByEmployeeNumber(params.userId) // sent in hidden field name userId with value employeeNumber
user.employeeNumber = Integer.parseInt(params.employeeNumber)
user.employeeLogin = params.employeeLogin
user.firstName = params.firstName
user.middleName = params.middleName
user.lastName = params.lastName
user.nickname = params.nickname
try {
user.birthday = formatter.parse(params.birthday)
}
catch (Exception ignore) {
user.birthday = null
}
if (user.validate()) {
user.save(flush:true, failOnError:true)
redirect(action:'profile', id:user.employeeNumber)
} else {
render(view:'editUser', model:[user:user])
}
} else { // handles new user
... // this part works
}
}
...
}
but it is catching on if (user.validate()) { ... } due to the unique constraints on employeeNumber and employeeLogin.
When creating a new user I want the username and id to be unique, but upon update I'd obviously like to update an existing user, however this unique constraint is blocking me from doing so. Any ideas on how to solve this problem?
You are getting this error because you have made the employeeNumber field as your primary key
For Hibernate/Grails, Primary Key is the identifier value for an object. It is something that will never change during the life of the object
During update you are trying to modify the identifier which is causing the issue.
You can definitely modify a unique field, so in your case you can modify the employeeLogin. But you can't modify the Primary Key.
Also employeeNumber is already a primary key so don't declare it as a unique key. This will remove the unique key error that you are facing.

Resources