I'm looking for a way to validate the structure of an input in Scaffolder. In case if the string is in the pattern "kehab case".
I hope to be able to use, for example, RegEx for this validation.
As we must write the steps in a yaml file, I haven't found effective ways to enforce this validation.
NOTE: Remembering that "react-jsonschema" is used for the "construction" of the forms.
It would be smth. like the following:
import {
scaffolderPlugin,
createScaffolderFieldExtension,
} from '#backstage/plugin-scaffolder';
import {
ValidateKebabCase,
validateKebabCaseValidation,
} from './ValidateKebabCase/ValidateKebabCaseExtension';
export const ValidateKebabCaseFieldExtension = scaffolderPlugin.provide(
createScaffolderFieldExtension({
name: 'ValidateKebabCase',
component: ValidateKebabCase,
validation: validateKebabCaseValidation,
}),
);
with
export const validateKebabCaseValidation = (
value: string,
validation: FieldValidation,
) => {
const kebabCase = /^[a-z0-9-_]+$/g.test(value);
if (kebabCase === false) {
validation.addError(
`Only use letters, numbers, hyphen ("-") and underscore ("_").`,
);
}
};
Check out our documentation for the full example: https://backstage.io/docs/features/software-templates/writing-custom-field-extensions#creating-a-field-extension
Related
I wonder if there is a way (plugin, middleware, etc) to apply some function to filter the string that is used on each mutation.
For instance:
// In my schema
input Comment {
status: "SENT",
comment: String
}
mutation updateStatus($id: String, $input: Comment!) {
updateStatus(id: $id, input: $input)
}
// and I call the mutation from React
useMutation(UPDATE_STATUS, { options });
But I have many similar mutations where I use some "input", so I wonder if it's possible to filter each used input with a simple function like:
// This will take each value of the input object and replace the strings that match with "some char"
const clearMutationPayload = (input) => Object.values(input).map(val => val.replace('some char', '');
Any idea?
Im using postgraphile and i have this query:
query Products($categories: [String]){
products(
filter: {
category: { in: $categories }
}) {
id
name
category
}
}
is there a way to not use the filter if the $categories array is empty ?
In case the array is not set or it's empty, i want to get all the results.
i saw there is an option to pass filter as an argument, but I wonder if there is another support.
Without making the filter a generic argument on the client side, There are two server-side options for filter operator customization in the postgraphile-plugin-connection-filter (v2.3.0) plugin. Both require you to create an additional plugin that registers a build hook.
Option 1: Use addConnectionFilterOperator
addConnectionFilterOperator is a build extension function added by ConnectionFilterPlugin that is meant to allow you to add a new operators for custom functionality.
Option 2: Modify Connection Filter Operators Directly
It's also possible to directly change the SQL output of a given operator by modifying the OperatorSpec values for each GraphQLType. These specs can be found in the following fields on the Build object:
connectionFilterArrayOperators,
connectionFilterEnumOperators,
connectionFilterRangeOperators,
connectionFilterScalarOperators
Below is an example of a combined Plugin implementation for Options 1 and 2 that omit SQL generation when the argument value array for the operator is empty - effectively nullifying the operator and resulting in full data returns. This implementation allows for easy switching between modifying the in operators directly and adding new expansiveIn operators to each connection filter's scalar fields.
import type { Build } from "postgraphile";
import type { Plugin, Options } from "graphile-build";
import type { AddConnectionFilterOperator } from "postgraphile-plugin-connection-filter/dist/PgConnectionArgFilterPlugin";
import type { PgType, SQL } from "graphile-build-pg";
import type { GraphQLInputType } from "graphql";
import type { OperatorSpec } from "postgraphile-plugin-connection-filter/dist/PgConnectionArgFilterOperatorsPlugin";
export interface InOperatorEmptyArrayCustomizationPluginBuildOpts {
inOperatorEmptyArrayCustomizationPlugin?: {
/**
* Add new "expansiveIn" operators with custom empty array instead of
* modifying existing "in" operators
*/
addNewOperator?: boolean;
};
}
/**
* Implements custom empty array handling either by registering a new "expansiveIn"
* operator for each connection filter or by modifying the existing "operators".
* This plugin must be appended AFTER postgraphile-plugin-connection-filter
* as it depends on its build extensions.
*/
const InOperatorEmptyArrayCustomizationPlugin: Plugin = (builder, options) => {
const { inOperatorEmptyArrayCustomizationPlugin } = options as Options &
InOperatorEmptyArrayCustomizationPluginBuildOpts;
const addNewOperator =
inOperatorEmptyArrayCustomizationPlugin?.addNewOperator === true;
// create a build hook to access ConnectionFilterPlugin build extensions.
builder.hook("build", (build) => {
const {
pgSql: sql,
graphql: { GraphQLList, GraphQLNonNull },
// this function is added as a build extension by the ConnectionFilterPlugin
// and allows for the addition of custom filter operators.
addConnectionFilterOperator,
gql2pg,
// this contains all existing ConnectionFilterPlugin scalar operators
// by GraphQL type
connectionFilterScalarOperators,
} = build as Build & {
addConnectionFilterOperator: AddConnectionFilterOperator;
connectionFilterScalarOperators: Record<
string,
Record<string, OperatorSpec>
>;
};
// Generate "in" SQL fragment from argument input values if values are
// present in the array. Otherwise, return null.
const resolveListSqlValue = (
input: unknown,
pgType: PgType,
pgTypeModifier: number | null,
resolveListItemSqlValue: (
elem: unknown,
pgType: PgType,
pgTypeModifier: number | null
) => unknown
) =>
(input as unknown[]).length === 0
? null
: sql.query`(${sql.join(
(input as unknown[]).map((i) =>
resolveListItemSqlValue
? resolveListItemSqlValue(i, pgType, pgTypeModifier)
: gql2pg(i, pgType, pgTypeModifier)
),
","
)})`;
// checks whether value is present before adding the sql filter fragment.
const resolve = (i: SQL, v: SQL) =>
v != null ? sql.fragment`${i} IN ${v}` : null;
// Find all the scalar GraphQLTypes that have an "in" operator.
const typesWithScalarInOperators = Object.entries(
connectionFilterScalarOperators
)
.filter(([, operations]) => operations.in)
.map(([typeName]) => typeName);
// modify existing "in" operators for every scalar type.
if (!addNewOperator) {
// The graphile build engine will emit a warning if you create
// a new build object using the standard javascript mechanisms.
// It will also throw an error if the existing
// connectionFilterScalarOperations field is replaced in the extension
// object...
const extendedBuild = build.extend(build, {});
// ...so we merge in the new operators in a separate step.
typesWithScalarInOperators.forEach((typeName) => {
extendedBuild.connectionFilterScalarOperators[typeName].in = {
// see https://github.com/graphile-contrib/postgraphile-plugin-connection-filter/blob/v2.3.0/src/PgConnectionArgFilterOperatorsPlugin.ts#L80-L85
// for existing "in" operator configuration
...extendedBuild.connectionFilterScalarOperators[typeName].in,
resolveSqlValue: resolveListSqlValue,
resolve,
};
});
return extendedBuild;
}
// Otherwise add a new operator called "inExpansive" that implements the custom
// empty array argument handling.
// see https://github.com/graphile-contrib/postgraphile-plugin-connection-filter/blob/v2.3.0/__tests__/customOperatorsPlugin.ts
// for `addConnectionFilterOperator` usage examples.
addConnectionFilterOperator(
// add the new operator to any type that has an "in" operator.
typesWithScalarInOperators,
"inExpansive",
"Included in the specified list -unless list is empty in which case this operator is not applied.",
// list of non-null element type
(fieldInputType: GraphQLInputType) =>
new GraphQLList(new GraphQLNonNull(fieldInputType)),
resolve,
{
resolveSqlValue: resolveListSqlValue,
}
);
return build;
});
};
export default InOperatorEmptyArrayCustomizationPlugin;
Append plugin after ConnectionFilterPlugin in the Postgraphile middleware options:
// ...
appendPlugins: [
// ..
ConnectionFilterPlugin,
AddInExpansiveFilterOperatorPlugin
],
// ...
To enable the expansiveIn operator (Option 1) add the relevant configuration to `graphileBuildOptions in the Postgraphile middleware options:
graphileBuildOptions: {
inOperatorEmptyArrayCustomizationPlugin: {
addNewOperator: true
},
// other plugin options
}
You can use inExpansive operator the same way as the in operator:
query Products($categories: [String]){
products(
filter: {
category: { inExpansive: $categories }
}) {
id
name
category
}
}
I am trying to implement react-redux in login-form input values.
I have added values to the redux state, but I cannot access the data individually from the state object.
Here are the details:
In App.js file
console.log(useSelector((state) => state));
gives result {email: "demo#demo.com" , password: "123456"}
. I am not able to access the email inside the state object using
console.log(useSelector((state) => state.email));
It is giving the error that
'email' does not exist on type 'DefaultRootState'. TS2339
Here is the reducer.js file
let formValues = {
email: "",
password: "",
};
export const inputReducer = (state = formValues, action) => {
switch (action.type) {
case "inputValue":
return { ...state, [action.name]: action.inputValue };
default:
return state;
}
};
Here is the action.txt file
export const handleChange = (name: string, inputValue: string) => {
return {
type: "inputValue",
name: name,
inputValue: inputValue,
};
}
I wrote a function to get rid of this problem :
function getProperty<T, K extends keyof T>(o: T, propertyName: K): T[K] {
return o[propertyName]; // o[propertyName] is of type T[K]
}
You have to pass your object as first parameter, then the name of your property (here it is email or password).
If you want to get all your property at once, you have to encapsulate them in an object property like this:
{ value : {email:"alan.turing#gmail.com",password:"123" } }
i may be late but thought to provide solution. Basically this type of error message appears when you don't provide the typing in the useSelector hook
As per the doc React-Redux which states:
Using configureStore should not need any additional typings. You will,
however, want to extract the RootState type and the Dispatch type so
that they can be referenced as needed.
here in your code block the RootState type is missing, this can be declared in your store file as below
import {createStore} from 'redux';
----------
const store = createStore(rootReducer);
export default store;
export type RootState = ReturnType<typeof store.getState>;
And in your .tsx or .jsx file where exactly you want to access your store values using react-redux hook useSelector add the type as below.
useSelector((state:RootState) => state)
I'm using the built in NestJS ValidationPipe along with class-validator and class-transformer to validate and sanitize inbound JSON body payloads. One scenario I'm facing is a mixture of upper and lower case property names in the inbound JSON objects. I'd like to rectify and map these properties to standard camel-cased models in our new TypeScript NestJS API so that I don't couple mismatched patterns in a legacy system to our new API and new standards, essentially using the #Transform in the DTOs as an isolation mechanism for the rest of the application. For example, properties on the inbound JSON object:
"propertyone",
"PROPERTYTWO",
"PropertyThree"
should map to
"propertyOne",
"propertyTwo",
"propertyThree"
I'd like to use #Transform to accomplish this, but I don't think my approach is correct. I'm wondering if I need to write a custom ValidationPipe. Here is my current approach.
Controller:
import { Body, Controller, Post, UsePipes, ValidationPipe } from '#nestjs/common';
import { TestMeRequestDto } from './testmerequest.dto';
#Controller('test')
export class TestController {
constructor() {}
#Post()
#UsePipes(new ValidationPipe({ transform: true }))
async get(#Body() testMeRequestDto: TestMeRequestDto): Promise<TestMeResponseDto> {
const response = do something useful here... ;
return response;
}
}
TestMeModel:
import { IsNotEmpty } from 'class-validator';
export class TestMeModel {
#IsNotEmpty()
someTestProperty!: string;
}
TestMeRequestDto:
import { IsNotEmpty, ValidateNested } from 'class-validator';
import { Transform, Type } from 'class-transformer';
import { TestMeModel } from './testme.model';
export class TestMeRequestDto {
#IsNotEmpty()
#Transform((propertyone) => propertyone.valueOf())
propertyOne!: string;
#IsNotEmpty()
#Transform((PROPERTYTWO) => PROPERTYTWO.valueOf())
propertyTwo!: string;
#IsNotEmpty()
#Transform((PropertyThree) => PropertyThree.valueOf())
propertyThree!: string;
#ValidateNested({ each: true })
#Type(() => TestMeModel)
simpleModel!: TestMeModel
}
Sample payload used to POST to the controller:
{
"propertyone": "test1",
"PROPERTYTWO": "test2",
"PropertyThree": "test3",
"simpleModel": { "sometestproperty": "test4" }
}
The issues I'm having:
The transforms seem to have no effect. Class validator tells me that each of those properties cannot be empty. If for example I change "propertyone" to "propertyOne" then the class validator validation is fine for that property, e.g. it sees the value. The same for the other two properties. If I camelcase them, then class validator is happy. Is this a symptom of the transform not running before the validation occurs?
This one is very weird. When I debug and evaluate the TestMeRequestDto object, I can see that the simpleModel property contains an object containing a property name "sometestproperty", even though the Class definition for TestMeModel has a camelcase "someTestProperty". Why doesn't the #Type(() => TestMeModel) respect the proper casing of that property name? The value of "test4" is present in this property, so it knows how to understand that value and assign it.
Very weird still, the #IsNotEmpty() validation for the "someTestProperty" property on the TestMeModel is not failing, e.g. it sees the "test4" value and is satisfied, even though the inbound property name in the sample JSON payload is "sometestproperty", which is all lower case.
Any insight and direction from the community would be greatly appreciated. Thanks!
You'll probably need to make use of the Advanced Usage section of the class-transformer docs. Essentially, your #Transform() would need to look something like this:
import { IsNotEmpty, ValidateNested } from 'class-validator';
import { Transform, Type } from 'class-transformer';
import { TestMeModel } from './testme.model';
export class TestMeRequestDto {
#IsNotEmpty()
#Transform((value, obj) => obj.propertyone.valueOf())
propertyOne!: string;
#IsNotEmpty()
#Transform((value, obj) => obj.PROPERTYTWO.valueOf())
propertyTwo!: string;
#IsNotEmpty()
#Transform((value, obj) => obj.PropertyThree.valueOf())
propertyThree!: string;
#ValidateNested({ each: true })
#Type(() => TestMeModel)
simpleModel!: TestMeModel
}
This should take an incoming payload of
{
"propertyone": "value1",
"PROPERTYTWO": "value2",
"PropertyThree": "value3",
}
and turn it into the DTO you envision.
Edit 12/30/2020
So the original idea I had of using #Transform() doesn't quite work as envisioned, which is a real bummer cause it looks so nice. So what you can do instead isn't quite as DRY, but it still works with class-transformer, which is a win. By making use of #Exclude() and #Expose() you're able to use property accessors as an alias for the weird named property, looking something like this:
class CorrectedDTO {
#Expose()
get propertyOne() {
return this.propertyONE;
}
#Expose()
get propertyTwo(): string {
return this.PROPERTYTWO;
}
#Expose()
get propertyThree(): string {
return this.PrOpErTyThReE;
}
#Exclude({ toPlainOnly: true })
propertyONE: string;
#Exclude({ toPlainOnly: true })
PROPERTYTWO: string;
#Exclude({ toPlainOnly: true })
PrOpErTyThReE: string;
}
Now you're able to access dto.propertyOne and get the expected property, and when you do classToPlain it will strip out the propertyONE and other properties (if you're using Nest's serialization interceptor. Otherwise in a secondary pipe you could plainToClass(NewDTO, classToPlain(value)) where NewDTO has only the corrected fields).
The other thing you may want to look into is an automapper and see if it has better capabilities for something like this.
If you're interested, here's the StackBlitz I was using to test this out
As an alternative to Jay's execellent answer, you could also create a custom pipe where you keep the logic for mapping/transforming the request payload to your desired DTO. It can be as simple as this:
export class RequestConverterPipe implements PipeTransform{
transform(body: any, metadata: ArgumentMetadata): TestMeRequestDto {
const result = new TestMeRequestDto();
// can of course contain more sophisticated mapping logic
result.propertyOne = body.propertyone;
result.propertyTwo = body.PROPERTYTWO;
result.propertyThree = body.PropertyThree;
return result;
}
export class TestMeRequestDto {
#IsNotEmpty()
propertyOne: string;
#IsNotEmpty()
propertyTwo: string;
#IsNotEmpty()
propertyThree: string;
}
You can then use it like this in your controller (but you need to make sure that the order is correct, i.e. the RequestConverterPipe must run before the ValidationPipe which also means that the ValidationPipe cannot be globally set):
#UsePipes(new RequestConverterPipe(), new ValidationPipe())
async post(#Body() requestDto: TestMeRequestDto): Promise<TestMeResponseDto> {
// ...
}
I have a component that needs to query two entirely separate tables. What do the schema, query and resolver need to look like in this case? I've googled but haven't found examples yet. Thanks in advance for any info.
UPDATE:
On Slack I see there may be a way to use compose for this purpose, e.g.:
export default compose(
graphql(query1,
....),
graphql(query2,
....),
graphql(query3,
....),
withApollo
)(PrintListEditPage)
Is there a way to have multiple declarations like this:
const withMutations = graphql(updateName, {
props({ mutate }) {
return {
updatePrintListName({ printListId, name }) {
return mutate({
variables: { printListId, name },
});
},
};
},
});
...that come before the call to export default compose?
The graphql function takes an optional second argument that allows you to alias the passed down property name. If you have multiple mutations you can use the name property to rename mutate as needed:
import { graphql, compose } from 'react-apollo'
export default compose(
graphql(mutation1, { name: 'createSomething' }),
graphql(mutation2, { name: 'deleteSomething' }),
)(Component)
For more details see the complete API.