Related
when I execute a request with an hexadecimal alias, i have an error message
query infosTokenAddress {
00007659311B67BAA83A952944B75F09142DA1D554EBDD6F88E9C9FCF9BD365F0BB3: transaction(address: "00007659311B67BAA83A952944B75F09142DA1D554EBDD6F88E9C9FCF9BD365F0BB3") {
... TransactionFields
}
}
fragment TransactionFields on Transaction {data { content } }
Syntax Error GraphQL request (2:4) Invalid number, unexpected digit after 0: "0".
1: query infosTokenAddress {
2: 00007659311B67BAA83A952944B75F09142DA1D554EBDD6F88E9C9FCF9BD365F0BB3: transaction(address: "00007659311B67BAA83A952944B75F09142DA1D554EBDD6F88E9C9FCF9BD365F0BB3") {
^
3: ... TransactionFields
if i add a alphanumeric prefix, it works but i need to keep the hexa value.
Im using postgraphile and i have this query:
query Products($categories: [String]){
products(
filter: {
category: { in: $categories }
}) {
id
name
category
}
}
is there a way to not use the filter if the $categories array is empty ?
In case the array is not set or it's empty, i want to get all the results.
i saw there is an option to pass filter as an argument, but I wonder if there is another support.
Without making the filter a generic argument on the client side, There are two server-side options for filter operator customization in the postgraphile-plugin-connection-filter (v2.3.0) plugin. Both require you to create an additional plugin that registers a build hook.
Option 1: Use addConnectionFilterOperator
addConnectionFilterOperator is a build extension function added by ConnectionFilterPlugin that is meant to allow you to add a new operators for custom functionality.
Option 2: Modify Connection Filter Operators Directly
It's also possible to directly change the SQL output of a given operator by modifying the OperatorSpec values for each GraphQLType. These specs can be found in the following fields on the Build object:
connectionFilterArrayOperators,
connectionFilterEnumOperators,
connectionFilterRangeOperators,
connectionFilterScalarOperators
Below is an example of a combined Plugin implementation for Options 1 and 2 that omit SQL generation when the argument value array for the operator is empty - effectively nullifying the operator and resulting in full data returns. This implementation allows for easy switching between modifying the in operators directly and adding new expansiveIn operators to each connection filter's scalar fields.
import type { Build } from "postgraphile";
import type { Plugin, Options } from "graphile-build";
import type { AddConnectionFilterOperator } from "postgraphile-plugin-connection-filter/dist/PgConnectionArgFilterPlugin";
import type { PgType, SQL } from "graphile-build-pg";
import type { GraphQLInputType } from "graphql";
import type { OperatorSpec } from "postgraphile-plugin-connection-filter/dist/PgConnectionArgFilterOperatorsPlugin";
export interface InOperatorEmptyArrayCustomizationPluginBuildOpts {
inOperatorEmptyArrayCustomizationPlugin?: {
/**
* Add new "expansiveIn" operators with custom empty array instead of
* modifying existing "in" operators
*/
addNewOperator?: boolean;
};
}
/**
* Implements custom empty array handling either by registering a new "expansiveIn"
* operator for each connection filter or by modifying the existing "operators".
* This plugin must be appended AFTER postgraphile-plugin-connection-filter
* as it depends on its build extensions.
*/
const InOperatorEmptyArrayCustomizationPlugin: Plugin = (builder, options) => {
const { inOperatorEmptyArrayCustomizationPlugin } = options as Options &
InOperatorEmptyArrayCustomizationPluginBuildOpts;
const addNewOperator =
inOperatorEmptyArrayCustomizationPlugin?.addNewOperator === true;
// create a build hook to access ConnectionFilterPlugin build extensions.
builder.hook("build", (build) => {
const {
pgSql: sql,
graphql: { GraphQLList, GraphQLNonNull },
// this function is added as a build extension by the ConnectionFilterPlugin
// and allows for the addition of custom filter operators.
addConnectionFilterOperator,
gql2pg,
// this contains all existing ConnectionFilterPlugin scalar operators
// by GraphQL type
connectionFilterScalarOperators,
} = build as Build & {
addConnectionFilterOperator: AddConnectionFilterOperator;
connectionFilterScalarOperators: Record<
string,
Record<string, OperatorSpec>
>;
};
// Generate "in" SQL fragment from argument input values if values are
// present in the array. Otherwise, return null.
const resolveListSqlValue = (
input: unknown,
pgType: PgType,
pgTypeModifier: number | null,
resolveListItemSqlValue: (
elem: unknown,
pgType: PgType,
pgTypeModifier: number | null
) => unknown
) =>
(input as unknown[]).length === 0
? null
: sql.query`(${sql.join(
(input as unknown[]).map((i) =>
resolveListItemSqlValue
? resolveListItemSqlValue(i, pgType, pgTypeModifier)
: gql2pg(i, pgType, pgTypeModifier)
),
","
)})`;
// checks whether value is present before adding the sql filter fragment.
const resolve = (i: SQL, v: SQL) =>
v != null ? sql.fragment`${i} IN ${v}` : null;
// Find all the scalar GraphQLTypes that have an "in" operator.
const typesWithScalarInOperators = Object.entries(
connectionFilterScalarOperators
)
.filter(([, operations]) => operations.in)
.map(([typeName]) => typeName);
// modify existing "in" operators for every scalar type.
if (!addNewOperator) {
// The graphile build engine will emit a warning if you create
// a new build object using the standard javascript mechanisms.
// It will also throw an error if the existing
// connectionFilterScalarOperations field is replaced in the extension
// object...
const extendedBuild = build.extend(build, {});
// ...so we merge in the new operators in a separate step.
typesWithScalarInOperators.forEach((typeName) => {
extendedBuild.connectionFilterScalarOperators[typeName].in = {
// see https://github.com/graphile-contrib/postgraphile-plugin-connection-filter/blob/v2.3.0/src/PgConnectionArgFilterOperatorsPlugin.ts#L80-L85
// for existing "in" operator configuration
...extendedBuild.connectionFilterScalarOperators[typeName].in,
resolveSqlValue: resolveListSqlValue,
resolve,
};
});
return extendedBuild;
}
// Otherwise add a new operator called "inExpansive" that implements the custom
// empty array argument handling.
// see https://github.com/graphile-contrib/postgraphile-plugin-connection-filter/blob/v2.3.0/__tests__/customOperatorsPlugin.ts
// for `addConnectionFilterOperator` usage examples.
addConnectionFilterOperator(
// add the new operator to any type that has an "in" operator.
typesWithScalarInOperators,
"inExpansive",
"Included in the specified list -unless list is empty in which case this operator is not applied.",
// list of non-null element type
(fieldInputType: GraphQLInputType) =>
new GraphQLList(new GraphQLNonNull(fieldInputType)),
resolve,
{
resolveSqlValue: resolveListSqlValue,
}
);
return build;
});
};
export default InOperatorEmptyArrayCustomizationPlugin;
Append plugin after ConnectionFilterPlugin in the Postgraphile middleware options:
// ...
appendPlugins: [
// ..
ConnectionFilterPlugin,
AddInExpansiveFilterOperatorPlugin
],
// ...
To enable the expansiveIn operator (Option 1) add the relevant configuration to `graphileBuildOptions in the Postgraphile middleware options:
graphileBuildOptions: {
inOperatorEmptyArrayCustomizationPlugin: {
addNewOperator: true
},
// other plugin options
}
You can use inExpansive operator the same way as the in operator:
query Products($categories: [String]){
products(
filter: {
category: { inExpansive: $categories }
}) {
id
name
category
}
}
Document stored in mongodb:
{
"CNF_SERVICE_ID":"1",
"SERVICE_CATEGORY":"COMMON_SERVICE",
"SERVICES":[{
"SERVICE_NAME":"Authentication Service",
"VERSIONS":[{
"VERSION_NAME":"AuthenticationServiceV6_3",
"VERSION_NUMBER":"2",
"VERSION_NOTES":"test",
"RELEASE_DATE":"21-02-2020",
"OBSOLETE_DATE":"21-02-2020",
"STATUS":"Y",
"GROUPS":[{
"GROUP_NAME":"TEST GROUP",
"CREATED_DATE":"",
"NODE_NAMES":[
""
],
"CUSTOMERS":[{
"CUSTOMER_CONFIG_ID":"4",
"ACTIVATION_DATE":"21-02-2020",
"DEACTIVATION_DATE":"21-02-2020",
"STATUS":"Y"
}]
}]
}]
}
]
}
Now, I need to add another customer json to the array "CUSTOMERS" inside "GROUPS" in the same document above. The customer json would be like this:
{
"CUSTOMER_CONFIG_ID":"10",
"ACTIVATION_DATE":"16-03-2020",
"DEACTIVATION_DATE":"16-03-2021",
"STATUS":"Y"
}
I tried this:
Update update = new Update().push("SERVICES.$.VERSIONS.GROUPS.CUSTOMERS",customerdto);
mongoOperations.update(query, update, Myclass.class, "mycollection");
But, I am getting the exception: org.springframework.data.mongodb.UncategorizedMongoDbException: Command failed with error 28 (PathNotViable): 'Cannot create field 'GROUPS' in element
[ EDIT ADD ]
I was able to update it using the filtered positional operator. Below is the query I used:
update(
{ "SERVICE_CATEGORY":"COMMON_SERVICE", "SERVICES.SERVICE_NAME":"Authentication Service", "SERVICES.VERSIONS.VERSION_NAME":"AuthenticationServiceV6_3"},
{ $push:{"SERVICES.$[].VERSIONS.$[].GROUPS.$[].CUSTOMERS": { "CUSTOMER_CONFIG_ID":"6", "ACTIVATION_DATE":"31-03-2020", "STATUS":"Y" } } }
);
Actually, this query updated all the fields irrespective of the filter conditions. So. I tried this but I am facing syntax exception. Please help.
update(
{"SERVICE_CATEGORY":"COMMON_SERVICE"},
{"SERVICES.SERVICE_NAME":"Authentication Service"},
{"SERVICES.VERSIONS.VERSION_NAME":"AuthenticationServiceV6_3"}
{
$push:{"SERVICES.$[service].VERSIONS.$[version].GROUPS.$[group].CUSTOMERS":{
"CUSTOMER_CONFIG_ID":"6",
"ACTIVATION_DATE":"31-03-2020",
"STATUS":"Y"
}
}
},
{
multi: true,
arrayFilters: [ { $and:[{ "version.VERSION_NAME": "AuthenticationServiceV6_3"},{"service.SERVICE_NAME":"Authentication Service"},{"group.GROUP_NAME":"TEST GROUP"}]} ]
}
);
Update: April 1,2020
The code I tried:
validationquery.addCriteria(Criteria.where("SERVICE_CATEGORY").is(servicedto.getService_category()).and("SERVICES.SERVICE_NAME").is(servicedetail.getService_name()).and("SERVICES.VERSIONS.VERSION_NAME").is(version.getVersion_name()));
Update update=new Update().push("SERVICES.$[s].VERSIONS.$[v].GROUPS.$[].CUSTOMERS", customer).filterArray(Criteria.where("SERVICE_CATEGORY").is(servicedto.getService_category()).and("s.SERVICE_NAME").is(servicedetail.getService_name()).and("v.VERSION_NAME").is(version.getVersion_name()));
mongoOperations.updateMulti(validationquery, update, ServiceRegistrationDTO.class, collection, key,env);
The below exception is thrown:
ERROR com.sample.amt.mongoTemplate.MongoOperations - Exception in count(query, collectionName,key,env) :: org.springframework.dao.DataIntegrityViolationException: Error parsing array filter :: caused by :: Expected a single top-level field name, found 'SERVICE_CATEGORY' and 's'; nested exception is com.mongodb.MongoWriteException: Error parsing array filter :: caused by :: Expected a single top-level field name, found 'SERVICE_CATEGORY' and 's'
This update query adds the JSON to the nested array, "SERVICES.VERSIONS.GROUPS.CUSTOMERS", based upon the specified filter conditions. Note that your filter conditions direct the update operation to the specific array (of the nested arrays).
// JSON document to be added to the CUSTOMERS array
new_cust = {
"CUSTOMER_CONFIG_ID": "6",
"ACTIVATION_DATE": "31-03-2020",
"STATUS": "Y"
}
db.collection.update(
{
"SERVICE_CATEGORY": "COMMON_SERVICE",
"SERVICES.SERVICE_NAME": "Authentication Service",
"SERVICES.VERSIONS.VERSION_NAME": "AuthenticationServiceV6_3"
},
{
$push: { "SERVICES.$[s].VERSIONS.$[v].GROUPS.$[g].CUSTOMERS": new_cust }
},
{
multi: true,
arrayFilters: [
{ "s.SERVICE_NAME": "Authentication Service" },
{ "v.VERSION_NAME": "AuthenticationServiceV6_3" },
{ "g.GROUP_NAME": "TEST GROUP" }
]
}
);
Few things to note when updating documents with nested arrays of more than one level nesting.
Use the all positional operator $[] and the filtered positional
operator $[<identifier>], and not the $ positional operator.
With filtered positional operator specify the array filter conditions
using the arrayFilters parameter. Note that this will direct your update to target the specific nested array.
For the filtered positional operator $[<identifier>], the
identifier must begin with a lowercase letter and contain only
alphanumeric characters.
References:
Array Update
Operators
db.collection.update() with arrayFilters
Thanks to #prasad_ for providing the query. I was able to eventually convert the query successfully to code with Spring data MongoTemplate's updateMulti method. I have posted the code below:
Query validationquery = new Query();
validationquery.addCriteria(Criteria.where("SERVICE_CATEGORY").is(servicedto.getService_category()).and("SERVICES.SERVICE_NAME").is(servicedetail.getService_name()).and("SERVICES.VERSIONS.VERSION_NAME").is(version.getVersion_name()));
Update update=new Update().push("SERVICES.$[s].VERSIONS.$[v].GROUPS.$[].CUSTOMERS", customer).filterArray(Criteria.where("s.SERVICE_NAME").is(servicedetail.getService_name())).filterArray(Criteria.where("v.VERSION_NAME").is(version.getVersion_name()));
mongoOperations.updateMulti(validationquery, update, ServiceRegistrationDTO.class, collection, key,env);
mongoTemplateobj.updateMulti(validationquery, update, ServiceRegistrationDTO.class, collection, key,env);
I've built some new Ace Editor Modes for my custom language (JMS Message representations) with a sophisticated state machine. Now it would be great to reuse that syntax highlighting also to create the errors. Is that possible ?
In other words, let's say my syntax highlighting creates 'invalid' tokens and I want to use the line number of that token to flag an error and then do something like this: https://github.com/ajaxorg/ace/wiki/Syntax-validation
The simplest format is the HEX format:
this.$rules = {
"start": [
{ regex: /[!#].*$/, token: "comment" },
{ regex: /^0x[0-9a-f]+:/, token: "constant" }, // hex offset
{ regex: /(?:[0-9a-fA-F]{4} |[0-9a-fA-F]{2} )/, token: "constant.numeric" }, // hex value
{ regex: /[\S ]{1,16}$/, token: "string" }, // printable value
{ regex: "\\s+", token: "text" },
{ defaultToken: "invalid" }
]
};
And let's say the editor created this state with an invalid token in line 4:
Is there a (preferably easy) way to get to the line numbers of my invalid tokens ? Or to reuse my $rules state machine for syntax checking ?
Found it - I must admin, Ace Editor is really good stuff. Always works as expected.
What works for me, after computing the tokens of the document with the rules state machine, I iterate through all tokens and find the once that are 'invalid' and then set annotations on those lines. Initially simply 'Syntax error' but different types of 'invalid' could mean different things in the future. This way I only have to write the validation syntax validation once.
aceEditor.session.on('change', function(delta) {
var sess = aceEditor.session;
sess.clearAnnotations();
var invalids = [];
for( var row=0;row<sess.getLength();row++ ) {
var tokens = sess.getTokens(row);
if( !tokens ) continue;
for( var t=0;t<tokens.length;t++ ) {
if( tokens[t].type==="invalid" ) {
invalids.push({ row: row, column: 0, text: "Syntax error", type: "error" });
}
}
}
sess.setAnnotations( invalids );
});
There might be a smarter way to do this (maybe an onToken(type,row,column) function somewhere ?), but above works for me.
I have following GraphQLEnumType
const PackagingUnitType = new GraphQLEnumType({
name: 'PackagingUnit',
description: '',
values: {
Carton: { value: 'Carton' },
Stack: { value: 'Stack' },
},
});
On a mutation query if i pass PackagingUnit value as Carton (without quotes) it works. But If i pass as string 'Carton' it throws following error
In field "packagingUnit": Expected type "PackagingUnit", found "Carton"
Is there a way to pass the enum as a string from client side?
EDIT:
I have a form in my front end, where i collect the PackagingUnit type from user along with other fields. PackagingUnit type is represented as a string in front end (not the graphQL Enum type), Since i am not using Apollo Client or Relay, i had to construct the graphQL query string by myself.
Right now i am collecting the form data as JSON and then do JSON.stringify() and then remove the double Quotes on properties to get the final graphQL compatible query.
eg. my form has two fields packagingUnitType (An GraphQLEnumType) and noOfUnits (An GraphQLFloat)
my json structure is
{
packagingUnitType: "Carton",
noOfUnits: 10
}
convert this to string using JSON.stringify()
'{"packagingUnitType":"Carton","noOfUnits":10}'
And then remove the doubleQuotes on properties
{packagingUnitType:"Carton",noOfUnits:10}
Now this can be passed to the graphQL server like
newStackMutation(input: {packagingUnitType:"Carton", noOfUnits:10}) {
...
}
This works only if the enum value does not have any quotes. like below
newStackMutation(input: {packagingUnitType:Carton, noOfUnits:10}) {
...
}
Thanks
GraphQL queries can accept variables. This will be easier for you, as you will not have to do some tricky string-concatenation.
I suppose you use GraphQLHttp - or similar. To send your variables along the query, send a JSON body with a query key and a variables key:
// JSON body
{
"query": "query MyQuery { ... }",
"variables": {
"variable1": ...,
}
}
The query syntax is:
query MyMutation($input: NewStackMutationInput) {
newStackMutation(input: $input) {
...
}
}
And then, you can pass your variable as:
{
"input": {
"packagingUnitType": "Carton",
"noOfUnits": 10
}
}
GraphQL will understand packagingUnitType is an Enum type and will do the conversion for you.