Validate mandatory field in Request Data in JMeter using JSR223 Assertion - jmeter

I want to check if mandatory field not provided in Request Data JSON. Secondly want to check if supporting JSON object is not provided for CustomerType.
Here is my JSON that I want to validate.
{
"Transaction": {
"TrType": "Vehicle", -- This is mandatory field
"CustomerType": "Individual", -- This is mandatory field and depend upon Customer type user must have to pass IndividualClient or CompanyClient
},
"IndividualClient": {
"FirstName": "Test First Name", -- Optional field
"LName": "Test Last Name", -- Mandatory field
},
"CompanyClient": {
"CompanyName": "Company Name", -- mandatory field
}
}
How can i achieve this by using JSR223 Assertion?

you need to assert only according to Individual or Company:
if (jsonRequest.Transaction.CustomerType.contains("Individual")) {
assert jsonRequest.IndividualClient
assert jsonRequest.IndividualClient.size() >=0
assert jsonRequest.IndividualClient.LName
} else {
assert jsonRequest.CompanyClient
assert jsonRequest.CompanyClient.size() >=0
assert jsonRequest.CompanyClient.CompanyName
}

Related

Get complete GraphQL response using POST without specify field name in request [duplicate]

Assume you have a GraphQL type and it includes many fields.
How to query all the fields without writing down a long query that includes the names of all the fields?
For example, If I have these fields :
public function fields()
{
return [
'id' => [
'type' => Type::nonNull(Type::string()),
'description' => 'The id of the user'
],
'username' => [
'type' => Type::string(),
'description' => 'The email of user'
],
'count' => [
'type' => Type::int(),
'description' => 'login count for the user'
]
];
}
To query all the fields usually the query is something like this:
FetchUsers{users(id:"2"){id,username,count}}
But I want a way to have the same results without writing all the fields, something like this:
FetchUsers{users(id:"2"){*}}
//or
FetchUsers{users(id:"2")}
Is there a way to do this in GraphQL ??
I'm using Folkloreatelier/laravel-graphql library.
Unfortunately what you'd like to do is not possible. GraphQL requires you to be explicit about specifying which fields you would like returned from your query.
Yes, you can do this using introspection. Make a GraphQL query like (for type UserType)
{
__type(name:"UserType") {
fields {
name
description
}
}
}
and you'll get a response like (actual field names will depend on your actual schema/type definition)
{
"data": {
"__type": {
"fields": [
{
"name": "id",
"description": ""
},
{
"name": "username",
"description": "Required. 150 characters or fewer. Letters, digits, and #/./+/-/_ only."
},
{
"name": "firstName",
"description": ""
},
{
"name": "lastName",
"description": ""
},
{
"name": "email",
"description": ""
},
( etc. etc. ...)
]
}
}
}
You can then read this list of fields in your client and dynamically build a second GraphQL query to get the values of these fields.
This relies on you knowing the name of the type that you want to get the fields for -- if you don't know the type, you could get all the types and fields together using introspection like
{
__schema {
types {
name
fields {
name
description
}
}
}
}
NOTE: This is the over-the-wire GraphQL data -- you're on your own to figure out how to read and write with your actual client. Your GraphQL javascript library may already employ introspection in some capacity. For example, the apollo codegen command uses introspection to generate types.
2022 Update
Since this answer was originally written, it is now a recommended security practice to TURN OFF introspection in production. Reference: Why you should disable GraphQL introspection in production.
For an environment where introspection is off in production, you could use it in development as a way to assist in creating a static query that was used in production; you wouldn't actually be able to create a query dynamically in production.
I guess the only way to do this is by utilizing reusable fragments:
fragment UserFragment on Users {
id
username
count
}
FetchUsers {
users(id: "2") {
...UserFragment
}
}
I faced this same issue when I needed to load location data that I had serialized into the database from the google places API. Generally I would want the whole thing so it works with maps but I didn't want to have to specify all of the fields every time.
I was working in Ruby so I can't give you the PHP implementation but the principle should be the same.
I defined a custom scalar type called JSON which just returns a literal JSON object.
The ruby implementation was like so (using graphql-ruby)
module Graph
module Types
JsonType = GraphQL::ScalarType.define do
name "JSON"
coerce_input -> (x) { x }
coerce_result -> (x) { x }
end
end
end
Then I used it for our objects like so
field :location, Types::JsonType
I would use this very sparingly though, using it only where you know you always need the whole JSON object (as I did in my case). Otherwise it is defeating the object of GraphQL more generally speaking.
GraphQL query format was designed in order to allow:
Both query and result shape be exactly the same.
The server knows exactly the requested fields, thus the client downloads only essential data.
However, according to GraphQL documentation, you may create fragments in order to make selection sets more reusable:
# Only most used selection properties
fragment UserDetails on User {
id,
username
}
Then you could query all user details by:
FetchUsers {
users() {
...UserDetails
}
}
You can also add additional fields alongside your fragment:
FetchUserById($id: ID!) {
users(id: $id) {
...UserDetails
count
}
}
Package graphql-type-json supports custom-scalars type JSON.
Use it can show all the field of your json objects.
Here is the link of the example in ApolloGraphql Server.
https://www.apollographql.com/docs/apollo-server/schema/scalars-enums/#custom-scalars

Apollo Array of Custom Input types as Mutation argument throws "__typename": Unknown field error

I do not quite understand what the error even is to be able to tackle this problem. Checking server console also doesn't show any descriptive error. I have added all the necessary code that is related to this issue.
Here is the mutation:
mutation SaveTrials($event: ID!, $input: [ResultTrialsInputType!]!) {
saveTrials(event: $event, results: $input) {
results {
id
trials
}
}
}
I am using Graphene (Python) in backend but the types correspond to the following:
input ResultTrialsInputType {
id: ID
person: ID!
trials: [String]
}
Here is the Python code if it matters:
class ResultTrialsInputType(graphene.InputObjectType):
id = graphene.ID()
person = graphene.ID(required=True)
trials = graphene.List(graphene.String)
When I send data from the apollo using the mutation above, this is what is being sent to the API:
{
"operationName": "SaveTrials",
"variables": {
"event": "207e9f27-be66-4564-9c28-ac92ec44235d",
"input": [
{
"id": "8eb80b8b-c93a-44b1-9624-e75436c13780",
"trials": [
"32.1",
"92.2",
"12.1",
"12.2",
"23.2",
""
],
"__typename": "ResultTrialsObjectType",
"person": "a6f18ab5-df23-421e-b916-73e569bf73ad"
}
]
},
"query": "mutation SaveTrials($event: ID!, $input: [ResultTrialsInputType!]!) {\n saveTrials(event: $event, results: $input) {\n results {\n id\n trials\n __typename\n }\n __typename\n }\n}\n"
}
Response for this query is an error about "__typename":
{
"errors": [
{
"message": "Variable \"$input\" got invalid value [{\"__typename\": \"ResultTrialsObjectType\", \"person\": \"a6f18ab5-df23-421e-b916-73e569bf73ad\", \"id\": \"8eb80b8b-c93a-44b1-9624-e75436c13780\", \"trials\": [\"32.1\", \"92.2\", \"12.1\", \"12.2\", \"23.2\", \"\"]}].\nIn element #0: In field \"__typename\": Unknown field.",
"locations": [
{
"line": 1,
"column": 34
}
]
}
]
}
In anywhere else in my application where an input argument is not array of custom objects as expected. What is the deal here? Am I setting my input arguments in the wrong way? Or am I missing something here?
I tried to add __typename manually to the input type; however, nothing happened.
Thanks!
EDIT: Now that I am checking this out, for some reason __typename is displayed as ResultTrialsObjectType but it should be ResultTrialsInputType. How is this value being generated? Does Apollo generate it or does server generate it and Apollo fetches it?
Your schema specifies that ResultTrialsInputType has three fields: id, person, trials. __typename is a special meta-field that signifies the type of an object -- it should not be added to the schema. In fact, any names that start with two underscores are reserved and should not be used for field names.
As the error indicates, the issue is that __typename is not a field that's specified for ResultTrialsInputType, but you're sending it anyway.
Apollo will automatically attach __typename to any selection sets in your request (not inputs or variable values). So a query like this:
query {
foo {
bar
}
}
becomes:
query {
foo {
bar
__typename
}
}
Apollo needs the __typename for every Object returned in your response in order to effectively cache the response. However, this means any time you are working with a data object returned by Apollo, it will have __typename properties throughout its structure.
What this boils down to is that, generally speaking, you cannot and should not make a query, mutate the response and then turn around and use that as an input to another query or mutation.

Graphql mandatory check doesn't work when using GraphiQL UI with input variables

I have defined a mutation like below with email field marked as mandatory
type Mutation {
bookTicket (person: PersonInput! ,email: String!): Ticket
}
input PersonInput{
personId: ID!
name: String!
date: String!
comment: String
}
When I try to execute the mutation through GraphIQl UI withou passing email field the UI doesn't throw validation error and the calls the endpoint with empty value for the email field .
mutation($personInput:PersonInput!, $email :String!){
bookTicket(person:$personInput,email: $email){
id
}
}
Variables
{
"personInput": {
"personId": "111",
"name": "test",
"date": "10-Oct-2018",
"comment": "Book"
}
}
If I try to run the mutation with inline variables the validation works fine and shows exception that email cannot be empty .
mutation{
bookTicket(person:{personId: "111", name: "test", date: "10-Oct-
2018",comment: "Book"}
email:""){
id
}
}
Can anyone help me on why the validation doesn't work in the first case ?

createUser update related field - understanding relation

I need to set a related field's value on create, is this possible?
Details:
I have a User model with fields: email, displayname.
I have a Verify model with fields: code, action.
I created a relation between the two models like this:
I want to createUser and set the related fields of code and action at the same time. I tried this:
mutation {
createUser
(
email:"noit#mail.com",
displayname:"noit",
password:"123",
code: "this is a code",
action: "REGISTER"
) {
id
}
}
This fails with:
{
"data": null,
"errors": [
{
"message": "Unknown argument 'code' on field 'createUser' of type 'Mutation'. (line 2, column 76):\n createUser(email: \"noit#mail.com\", displayname: \"noit\", password: \"123\", code: \"this is a code\", action: \"REGISTER\") {\n ^",
"locations": [
{
"line": 2,
"column": 76
}
]
},
{
"message": "Unknown argument 'action' on field 'createUser' of type 'Mutation'. (line 2, column 100):\n createUser(email: \"noit#mail.com\", displayname: \"noit\", password: \"123\", code: \"this is a code\", action: \"REGISTER\") {\n ^",
"locations": [
{
"line": 2,
"column": 100
}
]
}
]
}
We specifically designed the Graphcool API to handle cases like this as simple as possible, you can do it like this:
mutation {
createUser (
email:"noit#mail.com",
displayname:"noit",
password:"123",
blahVerify: {
code: "this is a code",
action: "REGISTER"
}) {
id
blahVerify {
id
}
}
}
Note the nested blahVerify object argument.
This answer to a similar question goes a bit more into detail and also shows how you can use GraphQL variables to send nested mutations from Apollo Client.
As a sidenote, depending on the different possible value for the action of a Verify node, you might want to use an enum field rather than strings. You can read more about enum fields in the documentation.
You can do this on scaphold.io. The Logic system includes more than just mutation callbacks. You can fire functions before mutations to validate/clean input before it is saved to the DB, after to manage connections like this that will get returned in that same mutation payload, and asynchronously (like mutation callbacks) for kicking off long standing tasks. You can even compose functions together to pass meta-data through a chain of function invocations.

What is the best way to validate data in mongo?

What's the best way to validate data being inserted or updated into MongoDB? Is it to write some sort of server executed Javascript code that does the validation?
Starting from MongoDB 3.2 they added document validation (slides).
You can specify validation rules for each collection, using validator option using almost all mongo query operators (except $geoNear, $near, $nearSphere, $text, and $where).
To create a new collection with a validator, use:
db.createCollection("your_coll", {
validator: { `your validation query` }
})
To add a validator to the existing collection, you can add the validator:
db.createCollection("your_coll", {
validator: { `your validation query` }
})
Validation work only on insert/update, so when you create a validator on your old collection, the previous data will not be validated (you can write application level validation for a previous data). You can also specify validationLevel and validationAction to tell what will happen if the document will not pass the validation.
If you try to insert/update the document with something that fails the validation, (and have not specified any strange validationLevel/action) then you will get an error on writeResult (sadly enough the error does not tell you what failed and you get only default validation failed):
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 121,
"errmsg" : "Document failed validation"
}
})
MongoDB doesn't have constraints or triggers so the application has to validate the data.
You can also write Javascript scripts that check once a day or more if there is invalid data. You can use this to check the quality of the business logic of your application.
I think it would be normal for your app to handle this kind of thing. If the data is invalid in some way, don't let it get added to the datastore until the user has corrected whatever error you have detected.
Starting in 2.4, MongoDB enables basic BSON object validation for mongod and mongorestore when writing to MongoDB data files. This prevents any client from inserting invalid or malformed BSON into a MongoDB database.
source: http://docs.mongodb.org/manual/release-notes/2.4/
Starting MongoDB 3.6 you can also use JSON Schema to express validation rules. These checks will happen on the database side on insert/update.
Here is an example from the docs:
validator = {
$jsonSchema: {
bsonType: "object",
required: [ "name", "year", "major", "address" ],
properties: {
name: {
bsonType: "string",
description: "must be a string and is required"
},
year: {
bsonType: "int",
minimum: 2017,
maximum: 3017,
description: "must be an integer in [ 2017, 3017 ] and is required"
},
major: {
enum: [ "Math", "English", "Computer Science", "History", null ],
description: "can only be one of the enum values and is required"
},
gpa: {
bsonType: [ "double" ],
description: "must be a double if the field exists"
},
address: {
bsonType: "object",
required: [ "city" ],
properties: {
street: {
bsonType: "string",
description: "must be a string if the field exists"
},
city: {
bsonType: "string",
description: "must be a string and is required"
}
}
}
}
}
}
db.runCommand( {
collMod: "collectionName",
validator: validator
} )
I've just started using MongoDB and PHP together, inside a Zend Framework based application.
I have created 1 object for each MongoDB collection (e.g. User.php maps to the user collection). Each object knows what collection it maps to, and what fields are required. It also knows which filters (Zend_Filter_Input) and validators (Zend_Validate) should be applied to each field. Before doing a MongoDB insert() or save(), I run $object->isValid(), which executes all the validators. If they all pass isValid() will return true, and I proceed to run the insert() or save(), otherwise I display the errors.

Resources