JHipster i18n for entity not being used - internationalization

I have a a JHipster application and I can see that the UI has all the field labels ok as some sort of defaults. But I also see that the entity specific translations are not being used.
I have a file src/main/webapp/i18n/en/employee.json for example,
{
"myApp": {
"employee": {
"home": {
"title": "Employees",
"createLabel": "Create a new Employee TEST TEST",
"createOrEditLabel": "Create or edit an Employee",
"notFound": "No Employees found"
},
"created": "A new Employee is created with identifier {{ param }}",
"updated": "An Employee is updated with identifier TEST TEST {{ param }}",
"deleted": "An Employee is deleted with identifier {{ param }}",
"delete": {
"question": "Are you sure you want to delete Employee {{ id }}?"
},
"detail": {
"title": "Employee"
},
"Employee": "Employee",
}
}
}
The create label just says "Create a new Employee", not "Create a new Employee TEST TEST".
Also, when I save an entity, I am seeing translation-not-found[A employee is updated with identifier 1]
Adding information suggested to Gaël Marziou.
Does it work when you restart webpack dev server npm start?
I start my server. In a terminal, I do npm start.
Webpack: Finished after 35.767 seconds.
DONE Compiled successfully in 35797ms 12:50:47 PM
No type errors found
Version: typescript 3.4.5
Same error: translation-not-found[A employee is updated with identifier 1].
What are the keys in your HTML templates?
There is a file employee.component.html and it uses keys:
myApp.employee.home.title
myApp.employee.home.createLabel
myApp.employee.home.notFound
global.field.id
myApp.employee.name
entity.action.view
entity.action.edit
entity.action.delete
but in no file so I see the use of myApp.employee.updated - a key that I also added "TEST TEST" to.
Which version of JHipster?
jhipsterVersion "6.7.1"

Found the reason. In EmployeeResource, the updateEmployee was calling a version of createEntityUpdateAlert that just sends a message, not a key.
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(ENTITY_NAME, employee.getId().toString()))
.body(result);
Obviously, client side only expected a key and threw an error..
I need to change that to the version that sends a key.
return ResponseEntity.ok()
.headers(HeaderUtil.createEntityUpdateAlert(applicationName, true, ENTITY_NAME, employee.getId().toString()))
.body(result);

Related

GraphQL AppSync DynamoDB only update one field in mutation

So I am working on an app with AWS amplify. I am using a single table design and I am trying to run a mutation where I only update the profile field of the UserV0 in the single table design. I am trying to only update the profile(s3key) but when I run my mutations it deletes the rest of the contents of UserV0.
Graph QL Schema
type SingleTable #model {
pk: String! #primaryKey(sortKeyFields: ["sk"])
sk: String!
user: UserV0
post: PostV0
}
type UserV0 {
name: String
username: String
email: String
profile: String
}
type PostV0 {
...
}
query getUserInfo {
getSingleTable(pk: "TEST", sk: "TEST") {
user {
username
name
profile
email
}
}
}
mutation createTable {
createSingleTable(input: {pk: "TEST", sk: "TEST", user: {email: "email#email.com", name: "testname", profile: "testPath", username: "testusername"}}) {
updatedAt
}
}
mutation updateTable {
updateSingleTable(input: {pk: "TEST", sk: "TEST", user: {profile: "TESTING", username: "TESTING123"}}) {
createdAt
}
}
If I run the update mutation above, then the entire user is reset and when I check it in my DynamoDB field the name and email fields are all lost. How can I make it so that when I run the mutation, I only update the profile field and leave the other fields without deleting them. Thanks in advance.
Edit: I put in all of the queries and mutations that I am running in AppSync. I run createTable and then getUserInfo and it returns this as it should.
{
"data": {
"getSingleTable": {
"user": {
"username": "testusername",
"name": "testname",
"profile": "testPath",
"email": "email#email.com"
}
}
}
}
But after I run the updateTable and then getUserInfo it returns this.
{
"data": {
"getSingleTable": {
"user": {
"username": "TESTING123",
"name": null,
"profile": "TESTING",
"email": null
}
}
}
}
As you can see the name and email fields are reset, set to null and removed from the DynamoDB database. I am pretty sure it is because it sees the user object as a new input. But how do I get it just recognize that I only want to update certain fields in userV0 and not the entire thing.
Make sure your function updateSingleTable uses the UpdateItem operation:
"operation" : "UpdateItem"
If you're using PutItem which I assume you are, it performs and overwrite and thus removing existing data.
https://docs.aws.amazon.com/appsync/latest/devguide/resolver-mapping-template-reference-dynamodb.html
Under the hood I believe the DynamoDB client is DynamoDB Mapper. As a result, it will delete values if set to null. To avoid this, you must ensure you do not set values to null, instead omit any values not being used in the request.

Aws AppSync Query erring out while using a resolver

Im new to AWS AppSync however its been pretty easy to learn and understand.
Im trying to create a resolver that when the user runs getChore(id: "") it will return all the chore information. Which its successfully doing, the problem is within the chore there are two fields: createdBy & assignedTo which are linked to a user type.
type Chore {
id: ID!
title: String
desc: String
status: String
reward: Float
retryDeduction: Float
required: Boolean
createdDate: AWSDateTime
date: AWSDateTime
interval: String
assignedTo: User
createdBy: User
}
type User {
id: ID!
age: Int
f_name: String
l_name: String
type: Int
admin: Boolean
family: Family
}
within aws appsync in trying to attach a resolver to assignedTo: User and createdBy: User so my query will look like:
query getChore {
getChore(id: "36d597c8-2c7e-4f63-93ee-38e5aa8f1d5b") {
id
...
...
assignedTo {
id
f_name
l_name
}
createdBy {
id
f_name
l_name
}
}
}
however when i fire off this query im getting an error:
The provided key element does not match the schema (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException;
which i have researched and cant seem to find the correct soltuion.
The resolver im using is:
{
"version": "2017-02-28",
"operation": "GetItem",
"key": {
"id": $util.dynamodb.toDynamoDBJson($ctx.args.id),
}
}
return:
$util.toJson($ctx.result)
When you get the The provided key element does not match the schema error, it's because your request mapping template key doesn't match the primary key in DynamoDB. You can enable CloudWatch Logs in your Application settings to see exactly what was sent to DynamoDB.
I'm not able to know what's wrong with your template because your sample lacks some information, if you can answers the questions pertaining to your application:
- Where are the users stored? Are they stored in their own DDB table separate from the chores, and is the hash key on the users table id as well?
- In the chores table how do you know which user your chore is assignedTo or createdBy? Is there a user id stored on the chore DDB item?
- Is the request mapping template you posted corresponding to the resolver attached to Chore.assignedTo? If yes, using $ctx.args.id will actually do a GetItem based on the chore id not the user it's assigned to.
Finally, I reproduced your application and I was able to make it work with a few changes.
Prerequisites:
I have a chores and a users DynamoDB table with both having id as hash key. These two tables are mapped as datasources in AppSync.
I have one chore in the chores tables that looks like
{
"assignedTo": "1",
"createdBy": "2",
"id": "36d597c8-2c7e-4f63-93ee-38e5aa8f1d5b",
"title": "Chore1"
}
and two users in the users table:
{
"f_name": "Alice",
"id": "2",
"l_name": "Wonderland"
}
and
{
"f_name": "John",
"id": "1",
"l_name": "McCain"
}
I used your GraphQL schema
Resolvers
Resolver on Query.getChore pointing to the chores table:
{
"version": "2017-02-28",
"operation": "GetItem",
"key": {
"id": $util.dynamodb.toDynamoDBJson($ctx.args.id),
}
}
Resolver on Chore.assignedTo pointing to the users table (note the $ctx.source.assignedTo instead of $ctx.args)
{
"version": "2017-02-28",
"operation": "GetItem",
"key": {
"id": $util.dynamodb.toDynamoDBJson($ctx.source.assignedTo),
}
}
Similarly, resolver on Chore.createdBy pointing to the users table:
{
"version": "2017-02-28",
"operation": "GetItem",
"key": {
"id": $util.dynamodb.toDynamoDBJson($ctx.source.createdBy),
}
}
All resolvers response mapping template use the pass-through.
Running the query
Finally, when running your query:
query getChore {
getChore(id: "36d597c8-2c7e-4f63-93ee-38e5aa8f1d5b") {
id
assignedTo {
id
f_name
l_name
}
createdBy {
id
f_name
l_name
}
}
}
I get the following results:
{
"data": {
"getChore": {
"id": "36d597c8-2c7e-4f63-93ee-38e5aa8f1d5b",
"assignedTo": {
"id": "1",
"f_name": "John",
"l_name": "McCain"
},
"createdBy": {
"id": "2",
"f_name": "Alice",
"l_name": "Wonderland"
}
}
}
}
Hope it helps!

createUser update related field - understanding relation

I need to set a related field's value on create, is this possible?
Details:
I have a User model with fields: email, displayname.
I have a Verify model with fields: code, action.
I created a relation between the two models like this:
I want to createUser and set the related fields of code and action at the same time. I tried this:
mutation {
createUser
(
email:"noit#mail.com",
displayname:"noit",
password:"123",
code: "this is a code",
action: "REGISTER"
) {
id
}
}
This fails with:
{
"data": null,
"errors": [
{
"message": "Unknown argument 'code' on field 'createUser' of type 'Mutation'. (line 2, column 76):\n createUser(email: \"noit#mail.com\", displayname: \"noit\", password: \"123\", code: \"this is a code\", action: \"REGISTER\") {\n ^",
"locations": [
{
"line": 2,
"column": 76
}
]
},
{
"message": "Unknown argument 'action' on field 'createUser' of type 'Mutation'. (line 2, column 100):\n createUser(email: \"noit#mail.com\", displayname: \"noit\", password: \"123\", code: \"this is a code\", action: \"REGISTER\") {\n ^",
"locations": [
{
"line": 2,
"column": 100
}
]
}
]
}
We specifically designed the Graphcool API to handle cases like this as simple as possible, you can do it like this:
mutation {
createUser (
email:"noit#mail.com",
displayname:"noit",
password:"123",
blahVerify: {
code: "this is a code",
action: "REGISTER"
}) {
id
blahVerify {
id
}
}
}
Note the nested blahVerify object argument.
This answer to a similar question goes a bit more into detail and also shows how you can use GraphQL variables to send nested mutations from Apollo Client.
As a sidenote, depending on the different possible value for the action of a Verify node, you might want to use an enum field rather than strings. You can read more about enum fields in the documentation.
You can do this on scaphold.io. The Logic system includes more than just mutation callbacks. You can fire functions before mutations to validate/clean input before it is saved to the DB, after to manage connections like this that will get returned in that same mutation payload, and asynchronously (like mutation callbacks) for kicking off long standing tasks. You can even compose functions together to pass meta-data through a chain of function invocations.

Error when querying in IBM Content Navigator

We are using the IBM Content Navigator 2.0.3 with IBM FileNet P8, Version 5.2. We want to make a query according the Process Engine REST Service Reference. I made an appointment for the columns in a given basket (according here) and I made this query:
[ECM SERVER]/CaseManager/P8BPMREST/p8/bpm/v1/queues/SPLN_Autuacao/workbaskets/Autuacao/columns?cp=CP1
My result was this (showing just 2 columns):
{
...,
"SPLN_itemkey":{
"ordinal":2,
"prompt":"itemkey",
"attributes":"queues\/SPLN_Autuacao\/workbaskets\/Autuacao\/columns\/SPLN_itemkey\/attributes",
"sortable":false,
"type":2,
"name":"SPLN_itemkey"
},
"SPLN_actid":{
"ordinal":3,
"prompt":"actid",
"attributes":"queues\/SPLN_Autuacao\/workbaskets\/Autuacao\/columns\/SPLN_actid\/attributes",
"sortable":false,
"type":1,
"name":"SPLN_actid"
},
...
}
If I make a query with no parameters like
[ECM SERVER]/CaseManager/P8BPMREST/p8/bpm/v1/queues/SPLN_Autuacao/workbaskets/Autuacao/queueelements?cp=CP1
Some results I get are:
{
"lastRecord":null,
"queueElements":[
{
"lockedBy":"",
"stepProcessorId":165458,
"milestones":"queues\/SPLN_Autuacao\/stepelements\/942CF4FC538FDC46A9E3ADBE3CF607C1\/milestones",
"caseTaskId":"{C087B74F-0100-C29D-9C14-EB557CC6F2D6}",
"stepElement":"queues\/SPLN_Autuacao\/stepelements\/942CF4FC538FDC46A9E3ADBE3CF607C1",
"canReassign":true,
"boundUserName":"",
"ETag":"14313.0",
"stepDeadlineStatus":0,
"stepName":"020 Autuacao",
"workObjectNumber":"942CF4FC538FDC46A9E3ADBE3CF607C1",
"caseFolderId":"{C087B74F-0000-C51C-8788-3E63307F980B}",
"queueName":"SPLN_Autuacao",
"lockedById":0,
"columns":{
"F_StepName":"020 Autuacao",
"SPLN_resultout":"COMPLETE: ",
"F_CreateTime":"2015-09-10T13:52:53Z",
"F_Subject":"Fluxo Autuacao",
"SPLN_funcmode":"PR_ELABORACAO:FN_ST_ELABORACAO",
"SPLN_IDDocumento":"1098857",
"SPLN_itemkey":"620006",
"SPLN_itemtype":"SPL",
"SPLN_actid":null,
"SPLN_Natureza":"Mo\u00e7\u00e3o"
}
},
{
"lockedBy":"",
"stepProcessorId":165458,
"milestones":"queues\/SPLN_Autuacao\/stepelements\/9E1DCCF25AEE4A4FA4C61421214B9F40\/milestones",
"caseTaskId":"{008DB74F-0100-C600-9410-D38352275E36}",
"stepElement":"queues\/SPLN_Autuacao\/stepelements\/9E1DCCF25AEE4A4FA4C61421214B9F40",
"canReassign":true,
"boundUserName":"",
"ETag":"14315.0",
"stepDeadlineStatus":0,
"stepName":"020 Autuacao",
"workObjectNumber":"9E1DCCF25AEE4A4FA4C61421214B9F40",
"caseFolderId":"{008DB74F-0000-C516-B965-5D1351219C0E}",
"queueName":"SPLN_Autuacao",
"lockedById":0,
"columns":{
"F_StepName":"020 Autuacao",
"SPLN_resultout":"COMPLETE: ",
"F_CreateTime":"2015-09-10T13:58:38Z",
"F_Subject":"Fluxo Autuacao",
"SPLN_funcmode":"PR_ELABORACAO:FN_ST_ELABORACAO",
"SPLN_IDDocumento":"1098858",
"SPLN_itemkey":"620007",
"SPLN_itemtype":"SPL",
"SPLN_actid":null,
"SPLN_Natureza":"Projeto de lei"
}
},
...
],
...
}
However, if I try to query for a element - let´s say when SPLN_itemkey equals to 620007 - I tried to use a URL such as described here.
[ECM SERVER]/CaseManager/P8BPMREST/p8/bpm/v1/queues/SPLN_Autuacao/workbaskets/Autuacao/queueelements?cp=CP1&filters=[SPLN_itemkey=620006]
I get This error message:
{
"msg":"filenet.pe.rest.VWRESTException",
"UnderlyingDetails":{
"Causes":[
"Invalid filters parameter.\n[FNRPE0450100011E] Invalid filter name",
"Invalid filters parameter.\n[FNRPE0450100011E] Invalid filter name",
"[FNRPE0450100011E] Invalid filter name"
]
},
"stack": "filenet.pe.rest.handlers.QueueElements$FiltersParam.parseFilters(QueueElements.java:458)\r\n\tfilenet.pe.rest.handlers.QueueElements.onGet(QueueElements.java:357)\r\n\tfilenet.pe.rest.P8BPMRESTServlet.doMethod(P8BPMRESTServlet.java:714)\r\n\t",
"UserMessage":{
"UniqueId":"FNRPE0450100011E",
"Severity":"ERROR",
"Text":"[FNRPE0450100011E] Invalid filter name"
}
}
According the message, I am using a invalid filter name. So I tried make filters=[itemkey=620006] instead filters=[SPLN_itemkey=620006]. However I get the same error message.
Actually, the problem was not in the url. It was necessary first create a query in the ECM server to this URL worked.

What is the best way to validate data in mongo?

What's the best way to validate data being inserted or updated into MongoDB? Is it to write some sort of server executed Javascript code that does the validation?
Starting from MongoDB 3.2 they added document validation (slides).
You can specify validation rules for each collection, using validator option using almost all mongo query operators (except $geoNear, $near, $nearSphere, $text, and $where).
To create a new collection with a validator, use:
db.createCollection("your_coll", {
validator: { `your validation query` }
})
To add a validator to the existing collection, you can add the validator:
db.createCollection("your_coll", {
validator: { `your validation query` }
})
Validation work only on insert/update, so when you create a validator on your old collection, the previous data will not be validated (you can write application level validation for a previous data). You can also specify validationLevel and validationAction to tell what will happen if the document will not pass the validation.
If you try to insert/update the document with something that fails the validation, (and have not specified any strange validationLevel/action) then you will get an error on writeResult (sadly enough the error does not tell you what failed and you get only default validation failed):
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 121,
"errmsg" : "Document failed validation"
}
})
MongoDB doesn't have constraints or triggers so the application has to validate the data.
You can also write Javascript scripts that check once a day or more if there is invalid data. You can use this to check the quality of the business logic of your application.
I think it would be normal for your app to handle this kind of thing. If the data is invalid in some way, don't let it get added to the datastore until the user has corrected whatever error you have detected.
Starting in 2.4, MongoDB enables basic BSON object validation for mongod and mongorestore when writing to MongoDB data files. This prevents any client from inserting invalid or malformed BSON into a MongoDB database.
source: http://docs.mongodb.org/manual/release-notes/2.4/
Starting MongoDB 3.6 you can also use JSON Schema to express validation rules. These checks will happen on the database side on insert/update.
Here is an example from the docs:
validator = {
$jsonSchema: {
bsonType: "object",
required: [ "name", "year", "major", "address" ],
properties: {
name: {
bsonType: "string",
description: "must be a string and is required"
},
year: {
bsonType: "int",
minimum: 2017,
maximum: 3017,
description: "must be an integer in [ 2017, 3017 ] and is required"
},
major: {
enum: [ "Math", "English", "Computer Science", "History", null ],
description: "can only be one of the enum values and is required"
},
gpa: {
bsonType: [ "double" ],
description: "must be a double if the field exists"
},
address: {
bsonType: "object",
required: [ "city" ],
properties: {
street: {
bsonType: "string",
description: "must be a string if the field exists"
},
city: {
bsonType: "string",
description: "must be a string and is required"
}
}
}
}
}
}
db.runCommand( {
collMod: "collectionName",
validator: validator
} )
I've just started using MongoDB and PHP together, inside a Zend Framework based application.
I have created 1 object for each MongoDB collection (e.g. User.php maps to the user collection). Each object knows what collection it maps to, and what fields are required. It also knows which filters (Zend_Filter_Input) and validators (Zend_Validate) should be applied to each field. Before doing a MongoDB insert() or save(), I run $object->isValid(), which executes all the validators. If they all pass isValid() will return true, and I proceed to run the insert() or save(), otherwise I display the errors.

Resources