I m trying to create a Spring Boot application that a user can select type of role on registration e.g patient/doctor
The roles are created automatically in the database on application start
I m using Postman to make registration request but when I pass the role it is duplicated in the database
Any advice would be much appreciated!
Here is the method i use to create a user
public User createUser (RegistrationRequest registrationRequest) {
User user = new User();
user.setFirstname(registrationRequest.getFirstname());
user.setLastname(registrationRequest.getLastname());
user.setEmail(registrationRequest.getEmail());
user.setUsername(registrationRequest.getUsername());
user.setPhone(registrationRequest.getPhone());
user.setAddress(registrationRequest.getAddress());
user.setPassword(passwordEncoder.encode(registrationRequest.getPassword()));
user.setCreatedAt(Instant.from(Instant.now()));
user.setIsEnabled(false);
user.setRoles(registrationRequest.getRoles());
return user;
and here is the json request
{
"firstname":"Mary",
"lastname":"K",
"email":"example#gmail.com",
"username":"rain",
"password":"123",
"phone":"2848392",
"address":
{
"city": "Burdwan",
"region": "Paschimbanga",
"street": "gwg",
"zipcode": "713102"
},
"roles":[
{
"name":"DOCTOR_ROLE"
} ]
}
Related
So I am working on an app with AWS amplify. I am using a single table design and I am trying to run a mutation where I only update the profile field of the UserV0 in the single table design. I am trying to only update the profile(s3key) but when I run my mutations it deletes the rest of the contents of UserV0.
Graph QL Schema
type SingleTable #model {
pk: String! #primaryKey(sortKeyFields: ["sk"])
sk: String!
user: UserV0
post: PostV0
}
type UserV0 {
name: String
username: String
email: String
profile: String
}
type PostV0 {
...
}
query getUserInfo {
getSingleTable(pk: "TEST", sk: "TEST") {
user {
username
name
profile
email
}
}
}
mutation createTable {
createSingleTable(input: {pk: "TEST", sk: "TEST", user: {email: "email#email.com", name: "testname", profile: "testPath", username: "testusername"}}) {
updatedAt
}
}
mutation updateTable {
updateSingleTable(input: {pk: "TEST", sk: "TEST", user: {profile: "TESTING", username: "TESTING123"}}) {
createdAt
}
}
If I run the update mutation above, then the entire user is reset and when I check it in my DynamoDB field the name and email fields are all lost. How can I make it so that when I run the mutation, I only update the profile field and leave the other fields without deleting them. Thanks in advance.
Edit: I put in all of the queries and mutations that I am running in AppSync. I run createTable and then getUserInfo and it returns this as it should.
{
"data": {
"getSingleTable": {
"user": {
"username": "testusername",
"name": "testname",
"profile": "testPath",
"email": "email#email.com"
}
}
}
}
But after I run the updateTable and then getUserInfo it returns this.
{
"data": {
"getSingleTable": {
"user": {
"username": "TESTING123",
"name": null,
"profile": "TESTING",
"email": null
}
}
}
}
As you can see the name and email fields are reset, set to null and removed from the DynamoDB database. I am pretty sure it is because it sees the user object as a new input. But how do I get it just recognize that I only want to update certain fields in userV0 and not the entire thing.
Make sure your function updateSingleTable uses the UpdateItem operation:
"operation" : "UpdateItem"
If you're using PutItem which I assume you are, it performs and overwrite and thus removing existing data.
https://docs.aws.amazon.com/appsync/latest/devguide/resolver-mapping-template-reference-dynamodb.html
Under the hood I believe the DynamoDB client is DynamoDB Mapper. As a result, it will delete values if set to null. To avoid this, you must ensure you do not set values to null, instead omit any values not being used in the request.
I am having trouble getting the users Identity information from within the API.
My project consists of a standalone WASM app, IDP and WebApi.
I have everything setup and it works but what I am after is a Call from the Blazor client to get some data from the api. The Api then uses the users email address to identify them and get the data just for them.
I have looked at similar questions and the solutions don't work for me on my project.
[HttpGet("GetData")]
public async Task<IActionResult> GetData()
{
string test = User.Identity.Name; // returns null
string username = "myuser#users.com";
List<string> data= new List<string>();
data= (await _dataRepository.GetData(username)).ToList();
if (data.Count > 0)
{
return Ok(data);
}
else
{
return NoContent();
}
}
So where I am setting the username is there a way to get a hold of the email of the user who passed the request?
Edited
Access Token:
{
"alg": "RS256",
"kid": "2D49329C75FC43C78590AF6F6A0EFDB2",
"typ": "at+jwt"
}
{
"nbf": 1639243158,
"exp": 1639246758,
"iss": "https://localhost:5000",
"aud": "https://localhost:5000/resources",
"client_id": "ATS",
"sub": "4892725f-f6da-4a28-827a-ce666bb6f098",
"auth_time": 1638729064,
"idp": "local",
"jti": "53CB2F8FCB2EB34E3501E2C210B59B5D",
"sid": "8463E4AA74D7369C1176249ED8FA46B1",
"iat": 1639243158,
"scope": [
"openid",
"profile",
"MY_API"
"email"
],
"amr": [
"pwd"
]
}
First you would typically secure your controller action methods using the [Authorize(...)] attribute and lookup the authorization in ASP.NET Core for more details about that.
Second, the most common problem when the name/email is not found is that you need to turn of the claims mapping using
JwtSecurityTokenHandler.DefaultInboundClaimTypeMap.Clear();
JwtSecurityTokenHandler.DefaultOutboundClaimTypeMap.Clear();
This is because by default Microsoft and OpenID have different opinions on what the claim names should be and because of that it can be wise to first clear this mapping and then secondly, point tell Microsoft what the name of the name/role claim by setting this:
opt.TokenValidationParameters.RoleClaimType = "roles";
opt.TokenValidationParameters.NameClaimType = "name";
For more details about claims mapping visit:
https://learn.microsoft.com/en-us/aspnet/core/security/authentication/claims?view=aspnetcore-6.0
I am new to plaid.
I created a plaid access_token and now its showing
"error_code":"ITEM_LOGIN_REQUIRED"
Using the doc I understand that we need to use update mode for solving this
then access token will not change and no need to call token -exchange
after getting this error
I tried calling
https://sandbox.plaid.com/link/token/create
method -POST
{
"client_id": "xxxxxx",
"secret": "xxxxxx",
"client_name": "test",
"user": { "client_user_id": "xxxx" },
"country_codes": ["US"],
"language": "en",
"access_token": "access-sandbox-xxxx-xxx-xxx-xxx-111111"
}
then I got new link_token
{
"expiration": "2021-11-09T13:46:12Z",
"link_token": "link-sandbox-xxxx-xxx-xxxx-xxx-xxx",
"request_id": "xxxxx"
}
Then after what I need to do ?? .. I understand that no need to do token exchange api.
but if I tried to use this api using the existing access-token it is showing the same error
https://sandbox.plaid.com/accounts/get
method -POST
{
"client_id": "xxxxxx",
"secret": "xxxxxx",
"access_token": "access-sandbox-xxxx-xxx-xxx-xxx-111111"
}
output
{
"display_message": null,
"error_code": "ITEM_LOGIN_REQUIRED",
"error_message": "the login details of this item have changed (credentials, MFA, or required user action) and a user login is required to update this information. use Link's update mode to restore the item to a good state",
"error_type": "ITEM_ERROR",
"request_id": "3LMjpQHxYAMDwos",
"suggested_action": null
}
in that document they are saying like this.
An Item's access_token does not change when using Link in update mode, so there is no need to repeat the exchange token process.
then why I am getting again this ??
What I need to do solve this issue?
// Initialize Link with the token parameter
// set to the generated link_token for the Item
const linkHandler = Plaid.create({
token: 'GENERATED_LINK_TOKEN',
onSuccess: (public_token, metadata) => {
// You do not need to repeat the /item/public_token/exchange
// process when a user uses Link in update mode.
// The Item's access_token has not changed.
},
onExit: (err, metadata) => {
// The user exited the Link flow.
if (err != null) {
// The user encountered a Plaid API error prior
// to exiting.
}
// metadata contains the most recent API request ID and the
// Link session ID. Storing this information is helpful
// for support.
},
});
After getting the Link token, you need to initialize Link with the Link token. Per the docs:
"To use update mode for an Item, initialize Link with a link_token configured with the access_token for the Item that you wish to update."
https://plaid.com/docs/link/update-mode/
Once the user has successfully completed the Link flow, the access token should be reactivated.
Im new to AWS AppSync however its been pretty easy to learn and understand.
Im trying to create a resolver that when the user runs getChore(id: "") it will return all the chore information. Which its successfully doing, the problem is within the chore there are two fields: createdBy & assignedTo which are linked to a user type.
type Chore {
id: ID!
title: String
desc: String
status: String
reward: Float
retryDeduction: Float
required: Boolean
createdDate: AWSDateTime
date: AWSDateTime
interval: String
assignedTo: User
createdBy: User
}
type User {
id: ID!
age: Int
f_name: String
l_name: String
type: Int
admin: Boolean
family: Family
}
within aws appsync in trying to attach a resolver to assignedTo: User and createdBy: User so my query will look like:
query getChore {
getChore(id: "36d597c8-2c7e-4f63-93ee-38e5aa8f1d5b") {
id
...
...
assignedTo {
id
f_name
l_name
}
createdBy {
id
f_name
l_name
}
}
}
however when i fire off this query im getting an error:
The provided key element does not match the schema (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException;
which i have researched and cant seem to find the correct soltuion.
The resolver im using is:
{
"version": "2017-02-28",
"operation": "GetItem",
"key": {
"id": $util.dynamodb.toDynamoDBJson($ctx.args.id),
}
}
return:
$util.toJson($ctx.result)
When you get the The provided key element does not match the schema error, it's because your request mapping template key doesn't match the primary key in DynamoDB. You can enable CloudWatch Logs in your Application settings to see exactly what was sent to DynamoDB.
I'm not able to know what's wrong with your template because your sample lacks some information, if you can answers the questions pertaining to your application:
- Where are the users stored? Are they stored in their own DDB table separate from the chores, and is the hash key on the users table id as well?
- In the chores table how do you know which user your chore is assignedTo or createdBy? Is there a user id stored on the chore DDB item?
- Is the request mapping template you posted corresponding to the resolver attached to Chore.assignedTo? If yes, using $ctx.args.id will actually do a GetItem based on the chore id not the user it's assigned to.
Finally, I reproduced your application and I was able to make it work with a few changes.
Prerequisites:
I have a chores and a users DynamoDB table with both having id as hash key. These two tables are mapped as datasources in AppSync.
I have one chore in the chores tables that looks like
{
"assignedTo": "1",
"createdBy": "2",
"id": "36d597c8-2c7e-4f63-93ee-38e5aa8f1d5b",
"title": "Chore1"
}
and two users in the users table:
{
"f_name": "Alice",
"id": "2",
"l_name": "Wonderland"
}
and
{
"f_name": "John",
"id": "1",
"l_name": "McCain"
}
I used your GraphQL schema
Resolvers
Resolver on Query.getChore pointing to the chores table:
{
"version": "2017-02-28",
"operation": "GetItem",
"key": {
"id": $util.dynamodb.toDynamoDBJson($ctx.args.id),
}
}
Resolver on Chore.assignedTo pointing to the users table (note the $ctx.source.assignedTo instead of $ctx.args)
{
"version": "2017-02-28",
"operation": "GetItem",
"key": {
"id": $util.dynamodb.toDynamoDBJson($ctx.source.assignedTo),
}
}
Similarly, resolver on Chore.createdBy pointing to the users table:
{
"version": "2017-02-28",
"operation": "GetItem",
"key": {
"id": $util.dynamodb.toDynamoDBJson($ctx.source.createdBy),
}
}
All resolvers response mapping template use the pass-through.
Running the query
Finally, when running your query:
query getChore {
getChore(id: "36d597c8-2c7e-4f63-93ee-38e5aa8f1d5b") {
id
assignedTo {
id
f_name
l_name
}
createdBy {
id
f_name
l_name
}
}
}
I get the following results:
{
"data": {
"getChore": {
"id": "36d597c8-2c7e-4f63-93ee-38e5aa8f1d5b",
"assignedTo": {
"id": "1",
"f_name": "John",
"l_name": "McCain"
},
"createdBy": {
"id": "2",
"f_name": "Alice",
"l_name": "Wonderland"
}
}
}
}
Hope it helps!
I'm developping a Loopback application extending base User model to UserCode model where each user is identified by an email plus a code fields.
So that a user can register with the same email twice but with different code.
I've seen that in node_modules/loopback/common/models/user.js at line 691 there is:
UserModel.validatesUniquenessOf('email', {message: 'Email already exists'});
I want to delete this restriction/validation but without change loopback code, of course.
How can I do it?
Maybe in the boot script I can loop through all validation and delete this one?
Figured it out
In this case you need to remove the default validations set by the User model
common/models/userCode.js
module.exports = function(UserCode){
//Add this line and it will start receiving multiple email.
delete UserCode.validations.email;
}
Also you can play with the required:true|false property to make any default defined property required or not.
common/models/userCode.json
{
"name": "UserCode",
"base": "User",
"idInjection": true,
"properties": {
"password": {
"type": "string",
"required": true
},
....
....
}
The following code the accepted answer will remove ALL the email validations:
module.exports = function(UserCode){
//Add this line and it will start receiving multiple email.
delete UserCode.validations.email;
}
Instead be selective and do something like this:
module.exports = function(UserCode){
// remove ONLY email uniqueness validation
UserCode.validations.email = UserCode.validations.email.reduce((all, one) => {
if (one.validation !== 'uniqueness') {
all.push(one);
}
return all;
}, []);
}