The best way to update by patch mapping object with set in fields - spring

What is the best way to do patch mapping on object like this:
#Data
public class UnifiedOfferEntity {
private String companyName;
private String city;
private String title;
private Set<SkillEntity> skills;
}
In entity I have many to many realtion. I did method for update where I send map<fieldName, value> and it works properly until I am trying to put set to the request body.
public UnifiedOfferEntity patchEntity(String id, Map<Object, Object> fields) {
UnifiedOfferEntity unifiedOfferEntity = getEntityById(id);
fields.forEach((key, value) -> {
Field field = ReflectionUtils.findField(UnifiedOfferEntity.class, (String) key);
field.setAccessible(true);
ReflectionUtils.setField(field, unifiedOfferEntity, value);
});
return unifiedOfferEntity;
}
I am getting:
java.lang.IllegalArgumentException: Can not set java.util.Set field com.example.jobfinder.entity.UnifiedOfferEntity.seniority to java.util.ArrayList

have you thought in this approach:
public UnifiedOfferEntity patchEntity(String id, Map<Object, Object> fields) {
UnifiedOfferEntity unifiedOfferEntity = getEntityById(id);
enhanceUnitedOfferEntity(unifiedOfferEntity,fields);
return unifiedOfferEntity;
}
where the method would manually populate the objet based on keys as constants.. or another object...
private void enhanceUnitedOfferEntity(UnifiedOfferEntity unifiedOfferEntity, Map<Object, Object> fields){
fields.forEach((k, v) ->
switch(k) {
case "COMPANY_NAME":
unifiedOfferEntity.setCompanyName(v);
case "CITY":
unifiedOfferEntity.setCity(v);
case "TITLE":
unifiedOfferEntity.setTitle(v);
break;
case "SKILLS":
unifiedOfferEntity.setSkills(v);
break;
default:
});
}

Related

How do I add a Type to a graphql-java-annotations project?

The documentation for graphql-java-annotations doesn't do such a great job of telling me how to add a Custom Scalar to my schema: https://github.com/Enigmatis/graphql-java-annotations/tree/v8.0.1#annotations-schema-creator
What I need is to create some 'scalar Date' in the Schema. It is unclear how to do this with the AnnotationsSchemaCreator builder thing.
GraphQLSchema schema = AnnotationsSchemaCreator.newAnnotationsSchema()
.query(Query.class) // to create you query object
.mutation(Mutation.class) // to create your mutation object
.subscription(Subscription.class) // to create your subscription object
.directive(UpperDirective.class) // to create a directive
.additionalType(AdditionalType.class) // to create some additional type and add it to the schema
.typeFunction(CustomType.class) // to add a typefunction
.setAlwaysPrettify(true) // to set the global prettifier of field names (removes get/set/is prefixes from names)
.setRelay(customRelay) // to add a custom relay object
.build();
The docs give me just that. Is a typeFunction what I need here? Do I have to first get the graphql-java "Custom Scalar" stuff set up and put that into the typeFunction?
What's happening right now is that my graphql-java-annotations Types which need the Date type...
public abstract class BasePart {
#GraphQLField
#GraphQLNonNull
#JsonIgnore
public String id;
...
#GraphQLField
public Date createdOn;
...
}
Get into the Schema without the Date scalar defined so the GraphiQL UI is rejecting it with errors like...
Error: Date fields must be an object with field names as keys or a function which returns such an object.
at invariant (http://localhost.blueorigin.com:8080/webjars/graphiql/0.10.1/graphiql.min.js:13:12678)
at defineFieldMap (http://localhost.blueorigin.com:8080/webjars/graphiql/0.10.1/graphiql.min.js:14:16395)
at e.getFields (http://localhost.blueorigin.com:8080/webjars/graphiql/0.10.1/graphiql.min.js:14:22028)
at http://localhost.blueorigin.com:8080/webjars/graphiql/0.10.1/graphiql.min.js:15:22055
at typeMapReducer (http://localhost.blueorigin.com:8080/webjars/graphiql/0.10.1/graphiql.min.js:15:22227)
at http://localhost.blueorigin.com:8080/webjars/graphiql/0.10.1/graphiql.min.js:15:22200
at Array.forEach (<anonymous>)
at http://localhost.blueorigin.com:8080/webjars/graphiql/0.10.1/graphiql.min.js:15:22082
at typeMapReducer (http://localhost.blueorigin.com:8080/webjars/graphiql/0.10.1/graphiql.min.js:15:22227)
at typeMapReducer (http://localhost.blueorigin.com:8080/webjars/graphiql/0.10.1/graphiql.min.js:15:21564)
I'm trying to figure out how to get that information into the, what, AnnotationsSchemaCreator.newAnnotationsSchema() builder?
How do you add a Custom Scalar to a graphql-java-annotations project?
The TypeFunction is the key. You pass the TypeFunction when you are building the Schema with the AnnotationsSchemaCreator. The following code effectively got scalar Date into the service's GraphQL Schema
graphQLSchema = AnnotationsSchemaCreator.newAnnotationsSchema()
.query(QuerySchema.class)
.setAlwaysPrettify(true)
.typeFunction(new MyDateTypeFunction()) // <-- This got scalar Date onto the schema
.build();
The TypeFunction itself realizes the support for the scalar Date.
public class MyDateTypeFunction implements TypeFunction {
#Override
public boolean canBuildType(Class<?> clazz, AnnotatedType annotatedType) {
return clazz == java.util.Date.class;
}
#Override
public GraphQLType buildType(
boolean b,
Class<?> clazz,
AnnotatedType annotatedType,
ProcessingElementsContainer processingElementsContainer) {
return MY_DATE;
}
public static final GraphQLScalarType MY_DATE = GraphQLScalarType
.newScalar()
.name("Date")
.description("Coerce java.util.Date to/from a String representation of the long value of getTime().")
.coercing(
new Coercing() {
#Override
public Object serialize(Object dataFetcherResult) throws CoercingSerializeException {
if (dataFetcherResult instanceof Date) {
final String result = String.format("%d", ((Date) dataFetcherResult).getTime());
return result;
}
final String message =
String.format("Expected type java.util.Date but found %s", typeName(dataFetcherResult));
throw new CoercingSerializeException(message);
}
#Override
public Object parseValue(Object input) throws CoercingParseValueException {
if (input instanceof String) {
try {
return stringToDate((String) input);
} catch (NumberFormatException nfe) {
final String message = String.format("NumberFormatException %s", nfe.getMessage());
throw new CoercingParseValueException(message);
}
}
final String message = String.format("Unable to parseValue %s to a java.util.Date", input);
throw new CoercingParseValueException(message);
}
#Override
public Object parseLiteral(Object input) throws CoercingParseLiteralException {
if (input instanceof StringValue) {
try {
final String inputStringValue = ((StringValue) input).getValue();
return stringToDate(inputStringValue);
} catch (NumberFormatException nfe) {
final String message = String.format("NumberFormatException %s", nfe.getMessage());
throw new CoercingParseLiteralException(message);
}
}
final String message = String.format("Unable to parseLiteral %s to a java.util.Date", input);
throw new CoercingParseLiteralException(message);
}
}
)
.build();
public static Date stringToDate(String input) throws NumberFormatException {
final long inputAsLong = Long.parseLong(input);
return new Date(inputAsLong);
}
public static String typeName(Object input) {
return input == null ? "null" : input.getClass().getName();
}
}
Note that I'm not recommending that you represent java.util.Date as the String value of the long getTime(), java.time.Instant's ISO-8601 is so much more readable, but my service needed this string value and this is how I got it into a graphql-java-annotation's project schema.

Generic Search and Filter by dynamic fields for Criteria (Global Search)

I have a scenario where I need to add Criteria to perform search and filter in Spring using mongoTemplate.
Scenario:
Lets say I have Student, Course and PotentialStudent. and I have to define only certain fields to be used for search and filter purpose. For PotentialStudent, it contains both Student and Course information that is collected before all required information is gathered to be filled to Student and Course.
Search Fields are the fields to be used for searching either of the fields. For example: get values matching in either courseName or courseType in Course.
Filter is to be used to filter specific fields for matching multiple values and the values to be filtered on field is set on FilterParams. Meaning, if I get values in FilterParams.studentType then for PotentialStudent I should
add Criteria to search inside PotentialStudent's student.type for list of values whereas if for Student add Criteria to search in Student's type.
public abstract class Model {
#Id
protected String id;
#CreatedDate
protected Date createdDateTime;
#LastModifiedDate
protected Date modifiedDateTime;
protected abstract List<String> searchFields();
protected abstract Map<String, String> filterFields();
}
#Getter
#Setter
#Document("student")
public class Student extends Model {
private String firstName;
private String lastName;
private String address;
private StudentType type;
#Override
protected List<String> searchFields() {
return Lists.newArrayList("firstName","lastName","address");
}
#Override
protected Map<String, String> filterFields() {
Map<String, String> filterMap = Maps.newHashMap();
filterMap.put("studentType", "type");
return filterMap;
}
}
#Getter
#Setter
#Document("course")
public class Course extends Model {
private String courseName;
private String courseType;
private int duration;
private Difficulty difficulty;
#Override
protected List<String> searchFields() {
return Lists.newArrayList("courseName","courseType");
}
#Override
protected Map<String, String> filterFields() {
Map<String, String> filterMap = Maps.newHashMap();
filterMap.put("courseDifficulty", "difficulty");
return filterMap;
}
}
#Getter
#Setter
#Document("course")
public class PotentialStudent extends Model {
private Student student;
private Course course;
#Override
protected List<String> searchFields() {
return Lists.newArrayList("student.firstName","student.lastName","course.courseName");
}
#Override
protected Map<String, String> filterFields() {
Map<String, String> filterMap = Maps.newHashMap();
filterMap.put("studentType", "student.type");
filterMap.put("courseDifficulty", "course.difficulty");
return filterMap;
}
}
}
public class FilterParams {
private List<StudentType> studentTypes;
private List<Difficulty> difficulties;
}
public class PageData<T extends Model> {
public void setPageRecords(List<T> pageRecords) {
this.pageRecords = pageRecords;
}
private List<T> pageRecords;
}
//Generic Search Filter Implementation Class
public class GenericSearchFilter {
public <T extends Model> PageData getRecordsWithPageSearchFilter(Integer page, Integer size, String sortName, String sortOrder, String value, FilterParams filterParams, Class<T> ormClass) {
PageRequestBuilder pageRequestBuilder = new PageRequestBuilder();
Pageable pageable = pageRequestBuilder.getPageRequest(page, size, sortName, sortOrder);
Query mongoQuery = new Query().with(pageable);
//add Criteria for the domain specified search fields
Criteria searchCriteria = searchCriteria(value, ormClass);
if (searchCriteria != null) {
mongoQuery.addCriteria(searchCriteria);
}
//Handle Filter
query.addCriteria(Criteria.where(filterFields().get("studentType")).in(filterParams.getStudentTypes()));
query.addCriteria(Criteria.where(filterFields().get("courseDifficulty")).in(filterParams.getDifficulty()));
List<T> records = mongoTemplate.find(mongoQuery, ormClass);
PageData pageData = new PageData();
pageData.setPageRecords(records);
return pageData;
}
private <T extends BaseDocument> Criteria searchCriteria(String value, Class<T> ormClass) {
try {
Criteria orCriteria = new Criteria();
if (StringUtils.isNotBlank(value)) {
BaseDocument document = ormClass.getDeclaredConstructor().newInstance();
Method method = ormClass.getDeclaredMethod("searchFields");
List<String> records = (List<String>) method.invoke(document, null);
Criteria[] orCriteriaArray = records.stream().map(s -> Criteria.where(s).regex(value, "i")).toArray(Criteria[]::new);
orCriteria.orOperator(orCriteriaArray);
}
return orCriteria;
} catch (Exception e) {
log.error(e.getMessage());
}
return null;
}
}
Given this scenario, my question is how to handle filter cases in better and dynamic way and how to implement a Global search if needed to search in all Document types for specified fields on each types.

Get the value of multiple map (Map inside of map) from postman

How can get the value of multiple key values (Map inside of map) from the postman
{
"message_key": {
"device_id": "12548652",
"message": "Y5482lsdfkOjEyNDUysdfsdfMTc1sdfOTM3MjU=",
"messageType": "Text"
}
}
Actually I want to bind value of message_key with domain to validate every properties.
I have found the answer e.g:
DTO:
public #Data class MessageKey {
#JsonProperty("device_id")
private String deviceId;
#JsonProperty("message")
private String message;
#JsonProperty("messageType")
private String messageType;
}
Controller:
public void test(#RequestBody Map<String, MessageKey> bodyParameters) {
MessageKey messageKey = bodyParameters.get("message_key");
System.out.println(messageKey);
}

How to de-serialize POJO contains HashTable?

I have pojo like this:
public class Test implements java.io.Serializable {
private static final long serialVersionUID = 1L;
private String hash;
private java.util.Hashtable<Integer, Long> myTempTable;
public java.util.Hashtable<Integer, Long> getMyTempTable() {
return this.myTempTable;
}
public void setMyTempTable(java.util.Hashtable<Integer, Long> myTempTable) { this.myTempTable = myTempTable; }
//And some few variables
}
In response I get this POJO in JSON format but while converting this JSON to "Test" java object like this.
gson.fromJson(tempString, Test.class);
It is giving error as
java.lang.IllegalArgumentException: Can not set java.util.Hashtable field <package_name>.Temp.myTempTable to java.util.LinkedHashMap
Why GSON is converting HashTable to LinkedHashMap?
And does this error means?
UPDATE: JSON File as
{
"hash": "abc",
"myTempTable": {
"1": 30065833999,
"2": 34364325903,
"3": 536872959
}
}
For converting an Object to JSON String.
public static <T> String convertObjectToStringJson(T someObject, Type type) {
Gson mGson = new Gson();
String strJson = mGson.toJson(someObject, type);
return strJson;
}
For converting a JSON String to an Object.
public static <T> T getObjectFromJson(String json, Type type) {
Gson mGson = new Gson();
if (json != null) {
if (json.isEmpty()) {
return null;
}
}
return mGson.fromJson(json, type);
}
where
Type is type of your Object.
ex:
for object:
new TypeToken<YOUR_POJO>(){}.getType();
for list:
new TypeToken<List<YOUR_POJO>>(){}.getType();

Hibernate CompositeUserType mapping has wrong number of columns

I am new to Hibernate. Writing a CompositeUserType. When I run the code I am getting error.
property
mapping has wrong number of columns:
Please help me what am I missing?
My CompositeUserType goes as follows
public class EncryptedAsStringType implements CompositeUserType {
#Override
public String[] getPropertyNames() {
return new String[] { "stockId", "stockCode", "stockName","stockDescription" };
}
#Override
public Type[] getPropertyTypes() {
//stockId, stockCode,stockName,modifiedDate
return new Type[] {
Hibernate.INTEGER, Hibernate.STRING, Hibernate.STRING,Hibernate.STRING
};
}
#Override
public Object getPropertyValue(final Object component, final int property)
throws HibernateException {
Object returnValue = null;
final Stock auditData = (Stock) component;
if (0 == property) {
returnValue = auditData.getStockId();
} else if (1 == property) {
returnValue = auditData.getStockCode();
} else if (2 == property) {
returnValue = auditData.getStockName();
} return returnValue;
}
#Override
public void setPropertyValue(final Object component, final int property,
final Object setValue) throws HibernateException {
final Stock auditData = (Stock) component;
}
#Override
public Object nullSafeGet(final ResultSet resultSet,
final String[] names,
final SessionImplementor paramSessionImplementor, final Object paramObject)
throws HibernateException, SQLException {
//owner here is of type TestUser or the actual owning Object
Stock auditData = null;
final Integer createdBy = resultSet.getInt(names[0]);
//Deferred check after first read
if (!resultSet.wasNull()) {
auditData = new Stock();
System.out.println(">>>>>>>>>>>>"+resultSet.getInt(names[1]));
System.out.println(">>>>>>>>>>>>"+resultSet.getString(names[2]));
System.out.println(">>>>>>>>>>>>"+resultSet.getString(names[3]));
System.out.println(">>>>>>>>>>>>"+resultSet.getString(names[4]));
}
return auditData;
}
#Override
public void nullSafeSet(final PreparedStatement preparedStatement,
final Object value, final int property,
final SessionImplementor sessionImplementor)
throws HibernateException, SQLException {
if (null == value) {
} else {
final Stock auditData = (Stock) value;
System.out.println("::::::::::::::::::::::::::::::::"+auditData.getStockCode());
System.out.println("::::::::::::::::::::::::::::::::"+auditData.getStockDescription());
System.out.println("::::::::::::::::::::::::::::::::"+auditData.getStockId());
System.out.println("::::::::::::::::::::::::::::::::"+auditData.getStatus());
}
}
My Domain class Stock has five attributes. (stockId,stockCode,StockName,Status , Stock
Description)
I need to declare the field Stock description as Composite field Type.
private Integer stockId;
private String stockCode;
private String stockName;
private String status;
private String stockDescription;
//Constructors
#Column(name = "STOCK_CC", unique = true, nullable = false, length = 20)
#Type(type="com.mycheck.EncryptedAsStringType")
#Columns(columns = { #Column(name="STOCK_ID"),
#Column(name="STOCK_CODE"),
#Column(name="STOCK_NAME")
})
public String getStockDescription() {
return stockDescription;
}
}
When I try to execute a insert for Stock. I am getting the error Error creating bean with name
'sessionFactory' defined in class path resource [spring/config/../database/Hibernate.xml]:
Invocation of init method failed. nested exception is org.hibernate.MappingException:
property mapping has wrong number of columns: com.stock.model.Stock.stockDescription type:
com.mycheck.EncryptedAsStringType
Where am I going wrong ?
One can extract the answer from the code samples and the comments to the original question, but to save everyone some reading, I've compiled a quick summary.
If you declare a CompositeUserType that maps a type to n columns, you have to declare n columns in #Columns besides the #Type annotation. Example:
public class EncryptedAsStringType implements CompositeUserType {
#Override
public String[] getPropertyNames() {
return new String[] { "stockId", "stockCode", "stockName","stockDescription" };
}
// ...
}
This CompositeUserType maps to 4 separate columns, therefore 4 separate #Column annotations have to be declared:
#Type(type="com.mycheck.EncryptedAsStringType")
#Columns(columns = {
#Column(name="STOCK_ID"),
#Column(name="STOCK_CODE"),
#Column(name="STOCK_NAME"),
#Column(name="STOCK_DESCRIPTION")
})
public String getStockDescription() {
return stockDescription;
}
That's it and Hibernate is happy.

Resources