How to fire DRL rules deployed on KIE Server using spring boot? - spring-boot

I created a simple project in Drools work bench. The project has 1 data object and 2 DRL files. I build and deployed the project to the KIE server. And I have created a simple spring boot application which loads data into the data object using a rest service and fires the rules. Below is the code:
public class Main {
private static String containerId = "ImportProducts_1.0.1-LATEST";
private static String user = "kieserver";
private static String password = "kieserver1!";
private static String url = "http://localhost:8180/kieserver/services/rest/server";
private static final Logger LOGGER = LoggerFactory.getLogger(Main.class);
private static final MarshallingFormat FORMAT = MarshallingFormat.JSON;
private static String CLASS_NAME = "ImportProduct";
public static void main(String[] args) {
List<ImportProduct> prods = new ArrayList<>();
prods.add(new ImportProduct("1", "Grocery - Milk", "OK", 25.0));
prods.add(new ImportProduct("2", "Fashion - Trouser", "NOT_OK", 1300.0));
prods.add(new ImportProduct("3", "Grocery - Wheat", "OK", 425.0));
prods.add(new ImportProduct("4", "Grocery - Dairy Milk Chocolate", "OK", 100.0));
KieServicesConfiguration config = KieServicesFactory.newRestConfiguration(url, user, password, 60000);
config.setMarshallingFormat(MarshallingFormat.JSON);
RuleServicesClient client = KieServicesFactory.newKieServicesClient(config).getServicesClient(RuleServicesClient.class);
List<Command<?>> cmds = new ArrayList<>();
KieCommands commands = KieServices.Factory.get().getCommands();
cmds.add(commands.newInsert(prods, CLASS_NAME));
cmds.add(commands.newFireAllRules());
BatchExecutionCommand myCommands = CommandFactory.newBatchExecution(cmds);
ServiceResponse<ExecutionResults> response = client.executeCommandsWithResults(containerId, myCommands);
if (response.getType() == ServiceResponse.ResponseType.SUCCESS) {
LOGGER.info("Commands executed with success! Response: ");
LOGGER.info("{}", response.getResult());
List<ImportProduct> prodUpdated = (List<ImportProduct>) response.getResult().getValue(CLASS_NAME);
//sale.setDiscount(saleUpdated.getDiscount());
LOGGER.info("Response is: {}", response.getMsg());
LOGGER.info("Output is: {}", prodUpdated.toString());
} else {
LOGGER.error("Error executing rules. Message: {}", response.getMsg());
}
//KieServices kieServices = KieServices.Factory.get();
//ReleaseId releaseId = (ReleaseId) kieServices.newReleaseId( "com.test", "ImportProducts", "1.0.1-LATEST" );
}
}
The application runs without errors but the there is no impact of the rules on the data fed in the KIE container. The rule deletes the data object if the status != 'OK'. I am getting all the data back I sent in the request body of the POST request service. I think DRL capability is enabled as shown in the stack trace below:
22:11:40.139 [main] DEBUG org.kie.server.client.impl.AbstractKieServicesClientImpl - About to deserialize content:
'{
"type" : "SUCCESS",
"msg" : "Kie Server info",
"result" : {
"kie-server-info" : {
"id" : "kie-server-a017ffcb29dc",
"version" : "7.55.0.Final",
"name" : "kie-server-a017ffcb29dc",
"location" : "http://172.17.0.3:8080/kie-server/services/rest/server",
**"capabilities" : [ "KieServer", "BRM", "BPM", "CaseMgmt", "BPM-UI", "BRP", "DMN", "Swagger" ],**
"messages" : [ {
"severity" : "INFO",
"timestamp" : {
"java.util.Date" : 1623926563857
},
"content" : [ "Server KieServerInfo{serverId='kie-server-a017ffcb29dc', version='7.55.0.Final', name='kie-server-a017ffcb29dc', location='http://172.17.0.3:8080/kie-server/services/rest/server', capabilities=[KieServer, BRM, BPM, CaseMgmt, BPM-UI, BRP, DMN, Swagger]', messages=null', mode=DEVELOPMENT}started successfully at Thu Jun 17 10:42:43 UTC 2021" ]
} ],
"mode" : "DEVELOPMENT"
}
}
}'
Below is the response I got back:
22:11:41.515 [main] DEBUG org.kie.server.client.impl.AbstractKieServicesClientImpl - About to deserialize content:
'{
"type" : "SUCCESS",
"msg" : "Container ImportProducts_1.0.1-LATEST successfully called.",
"result" : {
"execution-results" : {
"results" : [ {
"value" : [{"com.test.importproducts.ImportProduct":{
"id" : "1",
"category" : "Grocery - Milk",
"status" : "OK",
"price" : 25.0
}},{"com.test.importproducts.ImportProduct":{
"id" : "2",
"category" : "Fashion - Trouser",
"status" : "NOT_OK",
"price" : 1300.0
}},{"com.test.importproducts.ImportProduct":{
"id" : "3",
"category" : "Grocery - Wheat",
"status" : "OK",
"price" : 425.0
}},{"com.test.importproducts.ImportProduct":{
"id" : "4",
"category" : "Grocery - Dairy Milk Chocolate",
"status" : "OK",
"price" : 100.0
}}],
"key" : "ImportProduct"
} ],
"facts" : [ {
"value" : {"org.drools.core.common.DefaultFactHandle":{
"external-form" : "0:3:459769708:-992461270:3:DEFAULT:NON_TRAIT:java.util.ArrayList"
}},
"key" : "ImportProduct"
} ]
}
}
}'
Please help.

I fixed the issue. Below is the working code:
private Command<?> prepareCommands(List<ImportProduct> facts, String sessionName, String outIdentifier) {
KieCommands commandsFactory = KieServices.Factory.get().getCommands();
List<Command> commands = facts.stream().map(commandsFactory::newInsert).collect(Collectors.toList());
commands.add(commandsFactory.newFireAllRules());
ObjectFilter factsFilter = new ClassObjectFilter(ImportProduct.class);
//System.out.println("Check1: "+commandsFactory.newGetObjects(factsFilter, CLASS_NAME));
//System.out.println("Check2: "+commands.toString());
commands.add(commandsFactory.newGetObjects(factsFilter, CLASS_NAME));
return commandsFactory.newBatchExecution(commands, sessionName);
}

Related

Spring data mongodb Aggregation group and count

I want to make group and count a collection data by its nested field type.name. However, spring data mongo db's Aggregation count static method return 1 as count for each type, eventhough there are more then 1 data for each type. Is there any mistake my pipeline functions ? If so could you correct me please ? Thanks in advance
My collection data structure is:
java code to fetch data:
private List<OrganizationSummary> getOrganizationSummary(){
Aggregation agg = newAggregation(
match(Criteria.where("status").is(Boolean.TRUE).and("deleted").is(Boolean.FALSE)),
group("type.name").count().as("count"),
project("count").and("name").previousOperation()
);
AggregationResults<OrganizationSummary> groupResults
= mongoTemplate.aggregate(agg, Organization.class, OrganizationSummary.class);
List<OrganizationSummary> result = groupResults.getMappedResults();
return result;
}
result from debugger. As you see all count fields are default 1, eventhough there are more then 1 data for each type.
Result from mongoshell:
Result dto to be projected:
#Getter
#Setter
#Builder
#AllArgsConstructor
#NoArgsConstructor
public class OrganizationSummary {
private String name;
private Long count;
}
Logs after mongoTemplate.aggregate() method is executed.
06-10-2022 15:04:18 [DEBUG] org.springframework.data.mongodb.core.MongoTemplate : Executing aggregation: [{ "$match" : { "status" : true, "deleted" : false}}, { "$group" : { "_id" : "$type.name", "count" : { "$sum" : { "$date" : "1970-01-01T00:00:00.001Z"}}}}, { "$project" : { "count" : { "$date" : "1970-01-01T00:00:00.001Z"}, "_id" : { "$date" : "1970-01-01T00:00:00Z"}, "name" : "$_id"}}] in collection organization
06-10-2022 15:04:18 [DEBUG] org.mongodb.driver.protocol.command : Sending command '{"aggregate": "organization", "pipeline": [{"$match": {"status": true, "deleted": false}}, {"$group": {"_id": "$type.name", "count": {"$sum": {"$date": "1970-01-01T00:00:00.001Z"}}}}, {"$project": {"count": {"$date": "1970-01-01T00:00:00.001Z"}, "_id": {"$date": "1970-01-01T00:00:00Z"}, "name": "$_id"}}], "cursor": {}, "allowDiskUse": false, "$db": "organizationidentity", "lsid": {"id": {"$binary": {"base64": "GA/dUO4HRlWxF1m9NVs1iQ==", "subType": "04"}}}}' with request id 22 to database organizationidentity on connection [connectionId{localValue:3, serverValue:78}] to server localhost:27017
06-10-2022 15:04:18 [DEBUG] org.mongodb.driver.protocol.command : Execution of command with request id 22 completed successfully in 13.70 ms on connection [connectionId{localValue:3, serverValue:78}] to server localhost:27017
06-10-2022 15:04:18 [DEBUG] org.mongodb.driver.operation

java 11 collect to map by keeping value unique

I have a specific json value as shown below,
{
"record_id" : "r01",
"teacherNstudents": [
{
"teacher" : {
"name" : "tony",
"tid" : "T01"
},
"student" : {
"name" : "steve",
"sid" : "S01"
}
},
{
"teacher" : {
"name" : "tony",
"tid" : "T01"
},
"student" : {
"name" : "natasha",
"sid" : "S02"
}
},
{
"teacher" : {
"name" : "tony",
"tid" : "T01"
},
"student" : {
"name" : "bruce",
"sid" : "S03"
}
},
{
"teacher" : {
"name" : "tony",
"tid" : "T01"
},
"student" : {
"name" : "victor",
"sid" : "S04"
}
},
{
"teacher" : {
"name" : "henry",
"tid" : "T02"
},
"student" : {
"name" : "jack",
"sid" : "S05"
}
},
{
"teacher" : {
"name" : "henry",
"tid" : "T02"
},
"student" : {
"name" : "robert",
"sid" : "S06"
}
}
]
}
I am trying to generate a map like the one below,
[ {"S01", "T01"} , {"S05","T02"} ]
This is by removing all duplicate values and selecting only one teacher and student. The current code I wrote for this is
var firstMap = records.getTeacherNstudents()
.stream()
.collect(Collectors.toMap(tS -> tS.getTeacher().getTid(),
tS -> tS.getStudent().getSid(),
(a1, a2) -> a1));
return firstMap.entrySet()
.stream()
.collect(Collectors.toMap(Map.Entry::getValue, Map.Entry::getKey));
I believe this can be improved, by using Collectors.groupingBy. I am still working on it, but if anyone has any good idea on how to solve this, please share.
Using Java 8 groupingBy
You can try the below approach in order to have the Map<String,List<String>> or Map<String,Set<String>>(avoid duplicates) where key of map will be the teacher id and value as List or Set of Students corresponding to each teacher.
I have used groupingBy feature from java 8 and did the grouping based on the tId and before collecting it, I have downstream it to List or Set of student Ids corresponding to each tId.
Approach A: Map<String,Set< String >> (Uniques)
data.getTeacherStudentMappingList()
.stream()
.collect(Collectors.groupingBy(x -> x.getTeacher().getTid(), LinkedHashMap::new,
Collectors.mapping(y -> y.getStudent().getSid(),Collectors.toSet())));
Approach B : Map<String,List< String >> (Non-uniques, duplicates)
data.getTeacherStudentMappingList()
.stream()
.collect(Collectors.groupingBy(x -> x.getTeacher().getTid(), LinkedHashMap::new,
Collectors.mapping(y -> y.getStudent().getSid(),Collectors.toList())));
Here,
data is the converted object from the given json.
LinkedHashmap::new is used to preserve the order of student data from the json in the output.
collectors.mapping is used to convert the values corresponding to each key into the student ids.
Collectors.toList() will collect the list of student ids in the list.
Collectors.toSet() will collect the unique student ids in the set.
Output:
{T01=[S01, S02, S03, S04], T02=[S05, S06]}

Elasticsearch Java API access source when Exception is thrown

I am learning the Java RestHighLevelClient but I can't find the answer to this question.
When you submit a REST request to something where a document is not found you will see something like this:
$ curl localhost:9200/customer/_doc/1?pretty
{
"error" : {
"root_cause" : [
{
"type" : "index_not_found_exception",
"reason" : "no such index [customer]",
"resource.type" : "index_expression",
"resource.id" : "customer",
"index_uuid" : "_na_",
"index" : "customer"
}
],
"type" : "index_not_found_exception",
"reason" : "no such index [customer]",
"resource.type" : "index_expression",
"resource.id" : "customer",
"index_uuid" : "_na_",
"index" : "customer"
},
"status" : 404
}
However in the Java client you code something like this:
GetRequest request = new GetRequest(INDEX, ROOT);
GetResponse response = null;
try {
response = client.get(request, RequestOptions.DEFAULT);
} catch (IOException ioe) {
// do something with the IOException
} catch (ElasticsearchException ese) {
// where is the response source?
}
So if the document is not found, you get the ElasticsearchException in which case the local variable response is null. So where do you get
the source document that was present at the low level? (Preferably as a Map).
You can get to the source of the response at the low level via the suppressesed exceptions in the ElasticsearchException.
Throwable[] suppressed = ese.getSuppressed();
if (suppressed.length > 0 && suppressed[0] instanceof ResponseException) {
ResponseException re = (ResponseException) suppressed[0];
Response response = re.getResponse();
}

RepositoryRestResource returns results in a different format than RepositoryRestController

I am facing an issue with using different Spring Controllers.
We are using a standard Spring PagingAndSortingRepository annotated with RepositoryRestResource to serve search responses . One of the methods is
Page<Custodian> findByShortNameContainingIgnoreCase(#Param("shortName") String shortName, Pageable p);
It returns all entities of Custodian that satisfy the conditions grouped in Pages.
The result looks like this:
{
"_embedded" : {
"custodians" : [ {
"shortName" : "fr-custodian",
"name" : "french custodian",
"contact" : "Francoir",
"_links" : {
"self" : {
"href" : "http://127.0.0.1:9000/api/custodians/10004"
},
"custodian" : {
"href" : "http://127.0.0.1:9000/api/custodians/10004"
}
}
} ]
},
"_links" : {
"self" : {
"href" : "http://127.0.0.1:9000/api/custodians/search/findByShortNameContainingIgnoreCase?shortName=fr&page=0&size=3&sort=shortName,asc"
}
},
"page" : {
"size" : 3,
"totalElements" : 1,
"totalPages" : 1,
"number" : 0
}
}
This is the format our frontend expects.
However, we need another query that results in a pretty long function (and thus URL) because it takes multiple parameters.
To be specific, it globally searches for a string in Custodian. So every parameter has the same value.
In order to shorten the URL we created a RepositoryRestController annotated with ResponseBody and implemented a function that takes only one parameter, calls the long URL internally and re-returns the result (a Page).
#RequestMapping(value = "/custodian", method = RequestMethod.GET)
public Page<Custodian> search(#RequestParam(value = "keyWord") String keyWord, Pageable p) {
return repo.LONGURL(keyWord, keyWord, p);
}
Unfortunately, Spring doesn't apply the same format to the result of our function.
It looks like this:
{
"content" : [ {
"id" : 10004,
"shortName" : "fr-custodian",
"name" : "french custodian",
"contact" : "Francoir",
} ],
"pageable" : {
"sort" : {
"sorted" : true,
"unsorted" : false
},
"offset" : 0,
"pageSize" : 3,
"pageNumber" : 0,
"unpaged" : false,
"paged" : true
},
"totalElements" : 3,
"totalPages" : 1,
"last" : true,
"size" : 3,
"number" : 0,
"sort" : {
"sorted" : true,
"unsorted" : false
},
"numberOfElements" : 3,
"first" : true
}
How do you get Spring to deliver the same format in our custom method?

$sort pipeline after $group not working using Spring aggregate class

I have a User collection which looks like below sample :
User :{
"_id" : ObjectId("59f6dc660a975a3e3290ea01"),
"basicInfo" : {
"name" : "xxxx",
"age" : 27,
"gender" : "Male"
}
"otherInfo" {
"projects" : [
{
"_id" : ObjectId("59f6f9230a975a67cc7d7638"),
"name" : "Test Project",
"projectImage" : "images/project/59f6f9230a975a67cc7d7638.jpg",
"desc" : "This is a testing project",
"status" : "Active",
"verifyDet" : {
"method" : "Admin",
"status" : "PENDING",
"isVerified" : false
}
},
{
"_id" : ObjectId("59f6f9230a975a67cc7d5556"),
"name" : "Test Project Two",
"projectImage" : "images/project/59f6f9230a975a67cc7d5556.jpg",
"desc" : "This is a testing project",
"status" : "Closed",
"verifyDet" : {
"method" : "Admin",
"status" : "APPROVED",
"isVerified" : true
}
}
]
}
}
Note: One user can be part of multiple projects. But he needs approval from Admin to participate in the project activities. Verification is managed by verifyDet and projects are managed by projects array.
Actual requirement is to show the list of members in such a way that members having verification pending comes on top in alphabetic order and then approved/verified members in alphabetic order to Admin.
When I run below query on mongo shell I get list of Users with only one project detail(_id=59f6f9230a975a67cc7d7638) for which I want to search and result sorted by Verification pending users and User name. The result comes appropriately.
db.User.aggregate(
{$unwind:"$otherInfo.projects"},
{
$match:{
"otherInfo.projects._id":ObjectId("59f6f9230a975a67cc7d7638"),
"otherInfo.projects.status":"Active"
}
},
{$group: {_id: {"_id":"$_id", "basicInfo":"$basicInfo"}, "projects": {$push: "$otherInfo.projects"}}},
{$project:{"_id":"$_id._id", "basicInfo":"$_id.basicInfo", "otherInfo.projects":"$projects"}},
{$sort:{"otherInfo.projects.verifyDet.isVerified":1, "basicInfo.name":1}}
)
But when I create same aggregate in Spring like mentioned below I get exception:
public List<Map> fetchUsersList(String projectId, Pageable pageable) {
//unwind operation
AggregationOperation unwindOp = Aggregation.unwind("$otherInfo.projects");
Criteria criteria = Criteria.where("otherInfo.projects._id").is(new ObjectId(projectId));
criteria.and("otherInfo.projects.status").is("Active");
AggregationOperation matchOp = Aggregation.match(criteria);
AggregationOperation groupOp = Aggregation.group(
Fields.from(Fields.field("_id", "$_id")).and(Fields.field("basicInfo","$basicInfo"))).push("$otherInfo.projects").as("projects");
AggregationOperation projectOp = Aggregation.project(
Fields.from(Fields.field("_id","$_id._id"),
Fields.field("basicInfo","$_id.basicInfo"),
Fields.field("otherInfo.projects","$projects")));
AggregationOperation sortOp = Aggregation.sort(Direction.DESC, "otherInfo.projects.verifyDet.isVerified").and(Direction.DESC, "basicInfo.name");
Aggregation agg = Aggregation.newAggregation(unwindOp, matchOp, groupOp, projectOp, sortOp);
AggregationResults<User> results = mongoTemplate.aggregate(agg,
"User", User.class);
return results.getMappedResults();
}
Exception :
2017-12-15 19:24:31,852 ERROR GlobalExceptionHandler:75 - Exception Stack Trace :
java.lang.IllegalArgumentException: Invalid reference 'otherInfo.projects.verifyDet.isVerified'!
at org.springframework.data.mongodb.core.aggregation.ExposedFieldsAggregationOperationContext.getReference(ExposedFieldsAggregationOperationContext.java:99)
at org.springframework.data.mongodb.core.aggregation.ExposedFieldsAggregationOperationContext.getReference(ExposedFieldsAggregationOperationContext.java:80)
at org.springframework.data.mongodb.core.aggregation.SortOperation.toDBObject(SortOperation.java:73)
at org.springframework.data.mongodb.core.aggregation.AggregationOperationRenderer.toDBObject(AggregationOperationRenderer.java:56)
at org.springframework.data.mongodb.core.aggregation.Aggregation.toDbObject(Aggregation.java:580)
at org.springframework.data.mongodb.core.aggregation.Aggregation.toString(Aggregation.java:596)
at com.grpbk.gp.repository.impl.UserRepositoryCustomImpl.fetchUsersList(UserRepositoryCustomImpl.java:1128)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
Please let me know what I am doing wrong.
"otherInfo.projects.verifyDet.isVerified" field needs to be present in Project Oepration or group operation so that sort would get a reference for that.

Resources