Spring Data Elasticsearch - How to return the score - spring-boot

I am using Spring Data Elasticsearch and I can successfully run a match query that returns some items. I would like to display the score as well, but it always returns zero.
Here is the code:
String query = "test";
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(matchQuery("_all", query))
.build();
String scrollId = elasticsearchTemplate.scan(searchQuery, 1000, false);
List<GlobalSearchDTO> sampleEntities = new ArrayList<GlobalSearchDTO>();
boolean hasRecords = true;
while (hasRecords){
Page<GlobalSearchDTO> page = elasticsearchTemplate.scroll(scrollId, 5000L , new SearchResultMapper(){
#Override
public <T> AggregatedPage<T> mapResults(SearchResponse response, Class<T> clazz, Pageable pageable) {
List<GlobalSearchDTO> out = new ArrayList<>();
GlobalSearchDTO tmp;
ObjectMapper objectMapper = new ObjectMapper();
log.debug("MAX SCORE {}", response.getHits().getMaxScore()); // <-- ZERO
for (SearchHit hit : response.getHits()) {
tmp = new GlobalSearchDTO();
tmp.setId(Long.valueOf(hit.getId()));
tmp.setType(hit.getType());
log.debug("SCORE: {}", hit.getScore()); // <-- ZERO
try {
switch(hit.getIndex()) {
case "document": // map to DTO
tmp.setObj(objectMapper.readValue(hit.getSourceAsString(), Document.class));
break;
case "product": //map to DTO
tmp.setObj(objectMapper.readValue(hit.getSourceAsString(), Product.class));
break;
}
} catch (IOException e) {
e.printStackTrace();
}
out.add(tmp);
}
if (out.size() > 0) {
return new AggregatedPageImpl<T>((List<T>) out);
}
return null;
}
});

Related

How can I read Flux<DataBuffer> content?

I want to read mulitpart/formdata, one part is application/JSON. I can't get them to Map<String,String>, Is there any way to parse Part to String?
private Map<String, String> getFormData(String path, MultiValueMap<String, Part> partMultiValueMap) {
if (partMultiValueMap != null) {
Map<String, String> formData = new HashMap<>();
Map<String, Part> multiPartMap = partMultiValueMap.toSingleValueMap();
for (Map.Entry<String, Part> partEntry : multiPartMap.entrySet()) {
Part part = partEntry.getValue();
if (part instanceof FormFieldPart) {
formData.put(partEntry.getKey(), ((FormFieldPart) part).value());
} else {
String bodyString = bufferToStr(part.content());
formData.put(partEntry.getKey(), bodyString);
}
}
return formData;
}
return null;
}
extra Flux
private String bufferToStr(Flux<DataBuffer> content){
AtomicReference<String> res = new AtomicReference<>();
content.subscribe(buffer -> {
byte[] bytes = new byte[buffer.readableByteCount()];
buffer.read(bytes);
DataBufferUtils.release(buffer);
res.set(new String(bytes, StandardCharsets.UTF_8));
});
return res.get();
}
Subscribe is async; bufferToStr value may be null?
You could do it in non-blocking way with StringDecoder
Basically you could write your code to return Mono<Map<>>
Note: I'm using Pair class here to return key-value and later collect them to Map
Pair I'm using here is from package org.springframework.data.util.Pair
public Mono<Map<String, String>> getFormData(MultiValueMap<String, Part> partMultiValueMap) {
Map<String, Part> multiPartMap = partMultiValueMap.toSingleValueMap();
return Flux.fromIterable(multiPartMap.entrySet())
.flatMap(entry -> {
Part part = entry.getValue();
if (part instanceof FormFieldPart) {
return Mono.just(
Pair.of(entry.getKey(), ((FormFieldPart) part).value()) // return Pair
);
} else {
return decodePartToString(part.content()) // decoding DataBuffers to string
.flatMap(decodedString ->
Mono.just(Pair.of(entry.getKey(), decodedString))); // return Pair
}
})
.collectMap(Pair::getFirst, Pair::getSecond); // map and collect pairs to Map<>
}
private Mono<String> decodePartToString(Flux<DataBuffer> dataBufferFlux) {
StringDecoder stringDecoder = StringDecoder.textPlainOnly();
return stringDecoder.decodeToMono(dataBufferFlux,
ResolvableType.NONE,
MimeTypeUtils.TEXT_PLAIN,
Collections.emptyMap()
);
}

How to hit ElasticSearch using Apache HttpClient

I have SearchRequest object with all the Elasticsearch(ES) query data set. I cannot use RestHighLevel client for my usecase because it requires endpoint need to be passed at the time of instantiation. I gets ES endpoint dynamically based on some condition. One way is to always create new RestHighLevel client which will be inefficient approach. Other way is to create static CloseableHttpClient on service start and make HttpPost request with dynamic endpoint. I wanted to take later approach but don't know how to convert SearchRequest object into json query string.
Any code reference/snippet would be very helpful
private final CloseableHttpClient client;
public GenericElasticSearchResponse search(#Nonnull final SearchRequest searchRequest,
#Nonnull final RoutingConfig route) {
final URIBuilder builder = new URIBuilder()
.setScheme(route.getScheme())
.setHost(route.getESEndpoint())
.setPort(Optional.ofNullable(route.getPort())
.orElse(80))
.setPath("/sessions*/_search");
final URI uri = builder.build();
final ContentType contentType = ContentType.create("application/json", "UTF-8");
final HttpPost httpPost = new HttpPost(uri);
httpPost.setEntity(entity);
final CloseableHttpResponse response = client.execute(httpPost);
final String responseEntity;
try (final Reader reader = new InputStreamReader(response.getEntity().getContent(), Charsets.UTF_8)) {
responseEntity = CharStreams.toString(reader);
}
final SearchResponse searchResponse = objectMapper.readValue(responseEntity, SearchResponse.class);
return new ElasticSearchResponse(searchResponse);
}
I found searchRequest.source().toString() was actually returning json form of SearchRequest. Following is complete code snippet for hitting ES via Apache client
final EndpointConfig endpoint = route.getEndpoint();
final URIBuilder builder = new URIBuilder()
.setScheme(endpoint.getScheme())
.setHost(endpoint.getHost())
.setPort(Optional.ofNullable(endpoint.getPort())
.orElse(HTTPS_PORT))
.setPath(Optional.ofNullable(endpoint.getQueryPath())
.orElse(StringUtils.EMPTY));
final URI uri = builder.build();
final ContentType contentType = ContentType.create("application/json", "UTF-8");
final String queryString = searchRequest.source().toString();
final StringEntity entity = new StringEntity(queryString, contentType);
final HttpPost httpPost = new HttpPost(uri);
httpPost.setEntity(entity);
final CloseableHttpResponse response = sendRequest(httpPost);
final String responseEntity;
try (final Reader reader = new InputStreamReader(response.getEntity().getContent(), Charsets.UTF_8)) {
responseEntity = CharStreams.toString(reader);
}
log.info("ElasticSearchClient response: Code: {}, Entity {}", response.getCode(), responseEntity);
SearchResponse searchResponse = null;
if (Objects.nonNull(responseEntity)) {
searchResponse = parseResponse(responseEntity, searchRequest, response.getCode());
log.info("ElasticSearchClient searchResponse- {} ", searchResponse);
}
return new ElasticSearchResponse(searchResponse);
} catch (final URISyntaxException e) {
throw new IllegalStateException(
String.format("Invalid URI. host: %s", route.getEndpoint()), e);
} catch (final IOException e) {
throw new IllegalStateException("ElasticSearch Request failed.", e);
}
private SearchResponse parseResponse(#Nonnull final String responseEntity,
#Nonnull final SearchRequest searchRequest,
final int responseCode) {
if (responseCode >= 400 || responseCode < 200) {
log.info("ES error response - {} ", responseEntity);
final ESErrorResponse response = GSON.fromJson(responseEntity, ESErrorResponse.class);
throw new IllegalStateException();
}
SearchResponse searchResponse = null;
final NamedXContentRegistry registry = new NamedXContentRegistry(getDefaultNamedXContents());
final XContentParser parser;
try {
parser = JsonXContent.jsonXContent.createParser(registry,
DeprecationHandler.THROW_UNSUPPORTED_OPERATION, responseEntity);
searchResponse = SearchResponse.fromXContent(parser);
} catch (IOException e) {
throw new IllegalStateException("Error while parsing response ", e);
}
return searchResponse;
}
public static List<NamedXContentRegistry.Entry> getDefaultNamedXContents() {
final Map<String, ContextParser<Object, ? extends Aggregation>> map = new HashMap<>();
map.put(TopHitsAggregationBuilder.NAME, (p, c) -> ParsedTopHits.fromXContent(p, (String) c));
map.put(StringTerms.NAME, (p, c) -> ParsedStringTerms.fromXContent(p, (String) c));
return map.entrySet().stream()
.map(entry -> new NamedXContentRegistry.Entry(Aggregation.class, new ParseField(entry.getKey()), entry.getValue()))
.collect(Collectors.toList());
}
private CloseableHttpResponse sendRequest(final HttpPost httpPost) throws IOException {
return client.execute(httpPost);
}

Handling multipart response from spring rest controller

I am having controller method like this
#PostMapping(path = "/downloadAttachment",
produces = "application/octet-stream")
public ResponseEntity<?> downloadAttachment(#Valid #RequestBody Attachment attachmentModel) {
refreshProp(false);
HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_OCTET_STREAM);
try {
String byteRes = null;
JSONArray responseFromDownloadAttachment =
databaseOperations.downloadAttachment(attachmentModel);
if (responseFromDownloadAttachment.length() == 0) {
return new ResponseEntity<>("", HttpStatus.NO_CONTENT);
}
else {
for (int blobRes = 0; blobRes < responseFromDownloadAttachment.length(); blobRes++) {
JSONObject blobObj = responseFromDownloadAttachment.getJSONObject(blobRes);
if (blobObj != null) {
byteRes = (String) blobObj.getString("file");
}
}
}
byte[] byteArrray = byteRes.getBytes();
return new ResponseEntity<>(byteArrray, HttpStatus.OK);
} catch (Exception e) {
log.error("Exception occurred!" + e);
e.printStackTrace();
JSONObject errObj = new JSONObject();
errObj.put("status", "E");
errObj.put("message", e);
return new ResponseEntity<>(errObj.toString(), HttpStatus.INTERNAL_SERVER_ERROR);
}
}
I am sending byte array as response.But i am not sure which type of file i will be getting from service layer.It can be in any form like xlsx,txt,png,jpg or any multimedia.I am setting headers to octet-stream and also produces to octet-stream.Can i use octet-stream to handle these type of responses?

AggregatingReplyingKafkaTemplate releaseStrategy Question

There seem to be an issue when I use AggregatingReplyingKafkaTemplate with template.setReturnPartialOnTimeout(true) in that, it returns timeout exception even if partial results are available from consumers.
In example below, I have 3 consumers to reply to the request topic and i've set the reply timeout at 10 seconds. I've explicitly delayed the response of Consumer 3 to 11 seconds, however, I expect the response back from Consumer 1 and 2, so, I can return partial results. However, I am getting KafkaReplyTimeoutException. Appreciate your inputs. Thanks.
I follow the code based on the Unit Test below.
[ReplyingKafkaTemplateTests][1]
I've provided the actual code below:
#RestController
public class SumController {
#Value("${kafka.bootstrap-servers}")
private String bootstrapServers;
public static final String D_REPLY = "dReply";
public static final String D_REQUEST = "dRequest";
#ResponseBody
#PostMapping(value="/sum")
public String sum(#RequestParam("message") String message) throws InterruptedException, ExecutionException {
AggregatingReplyingKafkaTemplate<Integer, String, String> template = aggregatingTemplate(
new TopicPartitionOffset(D_REPLY, 0), 3, new AtomicInteger());
String resultValue ="";
String currentValue ="";
try {
template.setDefaultReplyTimeout(Duration.ofSeconds(10));
template.setReturnPartialOnTimeout(true);
ProducerRecord<Integer, String> record = new ProducerRecord<>(D_REQUEST, null, null, null, message);
RequestReplyFuture<Integer, String, Collection<ConsumerRecord<Integer, String>>> future =
template.sendAndReceive(record);
future.getSendFuture().get(5, TimeUnit.SECONDS); // send ok
System.out.println("Send Completed Successfully");
ConsumerRecord<Integer, Collection<ConsumerRecord<Integer, String>>> consumerRecord = future.get(10, TimeUnit.SECONDS);
System.out.println("Consumer record size "+consumerRecord.value().size());
Iterator<ConsumerRecord<Integer, String>> iterator = consumerRecord.value().iterator();
while (iterator.hasNext()) {
currentValue = iterator.next().value();
System.out.println("response " + currentValue);
System.out.println("Record header " + consumerRecord.headers().toString());
resultValue = resultValue + currentValue + "\r\n";
}
} catch (Exception e) {
System.out.println("Error Message is "+e.getMessage());
}
return resultValue;
}
public AggregatingReplyingKafkaTemplate<Integer, String, String> aggregatingTemplate(
TopicPartitionOffset topic, int releaseSize, AtomicInteger releaseCount) {
//Create Container Properties
ContainerProperties containerProperties = new ContainerProperties(topic);
containerProperties.setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
//Set the consumer Config
//Create Consumer Factory with Consumer Config
DefaultKafkaConsumerFactory<Integer, Collection<ConsumerRecord<Integer, String>>> cf =
new DefaultKafkaConsumerFactory<>(consumerConfigs());
//Create Listener Container with Consumer Factory and Container Property
KafkaMessageListenerContainer<Integer, Collection<ConsumerRecord<Integer, String>>> container =
new KafkaMessageListenerContainer<>(cf, containerProperties);
// container.setBeanName(this.testName);
AggregatingReplyingKafkaTemplate<Integer, String, String> template =
new AggregatingReplyingKafkaTemplate<>(new DefaultKafkaProducerFactory<>(producerConfigs()), container,
(list, timeout) -> {
releaseCount.incrementAndGet();
return list.size() == releaseSize;
});
template.setSharedReplyTopic(true);
template.start();
return template;
}
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,bootstrapServers);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "test_id");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringDeserializer.class);
return props;
}
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
// list of host:port pairs used for establishing the initial connections to the Kakfa cluster
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
org.apache.kafka.common.serialization.StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringSerializer.class);
return props;
}
public ProducerFactory<Integer,String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#KafkaListener(id = "def1", topics = { D_REQUEST}, groupId = "D_REQUEST1")
#SendTo // default REPLY_TOPIC header
public String dListener1(String in) throws InterruptedException {
return "First Consumer : "+ in.toUpperCase();
}
#KafkaListener(id = "def2", topics = { D_REQUEST}, groupId = "D_REQUEST2")
#SendTo // default REPLY_TOPIC header
public String dListener2(String in) throws InterruptedException {
return "Second Consumer : "+ in.toLowerCase();
}
#KafkaListener(id = "def3", topics = { D_REQUEST}, groupId = "D_REQUEST3")
#SendTo // default REPLY_TOPIC header
public String dListener3(String in) throws InterruptedException {
Thread.sleep(11000);
return "Third Consumer : "+ in;
}
}
'''
[1]: https://github.com/spring-projects/spring-kafka/blob/master/spring-kafka/src/test/java/org/springframework/kafka/requestreply/ReplyingKafkaTemplateTests.java
template.setReturnPartialOnTimeout(true) simply means the template will consult the release strategy on timeout (with the timeout argument = true, to tell the strategy it's a timeout rather than a delivery call).
It must return true to release the partial result.
This is to allow you to look at (and possibly modify) the list to decide whether you want to release or discard.
Your strategy ignores the timeout parameter:
(list, timeout) -> {
releaseCount.incrementAndGet();
return list.size() == releaseSize;
});
You need return timeout ? true : { ... }.

Spring Boot Java Request GET Request Mapping URL problems

How to turn this 【http://myurl.com/test/api/v1/data/?:getlicense=1234】
to this way 【http://myurl.com/test/api/v1/data/:getlicense/1234】
The code below using postman GET http://myurl.com/test/api/v1/data/?:getlicense=1234 will return result success.
Below is my code:
#RequestMapping(value="/api/v1/data/" ,produces=MediaType.APPLICATION_JSON_VALUE,headers="Accept=*/*",method = { RequestMethod.GET })
public Map ReturnData(#RequestParam(":getlicense") String getdata) {
Map returns = new HashMap();
try {
queryData qD= new queryData ();
qD.setData(getdata);
returns = result.getdataList(qD);
} catch (Exception e) {
e.printStackTrace();
}
return returns;
}
please help me, thank you.
You have to convert your RequestParam to a PathVariable
#GetMapping("/api/v1/data/licenses/{id}")
public Map returnData(#PathVariable(value = "id") String id) {
Map returns = new HashMap();
try {
queryData qD= new queryData ();
qD.setData(id);
returns = result.getdataList(qD);
} catch (Exception e) {
e.printStackTrace();
}
return returns;
}
Please find below solution
#RequestMapping(value="/api/v1/data/licenses/{id}" ,produces=MediaType.APPLICATION_JSON_VALUE,headers="Accept=*/*",method = { RequestMethod.GET })
public Map ReturnData(#PathVariable(value = "id") String id) {
Map returns = new HashMap();
try {
queryData qD= new queryData ();
qD.setData(id);
returns = result.getdataList(qD);
} catch (Exception e) {
e.printStackTrace();
}
return returns;
}

Resources