mget producing: Unable to parse response body for Response - elasticsearch

I'm issuing a Multi-Get request via the Java High Level REST Client and I'm receiving the following exception:
"Unable to parse response body for Response{requestLine=POST /_mget HTTP/1.1, host=http://localhost:9200, response=HTTP/1.1 200 OK}"
I pulled the following JSON from the logs that was sent to Elastic:
{
"docs": [
{
"_index": "blah",
"_type": null,
"_id": "some-id-232332",
"routing": null,
"stored_fields": null,
"version": -3,
"version_type": "internal",
"_source": {
"includes": [],
"excludes": []
}
}
]
}
I sent the above JSON to Elastic via Postman and I'm seeing the following response (which is the same I see in the logs):
{
"docs": [
{
"_index": "blah",
"_type": null,
"_id": "some-id-232332",
"found": false
}
]
}
Isn't that a valid response? Is this an issue w/ the elasticsearch-rest-high-level-client?
Elastic 7.5.0, org.elasticsearch.client:elasticsearch-rest-high-level-client:7.5.2

I needed to check the getCause()'s of the exception for the real problem. I was passing null to Jackson on mapper.readValue(nullBytes, Customer.class); was the real problem.
Interestingly enough a NPE shows itself 🤷‍♂️.
Stack trace:
java.util.concurrent.ExecutionException: java.io.IOException: Unable to parse response body for Response{requestLine=POST /_mget HTTP/1.1, host=http://localhost:9200, response=HTTP/1.1 200 OK}
...
...
THE REAL PROBLEM IS HERE!!! 🚨🚨
Caused by: java.lang.IllegalArgumentException: argument "src" is null
at com.fasterxml.jackson.databind.ObjectMapper._assertNotNull(ObjectMapper.java:4429)
restHighLevelClient.mgetAsync(multiGetRequest, RequestOptions.DEFAULT, new ActionListener<MultiGetResponse>() {
#Override
public void onResponse(MultiGetResponse response) {
for (var responseItem : response.getResponses()) {
try {
// simulating a null source
byte[] nullBytes = null;
customer = mapper.readValue(nullBytes, Customer.class);
} catch (IOException e) {
result.completeExceptionally(e);
}
}
result.complete(true);
}
#Override
public void onFailure(Exception ex) {
//the real problem!!!
//log.error("ex.cause", ex.getCause());
log.error("error with mget", ex);
result.completeExceptionally(ex);
}
});
Full source of repro, Full log file

Related

Issue while Querying GraphQL API as a client

I am trying to run a query from postman like below and see HTTP 400 bad request as the error. Please help.
query getContract($data : String!){
contract(contractNumber: $data) {
contractNumber
versions(getCurrentContractVersion: true) {
nodes {
customer {
birthDate
legalEntityCode
}
contractAssets {
nodes {
assetDetails {
plate
}
}
}
}
}
}
}
Postman Response - HTTP 400 Bad request
{
"errors": [
{
"message": "Either the parameter query or the parameter id has to be set.",
"extensions": {
"code": "HC0013"
}
}
]
}

How can we catch unauthorized exception which been thrown by quarkus

I am facing this problem but don't know how to achieve it.
I have a graphql endpoint to fetch list of user, it already enabled authentication check.
Basically, when I send a request fetchUsers without authorization header it will throw exception or status code to let the user know, but currently, it just response
{
"errors": [
{
"message": null,
"locations": [
{
"line": 2,
"column": 3
}
],
"path": [
"fetchUsers"
],
"extensions": {
"classification": "DataFetchingException"
}
}
],
"data": {
"fetchUsers": null
}
}
And in the backend server, there have some exception throw:
SRGQL012000: Data Fetching Error: io.quarkus.security.UnauthorizedException
at io.quarkus.security.runtime.interceptor.check.AuthenticatedCheck.apply(AuthenticatedCheck.java:28)
at io.quarkus.security.runtime.interceptor.SecurityConstrainer.check(SecurityConstrainer.java:28)
at io.quarkus.security.runtime.interceptor.SecurityConstrainer_Subclass.check$$superforward1(SecurityConstrainer_Subclass.zig:100)
at io.quarkus.security.runtime.interceptor.SecurityConstrainer_Subclass$$function$$1.apply(SecurityConstrainer_Subclass$$function$$1.zig:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:62)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(InvocationInterceptor_Bean.zig:521)
Is there any way to catch this Unauthorized exception and custom it, to response 401 and the error message that we want to response.

Reducing output of GraphQL

I have set up a GraphQL-mongoose-express-apollo combo as per this guide.
When I run a query to get multiple results, is there a way to reduce the resulting array before I actually get to processing the response from the query.
Query:
query GetSomeUsers {
userMany (limit: 3){
_id
}
}
Actual output:
{
"data": {
"userMany": [
{
"_id": "5e950543cb48dbaafc60722d"
},
{
"_id": "5e950543cb48dbaafc60722e"
},
{
"_id": "5e950547cb48dbaafc60722f"
}
]
}
}
Desired output:
{
"data": {
"userMany": [
"5e950543cb48dbaafc60722d",
"5e950543cb48dbaafc60722e",
"5e950547cb48dbaafc60722f"
]
}
}
So far I have only found something that seems to be relevant in an article on GraphQL Leveler, but I don't see how it would work with graphql-compose-mongoose, as the GraphQL schema is automatically generated and there does not seem to be any place in the code to put in that LevelerObjectType in place of a GraphQLObjectType.

C# ElasticClient v6.0.2 LowLevel.IndexAsync Creating Empty Documents

Below is a portion of my code, that I've taken from here. I had to make few changes to make work with the new v6.X of Elasticsearch.
It runs without any errors and creates new documents BUT with empty field values. If I take the same JSON payload and PUT in Elasticsearch using Postman, the document gets indexed fine, with all fields populated. Please let me know if I am using the right Elasticsearch API methods, and using them right.
string strJsonMessage = #"
{
message : {
content: 'Test Message Content'
}
}";
ConnectionSettings connectionSettings = new ConnectionSettings(new Uri("xxx")).BasicAuthentication("xx", "xx");
ElasticClient client = new ElasticClient(connectionSettings);
JObject msg = JObject.Parse(strJsonMessage);
var result = await client.LowLevel.IndexAsync<BytesResponse>("events-2018.03.27", "event", PostData.Serializable(msg));
if (result.Success)
{
log.Info("Data successfully sent.");
log.Verbose(result.Body);
}
else
{
log.Error("Failed to send data.");
}
****OUTPUT TRACE:****
2018-03-27T18:37:18.961 [Info] Data successfully sent.
2018-03-27T18:37:18.961 [Verbose] {"_index":"events-2018.03.27","_type":"event","_id":"u9HPaGIBBm3ZG7GB5jM_","_version":1,"result":"created","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":29,"_primary_term":1}
Querying elasticsearch for this document by its ID gives me this:
{
"_index": "events-2018.03.27",
"_type": "event",
"_id": "u9HPaGIBBm3ZG7GB5jM_",
"_version": 1,
"found": true,
"_source": {
"message": {
"content": []
}
}
}
In case someone comes across the same issue, I got the solution here.

Error while updating nested field

Hi i am using elasticsearch java API for updating a document with script. But i am getting below exception
Exception in thread "main" MapperParsingException[object mapping for [content] tried to parse field [content] as object, but found a concrete value]
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:215)
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:308)
at org.elasticsearch.index.mapper.DocumentParser.parseValue(DocumentParser.java:438)
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:264)
at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:124)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:309)
at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:580)
at org.elasticsearch.index.shard.IndexShard.prepareIndexOnPrimary(IndexShard.java:559)
at org.elasticsearch.action.index.TransportIndexAction.prepareIndexOperationOnPrimary(TransportIndexAction.java:211)
at org.elasticsearch.action.index.TransportIndexAction.executeIndexRequestOnPrimary(TransportIndexAction.java:223)
at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:157)
at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:66)
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryPhase.doRun(TransportReplicationAction.java:657)
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryOperationTransportHandler.messageReceived(TransportReplicationAction.java:287)
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryOperationTransportHandler.messageReceived(TransportReplicationAction.java:279)
at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:77)
at org.elasticsearch.transport.TransportService$4.doRun(TransportService.java:376)
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Below is the existing document in the ES
{
"_index": "index1",
"_type": "type1",
"_id": "1",
"_version": 8,
"found": true,
"_source": {
"content": {
"contentId": 1,
"metadata": {
"title": "content one",
"duration": 4500
}
},
"custom": {
"field1": "value1"
}
}
}
I would like to update the "content" field as below
"content": {
"contentId": 1,
"metadata": {
"duration": 900
}
}
when i am updating with REST call (localhost:9200/index1/type1/1/_update), it is working fine. I am getting error in java API prepareUpdate.
I have 3 java classes.
DTO class has Content object
Content class has Metadata object and contentId as long
Metadata class has title (String) and duration(long).
Below is the code to update
Map<String, Object> params = new HashMap<>();
params.put("contentScript", dto.toString());
Script s = new Script("ctx._source.content=contentScript",ScriptType.INLINE,null,params);
UpdateResponse resp = client.prepareUpdate("index1", "type1", "1").setScript(s).setScriptedUpsert(true).get();
dto is object of DTO class and values are set accordingly.
Please help.
params.put("contentScript", dto.toString());
You are passing a string where it is expecting an object. The below code might help.
String script = "ctx._source.pete = jsonMap";
Map<String, Object> jsonMap = new ObjectMapper().readValue(json, HashMap.class);
Map<String, Object> params = ImmutableMap.of("jsonMap", jsonMap);
return new Script(script, ScriptService.ScriptType.INLINE, null, params);
https://discuss.elastic.co/t/how-to-update-nested-objects-in-elasticsearch-2-2-script-via-java-api/43135/2

Resources