elasticsearch wrapper query not works with base64 encoded string
ES version : 5.2.3
To encode i have used base64:
char[] data = Base64Coder.encode(text.getBytes());
return data.ToString();
Note: text is the underline json query.
query:
curl -d XPOST 'http://localhost:9200/entitymaster_qa_t4/_search' -d '{
"query" : {
"wrapper" : {
"query" : "W0NAMTZiN2MzYw=="
}
}
}'
Response:
{"error":{"root_cause":[{"type":"parse_exception","reason":"Failed to derive xcontent"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"entitymaster_qa_t4","node":"8WVaVr9ATmaqOPDHGpNyHw","reason":{"type":"parse_exception","reason":"Failed to derive xcontent"}}]},"status":400}
The wrapper query appeared in ES 6.0 according to the docs, so if you want to use it you need to update your version. Also, the base64 string must decode into a valid query, not just a piece of data.
Related
Here it is my requirement, This is my 3levels of data which I am gettting from DB , my requirement is when I search for Developer I should get all the values of Developer such as Geo and Graph from Data2 in a list and while coming to support my values should contain Server and Data in a list and then on the basis of selection of Data1 . Data3 should be able to do the search , like suppose when we select developer then Geopos and Graphpos...
the logic which i need to use here is of elasticsearch
data1 data2 data3
Developer GEO GeoPos
Developer GRAPH GraphPos
Support SERVER ServerPos
Support Data DataPos
this is what I have done to crete the index and to get the values
curl -X PUT http://localhost:9200/mapping_log
{ "mappings":{ "properties":{"data1:{"type": "text","fields":{"keyword":{"type":"keyword"}}}, {"data2":{"type": "text","fields":{"keyword":{"type":"keyword"}}}, {"data3":{"type": "text","fields":{"keyword":{"type":"keyword"}}}, } } }
searching values , I am not sure what I am going to get can u pls help with search dsl query too
curl -X GET "localhost:9200/mapping_log/_search?pretty" -H 'Content-Type: application/json' -d'
{
"query": {
"match": {
"data1.data2": "product"
}
}
}
How to create document for such type of Data can we create json and post it through postman or curl ?
If your documents are not indexed in elastic search first you need to ingest them to an existing index in elastic with the aid of Logstah , you can find many configuration file related to you input database.
Before transforming your documents create and index in elastic with multi fields mapping, you can use dynamic mapping(elastic default mapping) also and change your Dsl query but I recommend to use multi fields mapping as follow
PUT /mapping{
"mappings":
{"properties": {"rating":{"type": "float"},
"content":{"type": "text"},
"author":{"properties": {
"name":{"type": "text"},
"email":{"type": "keyword"}
}}
}}
}
The result will be
Mapping result
then you can query the fields in kibana Dev tools with DSL query like below
GET /mapping/_search{
"query": {"match":
{ "author.email": "SOMEMAIL"}}
}
Following works fine
curl -X GET "localhost:9200/_search?pretty" -H 'Content-Type: application/json' -d'
{
"query" : {
"query_string" : {
"query" : "chicag*",
"fields" : ["name"],
"_name":"myqry"
}
}
}
'
How can I create a browser url for this GET request ? I URLEncoded the json and tried -
localhost:9200/_search?data=%7B%0A%20%20%22query%22%20%3A%20%7B%0A%20%20%20%20%22query_string%22%20%3A%20%7B%0A%20%20%20%20%20%20%22query%22%20%3A%20%22chicag%2A%22%2C%0A%20%20%20%20%20%20%22fields%22%20%20%3A%20%5B%22name%22%5D%2C%0A%20%20%20%20%20%20%22_name%22%3A%22Chcagoooo%22%0A%20%20%20%20%7D%0A%20%20%7D
but it did not work
It is definitely possible to send your DSL query as a query string parameter using the source parameter like this:
localhost:9200/_search?pretty&source={"query":{"query_string":{"query":"chicag*","fields":["name"],"_name":"myqry"}}}&source_content_type=application/json
I am using multi-get API _mget. In my case, I am using the following URL to get all the documents in the specified range.
Refer: https://www.elastic.co/guide/en/elasticsearch/reference/7.10/docs-multi-get.html#mget-ids
This example displays documents with id 49 and 50.
http://localhost:9200/my_index/_mget?pretty&source={ "ids" : [49, 50] }&source_content_type=application/json
I am running elasticsearch 2.3.4, but the syntax does not seem to have changed in 5.x.
Multiget over curl is working just fine. Here is what my curl looks like:
curl 'localhost:9200/_mget' -d '{
"docs" : [
{
"_index" : "logs-2017-04-30",
"_id" : "e72927c2-751c-4b33-86de-44a494abf78f"
}
]
}'
And when I want to pull the "message" field off that response, I use this request:
curl 'localhost:9200/_mget' -d '{
"docs" : [
{
"_index" : "logs-2017-04-30",
"_id" : "e72927c2-751c-4b33-86de-44a494abf78f",
"fields" : ["message"]
}
]
}'
Both of the above queries return the log and information that I am looking for.
But when I try to translate it to Java like this:
MultiGetRequestBuilder request = client.prepareMultiGet();
request.add("logs-2017-04-30", null, "e72927c2-751c-4b33-86de-44a494abf78f");
MultiGetResponse mGetResponse = request.get();
for (MultiGetItemResponse itemResponse : mGetResponse.getResponses()) {
GetResponse response = itemResponse.getResponse();
logger.debug("Outputing object: " + ToStringBuilder.reflectionToString(response));
}
I appear to be getting null objects back. When I try to grab the message field off the null-looking GetResponse object, nothing is there:
GetField field = response.getField("message"); <--- returns null
What am I doing wrong? doing a rest call to elasticsearch proves the log exists, but my Java call is wrong somehow.
The documentation page for the Java multi get completely skips over the extra syntax required to retrieve data beyond the _source field. Just like the REST API, doing a multi get with the minimum information required to locate a log gets very limited information about it. In order to get specific fields from a log in a multi get call through the Java API, you must pass in a MultiGetRequest.Item to the builder. This item needs to have the fields you want specified in it before you execute the request.
Here is the code change (broken into multiple lines for clarity) that results in the fields I want being present when I make the query:
MultiGetRequestBuilder request = client.prepareMultiGet();
MultiGetRequest.Item item = new MultiGetRequest.Item("logs-2017-04-30", "null", "e72927c2-751c-4b33-86de-44a494abf78f");
item.fields("message");
request.add(item);
MultiGetResponse mGetResponse = request.get();
Now I can ask for the field I specified earlier:
GetField field = response.getField("message");
Elastic-search has from/size parameter and this size is 10 by default. How to get all the results without pagination?
first get count of results . you can get it by count api. the put n into following method of QueryBuilder.
CountResponse response = client.prepareCount("test")
.setQuery(termQuery("_type", "type1"))
.execute()
.actionGet();
then call
n=response.getCount();
you can use setSize(n).
for non java use like this curl request.
curl -XGET 'http://localhost:9200/twitter/tweet/_count' -d '
{
"query" : {
"term" : { "user" : "kimchy" }
}
}'
more on this can be found on this link.
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/search-count.html
"-" is acting like a or operator for e.g. I am searching "t-link", then it showing the result containing "t-link" as well as "t", why it is giving two terms, but i interested in the "t-link", why it is happening so? How can i recover from it?
Elasticsearch is using by default the standard analyzer for strings.
Basically, your string is tokenized in two tokens, lowercased:
t
link
If you need to know what does elasticsearch with your fields, use the _analyze API.
$ curl -XGET 'localhost:9200/_analyze?analyzer=standard' -d 't-link'
$ curl -XGET 'localhost:9200/_analyze?analyzer=simple' -d 't-link'
If you don't want that, make sure you put the right mapping for that field and use either a simple analyzer or a keyword analyzer or no analyzer at all depending on your requirements. See also String core type.
$ curl -XPUT 'http://localhost:9200/twitter/tweet/_mapping' -d '
{
"tweet" : {
"properties" : {
"message" : {"type" : "string", "analyzer" : "simple"},
"other" : {"type" : "string", "index" : "not_analyzed"}
}
}
}
'
Using this form message field will be analyzed with simple analyzer and other field won't be analyzed at all.