elasticsearch.logQueries no effect in kibana config - elasticsearch

I'm trying to log all queries from kibana. So I edited config/kibana.yml and added the following lines:
logging.dest: /tmp/test.log
logging.silent: false
logging.quiet: false
logging.verbose: true
elasticsearch.logQueries: true
Then I restarted kibana, queried for something.
Now logs start to appear, but only access logs are recorded, no ES queries there.
{
"type": "response",
"#timestamp": "2018-08-21T02:41:03Z",
"tags": [],
"pid": 28701,
"method": "post",
"statusCode": 200,
"req": {
"url": "/elasticsearch/_msearch",
"method": "post",
"headers": {
...
},
"remoteAddress": "xxxxx",
"userAgent": "xxxxx",
"referer": "http://xxxxxxx:8901/app/kibana"
},
"res": {
"statusCode": 200,
"responseTime": 62,
"contentLength": 9
},
"message": "POST /elasticsearch/_msearch 200 62ms - 9.0B"
}
Any ideas? I'm using ELK 6.2.2.

The elasticsearch.logQueries setting has been introduced in Kibana 6.3 as can be seen in this pull request

Related

Magento 2 Klarna checkout

I need to integrate the Klarna Checkout module into magento 2.1.2. I am using the version of the "klarna/m2-checkout module": 4.2.2.
When choosing a delivery method, I always get an error in the pop-up window:
Sorry, the delivery option you chose cannot be processed. Please select another delivery option.
When i choose shipping method, i get this responce:
{
"shared": {
"customer": {
"type": "person"
},
"user_preferences": {
"remember_me": true
},
"language": "en",
"locale": "en-US",
"customer_details": {
"client_token": "eyJhbGciOiJSUzUxMiJ9.eyJz",
"country": "swe",
"completed": true,
"fields_with_obfuscation": {
"email": "melosicuva#royalhost.info",
"given_name": "Testperson-se",
"family_name": "Approved",
"street_address": "Stårgatan 1",
"postal_code": "123 45",
"city": "Ankeborg",
"country": "SE",
"phone": "076-526 00 00",
"date_of_birth": "1941-03-21",
"national_identification_number": "19410321-9202"
},
"reference": "2f9a445a57a49215175178099002fc7165ee"
},
"shipping_details": {
"client_token": "eyJhbGciOiJSUzUxMiJ9.eyJzZXNzaW9uX"
},
"currency": "SEK",
"obfuscated_fields": []
},
"cart": {
"total_tax_amount": 30000,
"total_price_including_tax": 150000,
"total_price_excluding_tax": 120000,
"total_shipping_amount_excluding_tax": 0,
"total_surcharge_amount_excluding_tax": 0,
"total_discount_amount_excluding_tax": 0,
"total_shipping_amount_including_tax": 0,
"total_surcharge_amount_including_tax": 0,
"total_discount_amount_including_tax": 0,
"subtotal": 120000,
"total_store_credit": 0,
"items": [{
"type": "physical",
"reference": "1201018390010",
"name": "Armour Bib Shorts",
"quantity": 1,
"unit_price": 150000,
"total_tax_amount": 30000,
"tax_rate": 2500,
"total_price_including_tax": 150000,
"total_price_excluding_tax": 120000,
"product_url": "https://local.com/armour-bib-shorts-black.html?___store%5B_data%5D%5Bstore_id%5D=2&___store%5B_data%5D%5Bcode%5D=se&___store%5B_data%5D%5Bwebsite_id%5D=2&___store%5B_data%5D%5Bgroup_id%5D=2&___store%5B_data%5D%5Bname%5D=Sweden+Store&___store%5B_data%5D%5Bsort_order%5D=30&___store%5B_data%5D%5Bis_active%5D=1&___store%5B_data%5D%5Balias%5D=Sweden&___store%5B_data%5D%5Bavailable_currency_codes%5D%5B0%5D=SEK",
"image_url": "https://local.com//media/catalog/product/a/r/armour-bib-shorts-aw18-01.jpg"
}]
},
"errors": {
"generic": ["shipping_service_failed"]
},
"options": {
"allow_separate_shipping_address": false,
"date_of_birth_mandatory": false,
"title_mandatory": false,
"national_identification_number_mandatory": false,
"phone_mandatory": true,
"allowed_customer_types": ["person"],
"payment_selector_on_load": false
},
"preview_payment_methods": [{
"id": "-1",
"type": "invoice",
"locked": false,
"selected": false,
"data": {
"days": 14
}
}, {
"id": "-1",
"type": "direct_debit",
"locked": false,
"selected": false
}, {
"id": "-1",
"type": "credit_card",
"locked": false,
"selected": false,
"data": {
"available_cards": ["VISA", "MASTER"],
"allow_saved_card": false,
"do_save_card": false,
"collect_consent": false,
"consent_given": false
}
}],
"allowed_billing_countries": ["swe"],
"status": {
"prescreened": false
},
"analytics_user_id": "ELmpDn1f600JYxHtagC7FcsOdAXe9-2iwWhIzHSfmhM=",
"merchant": {
"hashed_id": "a9c814c7a780d46a7fb2403e452829b3",
"name": "Your business name"
},
"merchant_urls": {
"checkout": "https://local.com/checkout/klarna",
"confirmation": "https://checkout-eu.playground.klarna.com/yaco/orders/ffc4101d-00cb-5e63-81fc-0f0c15baeac3/redirect?auth_token=0el7mltb89prfz2fz2mw",
"terms": "https://local.com/terms",
"confirmation_page": "https://local.com/checkout/klarna/confirmation/id/ffc4101d-00cb-5e63-81fc-0f0c15baeac3"
}
}
Here I do not like the block:
"errors": {
"generic": ["shipping_service_failed"]
}
Does anyone know how to fix it?
Delivery error :
This error occurs when you set address_update callback and and it's not handled in the right way. This callback should be set if you need to update order's addresses, and should not take more than 10 sec.
Here's an example: https://developers.klarna.com/api/#checkout-api-callbacks-address-update
And some best practices: https://developers.klarna.com/documentation/klarna-checkout/best-practices/#address-updated
If you run Klarna Checkout on localhost, then you should make the localhost-based application reachable from Klarna via the HTTP protocol (e.g., for the address_update callback).
You can do it via services like Ngrok.
In case of this error it's good to know that:
Klarna Checkout is calling callbacks regarding the shipping on checkout page:
address_update
shipping_option_update
If Klarna doesn't receive the answer from callback request in 10s it will end the connection and eventually you will see the error message. You can find access status logs in your http server, for example access status 499 in nginx. On the other hand in Klarna Merchant Portal you will see logs with status "???".
The callback request may be not accessible or not accessible in time below 10s:
if you work on localhost configure tunnel to expose your local environment to be visible by Klarna. For example with ngrok.
make sure that magento cache is enabled.
disable xdebug (unless it's version >=3)
check internet connection quality
check php.ini and http server performance related settings
If error still occurs you can debug the callback api to find the bottleneck. For example you can use logs in Klarna Merchant Portal to create a postman request to the callback api.

Decoding gzip response body from Packetbeat

I am using Packetbeat to monitor the requests/responses into/out of Elasticsearch client nodes using the http protocol watcher on port 9200. I am sending the output of Packetbeat through Logstash, and then from there out to a different instance of Elasticsearch. We have compression support enabled in the Elasticsearch that is being monitored, so I occasionally see requests with "Accept-Encoding: gzip, deflate" headers returning responses that are gzipped. Unfortunately, I have not been able to decode any of these gzip responses using any tools I have at my disposal (including the web-based converters, the gzip command line tool, and using Zlib::GzipReader in a Logstash ruby filter script). They all report that it is not a gzip format.
Does anyone know why I can't seem to decode the gzip content?
I have provided a sample of the filter I'm using in Logstash to try to do this on the fly as the event passes through Logstash (and it always reports that http.response.body is not in gzip format).
filter {
if [type] == "http" {
if [http][response][headers][content-encoding] == "gzip" {
ruby {
init => "
require 'zlib'
require 'stringio'
"
code => "
body = event.get('[http][response][body]').to_s
sio = StringIO.new(body)
gz = Zlib::GzipReader.new(sio)
result = gz.read.to_s
event.set('[http][response][body]', result)
"
}
}
}
}
I'm also providing a sample of the logged event here which includes the gzip content in case you would like to try to decompress it yourself:
{
"_index": "packetbeat-6.2.3-2018.05.19",
"_type": "doc",
"_id": "oH0bemMB2mAXfg5euIiP",
"_score": 1,
"_source": {
"server": "",
"client_server": "",
"bytes_in": 160,
"bytes_out": 361,
"#timestamp": "2018-05-19T20:33:46.470Z",
"client_port": 55863,
"path": "/",
"type": "http",
"client_proc": "",
"query": "GET /",
"port": 9200,
"host": "gke-main-production-elastic-clients-5728bab3-t1z8",
"#version": "1",
"responsetime": 0,
"fields": {
"nodePool": "production-elastic-clients"
},
"response": "HTTP/1.1 200 OK\r\ncontent-type: application/json; charset=UTF-8\r\ncontent-encoding: gzip\r\ncontent-length: 250\r\n\r\n\u001f�\b\u0000\u0000\u0000\u0000\u0000\u0000\u0000T��n�0\u0014Fw���\u001c\u0010\u0018�����&��vH\u0016d�K������\u0010��\u000b�C\u0018����{��\u0010]\u0001�\u001aap1W\u0012�\u0018\u0017�,y)���oC�\n��A��\u001b�6/��\u001a�\u000e��\"l+�����\u001d\u000f\u0005y/���k�?�\u0005�\u0005���3���Y�_[���Mh�\u0007nzo�T����C�1�\u0011�]����\u0007H�\u0015q��)�&i��u^%iF�k�i6�ތs�c���)�9hh^�0�T2<�<���.J����x���}�:c�\u0011��=���\u001f\u0000\u0000\u0000��\u0003\u0000��.�S\u0001\u0000\u0000",
"proc": "",
"request": "GET / HTTP/1.1\r\nUser-Agent: vscode-restclient\r\nhost: es-http-dev.elastic-prod.svc.cluster.local:9200\r\naccept-encoding: gzip, deflate\r\nConnection: keep-alive\r\n\r\n",
"beat": {
"name": "gke-main-production-elastic-clients-5728bab3-t1z8",
"version": "6.2.3",
"hostname": "gke-main-production-elastic-clients-5728bab3-t1z8"
},
"status": "OK",
"method": "GET",
"client_ip": "10.24.20.6",
"http": {
"response": {
"phrase": "OK",
"headers": {
"content-encoding": "gzip",
"content-length": 250,
"content-type": "application/json; charset=UTF-8"
},
"body": "\u001f�\b\u0000\u0000\u0000\u0000\u0000\u0000\u0000T��n�0\u0014Fw���\u001c\u0010\u0018�����&��vH\u0016d�K������\u0010��\u000b�C\u0018����{��\u0010]\u0001�\u001aap1W\u0012�\u0018\u0017�,y)���oC�\n��A��\u001b�6/��\u001a�\u000e��\"l+�����\u001d\u000f\u0005y/���k�?�\u0005�\u0005���3���Y�_[���Mh�\u0007nzo�T����C�1�\u0011�]����\u0007H�\u0015q��)�&i��u^%iF�k�i6�ތs�c���)�9hh^�0�T2<�<���.J����x���}�:c�\u0011��=���\u001f\u0000\u0000\u0000��\u0003\u0000��.�S\u0001\u0000\u0000",
"code": 200
},
"request": {
"params": "",
"headers": {
"connection": "keep-alive",
"user-agent": "vscode-restclient",
"content-length": 0,
"host": "es-http-dev.elastic-prod.svc.cluster.local:9200",
"accept-encoding": "gzip, deflate"
}
}
},
"tags": [
"beats",
"beats_input_raw_event"
],
"ip": "10.24.41.5"
},
"fields": {
"#timestamp": [
"2018-05-19T20:33:46.470Z"
]
}
}
And this is the response for that message that I receive at the client after it has been decompressed successfully by the client:
HTTP/1.1 200 OK
content-type: application/json; charset=UTF-8
content-encoding: gzip
content-length: 250
{
"name": "es-client-7688c8d9b9-qp9l7",
"cluster_name": "esprod",
"cluster_uuid": "8iRwLMMSR72F76ZEONYcUg",
"version": {
"number": "5.6.3",
"build_hash": "1a2f265",
"build_date": "2017-10-06T20:33:39.012Z",
"build_snapshot": false,
"lucene_version": "6.6.1"
},
"tagline": "You Know, for Search"
}
I had a different situation and was able to resolve my issue. Posting it here, see if it helps your case.
I was using postman tool to test my REST API services locally. My Packetbeat used following config.
type: http
ports: [80, 8080, 8000, 5000, 8002]
send_all_headers: true
include_body_for: ["application/json", "x-www-form-urlencoded"]
send_request: true
send_response: true
I was getting following output in body.
I was able to get http.response.body in clear text when i added following to my postman request.
Accept-Encoding: application/json

unable to parse my custom log in logstash

Here given below is my Service log I want to parse this log into logstash. please suggests some plugin or method to parse the log.
"msgs": [{
"ts": "2017-07-17T12:22:00.2657422Z",
"tid": 4,
"eid": 1,
"lvl": "Information",
"cat": "Microsoft.AspNetCore.Hosting.Internal.WebHost",
"msg": {
"cnt": "Request starting HTTP/1.1 POST http://localhost:20001/Processor text/xml; charset=utf-8 601",
"Protocol": "HTTP/1.1",
"Method": "POST",
"ContentType": "text/xml; charset=utf-8",
"ContentLength": 601,
"Scheme": "http",
"Host": "localhost:20001",
"PathBase": "",
"Path": "/Processor",
"QueryString": ""
}
},
{
"ts": "2017-07-17T12:22:00.4617773Z",
"tid": 4,
"lvl": "Information",
"cat": "NCR.CP.Service.ServiceHostMiddleware",
"msg": {
"cnt": "REQ"
},
"data": {
"Headers": {
"Connection": "Keep-Alive",
"Content-Length": "601",
"Content-Type": "text/xml; charset=utf-8",
"Accept-Encoding": "gzip, deflate",
"Expect": "100-continue",
"Host": "localhost:20001",
"SOAPAction": "\"http://servereps.mtxeps.com/TransactionService/SendTransaction\""
}
}
}]
please suggest me some the way to apply filter on this type of logs, so that i can take out fields from this logs and visualize in kibana.
i have heard of grok filter but what pattern i have to use here for the same.

MobileFirst create wrong SMS request

IBM MobileFirst Platform Foundation 8.0.0.
After configuring SMS settings I am trying to send a message but the request is created in the wrong way. See the result below.
//REST API : send notification request
{
"message": {
"alert": "Hello World from an SMS message"
},
"notificationType":3,
"target" : {
"deviceIds" : ["9a149c24-8859-3383-6067-d161e46d2554"]
}
}
The created request:
473607:[2017-01-02 16:44:02.494] - [440093822] Request received: HTTP GET /send.aspx?
encode=false&name=toParamName&value=Recipients&encode=false&name=textParamName&value=MessageText&encode=false&name=MessageType&value=text&encode=false&name=SenderName&value=PLIX&encode=false&name=UserName&value=MahmoudSamy&encode=true&name=Password&value=xyz&to=20100051111&text=Hello+World+from+an+SMS+message+2
//SMS settings
{
"port": "80",
"programName": "/sendsms",
"host": "xyz.com",
"name": "SMSGateway",
"parameters": [
{
"encode": "false",
"name": "toParamName",
"value": "to"
},
{
"encode": "false",
"name": "textParamName",
"value": "text"
},
{
"encode": "false",
"name": "SenderName",
"value": "Support"
},
{
"encode": "false",
"name": "UserName",
"value": "xyz"
},
{
"encode": "false",
"name": "Password",
"value": "xyz"
}
]
}
We tried to send SMS with SMS settings shared by you.
We are able to get correct value pair in the created request.
Below is the created request
GET /gateway/add.php?encode=false&name=toParamName&value=to&encode=false&name=textParamName&value=text&encode=false&name=SenderName&value=Support&encode=false&name=UserName&value=xyz&encode=false&name=Password&value=xyz&to=99&text=Hello+World+from+an+SMS+message HTTP/1.1
Also in created request shared by you, I am noticing different username value than given in sms settings.
Could you please tell us how you are checking the request. We are using wireshark to capture.
the below configuration works with me but it force me to accept to and text parameters.
{
"port": "80",
"programName": "/sendsms",
"host": "xyz.com",
"name": "SMSGateway",
"parameters": [{
"SenderName": "Support",
"MessageType": "text",
"UserName": "xyz",
"Password": "xyz"
}]
}
HTTP GET /send.aspx?SenderName=Support&MessageType=text&UserName=xyz&Password=xyz&to=083127312763&to=hello+world

LogicApp CRM connector create record gives a 500 error

I've build a fairly simple LogicApp with the most recent version that came out about a week ago. It runs every hour and tries to create a record in CRM online. In the same manner I've created a LogicApp that retrieves records and that works.
The failed input and output looks like this:
{
"host": {
"api": {
"runtimeUrl": "https://logic-apis-westeurope.azure-apim.net/apim/dynamicscrmonline"
},
"connection": {
"name": "subscriptions/6e779c81-1d0c-4df9-92b8-b287ba919b51/resourceGroups/spdev-eiramar/providers/Microsoft.Web/connections/EE2AF043-71F0-4780-A8D1-25438A7746C0"
}
},
"method": "post",
"path": "/datasets/xxxx.crm4/tables/accounts/items",
"body": {
"name": "Test 1234"
}
}
Output:
{
"statusCode": 500,
"headers": {
"pragma": "no-cache",
"x-ms-request-id": "d121f98c-3dd5-4e6d-899b-5150b17795a3",
"cache-Control": "no-cache",
"date": "Tue, 01 Mar 2016 12:27:09 GMT",
"set-Cookie": "ARRAffinity=xxxxx082f2bca2639f8a68e283db0eba612ddd71036cf5a5cf2597f99;Path=/;Domain=127.0.0.1",
"server": "Microsoft-HTTPAPI/2.0",
"x-AspNet-Version": "4.0.30319",
"x-Powered-By": "ASP.NET"
},
"body": {
"status": 500,
"message": "Unknown error.",
"source": "127.0.0.1"
}
}
Does anybody know how to solve this issue?
Do you have security roles setup to access the CRM Org, i.e. can you access the URL from browser? https://xxxx.crm4.dynamics.com
Update: Fix for this issue has been rolled out. Please retry your scenario.

Resources