Here given below is my Service log I want to parse this log into logstash. please suggests some plugin or method to parse the log.
"msgs": [{
"ts": "2017-07-17T12:22:00.2657422Z",
"tid": 4,
"eid": 1,
"lvl": "Information",
"cat": "Microsoft.AspNetCore.Hosting.Internal.WebHost",
"msg": {
"cnt": "Request starting HTTP/1.1 POST http://localhost:20001/Processor text/xml; charset=utf-8 601",
"Protocol": "HTTP/1.1",
"Method": "POST",
"ContentType": "text/xml; charset=utf-8",
"ContentLength": 601,
"Scheme": "http",
"Host": "localhost:20001",
"PathBase": "",
"Path": "/Processor",
"QueryString": ""
}
},
{
"ts": "2017-07-17T12:22:00.4617773Z",
"tid": 4,
"lvl": "Information",
"cat": "NCR.CP.Service.ServiceHostMiddleware",
"msg": {
"cnt": "REQ"
},
"data": {
"Headers": {
"Connection": "Keep-Alive",
"Content-Length": "601",
"Content-Type": "text/xml; charset=utf-8",
"Accept-Encoding": "gzip, deflate",
"Expect": "100-continue",
"Host": "localhost:20001",
"SOAPAction": "\"http://servereps.mtxeps.com/TransactionService/SendTransaction\""
}
}
}]
please suggest me some the way to apply filter on this type of logs, so that i can take out fields from this logs and visualize in kibana.
i have heard of grok filter but what pattern i have to use here for the same.
Related
I'm trying to log all queries from kibana. So I edited config/kibana.yml and added the following lines:
logging.dest: /tmp/test.log
logging.silent: false
logging.quiet: false
logging.verbose: true
elasticsearch.logQueries: true
Then I restarted kibana, queried for something.
Now logs start to appear, but only access logs are recorded, no ES queries there.
{
"type": "response",
"#timestamp": "2018-08-21T02:41:03Z",
"tags": [],
"pid": 28701,
"method": "post",
"statusCode": 200,
"req": {
"url": "/elasticsearch/_msearch",
"method": "post",
"headers": {
...
},
"remoteAddress": "xxxxx",
"userAgent": "xxxxx",
"referer": "http://xxxxxxx:8901/app/kibana"
},
"res": {
"statusCode": 200,
"responseTime": 62,
"contentLength": 9
},
"message": "POST /elasticsearch/_msearch 200 62ms - 9.0B"
}
Any ideas? I'm using ELK 6.2.2.
The elasticsearch.logQueries setting has been introduced in Kibana 6.3 as can be seen in this pull request
I am using Packetbeat to monitor the requests/responses into/out of Elasticsearch client nodes using the http protocol watcher on port 9200. I am sending the output of Packetbeat through Logstash, and then from there out to a different instance of Elasticsearch. We have compression support enabled in the Elasticsearch that is being monitored, so I occasionally see requests with "Accept-Encoding: gzip, deflate" headers returning responses that are gzipped. Unfortunately, I have not been able to decode any of these gzip responses using any tools I have at my disposal (including the web-based converters, the gzip command line tool, and using Zlib::GzipReader in a Logstash ruby filter script). They all report that it is not a gzip format.
Does anyone know why I can't seem to decode the gzip content?
I have provided a sample of the filter I'm using in Logstash to try to do this on the fly as the event passes through Logstash (and it always reports that http.response.body is not in gzip format).
filter {
if [type] == "http" {
if [http][response][headers][content-encoding] == "gzip" {
ruby {
init => "
require 'zlib'
require 'stringio'
"
code => "
body = event.get('[http][response][body]').to_s
sio = StringIO.new(body)
gz = Zlib::GzipReader.new(sio)
result = gz.read.to_s
event.set('[http][response][body]', result)
"
}
}
}
}
I'm also providing a sample of the logged event here which includes the gzip content in case you would like to try to decompress it yourself:
{
"_index": "packetbeat-6.2.3-2018.05.19",
"_type": "doc",
"_id": "oH0bemMB2mAXfg5euIiP",
"_score": 1,
"_source": {
"server": "",
"client_server": "",
"bytes_in": 160,
"bytes_out": 361,
"#timestamp": "2018-05-19T20:33:46.470Z",
"client_port": 55863,
"path": "/",
"type": "http",
"client_proc": "",
"query": "GET /",
"port": 9200,
"host": "gke-main-production-elastic-clients-5728bab3-t1z8",
"#version": "1",
"responsetime": 0,
"fields": {
"nodePool": "production-elastic-clients"
},
"response": "HTTP/1.1 200 OK\r\ncontent-type: application/json; charset=UTF-8\r\ncontent-encoding: gzip\r\ncontent-length: 250\r\n\r\n\u001f�\b\u0000\u0000\u0000\u0000\u0000\u0000\u0000T��n�0\u0014Fw���\u001c\u0010\u0018�����&��vH\u0016d�K������\u0010��\u000b�C\u0018����{��\u0010]\u0001�\u001aap1W\u0012�\u0018\u0017�,y)���oC�\n��A��\u001b�6/��\u001a�\u000e��\"l+�����\u001d\u000f\u0005y/���k�?�\u0005�\u0005���3���Y�_[���Mh�\u0007nzo�T����C�1�\u0011�]����\u0007H�\u0015q��)�&i��u^%iF�k�i6�ތs�c���)�9hh^�0�T2<�<���.J����x���}�:c�\u0011��=���\u001f\u0000\u0000\u0000��\u0003\u0000��.�S\u0001\u0000\u0000",
"proc": "",
"request": "GET / HTTP/1.1\r\nUser-Agent: vscode-restclient\r\nhost: es-http-dev.elastic-prod.svc.cluster.local:9200\r\naccept-encoding: gzip, deflate\r\nConnection: keep-alive\r\n\r\n",
"beat": {
"name": "gke-main-production-elastic-clients-5728bab3-t1z8",
"version": "6.2.3",
"hostname": "gke-main-production-elastic-clients-5728bab3-t1z8"
},
"status": "OK",
"method": "GET",
"client_ip": "10.24.20.6",
"http": {
"response": {
"phrase": "OK",
"headers": {
"content-encoding": "gzip",
"content-length": 250,
"content-type": "application/json; charset=UTF-8"
},
"body": "\u001f�\b\u0000\u0000\u0000\u0000\u0000\u0000\u0000T��n�0\u0014Fw���\u001c\u0010\u0018�����&��vH\u0016d�K������\u0010��\u000b�C\u0018����{��\u0010]\u0001�\u001aap1W\u0012�\u0018\u0017�,y)���oC�\n��A��\u001b�6/��\u001a�\u000e��\"l+�����\u001d\u000f\u0005y/���k�?�\u0005�\u0005���3���Y�_[���Mh�\u0007nzo�T����C�1�\u0011�]����\u0007H�\u0015q��)�&i��u^%iF�k�i6�ތs�c���)�9hh^�0�T2<�<���.J����x���}�:c�\u0011��=���\u001f\u0000\u0000\u0000��\u0003\u0000��.�S\u0001\u0000\u0000",
"code": 200
},
"request": {
"params": "",
"headers": {
"connection": "keep-alive",
"user-agent": "vscode-restclient",
"content-length": 0,
"host": "es-http-dev.elastic-prod.svc.cluster.local:9200",
"accept-encoding": "gzip, deflate"
}
}
},
"tags": [
"beats",
"beats_input_raw_event"
],
"ip": "10.24.41.5"
},
"fields": {
"#timestamp": [
"2018-05-19T20:33:46.470Z"
]
}
}
And this is the response for that message that I receive at the client after it has been decompressed successfully by the client:
HTTP/1.1 200 OK
content-type: application/json; charset=UTF-8
content-encoding: gzip
content-length: 250
{
"name": "es-client-7688c8d9b9-qp9l7",
"cluster_name": "esprod",
"cluster_uuid": "8iRwLMMSR72F76ZEONYcUg",
"version": {
"number": "5.6.3",
"build_hash": "1a2f265",
"build_date": "2017-10-06T20:33:39.012Z",
"build_snapshot": false,
"lucene_version": "6.6.1"
},
"tagline": "You Know, for Search"
}
I had a different situation and was able to resolve my issue. Posting it here, see if it helps your case.
I was using postman tool to test my REST API services locally. My Packetbeat used following config.
type: http
ports: [80, 8080, 8000, 5000, 8002]
send_all_headers: true
include_body_for: ["application/json", "x-www-form-urlencoded"]
send_request: true
send_response: true
I was getting following output in body.
I was able to get http.response.body in clear text when i added following to my postman request.
Accept-Encoding: application/json
I am trying to access the service endpoint setup in my extension code.
The extension is as follows:
{
"manifestVersion": 1,
"id": "vsts-extensions-myExtensions",
"version": "0.5.1",
"name": "xxx Projects Time Entry",
"description": "Record time spent in xxx Projects",
"publisher": "xxx",
"targets": [
{
"id": "Microsoft.VisualStudio.Services"
}
],
"icons": {
"default": "img/logo.png"
},
"contributions":
[
{
"id": "xxTimeEntry",
"type": "ms.vss-dashboards-web.widget",
...
},
{
"id": "service-endpoint",
"description": "Service Endpoint type for xx connections",
"type": "ms.vss-endpoint.service-endpoint-type",
"targets": [ "ms.vss-endpoint.endpoint-types" ],
"properties": {
"name": "xxxyyy",
"displayName": "xx server connection",
"url": {
"displayName": "Server Url",
"helpText": "Url for the xxx server to connect to."
},
"dataSources": [
{
"name": "xxx Projects",
"endpointUrl": "{{endpoint.url}}api/timesheetwidgetprojects",
"resultSelector": "jsonpath:$[*].nm"
}
],
"authenticationSchemes": [
{
"type": "ms.vss-endpoint.endpoint-auth-scheme-basic",
"inputDescriptors": [
{
"id": "username",
"name": "Username",
"description": "Username",
"inputMode": "textbox",
"validation": {
"isRequired": false,
"dataType": "string"
}
},
{
"id": "password",
"name": "Password",
"description": "Password",
"inputMode": "passwordbox",
"isConfidential": true,
"validation": {
"isRequired": false,
"dataType": "string"
}
}
]
}
]
}
}
],
...
The code to access the service endpoint is something like :
VSS.require(["VSS/Service", "VSS/WebApi/RestClient"],
function (VSS_Service, RestClient) {
var webContext = VSS.getWebContext();
var client = VSS_Service.getCollectionClient(DistributedTask.TaskAgentRestClient);
client.getServiceEndpoints(webContext.project.id).then(
function (endpoints) {
alert('endpoints')
}
);
}
);
however I am not using a task and just have my endpoint in the main vss-extension.json.
Any ideas?
Thanks
Martin
Based on the supported scopes, there isn’t the scope for service endpoint, so you can’t do it.
I submit a user voice here: VSTS extension service endpoint scope, you can vote and follow up.
The workaround is that you can call REST API by using JS code with Personal Access Token in your extension.
Simple code to call REST API:
$.ajax({
url: 'https://fabrikam.visualstudio.com/defaultcollection/_apis/projects?api-version=1.0',
dataType: 'json',
headers: {
'Authorization': 'Basic ' + btoa("" + ":" + myPatToken)
}
}).done(function( results ) {
console.log( results.value[0].id + " " + results.value[0].name );
});
The scope has been added now and it is "vso.serviceendpoint"
Quick question re: a 401 error I'm running into when executing a report.getfile in the Doubleclick Search API Explorer (full error message below) .
I am currently logged into my google account that is used for Doubleclick Search. I've tried logging in on a different browser, as well.
I've gone through the 'report.request' and 'report.get' executions without issue. I only get this error when I get to this point. "isReportReady" is also true before I execute 'report.getfile'. Full list of requests/responses below this initial message.
My understanding is that with API Explorer, I should just be able to turn on the OAUTH2.0 and execute, but correct me if I'm wrong!
"domain": "global",
"reason": "required",
"message": "Login Required",
"locationType": "header",
"location": "Authorization"
"code": 401,
"message": "Login Required"
If any additional details are needed definitely let me know. The only questions I've seen come up in here seem to be more-so based around manual coding, or other API Explorer services.
Thanks in advance for any insight!
reports.Request (Response)
POST https://www.googleapis.com/doubleclicksearch/v2/reports?fields=isReportReady%2Crequest%2Fcolumns%2FcolumnName&key={YOUR_API_KEY}
{
"downloadFormat": "csv",
"maxRowsPerFile": 6000000,
"reportType": "conversion",
"statisticsCurrency": "agency",
"reportScope": {
"agencyId": "xxxxxxxxxxxxxxxxxxxx",
"advertiserId": "xxxxxxxxxxxxxxxxxxx"
},
"timeRange": {
"startDate": "2014-01-1",
"endDate": "2016-11-3"
},
"columns": [
{
"columnName": "conversionVisitExternalClickId"
},
{
"columnName": "conversionTimestamp"
},
{
"columnName": "keywordid"
},
{
"columnName": "adGroup"
},
{
"columnName": "campaign"
},
{
"columnName": "account"
},
{
"columnName": "floodlightActivity"
},
{
"columnName": "conversionID"
}
]
}
reports.Get (Response)
"kind": "doubleclicksearch#report",
"id": "xxxxxxxxxxxxx",
"isReportReady": true,
"request": {
"reportScope": {
"agencyId": "xxxxxxxxxxxxxxx",
"advertiserId": "xxxxxxxxxxxxx"
},
"reportType": "conversion",
"columns": [
{
"columnName": "conversionVisitExternalClickId"
},
{
"columnName": "conversionTimestamp"
},
{
"columnName": "keywordId"
},
{
"columnName": "adGroup"
},
{
"columnName": "campaign"
},
{
"columnName": "account"
},
{
"columnName": "floodlightActivity"
},
{
"columnName": "conversionId"
}
],
"timeRange": {
"startDate": "2014-01-1",
"endDate": "2016-11-3"
},
"includeRemovedEntities": false,
"downloadFormat": "csv",
"statisticsCurrency": "agency",
"maxRowsPerFile": 6000000
},
"statisticsCurrencyCode": "USD",
"statisticsTimeZone": "America/New_York",
"rowCount": 5225201,
"files": [
{
"url": "https://www.googleapis.com/doubleclicksearch/v2/reports/AAAncMCcKrSJ9MS_/files/0",
"byteCount": "1271002762"
}
]
}
reports.getFile (Response)
GET https://www.googleapis.com/doubleclicksearch/v2/reports/AAAncMCcKrSJ9MS_/files/0?key={YOUR_API_KEY}
200
- Hide headers -
cache-control: private, max-age=0, must-revalidate, no-transform
content-length: 1694670352
Content-Type: text/csv
date: Tue, 21 Mar 2017 14:24:03 GMT
etag: W/"u6ybyji1w_vhYZMTUlOUhEWemRI/nWLUgbdAKD9EU6srfscDcZyGxz8"
expires: Tue, 21 Mar 2017 14:24:03 GMT
server: UploadServer
vary: Origin, X-Origin
x-guploader-uploadid: AEnB2UoZzz4AafhGUKg-1edKZWLCmiML_a6HnP7lr11aD7tXZAnEIgyEno9ZhHOCbJ02XTlAkgtYcD4AOGbhmar6dpSx-uzh-n_6SCQH-iShQma-k6GLsMk
I've build a fairly simple LogicApp with the most recent version that came out about a week ago. It runs every hour and tries to create a record in CRM online. In the same manner I've created a LogicApp that retrieves records and that works.
The failed input and output looks like this:
{
"host": {
"api": {
"runtimeUrl": "https://logic-apis-westeurope.azure-apim.net/apim/dynamicscrmonline"
},
"connection": {
"name": "subscriptions/6e779c81-1d0c-4df9-92b8-b287ba919b51/resourceGroups/spdev-eiramar/providers/Microsoft.Web/connections/EE2AF043-71F0-4780-A8D1-25438A7746C0"
}
},
"method": "post",
"path": "/datasets/xxxx.crm4/tables/accounts/items",
"body": {
"name": "Test 1234"
}
}
Output:
{
"statusCode": 500,
"headers": {
"pragma": "no-cache",
"x-ms-request-id": "d121f98c-3dd5-4e6d-899b-5150b17795a3",
"cache-Control": "no-cache",
"date": "Tue, 01 Mar 2016 12:27:09 GMT",
"set-Cookie": "ARRAffinity=xxxxx082f2bca2639f8a68e283db0eba612ddd71036cf5a5cf2597f99;Path=/;Domain=127.0.0.1",
"server": "Microsoft-HTTPAPI/2.0",
"x-AspNet-Version": "4.0.30319",
"x-Powered-By": "ASP.NET"
},
"body": {
"status": 500,
"message": "Unknown error.",
"source": "127.0.0.1"
}
}
Does anybody know how to solve this issue?
Do you have security roles setup to access the CRM Org, i.e. can you access the URL from browser? https://xxxx.crm4.dynamics.com
Update: Fix for this issue has been rolled out. Please retry your scenario.