Logstash : Split json object - ruby

Please I have a json Object which results from an xml input, it looks like this :
{
"#version" => "1",
"#timestamp" => "2016-04-11T15:35:07.372Z",
"host" => "YUSUF-PC",
"command" => "nana",
"doc" => {
"TotalResults" => "1892",
"Audit" => [
[0] {
"Id" => "2260167",
"Action" => "UPDATE",
"ParentId" => "30612",
"ParentType" => "defect",
"Time" => "2016-01-04 08:27:59",
"User" => "nana",
"Properties" => {
"Property" => [
[0] {
"Label" => "Statut",
"Name" => "status",
"NewValue" => [
[0] "En cours"
]
},
[1] {
"Label" => "Affecté à",
"Name" => "owner",
"NewValue" => [
[0] "nana"
]
},
[2] {
"Label" => "Priorité",
"Name" => "severity",
"NewValue" => [
[0] "nana"
]
}
]
}
},
[1] {
"Id" => "2260168",
"Action" => "UPDATE",
"ParentId" => "30612",
"ParentType" => "defect",
"Time" => "2016-01-04 09:45:33",
"User" => "nana",
"Properties" => {
"Property" => [
[0] {
"Label" => "Affecté à",
"Name" => "owner",
"NewValue" => [
[0] "nana"
],
"OldValue" => [
[0] "nana"
]
}
]
}
}
]
} }
I need to split this json to properties, ie to get each document containing one property, the problem is not the split operation, but when I insert this to elasticsearch, the "NewValue" field doesn't get into account... So I need to write a ruby filter to alter the value to value[0]. Anyone can help, I'm not good at ruby ?
I want to get a json like this one :
{
"#version" => "1",
"#timestamp" => "2016-04-11T15:35:07.372Z",
"host" => "YUSUF-PC",
"command" => "nana",
"doc" => {
"TotalResults" => "1892",
"Audit" => [
[0] {
"Id" => "2260167",
"Action" => "UPDATE",
"ParentId" => "30612",
"ParentType" => "defect",
"Time" => "2016-01-04 08:27:59",
"User" => "nana",
"Properties" => {
"Property" =>
{
"Label" => "Statut",
"Name" => "status",
"NewValue" => "En cours"
}
}
}
]
}
}
Thank you

I hope this helps.
old = {
"#version" => "1",
"#timestamp" => "2016-04-11T15:35:07.372Z",
"host" => "YUSUF-PC",
"command" => "nana",
"doc" => {
"TotalResults" => "1892",
"Audit" => [
{
"Id" => "2260167",
"Action" => "UPDATE",
"ParentId" => "30612",
"ParentType" => "defect",
"Time" => "2016-01-04 08:27:59",
"User" => "nana",
"Properties" => {
"Property" => [
{
"Label" => "Statut",
"Name" => "status",
"NewValue" => [
"En cours"
]
},
{
"Label" => "Affecté à",
"Name" => "owner",
"NewValue" => [
"nana"
]
},
{
"Label" => "Priorité",
"Name" => "severity",
"NewValue" => [
"nana"
]
}
]
}
},
{
"Id" => "2260168",
"Action" => "UPDATE",
"ParentId" => "30612",
"ParentType" => "defect",
"Time" => "2016-01-04 09:45:33",
"User" => "nana",
"Properties" => {
"Property" => [
{
"Label" => "Affecté à",
"Name" => "owner",
"NewValue" => [
"nana"
],
"OldValue" => [
"nana"
]
}
]
}
}
]
} }
##THIS IS THE LINE ACTUALLY DOING WORK.
old["doc"]["Audit"].map{|prop| prop["Properties"]["Property"].map{|value| value['NewValue']= value['NewValue'].first} }
old
=> {"#version"=>"1", "#timestamp"=>"2016-04-11T15:35:07.372Z", "host"=>"YUSUF-PC", "command"=>"nana", "doc"=>{"TotalResults"=>"1892", "Audit"=>[{"Id"=>"2260167", "Action"=>"UPDATE", "ParentId"=>"30612", "ParentType"=>"defect", "Time"=>"2016-01-04 08:27:59", "User"=>"nana", "Properties"=>{"Property"=>[{"Label"=>"Statut", "Name"=>"status", "NewValue"=>"En cours"}, {"Label"=>"Affecté à", "Name"=>"owner", "NewValue"=>"nana"}, {"Label"=>"Priorité", "Name"=>"severity", "NewValue"=>"nana"}]}}, {"Id"=>"2260168", "Action"=>"UPDATE", "ParentId"=>"30612", "ParentType"=>"defect", "Time"=>"2016-01-04 09:45:33", "User"=>"nana", "Properties"=>{"Property"=>[{"Label"=>"Affecté à", "Name"=>"owner", "NewValue"=>"nana", "OldValue"=>["nana"]}]}}]}}

Related

Magento 2 rest api doesn't update salable quantity

I use rest api to create a simple product on my magento. All works correctly except quantity column that is correctly update on "Quantity" column but not on the salable quantity.
Rest call: "www.mysite.com/V1/products"
Array data
$data = [
"product" => [
"sku" => $sku,
"name" => $product_title,
"attribute_set_id" => 4,
"price" => $price,
"status" => 1,
"visibility" => 4,
"type_id" => "simple",
"weight" => "1",
"extension_attributes" => [
"category_links" => [
[
"position" => 0,
"category_id" => "53"
]
],
"stock_item" => [
"qty" => $qty,
"is_in_stock" => true
]
],
"custom_attributes" => [
[
"attribute_code" => "special_price",
"value" => $salable_price
],
[
"attribute_code" => "special_from_date",
"value" => "2021-02-07 00:00:00"
],
[
"attribute_code" => "special_to_date",
"value" => "2091-02-07 00:00:00"
],
[
"attribute_code" => "cost",
"value" => $salable_price
],
[
"attribute_code" => "description",
"value" => $description
],
[
"attribute_code" => "short_description",
"value" => $short_description
]
]
]
];
As you see, qty has been correctly update on qty column but not on the salable quantity. What is my mistake?
Please try this:
"stockItem": {
"is_in_stock": 1,
"qty": 10,
"manage_stock": true,
"use_config_manage_stock": 1
}

I tried to import a 3GB system log file through elk but getting errors

files started importing like this..through logstash..kindly suggest me how to remove the errors i use kv filter only in my conf program
{
"authserver" => "a_India RADIUS",
"proto" => "6",
"devname" => "FW_1",
"10:56:12\tdate" => "2020-06-22\tlocal7\tnotice\t\ttime=10:56:11",
"host" => "kali",
"dstintf" => "wan1",
"path" => "/root/Cybrotech-/log00",
"subtype" => "webfilter",
"srcintf" => "ssl.root",
"method" => "domain",
"eventtype" => "ftgd_allow",
"hostname" => "webmail.accessarellc.net",
"cat" => "33",
"srcintfrole" => "undefined",
"dstip" => "20.73.98.154",
"type" => "utm",
"sessionid" => "677535",
"dstintfrole" => "wan",
"srcport" => "6095",
"url" => "/",
"profile" => "monitor-all",
"srcip" => "10.212.134.190",
"logid" => "07013312",
"policyid" => "17",
"eventtime" => "12803571",
"direction" => "outgoing",
"level" => "notice",
"#version" => "1",
"reqtype" => "direct",
"catdesc" => ""Health",
"action" => "passthrough",
"vd" => "root",
"dstport" => "443",
"service" => "HTTPS",
"#timestamp" => 2020-07-13T09:10:47.811Z,
"sentbyte" => "192",
"devid" => "FG0TK19907000",
"group" => "SSLVPN_Group",
"msg" => "URL belongs to an allowed category in policy",
"user" => "\ASINGH",
"rcvdbyte" => "0"
}
after some time i got this error on screen..
"_type"=>"doc", "_id"=>"LzhxR3MBoH6QvDEw21Sy", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] in index [log00_210270] has been exceeded"}}}}

Can’t Send #metadata to elasticsearch

I want to include #metadata field contents in my elasticsearch output.
This is the output when i am using stdout in my output filter-
{
"#timestamp" => 2018-03-08T08:17:42.059Z,
"thread_name" => "SimpleAsyncTaskExecutor-2",
"#metadata" => {
"dead_letter_queue" => {
"entry_time" => 2018-03-08T08:17:50.082Z,
"reason" => "Could not index event to Elasticsearch. status: 400, action: ["index", {:_id=>nil, :_index=>"applog-2018.03.08", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x3ab79ab5], response: {"index"=>{"_index"=>"applog-2018.03.08", "_type"=>"doc", "_id"=>"POuwBGIB0PJDPQOoDy1Q", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [message]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:223"}}}}",
"plugin_type" => "elasticsearch",
"plugin_id" => "7ee60ceccc2ef7c933cf5aa718d42f24a65b489e12a1e1c7b67ce82e04ef0d37"
}
},
"#version" => "1",
"beat" => {
"name" => "filebeat-kwjn6",
"version" => "6.0.0"
},
"dateOffset" => 408697,
"source" => "/var/log/applogs/spring-cloud-dataflow/Log.log",
"logger_name" => "decurtis.dxp.deamon.JobConfiguration",
"message" => {
"timeStamp" => "2018-01-30",
"severity" => "ERROR",
"hostname" => "",
"commonUtility" => {},
"offset" => "Etc/UTC",
"messageCode" => "L_9001",
"correlationId" => "ea5b13c3-d395-4fa5-8124-19902e400316",
"componentName" => "dxp-deamon-refdata-country",
"componentVersion" => "1",
"message" => "Unhandled exceptions",
},
"tags" => [
[0] "webapp-log",
[1] "beats_input_codec_plain_applied",
[2] "_jsonparsefailure"
]
}
I want my #metadata field in elasticsearch output.
Below is my conf file:
input {
dead_letter_queue {
path => "/usr/share/logstash/data/dead_letter_queue"
commit_offsets => true
pipeline_id => "main"
}
}
filter {
json {
source => "message"
}
mutate {
rename => { "[#metadata][dead_letter_queue][reason]" => "reason" }
}
}
output {
elasticsearch {
hosts => "elasticsearch"
manage_template => false
index => "deadletterlog-%{+YYYY.MM.dd}"
}
}
Now in my output there is a field called "reason" but without any content. Is there something i am missing.
this can help :-
mutate {
add_field => {
"reason" => "%{[#metadata][dead_letter_queue][reason]}"
"plugin_id" => "%{[#metadata][dead_letter_queue][plugin_id]}"
"plugin_type" => "%{[#metadata][dead_letter_queue][plugin_type]}"
}
}

How to solve date parsing error in logstash?

I have the following logstash configuration:
input {
file{
path => ["C:/Users/MISHAL/Desktop/ELK_Files/rm/evsb.json"]
type => "json"
start_position => "beginning"
}
}
filter {
json {
source => "message"
}
mutate {
convert => [ "increasedFare", "float"]
convert => ["enq", "float"]
convert => ["bkd", "float"]
}
date{
match => [ "date" , "YYYY-MM-dd HH:mm:ss" ]
target => "#timestamp"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => "localhost"
index => "zsx"
}
}
And this is the json data jt.json :
[{"id":1,"date":"2015-11-11 23:00:00","enq":"105","bkd":"9","increasedFare":"0"}, {"id":2,"date":"2015-11-15 23:00:00","eng":"55","bkd":"2","increasedFare":"0"}, {"id":3,"date":"2015-11-20 23:00:00","enq":"105","bkd":"9","increasedFare":"0"}, {"id":4,"date":"2015-11-25 23:00:00","eng":"55","bkd":"2","increasedFare":"0"}]
Tried running this in logstash however I am not able to parse the date or get the date in timestamp.
The following is the warning message im getting:
Failed parsing date from field {:field=>"[date]", :value=>"%{[date]}", :exception=>"Invalid format: \"%{[date]}\"", :config_parsers=>"YYYY-MM-dd HH:mm:ss", :config_locale=>"default=en_IN", :level=>:warn}
The following is the stdout
Logstash startup completed
{
"message" => "{\"id\":2,\"date\":\"2015-09-15 23:00:00\",\"enq\":\"34\",\"bkd\":\"2\",\"increasedFare\":\"0\"}\r",
"#version" => "1",
"#timestamp" => "2015-09-15T17:30:00.000Z",
"host" => "TCHWNG",
"path" => "C:/Users/MISHAL/Desktop/ELK_Files/jsonTest/jt.json",
"type" => "json",
"id" => 2,
"date" => "2015-09-15 23:00:00",
"enq" => 34.0,
"bkd" => 2.0,
"increasedFare" => 0.0
}
{
"message" => "{\"id\":3,\"date\":\"2015-09-20 23:00:00\",\"enq\":\"22\",\"bkd\":\"9\",\"increasedFare\":\"0\"}\r",
"#version" => "1",
"#timestamp" => "2015-09-20T17:30:00.000Z",
"host" => "TCHWNG",
"path" => "C:/Users/MISHAL/Desktop/ELK_Files/jsonTest/jt.json",
"type" => "json",
"id" => 3,
"date" => "2015-09-20 23:00:00",
"enq" => 22.0,
"bkd" => 9.0,
"increasedFare" => 0.0
}
{
"message" => "{\"id\":4,\"date\":\"2015-09-25 23:00:00\",\"enq\":\"66\",\"bkd\":\"2\",\"increasedFare\":\"0\"}\r",
"#version" => "1",
"#timestamp" => "2015-09-25T17:30:00.000Z",
"host" => "TCHWNG",
"path" => "C:/Users/MISHAL/Desktop/ELK_Files/jsonTest/jt.json",
"type" => "json",
"id" => 4,
"date" => "2015-09-25 23:00:00",
"enq" => 66.0,
"bkd" => 2.0,
"increasedFare" => 0.0
}
Been trying to solve this for two days and tried various things, But I am not able to solve this. Please tell what Im doing wrong here.

How can i update the ids field with this rethinkdb document structure?

Having trouble trying to update the ids field in the document structure:
[
[0] {
"rank" => nil,
"profile_id" => 3,
"daily_providers" => [
[0] {
"relationships" => [
[0] {
"relationship_type" => "friend",
"count" => 0
},
[1] {
"relationship_type" => "acquaintance",
"ids" => [],
"count" => 0
}
],
"countries" => [
[0] {
"country_name" => "United States",
"count" => 0
},
[1] {
"country_name" => "Great Britain",
"count" => 0
}
],
"provider_name" => "foo",
"date" => 20130912
},
[1] {
"provider_name" => "bar"
}
]
}
]
In JavaScript, you can do
r.db('test').table('test').get(3).update(function(doc) {
return {daily_providers: doc("daily_providers").changeAt(
0,
doc("daily_providers").nth(0).merge({
relationships: doc("daily_providers").nth(0)("relationships").changeAt(
1,
doc("daily_providers").nth(0)("relationships").nth(1).merge({
ids: [1]
})
)
})
)}
})
Which becomes in Ruby
r.db('test').table('test').get(3).update{ |doc|
{"daily_providers" => doc["daily_providers"].changeAt(
0,
doc["daily_providers"][0].merge({
"relationships" => doc["daily_providers"][0]["relationships"].changeAt(
1,
doc["daily_providers"][0]["relationships"][1].merge({
ids => [1]
})
)
})
)}
}
You should probably have another table for the daily providers and do joins.
That would make things way more simpler.

Resources