LOGSTASH config for xml - elasticsearch

I'm new to elasticsearch and logstash and I would like to create the config file for logstash that it can load the XML file data, so I can do the searching in elasticsearch using kibana. How to create this config?
The XML file structure is :
<?xml version="1.0" encoding="ISO-8859-15"?>
<ORDERS>
<ORDER>
<COMPANY_CODE>CHU</COMPANY_CODE>
<ETABLISSEMENET_CODE>CHU</ETABLISSEMENET_CODE>
<FOURNISSEUR>BI</FOURNISSEUR>
<DESTINATAIRE>CHUSUDRUN2</DESTINATAIRE>
<NUM_COMMANDE_MYTOWER>342</NUM_COMMANDE_MYTOWER>
<NUM_COMMANDE_CHU>CMD12345</NUM_COMMANDE_CHU>
<INSTRUCTIONS>COLIS</INSTRUCTIONS>
<ETAT>4</ETAT>
<DATE_DE_COMMANDE>01-01-2018</DATE_DE_COMMANDE>
<DATE_DE_DISPONIBILITE>01-01-2018</DATE_DE_DISPONIBILITE>
<MONTANT_HT>3695.0</MONTANT_HT>
<DATE_DE_CREATION></DATE_DE_CREATION>
<POIDS_BRUT>20.0</POIDS_BRUT>
<NOMBRE_COLIS>3</NOMBRE_COLIS>

Below is an example of xml conf in logstash:
input {
file
{
path => "/home/Test_xml.xml"
start_position => "beginning"
codec => multiline
{
pattern => "^<\?book .*\>"
negate => true
what => "previous"
}
sincedb_path => "/dev/null"
}
}
filter
{
xml {
source => "message"
target => "parsed"
}
split {
field => "[parsed][book]"
add_field => {
bookAuthor => "%{[parsed][book][author]}"
title => "%{[parsed][book][title]}"
genre => "%{[parsed][book][genre]}"
price => "%{[parsed][book][price]}"
publish_date => "%{[parsed][book][publish_date]}"
description => "%{[parsed][book][description]}"
}
}
}
output
{
elasticsearch {
hosts => "localhost:9200"
index => "xml_test"
}
stdout
{
codec => rubydebug
}
}
link for xml file
I tried to insert data in elasticsearh long back with logstash .
hope this will work.

Related

JSON parse error, original data now in message fiel {:message=>"incompatible json object type=java

My logstash filter code catches incorrect data, can anyone help me with the correct syntax? it points to an error in the pipeline, no, there are multiple pipelines I'm working on, but I'm also in doubt about the syntax
input {
file {
path => "/var/tmp/wd/accounts/*.json"
codec => "json"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
template => "/etc/logstash/templates/accounts-template.json"
template_name =>["accounts-template.json"]
template_overwrite => true
index => "accounts-%{+yyyy.MM.dd}"
user => "user"
password => "password"
}
stdout {codec => rubydebug}
}

Loading a number of xml file into logstash

i want to load a number of xml file into logstash in the same time, so what i should to add in my config file!.
Thanks guys for your support :)
this is my config file :
input {
file {
path => "D:/test*.xml",
start_position => beginning
sincedb_path => "NUL"
codec => multiline {
pattern => "<invoicing>|</invoicing>"
negate => "true"
what => "previous"
auto_flush_interval => 1
max_lines => 3000
}
}
}
filter {
xml {
source => "message"
target => "message.parsed"
store_xml => false
force_array => false
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
index => "tizer005"
hosts => ["localhost:9200"]
document_type => "ChannelFiles"
}
}

Parsing XML file using Logstash

I am trying to parse an XML file in Logstash. I want to use XPath to do the parsing of documents in XML. So when I run my config file the data loads into elasticsearch but It is not in the way I want to load the data. The data loaded in elasticsearch is each line in xml document
Structure of my XML file
What I want to achieve:
create fields in elasticsearch that stores the follwing
ID =1
Name = "Finch"
My Config file:
input{
file{
path => "C:\Users\186181152\Downloads\stations.xml"
start_position => "beginning"
sincedb_path => "/dev/null"
exclude => "*.gz"
type => "xml"
}
}
filter{
xml{
source => "message"
store_xml => false
target => "stations"
xpath => [
"/stations/station/id/text()", "station_id",
"/stations/station/name/text()", "station_name"
]
}
}
output{
elasticsearch{
codec => json
hosts => "localhost"
index => "xmlns"
}
stdout{
codec => rubydebug
}
}
Output in Logstash:
{
"station_name" => "%{station_name}",
"path" => "C:\Users\186181152\Downloads\stations.xml",
"#timestamp" => 2018-02-09T04:03:12.908Z,
"station_id" => "%{station_id}",
"#version" => "1",
"host" => "BW",
"message" => "\t\r",
"type" => "xml"
}
The multiline filter allows to create xml file as a single event and we can use xml-filter or xpath to parse the xml to ingest data in elasticsearch.
In the multiline filter, we mention a pattern( in below example) that is used by logstash to scan your xml file. Once the pattern matches all the entries after that will be considered as a single event.
The following is an example of working config file for my data
input {
file {
path => "C:\Users\186181152\Downloads\stations3.xml"
start_position => "beginning"
sincedb_path => "/dev/null"
exclude => "*.gz"
type => "xml"
codec => multiline {
pattern => "<stations>"
negate => "true"
what => "previous"
}
}
}
filter {
xml {
source => "message"
store_xml => false
target => "stations"
xpath => [
"/stations/station/id/text()", "station_id",
"/stations/station/name/text()", "station_name"
]
}
}
output {
elasticsearch {
codec => json
hosts => "localhost"
index => "xmlns24"
}
stdout {
codec => rubydebug
}
}

Logstash Not Reading "File" Input

I am trying to use file as an input to logstash.Here is my logstash.conf
input {
file {
path => "/home/dxp/elb.log"
type => "elb"
start_position => "beginning"
sincedb_path => "/home/dxp/log.db"
}
}
filter {
if [type] == "elb" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:loadbalancer} %{IP:client_ip}:%{NUMBER:client_port:int} %{IP:backend_ip}:%{NUMBER:backend_port:int} %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} %{NUMBER:elb_status_code:int} %{NUMBER:backend_status_code:int} %{NUMBER:received_bytes:int} %{NUMBER:sent_bytes:int} %{QS:request}" ]
}
}
}
output
{
elasticsearch {
hosts => "10.99.0.180:9200"
manage_template => false
index => "elblog-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
My logs show this:
[2017-10-27T13:11:31,164][DEBUG][logstash.inputs.file ]_globbed_files: /home/dxp/elb.log: glob is []: I guess my file has not been read by logstash, so a new index is not formed in elasticsearch.
Please help me with what i am missing in this.

How can I make indexing json file by using logstash?

I try to make index my json file like below. I have to write a grok expression . But I could not do that? can you help me?
{"level":"Information","ClientIP":"10.201.21.188","Test":"10.210.21.188"}
{"level":"Information","ClientIP":"10.202.21.187","Test":"10.220.21.188"}
{"level":"Information","ClientIP":"10.203.21.186","Test":"10.230.21.188"}
{"level":"Information","ClientIP":"10.204.21.185","Test":"10.240.21.188"}
My logstash.conf is below :
input {
file {
type => "json"
path => ["C:/logs/test-20170933.json"]
start_position => "beginning"
}
}
filter {
grok {
match => [ "message","%{WORD:level} I HAVE TO WRITE OTHER ELEMENTS BUT HOW????"]
}
json {
source => "message"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-%{+YYYY.MM.dd}"
}
}
I guess that we need grok expression to achive that. Also I am open for new creative solution for that.
You don't need to grok anything, your file input simply needs a JSON codec and you're good to go:
input {
file {
type => "json"
path => ["C:/logs/test-20170933.json"]
start_position => "beginning"
codec => "json" <-- add this
}
}
filter {
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-%{+YYYY.MM.dd}"
}
}

Resources