logstash.conf file error -- no discernable error - elasticsearch

Here is the file itself:
input {
file {
path => ["~/Downloads/log20170629.csv"]
start_position=> "beginning"
}
}
filter {
csv {
separator=>","
columns=>["ip","date","time","zone",”cik”,”accession”,”doc”,”code”,”filesize”,”idx”,”norefer”,”noagent”,”find”,”crawler”,”browser"]
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "sec_data"
}
stdout{
}
}
I am unsure of the issue. I have double checkd all the curly braces and everything else seems in order?

Related

grok not parsing logs

Log Sample
[2020-01-09 04:45:56] VERBOSE[20735][C-0000ccf3] pbx.c: Executing [9081228577525#from-internal:9] Macro("PJSIP/3512-00010e39", "dialout-trunk,1,081228577525,,off") in new stack
I'm trying to parse some logs,
I have tested some logs I have made on and it returning the result I need. But when I combining it with my config and run it, the logs not parsed into the index.
here is my config:
input{
beats{
port=>5044
}
}
filter
{
if [type]=="asterisk_debug"
{
if [message] =~ /^\[/
{
grok
{
match =>
{
"message" => "\[%{TIMESTAMP_ISO8601:log_timestamp}\] +(?<log_level>(?i)(?:debug|notice|warning|error|verbose|dtmf|fax|security)(?-i))\[%{INT:thread_id}\](?:\[%{DATA:call_thread_id}\])? %{DATA:module_name}\: %{GREEDYDATA:log_message}"
}
add_field => [ "received_timestamp", "%{#timestamp}"]
add_field => [ "process_name", "asterisk"]
}
if ![log_message]
{
mutate
{
add_field => {"log_message" => ""}
}
}
if [log_message] =~ /^Executing/ and [module_name] == "pbx.c"
{
grok
{
match =>
{
"log_message" => "Executing +\[%{DATA:TARGET}#%{DATA:dialplan_context}:%{INT:dialplan_priority}\] +%{DATA:asterisk_app}\(\"%{DATA:protocol}/%{DATA:Ext}-%{DATA:Channel}\",+ \"%{DATA:procedure},%{INT:trunk},%{DATA:dest},,%{DATA:mode}\"\) %{GREEDYDATA:log_message}"
}
}
}
}
}
}
output{
elasticsearch{
hosts=>"127.0.0.1:9200"
index=>"new_asterisk"
}
}
when I check it into kibana index, the index just showing raw logs.
Questions:
why my conf not parsing logs even the grok I've made successfully tested (by me).
solved
log not get into if condition
It seems like your grok-actions don't get applied at all because the data get indexed raw and no error-tags are thrown. Obviously your documents don't contain a field type with value asterisk_debug which is your condition to execute the grok-actions.
To verify this, you could implement a simple else-path that adds a field or tag indicating that the condition was not met like so:
filter{
if [type]=="asterisk_debug"{
# your grok's ...
}
else{
mutate{
add_tag => [ "no_asterisk_debug_type" ]
}
}
}

Logstash Array split gives Nilclass for one element array

I am trying to input json data from logs through filebeat-> logstash to elasticsearch but i seem to get NilClass error no matter what i try.
The data sample:
{"student":[{"details":{"name":chirs,"lname":dave},"age":10,"grade":1.2,"id":"323"}],"id":"metric95"}
my logstash configuration is:
input {
beats {
port => "5044"
}
}
filter {
json {
source => "message"
}
split {
field => "[student]"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
stdout { codec => rubydebug }
}
Error: split - Only String and Array types are splittable. field:[student] is of type = NilClass
Please try
split {
field => "student"
}
and put double quotes as {"name":"chirs","lname":"dave"}

Logstash filter section

Could you please advise how to filter a specific words with Logstash 1.5? For example, it's necessary to filter the following words: Critical, Exit, Not connected.
As I remember, in previous versions of Logstash (i.e 1.4 and earlier) it has been possible with grep filter.
Currently my logstash.conf contains:
input {
file {
path => ["C:\ExportQ\export.log"]
type => "Exporter-log"
codec => plain {
charset => "CP1251"
}
start_position => "beginning"
sincedb_path => "C:\Progra~1\logstash\sincedb"
}
}
filter {
}
output {
stdout { codec => rubydebug }
zabbix {
zabbix_host => "VS-EXP"
zabbix_key => "log.exp"
zabbix_server_host => "192.168.1.71"
zabbix_value => "message"
}
}
}
Many thanks in advance!
Use a conditional and the drop filter to delete matching messages.
filter {
# Simple substring condition
if "boring" in [message] {
drop { }
}
# Regexp match
if [message] =~ /boring/ {
drop { }
}
}

_xmlparsefailure thrown by Logstash

Logstash is throwing _xmlparsefailure error for the below script and log file. Due to this for 1 log statement unwanted multiple events are generated. How can we remove parsing error?
input {
file {
path => "/novus/users/arun/a*"
start_position => "end"
codec => multiline {
pattern => "(^\t)|(</stacktrace>)"
what => previous
}
}
}
filter {
grep {
match => { "message" => "Exception" }
}
}
output {
elasticsearch {
host => localhost protocol => "http"
}
}
Log file:
<event><date>5444-01-28-01:40:49.940</date><key>Exception</key><machine>ns9066</machine><timestamp>1422430849940</timestamp>><thread>UniqueIdRunnable_Runnable0</thread><product></product><novusid>ns9066.novusqc.1</novusid><application>NORM</application><environment>qc</environment><eventId>#23 Return relationships when MaxRelationships is equals to 10</eventId><requestId>0acd447514b2f7c4b8a3a497f1c</requestId><userid></userid><engineName>ns9066.novusqc.1</engineName><Class>RelationshipResolver.java</Class><Method>RelationshipResolver.getRelationshipGroups() </Method><eventlevel>warning</eventlevel><text>RelationshipGroup. normscalingloadRelationshipGroup. normscalingload</text>?><stacktrace>com.westgroup.novus.cci.CciRecordNotFoundException: RelationshipGroup. normscalingload
at 2019com.westgroup.novus.norm.NormCciAccess_Cached1.retrieveRelationshipGroup(No>rmCciAccess_Cached1.java:68)
at 2019com.westgroup.novus.norm.RelationshipResolver.getRelationshipGroups(Relatio>nshipResolver.java:3012)
at 2019com.westgroup.novus.norm.splitmerge.GetUniqueIdRunnable.performTasks(GetUniqueIdRunnable.java:190)
at 2019com.westgroup.novus.services.splitmerge.BaseRunnable.run(BaseRunnable.java:>107)
at 2019com.westgroup.novus.commonutils.PooledThread.run(PooledThread.java:128)
</stacktrace><eventguid>2019</eventguid></event>

Parsing a string as date in logstash while inserting in elasticsearch

One record in my csv file looks like
,No,FMN1116CD,Holiday Order,2782427,Mr Vijay Ran ,9/6/2014,17/11/2014,,To Deliver,0,S,FALSE,2726149,-1,-1,,0,,,,-1,Delhi Inbound,NEW DELHI,Basic Hotel Order Details,Custom Package,Double,,Others,25500,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,890,0,0,3280,0,29670,,,FALSE,not paid,287894747,,TRUE,,5,-301767,0,50764,,,,,,,,,,,,-3065174,,,,,,,,,,,,,,,,,,,,2,India,22/11/2014,,Manual,Custom Package,26889,Balasore,Callcenter,0,0,0,0
My conf file looks like
input {
file {
path => "/home/sagnik/Work/logstash-1.4.2/bin/ho.csv"
start_position => "beginning"
}
}
filter {
date {
match => ["Travel_Date", "dd/MM/YYYY"]
}
csv {
columns => ["Comm_Plan","Queue_Booking","Order_Reference","Multi_Ordertype","Order_Item_Id","Pax_Name","Generation_Date","Travel_Date","Desk_ID","Status","SalesID","UserRole","Group_Booking","Agent_ID","Admin_ID","Partner_ID","Partner_Name","AgencyAdmin_Id","Supp_Pmt_Ref","Supp_Pmt_Acc","Supp_Pmt_Status","Distributor","Agent_Name","State","Supplier_Code","Secondary_Supplier_Code","Supplier_Number","PNR","Ticket_Number","Basic","Taxes","OCTax","Meal_Price","Cab_Price","Handling","PLB","Deposit_Incentive","Subagent_Handling","Subagent_Plb","Subagent_Deposit_Incentive","Dist_Comm","Stax_Air","Booking_Surcharge","TDS","SubAgent_TDS","Dist_TDS","Dist_Service_Tax","STax_Bas","Partner_Booking_Fee","Old_Payment_Fee","Transaction_Fee_Rcvd","Transaction_Fee_Givn","Net_Amount","Vouchers","CC","Dist_Credit","Partner_Payment_Status","Call_CenterId","Linked_Order","Is_Holiday","Child_Ordertype","Room_Nights","Payment_Sum","Credit_Outstanding","Payment_Fee","DepositCharge","DepositComm_Cr","CreditCharge","CreditComm_Cr","Distributor_CreditCharge","Distributor_CreditComm_Cr","Vendor_7Charge","CCICICI_MOTO_3DCharge","IPSPCharge","NetBanking_TPSCharge","CCICICI_EMICharge","NetBanking_CITRUSCharge","CCHDFC_MOTOCharge","ACharge","CCAMEXCharge","NetBanking_4Charge","NetBanking_PayUCharge","Ccivrscharge","Vch_LossVoucher","Vch_StaffTravel","Vch_DiscountB2C","Vch_ViaPointsRedemption","Vch_DealVoucher","Vch_BonusRedemption","Vch_Loss","Vch_MultiOrder","Vch_SME","Vch_TripCard","Vch_NetPayments","Vch_OfferPromo","Vch_HotelPromotion","No_Of_Pax","Hotel_CountryName","Checkout_Date","Hotel_Booking_Code","Hotel_Type","Hotel_Name","Hotel_Id","Hotel_City","Hotel_Booked_By","Hotel_Net","Hotel_Taxes","Hotel_Gross","Hotel_Supplier_Commission"]
separator => ","
}
}
output {
elasticsearch {
host => "localhost"
index => "hotel"
}
stdout { codec => rubydebug }
}
But after insertion the Travel_Date is coming as a string and not a date. As a result I am unable to do any navigation with it. Please help
you need to use target for this case
date {
match => ["Travel_Date", "dd/MM/YYYY"]
target => "New_Field_Name"
}
I think you have misunderstanding date plugin. date plugin is used to parse a field value and match it to #timestamp field.
If you need convert a field value from string to date, you can use Ruby plugin to do it.
With the below conf I can parse the Travel_Date to date format and navigate it in elasticsearch.
Have a look.
input {
file {
path => "/home/sagnik/Work/logstash-1.4.2/bin/ho.csv"
start_position => "beginning"
}
}
filter {
csv {
columns => ["Comm_Plan","Queue_Booking","Order_Reference","Multi_Ordertype","Order_Item_Id","Pax_Name","Generation_Date","Travel_Date","Desk_ID","Status","SalesID","UserRole","Group_Booking","Agent_ID","Admin_ID","Partner_ID","Partner_Name","AgencyAdmin_Id","Supp_Pmt_Ref","Supp_Pmt_Acc","Supp_Pmt_Status","Distributor","Agent_Name","State","Supplier_Code","Secondary_Supplier_Code","Supplier_Number","PNR","Ticket_Number","Basic","Taxes","OCTax","Meal_Price","Cab_Price","Handling","PLB","Deposit_Incentive","Subagent_Handling","Subagent_Plb","Subagent_Deposit_Incentive","Dist_Comm","Stax_Air","Booking_Surcharge","TDS","SubAgent_TDS","Dist_TDS","Dist_Service_Tax","STax_Bas","Partner_Booking_Fee","Old_Payment_Fee","Transaction_Fee_Rcvd","Transaction_Fee_Givn","Net_Amount","Vouchers","CC","Dist_Credit","Partner_Payment_Status","Call_CenterId","Linked_Order","Is_Holiday","Child_Ordertype","Room_Nights","Payment_Sum","Credit_Outstanding","Payment_Fee","DepositCharge","DepositComm_Cr","CreditCharge","CreditComm_Cr","Distributor_CreditCharge","Distributor_CreditComm_Cr","Vendor_7Charge","CCICICI_MOTO_3DCharge","IPSPCharge","NetBanking_TPSCharge","CCICICI_EMICharge","NetBanking_CITRUSCharge","CCHDFC_MOTOCharge","ACharge","CCAMEXCharge","NetBanking_4Charge","NetBanking_PayUCharge","Ccivrscharge","Vch_LossVoucher","Vch_StaffTravel","Vch_DiscountB2C","Vch_ViaPointsRedemption","Vch_DealVoucher","Vch_BonusRedemption","Vch_Loss","Vch_MultiOrder","Vch_SME","Vch_TripCard","Vch_NetPayments","Vch_OfferPromo","Vch_HotelPromotion","No_Of_Pax","Hotel_CountryName","Checkout_Date","Hotel_Booking_Code","Hotel_Type","Hotel_Name","Hotel_Id","Hotel_City","Hotel_Booked_By","Hotel_Net","Hotel_Taxes","Hotel_Gross","Hotel_Supplier_Commission"]
separator => ","
}
ruby {
code => "
event['Travel_Date'] = Date.parse(event['Travel_Date']);
"
}
}
output {
elasticsearch {
host => "localhost"
index => "hotel"
}
stdout { codec => rubydebug }
}
Hope this can help you.

Resources