Extract a timestamp from a date without the year component - elasticsearch

This topic is related but skips the part interesting for me.
I'm using filebeat to read CA Service Desk Manager logs written in a custom format which I cannot change. A single log line looks something like this:
11/07 13:05:26.65 <hostname> dbmonitor_nxd 9192 SIGNIFICANT bpobject.c 2587 Stats: imp(0) lcl(0) rmt(1,1) rmtref(0,0) dbchg(0)
As you can see the date in the beginning has no year information.
I then use logstash to parse the date from the log line. I've got the extra pattern TIMESTAMP defined like this:
TIMESTAMP %{MONTHNUM}[/]%{MONTHDAY}%{SPACE}%{TIME}
And then in logstash.conf I have the following filter:
grok {
patterns_dir => ["./patterns"]
match => { "message" => [
"%{TIMESTAMP:time_stamp}%{SPACE}%{WORD:server_name}%{SPACE}%{DAEMONNAME:object_name}%{SPACE}%{INT:object_id:int}%{SPACE}%{WORD:event_type}%{SPACE}%{USERNAME:object_file}%{SPACE}%{INT:object_line_number:int}%{SPACE}%{GREEDYDATA:log_message}"
]
}
}
date{
match => ["time_stamp", "MM/d HH:mm:ss.SS", "MM/dd HH:mm:ss.SS", "ISO8601"]
}
Currently I have to rely on the automatic timestamp as the time_stamp is indexed as text and I've been fine so far, but occasionally the time the log line was written on the server is not the same as the time it was pushed into ES. Around new year I fear I will run into trouble with this and how the year is deducted from the current time. My questions are:
Is it possible to parse a date field from the data given?
Is there a way to write some advanced logic for the date conversion?
As we're not past the new year yet, is there a way to manually check/ensure the automatic conversion is done correctly?

Related

Convert date format, BMC Remedy/smart-it

Problem:
In a field called $Detailed Decription$ sometimes dateformat 08/09/2021 is enterd and this need to be converted to swedish format 2022-02-11
I'am going to use BMC Developer studio and make a filter but i cant find a fitting solution for it. Replacing it wont work (i think) becaus it need to have a value to replace it with.
Maby there can be a function that reads regex (\d{2})/(\d{1,2})/(\d{4}) but how can i convert it?
If it's sometimes - look at AR System User Preferencje form. Check certain user's locale and date time config.
Also is important where the data comes from. Could be a browser setting or java script mod.
1- Using Set fields action, copy the date value from Detailed Description to a Date/Time field (i.e. z1D_DateTime01).
2- Using Set fields action and Functions (MONTH, YEAR, DAY, HOUR, MINUTE, SECOND) you can parse the date/time and convert it to format you like.
Something like this:
SwedishDate = YEAR($z1D_DateTime01$) + "-" + MONTH($z1D_DateTime01$) + "-" + DAY($z1D_DateTime01$)
This will capture the parts of date and combine them with "-" in the middle and in the order you want.

Formatting date to a particular type in zapier cli

In my inputFields of zapier, there's one field where the user can enter the date but I want the zapier to work only if I write the date in "2020-09-18T15:30" otherwise it should show a message that the data entered does not match the format specified. I tried this but it's not working.
const activityEditableFields = async (z, bundle) => {
if (bundle.inputData.dueDate) {
(/\d{4}-\d{2}-\d{2}T\d{2}:\d{2}Z/.test(`${bundle.inputData.dueDate}`))
}
here duedate in the field and I am giving the format that if there is data in it then it should match the specified format but there's no difference in the zap. If you can help me as soon as possible. Do reply.
This shouldn't be necessary. Per the docs:
DateTime fields let users enter dates and times, using their human readable values, machine readable datetimes, or standard English words for time like tomorrow. Zapier interperpates the date input from users and outputs a standard ISO 8601 datetime to your API.
If you declare the field as a datetime, then bundle.inputData.dueDate will always be a proper ISO 8601 datetime.
The easiest way I could find to workaround this issue was to use with JS code only the first 10 characters of the ISO8601 Zapier date.
bundle.inputData.dueDate.substr(0,10)
or to use moment to parse date (This library is supported by Zapier):
const moment = z.require('moment');
....
moment(bundle.inputData.dueDate).format("YYYY-MM-DD HH:MM:SS UTC")
The only drawback is that you need to use the code editor

Facing Issue while sending data from Filebeats to Multiple Logstash files

To be Precise, I am handling a log file which has almost millions of records. Since it is a Billing Summary log, Customer Information will be recorded in no particular order.
I am Using customized GROK Patterns and logstash XML filter plugin to extract the data which would be sufficient to track. To track the The Individual Customer Activities, I am using "Customer_ID" as a unique key. So Even though I am using Multiple Logstash Files, and Multiple GROK Patterns, All his Information could be bounded/Aggregated using his "Customer_ID" (Unique Key)
here is my sample of log file,
7-04-2017 08:49:41 INFO abcinfo (ABC_RemoteONUS_Processor.java52) - Customer_Entry :::<?xml version="1.0" encoding="UTF-8"?><ns2:ReqListAccount xmlns:ns2="http://vcb.org/abc/schema/"/"><Head msgId="1ABCDEFegAQtQOSuJTEs3u" orgId="ABC" ts="2017-04-27T08:49:51+05:30" ver="1.0"/><Cust id="ABCDVFR233cd662a74a229002159220ce762c" note="Account CUST Listing" refId="DCVD849512576821682" refUrl="http://www.ABC.org.in/" ts="2017-04-27T08:49:51+05:30"
My Grok Pattern,
grok {
patterns_dir => "D:\elk\logstash-5.2.1\vendor\bundle\jruby\1.9\gems\logstash-patterns-core-4.0.2\patterns"
match => [ "message" , "%{DATESTAMP:datestamp} %{LOGLEVEL:Logseverity}\s+%{WORD:ModuleInfo} \(%{NOTSPACE:JavaClass}\)%{ABC:Customer_Init}%{GREEDYDATA:Cust}"]add_field => { "Details" => "Request" }remove_tag => ["_grokparsefailure"]}
My Customized pattern which is stored inside Pattern_dir,
ABC ( - Customer_Entry :::)
My XML Filter plugin,
xml {
source => "Cust"
store_xml =>false
xpath => [
"//Head/#ts", "Cust_Req_time",
"//Cust/#id", "Customer_ID",
"//Cust/#note", "Cust_note", ]
}
So whatever the details comes behind ** - Customer_Entry :::**, I will be able to extract it using XML Plugin Filter (will be stored similar to multi-line codec). I have written 5 different Logstash files to extract different Activities of Customer with 5 different Grok Patterns. Which will tell,
1.Customer_Entry
2.Customer_Purchase
3.Customer_Last_Purchase
4.Customer_Transaction
5.Customer_Authorization
All the above Grok patterns has different set of Information, which will be grouped by Customer_ID as I said earlier.
I can able to Extract the Information and Visualize It clearly in Kibana without any flaw by using my Customized pattern with different log files.
Since I have 100's of Log files each and everyday to put into logstash, I opted for Filebeats, but Filebeats run with only one port "5044". I tried to run with 5 different ports for 5 different logstash files but that was not working, Only one logstash file of 5 was getting loaded rest of the config files were being Idle.
here is my sample filebeat output.prospector,
output.logstash:
hosts: ["localhost:5044"]
output.logstash:
hosts: ["localhost:5045"]
output.logstash:
hosts: ["localhost:5046"]
I couldn't add all the grok patterns in one logstash config file, because XML Filter plugin takes the source "GREEDYDATA". in such case I will be having 5 different Source=> for 5 different Grok pattern.
I even tried that too but that was not working.
Looking for better approach.
Sounds like you're looking for scale, with parallel ingestion. As it happens, File beats supports something called load-balancing which sounds like what you're looking for.
output.logstash:
hosts: [ "localhost:5044", "localhost:5045", "localhost:5046" ]
loadbalance: true
That's for the outputs. Though, I believe you wanted multithreading on the input. FileBeats s supposed to track all files specified in the prospector config, but you've found limits. Globbing or specifying a directory will single-thread the files in that glob/directory. If your file-names support it, creative-globbing may get you better parallelism by defining multiple globs in the same directory.
Assuming your logs are coming in by type:
- input_type: log
paths:
- /mnt/billing/*entry.log
- /mnt/billing/*purchase.log
- /mnt/billing/*transaction.log
Would enable prospectors on multiple threads reading in parallel files here.
If your logs were coming in with random names, you could use a similar setup
- input_type: log
paths:
- /mnt/billing/a*
- /mnt/billing/b*
- /mnt/billing/c*
[...]
- /mnt/billing/z*
If you are processing lots of files with unique names that never repeat, adding the clean_inactive config config-option to your prospectors will keep your FileBeat running fast.
- input_type: log
ignore_older: 18h
clean_inactive: 24h
paths:
- /mnt/billing/a*
- /mnt/billing/b*
- /mnt/billing/c*
[...]
- /mnt/billing/z*
Which will remove all state for files older than 24 hours old, and won't bother processing any file more than 18 hours old.

logstash individual log parse with multi line

I have a set of log file which each file log is for specific machine.
What i am trying to achieve is to use the multiline{} filter to join the multi-line messages in each of the file because i would like to have a single #timestamp for each file.
example data in the log file
title
description
test1
test pass
test end
filter {
multiline {
pattern => "from_start_line to end of line"
what => "previous"
negate => true
}
}
I just want to make all the data in the log file as single event without using the pattern.
pretty much like telling logstash to make a multi-line event until EOF.
You can't do like that. Because logstash will always keep monitoring the file. Therefore the EOF is meaningless for it.
The other way you can do is add some pattern to the end of the logs. For example, add log_end to the end of each log output.
title
description
test1
test pass
test end-log_end
Then you can use this pattern to multiline all the logs.
multiline {
pattern => "log_end$"
negate => true
what => "next"
}
Hope this can help you.

My date format : 09/28/2012 16:35:34 , I want the date format like : 2012-09-28T16:35:34 , using gsub [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Ruby - Change string in a date format to another format
My date format : 09/28/2012 16:35:34, I want the date to be formatted like: 2012-09-28T16:35:34, I need to compile the code in jruby.
You probably want to convert the date to something more useful:
require 'date'
dt = DateTime.strptime "09/28/2012 16:35:34", '%m/%d/%Y %H:%M:%S'
# => #<DateTime: 2012-09-28T16:35:34+00:00 (106107805067/43200,0/1,2299161)>
Now you can do any transformation:
dt.strftime '%FT%T'
# => "2012-09-28T16:35:34"
This also raises an exception when the date format is wrong, which is useful to notice when things break.
For more Information see the Apidocs for Date.
Find
(\d+)\/(\d+)\/(\d+) ([\d:]+)
replace with
$3-$1-$2T$4
Here you can see the groups (which in the regex are the parts in ()), $1 is the first group, $2 the second and so on. Basically you need to reorder the groups putting - in the middle and T before the hour.

Resources