access fields from log using ruby filter - ruby

What I am trying to do is pass my grok fields in some way or another to an external ruby filter-script and set based on these fields specific tags. The problem is that I can only get the whole log message with the event API.
My question is: is it possible to access fields from the already processed log message in the ruby filter or do I have to parse the whole message myself, which would not be optimal because every log message is processed twice? Alternatively I could completely dump the grok filter and do everything myself in the script.

Yes, it is possible.
You can get read-only access to any field using Event API
filter {
ruby {
code => 'event.get("foo" )'
}
}
field can also be a nested field reference such as [field][bar].
event.get("[foo][bar]")

Related

Laravel validation; human names for array fields

In Laravel form validation you can do this: file_description.* to validate each item of an array according to a set of rules. The problem is, the system automatically returns "file_description.1 is required" as an error message if the field is required and not filled in.
More on that here: https://ericlbarnes.com/2015/04/04/laravel-array-validation/
Now, I'm not a complex man, I just want the field to say "File Description 1 is required". I am aware you can set messages but a) my input arrays are dynamically generated by jquery (click to add more type scenario) so I'd have to use a loop like in the above example b) I feel like there must be a better way.
Is there a way to either extend the core validation system to simply return a humanized name for the array field as Laravel does with regular fields, or is there an option I missed in the docs that allows for this? I'd rather not get involved with doing some regex type search to fix this.

How to get Json in Grok Logstash

So currently I'm building a log system using ELK Stack. Before building this ELK, I already have custom log format for my apps, so that it can be easily read by human. My log is formatted something like this
Method: POST
URL: https://localhost:8888/api
Body: {
"field1":"value1",
"field2":[
{
"field3":"value2",
"field4":"value3"
},
{
"field3":"value2",
"field4":"value3"
},
]
}
using grok pattern, I can get the Method and the URL, but how can I get the full body json in grok / logstash so that i can send them to elasticsearch?
Since the length of the json is not fixed and can be longer or shorter each log
Thank you
You can use the JSON Filter.
It should parse the JSON for you, and put it into a structured format so you can then send it where ever you need (e.g. Elasticsearch, another pipeline)
From the docs
It takes an existing field which contains JSON and expands it into an actual data
structure within the Logstash event.
There are also some other questions here on SO that could be helpful. An example: Using JSON with LogStash

ndc (nested diagnostic context) in LARAVEL - is it possible?

In log4net you can add NDC to filter all the logs in a specific context. for example per session,request,cli command ext.
its a basically another field in the log that has random string, and the string is share between the contexts from abobe.
When you search for this string you can see the entire request logs.
is it possible in laravel?
Thanks

How can I log the value of a variable sent in an HTTP Request sent from JMeter, if the value was first read in from a csv file

I would like to read the exact value of a variable I use to pass through an HTTP Request. I first read in many values of variables using the CSV Data Set Config. For the username, it is in the form of an email address. So, I have a variable called "email" in the Data Set Config. In the actual HTTP Request, for "name", I call it "username". For the "Value" field for this same "username", I added a time() function to it like this so I would end up creating unique users in my tests:
${email}${__time()
When I view the "Request" in a View Results Tree, I can see my parameter is listed correctly:
username=email1%40email.com1390854377360
I do not care if this is correct in real world terms. I already know that is not a valid email. That is ok for now.
What I want to know is how can I log that email that I just created on the fly? I would like to not have to pull in the whole request every time also and then use some type of Regular Expression extractor. It just seems like there should be an easy way to do this.
I think there are 2 ways,
Beanshell Pre/Post processors : you can write custom code in which you can log all your variables in some custom log file
Simple data writer : you can configure it and check save url,save field names,save response data field checkboxes that will give you complete data but in that later postprocessing on result file is required to get all usernames (email in your case).
first approach is easier and allows you create your own logging format(easy to retrieve and use somewhere else).
second approach is somewhat tedious and requires post processing.

Organize validation messages in Struts2 validation(XML)

I am trying to do validations in struts 2 for my current project. I have to group my validation messages. For Eg: If there are 3 fields that are empty and there are 3 other fields whose format is not right, I need to get a msg like
"The following fields are required: field1, field2, field3
The format of the following fields are invalid: field4, field5, field6"
I tried providing a param to fieldError.
Eg:
< s : fielderror >
< s : param value="%{requiredstring}"/>
< / s : fielderror>
According to me this is like specifying "show all errors whose validator type is requiredstring". Please correct me if I am wrong.
But this will display the message "The following fields are required" each time for every field that is empty. I want it displayed only once.
Is there a way to do this cleanly in stuts2 using validation through xml? I donot want to do all the validations in a validate method.
Thanks
You are wrong; I have no idea why you thought that'd work, the docs don't imply that's possible.
Field errors are just that--errors for a specific field. If you need to group errors by arbitrary criteria, like the validation type, you'll need to implement that yourself.
There are a number of ways to do this, including writing a custom validation interceptor, providing validators that group errors in a different way, or simply gathering the appropriate messages in an action or validation method.
You could gather errors based on the message content, but IMO that would be brittle. If this is a cross-application issue, you're better off doing it a different way.
All that said, by presenting error messages in an order not necessarily reflective of the form, you're pushing more cognitive overhead onto the user: I don't want to see groups of messages telling me which fields share the same error, I want to see what's wrong with each field, in the same order the fields are presented on the form, preferably near the form field itself.

Resources