Non-Zero Metrics-FileBeat - elasticsearch

I am using elasic.co/filebeat:6.3.1 and ELK elastic.co:6.3.0 in ubuntu as docker, while running filebeat facing this issue,
https://i.stack.imgur.com/9rZjt.png
And my filebeat.yml is
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: false
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /usr/local/java/ABC_LOGS/*/*.log
#- c:\programdata\elasticsearch\logs\*
#============================= Filebeat modules ===============================
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
#==================== Elasticsearch template setting ==========================
setup.template.settings:
index.number_of_shards: 3
#index.codec: best_compression
#_source.enabled: false
#============================== Dashboards =====================================
setup.dashboards.enabled: true
#============================== Kibana =====================================
setup.kibana:
host: "10.0.0.0:5601"
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
hosts: ["10.0.0.0:9200"]
Please help me,Thanks in advance

Related

Filebeat Netflow not showing up in Kibana

I use Filebeat to send Netflow to Elasticsearch and visualize it with Kibana.
The problem is that the Netflow events are not showing up in Kibana.
Here are my netflow and filebeat configuration files.
netflow.yml
Module: netflow
module: netflow
log:
enabled: true
var:
netflow_host: 0.0.0.0
netflow_port: 2055
filebeat.yml
Kibana section:
setup.kibana:
host: "X.X.X.X:5601"
Elasticsearch Output
output.elasticsearch:
hosts: ["localhost:9200"]
username: "xxxxxxxxxxxx"
password: "XXXXXXXXXXXX"

Can i avoid repetition in filebeat input settings?

I have an input settings like this (Proof Of Concept) and i will add more prospectors further on.
Can i avoid repetition of the multiline properties?
filebeat.prospectors:
- type: log
enabled: true
paths:
- /data/server/logs/inode-stage/inode-stage.log
multiline.pattern: '^\['
multiline.negate: true
multiline.match: after
fields:
env: 'stage'
app: 'inode'
- type: log
enabled: true
paths:
- /data/server/logs/inode-dev/inode-dev.log
multiline.pattern: '^\['
multiline.negate: true
multiline.match: after
fields:
env: 'dev'
app: 'inode'
I don't think that is possible right now. Not sure how many variations you will have in your inputs, but based on your current example I would extract the env with dissect. If you need something more powerful, you could even go for the script processor.

Unable to use custom index in filebeat configuration

I'm working on elasticsearch version 7.2.0 and shipping the logs using filebeat. I can use the custom pipeline but I'm unable to set custom index name. Kindly help.
Below is my filebeat output configuration:
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
pipeline: reindex_timestamp
index: "logstash-%{+yyyy.MM.dd}"
setup.template.name: "logstash"
setup.template.pattern: "logstash-*"
setup.template.enabled: true
setup.template.overwrite: true
Here I'm not sure though how I have to create a custom template that I specified the name
Update: I found the solution to my requirement- Below configurations worked for me since my requirement is to write weblogs( if logs contain name: web) in separate index and rest of the application logs in another index (now writing to default index called filebeat-*)
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
#index: "filebeat-7.2.0-logstash-%{+yyyy.MM.dd}" #Its not taking custom index
pipeline: reindex_timestamp_logstash
indices:
- index: "node-%{+yyyy.MM.dd}"
when.contains:
name: web
pipelines:
- pipeline: reindex_timestamp_node
when.contains:
name: web
setup.template.name: "filebeat-7.2.0"
setup.template.pattern: "filebeat-7.2.0-*"

How to constrain Filebeat to only ship logs to ELK if they contain a specific field?

I’m trying to collect logs from Kubernetes nodes using Filebeat and ONLY ship them to ELK IF the logs originate from a specific Kubernetes Namespace.
So far I’ve discovered that you can define Processors which I think accomplish this. However, no matter what I do I can not get the shipped logs to be constrained. Does this look right?
Hm, does this look correct then?
filebeat.config:
inputs:
path: ${path.config}/inputs.d/*.yml
reload.enabled: true
reload.period: 10s
when.contains:
kubernetes.namespace: "NAMESPACE"
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
processors:
- add_kubernetes_metadata:
namespace: "NAMESPACE"
xpack.monitoring.enabled: true
output.elasticsearch:
hosts: ['elasticsearch:9200']
Despite this configuration I still get logs from all of the namespaces.
Filebeat is running as a DaemonSet on Kubernetes. Here is an example of an expanded log entry: https://i.imgur.com/xfTwbhl.png
You have number options to do it:
Filter data by filebeat
processors:
- drop_event:
when:
contains:
source: "field"
Use ingest pipeline into elasticsearch:
output.elasticsearch:
hosts: ["localhost:9200"]
pipeline: my_pipeline_id
And then test events into pipeline:
{
"drop": {
"if" : "ctx['field'] == null "
}
}
Use drop filter of logstash:
filter {
if ![field] {
drop { }
}
}
In the end, I resolved this by moving the drop processor to the input configuration file from the configuration file.

Filebeat not sending specific Log Files

I have configured filebeat 6.6 on a Windows instance. Weird thing is, it is sending logs for IIS but not for file I have specified even though the filebeat can detect it.
Filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- C:\ELK-Logger\filebeat-6.6.1-windows-x86_64\LowError.txt
- type: log
enabled: true
paths:
- C:\inetpub\logs\LogFiles\*\*
- C:\Hosting\stagingb2c\PaymentGatewayLogs\*\*
recursive_glob: enabled
- type: log
enabled: true
paths:
- C:\Hosting\stagingb2c\ErrorLogs\*
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
setup.template.settings:
index.number_of_shards: 3
output.logstash:
hosts: ["13.234.83.186:5044"]
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
logging:
to_files: true
files:
path: C:\ELK-Logger\filebeat-6.6.1-windows-x86_64\filebeat-6.6.1-windows-x86_64\LOG
level: info
I can see logs from C:\inetpub\logs\LogFiles folder but not from C:\Hosting\stagingb2c\PaymentGatewayLogs.
I can not see any errors or warnings in filebeat.log when I started it with :slight_smile:
PS C:\ELK-Logger\filebeat-6.6.1-windows-x86_64\filebeat-6.6.1-windows-x86_64> .\filebeat.exe -e -d "*"
|2019-03-04T21:15:51.602+0300|INFO|log/harvester.go:255|Harvester started for file: C:\Hosting\stagingb2c\PaymentGatewayLogs\CredimaxPaymentGateway_OrderId_12f1050220190810\CredimaxPayment_TransactionDetails_OrderId_12f1050220190810|
|---|---|---|---|
|2019-03-04T21:15:51.761+0300|INFO|log/harvester.go:255|Harvester started for file: C:\Hosting\stagingb2c\PaymentGatewayLogs\CredimaxPaymentGateway_OrderId_Sw2m\CredimaxPayment_PROCESS_ACS_RESULT_Response_20190213124610_OrderId_Sw2m.txt|
|2019-03-04T21:15:51.920+0300|INFO|log/harvester.go:255|Harvester started for file: C:\Hosting\stagingb2c\PaymentGatewayLogs\CredimaxPaymentGateway_OrderId__SoLx\CredimaxPayment_PAY_Request_20190205085701_OrderId__SoLx.txt|
I am not able to see these logs in logstash though I can surely see other files coming in Logstash.
Change your input section to this and check,
filebeat.inputs:
- type: log
enabled: true
paths:
- C:\ELK-Logger\filebeat-6.6.1-windows-x86_64\LowError.txt
- C:\inetpub\logs\LogFiles\*\*
- C:\Hosting\stagingb2c\PaymentGatewayLogs\*\*
- C:\Hosting\stagingb2c\ErrorLogs\*
recursive_glob: enabled

Resources