Filebeat not sending specific Log Files - windows

I have configured filebeat 6.6 on a Windows instance. Weird thing is, it is sending logs for IIS but not for file I have specified even though the filebeat can detect it.
Filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- C:\ELK-Logger\filebeat-6.6.1-windows-x86_64\LowError.txt
- type: log
enabled: true
paths:
- C:\inetpub\logs\LogFiles\*\*
- C:\Hosting\stagingb2c\PaymentGatewayLogs\*\*
recursive_glob: enabled
- type: log
enabled: true
paths:
- C:\Hosting\stagingb2c\ErrorLogs\*
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
setup.template.settings:
index.number_of_shards: 3
output.logstash:
hosts: ["13.234.83.186:5044"]
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
logging:
to_files: true
files:
path: C:\ELK-Logger\filebeat-6.6.1-windows-x86_64\filebeat-6.6.1-windows-x86_64\LOG
level: info
I can see logs from C:\inetpub\logs\LogFiles folder but not from C:\Hosting\stagingb2c\PaymentGatewayLogs.
I can not see any errors or warnings in filebeat.log when I started it with :slight_smile:
PS C:\ELK-Logger\filebeat-6.6.1-windows-x86_64\filebeat-6.6.1-windows-x86_64> .\filebeat.exe -e -d "*"
|2019-03-04T21:15:51.602+0300|INFO|log/harvester.go:255|Harvester started for file: C:\Hosting\stagingb2c\PaymentGatewayLogs\CredimaxPaymentGateway_OrderId_12f1050220190810\CredimaxPayment_TransactionDetails_OrderId_12f1050220190810|
|---|---|---|---|
|2019-03-04T21:15:51.761+0300|INFO|log/harvester.go:255|Harvester started for file: C:\Hosting\stagingb2c\PaymentGatewayLogs\CredimaxPaymentGateway_OrderId_Sw2m\CredimaxPayment_PROCESS_ACS_RESULT_Response_20190213124610_OrderId_Sw2m.txt|
|2019-03-04T21:15:51.920+0300|INFO|log/harvester.go:255|Harvester started for file: C:\Hosting\stagingb2c\PaymentGatewayLogs\CredimaxPaymentGateway_OrderId__SoLx\CredimaxPayment_PAY_Request_20190205085701_OrderId__SoLx.txt|
I am not able to see these logs in logstash though I can surely see other files coming in Logstash.

Change your input section to this and check,
filebeat.inputs:
- type: log
enabled: true
paths:
- C:\ELK-Logger\filebeat-6.6.1-windows-x86_64\LowError.txt
- C:\inetpub\logs\LogFiles\*\*
- C:\Hosting\stagingb2c\PaymentGatewayLogs\*\*
- C:\Hosting\stagingb2c\ErrorLogs\*
recursive_glob: enabled

Related

Getting this error while deploying filebeat "Exiting: No outputs are defined. Please define one under the output section."

I am trying to deploy filebeat in one cluster and trying to send the logs into another cluster which has elastic search and kibana installed in it.
This is my yml file
`
---
apiVersion: beat.k8s.elastic.co/v1beta1
kind: Beat
metadata:
name: es-beats
namespace: elastic
spec:
type: filebeat
version: 7.12.1
elasticsearchRef:
name: elastic
config:
filebeat.inputs:
- type: container
paths:
- /var/log/containers/*.log
- output.elasticsearch:
# Array of hosts to connect to.
hosts: ["https://<my-other-cluster-ip>:9200"]
# Protocol - either `http` (default) or `https`.
protocol: "https"
username: "elastic"
password: "mypass"
- setup.kibana:
host: "https://<my-other-cluster-ip>:9200"
username: "elastic"
password: "mypass"
daemonSet:
podTemplate:
spec:
dnsPolicy: ClusterFirstWithHostNet
hostNetwork: true
securityContext:
runAsUser: 0
containers:
- name: filebeat
volumeMounts:
- name: varlogcontainers
mountPath: /var/log/containers
- name: varlogpods
mountPath: /var/log/pods
- name: varlibdockercontainers
mountPath: /var/lib/docker/containers
volumes:
- name: varlogcontainers
hostPath:
path: /var/log/containers
- name: varlogpods
hostPath:
path: /var/log/pods
- name: varlibdockercontainers
hostPath:
path: /var/lib/docker/containers
`
beats is successfully installed and when i apply this file everything works fine but the pods don't get deployed and when I get pods the status says CrashLoopBackOff
I deployed the beats using k8 operators and want to get the logs into the other cluster I read the documentation but I am confused where do I enter the elasticsearch host and kibana host name and pass and what shall the output be.

Can i avoid repetition in filebeat input settings?

I have an input settings like this (Proof Of Concept) and i will add more prospectors further on.
Can i avoid repetition of the multiline properties?
filebeat.prospectors:
- type: log
enabled: true
paths:
- /data/server/logs/inode-stage/inode-stage.log
multiline.pattern: '^\['
multiline.negate: true
multiline.match: after
fields:
env: 'stage'
app: 'inode'
- type: log
enabled: true
paths:
- /data/server/logs/inode-dev/inode-dev.log
multiline.pattern: '^\['
multiline.negate: true
multiline.match: after
fields:
env: 'dev'
app: 'inode'
I don't think that is possible right now. Not sure how many variations you will have in your inputs, but based on your current example I would extract the env with dissect. If you need something more powerful, you could even go for the script processor.

How to constrain Filebeat to only ship logs to ELK if they contain a specific field?

I’m trying to collect logs from Kubernetes nodes using Filebeat and ONLY ship them to ELK IF the logs originate from a specific Kubernetes Namespace.
So far I’ve discovered that you can define Processors which I think accomplish this. However, no matter what I do I can not get the shipped logs to be constrained. Does this look right?
Hm, does this look correct then?
filebeat.config:
inputs:
path: ${path.config}/inputs.d/*.yml
reload.enabled: true
reload.period: 10s
when.contains:
kubernetes.namespace: "NAMESPACE"
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
processors:
- add_kubernetes_metadata:
namespace: "NAMESPACE"
xpack.monitoring.enabled: true
output.elasticsearch:
hosts: ['elasticsearch:9200']
Despite this configuration I still get logs from all of the namespaces.
Filebeat is running as a DaemonSet on Kubernetes. Here is an example of an expanded log entry: https://i.imgur.com/xfTwbhl.png
You have number options to do it:
Filter data by filebeat
processors:
- drop_event:
when:
contains:
source: "field"
Use ingest pipeline into elasticsearch:
output.elasticsearch:
hosts: ["localhost:9200"]
pipeline: my_pipeline_id
And then test events into pipeline:
{
"drop": {
"if" : "ctx['field'] == null "
}
}
Use drop filter of logstash:
filter {
if ![field] {
drop { }
}
}
In the end, I resolved this by moving the drop processor to the input configuration file from the configuration file.

Non-Zero Metrics-FileBeat

I am using elasic.co/filebeat:6.3.1 and ELK elastic.co:6.3.0 in ubuntu as docker, while running filebeat facing this issue,
https://i.stack.imgur.com/9rZjt.png
And my filebeat.yml is
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: false
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /usr/local/java/ABC_LOGS/*/*.log
#- c:\programdata\elasticsearch\logs\*
#============================= Filebeat modules ===============================
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
#==================== Elasticsearch template setting ==========================
setup.template.settings:
index.number_of_shards: 3
#index.codec: best_compression
#_source.enabled: false
#============================== Dashboards =====================================
setup.dashboards.enabled: true
#============================== Kibana =====================================
setup.kibana:
host: "10.0.0.0:5601"
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
hosts: ["10.0.0.0:9200"]
Please help me,Thanks in advance

why would Codeclimate be unable to show gpa?

What are the possible reasons a code climate gap badge would show up as a question mark/unknown?
The other badges are working however, I can see number of issues,
and % LoC Covered badges.
Here's my .codeclimate.yml file
engines:
rubocop:
enabled: true
eslint:
enabled: true
csslint:
enabled: true
duplication:
enabled: true
config:
languages:
- ruby:
- javascript:
exclude_paths:
- "test/"
- "coverage/"
- "doc/"
- "bin/"
In think You are missing this from your .codeclimate file:
ratings:
paths:
- Gemfile.lock
- "**.css"
- "**.js"
- "**.jsx"
- "**.rb"
You can read about it more here: https://docs.codeclimate.com/v1.0/docs/ratings
You also need to make sure your files have UTF-8 encoding.

Resources