I have my ELK installed and use a logstash file to configure the log txt files. However, when I open kibana, I could not see the .raw field data. How can I see that?
You may go to the elasticsearch forum to ask
Related
I have installed ELK in one server and filebeat in other server where logs resides. My logs are moved and able to view in Kibana. But I dont need the commented lines and lines with certains text to be displayed in kibana. Hence I used drop_event and exclude_lines in Filebeat and I even used drop filter in logstash but I dont see them refelecting in Kibana dashboard. Can anyone help on this.
I'm trying to visualize logs from my app. My logs formatted as json and stored in some file. I have filebeat installed which uses the same file as input. An filebeat could send the logs to Logstash and to Elasticsearch directly. Logstash could process logs, do something, parse them...
But my logs are json formatted already.
Elasticsearch are going to be installed on another server, another side of the planet...
so, my question is, Is there any good reason to use logstash in such scenario?( no need do any processing ), or is it ok to send logs to elasticsearch server directly?
I'm guessing the Logstash could do some buffering, but I want to keep my app's server light, don't want to install anything on top of it.
Thanks.
May this help you :https://www.elastic.co/guide/en/beats/filebeat/current/elasticsearch-output.html.
You can post the json into es by filebeat without Logstash, Logtstash is too heavy sometimes.
I use ElasticSeearch and GrayLog to show and analyse logs, this solution is great, but I want to replace grayLog by Grafana, I see that it can do a lot of greate Graphes, but I dont found any solution to show logs on Grafana.
I wont to collect syslogs and author logs, also metrics from system Parse and store theme on ElasticSearch, and finaly show both of theme on Grafana, logs and métrics.
If you have a suggestion or a solluttion for that, please help me.
I'm new to the ELK stack so I'm not sure what the problem is. I have a configuration file (see screenshot, it's based on the elasticsearch tutorial):
Configuration File
Logstash is able to read the logs (it says Pipeline main started) but when the configuration file is run, elasticsearch doesn't react. I can search through the files
However, when I open Kibana, it says no results found. I checked and made sure that my range is the full day.
Any help would be appreciated!
So I have this Elasticsearch installation, in insert data with logstash, visualize them with kibana.
Everything in the conf file is commented, so it's using the default folders which are relative to the elastic search folder.
1/ I store data with logstash
2/ I look at them with kibana
3/ I close the instance of elastic seach, kibana and logstash
4/ I DELETE their folders
5/ I re-extract everything and reconfigure them
6/ I go into kibana and the data are still there
How is this possible?
This command will however delete the data : curl -XDELETE 'http://127.0.0.1:9200/_all'
Thanks.
ps : forgot to say that I'm on windows
If you've installed ES on Linux, the default data folder is in /var/lib/elasticsearch (CentOS) or /var/lib/elasticsearch/data (Ubuntu)
If you're on Windows or if you've simply extracted ES from the ZIP/TGZ file, then you should have a data sub-folder in the extraction folder.
Have a look into the Nodes Stats and try
http://127.0.0.1:9200/_nodes/stats/fs?pretty
On Windows 10 with ElasticSearch 7 it shows:
"path" : "C:\\ProgramData\\Elastic\\Elasticsearch\\data\\nodes\\0"
According to the documentation the data is stored in a folder called "data" in the elastic search root directory.
If you run the Windows MSI installer (at least for 5.5.x), the default location for data files is:
C:\ProgramData\Elastic\Elasticsearch\data
The config and logs directories are siblings of data.
Elastic search is storing data under the folder 'Data' as mentioned above answers.
Is there any other elastic search instance available on your local network?
If yes, please check the cluster name. If you use same cluster name in the same network it will share data.
Refer this link for more info.
On centos:
/var/lib/elasticsearch
It should be in your extracted elasticsearch. Something like es/data