I'm using Elastic search to analyze my logs in WSO2 API Manager. I'm using basic authentication mode. After setting up Elastic and Kibana and configuring its setting, these errors appear when I want to see Kibana dashboards. How can I solve these problems?
In you Elasticsearch looks like there is no index which starts with apim_event_faulty or apim_event*, you can check all the indices in your Elasticsearch cluster by hitting _cat/indices?v API of Elasticsearch.
Check whether there is /repository/logs/apim_metrics.log inside your WSO2 API Manager home directory.
If you don't have the apim_metrics.log file, most like there is an issue in configurations you have done in API Manager. Refer this documentation https://apim.docs.wso2.com/en/latest/api-analytics/on-prem/elk-installation-guide/
If you have the apim_metrics.log file, check the content. If it does not have any logs, most likely API Manger haven't gone through any event to trigger apim_event_faulty, apim_event_response logs. Try invoking an API and observe the logs.
Related
I have elastic search cluster.
Currently designing a python service for client for read and write query to my elastic search. The python service will not be maintained by me. Only internally python service will call our elastic search for fetching and writing
Is there any way to configure the elastic search so that we get to know that the requests are coming from python service, Or any way we can pass some extra fields while querying based on that fields we will get the logs
There is no online feature in elasticsearch to resolve your request. (you want to check the source and add fields to query).
but there is a solution for audit logs.
https://www.elastic.co/guide/en/elasticsearch/reference/current/enable-audit-logging.html
What you can do is placing a proxy in front of it and do the logging there, we have an Apache in front of our Elastic clusters to enable SSL-offloading there and add logging and ACL possibilities.
I'm using Elasticsearch to drive a "search website" feature. I'd like to collect statistics about what people search for (and which search queries are popular).
Elasticsearch is currently running behind Nginx, so I could extract this information from the Nginx access logs - but maybe Elasticsearch can be made to track this iinformation itself?
I found the Index stats API but that seems to be more abstract. It can be used to determne the average time needed to answer a query and such things, but it does not keep track of individual queries.
I am using a similar configuration (ES behind nginx), and I up to now I always just checked nginx' logfiles directly. However, thinking about your question, it makes much sense to route the nginx log files through the Elastic stack to Elastic Search using logstash, this seems to be the cleanest way.
Apparently in deprecated version there were some security auditing options using a plugin termed Shield or Security, but as I said, configuring logstash to ingest nginx logfiles directly seems most endurable way for your purposes.
Further reading and detailed instructions
discuss.elastic.co: How to get elaticsearch access logs
https://sysadmins.co.za/how-to-ingest-nginx-access-logs-to-elasticsearch-using-filebeat-and-logstash/
Elasticsearch Access Log
how to enable ElasticSearch http access log
I am new to kibana the requirement is to build an analytical dashboard, so we are thinking of uploading the data into elastic search and give access to only visualization of kibana to build reports and use the different dashboards. I have googled and found some links for building custom dashboards using kibana plugin but We don't need any customization so we wanted to use the same kibana UI. Please share your thoughts.
Elasticsearch and Kibana are both open-source.
If you want to prevent your users from read/writing to cluster but allow them to create visualisations and dashboards then you can do so by using a basic license and setting up Role-based access control. You'll need to give them full access to .kibana index. Have a look at Elastic subscriptions to understand the different types of licenses (subscriptions).
If they only want to view and not create visualisations/dashboards, then create a RO (read-only) user with limited privileges i.e read-all but no-write permissions.
Can i use Kibana UI for client's ? Is it open source?
Yes, Kibana is open source. You need a server to host Kibana for free, or you can buy some special option like 10 nodes ES server and Kibana with SAML (Platinium plan).
Check: https://www.elastic.co/subscriptions
The name you are looking for is Elastic stack not solo Kibana.
For example Kibana needs Elasticsearch always.
Some features are in paid versions only, or in some free additional plugins.
I am currently working on the ELK setup for my Kubernetes clusters. I set up logging for all the pods and fortunately, it's working fine.
Now I want to push all terminated/crashed pod logs (which we get by describing but not as docker logs) as well to my Kibana instance.
I checked on my server for those logs, but they don't seem to be stored anywhere on my machine. (inside /var/log/)
maybe it's not enabled or I might not aware where to find them.
If these logs are available in a log file similar to the system log then I think it would be very easy to put them on Kibana.
It would be a great help if anyone can help me achieve this.
You need to use kube-state-metrics by which you can get all pod related metrics. You can configure to your kube-state-metrics to connect elastic search. It will create an index for a different kind of metrics. Then you can easily use that index to display your charts/graphs in Kibana UI.
https://github.com/kubernetes/kube-state-metrics
I have started working on ELK recently and have a doubt regarding handling of multiple types of logs.
I have two sets of logs on my server that I want to analyse, one from my android application and the other from my website. I have successfully transferred logs from this server via filebeat to the ELK server.
I have created two filters for either types of logs and have successfully imported these logs into logstash and then Kibana.
This link helped do the above stuff.
https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-centos-7
The above link directs to use the logs in the filebeat index in Kibana and start analysing(I successfully did for one type of logs). But the problem that I am facing is that since both these logs are very different, they need to be analysed differently. How do I do this in Kibana. Should I create multiple filebeat indexes there and import them, or should it be just one single index, or some other way. I am not very clear on this(could not find much documentation), hence would request to please help and guide me here.
Elasticsearch organizes by index and type. Elastic used to compare these to SQL concepts, but now offers a new explanation.
Since you say that the logs are very different, Elastic is saying that you should use different indexes.
In Kibana, the visualization is tied to an index. If you had one panel from each index, you can show them both on the same dashboard.