I am exploring the idea that capturing certain telescope entries (e.g. type = request) and save to 3rd party service such as Elastic Search.
Why? The hosting I am using does not support straight Logstash (or equivalent integration) and I need a way to access to request history. I can do this with MySQL but other tools such as Elastic Search provides more insights.
Any ideas where do I get started?
Related
I need to send email automatically whenever any error comes in my Elastic search.
Is there anyway to do it.
I dont want to use Elastic Cloud for it.
I can use Watcher in Kibana, but my question is whether the "Watcher" is available in local also along with cloud?
Please help!
Watcher is available in on-premises installations if you have at least a Gold License, it is not available with the free basic license.
The same thing for the Kibana e-mail action, it needs a Gold License.
You can check what is available at the subscription page.
If you do not have a Gold License for your on-premises cluster, you will need an external tool to query elasticsearch and send e-mails, you can build one using one of the official clients libraries (python, node.js, java etc) or you can try other tools like elastalert.
I had a few questions in regards to elasticsearch architecture and associated services and/or products that is not clear to me.
The idea is to setup an elasticsearch instance for searching through file shares, Exchange mailboxes, Sharepoint sites and even Teams conversations if possible.
How would I setup the elasticsearch instance to support the following requirements:
Security filtering results from these sources for users
Develop on a simple and clean web search page like SearchUI from Elastic themselves.
Active Directory or ADFS authentication
Use nodejs on a separate server to proxy to elastic, as elastic user management means that users get access to all search results
I can find tutorials and blogs on some of these items, but no comprehensive description of how the architecture would actually work specifically with the SearchUI and proxying of data to ES.
Please have a look at this new product released by Elastic guys using same elastic search framework
https://www.elastic.co/workplace-search
it closley matches your requirement.
I'm installing ELK stack for my company, my cousin uses it for his company too, he's a programmer so I asked him if he bought the Xpack, he says no since the mysql logs he processes aren't of value. I know that I can buy XPack or use Nginx to add authentication, but let's assume that I won't do any authentication, like many elastic users, I have a couple of questions about that scenario.
So I have filebeat that ships MySQL logs to logstash, which feeds them to Elastic search and analytics is done in Kibana.
How to make sure that no information of value end up in the logs while still having meaningful analytics. My company develops an ERP, and has many companies as customers, so at the very least, you'll have the company ID and the user ID in the logs in order to have any meaningful data, isn't this considered sensitive data?
How to make sure that no unauthorized user send a Post request to elastic search or access Kibana? Do you run them locally, not on the internet?
Do you filter the logs of any sensitive before sending it to Filebeat?
I'm just trying to understand how many users manage to run ELK without authentication, while still being able to get meaningful data.
.
How to make sure that no information of value end up in the logs while still having meaningful analytics. My company develops an ERP, and has many companies as customers, so at the very least, you'll have the company ID and the user ID in the logs in order to have any meaningful data, isn't this considered sensitive data?
If you don't want sensitive data stored in your elasticsearch you need to filter it out or anonymize it, for example, you can use a logstash filter to create a fingerprint combining the company id and user id fields, or you can remove any field with sensitive data from your message.
How to make sure that no unauthorized user send a Post request to elastic search or access Kibana? Do you run them locally, not on the internet?
Without authentication this is almost impossible, you will need full control of who knows about your elasticsearch instance and who can access it, if someone besides you has access, they can send requests to your instance, to avoid that you can use a firewall on your servers and only allow access to specific IPs.
Even if you take some precautions, running a Elasticsearch instance in production without any kind of access control is not recommended and is very risky.
You should use an access control method, it could be X-Pack, NGINX or a plugin like Search Guard.
Please check out Search Guard (https://search-guard.com/). The basic version (which is sufficient for most use cases and definitely better than nothing) is free and open source (Apache 2 License).
Disclaimer: I work for Search Guard/floragunn GmbH
If you need to grant some access / privilege, you can use grafana instead of kibana for free.
For ES access, this is like any DB security. Configure your server to allow only some IPs on 9200 and 9300.
You can also look at: https://github.com/sscarduzio/elasticsearch-readonlyrest-plugin to secure delete query (still free).
I've setup an ELK (Elasticsearch, Logstash and Kibana) stack and created some Kibana dashboard widgets. So far everything went amazing. Now I want to send daily and weekly email with the generated reports.
What is the best way to do that. Do I need to install any plugin or I can sent it right from Kibana?
You can use ElastAlert. You will be able to mail a link with the Kibana dashboard with only the data of the period you want. The period parameter in the top right corner will be set automatically in Kibana.
There are some workarounds, such as phantomjs but not straightforward to implement. For specific events and Kibana queries there are alerting mechanisms available (Watcher, Logz.io), but I'm guessing you're looking to receive the entire dashboard by email.
There are two out-of-the box options for sending email reports from Kibana dashboard:
Skedler which allows you to schedule and send automated email reports based on your Kibana dashboard or search.
If you have Elasticsearch license/subscription, then you can use the reporting plugin.
Hope it helps.
You can use Sentinl that extends Kibana for Alerting and Reporting functionality to monitor, notify and report on data series changes using standard queries, programmable validators and a variety of configurable actions - Think of it as a free an independent "Watcher" which also has scheduled "Reporting" capabilities (PNG/PDFs snapshots).
The greatest thing about Sentinl is you can easily configure alerts through it's native App interface integrated in Kibana.
What are the options when it comes to SaaS/hosted full text search? How should I evaluate the different options available?
I'm looking for something that uses Lucene, solr, or sphinx on the backend, and provides a REST API for submitting documents to index, and running searches.
I could build my own EC2 AMI, but I'd have to configure EBS and other stuff, monitor it, etc.
Websolr provides a cloud-based Solr with a control panel. It's in private beta as of this writing, but you can get the service through Heroku.
Another hosted Solr service is PowCloud, also in private beta, which seems to offer strong Wordpress integration.
SolrHQ: another beta service providing a hosted Solr solution, with Joomla and Wordpress integrations.
Acquia Search offers Solr integration for Drupal sites.
If you decide to build your own EC2 instance, the SolrOnAmazonEC2 wiki page might be useful. Or you could just get LucidWorks Solr for EC2, which is probably the easiest and fastest way to get Solr on EC2.
Engine Yard provides a cloud-based Sphinx service.
Indextank is a hosted real-time full text search solution. It's pretty simple to set up (you can get an index running in a couple of minutes) and it's very powerfull (Reddit runs over IndexTank). It provides Java, Python, Ruby and Php clients as well as a Rest API specification. There's an awesome support service (including live chat). You should give it a try.
Another option, particularly for UK people is http://www.netaphorsearch.com/ . I should point out I own Netaphor Ltd. We support the Solr REST API but also have a PHP connector so that you can get up and running very quickly.
Have a look at Artirix - UK company but also in the US http://www.artirix.com. I know they power some sites such as Globrix.com in the UK based on SOLR and have a bunch of other products for crawling and data processing
My five cents
http://indexisto.com/
Offers free hosted Elastic Search if you are ready for advertisement in search results. But anyway you can start with free, and switch to no ads paid account.
It's also not just hosted Elastic Search, but ready to ase Ajax search box (that really impress) to embed to you site (mobile and tablet adopted), and some useful features like statistics, image resizing. There are several options to fill the index with documents - crawler, API and DB connector
Another option for lower-volume websites is Midwestern Mac's hosted Solr search (I am the owner of Midwestern Mac, LLC, just fyi).
Although it's not too hard (if you can use a command line respectably well) to provision your own server on a VPS somewhere...