ElasticSeach using GeoLite2-City.mmdb - elasticsearch

I wondered if it's possible to use /GeoLite2-City.mmdb or /GeoLite2-country.mmdb included in Elasticsearch as API. I mean, I would query it to give me IP details Instead of using an API from a different service?

Related

What details do I need to GET data from elasticsearch cluster?

My team has data stored on ElasticSearch and have given me an API key, the URL of a remote cluster, and a username/password combination (to what I dont know) to GET data.
How do I use this API key to get data from the ElasticSearch cluster with Python? I've looked through the docs, but none include the use of a raw API key and most involve localhost, not a remote host in my case.
Surely I need to know the names of nodes or indexes at least? For what would I need the username/password combo for? There must be more details I need to connect with than what I've been given?
We're moving from Node.js+couchbase work to ElasticSearch+Python so I'm more than a bit lost.
TYIA
Most probably x-pack basic security is enabled in your Elasticsearch(ES) cluster, which you can check by hitting http::9200, if it ask for username/password then you can provide what you have.
Please refer x-pack page for more info.
In short, its used to secure your cluster and indices and there are various types of authentication and basic auth(which requires username/password) is the one your team might be using.

Searching Query in solr using Jmeter

I configured the solr Search server in Tomcat server. I started tomcat server with below extra parameters.
Dcom.sun.management.jmxremote.port=9191
Dcom.sun.management.jmxremote.authenticate=false
Dcom.sun.management.jmxremote.ssl=false
Now I wants to test solr Searching request in my JMeter for load testing purpose. Will I be able to do it in Jmeter?
As per Solr Quick Start
Searching
Solr can be queried via REST clients, cURL, wget, Chrome POSTMAN, etc., as well as via the native clients available for many programming languages.
The Solr Admin UI includes a query builder interface - see the gettingstarted query tab at http://localhost:8983/solr/#/gettingstarted/query.
So you should be able to perform a search using HTTP Request sampler
Replace gettingstarted with your Solr core name and YOUR_QUERY_HERE with your actual query.
You will also be able to use XPath or JSON Path Extractor in order to extract some response parts into JMeter Variables if needed

How do I ensure proper routing with logstash when I update a parent/child relationship document?

I have a parent/child relationship setup in elastic search. When I try to update the parent it sometimes works and sometimes not. I believe I've narrowed it down to getting a missing document error because I can't figure out how to specify routing using logstash. (Parent/Child relationships must route to the same shard). I thought elasticsearch would do this automatically given that I have setup the routing path in the mappings but it only seems to work when I specify the routing paramater in the REST API URL. But, logstash doesn't seem to have a way to add that when updating. I'm using the logstash elastic-search output plugin with http protocol.
To add to my confusion it seems elasticsearch 1.5 is deprecating the "path" property in mappings.
Is there any way to ensure proper routing with parent/child updates using logstash?

How to use elastic search queries in windows?

Hi am new in elastic search, I installed the elastic search in my windows 7 machine but I can't know, how to run and use elastic search queries in windows where should I type the elastic search queries and where should I run this queries?..
Any one know about it help me. Thanks in advance...
There are multiple ways to do that.
via HTTP interface, which means that you can run GET queries via your browser (Firefox, Chrome etc.) by accesing the proper url like:
http://localhost:9200/_search?q=tag:wow
Elasticsearch's HEAD plugin. You can execute any query with it. It also has multiple additional functionalities.
Install cUrl for Windows and then run queries just like every tutorial suggests.
use any programming language like PHP that supports curl library.
Personally I prefer HEAD plugin since it has other functionalities that I use anyway.
you can also check sense plugin for chrome. It will also help you in syntax for queries.
you can get it from here
https://github.com/bleskes/sense

Multitenant setup with Kibana and Elasticsearch

I am going to use logstash+ES+kibana for my project. I want to know how to use this framework for multi tenants. Can any one explain me how after the authentication Kibana query the elastic search index and load in Kibana's dashboard? Can I restrict kibana to look for a specifix index of Elastic search for a particular user or some-id? Anybody has tried this?
Thnx
You could, but depending on your use case it is probably not a good idea. There are a few gotchas, particularly regarding security and separating the users. First Kibana is just javascript running in the browser. So whatever Kibana is allowed to do so is your user. You can however have a separate index pattern for each "user", but elastic search does not provide you any ways of authenticating a users or authorizing a user access to a specific index. You would have to use some sort of proxy for this.
I recommend http://www.found.no/foundation/elasticsearch-in-production/ and http://www.found.no/foundation/elasticsearch-security/ for a more in depth explanation.
Create an index for each tenant.
In this way you can use a proxy (like the app the hosts kibana) to intercept the request and return a settings that includes the index to use.
The value that specifies the index to use can be the logged in user or you can get that value somewhere else.
To separate even more the data, you can use a prefix in each index name, and then when you specify an index you can use a pattern to take all the index related to only certain kind of data/entities.
Hope this help.
Elasticsearch announced today a plugin they are working on that should provide security features to ES product. Probably, this will contain ways of restricting access based on roles and users setup at cluster and indices level. If this happens I see no way for them not to extend this security layer to Kibana, as well. Also, it seems this plugin will have a commercial version only.

Resources