How to get which indexed values contained in a field with ElasticSearch? - elasticsearch

I have 500 profiles, with 2 fields: name and description.
For description, it contains other profiles' name. I want to use ElasticSearch to get which profiles got mentioned in each profiles so that I can show them as related profiles.
I know how to get profiles using one profile's name and search for other's description, however in many case one profile will mention other profiles' name not bi-direction.
i.e.:
A mentions B but B does not mention A.
How to get related profiles by one's description to search for names but not the other way round? Is it possible to get it done with ElasticSearch?
Thanks

You want something like a relational join and it's not possible in ES. So the way is to fetch the desc, parse the contained profiles and then make a query. You can also try to explore new ES graph plugin

Related

How to make Country-State-City like search in elastic Search

I want to make dependent search like when user type country and select country then on next dropdown/text search result would be from that particular Countries state, after selecting state on next text search would only based on that selected state. can anyone help to achieve this kind thing via elastic search.
i am new to elastic Search and i had basic idea of it, but didn't get idea how to do this kind of stuff where i need to search from child and feel data like map
Fist, it is important to understand how Elasticsearch store its data. You can find this kind of info here: https://www.elastic.co/guide/en/elasticsearch/reference/master/documents-indices.html
So, basically what you need is build a query with two must terms.
One for the object type (Country, State, etc).
Other for the name ("Los Angeles", "Massachussets", etc). If you want a autocomplete feature you could add a wildcard query in your list. https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-wildcard-query.html
Obs: When you store your State object do not forget to store the Country name together. Since Elasticsearch is non relational you have to have Country name indexed in the State document.
Hope that it helps

Good way to exclude records in SOLR or Elasticsearch

For a matchmaking portal, we have one requirement where in, if a customer viewed complete profile details of a bride or groom then we have to exclude that profile from further search results. Currently, along with other detail we are storing the viewed profile ids in a field (Comma Separated) against that bride or groom's details.
Eg., if A viewed B, then in B's record under the field saw_me we will add A (comma separated).
while searching let say the currently searching members id is 123456 then we will fire a query like
Select * from profiledetails where (OTHER CON) AND 123456 not in saw_me;
The problem here is the saw_me field value is growing like anything, is there any better way to handle this requirement? Please guide.
If this is using Solr:
first, DON'T add the 'AND NOT ...' clauses along with the main query in q param, add them to fq. This have many benefits (the fq will be cached)
Until you get to a list of values that is maybe 1000s this approach is simple and should work fine
After you reach a point where the list is huge, maybe it time to move to a post filter with a high cost ( so it is looked up last). This would look up docs to remove in an external source (redis, db...).
In my opinion no matter how much the saw_me field grows, it will not make much difference in search time.Because tokens are indexed inversely and doc_values are created at index time in column major fashion for efficient read and has support for caching from OS. ES handles these things for you efficiently.

How to add (not set) value to include filter in google analytics

I want to add location filters to the view for only if city is "Mumbai". However, in the data, city shows as "(not set)" sometimes. I don't want to exclude this data.
I tried creating a custom include filter with filter pattern as "Mumbai|(not set)". It doesn't work. It filters for Mumbai but not for the unknown locations.
How can I have a geographical view and not ignore the hits from unknown locations?
You could use the filter pattern ga:city!=(not set).
Test your filter with the Query Explorer and check out the filter reference.
Good luck!

Elasticsearch with multiple search criteria

I am try to build a full text search engine using elasticsearch. We have a application which has conferences running across the globe. We have the future and past conferences data. For a POC we have already loaded the conferences details into elasticsearch and it contains fields like title,date,venue,geo_location of the venue as document.
I am able to do simple search using match all query. And also by using function_score I can get the current on going conferences and also using user geo location i can get nearby conferences to users location.
But there are some uses cases where i got stuck and could not proceed. Use cases are.
1) If user try to search with "title + location" then I should not use the user current geo location rather whatever user has provided the city_name use that place geo location and retrieve those doc. Here I know some programming is also required.
2) User search with "title + year", for ex. cardio 2014. User interested to see all the caridology conf of 2014 and it should retrieve that year documents only. But using function score it is retrieving the current years documents.
First of all let me know that above two use cases can be handled in single query. I am thinking to handle it one request, but got stuck.
A proper solution would require you to write your own query parser in your application (outside of elasticsearch) that will parse the query and extract dates, locations, etc. Once all features are extracted, the parser should generate a bool query where each feature would become an appropriate must clause. So, the date would became a range query, the location - geo_location query and everything else would go into a match query for full text matching. Then this query can be sent to elasticsearch.

Elasticsearch not searching some fields

I have just updated a website, the update adds new fields to elasticsearch.
In my dev environment, it all works fine. but on the live site, the new fields are not being found.
Eg. I have added a new field with the value : 1
However, when adding a filtered query of
{"field":1}
It does not find any matching results.
When I look in the documents, I can see docs with the field set to 1
Would the reason for this be that the new field was added after the mappings was set? I am not all that familiar with elasticsearch, So I am not really sure where to start looking to fix it.
Any help would be appreciated.
Update:
querying from URL shows nothing either
_search/?pretty=true&size=50&q=field1:*
however there is another field that was added at the same time which I can search on.
I can see field1 in the result set but it just wont allow me to search on it.
Only difference i see in the mapping is that the one that is working is set to type:long whereas the one not working is set as type:string
Is it a length issue on the ngram? what was your "min_gram" settings?
When you check on your index settings like this:
GET <host>/<index_name>/_settings
Does it work when you filter for a two digit field?
Are all the field values one digit?
It's OK to add a field after the mapping was set. ElasticSearch will guess the mapping for you. (in fact, it's one of their selling features --- no need to define the mapping, just throw the data at it)
There are a few things that can go wrong:
Verify that data is actually in the index. To do that, just navigate to the _search url with no parameters, you should see the field if it is indexed.
Look at your mapping. Could it be that the field is explicitly set not to be indexed?
Another possibility is that your query is wrong (but that is unlikely, since you're saying it works in the development environment)

Resources