Need concrete documentation / examples of building complex index using NEST ElasticSearch library - elasticsearch

I would like to use the NEST library's Fluent interface to create an index, which involves setting up custom filters, analyzers, and type mappings. I would like to avoid decorating my classes with NEST-specific annotations.
I have seen the documentation at http://nest.azurewebsites.net/indices/create-indices.html and http://nest.azurewebsites.net/indices/put-mapping.html. This documentation, while showing some examples, is not complete enough to help me figure out how to use the Fluent API to build some complex indexing scenarios.
I have found the tutorial at http://euphonious-intuition.com/2012/08/more-complicated-mapping-in-elasticsearch/ to be quite helpful; some code showing how to build the filters, analyzers and mappings in this tutorial via the NEST Fluent interface in place of the straight JSON would be a great answer to this question.

The more specific you can be with your question the better the answers you receive will be. Nevertheless, here is an index that sets up an analyzer (with filter) and tokenizer (EdgeNGram) and then uses them to create an autocomplete index on the Name field of a Tag class.
public class Tag
{
public string Name { get; set; }
}
Nest.IElasticClient client = null; // Connect to ElasticSearch
var createResult = client.CreateIndex(indexName, index => index
.Analysis(analysis => analysis
.Analyzers(a => a
.Add(
"autocomplete",
new Nest.CustomAnalyzer()
{
Tokenizer = "edgeNGram",
Filter = new string[] { "lowercase" }
}
)
)
.Tokenizers(t => t
.Add(
"edgeNGram",
new Nest.EdgeNGramTokenizer()
{
MinGram = 1,
MaxGram = 20
}
)
)
)
.AddMapping<Tag>(tmd => tmd
.Properties(props => props
.MultiField(p => p
.Name(t => t.Name)
.Fields(tf => tf
.String(s => s
.Name(t => t.Name)
.Index(Nest.FieldIndexOption.not_analyzed)
)
.String(s => s
.Name(t => t.Name.Suffix("autocomplete"))
.Index(Nest.FieldIndexOption.analyzed)
.IndexAnalyzer("autocomplete")
)
)
)
)
)
);
There is also a fairly complete mapping example in NEST's unit test project on github.
https://github.com/elasticsearch/elasticsearch-net/blob/develop/src/Tests/Nest.Tests.Unit/Core/Map/FluentMappingFullExampleTests.cs
Edit:
To query the index, do something like the following:
string queryString = ""; // search string
var results = client.Search<Tag>(s => s
.Query(q => q
.Text(tq => tq
.OnField(t => t.Name.Suffix("autocomplete"))
.QueryString(queryString)
)
)
);

Related

How to add conditional properties for index creation in elasticsearch nest?

I want to create index with some condition,like with querycontainer to add conditional filters.
PropertiesDescriptor<object> ps = new PropertiesDescriptor<object>();
if (condition)
{
ps.Text(s => s.Name(name[1]));
}
if(condition)
{
ps.Number(s => s.Name(name[1]));
}
if (!_con.client.Indices.Exists(indexname).Exists)
{
var createIndexResponse = _con.client.Indices.Create(indexname, index => index.Settings(s => s.NumberOfShards(1).NumberOfReplicas(0))
.Map(m=>m.Properties(ps)));
}
But i receive following error, can you guide me how to acheive this.
cannot convert from 'Nest.PropertiesDescriptor<object>' to 'System.Func<Nest.PropertiesDescriptor<object>, Nest.IPromise<Nest.IProperties>>'
You are almost there, just change Properties part to m.Properties(p => ps).
_con.client.Indices.Create(indexname,
index => index.Settings(s => s.NumberOfShards(1).NumberOfReplicas(0)).Map(m=>m.Properties(p => ps)));
Hope that helps.

Adding FunctionScore/FieldValueFactor to a MultiMatch query

We've got a pretty basic query we're using to allow users to provide a query text, and then it boosts matches on different fields. Now we want to add another boost based on votes, but not sure where to nest the FunctionScore in.
Our original query is:
var results = await _ElasticClient.SearchAsync<dynamic>(s => s
.Query(q => q
.MultiMatch(mm => mm
.Fields(f => f
.Field("name^5")
.Field("hobbies^2")
)
.Query(queryText)
)
)
);
If I try to nest in FunctionScore around the MultiMatch, it basically ignores the query/fields, and just returns everything in the index:
var results = await _ElasticClient.SearchAsync<dynamic>(s => s
.Query(q => q
.FunctionScore(fs => fs
.Query(q2 => q2
.MultiMatch(mm => mm
.Fields(f => f
.Field("name^5")
.Field("hobbies^2")
)
.Query(queryText)
)
)
)
)
);
My expectation is that since I'm not providing a FunctionScore or any Functions, this should basically do the exact same thing as above. Then, just adding in FunctionScore will provide boosts on the results based on the functions I give it (in my case, boosting based on the votes field just FieldValueFactor).
The documentation around this is a little fuzzy, particularly with certain combinations, like MultiMatch, FunctionScore, and query text. I did find this answer, but it doesn't cover when including query text.
I'm pretty sure it boils down to my still foggy understanding of how Elastic queries work, but I'm just not finding much to cover the (what I would think is a pretty common) scenario of:
A user entering a query
Boosting matches of that query with certain fields
Boosting all results based on the value of a numeric field
Your function_score query is correct, but the reason that you are not seeing the results that you expect is because of a feature in NEST called conditionless queries. In the case of a function_score query, it is considered conditionless when there are no functions, omitting the query from the serialized form sent in the request.
The easiest way to see this is with a small example
private static void Main()
{
var defaultIndex = "my-index";
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var settings = new ConnectionSettings(pool, new InMemoryConnection())
.DefaultIndex(defaultIndex)
.DisableDirectStreaming()
.PrettyJson()
.OnRequestCompleted(callDetails =>
{
if (callDetails.RequestBodyInBytes != null)
{
Console.WriteLine(
$"{callDetails.HttpMethod} {callDetails.Uri} \n" +
$"{Encoding.UTF8.GetString(callDetails.RequestBodyInBytes)}");
}
else
{
Console.WriteLine($"{callDetails.HttpMethod} {callDetails.Uri}");
}
Console.WriteLine();
if (callDetails.ResponseBodyInBytes != null)
{
Console.WriteLine($"Status: {callDetails.HttpStatusCode}\n" +
$"{Encoding.UTF8.GetString(callDetails.ResponseBodyInBytes)}\n" +
$"{new string('-', 30)}\n");
}
else
{
Console.WriteLine($"Status: {callDetails.HttpStatusCode}\n" +
$"{new string('-', 30)}\n");
}
});
var client = new ElasticClient(settings);
var queryText = "query text";
var results = client.Search<dynamic>(s => s
.Query(q => q
.FunctionScore(fs => fs
.Query(q2 => q2
.MultiMatch(mm => mm
.Fields(f => f
.Field("name^5")
.Field("hobbies^2")
)
.Query(queryText)
)
)
)
)
);
}
which emits the following request
POST http://localhost:9200/my-index/object/_search?pretty=true&typed_keys=true
{}
You can disable the conditionless feature by marking a query as Verbatim
var results = client.Search<dynamic>(s => s
.Query(q => q
.FunctionScore(fs => fs
.Verbatim() // <-- send the query *exactly as is*
.Query(q2 => q2
.MultiMatch(mm => mm
.Fields(f => f
.Field("name^5")
.Field("hobbies^2")
)
.Query(queryText)
)
)
)
)
);
This now sends the query
POST http://localhost:9200/my-index/object/_search?pretty=true&typed_keys=true
{
"query": {
"function_score": {
"query": {
"multi_match": {
"query": "query text",
"fields": [
"name^5",
"hobbies^2"
]
}
}
}
}
}

Nest DeleteByQuery without the Object name

I want to send a Nest delete request to elasticsearch without specifying the object which I don't have. I've seen solutions like:
var response = elasticClient.DeleteByQuery<MyClass>(q => q
.Match(m => m.OnField(f => f.Guid).Equals(someObject.Guid))
);
From: DeleteByQuery using NEST and ElasticSearch
As I'm just reading plain text from a queue I don't have access to the MyClass object to use with the delete request. Basically I just want to delete all documents in an index (whose name I know) where a variable matches for example ordId = 1234. Something like:
var response = client.DeleteByQuery<string>( q => q
.Index(indexName)
.AllTypes()
.Routing(route)
.Query(rq => rq
.Term("orgId", "1234"))
);
I see that the nest IElasticClient interface does have a DeleteByQuery method that doesn't require the mapping object but just not sure how to implement it.
You can just specify object as the document type T for DeleteByQuery<T> - just be sure to explicitly provide the index name and type name to target in this case. T is used to provide strongly type access within the body of the request only. For example,
var client = new ElasticClient();
var deleteByQueryResponse = client.DeleteByQuery<object>(d => d
.Index("index-name")
.Type("type-name")
.Query(q => q
.Term("orgId", "1234")
)
);
Will generate the following query
POST http://localhost:9200/index-name/type-name/_delete_by_query
{
"query": {
"term": {
"orgId": {
"value": "1234"
}
}
}
}
Replace _delete_by_query with _search in the URI first, to ensure you're targeting the expected documents :)

NEST Search not found any result

Just find out about nest. I already insert some number of document in Elastic Search. Right now I want to search the data based on my type, subcriberId. I did run through curl and it works just fine. But when I tried using nest, no result found.
My curl which work:
http://localhost:9200/20160902/_search?q=subscribeId:aca0ca1a-c96a-4534-ab0e-f844b81499b7
My NEST code:
var local = new Uri("http://localhost:9200");
var settings = new ConnectionSettings(local);
var elastic = new ElasticClient(settings);
var response = elastic.Search<IntegrationLog>(s => s
.Index(DateTime.Now.ToString("yyyyMMdd"))
.Type("integrationlog")
.Query(q => q
.Term(p => p.SubscribeId, new Guid("aca0ca1a-c96a-4534-ab0e-f844b81499b7"))
)
);
Can someone point what I did wrong?
A key difference between your curl request and your NEST query is that the former is using a query_string query and the latter, a term query. A query_string query input undergoes analysis at query time whilst a term query input does not so depending on how subscribeId is analyzed (or not), you may see different results. Additionally, your curl request is searching across all document types within the index 20160902.
To perform the exact same query in NEST as your curl request would be
void Main()
{
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var connectionSettings = new ConnectionSettings(pool)
// set up NEST with the convention to use the type name
// "integrationlog" for the IntegrationLog
// POCO type
.InferMappingFor<IntegrationLog>(m => m
.TypeName("integrationlog")
);
var client = new ElasticClient(connectionSettings);
var searchResponse = client.Search<IntegrationLog>(s => s
.Index("20160902")
// search across all types. Note that documents found
// will be deserialized into instances of the
// IntegrationLog type
.AllTypes()
.Query(q => q
// use query_string query
.QueryString(qs => qs
.Fields(f => f
.Field(ff => ff.SubscribeId)
)
.Query("aca0ca1a-c96a-4534-ab0e-f844b81499b7")
)
)
);
}
public class IntegrationLog
{
public Guid SubscribeId { get; set; }
}
This yields
POST http://localhost:9200/20160902/_search
{
"query": {
"query_string": {
"query": "aca0ca1a-c96a-4534-ab0e-f844b81499b7",
"fields": [
"subscribeId"
]
}
}
}
this specifies the query_string query in the body of the request which is analogous to using the q query string parameter to specify the query.

example of how to use synonyms in nest

i haven't found a solid example on how to create and use synonyms using Nest for Elasticsearch. if anyone has one it would be helpful.
my attempt looks like this, but i don't know how to apply it to a field.
var syn = new SynonymTokenFilter
{
Synonyms = new [] { "pink, p!nk => pink", "lil, little", "ke$ha, kesha => ke$ha" },
IgnoreCase = true,
Tokenizer = "standard"
};
client.CreateIndex("myindex", i =>
{
i
.Analysis(a => a.Analyzers(an => an
.Add("fullTermCaseInsensitive", fullTermCaseInsensitive)
)
.TokenFilters(x => x
.Add("synonym", syn)
)
)
...
it's very simple :)
you will need to define first the Synonym filter the you can use it in your custom Analyzer...where you can add also other type of filters.
Small example :
.Analysis(descriptor => descriptor
.Analyzers(bases => bases
.Add("folded_word", new CustomAnalyzer()
{
Filter = new List<string> { "icu_folding", "trim", "synonym" },
Tokenizer = "standard"
}
)
)
.TokenFilters(i => i
.Add("synonym", new SynonymTokenFilter()
{
SynonymsPath="analysis/synonym.txt",
Format = "Solr"
}
)
)
Then you can use the custom analyzer in the mapping part
Assuming your fullTermCaseInsensitive analyzer is custom, you need to add your synonym filter to it:
var fullTermCaseInsensitive = new CustomAnalyzer()
{
.
.
.
Filter = new string[] { "syn" }
};
And upon creating your index, you can add a mapping and apply the fullTermCaseInsensitive analyzer to your field(s):
client.CreateIndex("myindex", c => c
.Analysis(a => a
.Analyzers(an => an.Add("fullTermCaseInsensitive", fullTermCaseInsensitive))
.TokenFilters(tf => tf.Add("syn", syn)))
.AddMapping<MyType>(m => m
.Properties(p => p
.String(s => s.Name(t => t.MyField).Analyzer("fullTermCaseInsensitive")))));

Resources