Abstract object not mapped correctly in Elasticsearch using Nest 7.0.0-alpha1 - elasticsearch

I am using NEST (.NET 4.8) to import my data, and I have a problem getting the mapping to work in NEST 7.0.0-alpha1.
I have the following class structure:
class LinkActor
{
public Actor Actor { get; set; }
}
abstract class Actor
{
public string Description { get; set; }
}
class Person : Actor
{
public string Name { get; set; }
}
I connect to Elasticsearch this way:
var connectionSettings = new ConnectionSettings(new Uri(connection));
connectionSettings.DefaultIndex(indexName);
var client = new ElasticClient(connectionSettings);
The actual data looks like this:
var personActor = new Person
{
Description = "Description",
Name = "Name"
};
var linkActor = new LinkActor
{
Actor = personActor
};
And the data is indexed like this:
result = client.IndexDocument(linkActor);
Using NEST 6.6 I am getting the following data in Elasticsearch 6.5.2:
"actor": {
"name": "Name",
"description": "Description"
}
However when using NEST 7.0.0-alpha1 I get the following data in Elasticsearch 7.0.0:
"actor": {
"description": "Description"
}
So the data from the concrete class is missing. I am obviously missing / not understanding some new mapping feature, but my attempts with AutoMap has failed:
client.Map<(attempt with each of the above classes)>(m => m.AutoMap());
Is is still possible to map the data from the concrete class in NEST 7.0.0-alpha1?

I found a workaround using the NEST.JsonNetSerializer (remember to install this), which allows me to pass a JObject directly:
Connect to Elasticsearch using a pool so you can add the JsonNetSerializer.Default:
var pool = new SingleNodeConnectionPool(new Uri(connection));
var connectionSettings = new ConnectionSettings(pool, JsonNetSerializer.Default);
connectionSettings.DefaultIndex(indexName);
var client = new ElasticClient(connectionSettings);
Convert the linkActor object from above to a JObject (JsonSerializerSettings omitted for clarity, add them to get CamelCasing):
var linkActorSerialized = JsonConvert.SerializeObject(linkActor);
var linkActorJObject = JObject.Parse(linkActorSerialized);
result = client.IndexDocument(linkActorJObject);
This gives the desired result:
"actor": {
"name": "Name",
"description": "Description"
}
It is a workaround, hopefully someone will be able to explain the mapping in the question.

Related

HotChocolate (GraphQL) schema first approach on complex type

I'm novice in HotChocolate and I'm trying to PoC some simple usage.
I've created very simple .graphql file:
#camera.graphql
type Camera {
id: ID!
name: String!
}
type Query {
getCamera: Camera!
}
And a very simple .NET code for camera wrapping:
public class QlCamera
{
public static QlCamera New()
{
return new QlCamera
{
Id = Guid.NewGuid().ToString(),
Name = Guid.NewGuid().ToString()
};
}
public string Id { get; set; }
public string Name { get; set; }
}
as well as such for schema creation:
public void CreateSchema()
{
string path = System.IO.Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);
var smBuilder = SchemaBuilder.New();
smBuilder.AddDocumentFromFile(path + "/GraphQL/camera.graphql");
smBuilder.AddResolver("Query", "getCamera", () => QlCamera.New());
var schema = smBuilder.Create();
}
On the last line however I do get an exception :
HotChocolate.SchemaException: 'Multiple schema errors occured:
The field Camera.id has no resolver. - Type: Camera
The field Camera.name has no resolver. - Type: Camera
'
I've tried to create :
public class QlCameraType : ObjectType<QlCamera>
{
protected override void Configure(IObjectTypeDescriptor<QlCamera> descriptor)
{
descriptor.Name("Camera");
descriptor.Field(t => t.Id).Type<NonNullType<StringType>>();
descriptor.Field(t => t.Name).Type<StringType>();
}
}
and to replace
smBuilder.AddResolver("Query", "getCamera", () => QlCamera.New());
with
smBuilder.AddResolver("Query", "getCamera", () => new QlCameraType());
But I continue to get the same exception.
Obviously I miss something here, But I cannot understand what exactly.
Could someone explain me what I do miss ?
(I've passed few times trough the documentation, but I cannot find relevant help there)
As exception clearly states - there are no revolvers bind for the particular fields ("id" and "name") of the "Camera" type/object.
So they just have to be added with :
smBuilder.AddResolver("Camera", "id", rc => rc.Parent<QlCamera>().Id);
smBuilder.AddResolver("Camera", "name", rc => rc.Parent<QlCamera>().Name);
And that is it.

More Like This Query Not Getting Serialized - NEST

I am trying to create an Elasticsearch MLT query using NEST's object initializer syntax. However, the final query when serialized, is ONLY missing the MLT part of it. Every other query is present though.
When inspecting the query object, the MLT is present. It's just not getting serialized.
I wonder what I may be doing wrong.
I also noticed that when I add Fields it works. But I don't believe fields is a mandatory property here that when it is not set, then the MLT query is ignored.
The MLT query is initialized like this;
new MoreLikeThisQuery
{
Like = new[]
{
new Like(new MLTDocProvider
{
Id = parameters.Id
}),
}
}
MLTDocProvider implements the ILikeDocument interface.
I expect the serialized query to contain the MLT part, but it is the only part that is missing.
This looks like a bug in the conditionless behaviour of more like this query in NEST; I've opened an issue to address. In the meantime, you can get the desired behaviour by marking the MoreLikeThisQuery as verbatim, which will override NEST's conditionless behaviour
var client = new ElasticClient();
var parameters = new
{
Id = 1
};
var searchRequest = new SearchRequest<Document>
{
Query = new MoreLikeThisQuery
{
Like = new[]
{
new Like(new MLTDocProvider
{
Id = parameters.Id
}),
},
IsVerbatim = true
}
};
var searchResponse = client.Search<Document>(searchRequest);
which serializes as
{
"query": {
"more_like_this": {
"like": [
{
"_id": 1
}
]
}
}
}

In Nest 1.7.1 Delete or DeleteByQuery nothing works

In Nest 1.7.1 Delete or DeleteByQuery nothing works for me.
I am trying to delete below documents:
Article article1 = new Article()
{
Id = 1111,
Title = "Title - Test Elastic Search",
Summary = "Summary - Test Elastic Search",
Body = "Body - Test Elastic Search",
ArticleDate = _dateToday,
Author = new Author() { Id = 100, Name = "Mikey" },
};
Article article2 = new Article()
{
Id = 2222,
Title = "Title - Test Elastic Search",
Summary = "Summary - Test Elastic Search",
Body = "Body - Test Elastic Search",
ArticleDate = _dateToday,
Author = new Author() { Id = 100, Name = "Mikey" },
Published = true
};
I was expecting below queries would delete single document and all documents in an index but no query is deleting.
_elasticClient.Delete(article).Found;
_elasticClient.DeleteByQuery<Article>(q => q.Query(t => t.Term(m => m.OnField(f => f.Id).Value(articleId))))
.Found;
_elasticClient.DeleteByQuery<Article>(q => q.MatchAll()).IsValid;
Please correct me if i am doing anything wrong.
Here's a working example
void Main()
{
var settings = new ConnectionSettings(new Uri("http://localhost:9200"), "articles");
var client = new ElasticClient(settings);
if (client.IndexExists("articles").Exists)
{
client.DeleteIndex("articles");
}
client.CreateIndex("articles", c => c
.AddMapping<Article>(m => m
.MapFromAttributes()
)
);
var today = DateTime.Now.Date;
var article1 = CreateArticle(1111, today);
var article2 = CreateArticle(2222, today);
var article3 = CreateArticle(3333, today);
var article4 = CreateArticle(4444, today);
var bulkRequest = new BulkDescriptor();
bulkRequest.Index<Article>(i => i.Document(article1));
bulkRequest.Index<Article>(i => i.Document(article2));
bulkRequest.Index<Article>(i => i.Document(article3));
bulkRequest.Index<Article>(i => i.Document(article4));
bulkRequest.Refresh();
client.Bulk(bulkRequest);
var searchResponse = client.Search<Article>(q => q.MatchAll());
Console.WriteLine("Documents from search: {0}. Expect 4", searchResponse.Documents.Count());
client.Delete(article1, d => d.Refresh());
searchResponse = client.Search<Article>(q => q.MatchAll());
Console.WriteLine("Documents from search {0}. Expect 3", searchResponse.Documents.Count());
client.Delete(article2, d => d.Refresh());
searchResponse = client.Search<Article>(q => q.MatchAll());
Console.WriteLine("Documents from search {0}. Expect 2", searchResponse.Documents.Count());
client.DeleteByQuery<Article>(q => q.MatchAll());
searchResponse = client.Search<Article>(q => q.MatchAll());
Console.WriteLine("Documents from search {0}. Expect 0", searchResponse.Documents.Count());
}
private Article CreateArticle(int id, DateTime articleDate)
{
return new Article()
{
Id = id,
Title = "Title - Test Elastic Search",
Summary = "Summary - Test Elastic Search",
Body = "Body - Test Elastic Search",
ArticleDate = articleDate,
Author = new Author() { Id = 100, Name = "Mikey" },
Published = true
};
}
public class Article
{
public int Id { get; set;}
public string Title{ get; set;}
public string Summary { get; set;}
public string Body { get; set;}
public DateTime ArticleDate { get; set; }
public Author Author { get; set; }
public bool Published { get; set;}
}
public class Author
{
public int Id { get; set; }
public string Name { get; set;}
}
results in
Documents from search: 4. Expect 4
Documents from search 3. Expect 3
Documents from search 2. Expect 2
Documents from search 0. Expect 0
as expected.
Something to bear in mind is that Elasticsearch is eventually consistent meaning that a document that is indexed does not appear in search results until after a refresh interval (by default, 1 second); Likewise, with a delete query, a document marked for deletion will appear in search results until the refresh interval has elapsed.
A GET request on a given document with a given id will however return the document before the refresh interval.
If you need documents to be searchable (or to not show in search results after deletion), you can refresh the index after an operation as I did with the bulk and delete calls above, using .Refresh(). You might be tempted to call refresh after every operation, however I would recommend using it only when you really need to as it adds overhead to the cluster and called all the time would likely degrade performance.
I got it working finally.
The delete request which is sent via NEST in fiddler is DELETE /articlestest/article/_query and the query which worked in plugin is this DELETE /articlestest/articles/_query ( the document type name was misspelled in the code).That's the reason ,query was not deleting the documents via NEST.And the bad thing is ,it doesn't even complain about the non-existent document type :( It took me a while to found that issue.

MongoDB how update element in array using Spring Query Update

In my project I'm using SpringBoot 1.3.2 and org.springframework.data.mongodb.core.query.*
I'm trying to update element in array, in my main object i have array looking like this:
"sections" : [
{
"sectionId" : "56cc3c908f5e6c56e677bd2e",
"name" : "Wellcome"
},
{
"sectionId" : "56cc3cd28f5e6c56e677bd2f",
"name" : "Hello my friends"
}
]
Using Spring I want to update name of record with sectionId 56cc3c908f5e6c56e677bd2e
I was trying to to this like that but it didn't work
Query query = Query.query(Criteria
.where("sections")
.elemMatch(
Criteria.where("sectionId").is(editedSection.getId())
)
);
Update update = new Update().set("sections", new BasicDBObject("sectionId", "56cc3c908f5e6c56e677bd2e").append("name","Hi there"));
mongoTemplate.updateMulti(query, update, Offer.class);
It create something like:
"sections" : {
"sectionId" : "56cc3c908f5e6c56e677bd2e",
"name" : "Hi there"
}
But this above is object { } I want an array [ ], and I don't want it remove other elements.
Can any body help me how to update name of record with sectionId 56cc3c908f5e6c56e677bd2e using Spring
You essentially want to replicate this mongo shell update operation:
db.collection.update(
{ "sections.sectionId": "56cc3c908f5e6c56e677bd2e" },
{
"$set": { "sections.$.name": "Hi there" }
},
{ "multi": true }
)
The equivalent Spring Data MongoDB code follows:
import static org.springframework.data.mongodb.core.query.Criteria.where;
import static org.springframework.data.mongodb.core.query.Query;
import static org.springframework.data.mongodb.core.query.Update;
...
WriteResult wr = mongoTemplate.updateMulti(
new Query(where("sections.sectionId").is("56cc3c908f5e6c56e677bd2e")),
new Update().set("sections.$.name", "Hi there"),
Collection.class
);
Can use BulkOperations approach to update list or array of document objects
BulkOperations bulkOps = mongoTemplate.bulkOps(BulkMode.UNORDERED, Person.class);
for(Person person : personList) {
Query query = new Query().addCriteria(new Criteria("id").is(person.getId()));
Update update = new Update().set("address", person.setAddress("new Address"));
bulkOps.updateOne(query, update);
}
BulkWriteResult results = bulkOps.execute();
Thats my solution for this problem:
public Mono<ProjectChild> UpdateCritTemplChild(
String id, String idch, String ownername) {
Query query = new Query();
query.addCriteria(Criteria.where("_id")
.is(id)); // find the parent
query.addCriteria(Criteria.where("tasks._id")
.is(idch)); // find the child which will be changed
Update update = new Update();
update.set("tasks.$.ownername", ownername); // change the field inside the child that must be updated
return template
// findAndModify:
// Find/modify/get the "new object" from a single operation.
.findAndModify(
query, update,
new FindAndModifyOptions().returnNew(true), ProjectChild.class
)
;
}

load specific fields in Elasticsearch Nest query

the documentation seems to indicate i can return a subset of fields instead of the entire document. here's my code:
var result = client.Search<MyObject>(s => s
.Fields(f => f.Title)
.Query(q => q
.QueryString(qs => qs
.OnField("title")
.Query("the"))));
i'm searching on the word 'the' on the 'title' field and wanting to just return 'title'. my result.Documents object contains 10 objects that are each null.
i do see the values i want but it's deep in the search response:
result.Hits[0].Fields.FieldValues[0]...
is there a better way to get at the list of 'title' fields returned?
my mapping for the data (truncated) is this ...
{
"myidex": {
"mappings": {
"myobject": {
"properties": {
"title": {
"type": "string"
},
"artists": {
"properties": {
"id": {
"type": "string",
"index": "not_analyzed",
"analyzer": "fullTerm"
},
"name": {
"type": "string",
"index": "not_analyzed",
"analyzer": "fullTerm"
}
}
}
}
}
}
}
}
and my class objects are like this:
[Table("MyTable")]
[Serializable]
[ElasticType(Name="myobject")]
public class MyObject
{
[ElasticProperty]
public string Title { get; set; }
[JsonIgnore]
public string Artistslist { get; set; }
[ElasticProperty(Analyzer = "caseInsensitive")]
public List<Person> Artists { get; set; }
}
[Serializable]
public class Person
{
[ElasticProperty(Analyzer = "fullTerm", Index = FieldIndexOption.not_analyzed)]
public string Name { get; set; }
[ElasticProperty(Analyzer = "fullTerm", Index = FieldIndexOption.not_analyzed)]
public string Id { get; set; }
}
Artistslist comes from my data source (sql) then i parse it out into a new List object before indexing the data.
I think this deeply nested value is do to a change in Elasticsearch 1.0 and how partial fields are now returned as arrays (See 1.0 Breaking Changes - Return Values for details.). This is addressed in the NEST 1.0 Breaking Changes documentation; in the Fields() vs SourceIncludes() section. It shows an example of using a FieldValue helper method to gain access to these values. Based on that, try the following:
For all items:
foreach (var hit in result.Hits)
{
var title = hit.Fields.FieldValue<MyObject, string>(f => f.Title);
}
For a specific item:
var title = result.Hits.ElementAt(0)
.Fields.FieldValue<MyObject, string>(f => f.Title);
I know it is still a bit verbose but it should work for you and will handle the new array return formatting of Elasticsearch 1.0.
I found the solution in Nest's Github repo. They have created an issue about this problem. You should use FielddataFields instead of Fields.
https://github.com/elastic/elasticsearch-net/issues/1551
var result = client.Search<MyObject>(s => s
.FielddataFields(f => f.Title)
.Query(q => q
.QueryString(qs => qs
.OnField("title")
.Query("the"))));
and in response you see FieldSelections. You get the fields that you wanted.

Resources