Azure CosmoDB - MongoDb - C# - how to increment a value inside an array of Object? - mongodb-.net-driver

I've this schema for my db:
{
"_id" : "test_schema",
"t" : 5,
"p" : [
{
"id" : "207",
"v" : 4
},
{
"id" : "309",
"v" : 1
}
....
]
}
I'm trying to $inc the v value of p.id equal to "207".
I'm currently able to inc the t value with this code:
var result = collection.UpdateOneAsync(new BsonDocument("_id", "test_schema"}), new BsonDocument("$inc", new BsonDocument("t", 4)), new UpdateOptions() { IsUpsert = true }).Result;
but whe i try to update a value on array nothing happen(even no error!):
var result = collection.UpdateOneAsync(new BsonDocument(new Dictionary<string, object>() { { "_id", "test_schema" }, { "p.id", "207" } }), new BsonDocument("$inc", new BsonDocument("p.v", 4)), new UpdateOptions() { IsUpsert = true }).Result;
Following MongoDB documentation i noticed that "p.v", 4 should be "p.$.v" but in cosmodb raise a not valid $ symbol exception.
Any suggestion?

Cosmos DB doesn't yet support positional operator, it is on the roadmap. The feedback item https://feedback.azure.com/forums/599059-azure-cosmos-db-mongodb-api/suggestions/20091454-positional-array-update-via-query-support is filed to track interest in this feature. Please vote there if you need this support.

Related

How to get field values from a query using NEST with Elastic Search

I'm new to elastic search and using (trying to) the NEST Library. I'm writing logs to an index using Serilog Elastic Search Sink. So first consideration is I have no control over the structure that the sink uses, just the structured logging properties that I choose to log.
Anyway, I'm simply trying to run a basic search where I want to return the first X documents from an index. I'm able to get some of the property values back from the query but nothing for any of the fields.
The query is as follows:
var searchResponse = await _elasticClient.SearchAsync<LogsViewModel>(s => s
.Index("webapp-razor-*")
.From(0)
.Size(5)
.Query(q => q.MatchAll()));
I'm guessing the reason I'm returning null for the fields is because the model class is not structured correctly.
Ruuning the console tool within the elastic search portal for a simply GET Request:
An example document returned from this query is below:
{
"_index" : "webapp-razor-2021.05",
"_type" : "_doc",
"_id" : "34v3t43kBwE34t3vJowGRgl",
"_score" : 1.0,
"_source" : {
"#timestamp" : "2021-05-03T20:19:46.9329848+01:00",
"level" : "Information",
"messageTemplate" : "{#LogEventCategory}{#LogEventType}{#LogEventSource}{#LogCountry}{#LogRegion}{#LogCity}{#LogZip}{#LogLatitude}{#LogLongitude}{#LogIsp}{#LogIpAddress}{#LogMobile}{#LogUserId}{#LogUsername}{#LogForename}{#LogSurname}{#LogData}",
"message" : "\"Open Id Connect\"\"User Sign In\"\"WebApp-RAZOR\"\"United Kingdom\"\"England\"\"MyTown\"\"PX27\"\"54.8951\"\"-9.1585\"\"My ISP\"\"123.345.789.180\"\"False\"\"a8vce3vc-8e61-44fc-b142-93ck396ad91ce\"\"joe#email.net\"\"joe#email.net\"\"Bloggs\"\"User with username [joe#email.net] forename [joe#email.net] surname [Bloggs] from IP Address [123.345.789.180] signed into the application [WebApp_RAZOR] Succesfully\"",
"fields" : {
"LogEventCategory" : "Open Id Connect",
"LogEventType" : "User Sign In",
"LogEventSource" : "WebApp-RAZOR",
"LogCountry" : "United Kingdom",
"LogRegion" : "England",
"LogCity" : "MyTown",
"LogZip" : "PX27",
"LogLatitude" : "54.8951",
"LogLongitude" : "-9.1585",
"LogIsp" : "My ISP",
"LogIpAddress" : "123.345.789.180",
"LogMobile" : "False",
"LogUserId" : "a8vce3vc-8e61-44fc-b142-93ck396ad91ce",
"LogUsername" : "joe#email.net",
"LogForename" : "joe#email.net",
"LogSurname" : "Bloggs",
"LogData" : "User with username [joe#email.net] forename [Joe] surname [Bloggs] from IP Address [123.345.789.180] signed into the application [WebApp_RAZOR] Succesfully",
"RequestId" : "0HM8ED1IRB7AK:00000001",
"RequestPath" : "/signin-oidc",
"ConnectionId" : "0HM8ED1IRB7AK",
"MachineName" : "DESKTOP-OS52032",
"MemoryUsage" : 23688592,
"ProcessId" : 26212,
"ProcessName" : "WebApp-RAZOR",
"ThreadId" : 6
}
Sample model class (or part of it)
public class LogsViewModel
{
[JsonProperty("#timestamp")]
public string Timestamp { get; set; }
[JsonProperty("level")]
public string Level { get; set; }
[JsonProperty("fields")]
public Fields Fields { get; set; }
}
public class Fields
{
[JsonProperty("LogEventCategory")]
public string LogEventCategory { get; set; }
// Not all propeties shown here but would be the same principal...
}
Could someone please give me an idea in how to go about this? once I know how to get the values from the fields such as "LogEventCategory" then I should be able to move forward and figure it out. None of the documentation examples for Elastic has worked for me, thanks
After a few days of trial and error, I finally derived a solution in being able to pull the fields of choice from the _source object in the elastic document. There may well be a more optimized approach here so welcome any feedback on the topic.
My first step was to view the structure of a sample document from an index that Serilog is writing to, note in my case I'm not necessarily including all structured log event properties in all log events being written to Elastic i.e. on system startup, I simply don't need details of the user/location etc.
Using the DevTools in the Elastic Portal, I performed a simple GET request:
Great tip from user Russ Cam in the comments above where he advises the advantage in using the NuGet package for Elastic Common Schema .NET which provides some standardization for using Serilog and logging to Elastic from various different apps/sources. Reading the forums it looks to be that Elastic are strongly encouraging us to use a common schema as it will play better when working with charts/metrics/dashboards creation etc.
My WebApp is using .NET Core 5, I've included the code section below used in Program.cs file that shows where I added the reference to the above Elastic Common Schema .NET library. Now because I'm connecting to Elastic Cloud, I have to include the authentication details when building the Elastic client and it took me a few attempts before I figured out how to incorporate this package reference alongside some of the other Elastic Client options:
Program.cs file:
public static void Main(string[] args)
{
var configuration = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile(path: "appsettings.json", optional: false, reloadOnChange: true)
.Build();
// Credentials used for eleastic cloud logging sink.
var elkUri = configuration.GetSection("ElasticCloud").GetValue<string>("Uri");
var elkUsername = configuration.GetSection("ElasticCloud").GetValue<string>("Username");
var elkPassword = configuration.GetSection("ElasticCloud").GetValue<string>("Password");
var elkApplicationName = configuration.GetSection("ElasticCloud").GetValue<string>("ApplicationName");
Log.Logger = new LoggerConfiguration()
.ReadFrom.Configuration(configuration)
.WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri(elkUri))
{
ModifyConnectionSettings = x => x.BasicAuthentication(elkUsername, elkPassword),
IndexFormat = "webapp-razor-{0:yyyy.MM}",
AutoRegisterTemplate = true,
CustomFormatter = new EcsTextFormatter() // *Elastic Common Schema .NET package ref HERE*
})
.CreateLogger();
var host = CreateHostBuilder(args).Build();
using var scope = host.Services.CreateScope();
var services = scope.ServiceProvider;
string logEventCategory = "WebApp-RAZOR";
string logEventType = "Application Startup";
string logEventSource = "System";
string logData = "";
try
{
// Tested OK 1.5.2021
//throw new Exception(); // Testing only..
logData = "Application Starting Up";
Log.Information(
"{#LogEventCategory}" +
"{#LogEventType}" +
"{#LogEventSource}" +
"{#LogData}",
logEventCategory,
logEventType,
logEventSource,
logData);
host.Run(); // Run the WebHostBuilder.
}
catch (Exception ex)
{
logData = "The Application failed to start correctly.";
// Tested on 08/07/2020
Log.Fatal(ex,
"{#LogEventCategory}" +
"{#LogEventType}" +
"{#LogEventSource}" +
"{#LogData}",
logEventCategory,
logEventType,
logEventSource,
logData);
}
finally // Cleanup code.
{
Log.CloseAndFlush();
};
}
My methodology in using a dynamic type reference in the NEST Client method is so I can avoid using a strongly typed model, this made made life much easier when trying to figure out what the structure was of the data returned from the query by pausing the result on debug and having a peek inside the content structure.
var searchResponse = await _elasticClient.SearchAsync<dynamic>(s => s
//.AllIndices()
.Index("webapp-razor-*")
.Query(q => q
.MatchAll()
)
);
// Once the searchResponse data is returned from the query,
// I then map the results to a View Model
// (which I use for rendering the list of results to my Razor page)
LogsViewModel = new LogsViewModel
{
ScannedEventCount = searchResponse.Hits.Count,
LogEventProperties = new List<LogEventProperties>()
};
foreach (var doc in searchResponse.Documents)
{
var lep = new LogEventProperties();
lep.Timestamp = DateTime.Parse(doc["#timestamp"].ToString());
lep.Level = doc["log.level"];
// Properties
if (((IDictionary<string, object>)doc).ContainsKey("_metadata"))
{
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_event_category", out object value1)) { lep.LogEventCategory = value1.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_event_type", out object value2)) { lep.LogEventType = value2.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_event_source", out object value3)) { lep.LogEventSource = value3.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_device_id", out object value4)) { lep.LogDeviceId = value4.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_country", out object value5)) { lep.LogCountry = value5.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_region", out object value6)) { lep.LogRegion = value6.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_city", out object value7)) { lep.LogCity = value5.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_zip", out object value8)) { lep.LogZip = value5.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_latitude", out object value9)) { lep.LogLatitude = value9.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_longitude", out object value10)) { lep.LogLongitude = value10.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_isp", out object value11)) { lep.LogIsp = value5.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_ip_address", out object value12)) { lep.LogIpAddress = value12.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_mobile", out object value13)) { lep.LogMobile = value13.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_user_id", out object value14)) { lep.LogUserId = value14.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_username", out object value15)) { lep.LogUsername = value15.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_forename", out object value16)) { lep.LogForename = value16.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_surname", out object value17)) { lep.LogSurname = value17.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("log_data", out object value18)) { lep.LogData = value18.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("request_id", out object value19)) { lep.RequestId = value19.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("request_path", out object value20)) { lep.RequestPath = value20.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("connection_id", out object value21)) { lep.ConnectionId = value21.ToString(); }
if (((IDictionary<String, object>)doc["_metadata"]).TryGetValue("memory_usage", out object value22)) { lep.MemoryUsage = (Int64)value22; }
}
// Exception
if (((IDictionary<string, object>)doc).ContainsKey("error"))
{
if (((IDictionary<String, object>)doc["error"]).TryGetValue("message", out object value23)) { lep.ErrorMessage = value23.ToString(); }
if (((IDictionary<String, object>)doc["error"]).TryGetValue("type", out object value24)) { lep.ErrorType = value24.ToString(); }
if (((IDictionary<String, object>)doc["error"]).TryGetValue("stack_trace", out object value25)) { lep.ErrorStackTrace = value25.ToString(); }
}
// Machine Name
if (((IDictionary<string, object>)doc).ContainsKey("host"))
{
if (((IDictionary<String, object>)doc["host"]).TryGetValue("name", out object value26)) { lep.MachineName = value26.ToString(); }
}
// Process
if (((IDictionary<string, object>)doc).ContainsKey("process"))
{
if (((IDictionary<String, object>)doc["process"]["thread"]).TryGetValue("id", out object value27)) { lep.ThreadId = (Int64)value27; }
if (((IDictionary<String, object>)doc["process"]).TryGetValue("pid", out object value28)) { lep.ProcessId = (Int64)value28; }
if (((IDictionary<String, object>)doc["process"]).TryGetValue("name", out object value29)) { lep.ProcessName = value29.ToString(); }
}
LogsViewModel.LogEventProperties.Add(lep);
}
}
return View(LogsViewModel);
The fundamental reason I went with the above method is that some of the documents will not contain all of the structured logging event properties. I had to to derive a way in checking for the existence of the dictionary keys before trying to access the values, otherwise I'd get exception errors when the keys are missing. An example of this is the difference observed between a log event that was generated during an exception versus a log information event for when a user logged into the app.
The two documents below show a slightly different JSON structure which emphasises my decision to fetch the results using a dynamic type. In general, for any documents that I create myself into Elastic, I would usually map the items to a proper model given I would always know ow the full structure beforehand.
{
"took" : 0,
"timed_out" : false,
"_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : {
"value" : 70,
"relation" : "eq"
},
"max_score" : 1.0,
"hits" : [
{
"_index" : "webapp-razor-2021.05",
"_type" : "_doc",
"_id" : "_2sOPnkBwE4YgJownxnP",
"_score" : 1.0,
"_source" : {
"#timestamp" : "2021-05-05T20:43:34.6041763+01:00",
"log.level" : "Information",
"message" : "\"WebApp-RAZOR\"\"Application Startup\"\"System\"\"Application Starting Up\"",
"_metadata" : {
"message_template" : "{#LogEventCategory}{#LogEventType}{#LogEventSource}{#LogData}",
"log_event_category" : "WebApp-RAZOR",
"log_event_type" : "Application Startup",
"log_event_source" : "System",
"log_data" : "Application Starting Up",
"memory_usage" : 4680920
},
"ecs" : {
"version" : "1.5.0"
},
"event" : {
"severity" : 2,
"timezone" : "GMT Standard Time",
"created" : "2021-05-05T20:43:34.6041763+01:00"
},
"host" : {
"name" : "DESKTOP-OS52032"
},
"log" : {
"logger" : "Elastic.CommonSchema.Serilog",
"original" : null
},
"process" : {
"thread" : {
"id" : 9
},
"pid" : 3868,
"name" : "WebApp-RAZOR",
"executable" : "WebApp-RAZOR"
}
}
},
{
"_index" : "webapp-razor-2021.05",
"_type" : "_doc",
"_id" : "AGsOPnkBwE4YgJowyBrP",
"_score" : 1.0,
"_source" : {
"#timestamp" : "2021-05-05T20:43:44.3936344+01:00",
"log.level" : "Information",
"message" : "\"Open Id Connect\"\"User Sign In\"\"WebApp-RAZOR\"\"United Kingdom\"\"England\"\"MyTown\"\"OX26\"\"51.8951\"\"-1.1585\"\"My ISP\"\"123.456.789.101\"\"False\"\"34vc34-34v34534-44fc-b142-32223ad91ce\"\"joe.bloggs#email.net\"\"joe.bloggs#email.net\"\"Bloggs\"\"User with username [joe.bloggs#email.net] forename [Jose] surname [Bloggs] from IP Address [123.345.789.101] signed into the application [WebApp_RAZOR] Succesfully\"",
"_metadata" : {
"message_template" : "{#LogEventCategory}{#LogEventType}{#LogEventSource}{#LogCountry}{#LogRegion}{#LogCity}{#LogZip}{#LogLatitude}{#LogLongitude}{#LogIsp}{#LogIpAddress}{#LogMobile}{#LogUserId}{#LogUsername}{#LogForename}{#LogSurname}{#LogData}",
"log_event_category" : "Open Id Connect",
"log_event_type" : "User Sign In",
"log_event_source" : "WebApp-RAZOR",
"log_country" : "United Kingdom",
"log_region" : "England",
"log_city" : "MyTown",
"log_zip" : "OX26",
"log_latitude" : "55.1234",
"log_longitude" : "-10.1585",
"log_isp" : "My ISP",
"log_ip_address" : "123.456.789.101",
"log_mobile" : "False",
"log_user_id" : "34vc34-34v3434-44fc-b142-32223ad91ce",
"log_username" : "joe.bloggs#email.net",
"log_forename" : "joe.bloggs#email.net",
"log_surname" : "Bloggs",
"log_data" : "User with username [joe.bloggs#email.net] forename [Joe] surname [Bloggs] from IP Address [123.456.789.101] signed into the application [WebApp_RAZOR] Succesfully",
"request_id" : "0HM8FVO9FFHDD:00000001",
"request_path" : "/signin-oidc",
"connection_id" : "0HM8FVO9FFHDD",
"memory_usage" : 23954480
},
"ecs" : {
"version" : "1.5.0"
},
"event" : {
"severity" : 2,
"timezone" : "GMT Standard Time",
"created" : "2021-05-05T20:43:44.3936344+01:00"
},
"host" : {
"name" : "DESKTOP-OS52032"
},
"log" : {
"logger" : "Elastic.CommonSchema.Serilog",
"original" : null
},
"process" : {
"thread" : {
"id" : 16
},
"pid" : 3868,
"name" : "WebApp-RAZOR",
"executable" : "WebApp-RAZOR"
}
}
},

Upserting a MongoDB collection doc with $max

TL;DR Only last $max statement seems to get implemented
Hi there I am trying to add or update data based on if the new incoming value is greater than the stored value using pymongo
{
'site': 'xyz.com',
'site_data':
{'particular_aspect_about_site':{'score_1': 2, 'score_2': 2, 'score_3': 1}},
`enter code here`{'a_different_aspect_about_site': {'score_a': 3, 'score_b': 1, 'score_c': 4}},
}
what I am trying is something like
def upsert_site_data():
site_to_upsert = None
data_to_upsert = None
json_object = request.get_json()
if "site" in json_object:
site_to_upsert = json_object["site"]
data_to_upsert = { "site" : site_to_upsert}
###### check if data was collected
if "site_data" in json_object:
data_to_upsert.update(json_object["site_data"])
collection_name = mongo.db.SiteData # establish mongo db instance to work with
try:
result = collection_name.update_one({"site" : site_to_upsert,
},
{
"$max" : { "particular_aspect_about_site.score_2" : data_to_upsert["particular_aspect_about_site"]["score_2"]} ,
"$max" : { "particular_aspect_about_site.score_3" : data_to_upsert["particular_aspect_about_site"]["score_3"]},
"$max" : { "a_different_aspect_about_site.score_b" : data_to_upsert["a_different_aspect_about_site"]["score_b"]},
}
,
upsert = True)
if result.raw_result["updatedExisting"] != True:
return jsonify({"status": "ok"}) , 200
if result.raw_result["updatedExisting"] == True:
return jsonify({"error": "Site data was not updated, thanks though :D "}), 200
except Exception as e:
return jsonify({"error": e}) , 400
else:
return jsonify({"error": "A site must be refenced"}) , 400
The issue is with the $max statements in the update section of the update function.
It only ever implements the final max.
I have also tried other methods to less success such as
{
"$max" : [
{ "particular_aspect_about_site.score_2" : data_to_upsert["particular_aspect_about_site"]["score_2"]},
{ "particular_aspect_about_site.score_3" : data_to_upsert["particular_aspect_about_site"]["score_3"]},
{ "a_different_aspect_about_site.score_b" : data_to_upsert["a_different_aspect_about_site"]["score_b"]}
]
}
So it was a mistake by me, I did
{ "$max" : { "particular_aspect_about_site.score_2" : data_to_upsert["particular_aspect_about_site"]["score_2"]} ,
"$max" : { "particular_aspect_about_site.score_3" : data_to_upsert["particular_aspect_about_site"]["score_3"]},
"$max" : { "a_different_aspect_about_site.score_b" : data_to_upsert["a_different_aspect_about_site"]["score_b"]},
}
But I should have used this template
{ $max: { field1: value1, field2: value2 ... } }
from [geeks4geeks][1]
now it looks like
{ "$max" : {
"particular_aspect_about_site.score_2" : data_to_upsert["particular_aspect_about_site"]["score_2"] ,
"particular_aspect_about_site.score_3" : data_to_upsert["particular_aspect_about_site"]["score_3"],
"a_different_aspect_about_site.score_b" : data_to_upsert["a_different_aspect_about_site"]["score_b"]},
}
Does what I want now. I'm sure there is a better way to do this and when i find it I will post, does what I need now
[1]: https://www.geeksforgeeks.org/mongodb-maximum-operator-max/

How to check an empty JSONArray in swiftyJSON

I have a JSON that have a JSONArray as a value in one of the json inside it. here is the example of it.
[
{
"id": 1,
"symptoms" : [{\"key\":\"sample1\",\"value\":5},{\"key\":\"sample2\",\"value\":5}]
},
{
"id": 2,
"symptoms" : [{\"key\":\"sample3\",\"value\":1}]
},
{ "id": 3,
"symptoms" : []
},
{
"id": 4,
"symptoms": [{\"key\":\"sample4\",\"value\":1}]
}
]
So what I am doing is that I am parsing the inner JSON and place it in a String Array. But whenever I look up to symptoms it skips the empty JSONArray. So whenever i print the String Array it goes like this (with the given sample on top) ["sample1", "sample2", "sample3", "sample4"]. But i want to do is to append an "" to the String Array whenever the JSONArray is empty so it should be like this ["sample1", "sample2", "sample3", "", "sample4"]. Anyone can help me with this? Here is my code
var arrayHolder: [String] = []
var idHolder: [Int] = []
for item in swiftyJSON.arrayValue {
idHolder.append(item["id"].intValue)
//for the inner JSON
let innerJSON = JSON(data: item["symptoms"].dataUsingEncoding(NSUTF8StringEncoding)!)
for symptoms in innerJSON.arrayValue {
arrayHolder.append(symptoms["key"].stringValue)
}
}
print(idHolder) // [1,2,3,4]
print(arrayHolder) // ["sample1","sample2","sample3","sample4"]
Just check if innerJSON is empty:
for item in swiftyJSON.arrayValue {
idHolder.append(item["id"].intValue)
//for the inner JSON
let innerJSON = item["symptoms"].arrayValue // non need to create a new JSON object
if innerJSON.isEmpty {
arrayHolder.append("")
} else {
for symptoms in innerJSON {
arrayHolder.append(symptoms["key"].stringValue)
}
}
}

MongoDB dynamic update of collection when changes occurs in another collection

I created two collections using Robomongo :
collection_Project that contains documents like this
{
"_id" : ObjectId("5537ba643a45781cc8912d8f"),
"_Name" : "ProjectName",
"_Guid" : LUUID("16cf098a-fead-9d44-9dc9-f0bf7fb5b60f"),
"_Obj" : [
]
}
that I create with the function
public static void CreateProject(string ProjectName)
{
MongoClient client = new MongoClient("mongodb://localhost/TestCreationMongo");
var db = client.GetServer().GetDatabase("TestMongo");
var collection = db.GetCollection("collection_Project");
var project = new Project
{
_Name = ProjectName,
_Guid = Guid.NewGuid(),
_Obj = new List<c_Object>()
};
collection.Insert(project);
}
and collection_Object that contains documents like this
{
"_id" : ObjectId("5537ba6c3a45781cc8912d90"),
"AssociatedProject" : "ProjectName",
"_Guid" : LUUID("d0a5565d-a0aa-7a4a-9683-b86f1c1de188"),
"First" : 42,
"Second" : 1000
}
That I create with the function
public static void CreateObject(c_Object ToAdd)
{
MongoClient client = new MongoClient("mongodb://localhost/TestCreationMongo");
var db = client.GetServer().GetDatabase("TestMongo");
var collection = db.GetCollection("collection_Object");
collection.Insert(ToAdd);
I update the documents of collection_Project with the function
public static void AddObjToProject(c_Object ObjToAdd, string AssociatedProject)
{
MongoClient client = new MongoClient("mongodb://localhost/TestCreationMongo");
var db = client.GetServer().GetDatabase("TestMongo");
var collection = db.GetCollection<Project>("collection_Project");
var query = Query.EQ("_Name", AssociatedProject);
var update = Update.AddToSetWrapped<c_Object>("_Obj", ObjToAdd);
collection.Update(query, update);
}
so that the documents in collection_Project look like this
{
"_id" : ObjectId("5537ba643a45781cc8912d8f"),
"_Name" : "ProjectName",
"_Guid" : LUUID("16cf098a-fead-9d44-9dc9-f0bf7fb5b60f"),
"_Obj" : [
{
"_id" : ObjectId("5537ba6c3a45781cc8912d90"),
"AssociatedProject" : "ProjectName",
"_Guid" : LUUID("d0a5565d-a0aa-7a4a-9683-b86f1c1de188"),
"First" : 42,
"Second" : 1000
}
]
}
Can I update the document only in the collection_Object and see the change in the collection_Project as well ?
I tried to do that
public static void UpdateObject(c_Object ToUpdate)
{
MongoClient client = new MongoClient("mongodb://localhost/TestCreationMongo");
var db = client.GetServer().GetDatabase("TestMongo");
var collection = db.GetCollection("collection_Object");
var query = Query.EQ("_Guid", ToUpdate._Guid);
var update = Update.Replace<c_Object>(ToUpdate);
collection.Update(query, update);
}
but I the collection_Project doesn't change.
Do you have any clue ?
It looks like you are embedding the 'Object' document inside the 'Project' document, which might be fine, but that approach eliminates the need for your separate collection_Object collection. That is to say, collection_Object is redundant because each object (not just a reference) is actually stored inside the Project document as you have implemented it.
See the documentation for information on using embedded documents.
Alternatively, you could use document references.
The best approach to use depends on your specific use case.

How to select attribute in RavenDB Index in Json-Linq?

In RavenDB my document (ID = 1234) is
"datacontainer": {
"data": [
{
"#idx": "1",
"#idy": "a",
"value": {
"#text": "test 2010"
}
},
{
"#idx": "2",
"#idy": "b",
"value": {
"#text": "test 2011"
}
},
{
"#idx": "3",
"#idy": "c",
"value": {
"#text": "test 2012"
}
}
]
}
I want to create an Index, where I choose my favourite values (for example idx = "2" and idy = "b") and the output will be:
(ID, value_text) = (1234, "test 2011")
Now I can select a single element and check its value in Linq:
where p.datacontainer.data[0]["#idx"] == "2" && p.datacontainer.data[0]["#idy"] == "b"
How can I search the right element in my list?
Luigi,
In RavenDB, you don't search for a list value, you are searching for a document with a given document that matches the query you have.
In your case, what does your entity looks like?
I solved my problem! In RavenDB the index, called "MyIndex", is:
Map:
from p in docs
select new
{ Id = p.id,
M = p.dataApplication.datacontainer.data.Where(x => x["#idx"] == "2").First(x => x["#idy"] == "b").value["#text"]
};
Reduce:
from test in results
group test by new {test.Id, test.M } into g
select new { g.Key.Id, g.Key.M }
Now I can use this Index in my queries, so I will search for a document that contains a particular value, for example:
var results = from p in session.Query<QueryResult>("MyIndex")
where p.M == "test 2011"
select p;
Maybe there is a better solution, but now it works!

Resources