I want a list of counts for some of my data (count the number of open.closed tasks etc), I want to get all counts inside 1 query, so I am not sure what I do with my linq statement below...
_user is an object that returns info about the current loggedon user
_repo is am object that returns an IQueryable of whichever table I want to select
var counters = (from task in _repo.All<InstructionTask>()
where task.AssignedToCompanyID == _user.CompanyID || task.CompanyID == _user.CompanyID
join instructions in _repo.GetAllMyInstructions(_user) on task.InstructionID equals
instructions.InstructionID
group new {task, instructions}
by new
{
task
}
into g
select new
{
TotalEveryone = g.Count(),
TotalMine = g.Count(),
TotalOpen = g.Count(x => x.task.IsOpen),
TotalClosed = g.Count(c => !c.task.IsOpen)
}).SingleOrDefault();
Do I convert my object to single or default? The exception I am getting is, this sequence contains more than one element
Note: I want overall stats, not for each task, but for all tasks - not sure how to get that?
You need to dump everything into a single group, and use a regular Single. I am not sure if LINQ-to-SQL would be able to translate it correctly, but it's definitely worth a try.
var counters = (from task in _repo.All<InstructionTask>()
where task.AssignedToCompanyID == _user.CompanyID || task.CompanyID == _user.CompanyID
join instructions in _repo.GetAllMyInstructions(_user) on task.InstructionID == instructions.InstructionID
group task by 1 /* <<=== All tasks go into one group */ into g select new {
TotalEveryone = task.Count(),
TotalMine = task.Count(), // <<=== You probably need a condition here
TotalOpen = task.Count(x => x.task.IsOpen),
TotalClosed = task.Count(c => !c.task.IsOpen)
}).Single();
From MSDN
Returns the only element of a sequence, or a default value if the
sequence is empty; this method throws an exception if there is more
than one element in the sequence.
You need to use FirstOrDefault. SingleOrDefault is designed for collections that contains exactly 1 element (or none).
Related
I'm pulling data from a third party api. The api runs multiple times in a day. So, if the same data is present in the table it should ignore that record, else if there are any changes it should update that record or insert a new record if anything new shows up in the json received.
I'm using the below code for inserting any new data.
var input = JsonConvert.DeserializeObject<List<DeserializeLookup>>(resultJson).ToList();
var entryset = input.Select(y => new Lookup
{
lookupType = "JOBCODE",
code = y.Code,
description = y.Description,
isNew = true,
lastUpdatedDate = DateTime.UtcNow
}).ToList();
await _context.Lookup.AddRangeAsync(entryset);
await _context.SaveChangesAsync();
But, after the first run, when the api runs again it's again inserting the same data in the table. As a result, duplicate entries are getting into table. To handle the same, I used a foreach loop as below before inserting data to the table.
foreach (var item in input)
{
if (!_context.Lookup.Any(r =>
r.code== item.Code))
{
//above insert code
}
}
But, the same doesn't work as expected. Also, the api takes a lot of time to run when I put a foreach loop. Is there a solution to this in .net core 3.1
List<DeserializeLookup> newList=new();
foreach (var item in input)
{
if (!_context.Lookup.Any(r =>
r.code== item.Code))
{
newList.add(item);
//above insert code
}
}
await _context.Lookup.AddRangeAsync(newList);
await _context.SaveChangesAsync();
It will be better if you try this way
I’m on my phone so forgive me for not being able to format the code in my response. The solution to your problem is something I actually just encountered myself while syncing data from an azure function and third party app and into a sql database.
Depending on your table schema, you would need one column with a unique identifier. Make this column a primary key (first step to preventing duplicates). Here’s a resource for that: https://www.w3schools.com/sql/sql_primarykey.ASP
The next step you want to take care of is your stored procedure. You’ll need to perform what’s commonly referred to as an UPSERT. To do this you’ll need to merge a table with the incoming data...on a specified column (whichever is your primary key).
That would look something like this:
MERGE
Table_1 AS T1
USING
Incoming_Data AS source
ON
T1.column1 = source.column1
/// you can use an AND / OR operator in here for matching on additional values or combinations
WHEN MATCHED THEN
UPDATE SET T1.column2= source.column2
//// etc for more columns
WHEN NOT MATCHED THEN
INSERT (column1, column2, column3) VALUES (source.column1, source.column2, source.column3);
First of all, you should decouple the format in which you get your data from your actual data handling. In your case: get rid of the JSon before you actually interpret the data.
Alas, I haven't got a clue what your data represents, so Let's assume your data is a sequence of Customer Orders. When you get new data, you want to Add all new orders, and you want to update changed orders.
So somewhere you have a method with input your json data, and as output a sequence of Orders:
IEnumerable<Order> InterpretJsonData(string jsonData)
{
...
}
You know Json better than I do, besides this conversion is a bit beside your question.
You wrote:
So, if the same data is present in the table it should ignore that record, else if there are any changes it should update that record or insert a new record
You need an Equality Comparer
To detect whether there are Added or Changed Customer Orders, you need something to detect whether Order A equals Order B. There must be at least one unique field by which you can identify an Order, even if all other values are of the Order are changed.
This unique value is usually called the primary key, or the Id. I assume your Orders have an Id.
So if your new Order data contains an Id that was not available before, then you are certain that the Order was Added.
If your new Order data has an Id that was already in previously processed Orders, then you have to check the other values to detect whether it was changed.
For this you need Equality comparers: one that says that two Orders are equal if they have the same Id, and one that says checks all values for equality.
A standard pattern is to derive your comparer from class EqualityComparer<Order>
class OrderComparer : EqualityComparer<Order>
{
public static IEqualityComparer<Order> ByValue = new OrderComparer();
... // TODO implement
}
Fist I'll show you how to use this to detect additions and changes, then I'll show you how to implement it.
Somewhere you have access to the already processed Orders:
IEnumerable<Order> GetProcessedOrders() {...}
var jsondata = FetchNewJsonOrderData();
// convert the jsonData into a sequence of Orders
IEnumerable<Order> orders = this.InterpretJsonData(jsondata);
To detect which Orders are added or changed, you could make a Dictonary of the already Processed orders and check the orders one-by-one if they are changed:
IEqualityComparer<Order> comparer = OrderComparer.ByValue;
Dictionary<int, Order> processedOrders = this.GetProcessedOrders()
.ToDictionary(order => order.Id);
foreach (Order order in Orders)
{
if(processedOrders.TryGetValue(order.Id, out Order originalOrder)
{
// order already existed. Is it changed?
if(!comparer.Equals(order, originalOrder))
{
// unequal!
this.ProcessChangedOrder(order);
// remember the changed values of this Order
processedOrder[order.Id] = Order;
}
// else: no changes, nothing to do
}
else
{
// Added!
this.ProcessAddedOrder(order);
processedOrder.Add(order.Id, order);
}
}
Immediately after Processing the changed / added order, I remember the new value, because the same Order might be changed again.
If you want this in a LINQ fashion, you have to GroupJoin the Orders with the ProcessedOrders, to get "Orders with their zero or more Previously processed Orders" (there will probably be zero or one Previously processed order).
var ordersWithTPreviouslyProcessedOrder = orders.GroupJoin(this.GetProcessedOrders(),
order => order.Id, // from every Order take the Id
processedOrder => processedOrder.Id, // from every previously processed Order take the Id
// parameter resultSelector: from every Order, with its zero or more previously
// processed Orders make one new:
(order, previouslyProcessedOrders) => new
{
Order = order,
ProcessedOrder = previouslyProcessedOrders.FirstOrDefault(),
})
.ToList();
I use GroupJoin instead of Join, because this way I also get the "Orders that have no previously processed orders" (= new orders). If you would use a simple Join, you would not get them.
I do a ToList, so that in the next statements the group join is not done twice:
var addedOrders = ordersWithTPreviouslyProcessedOrder
.Where(orderCombi => orderCombi.ProcessedOrder == null);
var changedOrders = ordersWithTPreviouslyProcessedOrder
.Where(orderCombi => !comparer.Equals(orderCombi.Order, orderCombi.PreviousOrder);
Implementation of "Compare by Value"
// equal if all values equal
protected override bool Equals(bool x, bool y)
{
if (x == null) return y == null; // true if both null, false if x null but y not null
if (y == null) return false; // because x not null
if (Object.ReferenceEquals(x, y) return true;
if (x.GetType() != y.GetType()) return false;
// compare all properties one by one:
return x.Id == y.Id
&& x.Date == y.Date
&& ...
}
For GetHashCode is one rule: if X equals Y then they must have the same hash code. If not equal, then there is no rule, but it is more efficient for lookups if they have different hash codes. Make a tradeoff between calculation speed and hash code uniqueness.
In this case: If two Orders are equal, then I am certain that they have the same Id. For speed I don't check the other properties.
protected override int GetHashCode(Order x)
{
if (x == null)
return 34339d98; // just a hash code for all null Orders
else
return x.Id.GetHashCode();
}
I'm trying to build a linq query coming from a table grid from the client side, so im expecting page offset, page start, order and the traditional paging parameters. I have the following code:
[Route("api/settings/logs")]
public Rest.DatatablesResponse GetLogs(int draw, int start, int length) {
var query_string = Request.GetQueryNameValuePairs().ToDictionary(x => x.Key, x => x.Value);
var search = query_string["search.value"];
int order_column = int.Parse(query_string["order[0].column"]);
var order_direction = query_string["order[0].dir"];
var count = db.Logs.Count(q => q.Mode == 2);
var logs = (from l in db.Logs
where l.Mode == 2
select new {
id = l.ID,
mode = l.Mode,
phase_id = l.Phase.ID,
created = l.Created,
user = l.User.Name,
blender_name = l.Blender.Name,
oil_name = l.Oil,
oil_quantity = l.OilQuantity,
production_cycle_name = l.ProductionCycle.Name
});
if (order_direction == "asc") {
if (order_column == 0) logs.OrderBy(q => q.created);
else if (order_column == 2) logs.OrderBy(q => q.production_cycle_name);
} else {
if (order_column == 0) logs.OrderByDescending(q => q.created);
else if (order_column == 2) logs.OrderByDescending(q => q.production_cycle_name);
};
if (!string.IsNullOrEmpty(search)) {
logs.Where(q => q.blender_name.Contains(search) ||
q.oil_name.Contains(search) ||
SqlFunctions.StringConvert((decimal)q.id).Contains(search));
}
logs.Skip(start).Take(length);
DateTime dtDateTime = new DateTime(1970,1,1,0,0,0,0,System.DateTimeKind.Utc);
var steps = from l in logs.ToList()
select new {
id = l.id,
message = StringHelpers.FormatWith(_tpl_message[l.phase_id.ToString() + l.mode.ToString() ], l) ,
created = dtDateTime.AddSeconds(l.created).ToString("h:mmtt - MMMM d, yyyy"),
production_cycle_name = l.production_cycle_name
};
return new Rest.DatatablesResponse {
draw = draw,
recordsTotal = count,
recordsFiltered = count,
data = steps.ToArray()
};
}
My problem is the skip and take and orderby expressions are getting ignored for some reason, and this is the SQL code generated just before converting my linq expressions to a list. From my understanding, the query should not be executed or evaluated until my logs.ToList() call, so the ordering and take/skip should be taken into account, but it's not:
{SELECT
[Extent1].[ID] AS [ID],
[Extent1].[Mode] AS [Mode],
[Extent1].[Phase_ID] AS [Phase_ID],
[Extent1].[Created] AS [Created],
[Extent2].[Name] AS [Name],
[Extent3].[Name] AS [Name1],
[Extent1].[Oil] AS [Oil],
[Extent1].[OilQuantity] AS [OilQuantity],
[Extent4].[Name] AS [Name2]
FROM [dbo].[Steps] AS [Extent1]
LEFT OUTER JOIN [dbo].[Users] AS [Extent2] ON [Extent1].[User_Id] = [Extent2].[Id]
LEFT OUTER JOIN [dbo].[Blenders] AS [Extent3] ON [Extent1].[Blender_ID] = [Extent3].[ID]
LEFT OUTER JOIN [dbo].[ProductionCycles] AS [Extent4] ON [Extent1].[ProductionCycle_ID] = [Extent4].[ID]
WHERE 2 = [Extent1].[Mode]}
Irrelevant P.S. I'm using the not so clever ifs for building the order expression instead of using DynamicLINQ since i have only two sortable columns.
logs.Skip(start).Take(length);
Creates a IQueryable<T> where T is the same anonymous type of which logs is an IQueryable<T> but with start items skipped. Then from that it creates a similarly typed IQueryable<T> where lenght items are the most that will be taken.
Then it throws that away and lets it be garbage collected. (Or ideally the compiler or jitter steps will realise it's thrown away and cut out the whole thing).
Then logs.ToList() goes back to the logs you still have and creates a list from it.
You should replace the Skip and Take line with:
logs = logs.Skip(start).Take(length);
So that you are actually making use of this skipping and taking.
I'm using the not so clever ifs for building the order expression instead of using DynamicLINQ since i have only two sortable columns.
There's nothing particularly not-clever about that, except that you make the same mistake; apply the OrderBy and then throwing away the result instead of using it. Likewise with the Where. You need logs = logs.OrderBy(...) etc.
I'd also question from l in logs.ToList() select new {…} here.
It might be the best approach, if obtaining that list in one step has some advantage. However otherwise:
from l in logs select new {…}
Do the select work on the database, retrieving just what you need.
from l in logs.AsEnumerable() select new {…}
Do the select work in the application, appropriate if part of it cannot be converted to database work, but do it as it comes rather than loading it all into memory first.
from l in await logs.ToListAsync() select new {…}
Has the downside of ToList() but in asynchronous uses, then (assuming your provider has a ToListAsync() method) allows for awaiting.
ToList() is rarely the best option here.
I am using Linq for SQL and have always thought that I was querying the results of a Query in memory. I have just looked at the database and it is showing many thousands of queries rather than 1 which is what I expected.
My approach has been to run a query and then use linq to search within the resultset.
IQueryable<mapping> fieldList = from mm in db.mappings
join mi in db.metaItems on mm.secondaryMetaItemId equals mi.metaItemId
join mo in db.metaObjects on mi.metaObjectId equals mo.metaObjectId
where mm.linkageId == 277
select mm;
for (int i=0;i<100;i++)
{
mapping thisVar = fieldList.FirstOrDefault(m => m.primaryItem == info.Name);
}
How can I stop Linq requerying everytime I access my resultset...
Thanks for your help!
When you write a LINQ query, the query doesn't get actually get executed until you perform an action that actually enumerates over it (deferred execution). Calling FirstOrDefault() is an example of one such method that enumerates over the result (the first result has to be found). You'll want to call a method or otherwise enumerate over the results once. That way when you want to refer to those results throughout your program, you do it on a stored copy.
The easiest way you can do that is to convert it to a list. That will put the results in memory as a list. You could then use that.
IQueryable<mapping> fieldList =
from mm in db.mappings
join mi in db.metaItems on mm.secondaryMetaItemId equals mi.metaItemId
join mo in db.metaObjects on mi.metaObjectId equals mo.metaObjectId
where mm.linkageId == 277
select mm;
// save it!
var result = fieldList.ToList(); // query is processed only once here
// do stuff with result
for (int i=0;i<100;i++)
{
// using the stored result
thisVar = result.FirstOrDefault(m => m.primaryItem == info.Name);
}
try this :
var fieldList = (from mm in db.mappings
join mi in db.metaItems on mm.secondaryMetaItemId equals mi.metaItemId
join mo in db.metaObjects on mi.metaObjectId equals mo.metaObjectId
where mm.linkageId == 277
select mm).AsEnumerable();
foreach (int i=0;i<100;i++)
{
mapping thisVar = fieldList.FirstOrDefault(m => m.primaryItem == info.Name);
}
I am trying to access a user object in a collection with the id = to users101 and set this to another users.
Controller.MyObject.SingleOrDefault(x => x.Id == "user101") = OtherUser();
Thanks in advance.
You can't do it with one LINQ expression.
Usually LINQ extensions works on enumerables, if MyObject is a collection you first have to find the required item and then overwrite it with the new object (moreover SingleOrDefault() will simply return null if condition is not satisfied).
You should write something like this (exact code depends on what MyObject is):
var item = Controller.MyObject.SingleOrDefault(x => x.Id == "user101");
if (item != null)
Controller.MyObject[Controller.MyObject.IndexOf(item)] = new OtherUser();
Please note that if you do not really need the check performed by SingleOrDefault() you can simplify the code (and avoid the double search performed in SingleOrDefault() and IndexOf()).
If this is "performance critical" maybe it is better to write an ad-hoc implementation that does this task in one single pass.
Try it in two lines:
var objectWithId = Controller.MyObject.SingleOrDefault(x => x.Id == "user101");
(objectWithId as WhateverTypeOfObjectOtherUserIs) = OtherUser();
I've been at this for a while. I have a data set that has a reoccurring key and a sequence similar to this:
id status sequence
1 open 1
1 processing 2
2 open 1
2 processing 2
2 closed 3
a new row is added for each 'action' that happens, so the various ids can have variable sequences. I need to get the Max sequence number for each id, but I still need to return the complete record.
I want to end up with sequence 2 for id 1, and sequence 3 for id 2.
I can't seem to get this to work without selecting the distinct ids, then looping through the results, ordering the values and then adding the first item to another list, but that's so slow.
var ids = this.ObjectContext.TNTP_FILE_MONITORING.Select(i => i.FILE_EVENT_ID).Distinct();
List<TNTP_FILE_MONITORING> vals = new List<TNTP_FILE_MONITORING>();
foreach (var item in items)
{
vals.Add(this.ObjectContext.TNTP_FILE_MONITORING.Where(mfe => ids.Contains(mfe.FILE_EVENT_ID)).OrderByDescending(mfe => mfe.FILE_EVENT_SEQ).First<TNTP_FILE_MONITORING>());
}
There must be a better way!
Here's what worked for me:
var ts = new[] { new T(1,1), new T(1,2), new T(2,1), new T(2,2), new T(2,3) };
var q =
from t in ts
group t by t.ID into g
let max = g.Max(x => x.Seq)
select g.FirstOrDefault(t1 => t1.Seq == max);
(Just need to apply that to your datatable, but the query stays about the same)
Note that with your current method, because you are iterating over all records, you also get all records from the datastore. By using a query like this, you allow for translation into a query against the datastore, which is not only faster, but also only returns only the results you need (assuming you are using Entity Framework or Linq2SQL).