I'm pulling data from a third party api. The api runs multiple times in a day. So, if the same data is present in the table it should ignore that record, else if there are any changes it should update that record or insert a new record if anything new shows up in the json received.
I'm using the below code for inserting any new data.
var input = JsonConvert.DeserializeObject<List<DeserializeLookup>>(resultJson).ToList();
var entryset = input.Select(y => new Lookup
{
lookupType = "JOBCODE",
code = y.Code,
description = y.Description,
isNew = true,
lastUpdatedDate = DateTime.UtcNow
}).ToList();
await _context.Lookup.AddRangeAsync(entryset);
await _context.SaveChangesAsync();
But, after the first run, when the api runs again it's again inserting the same data in the table. As a result, duplicate entries are getting into table. To handle the same, I used a foreach loop as below before inserting data to the table.
foreach (var item in input)
{
if (!_context.Lookup.Any(r =>
r.code== item.Code))
{
//above insert code
}
}
But, the same doesn't work as expected. Also, the api takes a lot of time to run when I put a foreach loop. Is there a solution to this in .net core 3.1
List<DeserializeLookup> newList=new();
foreach (var item in input)
{
if (!_context.Lookup.Any(r =>
r.code== item.Code))
{
newList.add(item);
//above insert code
}
}
await _context.Lookup.AddRangeAsync(newList);
await _context.SaveChangesAsync();
It will be better if you try this way
I’m on my phone so forgive me for not being able to format the code in my response. The solution to your problem is something I actually just encountered myself while syncing data from an azure function and third party app and into a sql database.
Depending on your table schema, you would need one column with a unique identifier. Make this column a primary key (first step to preventing duplicates). Here’s a resource for that: https://www.w3schools.com/sql/sql_primarykey.ASP
The next step you want to take care of is your stored procedure. You’ll need to perform what’s commonly referred to as an UPSERT. To do this you’ll need to merge a table with the incoming data...on a specified column (whichever is your primary key).
That would look something like this:
MERGE
Table_1 AS T1
USING
Incoming_Data AS source
ON
T1.column1 = source.column1
/// you can use an AND / OR operator in here for matching on additional values or combinations
WHEN MATCHED THEN
UPDATE SET T1.column2= source.column2
//// etc for more columns
WHEN NOT MATCHED THEN
INSERT (column1, column2, column3) VALUES (source.column1, source.column2, source.column3);
First of all, you should decouple the format in which you get your data from your actual data handling. In your case: get rid of the JSon before you actually interpret the data.
Alas, I haven't got a clue what your data represents, so Let's assume your data is a sequence of Customer Orders. When you get new data, you want to Add all new orders, and you want to update changed orders.
So somewhere you have a method with input your json data, and as output a sequence of Orders:
IEnumerable<Order> InterpretJsonData(string jsonData)
{
...
}
You know Json better than I do, besides this conversion is a bit beside your question.
You wrote:
So, if the same data is present in the table it should ignore that record, else if there are any changes it should update that record or insert a new record
You need an Equality Comparer
To detect whether there are Added or Changed Customer Orders, you need something to detect whether Order A equals Order B. There must be at least one unique field by which you can identify an Order, even if all other values are of the Order are changed.
This unique value is usually called the primary key, or the Id. I assume your Orders have an Id.
So if your new Order data contains an Id that was not available before, then you are certain that the Order was Added.
If your new Order data has an Id that was already in previously processed Orders, then you have to check the other values to detect whether it was changed.
For this you need Equality comparers: one that says that two Orders are equal if they have the same Id, and one that says checks all values for equality.
A standard pattern is to derive your comparer from class EqualityComparer<Order>
class OrderComparer : EqualityComparer<Order>
{
public static IEqualityComparer<Order> ByValue = new OrderComparer();
... // TODO implement
}
Fist I'll show you how to use this to detect additions and changes, then I'll show you how to implement it.
Somewhere you have access to the already processed Orders:
IEnumerable<Order> GetProcessedOrders() {...}
var jsondata = FetchNewJsonOrderData();
// convert the jsonData into a sequence of Orders
IEnumerable<Order> orders = this.InterpretJsonData(jsondata);
To detect which Orders are added or changed, you could make a Dictonary of the already Processed orders and check the orders one-by-one if they are changed:
IEqualityComparer<Order> comparer = OrderComparer.ByValue;
Dictionary<int, Order> processedOrders = this.GetProcessedOrders()
.ToDictionary(order => order.Id);
foreach (Order order in Orders)
{
if(processedOrders.TryGetValue(order.Id, out Order originalOrder)
{
// order already existed. Is it changed?
if(!comparer.Equals(order, originalOrder))
{
// unequal!
this.ProcessChangedOrder(order);
// remember the changed values of this Order
processedOrder[order.Id] = Order;
}
// else: no changes, nothing to do
}
else
{
// Added!
this.ProcessAddedOrder(order);
processedOrder.Add(order.Id, order);
}
}
Immediately after Processing the changed / added order, I remember the new value, because the same Order might be changed again.
If you want this in a LINQ fashion, you have to GroupJoin the Orders with the ProcessedOrders, to get "Orders with their zero or more Previously processed Orders" (there will probably be zero or one Previously processed order).
var ordersWithTPreviouslyProcessedOrder = orders.GroupJoin(this.GetProcessedOrders(),
order => order.Id, // from every Order take the Id
processedOrder => processedOrder.Id, // from every previously processed Order take the Id
// parameter resultSelector: from every Order, with its zero or more previously
// processed Orders make one new:
(order, previouslyProcessedOrders) => new
{
Order = order,
ProcessedOrder = previouslyProcessedOrders.FirstOrDefault(),
})
.ToList();
I use GroupJoin instead of Join, because this way I also get the "Orders that have no previously processed orders" (= new orders). If you would use a simple Join, you would not get them.
I do a ToList, so that in the next statements the group join is not done twice:
var addedOrders = ordersWithTPreviouslyProcessedOrder
.Where(orderCombi => orderCombi.ProcessedOrder == null);
var changedOrders = ordersWithTPreviouslyProcessedOrder
.Where(orderCombi => !comparer.Equals(orderCombi.Order, orderCombi.PreviousOrder);
Implementation of "Compare by Value"
// equal if all values equal
protected override bool Equals(bool x, bool y)
{
if (x == null) return y == null; // true if both null, false if x null but y not null
if (y == null) return false; // because x not null
if (Object.ReferenceEquals(x, y) return true;
if (x.GetType() != y.GetType()) return false;
// compare all properties one by one:
return x.Id == y.Id
&& x.Date == y.Date
&& ...
}
For GetHashCode is one rule: if X equals Y then they must have the same hash code. If not equal, then there is no rule, but it is more efficient for lookups if they have different hash codes. Make a tradeoff between calculation speed and hash code uniqueness.
In this case: If two Orders are equal, then I am certain that they have the same Id. For speed I don't check the other properties.
protected override int GetHashCode(Order x)
{
if (x == null)
return 34339d98; // just a hash code for all null Orders
else
return x.Id.GetHashCode();
}
I have a list of Customers who each have a list of Orders. Each Order has a list of LineItems.
I would like to write a LINQ query that would get me the top 10 customers based on order value (i.e. money spent) and not the total number of orders.
One customer could have 2 orders but could have spent £10,000, but another customer could have 100 orders, and only spent £500.
Right now, I have this which gets me the top 10 customers by the number of orders.
var customers = (from c in _context.Customers where c.SaleOrders.Count > 0
let activeCount = c.SaleOrders.Count(so => so.Status != SaleOrderStatus.Cancelled)
orderby activeCount descending
select c).Take(10);
UPDATE
Thanks to Jon Skeet's comment about doing a double Sum, I wrote the following query which compiles.
var customers = (from c in _context.Customers where c.SaleOrders.Count > 0
let orderSum = c.SaleOrders.Where(so => so.Status != SaleOrderStatus.Cancelled)
.Sum(so => so.LineItems.Sum(li => li.CalculateTotal()))
orderby orderSum descending
select c).Take(10);
But when I run this, I get the following error:
It seems LINQ doesn't recognise my .CalculateTotal() method which sit on my LineItem.cs entity.
The problem you were seeing is that CalculateTotal() is not something that Linq can translate into SQL (which is done at run-time, hence no complier error).
The essential problem here is that Linq doesn't really work on lambdas (Func<>), but actually Expressions (Expression<Func<>>), which is the code in a partial compiled state, which Linq then goes about disassembling and translating into SQL.
So, let assume CalculateTotal is a member function defined like this:
public decimal CalculateTotal()
{
return this.quantity * this.value;
}
We could define that as a local lambda function
Func<LineItem, decimal> CalculateTotal = (li => li.quantity * li.value);
Now, we have a lambda which takes a LineItem and returns a value, which is exactly what Sum() wants, so now we can replace:
.Sum(so => so.LineItems.Sum(li => li.CalculateTotal()))
with
.Sum(so => so.LineItems.Sum(CalculateTotal))
But that will crash, just as it did before, because, as I said, it wants an Expression. So, we give it one:
Expression<Func<LineItem, decimal>> CalculateTotal = (li => li.quantity * li.value);
I have two issues I'm struggling with LINQ. I appreciate if you could advise. I have 2 lists rawStates (storing rows of entity-downtime-uptime-eventtype) and rawData list storing products' in and out times from entity.
I want to select those elements from rawStates that occurred when still waiting for that entity to be processed
foreach(var t in rawData)
var s = rawStates
//I am not sure if this single logic clause in Where is enough;
.Where(o => o.Entity == t.Entity
&& o.DownDate > t.InTime
&& o.Update < t.OutTime)
.ToList();
If I group my rawData by productID (there are multiple rows with same ProductID), how can I revert back this "s" to these groups so that for a productID I can group by eventtype, and summarise durations by productID?
I have two tables with a one to many relation.
One Order has many Products.
I want to take a list of Orders with the free Product in each (one per order).
I've tried something like this:
this.ObjectContext.ORDERS.Include("PRODUCTS").Where(e=>e.PRODUCTS.price == 0).OrderBy(e => e.Order);
But this is not working.
Is there any other approach??
Thanks in advance.
if PRODUCTS is a collection, you can use the Any() extension method to find out if any product is free (alternatively, the All() method to find out if all products are free):
this.ObjectContext.ORDERS
.Include("PRODUCTS")
.Where(o => o.PRODUCTS.Any(p => p.price == 0))
.OrderBy(e => e.Order);
Based on your clarification of the desired output (all orders with optional free product), you can use this query:
this.ObjectContext.ORDERS
.Include("PRODUCTS")
.Select(o => new {
Order = o,
FreeProduct = o.PRODUCTS.FirstOrDefault(p => p.price == 0)
});
It will return a sequence of an anomymous type containing the order, and the free product (or null if no free product exists).
all,
This problem relates to the Dynamics CRM 2011 Linq provider - which does have a LOT of quirks. However, I have not tagged it such as I think this is a general Linq question.
I have a class- Product. It has a property (say ProductPrice) of type Price.
I am doing an Outer Join on this in Linq. The CRM documentation says outer joins are not possible, but it seems to work (with the obvious problem I am asking here).
So say I am doing something like: (apologies for the pseudo linq)
IList<Product> products = (from p in xrmContext.Products
join pr in xrmContext.Prices
on p.ProductId equals pr.ProductId into prx from prices in prx.DefaultIfEmpty
select new Product { ProductName = p.productName, ProductPrice = new Price { Amount = prices.PriceValue }).ToList();
This works great to a point. It creates all the products irrespective of whether they have a price object or not. Tippety top.
The problem is the DefaultIfEmpty. As you are no doubt aware if a product has no price this DefaultIfEmpty will create a 'default' price object ... i.e. an object with null values. What I actually want is NO price object - i.e. null, not a 'blank' object.
How is that possible?
I have worked round it by testing for a blank price name - ProductPrice = price.priceName == "" ? null : new Price ...
It would be nice to be able to do something like NullIfEmpty. Any ideas?
You can skip the join:
from p in xrmContext.Products
let price = xrmContext.Prices.FirstOrDefault(pr => pr.ProductID == p.ProductID)
select new Product()
{
ProductName = p.productName,
ProductPrice = price != null ? new Price() { Amount = price.PriceValue } : null
}
crm 2011 linq doesn't support outer join.
If you try Amiram's code - when evaluating the select, you'll get an error "products doesn't contain attribute "pricevalue"".