I had to write a method that does the following:
There is a DataSet let's say CarDataSet with one table Car and contains Primary key Id and one more column ColorId. And there is a string with Ids seperated with commas for example "5,6,7,8" (random length). The task is to check if all appropriate ColorIds are identical for given Car Ids.
For example:
String ids = "5,6,7,8"
If all the Cars ColorIds are for example 3,3,3,3 where the Car Ids are 5,6,7,8 then return true;
In other words - check if all cars with given Ids are in one color. Now I don't have my code anymore but I made this using 3 foreach loops and 3 linq expressions. Is there any simplier way to do this?
If you want all cars have same color means all of them should have same color as first one:
// first find the cars with given ids
var selectedCars = Cars.Where(x=>ids.Contains(x.ID.ToString());
// select one of them as comparer:
var firstCar = selectedCars.FirstOrDefault();
if (firstCar == null)
return true;
// check all of them has same color as first one:
return selectedCars.All(x=>x.ColorID == firstCar.ColorID);
Edit: Or if you have no problem with throwing exception when there is no car with given ids you can use two query in lambda syntax:
var selectedCars = Cars.Where(x=>ids.Contains(x.ID.ToString()));
return selectedCars.All(x=>x.ColorID == selectedCars.First().ColorID);
You could do this by performing a distinct, and asserting the count is 1.
var colors = Cars.Where(x=>ids.Contains(x.ID.ToString())
.Select(x=>x.ColorID)
.Distinct().Count();
return count == 1;
Related
I'm pulling data from a third party api. The api runs multiple times in a day. So, if the same data is present in the table it should ignore that record, else if there are any changes it should update that record or insert a new record if anything new shows up in the json received.
I'm using the below code for inserting any new data.
var input = JsonConvert.DeserializeObject<List<DeserializeLookup>>(resultJson).ToList();
var entryset = input.Select(y => new Lookup
{
lookupType = "JOBCODE",
code = y.Code,
description = y.Description,
isNew = true,
lastUpdatedDate = DateTime.UtcNow
}).ToList();
await _context.Lookup.AddRangeAsync(entryset);
await _context.SaveChangesAsync();
But, after the first run, when the api runs again it's again inserting the same data in the table. As a result, duplicate entries are getting into table. To handle the same, I used a foreach loop as below before inserting data to the table.
foreach (var item in input)
{
if (!_context.Lookup.Any(r =>
r.code== item.Code))
{
//above insert code
}
}
But, the same doesn't work as expected. Also, the api takes a lot of time to run when I put a foreach loop. Is there a solution to this in .net core 3.1
List<DeserializeLookup> newList=new();
foreach (var item in input)
{
if (!_context.Lookup.Any(r =>
r.code== item.Code))
{
newList.add(item);
//above insert code
}
}
await _context.Lookup.AddRangeAsync(newList);
await _context.SaveChangesAsync();
It will be better if you try this way
I’m on my phone so forgive me for not being able to format the code in my response. The solution to your problem is something I actually just encountered myself while syncing data from an azure function and third party app and into a sql database.
Depending on your table schema, you would need one column with a unique identifier. Make this column a primary key (first step to preventing duplicates). Here’s a resource for that: https://www.w3schools.com/sql/sql_primarykey.ASP
The next step you want to take care of is your stored procedure. You’ll need to perform what’s commonly referred to as an UPSERT. To do this you’ll need to merge a table with the incoming data...on a specified column (whichever is your primary key).
That would look something like this:
MERGE
Table_1 AS T1
USING
Incoming_Data AS source
ON
T1.column1 = source.column1
/// you can use an AND / OR operator in here for matching on additional values or combinations
WHEN MATCHED THEN
UPDATE SET T1.column2= source.column2
//// etc for more columns
WHEN NOT MATCHED THEN
INSERT (column1, column2, column3) VALUES (source.column1, source.column2, source.column3);
First of all, you should decouple the format in which you get your data from your actual data handling. In your case: get rid of the JSon before you actually interpret the data.
Alas, I haven't got a clue what your data represents, so Let's assume your data is a sequence of Customer Orders. When you get new data, you want to Add all new orders, and you want to update changed orders.
So somewhere you have a method with input your json data, and as output a sequence of Orders:
IEnumerable<Order> InterpretJsonData(string jsonData)
{
...
}
You know Json better than I do, besides this conversion is a bit beside your question.
You wrote:
So, if the same data is present in the table it should ignore that record, else if there are any changes it should update that record or insert a new record
You need an Equality Comparer
To detect whether there are Added or Changed Customer Orders, you need something to detect whether Order A equals Order B. There must be at least one unique field by which you can identify an Order, even if all other values are of the Order are changed.
This unique value is usually called the primary key, or the Id. I assume your Orders have an Id.
So if your new Order data contains an Id that was not available before, then you are certain that the Order was Added.
If your new Order data has an Id that was already in previously processed Orders, then you have to check the other values to detect whether it was changed.
For this you need Equality comparers: one that says that two Orders are equal if they have the same Id, and one that says checks all values for equality.
A standard pattern is to derive your comparer from class EqualityComparer<Order>
class OrderComparer : EqualityComparer<Order>
{
public static IEqualityComparer<Order> ByValue = new OrderComparer();
... // TODO implement
}
Fist I'll show you how to use this to detect additions and changes, then I'll show you how to implement it.
Somewhere you have access to the already processed Orders:
IEnumerable<Order> GetProcessedOrders() {...}
var jsondata = FetchNewJsonOrderData();
// convert the jsonData into a sequence of Orders
IEnumerable<Order> orders = this.InterpretJsonData(jsondata);
To detect which Orders are added or changed, you could make a Dictonary of the already Processed orders and check the orders one-by-one if they are changed:
IEqualityComparer<Order> comparer = OrderComparer.ByValue;
Dictionary<int, Order> processedOrders = this.GetProcessedOrders()
.ToDictionary(order => order.Id);
foreach (Order order in Orders)
{
if(processedOrders.TryGetValue(order.Id, out Order originalOrder)
{
// order already existed. Is it changed?
if(!comparer.Equals(order, originalOrder))
{
// unequal!
this.ProcessChangedOrder(order);
// remember the changed values of this Order
processedOrder[order.Id] = Order;
}
// else: no changes, nothing to do
}
else
{
// Added!
this.ProcessAddedOrder(order);
processedOrder.Add(order.Id, order);
}
}
Immediately after Processing the changed / added order, I remember the new value, because the same Order might be changed again.
If you want this in a LINQ fashion, you have to GroupJoin the Orders with the ProcessedOrders, to get "Orders with their zero or more Previously processed Orders" (there will probably be zero or one Previously processed order).
var ordersWithTPreviouslyProcessedOrder = orders.GroupJoin(this.GetProcessedOrders(),
order => order.Id, // from every Order take the Id
processedOrder => processedOrder.Id, // from every previously processed Order take the Id
// parameter resultSelector: from every Order, with its zero or more previously
// processed Orders make one new:
(order, previouslyProcessedOrders) => new
{
Order = order,
ProcessedOrder = previouslyProcessedOrders.FirstOrDefault(),
})
.ToList();
I use GroupJoin instead of Join, because this way I also get the "Orders that have no previously processed orders" (= new orders). If you would use a simple Join, you would not get them.
I do a ToList, so that in the next statements the group join is not done twice:
var addedOrders = ordersWithTPreviouslyProcessedOrder
.Where(orderCombi => orderCombi.ProcessedOrder == null);
var changedOrders = ordersWithTPreviouslyProcessedOrder
.Where(orderCombi => !comparer.Equals(orderCombi.Order, orderCombi.PreviousOrder);
Implementation of "Compare by Value"
// equal if all values equal
protected override bool Equals(bool x, bool y)
{
if (x == null) return y == null; // true if both null, false if x null but y not null
if (y == null) return false; // because x not null
if (Object.ReferenceEquals(x, y) return true;
if (x.GetType() != y.GetType()) return false;
// compare all properties one by one:
return x.Id == y.Id
&& x.Date == y.Date
&& ...
}
For GetHashCode is one rule: if X equals Y then they must have the same hash code. If not equal, then there is no rule, but it is more efficient for lookups if they have different hash codes. Make a tradeoff between calculation speed and hash code uniqueness.
In this case: If two Orders are equal, then I am certain that they have the same Id. For speed I don't check the other properties.
protected override int GetHashCode(Order x)
{
if (x == null)
return 34339d98; // just a hash code for all null Orders
else
return x.Id.GetHashCode();
}
I have an Action method in my controller which returns a List Object
Public ActionResult GetCats(long Id,string strsortorder,string dltIds)
{
var Result=objrepo.GetCats(Id);//this method returns me List of Result
}
My array looks like this:
var Result=[{CatId:1015,CatName:Abc},{CatId:1016,CatName:Acd},
{CatId:1017,CatName:Adf},{CatId:1018,CatName:CDdf},{CatId:1019,CatName:asdas},
{CatId:1020,CatName:Abc},{CatId:1021,CatName:Abc},{CatId:1022,CatName:Abc},
{CatId:1023,CatName:Abc},{CatId:1024,CatName:Abc}]
What I want to do is:
Using two more parameters in my Action Method "strsortorder" and "dltIds"
that have a list of ids like this:
strsortorder="1021,1015,1016,1019,1022";
dltIds="1017,1018,1020";
From this the "Result" returned from my method , I want to remove the records which are in "dltids" and the remaining array should be sorted in the order which I have in "strsortorder";
In the end the new object should look like this:
var NewResult=[{CatId:1021,CatName:Abc},{CatId:1015,CatName:Abc},
{CatId:1016,CatName:Acd},{CatId:1019,CatName:asdas},{CatId:1022,CatName:Abc},
{CatId:1023,CatName:Abc},{CatId:1024,CatName:Abc}]
Can any one help me in acheiving this in linq or any other way?
I want to avoid any type of loop or froeach here for max extent, I know it can be done by looping but I want to avoid this since the result can sometimes contain large amounts of data.
I realized you can use an ArrayList instead of a Dictionary and it would be faster. I think Dictionary is clear how it works but here is the "better" implementation using array list:
var excludeList = dltIds.Split(",".ToCharArray());
ArrayList sortList = new ArrayList(strsortorder.Split(",".ToCharArray()));
var NewResult =
Result.Where(item => ! excludeList.Contains(item.CatId.ToString()))
.OrderBy(item => {
if (sortList.Contains(item.CatId.ToString()))
return sortList.IndexOf(item.CatId.ToString());
return sortList.Count;
});
Original answer below:
Public ActionResult GetCats(long Id,string strsortorder,string dltIds)
{
var Result=objrepo.GetCats(Id);//this method returns me List of Result
var excludeList = dltIds.Split(",".ToCharArray());
int orderCount = 0; // used in the closure creating the Dictionary below
var sortList = strsortorder.Split(",".ToCharArray())
.ToDictionary(x => x,x => orderCount++);
// filter
var NewResult =
Result.Where(item => ! excludeList.Contains(item.CatId.ToString()))
.OrderBy(item => {
if (sortList.ContainsKey(item.CatId.ToString()))
return sortList[item.CatId.ToString()];
return sortList.Count();
});
}
How this works:
First I create lists out of your comma separated exclude list using split.
This I create a dictionary with the key being the ordering ID and the value being an integer that goes up by one.
For the filtering I look to see if an item is in the exclude array before I continue processing the item.
I then do a sort on matching against the key and the dictionary and returning the value -- this will sort things in the order of the list since I incremented a counter when creating the values. If an item is not in the dictionary I return one more than the maximum value in the dictionary which must be the count of the items. (I could have used the current value of orderCount instead.)
Questions?
I have a very basic sql view which joins 3 tables: users, pictures, and tags.
How would one create the query below in a way that it won't list the same pictures more than once? In other words, I want to Group By pictures (I think) and return get the first insance of each.
I think this is very similar to the post Linq Query Group By and Selecting First Items, but I cannot figure out how to apply it in this case where the query is instantiating MyImageClass.
validPicSummaries = (from x in db.PicsTagsUsers
where x.enabled == 1
select new MyImageClass {
PicName = x.picname,
Username= x.Username,
Tag = x.tag }).Take(50);
To exclude duplicates, you can use the Distinct LINQ method:
validPicSummaries =
(from x in db.PicsTagsUsers
where x.tag == searchterm && x.enabled == 1
select new MyImageClass
{
PicName = x.picname,
Username= x.Username,
Tag = x.tag
})
.Distinct()
.Take(50);
You will need to make sure that the objects are comparable so that two MyImageClass objects that have the same PicName, Username, and Tag are considered equal (or however you wish to consider two of them as being equal).
You can write a small class that implements IEqualityComparer<T> if you would like to have a custom comparer for just this case. Ex:
private class MyImageClassComparer : IEqualityComparer<MyImageClass>
{
public bool Equals(MyImageClass pMyImage1, MyImageClass pMyImage2)
{
// some test of the two objects to determine
// whether they should be considered equal
return pMyImage1.PicName == pMyImage2.PicName
&& pMyImage1.Username == pMyImage2.Username
&& pMyImage1.Tag == pMyImage2.Tag;
}
public int GetHashCode(MyImageClass pMyImageClass)
{
// the GetHashCode function seems to be what is used by LINQ
// to determine equality. from examples, it seems the way
// to combine hashcodes is to XOR them:
return pMyImageClass.PicName.GetHashCode()
^ pMyImageClass.UserName.GetHashCode()
^ pMyImageClass.Tag.GetHashCode();
}
}
Then when you call distinct:
...
.Distinct(new MyImageClassComparer())
.Take(50);
I am trying to select some records using LINQ for Entities (EF4 Code First).
I have a table called Monitoring with a field called AnimalType which has values such as
"Lion,Tiger,Goat"
"Snake,Lion,Horse"
"Rattlesnake"
"Mountain Lion"
I want to pass in some values in a string array (animalValues) and have the rows returned from the Monitorings table where one or more values in the field AnimalType match the one or more values from the animalValues. The following code ALMOST works as I wanted but I've discovered a major flaw with the approach I've taken.
public IQueryable<Monitoring> GetMonitoringList(string[] animalValues)
{
var result = from m in db.Monitorings
where animalValues.Any(c => m.AnimalType.Contains(c))
select m;
return result;
}
To explain the problem, if I pass in animalValues = { "Lion", "Tiger" } I find that three rows are selected due to the fact that the 4th record "Mountain Lion" contains the word "Lion" which it regards as a match.
This isn't what I wanted to happen. I need "Lion" to only match "Lion" and not "Mountain Lion".
Another example is if I pass in "Snake" I get rows which include "Rattlesnake". I'm hoping somebody has a better bit of LINQ code that will allow for matches that match the exact comma delimited value and not just a part of it as in "Snake" matching "Rattlesnake".
This is a kind of hack that will do the work:
public IQueryable<Monitoring> GetMonitoringList(string[] animalValues)
{
var values = animalValues.Select(x => "," + x + ",");
var result = from m in db.Monitorings
where values.Any(c => ("," + m.AnimalType + ",").Contains(c))
select m;
return result;
}
This way, you will have
",Lion,Tiger,Goat,"
",Snake,Lion,Horse,"
",Rattlesnake,"
",Mountain Lion,"
And check for ",Lion," and "Mountain Lion" won't match.
It's dirty, I know.
Because the data in your field is comma delimited you really need to break those entries up individually. Since SQL doesn't really support a way to split strings, the option that I've come up with is to execute two queries.
The first query uses the code you started with to at least get you in the ballpark and minimize the amount of data you're retrieving. It converts it to a List<> to actually execute the query and bring the results into memory which will allow access to more extension methods like Split().
The second query uses the subset of data in memory and joins it with your database table to then pull out the exact matches:
public IQueryable<Monitoring> GetMonitoringList(string[] animalValues)
{
// execute a query that is greedy in its matches, but at least
// it's still only a subset of data. The ToList()
// brings the data into memory, so to speak
var subsetData = (from m in db.Monitorings
where animalValues.Any(c => m.AnimalType.Contains(c))
select m).ToList();
// given that subset of data in the List<>, join it against the DB again
// and get the exact matches this time
var result = from data in subsetData
join m in db.Monitorings on data.ID equals m.ID
where data.AnimalType.Split(',').Intersect(animalValues).Any ()
select m;
return result;
}
I want a list of counts for some of my data (count the number of open.closed tasks etc), I want to get all counts inside 1 query, so I am not sure what I do with my linq statement below...
_user is an object that returns info about the current loggedon user
_repo is am object that returns an IQueryable of whichever table I want to select
var counters = (from task in _repo.All<InstructionTask>()
where task.AssignedToCompanyID == _user.CompanyID || task.CompanyID == _user.CompanyID
join instructions in _repo.GetAllMyInstructions(_user) on task.InstructionID equals
instructions.InstructionID
group new {task, instructions}
by new
{
task
}
into g
select new
{
TotalEveryone = g.Count(),
TotalMine = g.Count(),
TotalOpen = g.Count(x => x.task.IsOpen),
TotalClosed = g.Count(c => !c.task.IsOpen)
}).SingleOrDefault();
Do I convert my object to single or default? The exception I am getting is, this sequence contains more than one element
Note: I want overall stats, not for each task, but for all tasks - not sure how to get that?
You need to dump everything into a single group, and use a regular Single. I am not sure if LINQ-to-SQL would be able to translate it correctly, but it's definitely worth a try.
var counters = (from task in _repo.All<InstructionTask>()
where task.AssignedToCompanyID == _user.CompanyID || task.CompanyID == _user.CompanyID
join instructions in _repo.GetAllMyInstructions(_user) on task.InstructionID == instructions.InstructionID
group task by 1 /* <<=== All tasks go into one group */ into g select new {
TotalEveryone = task.Count(),
TotalMine = task.Count(), // <<=== You probably need a condition here
TotalOpen = task.Count(x => x.task.IsOpen),
TotalClosed = task.Count(c => !c.task.IsOpen)
}).Single();
From MSDN
Returns the only element of a sequence, or a default value if the
sequence is empty; this method throws an exception if there is more
than one element in the sequence.
You need to use FirstOrDefault. SingleOrDefault is designed for collections that contains exactly 1 element (or none).