EF5 .Local performance - performance

I'm doing this on a table with ~43k rows:
MyDbContext.Stores.Load();
MyDbContext.Stores.Local.Count.Dump(); //horrible performance!
I can see through the profiler that the first instruction fires up the select statement to fetch all rows. Actually the second instruction returns the correct value but after ~12 seconds, and it is not what I was expecting considering that all data should be in memory.
What is wrong (or what is its real purpose) with .Local in Entity Framework?

I think you should do this:
var stores = MyDbContext.Stores.ToList();
// stores is in memory after executing .ToList()
var count = stores.Count();
DbSet.Local Property
This property returns an ObservableCollection that contains all
Unchanged, Modified, and Added objects that are currently tracked by
the context for the given DbSet. The returned observable collection
stays in sync with the underlying DbSet collection and the contents of
the context. This means that you can modify the observable collection
or add/remove entities to/from the underlying DbSet collection (that
includes adding entities by executing a query) and both collections
will be synchronized.
This property is often used in data binding applications.

Related

Entity Framework and caching - Changes are tracking back to cache

I have some data being pulled in from an Entity model. This contains attributes of items, let's say car parts with max-speed, weight and size. Since there are a lot of parts and the base attributes never change, I've cached all the records.
Depending on the car these parts are used in, these attributes might now be changed, so I setup a new car, copy the values from the cached item "Engine" to the new car object and then add "TurboCharger", which boosts max speed, weight and size of the Engine.
The problem I'm running into is that it seems that the Entity model is still tracking the context back to the cached data. So when weight is increased by the local method, it increases it for all users. I tried adding "MergeOption.NoTracking" to my context as this is supposed to remove all entity tracking, but it still seems to be tracking back. If I turn off the cache, it works fine as it pulls fresh values from the database each time.
If I want to copy a record from my entity model, is there a way I can say "Copy the object but treat it as a standard object with no history of coming from entity" so that once my car has the attributes from an item, it is just a flattened object?
Cheers!
Im not too sure about MergeOption.NoTracking on the whole context and exactly what that does but what you can do as an alternative is to add .AsNoTracking() into your query from the database. This will definitely return a detached object.
Take a look here for some details on AsNoTracking usage : http://blog.staticvoid.co.nz/2012/04/entity-framework-and-asnotracking.html.
The other thing is to make sure you enumerate your collection before you insert to the cache to ensure that you arent acting within the queriable, ie use .ToArray().
The other option is to manually detach the object from the context (using Detach(T entity)).

Entity Framework - Querying from ObjectContext vs Querying from Navigation Property

I've noticed that depending on how I extract data from my Entity Framework model, I get different types of results. For example, when getting the list of employees in a particular department:
If I pull directly from ObjectContext, I get an IQueryable<Employee>, which is actually a System.Data.Objects.ObjectQuery<Employee>:
var employees = MyObjectContext.Employees.Where(e => e.DepartmentId == MyDepartment.Id && e.SomeCondtition)
But if I use the Navigation Property of MyDepartment, I get an IEnumerable<Employee>, which is actually a System.Linq.WhereEnumerableIterator<Employee> (private class in System.Linq.Enumerable):
var employees = MyDeparment.Employees.Where(e => e.SomeCondtition)
In the code that follows, I heavily use employees in several LINQ queries (Where, OrderBy, First, Sum, etc.)
Should I be taking into consideration which query method I use? Will there be a performance difference? Does the latter use deferred execution? Is one better practice? Or does it not make a difference?
I ask this because since installing ReShaper 6, I'm getting lots of Possible multiple enumeration of IEnumerable warnings when using the latter method, but none when using direct queries. I've been using the latter method more often, simply because it's much cleaner to write, and I'm wondering if doing so has actually had a detrimental effect!
There is very big difference.
If you are using the first approach you have IQueryable = exression tree and you can still add other expressions and only when you execute the query (deferred execution) the expression tree will be converted to SQL and executed in the database. So if you use your first example and add .Sum of something you will indeed execute operation in the database and it will transfer only single number back to your application. That is linq-to-entities.
The second example uses in memory collection. Navigation property doesn't represent IQueryable (expression tree). All linq commands are treated as linq-to-objects = all records representing related data in navigation property must be first loaded from database to your application and all operations are done in memory of your application server. You can load navigation property eagerly (by using Include), explicitly (by using Load) or lazily (it is just done automatically when you access the property for the first time if lazy loading is enabled). So if you want to have sum of something this scenario requires you to load all data from database and then execute the operation locally.

How to refresh relational property of a LINQ class?

I have two instances of a program that manipulate same Northwind database.
When I add some records to the database from one of the instances (for example adding some orders to Orders table with a customer foreign key John), I can query these new records from the other instance of the program properly. The problem begins when I want to access these new records using John.Orders. In this situation, the second instance of the program does not see newly added records. What should I do?
The problem you are having is probably related to the time you keep the LINQ to SQL DataContext class alive. It should typically be destroyed after each unit of work you do with it (since it follows the 'unit of work' design pattern), which typically means after each use case / business transaction.
You are probably keeping the DataContext class alive during the entire lifetime of the application. The DataContext class is not suited for this, because it will cache all objects it had once retrieved meaning that your data will get stale.
Create a new DataContext class for every operation or every time the user opens a new form / screen.

LINQ to SQL - method run for 2nd time does not return data changes that happened since 1st time

I have created a method in my data context that is mapped to a SQL stored procedure. The method is used in an ASP.NET application, and it is called twice in the lifecycle of a page. It returns the same single object in both cases (i.e. same primary key).
After the 1st call some data changes are made, so on the 2nd call the stored procedure returns the same record but with different property values. If I use the debugger and SQL Profiler I can verify absolutely that the record being returned has the same PK but different property values between the 1st and 2nd calls.
However, on the 2nd call the object returned by the method is identical to the object returned in the 1st call. It is as if LINQ has run the stored procedure but then totally ignored the results, deciding instead that the data couldn't have changed since the first time it was run, so it may as well return a copy of the original object that it happened to hang on to!
I have experimented with setting the datacontext's ObjectTrackingEnabled to false immediately before calling my method, but this stops me being able to reference related objects.
Here's the code I use to call the method:
Dim stl = _DataContext.GetMyStatus(SelectedUserID)
Dim st As MyStatus= stl.FirstOrDefault
I really need to be able to call this method more than once in the lifecycle of the page, and for it to accurately reflect the current state of the database, so how do I do it?
DataContext produces a single instance per primary key value. It populates this single instance the first time it sees the record, and then returns that instance for any future requests with that key.
If you want to update an existing instance's value from the database, use the Refresh method.
I really need to be able to call this method more than once in the lifecycle of the page, and for it to accurately reflect the current state of the database.
Don't share datacontexts between different page requests.

issue with submitChanges() inserting unwanted records in linq

I am using LINQ to insert records in the database. I create these records and keep track of them using a List. Based on some logic, I delete some of the records by deleting from the List. (I am using the same DataContext object).
When I want to insert the records in the database, I do the corresponding linq table's InsertOnSubmit() followed by SubmitChanges() on datacontext object. LINQ inserts the deleted-from-List records too along with the ones that are present in the list.
example:
//list to keep track of records to insert
List list
// add the records to list
list.add(some records)
//deleted last 2 records
list.remove()
//call InsertAllOnSubmit on the linq table passing the list object with records to insert
linqTable.InsertAllOnSubmit(list)
//call SubmitChanges on datacontext object
datacontext.SubmitChanges()
I came across this msdn article Object States and Change-Tracking (LINQ to SQL)
You can explicitly request Inserts by
using InsertOnSubmit. Alternatively,
LINQ to SQL can infer Inserts by
finding objects connected to one of
the known objects that must be
updated. For example, if you add an
Untracked object to an
EntitySet(TEntity) or set an
EntityRef(TEntity) to an Untracked
object, you make the Untracked object
reachable by way of tracked objects in
the graph. While processing
SubmitChanges, LINQ to SQL traverses
the tracked objects and discovers any
reachable persistent objects that are
not tracked. Such objects are
candidates for insertion into the
database.
I guess the question boils down to this - how do I change the deleted objects' state to 'Untracked'?
I tried DeleteOnSubmit after I delete the objects from list but that gives an exception (Cannot remove an entity that has not been attached).
Can someone please point me to a solution? Thanks.
I would like to know if I can achieve this using LINQ only. (I know that I can use an stored proc and insert only the records in the list.)
I think that the elements of your list are not being removed, because the List.Remove method determines equality using the default equality comparer.
If you don't want to write a custom comparer, I recommend you to use the RemoveAll method, which receives a predicate as its first parameter to match the elements that will be removed from the list:
list.RemoveAll( e=> /*condition to remove the element*/ );
Or the RemoveAt method, that removes the element based on the specified index:
list.RemoveAt(0); // Delete first element

Resources