EF 6 - Client side LINQ evaluation Logging - linq

Wondering is there a way to figure out whether my Linq queries were evaluated at the client or the server side, may be via some flag, log etc?
We are migrating EF6 to EFCore 6 and since client side evaluation is disabled in EFCore6, hence we need to know what all queries will fail to be exported from EF6.

Related

Using the Entity Framework and ADO in a hybrid setting for transition?

We have an app in ASP.NET MVC 3 that, due to legacy and porting reasons, is written entirely using traditional ADO.NET for the data layer.
I am now tasked with adding some reporting to this website, and the reports can result in some extremely complicated queries.
Are there any pitfalls in using the EF Power Tools to reverse-engineer a code first model and using it side-by-side with our current ADO.NET model? Doing so would allow me to use LINQ for querying the data I need, greatly speeding up the time required to write each report. I would need to shut off data context initialization, as we have our current model do that, but are there any glaring risks or problems associated with trying to do this?
If it's of any relevance (I know EF 5 has a ton of new features), we are using .NET 4 and will begin moving to .NET 4.5 as soon as it launches.
I think this is a very sensible thing to do. You could also use a database-first model, which you can refresh whenever the database changes and which does not try to initialize a database.
Since you will use the context read-only you can optimize the query process by setting the MergeOption property of ObjectQuerys to MergeOption.NoTracking. This reduces overhead because the context will not track changes of the generated objects.
A problem might be that there is more maintenance if the database changes, but I think the absence of walls of boiler-plate query code for reporting on the old data layer far outweighs that.
One day :) you may even decide to use the EF model to display data that users want to filter in the UI and use the old data layer for CUD commands. (a bit like CQRS).

Should i use Pooling=False in Entity Framework Connection String?

We have an ASP.Net MVC3 application that is running on Azure, connecting to a Sql Azure Database via Entity Framework.
The Developer that produced this has Pooling=False in the connection string. I would have thought this was a bad idea - wouldn't we want to re-use the connections if possible?
Can anyone give me some guidance of when this might be a good idea?
Thanks!
Yes, it is bad idea to have pooling=False. I don't use this option even for development. You should always use the SQL Server Connection Pooling.
What you have to be aware of, however is the Transient Errors in SQL Azure. There are a couple of good reading that can be found on the Internet on how to handle Transient Errors in SQL using Entity Framework. Here is a (non-complete) list of some:
Best practices for handling Transient conditions in SQL Azure
SQL Azure and EF fault handling
EF CodeFirst and Transient Errors
Handling Transient Errors in SQL Azure
Always use connection pooling, but be aware of the Transient conditions in SQL Azure.

Entity Framework With Sql Azure over net tcp slow performance

Hy guys,
So i have a really big performance issue here. I have a WPF application which connects to a service which runs inside a Worker Role. The service uses net tcp binding with full duplex. The data access layer is all in a library which i am referencing in my service. So when my service want's to get data it uses the methods in that library. That library uses EF 4.1 which is mapped to an Sql Azure database.
The problem i am facing is that a query like getting a user from a database, takes somewhere above 4 seconds. I have also a http service(used by a Silverlight app) which uses the same dataaccess library, the same query over there takes 115ms, which is normal.
Is there a problem with the Entity Framework when i am using a net tcp service? I really don't know where the issue is, because over a http service all the queries behave normaly.
Is it possible you are using Lazy Loading instead of Eager loading with your entity? Lazy Loading over the Internet is much slower since it results in many more roundtrips to SQL Azure, which would be the bottle neck in this case. Eager Loading will simply get all of the data at once with a single round trip.
Reference: http://msdn.microsoft.com/en-us/library/bb896272.aspx

Is it possible to use LINQ to SQL with Oracle?

I need to develop a program that must delete and insert data into an Oracle database. Is it possible to use LINQ to SQL with Oracle?
For development I use MS SQL server but it will be a Oracle database in production. What do you recommend?
Officially No. Linq to SQL was originally build with the ability to swap out the data provider to allow connections to other databases, but they disabled this functionality in the released versions to encourage people to use more stable and supported data access layers (like EF). The recommended approach is to use Entity Framework if you want to switch between SQL and Oracle.
Also, Patrick is very right, make sure you are developing and testing against the same database platform you are going to use in production, there is a world of difference in how they operate. Sure, you should be able to abstract it away to not care about whether you are using SQL or Oracle, but that is almost never really the case.
No, you can't. Although LINQ to SQL was initialy designed with multi-database support in mind (you can see this when looking at the code using .NET Reflector) using a provider model, this model was never made public and Microsoft has no intensions in adding multi-database support to LINQ to SQL.
If you need multi-database support, please use Entity Framework.
No, LINQ-to-SQL doesn't support Oracle. Internally, the project had support for multiple back-ends, but this never made it into the final public release. I believe LINQ-to-Entities supports other databases.

Finding performance bottlenecks in a classic asp/sql server website

I've got an old classic asp/sql server app which is constantly throwing 500 errors/timeouts even though the load is not massive. Some of the DB queries are pretty intensive but nothing that should be causing it to fall over.
Are there any good pieces of software I can install on my server which will show up precisely where the bottlenecks are in either the asp or the DB?
Some tools you can try:
HP (formerly Mercury) LoadRunner or Performance Center
Visual Studio Application Center Test (Enterprise Editions only?)
Microsoft Web Application Stress tool (aka WAS, aka "Homer"; predecessor to Application Center Test)
WebLoad
MS Visual Studio Analyzer if you want to trace through the application code. This can show you how long the app waits on DB calls, and what the SQL was that was used. You can then use the SQL profiler to tune the queries.
Where is the timeout occurring? Is it at lines when ASP is connecting/executing sql? If so your problem is either with the connection to the db server or at the db itself. Load up SQL profiler in MSSQL to see how long the queries take. Perhaps it is due to locks in the database.
Do you use transactions? If so make sure they do not lock your database for a long time. Make sure you use transactions in ADO and not on the entire ASP page. You can also ignore lock in SQL Selects by using WITH (NOLOCK) hint on tables.
Make sure you database is optimized with indexes.
Also make sure you are conencted to the DB for as shortest time as possible i.e (example not working code): conn.open; set rs = conn.execute(); rs.close; conn.close. So store recordsets in a variable instead of looping through while holding the connection to the DB open. A good way is to use GetRows() function in ADO.
Always explicitly close and set ADO objects to nothing. This can cause the connection to the DB to remain open.
Enable connection pooling.
Load ADO constants in global.asa if you are using them
Do not store any objects in session or application scopes.
Upgrade to latest versions of ADO, MDac, SQL Server service packs etc.
Are you sure the server can handle the load? Maybe upgrade it? Is it on shared hosting? Maybe your app is not the problem.
It is quite simple to measure a script performance by timing it from the 1 line to the last line. This way you can identify slow running pages.
Have you tried running the SQL Server Profiler on the server? It will highlight any unexpected activity hitting the database from the app as well as help identifying badly performing queries.
If you're happy that the DB queries are needfully intensive then perhaps you need to set more appropriate timeouts on those pages that use these queries.
Set the Server.ScriptTimeout to something larger, you may also need to set the timeout on ADO Command objects used by the script.
Here's how I'd approach it.
Look at the running tasks on the server. Which is taking up more CPU time - SQL server or IIS? Most of the time, it will be SQL server and it certainly sounds that way based on your post. It's very rare that any ASP application actually does a lot of processing on the ASP side of things as opposed to the COM or SQL sides.
Use SQL Profiler to check out all the queries hitting the database server.
Deal with the low-hanging fruit first. Generally you will have a few "problem" queries that hit the database frequently and chew up a lot of time. Deal with these. (A truism in software development is that 10% of the code chews up 90% of the execution time...)
In addition to looking at query costs with SQL Profiler and Query Analyzer/SQL Studio and doing the normal SQL performance detective work you might also want to check if your database calls are returning inordinate amounts of data to your ASP code. I've seen cases where innocuous-looking queries returned HUGE amounts of unneeded data to ASP - the classic ("select * from tablename") kind of query written by lazy/inexperienced programmers that returns 10,000 huge rows when the programmer really only needed 1 field from 1 row. The reason I mention this special case is because these sorts of queries often have low execution times and low query costs on the SQL side of things and can therefore slip under the radar.

Resources