Best strategy for retrieving large dynamically-specified tables on an ASP.NET page - linq

Looking for a bit of advice on how to optimise one of our projects. We have a ASP.NET/C# system that retrieves data from a SQL2008 data and presents it on a DevExpress ASPxGridView. The data that's retrieved can come from one of a number of databases - all of which are slightly different and are being added and removed regularly. The user is presented with a list of live "companies", and the data is retrieved from the corresponding database.
At the moment, data is being retrieved using a standard SqlDataSource and a dynamically-created SQL SELECT statement. There are a few JOINs in the statement, as well as optional WHERE constraints, again dynamically-created depending on the database and the user's permission level.
All of this works great (honest!), apart from performance. When it comes to some databases, there are several hundreds of thousands of rows, and retrieving and paging through the data is quite slow (the databases are already properly indexed). I've therefore been looking at ways of speeding the system up, and it seems to boil down to two choices: XPO or LINQ.
LINQ seems to be the popular choice, but I'm not sure how easy it will be to implement with a system that is so dynamic in nature - would I need to create "definitions" for each database that LINQ could access? I'm also a bit unsure about creating the LINQ queries dynamically too, although looking at a few examples that part at least seems doable.
XPO, on the other hand, seems to allow me to create a XPO Data Source on the fly. However, I can't find too much information on how to JOIN to other tables.
Can anyone offer any advice on which method - if any - is the best to try and retro-fit into this project? Or is the dynamic SQL model currently used fundamentally different from LINQ and XPO and best left alone?

Before you go and change the whole way that your app talks to the database, have you had a look at the following:
Run your code through a performance profiler (such as Redgate's performance profiler), the results are often surprising.
If you are constructing the SQL string on the fly, are you using .Net best practices such as String.Concat("str1", "str2") instead of "str1" + "str2". Remember, multiple small gains add up to big gains.
Have you thought about having a summary table or database that is periodically updated (say every 15 mins, you might need to run a service to update this data automatically.) so that you are only hitting one database. New connections to databases are quiet expensive.
Have you looked at the query plans for the SQL that you are running. Today, I moved a dynamically created SQL string to a sproc (only 1 param changed) and shaved 5-10 seconds off the running time (it was being called 100-10000 times depending on some conditions).
Just a warning if you do use LINQ. I have seen some developers who have decided to use LINQ write more inefficient code because they did not know what they are doing (pulling 36,000 records when they needed to check for 1 for example). This things are very easily overlooked.
Just something to get you started on and hopefully there is something there that you haven't thought of.
Cheers,
Stu

As far as I understand you are talking about so called server mode when all data manipulations are done on the DB server instead of them to the web server and processing them there. In this mode grid works very fast with data sources that can contain hundreds thousands records. If you want to use this mode, you should either create the corresponding LINQ classes or XPO classes. If you decide to use LINQ based server mode, the LINQServerModeDataSource provides the Selecting event which can be used to set a custom IQueryable and KeyExpression. I would suggest that you use LINQ in your application. I hope, this information will be helpful to you.

I guess there are two points where performance might be tweaked in this case. I'll assume that you're accessing the database directly rather than through some kind of secondary layer.
First, you don't say how you're displaying the data itself. If you're loading thousands of records into a grid, that will take time no matter how fast everything else is. Obviously the trick here is to show a subset of the data and allow the user to page, etc. If you're not doing this then that might be a good place to start.
Second, you say that the tables are properly indexed. If this is the case, and assuming that you're not loading 1,000 records into the page at once and retreiving only subsets at a time, then you should be OK.
But, if you're only doing an ExecuteQuery() against an SQL connection to get a dataset back I don't see how Linq or anything else will help you. I'd say that the problem is obviously on the DB side.
So to solve the problem with the database you need to profile the different SELECT statements you're running against it, examine the query plan and identify the places where things are slowing down. You might want to start by using the SQL Server Profiler, but if you have a good DBA, sometimes just looking at the query plan (which you can get from Management Studio) is usually enough.

Related

ORACLE - Which is better to generate a large resultset of records, View, SP, or Function

I recently working with Oracle database to generate some reports. What I need is to get result sets of specific records (only SELECT statement), sometimes are large records, to be used for generating the report in excel file.
At first, the reports are queried in Views but some of them are slow (have some complex subqueries). I was asked to increase the performance and also fixed some field mapping. I also want to tidy things up, because when I query against View, I must specifically call the right column name. I want to separate the data works into database, and the web app just for passing parameters and call the right result set.
I'm new to Oracle, so which is better to do this kind of task? Using SP or Function? or in what condition that maybe View is better?
Makes no difference whether you compile your SQL in a view, SP or function. It is the SQL itself that matters.
As long as you are able to meet your requirements with the views they should be a good option. If you intend to break-up your queries into multiple ones for achieving better performance then you should go for stored procedures. If you decide to go for stored procedure then it would be advisable to create a package and bundle all the stored procedures together in the package. If your problem is performance then there may not be a silver bullet solution for the same. You will have to work on your queries and design for the same.
If the problem is performance due to complex SELECT query (queries), you can consider tuning the queries. Often you will find queries written 15-20 years ago, which do not use functionality and techniques that were introduced by Oracle in more recent versions (even if the organization spent the big bucks to buy the more recent versions - making it into a waste of money). Honestly, that may be too much of a task for you if you are new at Oracle; also, some slow queries may have been written by people just like you, many years ago - before they had a chance to learn a lot about Oracle and have experience with it.
Another thing, if the reports don't need to use the absolute current state of the underlying tables (for example, if "what was in the tables at the end of the business day yesterday" is acceptable), you can create a materialized view. It will not work any faster than a regular view, but it can run overnight (say), or every six hours, or whatever - so that the further reporting processing from there will not have to wait for the queries to complete. This is one of the main uses of materialized views.
Good luck!

Multidimensional data types

So I was thinking... Imagine you have to write a program that would represent a schedule of a whole college.
That schedule has several dimensions (e.g.):
time
location
indivitual(s) attending it
lecturer(s)
subject
You would have to be able to display the schedule from several standpoints:
everything held in one location in certain timeframe
everything attended by individual in certain timeframe
everything lecturered by a certain lecturer in certain timeframe
etc.
How would you save such data, and yet keep the ability to view it from different angles?
Only way I could think of was to save it in every form you might need it:
E.g. you have folder "students" and in it each student has a file and it contains when and why and where he has to be. However, you also have a folder "locations" and each location has a file which contains who and why and when has to be there. The more angles you have, the more size-per-info ratio increases.
But that seems highly inefficinet, spacewise.
Is there any other way?
My knowledge of Javascript is 0, but I wonder if such things would be possible with it, even in this space inefficient form.
If not that, I wonder if it would work in any other standard (C++, C#, Java, etc.) language, primarily in Java...
EDIT: Could this be done by using MySQL database?
Basically, you are trying to first store data and then present it under different views.
SQL databases were made exactly for that: from one side you build a schema and instantiate it in a database to store your data (the language is called Data Definition Language, DDL), then you make requests on it with the query language (SQL), what you call "views". There are even "views" objects in SQL databases to build these views Inside the database (rather than having to the code of the request in the user code).
MySQL can do that for sure, note that it is possible to compile some SQL engine for Javascript (SQLite for example) and use local web store to store the data.
There is another aspect to your question: optimization of the queries. While SQL can do most of the request job for your views. It is sometimes preferred to create actual copies of the requests results in so called "datamarts" (this is called de-normalizing a request), so that the hard work of selecting or computing aggregate/groups functions and so on is done once per period of time (imagine that a specific view changes only on Monday), then requesters just have to read these results. It is important in this case to separate at least semantically what is primary data from what is secondary data (and for performance/user rights reasons, physical separation is often a good idea).
Note that as you cited MySQL, I wrote about SQL but mostly any database technology could do that what you searched to do (hierarchical, object oriented, XML...) as long as the particular implementation that you use is flexible enough for your data and requests.
So in short:
I would use a SQL database to store the data
make appropriate views / requests
if I need huge request performance, make appropriate de-normalized data available
the language is not important there, any will do

how to reduce the database's pressure

I have a database(sql server 2005),now there are about 100000 records in the table called users, when I do query use linq to sql, it is very slower and slower.how can I do some operate to improve the speed?
Analyse your query and add some indexes to your table may help.
To get a more specific answer post more specific information (table stucture, indexes you have, the sql code L2S generates, ...)
You could (in order of preference)
Save your query as a stored procedure
Add indexes to your users
table, for what you are querying for/sorting for
Analyze your query
(if it is complicated), see if there's a less-resource-intensive way
of doing it. There are graphical query analyzers to help you.
As a last resort, not use LINQ, but instead ADO.NET Entity Framework, it's significantly faster. But you'll only see performance improvements for crazy stuff, and only if you've already done all of the above.
Use stored procedures and then use linq to sql to get the desired rows, this will give performance.
The best tools at your disposal for analyzing your database access and seeing what needs to be optimized are:
SQL Server Profiler
Graphical Execution Plans
The first one will allow you to see the exact queries being sent to your database from your application, which is especially useful if it turns out that your application is chattier than you think. The second one will allow you to take those queries and see exactly what the SQL server is doing with them.
In the graphical execution plan, look for steps which use a lot of CPU and paths which transfer a lot of records. Those are what you'll want to optimize. It's possible that you're doing a table scan somewhere, which is slow, or maybe joining on many more records than you need somewhere, which is slow, etc.

How to access data in Dynamics CRM?

What is the best way in terms of speed of the platform and maintainability to access data (read only) on Dynamics CRM 4? I've done all three, but interested in the opinions of the crowd.
Via the API
Via the webservices directly
Via DB calls to the views
...and why?
My thoughts normally center around DB calls to the views but I know there are purists out there.
Given both requirements I'd say you want to call the views. Properly crafted SQL queries will fly.
Going through the API is required if you plan to modify data, but it isnt the fastest approach around because it doesnt allow deep loading of entities. For instance if you want to look at customers and their orders you'll have to load both up individually and then join them manually. Where as a SQL query will already have the data joined.
Nevermind that the TDS stream is a lot more effecient that the SOAP messages being used by the API & webservices.
UPDATE
I should point out in regard to the views and CRM database in general: CRM does not optimize the indexes on the tables or views for custom entities (how could it?). So if you have a truckload entity that you lookup by destination all the time you'll need to add an index for that property. Depending upon your application it could make a huge difference in performance.
I'll add to jake's comment by saying that querying against the tables directly instead of the views (*base & *extensionbase) will be even faster.
In order of speed it'd be:
direct table query
view query
filterd view query
api call
Direct table updates:
I disagree with Jake that all updates must go through the API. The correct statement is that going through the API is the only supported way to do updates. There are in fact several instances where directly modifying the tables is the most reasonable option:
One time imports of large volumes of data while the system is not in operation.
Modification of specific fields across large volumes of data.
I agree that this sort of direct modification should only be a last resort when the performance of the API is unacceptable. However, if you want to modify a boolean field on thousands of records, doing a direct SQL update to the table is a great option.
Relative Speed
I agree with XVargas as far as relative speed.
Unfiltered Views vs Tables: I have not found the performance advantage to be worth the hassle of manually joining the base and extension tables.
Unfiltered views vs Filtered views: I recently was working with a complicated query which took about 15 minutes to run using the filtered views. After switching to the unfiltered views this query ran in about 10 seconds. Looking at the respective query plans, the raw query had 8 operations while the query against the filtered views had over 80 operations.
Unfiltered Views vs API: I have never compared querying through the API against querying views, but I have compared the cost of writing data through the API vs inserting directly through SQL. Importing millions of records through the API can take several days, while the same operation using insert statements might take several minutes. I assume the difference isn't as great during reads but it is probably still large.

LINQ-to-SQL vs stored procedures? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I took a look at the "Beginner's Guide to LINQ" post here on StackOverflow (Beginners Guide to LINQ), but had a follow-up question:
We're about to ramp up a new project where nearly all of our database op's will be fairly simple data retrievals (there's another segment of the project which already writes the data). Most of our other projects up to this point make use of stored procedures for such things. However, I'd like to leverage LINQ-to-SQL if it makes more sense.
So, the question is this: For simple data retrievals, which approach is better, LINQ-to-SQL or stored procs? Any specific pro's or con's?
Thanks.
Some advantages of LINQ over sprocs:
Type safety: I think we all understand this.
Abstraction: This is especially true with LINQ-to-Entities. This abstraction also allows the framework to add additional improvements that you can easily take advantage of. PLINQ is an example of adding multi-threading support to LINQ. Code changes are minimal to add this support. It would be MUCH harder to do this data access code that simply calls sprocs.
Debugging support: I can use any .NET debugger to debug the queries. With sprocs, you cannot easily debug the SQL and that experience is largely tied to your database vendor (MS SQL Server provides a query analyzer, but often that isn't enough).
Vendor agnostic: LINQ works with lots of databases and the number of supported databases will only increase. Sprocs are not always portable between databases, either because of varying syntax or feature support (if the database supports sprocs at all).
Deployment: Others have mentioned this already, but it's easier to deploy a single assembly than to deploy a set of sprocs. This also ties in with #4.
Easier: You don't have to learn T-SQL to do data access, nor do you have to learn the data access API (e.g. ADO.NET) necessary for calling the sprocs. This is related to #3 and #4.
Some disadvantages of LINQ vs sprocs:
Network traffic: sprocs need only serialize sproc-name and argument data over the wire while LINQ sends the entire query. This can get really bad if the queries are very complex. However, LINQ's abstraction allows Microsoft to improve this over time.
Less flexible: Sprocs can take full advantage of a database's featureset. LINQ tends to be more generic in it's support. This is common in any kind of language abstraction (e.g. C# vs assembler).
Recompiling: If you need to make changes to the way you do data access, you need to recompile, version, and redeploy your assembly. Sprocs can sometimes allow a DBA to tune the data access routine without a need to redeploy anything.
Security and manageability are something that people argue about too.
Security: For example, you can protect your sensitive data by restricting access to the tables directly, and put ACLs on the sprocs. With LINQ, however, you can still restrict direct access to tables and instead put ACLs on updatable table views to achieve a similar end (assuming your database supports updatable views).
Manageability: Using views also gives you the advantage of shielding your application non-breaking from schema changes (like table normalization). You can update the view without requiring your data access code to change.
I used to be a big sproc guy, but I'm starting to lean towards LINQ as a better alternative in general. If there are some areas where sprocs are clearly better, then I'll probably still write a sproc but access it using LINQ. :)
I am generally a proponent of putting everything in stored procedures, for all of the reasons DBAs have been harping on for years. In the case of Linq, it is true that there will be no performance difference with simple CRUD queries.
But keep a few things in mind when making this decision: using any ORM couples you tightly to your data model. A DBA has no freedom to make changes to the data model without forcing you to change your compiled code. With stored procedures, you can hide these sorts of changes to an extent, since the parameter list and results set(s) returned from a procedure represent its contract, and the innards can be changed around, just so long as that contract is still met.
And also, if Linq is used for more complex queries, tuning the database becomes a much more difficult task. When a stored procedure is running slow, the DBA can totally focus on the code in isolation, and has lots of options, just so that contract is still satisfied when he/she is done.
I have seen many, many cases where serious problems in an application were addressed by changes to the schema and code in stored procedures without any change to deployed, compiled code.
Perhaps a hybird approach would be nice with Linq? Linq can, of course, be used to call stored procedures.
Linq to Sql.
Sql server will cache the query plans, so there's no performance gain for sprocs.
Your linq statements, on the other hand, will be logically part of and tested with your application. Sprocs are always a bit separated and are harder to maintain and test.
If I was working on a new application from scratch right now I would just use Linq, no sprocs.
For basic data retrieval I would be going for Linq without hesitation.
Since moving to Linq I've found the following advantages:
Debugging my DAL has never been easier.
Compile time safety when your schema changes is priceless.
Deployment is easier because everything is compiled into DLL's. No more managing deployment scripts.
Because Linq can support querying anything that implements the IQueryable interface, you will be able to use the same syntax to query XML, Objects and any other datasource without having to learn a new syntax
LINQ will bloat the procedure cache
If an application is using LINQ to SQL and the queries involve the use of strings that can be highly variable in length, the SQL Server procedure cache will become bloated with one version of the query for every possible string length. For example, consider the following very simple queries created against the Person.AddressTypes table in the AdventureWorks2008 database:
var p =
from n in x.AddressTypes
where n.Name == "Billing"
select n;
var p =
from n in x.AddressTypes
where n.Name == "Main Office"
select n;
If both of these queries are run, we will see two entries in the SQL Server procedure cache: One bound with an NVARCHAR(7), and the other with an NVARCHAR(11). Now imagine if there were hundreds or thousands of different input strings, all with different lengths. The procedure cache would become unnecessarily filled with all sorts of different plans for the exact same query.
More here: https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=363290
I think the pro LINQ argument seems to be coming from people who don't have a history with database development (in general).
Especially if using a product like VS DB Pro or Team Suite, many of the arguments made here do not apply, for instance:
Harder to maintain and Test:
VS provides full syntax checking, style checking, referential and constraint checking and more. It also provide full unit testing capabilities and refactoring tools.
LINQ makes true unit testing impossible as (in my mind) it fails the ACID test.
Debugging is easier in LINQ:
Why? VS allows full step-in from managed code and regular debugging of SPs.
Compiled into a single DLL rather than deployment scripts:
Once again, VS comes to the rescue where it can build and deploy full databases or make data-safe incremental changes.
Don't have to learn TSQL with LINQ:
No you don't, but you have to learn LINQ - where's the benefit?
I really don't see this as being a benefit. Being able to change something in isolation might sound good in theory, but just because the changes fulfil a contract doesn't mean it's returning the correct results. To be able to determine what the correct results are you need context and you get that context from the calling code.
Um, loosely coupled apps are the ultimate goal of all good programmers as they really do increase flexibility. Being able to change things in isolation is fantastic, and it is your unit tests that will ensure it is still returning appropriate results.
Before you all get upset, I think LINQ has its place and has a grand future. But for complex, data-intensive applications I do not think it is ready to take the place of stored procedures. This was a view I had echoed by an MVP at TechEd this year (they will remain nameless).
EDIT: The LINQ to SQL Stored Procedure side of things is something I still need to read more on - depending on what I find I may alter my above diatribe ;)
LINQ is new and has its place. LINQ is not invented to replace stored procedure.
Here I will focus on some performance myths & CONS, just for "LINQ to SQL", of course I might be totally wrong ;-)
(1)People say LINQ statment can "cache" in SQL server, so it doesn't lose performance. Partially true. "LINQ to SQL" actually is the runtime translating LINQ syntax to TSQL statment. So from the performance perspective,a hard coded ADO.NET SQL statement has no difference than LINQ.
(2)Given an example, a customer service UI has a "account transfer" function. this function itself might update 10 DB tables and return some messages in one shot. With LINQ, you have to build a set of statements and send them as one batch to SQL server. the performance of this translated LINQ->TSQL batch can hardly match stored procedure. Reason? because you can tweak the smallest unit of the statement in Stored procedue by using the built-in SQL profiler and execution plan tool, you can not do this in LINQ.
The point is, when talking single DB table and small set of data CRUD, LINQ is as fast as SP. But for much more complicated logic, stored procedure is more performance tweakable.
(3)"LINQ to SQL" easily makes newbies to introduce performance hogs. Any senior TSQL guy can tell you when not to use CURSOR (Basically you should not use CURSOR in TSQL in most cases). With LINQ and the charming "foreach" loop with query, It's so easy for a newbie to write such code:
foreach(Customer c in query)
{
c.Country = "Wonder Land";
}
ctx.SubmitChanges();
You can see this easy decent code is so attractive. But under the hood, .NET runtime just translate this to an update batch. If there are only 500 lines, this is 500 line TSQL batch; If there are million lines, this is a hit. Of course, experienced user won't use this way to do this job, but the point is, it's so easy to fall in this way.
The best code is no code, and with stored procedures you have to write at least some code in the database and code in the application to call it , whereas with LINQ to SQL or LINQ to Entities, you don't have to write any additional code beyond any other LINQ query aside from instantiating a context object.
LINQ definitely has its place in application-specific databases and in small businesses.
But in a large enterprise, where central databases serve as a hub of common data for many applications, we need abstraction. We need to centrally manage security and show access histories. We need to be able to do impact analysis: if I make a small change to the data model to serve a new business need, what queries need to be changed and what applications need to be re-tested? Views and Stored Procedures give me that. If LINQ can do all that, and make our programmers more productive, I'll welcome it -- does anyone have experience using it in this kind of environment?
A DBA has no freedom to make changes
to the data model without forcing you
to change your compiled code. With
stored procedures, you can hide these
sorts of changes to an extent, since
the parameter list and results set(s)
returned from a procedure represent
its contract, and the innards can be
changed around, just so long as that
contract is still met.
I really don't see this as being a benefit. Being able to change something in isolation might sound good in theory, but just because the changes fulfil a contract doesn't mean it's returning the correct results. To be able to determine what the correct results are you need context and you get that context from the calling code.
I think you need to go with procs for anything real.
A) Writing all your logic in linq means your database is less useful because only your application can consume it.
B) I'm not convinced that object modelling is better than relational modelling anyway.
C) Testing and developing a stored procedure in SQL is a hell of a lot faster than a compile edit cycle in any Visual Studio environment. You just edit, F5 and hit select and you are off to the races.
D) It's easier to manage and deploy stored procedures than assemblies.. you just put the file on the server, and press F5...
E) Linq to sql still writes crappy code at times when you don't expect it.
Honestly, I think the ultimate thing would be for MS to augment t-sql so that it can do a join projection impliclitly the way linq does. t-sql should know if you wanted to do order.lineitems.part, for example.
LINQ doesn't prohibit the use of stored procedures. I've used mixed mode with LINQ-SQL and LINQ-storedproc. Personally, I'm glad I don't have to write the stored procs....pwet-tu.
IMHO, RAD = LINQ, RUP = Stored Procs. I worked for a large Fortune 500 company for many years, at many levels including management, and frankly, I would never hire RUP developers to do RAD development. They are so siloed that they very limited knowledge of what to do at other levels of the process. With a siloed environment, it makes sense to give DBAs control over the data through very specific entry points, because others frankly don't know the best ways to accomplish data management.
But large enterprises move painfully slow in the development arena, and this is extremely costly. There are times when you need to move faster to save both time and money, and LINQ provides that and more in spades.
Sometimes I think that DBAs are biased against LINQ because they feel it threatens their job security. But that's the nature of the beast, ladies and gentlemen.
According to gurus, I define LINQ as motorcycle and SP as car.
If you want to go for a short trip and only have small passengers(in this case 2), go gracefully with LINQ.
But if you want to go for a journey and have large band, i think you should choose SP.
As a conclusion, choosing between motorcycle or car is depend on your route (business), length (time), and passengers (data).
Hope it helps, I may be wrong. :D
Also, there is the issue of possible 2.0 rollback. Trust me it has happened to me a couple of times so I am sure it has happened to others.
I also agree that abstraction is the best. Along with the fact, the original purpose of an ORM is to make RDBMS match up nicely to the OO concepts. However, if everything worked fine before LINQ by having to deviate a bit from OO concepts then screw 'em. Concepts and reality don't always fit well together. There is no room for militant zealots in IT.
I'm assuming you mean Linq To Sql
For any CRUD command it's easy to profile the performance of a stored procedure vs. any technology. In this case any difference between the two will be negligible. Try profiling for a 5 (simple types) field object over 100,000 select queries to find out if there's a real difference.
On the other hand the real deal-breaker will be the question on whether you feel comfortable putting your business logic on your database or not, which is an argument against stored procedures.
All these answers leaning towards LINQ are mainly talking about EASE of DEVELOPMENT which is more or less connected to poor quality of coding or laziness in coding. I am like that only.
Some advantages or Linq, I read here as , easy to test, easy to debug etc, but these are no where connected to Final output or end user. This is always going cause the trouble the end user on performance. Whats the point loading many things in memory and then applying filters on in using LINQ?
Again TypeSafety, is caution that "we are careful to avoid wrong typecasting" which again poor quality we are trying to improve by using linq. Even in that case, if anything in database changes, e.g. size of String column, then linq needs to be re-compiled and would not be typesafe without that .. I tried.
Although, we found is good, sweet, interesting etc while working with LINQ, it has shear disadvantage of making developer lazy :) and it is proved 1000 times that it is bad (may be worst) on performance compared to Stored Procs.
Stop being lazy. I am trying hard. :)
For simple CRUD operations with a single data access point, I would say go for LINQ if you feel comfortable with the syntax. For more complicated logic I think sprocs are more efficiant performance-wise if you are good at T-SQL and its more advanced operations. You also have the help from Tuning Advisor, SQL Server Profiler, debugging your queries from SSMS etc.
The outcome can be summarized as
LinqToSql for small sites, and prototypes. It really saves time for Prototyping.
Sps : Universal. I can fine tune my queries and always check ActualExecutionPlan / EstimatedExecutionPlan.
Create PROCEDURE userInfoProcedure
-- Add the parameters for the stored procedure here
#FirstName varchar,
#LastName varchar
AS
BEGIN
SET NOCOUNT ON;
-- Insert statements for procedure here
SELECT FirstName , LastName,Age from UserInfo where FirstName=#FirstName
and LastName=#FirstName
END
GO
http://www.totaldotnet.com/Article/ShowArticle121_StoreProcBasic.aspx
Stored procedure makes testing easier and you can change the query without touching the application code. Also with linq, getting a data does not mean its the right data. And testing the correctness of the data means running the application but with stored procedure it's easy to test without touching the application.
Both LINQ and SQL have their places. Both have their disadvantages and advantages.
Sometimes for complex data retrieval you might need stored procs. And sometimes you may want other people to use your stored proc in Sql Server Management Studio.
Linq to Entities is great for fast CRUD development.
Sure you can build an app using only one or the other. Or you can mix it up. It all comes down to your requirements. But SQL stored procs will no go away any time soon.

Resources