EF and repository pattern - ending up with multiple DbContexts in one controller - any issues (performance, data integrity)? - asp.net-mvc-3

Most of my knowledge of ASP.NET MVC 3 comes from reading through the book Pro ASP.NET MVC 3 Framework by Adam Freeman and Steven Senderson. For my test application I have tried to stick to their examples very closely. I am using the repository pattern plus Ninject and Moq which means that unit testing work quite well (i.e. without needing to pull data from the database).
In the book repositories are used like this:
public class EFDbTestChildRepository
{
private EFDbContext context = new EFDbContext();
public IQueryable<TestChild> TestChildren
{
get { return context.TestChildren; }
}
public void SaveTestChild(TestChild testChild)
{
if (testChild.TestChildID == 0)
{
context.TestChildren.Add(testChild);
}
else
{
context.Entry(testChild).State = EntityState.Modified;
}
context.SaveChanges();
}
}
And here is the DbContext that goes with it:
public class EFDbContext : DbContext
{
public DbSet<TestParent> TestParents { get; set; }
public DbSet<TestChild> TestChildren { get; set; }
}
Please note: to keep things simple in this extracted example I have left out the interface ITestChildRepository here which Ninject would then use.
In other sources I have seen a more general approach for the repository where one single repository is enough for the whole application. Obviously in my case I end up with quite a list of repositories in my application - basically one for each entity in my domain model. Not sure about the pros and cons about the two approaches - I just followed the book to be on the safe side.
To finally get to my question: each repository has its own DbContext - private EFDbContext context = new EFDbContext();. Do I risk ending up with multiple DbContexts within one request? And would that lead to any significant performance overhead? How about a potential for conflicts between the contexts and any consequences to the data integrity?
Here is an example where I ended up with more than one repository within a controller.
My two database tables are linked with a foreign key relationship. My domain model classes:
public class TestParent
{
public int TestParentID { get; set; }
public string Name { get; set; }
public string Comment { get; set; }
public virtual ICollection<TestChild> TestChildren { get; set; }
}
public class TestChild
{
public int TestChildID { get; set; }
public int TestParentID { get; set; }
public string Name { get; set; }
public string Comment { get; set; }
public virtual TestParent TestParent { get; set; }
}
The web application contains a page that allows the user to create a new TestChild. On it there is a selectbox that contains a list of available TestParents to pick from. This is what my controller looks like:
public class ChildController : Controller
{
private EFDbTestParentRepository testParentRepository = new EFDbTestParentRepository();
private EFDbTestChildRepository testChildRepository = new EFDbTestChildRepository();
public ActionResult List()
{
return View(testChildRepository.TestChildren);
}
public ViewResult Edit(int testChildID)
{
ChildViewModel cvm = new ChildViewModel();
cvm.TestChild = testChildRepository.TestChildren.First(tc => tc.TestChildID == testChildID);
cvm.TestParents = testParentRepository.TestParents;
return View(cvm);
}
public ViewResult Create()
{
ChildViewModel cvm = new ChildViewModel();
cvm.TestChild = new TestChild();
cvm.TestParents = testParentRepository.TestParents;
return View("Edit", cvm);
}
[HttpPost]
public ActionResult Edit(TestChild testChild)
{
try
{
if (ModelState.IsValid)
{
testChildRepository.SaveTestChild(testChild);
TempData["message"] = string.Format("Changes to test child have been saved: {0} (ID = {1})",
testChild.Name,
testChild.TestChildID);
return RedirectToAction("List");
}
}
catch (DataException)
{
//Log the error (add a variable name after DataException)
ModelState.AddModelError("", "Unable to save changes. Try again, and if the problem persists see your system administrator.");
}
// something wrong with the data values
return View(testChild);
}
}
It's not enough to have an EFDbTestChildRepository available but I also need an EFDbTestParentRepository. Both of them are assigned to private variables of the controller - and voila, it seems to me that two DbContexts have been created. Or is that not correct?
To avoid the issue I tried using EFDbTestChildRepository to get to the TestParents. But that obviously will only bring up those that are already hooked up to at least one TestChild - so not what I want.
Here is the code for the view model:
public class ChildViewModel
{
public TestChild TestChild { get; set; }
public IQueryable<TestParent> TestParents { get; set; }
}
Please let me know if I forgot to include some relevant code. Thanks so much for your advice!

There won't be a performance problem (unless we are talking about nanoseconds, instantiating a context is very cheap) and you won't have damaged your data integrity (before that happens you'll get exceptions).
But the approach is very limited and will work only in very simple situations. Multiple contexts will lead to problems in many scenarios. As an example: Suppose you want to create a new child for an existing parent and would try it with the following code:
var parent = parentRepo.TestParents.Single(p => p.Id == 1);
var child = new Child { TestParent = parent };
childrenRepo.SaveTestChild(child);
This simple code won't work because parent is already attached to the context inside of parentRepo but childrenRepo.SaveTestChild will try to attach it to the context inside of childrenRepo which will cause an exception because an entity must not be attached to another context. (Here is actually a workaround because you could set the FK property instead of loading the parent: child.TestParentID = 1. But without a FK property it would be a problem.)
How to solve such a problem?
One approach could be to extend the EFDbTestChildRepository by a new property:
public IQueryable<TestParent> TestParents
{
get { return context.TestParents; }
}
In the example code above you could then use only one repository and the code would work. But as you can see, the name "EFDbTest Child Repository" doesn't really fit anymore to the purpose of the new repository. It should be now "EFDbTest ParentAndChild Repository".
I would call this the Aggregate Root approach which means that you create one repository not for only one entity but for a few entities which are closely related to each other and have navigation properties between them.
An alternative solution is to inject the context into the repositories (instead of creating it in the repositories) to make sure that every repository uses the same context. (The context is often abstracted into a IUnitOfWork interface.) Example:
public class MyController : Controller
{
private readonly MyContext _context;
public MyController()
{
_context = new MyContext();
}
public ActionResult SomeAction(...)
{
var parentRepo = new EFDbTestParentRepository(_context);
var childRepo = new EFDbTestChildRepository(_context);
//...
}
protected override void Dispose(bool disposing)
{
_context.Dispose();
base.Dispose(disposing);
}
}
This gives you a single context per controller you can use in multiple repositories.
The next step might be to create a single context per request by dependency injection, like...
private readonly MyContext _context;
public MyController(MyContext context)
{
_context = context;
}
...and then configuring the IOC container to create a single context instance which gets injected into perhaps multiple controllers.

Do I risk ending up with multiple DbContexts within one request?
Yes. Each instance of a repository is going to instantiate its own DbContexts instances. Depending on the size and use of the application, this may not be a problem although it is not a very scalable approach. There are several ways of handling this though. In my web projects I add the DbContext(s) to the Request's Context.Item collection, this way it is available to all classes that require it. I use Autofac (similar to Ninject) to control what DbContexts are created within specific scenarios and how they are stored, e.g. I have a different 'session manager' for a WCF context to the one for a Http context.
And would that lead to any significant performance overhead?
Yes, but again not massively if the application is relatively small. As it grows though, you may notice the overhead.
How about a potential for conflicts between the contexts and any
consequences to the data integrity?
One of the reasons for using an ORM like this is so that changes can be maintained within the DbContext. If you are instantiating multiple context instances per request you lose this benefit. You wouldn't notice conflicts or any impact of the integrity per se unless you were handling a lot of updates asynchronously.

As promised I post my solution.
I came across your question because I was having trouble with the IIS application pool memory growing beyond limits and having multiple DBContexts was one of my suspects. In retrospect it is fair to say that there were other causes for my trouble. However, it challenged me to find a better layer based design for my repository.
I found this excellent blog: Correct use of Repository and Unit Of Work patterns in ASP.NET MVC leading me to the right direction. The redesign is based on the UnitOfWork pattern. It enables me to have just one constructor parameter for all my controllers instead of "never ending constructor parameters". And after that, I was able to introduce proactive caching as well, which solved a great deal of the earlier mentioned trouble I was having.
Now I only have these classes:
IUnitOfWork
EFUnitOfWork
IGenericRepository
EFGenericRepository
See the referred blog for complete information and implementation of these classes. Just to give an example, IUnitOfWork contains repository definitions for all entities that I need, like:
namespace MyWebApp.Domain.Abstract
{
public interface IUnitOfWork : IDisposable
{
IGenericRepository<AAAAA> AAAAARepository { get; }
IGenericRepository<BBBBB> BBBBBRepository { get; }
IGenericRepository<CCCCC> CCCCCRepository { get; }
IGenericRepository<DDDDD> DDDDDRepository { get; }
// etc.
string Commit();
}
}
The Dependency Injection (DI) is just one statement (I use Ninject):
ninjectKernel.Bind<IUnitOfWork>().To<EFUnitOfWork>();
The Controllers-constructors are maintainable:
public class MyController : BaseController
{
private MyModel mdl = new MyModel();
private IUnitOfWork _context;
public MyController(IUnitOfWork unitOfWork)
{
_context = unitOfWork;
// intialize whatever needs to be exposed to the View:
mdl.whatever = unitOfWork.SomeRepository.AsQueryable();
}
// etc.
Within the Controller I can use _context to access all repositories, if needed. The nice part of it, is that it needs just a single Commit()-call to save changed data for all repositories:
_context.Commit();

Related

What is the best way to create EF DbContext instance for ASP.NET MVC

In order to support lazy loading feature in EF, what is the best way to instantiate DbContext?
I know HttpContext's current item is good place to create DbContext via Application_BeginRequest method and Application_EndRequest method, but in some sample codes of MSDN and official asp.net mvc site, they just create DbContext in Controller's constructor and dispose it in controller's Dispose() method.
I think the both ways are not too different because all of those all implement session per request pattern.
I just want to make sure that my understanding is correct or not.
The Dispose() method in the controller isn't always reliable. By the same token, Session is probably not a good idea either. "Best" is probably subjective, but we've had the best success by using dependency injection (Castle Windsor) and following a Unit of Work Repository pattern.
Setup the unit of work along the following lines:
public class UnitOfWork : IUnitOfWork
{
public UnitOfWork()
{
this.Context = new MyEFEntities();
this.Context.ContextOptions.LazyLoadingEnabled = true;
}
public void Dispose()
{
this.Context.Dispose();
}
public ObjectContext Context { get; internal set; }
}
Setup your repository:
public class Repository<TEntity> : IRepository<TEntity>
where TEntity : class
{
public Repository(IUnitOfWork unitOfWork)
{
Context = unitOfWork.Context;
ObjectSet = Context.CreateObjectSet<TEntity>();
}
public ObjectContext Context { get; set; }
public IObjectSet<TEntity> ObjectSet { get; set; }
}
Register with Castle in Global.asax:
void Application_Start()
{
this.Container.Register(
Component.For<IUnitOfWork>()
.UsingFactoryMethod(() => new UnitOfWork())
.LifeStyle
.Is(LifestyleType.PerWebRequest)
);
ControllerBuilder.Current.SetControllerFactory(
new WindsorControllerFactory(this.Container));
}
And use in your controller (or wherever you're using it, as long as it's injectable):
public class SomeController
{
public SomeController(IRepository<MyEntity> repository)
{
this.Repository = repository;
}
public IRepository<MyEntity> Repository { get; set; }
public ActionResult MyAction()
{
ViewData.Model = this.Repository.ObjectSet.Single(x => x.Condition); //or something...
}
}
Any lazy loading here could potentially be a trap for a future issue. Without DI, without a repository - its hard to see anything working without it being a hack for lazy loading. Also do you you plan on passing your entities to your view. If so this is going to create a bad overlap. The controller should package data for your view, not have things evaluated later in your view.
For MVC best practices, you should flatten out your domain model as much as possible into a viewmodel (if flattening makes sense) and use the view model. Since you would ideally then know what would be lazy loaded, it may make more sense to take the hit up front and use .Include() in your query to eager load, otherwise you can issue many many queries to the database.
I've used a session factory pattern and saved the DBContext in the session object. It will stay open per session. I haven't had problems with it so far.

"An object with the same key already exists in the ObjectStateManager..." exception is thrown when setting an entity state to modified

I followed some examples(including such books as "Pro ASP.NET MVC 3" and "Professional ASP.NET MVC 3") to create simple ASP.NET MVC 3 apps using EF 4.1 (since I'm new to these technologies).
I'm using the following repository(single instance of it is used by all action methods of the controller) to access the DB:
public class ProductRepository : IProductRepository
{
private readonly EFDbContext _context = new EFDbContext();
#region Implementation of IProductRepository
....
public void SaveProduct(Product product)
{
if (product.ProductId == 0)
{
_context.Products.Add(product);
}
else
{
_context.Entry(product).State = EntityState.Modified;
}
_context.SaveChanges();
}
....
}
This repository performs updating as it was shown in the examples I used.
Product class:
public class Product
{
public int ProductId { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public decimal Price { get; set; }
public string Category { get; set; }
}
In case of updating the product, I'm getting the exception "An object with the same key already exists in the ObjectStateManager. The ObjectStateManager cannot track multiple objects with the same key"
I know that the similar questions have been already discussed here but my question is a bit different:
Why this code which was taken from examples is not working (though it looks pretty simple and straightforward)? What wrong might I have done or missed something.
After searching for hours for a solution, I have found one that seems suitable after doing enough reading.
The fix is here:
An object with the same key already exists in the ObjectStateManager. The ObjectStateManager cannot track multiple objects with the same key
Basically, fetch the record from the Context and call:
var currentProduct = _context.Products.Find(product.ProductId);
_context.Entry(currentProduct).CurrentValues.SetValues(product);
This seems like a bad idea and something I've always hated about EF in my previous workings, but cccording to Ladislav Mrnka (who apparnently answers every EF related question on Stackoverflow) in this post:
Entity Framework and Connection Pooling
EF will store a request for an entity internally, so ideally, it will already be there and it won't be making an additional call back to the database.
The root cause of the problem seems to be that once a product is fetched from the Context, the context is keeping track of it and that's what is causing all the trouble. So merging your changes back in is the only way.
Hope that helps.
It looks like you're not updating product.ProductId when the item is saved for the first time. This means that when you come back to save the item again it's adding it to the context again, hence the error.
As the Id will be added by database (I'm assuming it's the autogenerated Id) then you'll need to read your product data back onto the client.
From a Generics standpoint, here's how I have resolve the same problem very recently:
public TEntity Update(TEntity model, bool persist)
{
if (model == null)
{
throw new ArgumentException("Cannot update a null entity.");
}
var updateModel = Get(model.Id);
if (updateModel == null)
{
return model;
}
this.context.Entry<TEntity>(updateModel).CurrentValues.SetValues(model);
this.Save(persist);
return model;
}

Serializing EF4.1 Entities using JSON.Net

I am building an application using MVC3, Razor view engine, Repository Pattern with Unit of Work and using EF4.1 Code First to define my data model.
Here is a bit of background (gloss over it if you want).
The application itself is just an Intranet 'Menu'.
The 2 main entities are MenuItem and Department of which:
MenuItem can have many Departments
Departments can have many MenuItems
MenuItem may have a MenuItem as a parent
This is how I have defined my Entities
public class MenuItem
{
public int MenuItemId { get; set; }
public string Name { get; set; }
public string Url { get; set; }
public virtual ICollection<Department> Departments { get; set; }
public int? ParentId { get; set; }
public virtual MenuItem ParentMenuItem { get; set; }
}
public class Department
{
public int DepartmentId { get; set; }
public string Name { get; set; }
public virtual ICollection<MenuItem> MenuItems { get; set; }
}
I am using the FluentAPI to define the Self Reference Many-to-Many for the MenuItem.
The issue I am having is passing a MenuItem to the view via JSON.
The central issues are that I have a circular reference between my entities that the built in JSON parser can't deal with and I have lazy loading and proxy generation still enabled.
I am using JSON.net library from Nuget as my JSON Serializer as this seems to be a nice way round the circular reference issue. I now am unsure how to 'fix' the proxy generation issue. Currently the serializer throws The RelationshipManager object could not be serialized. This type of object cannot be serialized when the RelationshipManager belongs to an entity object that does not implement IEntityWithRelationships.
Can anyone help me with this? If I turn off proxy generation, I am going to have a hell of a time loading all of the MenuItem children so I am keen leave this on. I have read a fair amount and there seems to be a variety of different answers including projecting the entities into another object and serialize that, etc, etc. Ideally there would be some way of configuring JSON.net to ignore the RelationshipManager object?
Update
Here is what I have used as a Custom ContractResolver for JSON.Net serializer. This seems to have sorted out my issue.
public class ContractResolver : DefaultContractResolver
{
private static readonly IEnumerable<Type> Types = GetEntityTypes();
private static IEnumerable<Type> GetEntityTypes()
{
var assembly = Assembly.GetAssembly(typeof (IEntity));
var types = assembly.GetTypes().Where(t => String.Equals(t.Namespace, "Namespace", StringComparison.Ordinal));
return types;
}
protected override List<MemberInfo> GetSerializableMembers(Type objectType)
{
if (!AllowType(objectType))
return new List<MemberInfo>();
var members = base.GetSerializableMembers(objectType);
members.RemoveAll(memberInfo => (IsMemberEntityWrapper(memberInfo)));
return members;
}
private static bool AllowType(Type objectType)
{
return Types.Contains(objectType) || Types.Contains(objectType.BaseType);
}
private static bool IsMemberEntityWrapper(MemberInfo memberInfo)
{
return memberInfo.Name == "_entityWrapper";
}
}
IEntity is an interface all my Code First entity objects implement.
I realise this question has an accepted answer, but I thought I would post my EF Code First solution for future viewers. I was able to get around the error message with the contract resolver below:
class ContractResolver : DefaultContractResolver
{
protected override List<System.Reflection.MemberInfo> GetSerializableMembers(Type objectType)
{
if (objectType.Namespace.StartsWith("System.Data.Entity.Dynamic"))
{
return base.GetSerializableMembers(objectType.BaseType);
}
return base.GetSerializableMembers(objectType);
}
}
This works because EF Code First classes inherit from the POCO class that you actually want serialized, so if we can identify when we are looking at an EF generated class (by checking the namespace) we are able to just serialize using the properties from the base class, and therefore only serialize the POCO properties that we were really after in the first place.
Well, you used powerful serialization API which serializes references and all members as well and now you complains that it serializes all members :)
I didn't test it but I believe this will bring you close to the solution.
JSON.NET is quite powerful tool and it should offer you the extensibility point to avoid this behavior but you will have to code it yourselves. You will need custom DataContractResolver where you define which members should be serialized. Here is the similar example for NHibernate.
You can implement some logic which will take only members present in the parent class of dynamic proxy. I hope this will not break lazy loading. To validate that current entity is proxy you can use this code to get all known proxy types:
IEnumerable<Type> types = ((IObjectContextAdapter)dbContext).ObjectContext.GetKnownProxyTypes();

nHibernate [TransactionAttribute] for UoW conflicts with Repository Pattern

Doing research into the best way to design IRepository<T> structures, I came across a project called 'Whiteboard' (http://whiteboardchat.codeplex.com/) while looking through some forums for NHProf.
I dug around its source code for a while, and found a really interesting attribute for MVC called TransactionAttribute, defined as follows; (I have made brief adjustment to suit my IoC solution)
using System;
using System.Linq;
using Ninject;
namespace System.Web.Mvc
{
/// <summary>
/// This will allow ASP.NET MVC to apply Transactions to the controllers.
/// </summary>
[AttributeUsage(AttributeTargets.Method | AttributeTargets.Class)]
public class TransactionAttribute : ActionFilterAttribute
{
[Inject]
public NHibernate.ISession Session
{
get;
set;
}
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
Session.BeginTransaction();
}
public override void OnActionExecuted(ActionExecutedContext filterContext)
{
if (Session.Transaction.IsActive)
{
if (filterContext.Exception == null)
{
Session.Flush();
Session.Transaction.Commit();
}
else
{
Session.Transaction.Rollback();
}
}
}
}
}
This is really interesting; And useful, however something about it bothers me. When I run my queries using NHProf, it gives me warnings about 'Not using transactions properly', and suggests I wrap all queries in a Transaction. Alright, that's fine and good...
So then I go and decorate my Repository<T> : IRepository<T> class like this ...
public T Update(T instance)
{
using (var transaction = session.BeginTransaction())
{
// attempt to perform the given update
session.SaveOrUpdate(instance);
try
{
// commit the transaction to the database
transaction.Commit();
// update succeeded, so we'll return true
return instance;
}
catch
{
// restore the database to its previous state if we failed.
transaction.Rollback();
// update failed, so return a null object
return default(T);
}
}
}
Here's the problem I am running into.
Everywhere I read, the common practice is to always use a Repository for adding to the collections. However the TransactionAttribute, which in itself was brought to my attention by Ayende Rahien's blog, who is from what I can gather one of the primary developers of NHProf, and one of the people working on this Whiteboard project, makes the assumption that you are performing Repository commands at the MVC Controller Level.
So which is it? I'm utterly confused now where my Transaction logic is supposed to go for the best practice. I'm literally finding conflicting answers, and in some cases from the same people.
You are not supposed to deal with transactions inside repositories. A controller (like you have) or HTTP module should start and commit/rollback transactions. Saves or updates are not supposed to be done in isolation. They will be committed at the end of the operation by the controller. This way you can take advantage of ADO batching and other NHibernate features.
Also, make sure to set the FlushMode of the nhibernate ISession to Commit.
Are you decorating your action method or controller class with the [Transaction] attribute? If not, this action filter code won't even be called.
Also, you will need to ensure that you [Inject] the session object into your repository as well and that the session object is scoped to the request.
as an example:
public class MyRepository
{
[Inject]
public ISession Session { get; set; }
public void Save(MyModel model) { Session.Save(model); }
}
public class MyController : Controller
{
[Inject]
public MyRepository MyRepository { get; set; }
[Transaction]
public ActionResult Save(MyModel model)
{
MyRepository.Save(model);
}
}
and when registering your session;
var configuration = new NHibernateConfiguration();
Bind<ISessionFactory>().ToConstant(configuration.GetSessionFactory());
Bind<ISession>().ToMethod(x => x.Kernel.Get<ISessionFactory>().OpenSession()).InRequestScope();
notice the InRequestScope() part
Posting this for #Fatal.
The original response answered my question, but this is inevitably what I ended up doing to avoid using method level attributes.
Instead of declaring the code to control my transaction in an attribute, I included it right in my ISession management for Ninject.
Bind<ISession>()
.ToMethod(c => OpenSession())
.InRequestScope()
.OnActivation(session =>
{
session.BeginTransaction();
session.FlushMode = FlushMode.Commit;
})
.OnDeactivation(session =>
{
if (session.Transaction.IsActive)
{
try
{
session.Transaction.Commit();
}
catch
{
session.Transaction.Rollback();
}
}
});
What this does is open the new ISession each request where an ISession is injected, and when it is activated it begins a new transaction. At this point, Ninject now tracks the state and handles the consequences of rolling it back, thus implementing a very simplistic unit of work pattern.
I do not know if this is the best approach in the world, but I have shown it to a few people and it has not been shot down for bad practice, and it has worked well for me so far.

loosely coupled development

I'm reading Sanderson's "Pro ASP.NET MVC Framework".
I'm confused a little with decoupling implementation.
He uses LinqToSql in the code sample and repository pattern to interact with database.
[Table(Name = "Products")]
public class Product
{
[Column(IsPrimaryKey = true, IsDbGenerated = true, AutoSync=AutoSync.OnInsert)]
public int ProductID { get; set; }
[Column]
public string Name { get; set; }
[Column]
public string Description { get; set; }
[Column]
public decimal Price { get; set; }
[Column]
public string Category { get; set; }
}
public class SqlProductsRepository : IProductsRepository
{
private Table<Product> productsTable;
public SqlProductsRepository(string connectionString)
{
productsTable = (new DataContext(connectionString)).GetTable<Product>();
}
public IQueryable<Product> Products
{
get { return productsTable; }
}
}
SqlProductsRepository is dataLayer here as it interacts with database.
1.However it is located in DomainModel project. Maybe it is just for demo?
So where is domain logic here?
2.I can't see full decoupling as Products property return IQueryable.
Is it assumed that if we change a component, it must contain Product class?
I seem that it is required to have one more project with abstractions:
Repository Interfaces such as IProductRepository and MappingClasses interfaces such as IProduct.
DataLayer component must implement these abastractions.
Is it right?
Maybe it is diffucult to explain it shortly, however how it is usually work in live projects?
IMHO, this must have been for demo purposes as it doesn't make sense (in real world environments) to separate your architecture in layers and keep these different layers in a single dll. I just came up with a valid reason. What if you want multiple applications to use your business layer without immediate access to the datalayer. You'd have to carefully consider access modifiers to your datalayer but it would be possible.
Whether you should expose IQueryable objects from your datalayer is a discussion that has been going on since the invention of the repository pattern. And there are quite a lot of resources to be found about it.
To list a few:
http://mikehadlow.blogspot.com/2009/01/should-my-repository-expose-iqueryable.html
How can I write a clean Repository without exposing IQueryable to the rest of my application?
To return IQueryable<T> or not return IQueryable<T>
http://www.weirdlover.com/2010/05/11/iqueryable-can-kill-your-dog-steal-your-wife-kill-your-will-to-live-etc/
... (google)

Resources