I have been queering EF 6 DbSet.Local with great success for a number of years. However doing the same in EF Core is a factor 6 slower which unfortunately is a showstopper for me as I'm dealing with a huge amount of data which again prevent me to make the shift from EF 6 to EF Core.
Please help me to solve this issue.
Below please find an example which can be run both in EF 6. and EF Core
private BlogContext _blogContext;
public void BlogTest()
{
_logger.Information("BlogTest started");
_blogContext = new BlogContext();
_blogContext.ChangeTracker.AutoDetectChangesEnabled = false; // EF Core
//_blogContext.Configuration.AutoDetectChangesEnabled = false; // EF 6
// Add blogs to context
for (int i = 0; i < 10000; i++)
{
_blogContext.Blogs.Add(new Blog { ID = i });
}
_logger.Information("BlogTest continued");
// Loop blogs in context
for (int i = 1; i < 100000; i++)
{
foreach (var blog in _blogContext.Blogs.Local)
{
}
}
_logger.Information("BlogTest ended");
}
public class Blog
{
public int ID { get; set; }
}
public class BlogContext: DbContext
{
public DbSet<Blog> Blogs { get; set; }
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseSqlServer("Server=(localdb)\\mssqllocaldb;Database=Blog;Trusted_Connection=True;MultipleActiveResultSets=True;");
}
}
This has been solved in upcoming EF Core 3.0
https://github.com/aspnet/EntityFrameworkCore/issues/14231
This may be help here
check BlogContext model, EF Core not supported many thing yet.
Related
I'm trying to change from SQLite to Realm.io in my Xamarin projects, but can't find any autoincrement on ID's. I found a post with Java, with following line:
int nextID = (int) (realm.where(dbObj.class).maximumInt("id") + 1);
In Xamarin there isn't a where, but i tried this:
realm.All<DebitorPlateDBModel> ().Max (x => x.Id + 1);
Sadly "Max" isn't support.
Has anyone succeed on this?
There are different ways of achieving this, it just depends on what fits your model the best, here are a just a couple:
Test Model:
public class IdIntKeyModel : RealmObject
{
[Indexed]
public int ID { get; set; }
public string Humanized { get; set; }
}
Gap-less key ordering (via Count):
Note: Good for initial bulk imports
Note: Assumes only one thread adding records and you do not have gaps in your record ids, i.e. no deletes without reordering keys, etc...
var config = RealmConfiguration.DefaultConfiguration;
config.SchemaVersion = 1;
using (var theRealm = Realm.GetInstance("StackoverFlow.realm"))
{
var key = theRealm.All<IdIntKeyModel>();
theRealm.Write(() =>
{
for (int i = 1; i < 1000; i++)
{
var model = theRealm.CreateObject<IdIntKeyModel>();
model.ID = key.Count() + 1;
model.Humanized = model.ID.ToWords();
System.Diagnostics.Debug.WriteLine($"{model.ID} : {model.Humanized}");
}
});
var whatIsTheKey = theRealm.All<IdIntKeyModel>().OrderBy(modelKey => modelKey.ID).Last();
System.Diagnostics.Debug.WriteLine($"{whatIsTheKey.ID} : {whatIsTheKey.Humanized}");
}
Gap'ie key ordering (refetch the last record by indexed ID):
Note: "Gap'ie" is Trademark pending ;-)
var rand = new Random();
var config = RealmConfiguration.DefaultConfiguration;
config.SchemaVersion = 1;
using (var theRealm = Realm.GetInstance("StackOverflow.realm"))
{
theRealm.Write(() =>
{
for (int i = 1; i < 1000; i++)
{
var lastID = theRealm.All<IdIntKeyModel>().OrderByDescending(modelKey => modelKey.ID).FirstOrDefault();
var model = theRealm.CreateObject<IdIntKeyModel>();
model.ID = lastID != null ? lastID.ID + rand.Next(10) : 1; // use lastID.ID++ for normal code flow, using rand.Next as a test to check ID indexing
model.Humanized = model.ID.ToWords();
}
});
var lastKey = theRealm.All<IdIntKeyModel>().OrderBy(modelKey => modelKey.ID).Last();
System.Diagnostics.Debug.WriteLine($"{lastKey.ID} : {lastKey.Humanized}");
}
Note: Code updates based on added support for FirstOrDefault, tested w/ v0.78.1
"In Xamarin there isn't a where" is not correct - we support LINQ as you can see in the snippets on the home page.
However, you are correct that we don't (yet) have an auto-increment or anything for that role.
We will get something at some point, but due to synchronisation issues it will not be auto-increment, but rather something like auto-unique-id.
We just released the full Mobile Platform with sync (Xamarin is getting there). One of the big deals of the Realm Object Server is dealing with people who are editing offline data and then having highly reliable synchronisation to other Realms.
There is no way that simple auto-increment can be made to work with disconnected data creation (the first time I dealt with this was on a Mac back in 1996 but the laws of physics haven't changed, we just stopped using floppy disks).
where clause is actually supported by Realm. You only need to import linq. However, auto increment id is really a big deal.
I solved auto increment issue by creating my own id
using Realms;
using System;
namespace RealmDatabase
{
public class RealmUserObject : RealmObject
{
[PrimaryKey]
public int userID { get; set; }
public string userLoginName { get; set; }
public DateTimeOffset userCreated { get; set; }
public bool userActive { get; set; }
}
}
and then when adding account, i'm getting last user info from realm then get the last id from it ( which is int ) then + 1 before to insert new account.
public List<RealmUserObject> getAllUserAccountsFromDatabase()
{
try
{
realm = Realm.GetInstance(config);
return realm.All<RealmUserObject>().Last();
}
catch (Exception) { throw; }
}
i am calling whole account because it is useful for me in other scenario. but you can actually ask directly what you want like this way
return realm.All<RealmUserObject>().Last().userID;
note: ofcourse the issue with this is what if you don't have any record existing, then just insert it and make id initial to 1 and put else if account is greater than 0
I am trying to find a way to improve insert performances with the following code (please, read my questions after the code block):
//Domain classes
[Table("Products")]
public class Product
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int Id { get; set; }
public string Sku { get; set; }
[ForeignKey("Orders")]
public virtual ICollection<Order> Orders { get; set; }
public Product()
{
Orders = new List<Order>();
}
}
[Table("Orders")]
public class Order
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int Id { get; set; }
public string Title { get; set; }
public decimal Total { get; set; }
[ForeignKey("Products")]
public virtual ICollection<Product> Products { get; set; }
public Order()
{
Products = new List<Product>();
}
}
//Data access
public class MyDataContext : DbContext
{
public MyDataContext()
: base("MyDataContext")
{
Configuration.LazyLoadingEnabled = true;
Configuration.ProxyCreationEnabled = true;
Database.SetInitializer(new CreateDatabaseIfNotExists<MyDataContext>());
}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Entity<Product>().ToTable("Products");
modelBuilder.Entity<Order>().ToTable("Orders");
}
}
//Service layer
public interface IServices<T, K>
{
T Create(T item);
T Read(K key);
IEnumerable<T> ReadAll(Expression<Func<IEnumerable<T>, IEnumerable<T>>> pre);
T Update(T item);
void Delete(K key);
void Save();
void Dispose();
void BatchSave(IEnumerable<T> list);
void BatchUpdate(IEnumerable<T> list, Action<UpdateSpecification<T>> spec);
}
public class BaseServices<T, K> : IDisposable, IServices<T, K> where T : class
{
protected MyDataContext Context;
public BaseServices()
{
Context = new MyDataContext();
}
public T Create(T item)
{
T created;
created = Context.Set<T>().Add(item);
return created;
}
public void Delete(K key)
{
var item = Read(key);
if (item == null)
return;
Context.Set<T>().Attach(item);
Context.Set<T>().Remove(item);
}
public T Read(K key)
{
T read;
read = Context.Set<T>().Find(key);
return read;
}
public IEnumerable<T> ReadAll(Expression<Func<IEnumerable<T>, IEnumerable<T>>> pre)
{
IEnumerable<T> read;
read = Context.Set<T>().ToList();
read = pre.Compile().Invoke(read);
return read;
}
public T Update(T item)
{
Context.Set<T>().Attach(item);
Context.Entry<T>(item).CurrentValues.SetValues(item);
Context.Entry<T>(item).State = System.Data.Entity.EntityState.Modified;
return item;
}
public void Save()
{
Context.SaveChanges();
}
}
public interface IOrderServices : IServices<Order, int>
{
//custom logic goes here
}
public interface IProductServices : IServices<Product, int>
{
//custom logic goes here
}
//Web project's controller
public ActionResult TestCreateProducts()
{
//Create 100 new rest products
for (int i = 0; i < 100; i++)
{
_productServices.Create(new Product
{
Sku = i.ToString()
});
}
_productServices.Save();
var products = _productServices.ReadAll(r => r); //get a list of saved products to add them to orders
var random = new Random();
var orders = new List<Order>();
var count = 0;
//Create 3000 orders
for (int i = 1; i <= 3000; i++)
{
//Generate a random list of products to attach to the current order
var productIds = new List<int>();
var x = random.Next(1, products.Count() - 1);
for (int j = 0; j < x; j++)
{
productIds.Add(random.Next(products.Min(r => r.Id), products.Max(r => r.Id)));
}
//Create the order
var order = new Order
{
Title = "Order" + i,
Total = i,
Products = products.Where(p => productIds.Contains(p.Id))
};
orders.Add(order);
}
_orderServices.CreateRange(orders);
_orderServices.Save();
return RedirectToAction("Index");
}
This code works fine but is very VERY slow when the SaveChanges is executed.
Behind the scene, the annotations on the domain objects creates all the relationships needed: a OrderProducts table with the proper foreign keys are automatically created and the inserts are being done by EF properly.
I've tried many things with bulk inserts using EntityFramework.Utilities, SqlBulkCopy, etc... but none worked.
Is there a way to achieve this?
Understand this is only for testing purposes and my goal is to optimize the best I can any operations in our softwares using EF.
Thanks!
Just before you do your inserts disable your context's AutoDetectChangesEnabled (by setting it to false). Do your inserts and then set the AutoDetectChangesEnabled back to true e.g.;
try
{
MyContext.Configuration.AutoDetectChangesEnabled = false;
// do your inserts updates etc..
}
finally
{
MyContext.Configuration.AutoDetectChangesEnabled = true;
}
You can find more information on what this is doing here
I see two reasons why your code is slow.
Add vs. AddRange
You add entity one by one using the Create method.
You should always use AddRange over Add. The Add method will try to DetectChanges every time the add method is invoked while AddRange only once.
You should add a "CreateRange" method in your code.
public IEnumerable<T> CreateRange(IEnumerable<T> list)
{
return Context.Set<T>().AddRange(list);
}
var products = new List<Product>();
//Create 100 new rest products
for (int i = 0; i < 100; i++)
{
products.Add(new Product { Sku = i.ToString() });
}
_productServices.CreateRange(list);
_productServices.Save();
Disabling / Enabling the property AutoDetectChanges also work as #mark_h proposed, however personally I don't like this kind of solution.
Database Round Trip
A database round trip is required for every record to add, modify or delete. So if you insert 3,000 records, then 3,000 database round trip will be required which is VERY slow.
You already tried EntityFramework.BulkInsert or SqlBulkCopy, which is great. I recommend you first to try them again using the "AddRange" fix to see the newly performance.
Here is a biased comparison of library supporting BulkInsert for EF:
Entity Framework - Bulk Insert Library Reviews & Comparisons
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library allows you to BulkSaveChanges, BulkInsert, BulkUpdate, BulkDelete and BulkMerge within your Database.
It supports all inheritances and associations.
// Easy to use
public void Save()
{
// Context.SaveChanges();
Context.BulkSaveChanges();
}
// Easy to customize
public void Save()
{
// Context.SaveChanges();
Context.BulkSaveChanges(bulk => bulk.BatchSize = 100);
}
EDIT: Added answer to sub question
An entity object cannot be referenced by multiple instances of
IEntityChangeTracker
The issue happens because you use two different DbContext. One for the product and one for order.
You may find a better answer than mine in a different thread like this answer.
The Add method successfully attach the product, subsequent call of the same product doesn't throw an error because it's the same product.
The AddRange method, however, attach the product multiple time since it's not come from the same context, so when Detect Changes is called, he doesn't know how to handle it.
One way to fix it is by re-using the same context
var _productServices = new BaseServices<Product, int>();
var _orderServices = new BaseServices<Order, int>(_productServices.Context);
While it may not be elegant, the performance will be improved.
hello I downloaded the solution nopCommerce an e-commerce open source which achievement operate and install without problems with MSSQLSERVER database however I would like to implement with ORACLEdatabase
Official Site http://www.nopcommerce.com/
I have been guiding me this post
http://www.nopcommerce.com/boards/t/17712/mysql-support.aspx
I have tried to follow the steps indicated for mysql and adapt to oracle yet one of the first things that tells me is the creation of two classes
OracleConnectionFactory:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data.Entity.Infrastructure;
using System.Data.Common;
using Oracle.DataAccess.Client;
namespace Nop.Data
{
public class OracleConnectionFactory : IDbConnectionFactory
{
private readonly string _baseConnectionString;
private Func<string, DbProviderFactory> _providerFactoryCreator;
public OracleConnectionFactory()
{
}
public OracleConnectionFactory(string baseConnectionString)
{
this._baseConnectionString = baseConnectionString;
}
public DbConnection CreateConnection(string nameOrConnectionString)
{
string connectionString = nameOrConnectionString;
bool treatAsConnectionString = nameOrConnectionString.IndexOf('=') >= 0;
if (!treatAsConnectionString)
{
OracleConnectionStringBuilder builder = new OracleConnectionStringBuilder(this.BaseConnectionString);
//MySqlConnectionStringBuilder builder = new MySqlConnectionStringBuilder(this.BaseConnectionString);
//builder.Server = nameOrConnectionString;
connectionString = builder.ConnectionString;
}
DbConnection connection = null;
try
{
connection = this.ProviderFactory("Oracle.DataAccess.Client").CreateConnection();
connection.ConnectionString = connectionString;
}
catch
{
//connection = new MySqlConnection(connectionString);
connection = new OracleConnection(connectionString);
}
return connection;
}
public string BaseConnectionString
{
get
{
return this._baseConnectionString;
}
}
internal Func<string, DbProviderFactory> ProviderFactory
{
get
{
Func<string, DbProviderFactory> func1 = this._providerFactoryCreator;
return delegate(string name)
{
return DbProviderFactories.GetFactory(name);
};
}
set
{
this._providerFactoryCreator = value;
}
}
}
}
OracleProvider :
using System;
using System.Collections.Generic;
using System.Data.Common;
using System.Data.Entity;
using System.Data.Entity.Infrastructure;
using System.Data.SqlClient;
using System.IO;
using System.Text;
using System.Web.Hosting;
using Nop.Data.Initializers;
using Oracle.DataAccess.Client;
using Nop.Core.Data;
namespace Nop.Data
{
public class OracleDataProvider : IDataProvider
{
#region Utilities
protected virtual string[] ParseCommands(string filePath, bool throwExceptionIfNonExists)
{
if (!File.Exists(filePath))
{
if (throwExceptionIfNonExists)
throw new ArgumentException(string.Format("Specified file doesn't exist - {0}", filePath));
else
return new string[0];
}
var statements = new List<string>();
using (var stream = File.OpenRead(filePath))
using (var reader = new StreamReader(stream))
{
var statement = "";
while ((statement = readNextStatementFromStream(reader)) != null)
{
statements.Add(statement);
}
}
return statements.ToArray();
}
protected virtual string readNextStatementFromStream(StreamReader reader)
{
var sb = new StringBuilder();
string lineOfText;
while (true)
{
lineOfText = reader.ReadLine();
if (lineOfText == null)
{
if (sb.Length > 0)
return sb.ToString();
else
return null;
}
//MySql doesn't support GO, so just use a commented out GO as the separator
if (lineOfText.TrimEnd().ToUpper() == "-- GO")
break;
sb.Append(lineOfText + Environment.NewLine);
}
return sb.ToString();
}
#endregion
#region Methods
public virtual void InitConnectionFactory()
{
//var connectionFactory = new SqlConnectionFactory();
var connectionFactory = new OracleConnectionFactory();
//TODO fix compilation warning (below)
#pragma warning disable 0618
Database.DefaultConnectionFactory = connectionFactory;
}
/// <summary>
/// Initialize database
/// </summary>
public virtual void InitDatabase()
{
InitConnectionFactory();
SetDatabaseInitializer();
}
/// <summary>
/// Set database initializer
/// </summary>
public virtual void SetDatabaseInitializer()
{
//pass some table names to ensure that we have nopCommerce 2.X installed
var tablesToValidate = new[] { "Customer", "Discount", "Order", "Product", "ShoppingCartItem" };
//custom commands (stored proedures, indexes)
var customCommands = new List<string>();
//use webHelper.MapPath instead of HostingEnvironment.MapPath which is not available in unit tests
customCommands.AddRange(ParseCommands(HostingEnvironment.MapPath("~/App_Data/Install/SqlServer.Indexes.sql"), false));
//use webHelper.MapPath instead of HostingEnvironment.MapPath which is not available in unit tests
customCommands.AddRange(ParseCommands(HostingEnvironment.MapPath("~/App_Data/Install/SqlServer.StoredProcedures.sql"), false));
var initializer = new CreateTablesIfNotExist<NopObjectContext>(tablesToValidate, customCommands.ToArray());
Database.SetInitializer(initializer);
}
/// <summary>
/// A value indicating whether this data provider supports stored procedures
/// </summary>
public virtual bool StoredProceduredSupported
{
get { return true; }
}
/// <summary>
/// Gets a support database parameter object (used by stored procedures)
/// </summary>
/// <returns>Parameter</returns>
public virtual DbParameter GetParameter()
{
//return new SqlParameter();
return new OracleParameter();
}
#endregion
}
}
also i installed the managed nuget package like this link said
http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/dotnet/CodeFirst/index.html
oracle odp.net managed driver
in nop.data and nop.web
I appreciate any help freshened up the steps I need to do or that I may be going
one of the first thing i trying is the Oracle provider recognize and achieve connect to my database
It was an interesting question. Using nop commerce with Oracle is technically possible, but it would be a very wild ride for you.
Good news first. Nop Commerce is architectured based on repository pattern. Basically Nop.Data abstracts all the SQL Server related data access. You would need to re-write this almost entirely.
Not so good news next. Search, Paging, Catalog listing uses a stored procedure in SQL Server. You may need to re-write it completely. And most of the time, you are on your own. Unless if you are confident with Oracle and .Net EF, it would be a really wild ride.
I would say, its less problem if you want to stick with SQL Server. I understand sometime you may not make technical decision. So you can explain clearly about complexity and effort needed to migrate to the person makes that decision.
Source: NopCommerce developer for last 2.5years.
I have a Product object, which can depend on other products.
This dependency information is stored in a ServiceTemplate object.
These dependencies can be chained (i use the arrow to indicate that 3 depends on 2, and 2 depends on 1):
1 <= 2 <= 3
I need to prevent circular references, where with the above 1 could be set to depend on 3 this causing a loop.
This is hashed up from how i think it may work but is brute forcing the solution - what would be the optimal algorithm approach or is this already the best way?
class Program
{
class Product
{
public Product(int id)
{
this.id = id;
}
private readonly int id;
public int Id { get { return this.id; } }
}
class ServiceTemplate
{
// stores the list of other products that a product depends on
Dictionary<int, List<int>> relationships = new Dictionary<int, List<int>>();
public void SetRequires(Product on, Product requires)
{
if (WouldCauseCircularDepdndency(on.Id, requires.Id) == true)
throw new ArgumentException("circular dependencies not allowed");
List<int> list = null;
this.relationships.TryGetValue(on.Id, out list);
if(list==null)
{
list = new List<int>();
this.relationships.Add(on.Id, list);
}
list.Add(requires.Id);
}
private bool WouldCauseCircularDepdndency(int on, int requires)
{
// get relationships of product we will depend on
List<int> list = null;
this.relationships.TryGetValue(requires, out list);
if (list == null)
{
return false;
}
else if (list.Contains(on))
{
return true;
}
else
{
foreach (var id in list)
{
// traverse each child recursively
if (WouldCauseCircularDepdndency(on, id))
return true;
}
}
return false; // if we got this far, no circular references detected
}
}
static void Main(string[] args)
{
var windows = new Product(1);
var linux = new Product(2);
var mySql = new Product(3);
var ms_sql = new Product(4);
var cluster = new Product(5);
var other = new Product(6);
var config = new ServiceTemplate();
config.SetRequires(mySql, windows); // mySql requires windows
config.SetRequires(mySql, linux); // mySql requires linux
config.SetRequires(ms_sql, windows); // microsoft sql requires windows
config.SetRequires(cluster, ms_sql); // cluster requires microsoft sql
config.SetRequires(other, cluster);
// this should throw an exception due to circular dependency
config.SetRequires(windows, other);
/* at this point the config relationships dictionary is as follows:
3 => 1,2
4 => 1
5 => 4
5 => 6
1 => 6
*/
}
}
You could try topological sorting. If it can be constructed, you have no circular dependency. Otherwise, you have a cycle.
I ended up using the QuickGraph nuget package and an AdjacencyGraph<int,Edge<int>>, once the relationship is added i try and to a TopologicalSort() as advised by #Vesi:
class Program
{
class Product
{
public Product(int id)
{
this.id = id;
}
private readonly int id;
public int Id { get { return this.id; } }
}
class ServiceTemplate
{
// stores the list of other products that a product depends on
AdjacencyGraph<int, Edge<int>> relationshipGraph = new AdjacencyGraph<int, Edge<int>>();
public void SetRequires(Product on, Product requires)
{
var toAdd = new Edge<int>(on.Id, requires.Id);
this.relationshipGraph.AddVerticesAndEdge(toAdd);
try
{
var list = this.relationshipGraph.TopologicalSort();
}
catch (NonAcyclicGraphException)
{
this.relationshipGraph.RemoveEdge(toAdd);
throw new ArgumentException("Circular dependencies not allowed");
}
}
}
static void Main(string[] args)
{
var windows = new Product(1);
var linux = new Product(2);
var mySql = new Product(3);
var ms_sql = new Product(4);
var cluster = new Product(5);
var other = new Product(6);
var config = new ServiceTemplate();
config.SetRequires(mySql, windows); // mySql requires windows
config.SetRequires(mySql, linux); // mySql requires linux
config.SetRequires(ms_sql, windows); // microsoft sql requires windows
config.SetRequires(cluster, ms_sql); // cluster requires microsoft sql
config.SetRequires(other, cluster);
// this should throw an exception due to circular dependency
config.SetRequires(windows, other);
}
}
Given the following code:
public class RMAInfo
{
public enum RMAStatuses {
Undefined = 0, Approved = 1, Denied = 2,
Pending = 3, Received = 4, Closed = 5
}
public enum ReturnLocations { Undefined = 0, Utah = 1, Indy = 2 }
public RMAInfo()
{
ID = -1;
RMACode = string.Empty;
}
public int ID { get; set; }
public string RMACode { get; set; }
public string ResellerID { get; set; }
public RMAStatuses RMAStatus { get; set; }
}
private List<RMAInfo> GetRMAInfos_Internal(string resellerID)
{
List<RMAInfo> returnRMAInfos = new List<RMAInfo>();
using (Models.RMAEntities context = new Models.RMAEntities())
{
returnRMAInfos = (from r in context.RMAs
where r.ResellerID == resellerID
select new RMAInfo
{
ID = r.ID,
RMACode = r.RMACode,
ResellerID = r.ResellerID,
// error on next line!
RMAStatus = RMAInfo.RMAStatuses.Pending
}).ToList();
}
return returnRMAInfos;
}
I am getting an error on the assignment to the RMAStatus field. The error is
The specified value is not an instance of type 'Edm.Int32'
If I comment out that line, it works fine.
I have also tried to do this same code without using EF, and it seems to work fine.
Any ideas?
Entity Framework does not like the enum, as it cannot translate it to SQL. You would need to expose a way for EF to set the underlying int value, or you would have to set the value yourself once EF was done with it.
What you might do is expose an int property to set it. If you wish, you could restrict it to internal access so that perhaps callers can't see it but your EF code can (assuming callers are in different assemblies, but your context is not). Then you could have
public class RMAInfo
{
///<summary>
/// Integer representation of RMAStatus
///</summary>
internal int RMAStatusCode
{
get { return (int)this.RMAStatus; } // you could omit the getter
set { this.RMAStatus = (RMAInfo.RMAStatuses)value; }
}
}
...
select new RMAInfo
{
...
RMAStatusCode = (int)RMAInfo.RMAStatuses.Pending
}
To avoid this, you would basically select your RMAInfo sans status, and then iterate over the result to set each status to pending, leaving EF out of it entirely.
Installing .Net 4.5 appears to fix the issue as well (your project can still be on 4.0).
I was having this issue on our staging server (dev and test servers worked fine) and discovered that it did not have .Net 4.5 installed. Once I installed 4.5, the issue cleared up without any code changes.