check if table has been created in code first approach - linq

I am using Entity Framework's code-first approach to create tables, and I need to check if there are any entities in the database that I need to delete:
class MyDocument
{
public string Id { get; set; }
public string Text { get; set; }
}
class MyContext : DbContext
{
public DbSet<MyDocument> Documents { get; set; }
}
using (var data = new MyContext())
{
var present = from d in data.Documents
where d.Id == "some id" || d.Id == "other id"
select d;
// delete above documents
}
on first run, when there is no table yet, the LINQ expression above throws an exception:
Invalid object name 'dbo.Documents'
How do I check if the table is there and if it is not, then set present to the empty set, perhaps? Or maybe there is a way to force database/table creation before I issue the LINQ query?

EF will actually check the entire context against the DB it is attached to.
The DB can have more than the context. But not less.
So actually you check
Context.Database.CreateIfNotExists();
If the DB and context dont match and you are using automatic migrations, then you get specific object errors. But this can be misleading in terms of the how EF is handling the context to DB comparison.
You could of course try and access every DBSet in a context
Not sure how useful that is though.
EF Code first supports Migrations, either Automated or on demand.
See EF Code first migrations
Database.SetInitializer
use SetInitializer command to turn on automatic migrations for example.
The link will provide more info on the Manual/controlled approach to db migration for advanced db handling. The easier Automatic approach, is also described in the link.

Related

DbSet declaration does not accept the table name shown in database

I have developed an app for tracking multi-party coordination on proposed change requests.
I only use two table, with a one-to-one relationship. One table correlates to fields on an existing official paper form, while the other table tracks additional information in a one-to-one relationship.
I previously developed this app as a standalone project, using MS Access, but now, I am adding the app to a "one-stop shopping" SQL Server database environment.
My problem comes in my DbSet statements. The table names which the DBA chose result in errors which I never had when the app was stand-alone:
Below is the C# code for the DbContext portion:
namespace FormTracker
{
public class ApplicationDbContext:DbContext
{
public ApplicationDbContext(DbContextOptions options) : base(options)
{
}
public DbSet<T__AODMS_1067_tracking_fields> T__AODMS_1067_tracking_fieldss { get; set; }
public DbSet<T__AODMS_1067_tracking_non_1067_fields> T__AODMS_1067_tracking_non_1067_fields_Recordss { get; set; }
}
}
The portions between the <> are what is being flagged when build is executed.
Any ideas? possibly something totally obvious that I'm not seeing?

How can I delete all records from a table?

I've been searching for an answer on how to delete ALL records from a table using LINQ method syntax but all answers do it based on an attribute.
I want to delete every single record from the databse.
The table looks like so:
public class Inventory
{
public int InventoryId { get; set; }
public string InventoryName { get; set; }
}
I'm not looking to delete records based on a specific name or id.
I want to delete ALL recods.
LINQ method syntax isn't a must, bt I do prefer it since it's easier to read.
To delete all data from DB table I recommend to use SQL:
Trancate Table <tableName>
Linq is not meant to change the source. There are no LINQ methods to delete or update any element from your input.
The only method to change you input, is to select the (identifiers of the )data that you want to delete in some collection, and then delete the items one by one in a foreach. It might be that your interface with the source collection already has a DeleteRange, in that case you don't have to do the foreach.
Alas you didn't mention what your table was: Is it a System.Data.DataTable? Or maybe an Entity Framework DbSet<...>? Any other commonly used class that represents a Table?
If you table class is a System.Data.DataTable, or implements ICollection, it should have a method Clear.
If your tabls is an entity framework DbSet<...>, then it depends on your Provider (the database management system that you use) whether you can use `Clear'. Usually you need to do the following:
using (var dbContext = new MyDbContext(...))
{
List<...> itemsToDelete = dbContext.MyTable.Where(...).ToList();
dbContext.MyTable.RemoveRange(itemsToDelete);
dbContext.SaveChanges();
}

Breeze: Remove entities from cache that is removed from database by another user without clearing the whole cache?

Im facing a problem that probably is quite common but i can't find any solution to it.
The problem occurs when a user has entities in its cache on the client and another user removes some of those entities (on the server). When the first user then wants to update its data the removed entities is not removed from the cache. You could solve it by clearing the cache each time you update but then you also looses all non-saved changes.
Am I missing something obvious?
Example:
Model:
public class Order
{
[Key]
public int Id { get; set; }
public ICollection<OrderDetail> OrderDetails { get; set; }
}
public class OrderDetail
{
[Key]
public int Id { get; set; }
[ForeignKey("Order")]
public int Order_Id { get; set; }
public virtual Order Order { get; set; }
}
Client code:
function getOrder(orderId, orderObservable) {
var query = EntityQuery.from("Orders")
.where("orderId", "==", orderId)
.expand("orderDetails");
return manager.executeQuery(query).then(querySucceeded).fail(queryFailed);
function querySucceeded(data) {
var order = data.results[0];
// NOTE: the removed orderdetail is still there 'order.orderDetails'
orderObservable(order);
}
}
Step-by-step scenario:
User A queries for an order with its corresponding orderdetails.
The order and orderdetails is then placed in the cache.
User B removes an orderdetail and saves the changes to the server.
User A queries to get the latest updates for the order.
When the query returns the removed orderdetail is still there.
In the breeze-docs, under the headline "Important Caveats about cache clearing", there is a solution that removes cached entities by comparing the cache and the result from the query and detaches the missing entities in the result.
http://www.breezejs.com/documentation/entitymanager-and-caching
But that doesn't work in this case. I'm guessing it has to do with the fact that orderdetails is related to the order and that it is "picked up" from the cache before it is passed to the success-callback.
All help is appreciated!
The problem you are facing isn't with Breeze, but with design in general. There are a couple of options that come to mind -
Use SignalR to notify your web application that a change has occurred, detach any removed entities from the cache.
Use an archived or deleted flag instead of removing the entities from the database.
Both have their advantages and disadvantages.
With SignalR you will need to get the pipe work in place for notifications and set up a specific work flow around removing deleted entities
manager.detachEntity(entityToDetach);
The reason you would detach instead of deleting is because if you set it to deleted then your Breeze entity manager still thinks you need to persist that change to the database.
If you use a flag then you could simply set your business logic to ignore entities that are flagged as deleted or archived and when you query the DB it will return the change to that entity and stop showing it
myEntity().archived(true);
The problem here would be if your entity doesn't match your query it would never return the updated entity to let the client know that it was archived or deleted. The other caveat is that you would have information laying around in your database that isn't active anymore.
Depending on which type of application and requirements you have you should make one of these choices, or come up with another. Hope that helps.

Can't Persist Field to Aspnet_Users via NHibernate/ActiveRecord

I'm using ActiveRecord with NHibernate on the backend. I've set up a mapping for Users; I can create/retrieve/register users without any issues.
I now want to add an association called Role to my users (many users per role). I've created the appropriate Role class, tables, data, etc. and everything seems to be working on that end as well.
The problem is that when I save a user and associate a Role, that association does not persist to the database.
I've added a RoleId (int16) column to the aspnet_Users table to match the Role table's Id (int16) column. I've tried using Save and SaveAndFlush without success.
Here's some code:
Role superUser = Role.First(r => r.name == "Super User");
User me = User.First(r => r.UserName == myUserName);
me.Role = superUser;
me.Save(); // Or: SaveAndFlush
When debugging, I can see the association on the objects when they're saved (i.e. me.Role is not null and has the right attributes/properties/etc.) However, when I look at the database, the RoleId value for that user is still NULL. (SaveAndFlush doesn't make a difference.)
What am I missing?
I've read somewhere on SO that extending the users table is usually done by adding another table and linking the two by a foreign key; I assume the classes would then use inheritance by composition for the new ExtendedUser class. Assuming I don't want to go that route, why isn't this working? Is it because of the specific ASP.NET MVC stored procedures et. all?
Some relevant mapping:
[ActiveRecord("aspnet_Users", Mutable = false)]
public class User : ActiveRecordLinqBase<User>
{
[PrimaryKey(PrimaryKeyType.Assigned)]
public Guid UserId { get; set; }
// ...
[BelongsTo("RoleId", Cascade = CascadeEnum.SaveUpdate)]
public Role Role { get; set; }
}
[ActiveRecord]
public class Role : ActiveRecordLinqBase<Role>
{
[PrimaryKey]
public int Id { get; set; }
// ...
[HasMany(Inverse = true)]
public IList<User> Users { get; set; }
[Property]
public string Name { get; set; }
}
Edit: mutable="false" - this clearly stands that entity is read only, which is the source of your problem.
Immutable classes, mutable="false", may not be updated or deleted by the application. This allows NHibernate to make some minor
performance optimizations.
Also:
I believe that you need to have cascading defined. You are not saving just the entity itself but also reference to other entity. Use attributes, fluent config or hbml to define this the way you need. Here are the cascading options:
Here is what each cascade option means:
none - do not do any cascades, let the users handles them by
themselves.
save-update - when the object is saved/updated, check the assoications and save/update any object that require it (including
save/update the assoications in many-to-many scenario).
delete - when the object is deleted, delete all the objects in the assoication.
delete-orphan - when the object is deleted, delete all the objects in the assoication. In addition to that, when an object is
removed from the assoication and not assoicated with another object
(orphaned), also delete it.
all - when an object is save/update/delete, check the assoications and save/update/delete all the objects found.
all-delete-orphan - when an object is save/update/delete, check the assoications and save/update/delete all the objects found. In additional to that, when an object is removed from the assoication and not assoicated with another object (orphaned), also delete it.
You may want to read this article.

How to use a Dictionary or Hashtable for LINQ query performance underneath an OData service

I am very new to OData (only started on it yesterday) so please excuse me if this question is too dumb :-)
I have built a test project as a Proof of Concept for migrating our current web services to OData. For this test project, I am using Reflection Providers to expose POCO classes via OData. These POCO classes come from in-memory cache. Below is the code so far:
public class DataSource
{
public IQueryable<Category> CategoryList
{
get
{
List<Category> categoryList = GetCategoryListFromCache();
return categoryList.AsQueryable();
}
}
// below method is only required to allow navigation
// from Category to Product via OData urls
// eg: OData.svc/CategoryList(1)/ProductList(2) and so on
public IQueryable<Category> ProductList
{
get
{
return null;
}
}
}
[DataServiceKeyAttribute("CategoryId")]
public class Category
{
public int CategoryId { get; set; }
public string CategoryName { get; set; }
public List<Product> ProductList { get; set; }
}
[DataServiceKeyAttribute("ProductId")]
public class Product
{
public int ProductId { get; set; }
public string ProductName { get; set; }
}
To the best of my knowledge, OData is going to use LINQ behind the scenes to query these in-memory objects, ie: List in this case if somebody navigates to OData.svc/CategoryList(1)/ProductList(2) and so on.
Here is the problem though: In the real world scenario, I am looking at over 18 million records inside the cache representing over 24 different entities.
The current production web services make very good use of .NET Dictionary and Hashtable collections to ensure very fast look ups and to avoid a lot of looping. So to get to a Product having ProductID 2 under Category having CategoryID 1, the current web services just do 2 look ups, ie: first one to locate the Category and the second one to locate the Product inside the Category. Something like a btree.
I wanted to know how could I follow a similar architecture with OData where I could tell OData and LINQ to use Dictionary or Hashtables for locating records rather than looping over a Generic List?
Is it possible using Reflection Providers or I am left with no other choice but to write my custom provider for OData?
Thanks in advance.
You will need to process expression trees, so you will need at least partial IQueryable implementation over the underlying LINQ to Objects. For this you don't need a full blown custom provider though, just return you IQueryable from the propties on the context class.
In that IQueryable you would have to recognize filters on the "key" properties (.Where(p => p.ProductID = 2)) and translate that into a dictionary/hashtable lookup. Then you can use LINQ to objects to process the rest of the query.
But if the client issues a query with filter which doesn't touch the key property, it will end up doing a full scan. Although, your custom IQueryable could detect that and fail such query if you choose so.

Resources