Problem refreshing tables in the LINQ to SQL designer - visual-studio

I have been using LINQ to SQL for a while, and there is one thing that has always bothered me. Whenever I modify the schema of a table, in order to refresh it in the designer, I have to delete it and then add it back. That's fine, but this means I have to actually find the table in the designer. I have about 100+ tables in my database, and every time I do this, it's like finding a needle in a haystack. Well, maybe it's not that bad, but seriously, it takes way longer than it should.
Is there another option for refreshing tables that I am unaware of?

Some people use SqlMetal to 'refresh/update' their Linq2Sql designer. The designer does not have support for refreshing the schema, when the DB changes. You have to manually drop the table and re-add it back in.
ADO Entity Framework i believe can refresh. I've not used it, but I think I saw this at a TechEd demo this year.
Helpful Info: Google's results for SqlMetal.

This is not possible using the VS linq to sql designer.
You can do this using LLBLGEN PRO, a third party tool, instead of the built-in linq to sql designer. It isn't free but it does do a ton of other stuff as well, which of course you may or may not need.
LLBLGEN PRO is actually a full set of ORM tools, but also includes an enhanced linq-to-sql designer with 'refresh model from SQL' functionality.
See here for description of the issue - http://weblogs.asp.net/fbouma/archive/2008/05/01/linq-to-sql-support-added-to-llblgen-pro.aspx
And here for the tool - http://www.llblgen.com/

I don't do any customization of the content on the designer so after table changes I just hit CTRL+A followed by DEL. Then shift-select all of my tables and slap them back onto the designer. I don't have 100s of tables yet so not sure if things slow down at some point but with 20+ tables it just takes a second.

I have written an add-in that can do that (in both directions; database -> DBML or DBML- -> SQL-DDL diff script).
Unlike SQLMetal (or EF's "update model from database") mentioned in another reply, the add-in does a true sync/refresh; applying changes corresponding only to the differences between the model and the underlying db.
That means any customizations (renamed properties/navigation properties etc) that you have made in other areas of your model will not be removed/overwritten unless they are in conflict with the underlying db schema. (in which case you can still preserve them by adding them to the add-in's "exclusion list")
You can download it and get a free 30-day trial license from http://www.huagati.com/dbmltools/

I have a similar comment, thought it might fit in here for anybody out there Googling a solution to this issue...
When I change the columns that are returned by a stored procedure, deleting the procedure from the designer and re-adding it does not work. The custom return type entity that the designer generates does not reflect the changes to the SP.
I've tried disconnecting the DB in the server explorer, even deleting and re-adding the connection.
The only solution I've found is this:
1. Delete the SP from the designer.
2. Save the dbml file (or the whole solution, whatever)
3. Completely close Visual Studio.
4. Re-open Visual Studio and your solution.
5. Re-add the stored procedure to the designer.
I think that qualifies as a blue ribbon pain in the rump.
Anybody got a simpler solution?
PS- To those of you with 100+ tables: Go get a real (real == mature) ORM tool. I personally vote for NetTiers. It rocks. Used it for years with no (or at least very few) complaints. You'll probably have to buy CodeSmith to use it effectively, but it's worth it. The templates are open source. And there are templates for nHibernate as well. But I've found that I don't really dig on Java ports. If I'm gonna code on MS platforms I want code that was "born" there...
...editorial complete. :P

I have had simliar issues with the designer - the best thing I can suggest is creating multiple contexts for different areas of your data access - I broke mine down to as few a related tables as I could get away with for each functional area. You can re-use tables across contexts so it isn't a big deal.

There's a template for VS 2008 that replaces the designer, it should ease refreshing your LINQtoSQL classes: http://damieng.com/blog/2008/09/14/linq-to-sql-template-for-visual-studio-2008

There are a couple of other options:
Edit the .dbml file that the designer uses to draw the tables and generate the code. I've used this approach when the changes are small (adding a couple of columns, creating a simple table)
Use sqlmetal to create the required xml for the changed tables and move the declarations by hand to the .dbml file. This one is better for when the changes are either more complex or larger.

I personally detest using the designer, and I've had various issues with it whenever I've dared to use it.
I mostly use LINQ for very simple CRUD (no linked entities or anything), and if that's the case with you, it might be worth straying from the designer crutch. Especially since defining LINQ-to-SQL entities is as easy as this:
[Table("dbo.my_table")]
public class MyTable
{
[Column("id", AutoSync = AutoSync.OnInsert, IsDbGenerated = true, IsPrimaryKey = true)]
public Int32 Id { get; set; }
[Column("name", DbType="NVarChar(50) NOT NULL")]
public String Name { get; set; }
}
This way, all your entities have their own files, which makes finding them much easier, though you'll still have to add/update the properties manually.
Of course, if you'd refactor 100+ tables, that might not be an option ;)

Related

How to debug problems with The Entity Data Model Designer (Entity Framework)

I have inhereted some project which uses Entity Framework in a way which makes it hard to make there any changes. It uses QueryViews for almost all tables (cca 50 tables) and of course stored procedures. Now I have to change there quite a lot of things ... rename tables, add tables, change columns etc.
When I tried to use the "Update Model from database ..." wizard, than after the update (where I added/removed the tables and let refresh the others using the wizard) from the database the Entity Data Model Designer rendering stops working ... there is just blank window with the text "The Entity Data Model Designer is unable to display the file you requested."
So I tried different approaches (like manually editing the edmx file), but the problem remains. The editor shows only the "The Entity Data Model Designer is unable to display the file you requested."
The mapping using QueryViews makes it probably more complicated. It is well known that the designer can not work with the QueryViews properly (one can not edit them using the designer) and the Entity framework engine even does not recognize that the columns from CSDL are mapped using the QueryViews and complains on each and every column (which is mapped using QueryView) that "Error 11009: Property 'XXX' is not mapped." I see exactly 100 errors like this. Maybe somewhere after the 100th error, there is some hint (in the form of other errors) how to fix the issue with Designer, but I don't know how to see them. The 100 limit is most likely hardcoded in VS2010 (http://stackoverflow.com/questions/2880936/how-to-increase-error-limit-in-visual-studio).
Btw. the code (classes for entities etc.) is generated without problems.
So, the question is:Is there a way how to see some log or something, where would be noted why the Entity framework Data Model Designer is not able to render anything?
Or is there at least some way how to see the rest of the errors (besides the 100 errors)?
Or does anybody know the ideal way of dealing with updating schema in EF besides using the wizard?
Try to add new EDMX and right click >> open with >> XML editor, then you can see a complete set in an empty model definition in EDMX. So you can compare the two EDMX and check notice which part of the EDMX is missing.
Here is the error link
In the end I have just do all the changes manually by editing the xml. However, I used the model designer (the GUI integrated in VS for EF) for creating the whole CSDL layer. So my approach was to carefully choose tables in the correct order and add them one by one in the multiple iterations of the following steps:
Use the model designer to create the csdl layer for the chosen table including all relations with already existing table. This at least ensured that the designer was usable later on and it saves the manual writing of the CSDL objetcs.
Write the SSDL layer, which should reflect the DB table.
Write the mapping layer (in my case using the QueryViews).
Try to compile and resolve all compile errors.
Repeat for next table (or more tables if you find it easier).
I hope this will help somebody.

how do you generate class for LINQ to SQL?

I am using linq to sql for my mvc 3 project. There are several ways to generate domain modal class files.
sqlmetal
Object Relational Designer
hand code
I always hand code those model class files. because files generated by sqlmetal or designer are messy. whats your opinion? whats the best way to do it.
EDIT:
I am using MVC 3, not 2. Maybe I am wrong, but this is how I validate. I gonna end up writing all those class files anyway, so whats the point to use tools to generate them???
public class User
{
[Required]
public string Password { get; set; }
[Required, Compare("Password")]
public string ComparePassword { get; set; }
}
We have hundreds of tables across multiple databases (one server). We do table first development, drag the tables onto different DBML designer files each in different folders representing different namespaces within each project. The designer files are marked not to compile, and we use a custom built T4 template that generates our code by reading from whatever DBML files are in the project. This lets us have full control of the code that's generated, so we can do things like implement an interface (IAuditable is one example where we have CreatedBy, CreatedDate, ModifiedBy, ModifiedDate). We can also put System.ComponentModel.DataAnnotations on our Linqed objects this way too without resorting to Buddy Classes. We have a second T4 template that's in charge of refreshing the DBML from the database, so we can ensure that tables have the 3 part prefix (db.schema.tbl) and so we don't have to delete and re-add to the designer. The XML just gets changed based on reading the db schema and updating the DBML. We also generate a repository/manager object for each POCO that have a few common query operations like GetByID(), and also handle commits and the audit logging. These managers get extended with all the custom queries you'd need to write against each table, and they own the DataContext. This design is sometimes known as the "Mommy-may-I?" approach, where the object Linqed to the table has to ask its manager to do everything for it.
I've found this to be a very versatile and slick way of doing L2S, and it's made our back-end development a breeze so that we can put our focus on the user experience. The only downside is that if we do associations across namespaces, you have to manually add those to the partial class yourself, because otherwise you'd have to add that foreign table to another DBML in order to draw the association. This is actually not such a bad thing as causes us to really think about the specificity of our namespaces and cut down extra ones. Using T4 this way is a great way to develop DRY (don't repeat yourself). The table definition is the only place you need to change the structure and it all propogates. Validation goes in one place, the POCO. Queries go in one place, the manager. If you want to do something similar, here's a good place to start.
Even tho the Designer generated classes are messy, what does it matter to you?
There's, I dare say, absolutely no need to ever open one of the design files.
If you need to extend any of the entities defined in your model, they are all partial classes so you can just create your own partial class of the same name and implement your stuff...
When I do use L2S, I just use the designer.

How to make LINQ to SQL backward compatible

I am new to LINQ to SQL.... I need to add a new column to an existing table and I updated the dbml to include this new field. Then in my C# code, I query the database and accessing this new field. Everything is fine with the new database; however, if I load back a previous database without this new field, my program would crash when it's accessing the database (obviously because of this new field). How do I make it, either the database or my C# code to support backward compatibility?
Here's my code snip
I added a field email to Customer table and also add it to the DataContext.dbml, below is the c# code
DataContext ctx = new DataConext ();
var cusList = ctx.Customer;
foreach (var c in cusList)
{
.
.
.
//access the new field
if (c.email != null)
displayEmail (email);
.
.
.
}
When I ran through debugger, it's crashing at the very first foreach loop if I am using an older version database without the new email field.
Thanks.
Make sure you upgrade old database. That's what updates are made for.
I don't think there's a better option. But i might be wrong.
Should be a code-land fix. Make your code check for the existance of the column, and use different queries on each case.
I agree with Arnis L.: Upgrade your old database. LTS is going to want to look for that column called email in your table, and will complain if it can't find it. I could suggest a workaround that would entail using a Stored Procedure, but you'd need to update your old database to use this stored proc, so it's not a very helpful suggestion :-)
How to update your old database? This is the old-school way:
ALTER TABLE Customers
ADD Email VARCHAR(130) NULL
You could execute this manually against the older database through Query Analyzer, for example. See here for full docs on ALTER TABLE: http://msdn.microsoft.com/en-us/library/aa275462%28SQL.80%29.aspx
If you are working on a team with very strict procedures for deployment from development to production systems, you would already be writing your "change scripts" to the database in this same fashion. However, if you are developing through Enterprise Manager, it might seem counter-productive to have to do the same work, a second time, just to keep old database schemas in sync with the latest schema.
For a friendlier, more "gooey" approach to this latter style of development, I myself can't recommend enough the usage of something like the very excellent Red Gate SQL Compare tools to help you keep multiple SQL Server databases in sync. (There are other 3rd party utilities out there that supposedly can roughly the same thing, and that might even be a little cheaper, but I haven't looked much further into them.)
Best of luck!-Mike

DataAdapters against Typed DataSets = SQL Schema nightmares

I have seen many references stating that TableAdapters are weak and silly, and that any real dev would use DataAdapters. I don't know if that is true or not, but I am exploring the matter, and stressing out over how bad this whole 'DataAdapter/TableAdapter against a Typed DataSets' smells.
Let me try to explain...
Suppose I have my Typed DataSet defind in the xsd file, and now I'm ready to create a DataAdapter in code, against that schema...(By the way, I am using OleDb to access free-standing .dbf files in a folder... No SQL server stored procedures to call here, just plain old raw tables, ready for action.)
From my studies so far, here is how I see the DataAdapter used in conjunction with a Typed DataSet. Tell me if I am wrong. (Then I have my big complaint / question at the end.)
public DataTable GetJobsByCustomer(string CustNo)
{
OleDbConnection conn1 = new OleDbConnection(dbConnectionString);
conn1.Open();
LMVFP ds1 = new LMVFP(); //My Typed DataSet
string sqlstring = #"SELECT act_compda, contact, cust_num, est_cost, invoiced, job_hours,
job_invnum, job_num, job_remark, job_start, mach_cost, mat_cost, mat_mkup,
p_o_num, priority, quote_no, quoted_by, ship_date, ship_info, shop_notes, status, total_cost
FROM job_info
WHERE (cust_num = ?) AND (status = 'A')
ORDER BY priority";
OleDbDataAdapter JobsAdapter = new OleDbDataAdapter(sqlstring,conn1);
JobsAdapter.SelectCommand.Parameters.Add("?", OleDbType.VarChar,6).Value=CustNo;
JobsAdapter.Fill(ds1, "Jobs"); // A table schema in the Typed DataSet
return ds1.Jobs;
}
Is that how it goes? It does work, so that's good. And indeed the strongly typed behavior is great.
Now, my gripe.... You mean to tell me that I've got maintain the same exaxt SQL syntax in my DAL method (GetJobsByCustomer) to match the schema of the table in the xsd? It's crazy to have so much maintenance and dis-join between my hand-coded SQL and the xsd schema. There's no error cathing at all, since you are writing a text string!! You get to find out at run time if it will work.
When your typing all the SQL in code, it's terrible to have to look back and forth to keep your coded SQL in synch with the xsd table schema.
Surely I am missing something.
What a farce. The typed dataset works with beatiful intellisense and all, because it's generated from the schema, but when it comes down to it, it's just a pain to may to write SQL that matches the Typed schema. All they've done is move the headache to a new area.
Please tell me I am missing sometehing here that will make this much better.
I second Adam's appreciation for LINQ to SQL and EF, but I'm thinking this wouldn't be an option for you (yet) because of the lack of support for third-party DBMS. On the other hand, a third-party ORM (e.g. NHibernate) may be an option.
Perhaps I don't pay enough attention, but I'm not aware of any good reason to avoid TableAdapters vs DataAdapters. Do you have a link or two?
I don't believe you're missing anything; maintaining this type of code is never fun. Thankfully we now have LINQ to SQL and Entity Framework which can both reduce the amount of manual code maintenance necessary to keep your model objects in sync with your database.

LINQ to SQL not recognizing new associations?

I have two projects using legacy databases with no associations between the tables. In one, if I create associations in the DBML file, I can reference the associations in LINQ like this:
From c In context.Cities Where c.city_name = "Portland" _
Select c.State.state_name
(assuming I added the link from City.state_abbr to State.state_abbr in the DBML file.)
In a different project that uses a different database, adding the association manually doesn't seem to give me that functionality, and I'm forced to write the LINQ query like this:
From c In context.Cities Where c.city_name = "Portland" _
Join s In context.States On c.state_abbr = s.state_abbr _
Select s.state_name
Any idea what I could be missing in the second project?
Note: These are completely contrived examples - the real source tables are nothing like each other, and are very cryptic.
Check your Error List page. You might have something like the following in there:
DBML1062: The Type attribute
'[ParentTable]' of the Association
element 'ParentTable_ChildTable' of
the Type element 'ChildTable' does not
have a primary key. No code will be
generated for the association.
In which case all you should need to do is make sure that both tables have a primary key set and re-save the dbml file. This will invoke the custom tool, which will in turn update the designer.cs file and create code for the association.
It looks like my problem was my tables didn't have primary keys in the second project. Like I stated, these are legacy tables, so I had to do the linking and primary key stuff in the Database Context instead of the database itself, and I just forgot to specify the primary keys the second time around. Frustrating when you don't spot it, but it makes sense now.
Sometimes, when everything is configured correctly but still not working, the solution can be as simple as restarting Visual Studio.
I don't know why it happens sometimes, but I thought I should add this answer because having done some searching for a solution myself, it seems nobody has suggested this yet...

Resources