I need to refresh a bunch of EDMX files in my solution. We have disected our tables into groups and each model represents one component or process. However, there are some overlapping tables, which means sometimes I need to refresh/update multiple Entity Models.
Refreshing a group of different entity models in VS 2008 is slow and dangerous. If I miss an out-of-date model, my application won't work
I need to verify that all of the models in my solution are up to date with my development database.
Ultimate solution: How can I script this? Is there an API to Visual Studio for refreshing an EDMX file? I do the exact same thing every time. Can't I program this?
Ok Solution: Can I set something up in Visual Studio to inform me when an Entity Model doesn't match a DB? What is the recommended way to test the model against a DB?
Thanks in advance.
Check out the EDM Generator
http://msdn.microsoft.com/en-us/library/bb387165.aspx
You could put in a pre-build event to regenerate the model.
It's also wise to pre-generate the Views which increases performance substantially.
Hope this helps.
As far as I can tell EdmGen will only work for WPF apps. Since you are looking to update a .EDMX file you will not have access to the xml files which the EdmGen tries to validate.
I would suggest checking the The ADO.NET blog: http://blogs.msdn.com/adonet/archive/2008/06/26/edm-tools-options-part-3-of-4.aspx
They have an explination of how to update an EDMX file but it is pretty involved. Hopefully VS10 will have a better solution for this.
Open the .edmx file in the Model Browser (double click it in VS.Net). Right-click anywhere and choose "Update Model from Database...". The wizard that opens up will show you a limited diff: new tables and deleted tables. But its granularity stops there. It doesn't show changes in fields, for instance.
Related
So I've created a 2017 SSAS tabular model in VS. I've then deployed it to the SSAS tab work space server. But for some reason it creates two separate models. One normal and one with some random characters appended to it. Worse still, when I deploy changes to the model, it only updates the one with the characters appended.
What is going on here?
Deployment setting:
Two models showing in the work space server:
The tabular model with your username and a GUID is the workspace database. This is a local copy of the tabular model with the changes that you've applied to it when Integrated Workspace mode is not used. The workspace database is kept in memory while it's open in SSDT, and depending on the Workspace Retention property may be either removed from memory, removed from memory/disk, or still kept in memory. The default setting is to remove it from memory but not from disk, hence why you may not see this database whenever you close the model in SSDT. This property can by accessed from SSDT by highlighting the .bim file and viewing the properties (press F4).
The changes made to the workspace database should be applied to the deployed model when it's deployed using the model name as the target database on the deployment server, as in your screenshot. When you examine the model (non-workspace database) in SSMS, how do you know the changes are not applied to this, and have you tried refreshing the view from SSMS? This can be done by right-clocking the Database folder above the tabular models and pressing Refresh. Also, the deployment SSDT is succeeding without errors, correct?
Since I cannot comment on the correct answer I'll make it a separate answer.
What I saw happening was when you import a cube in Visual Studio (while creating a new project) a new tabular DB is created with the name "CubeName_username_Guid".
Everything you modify gets deployed to the real cube after you close Visual Studio. Also after you close VS, the oddly named cube also disappears.
I hope it helps.
We have two software projects that both communicate with a single database. Right now SQL updates are all done on the database and it's relying on developers to make sure to update both sets of projects independently to use the latest database model. Making these matters worse both projects are in separate solutions in separate source control repositories.
While I acknowledge this is a terrible situation to be in, I inherited this situation, and while my long term goal is to consolidate and share the (lots) of duplicated logic between them in one common project shared among both sets of application for various reasons it is not feasible to jump right into that right now due to critical deadlines coming up and the need to combine them iteratively and schedule it with other developers to not disrupt work too much.
Keeping that in mind, I really want to use SSDT to at least start bringing the database structure under source control and make it easier to manage, as there are quite a few database changes that I'm about to do.
The problem with SSDT in this scenario is that you can only import from database once. After that the option is greyed out and unavailable, which is apparently a design decision of SSDT, since it's explicitly listed in the MSDN documentation.
Is there any easy way to update my SSDT project without nuking the current project and recreating it each time someone makes a change to the database structure?
Firstly your right, it is a horrible situation so work on improving it in the long term!
There are two things you can do, firstly you could use SSMS "Generate Scripts" to export all the objects and then use the import in SSDT to import from the scripts - this isn't greyed out.
The second thing you can do is manually bring the changes in using the schema compare in SSDT, you can set the database as the source and project as the destination and choose what you drop, update and import.
Ed
its bit delay in answer. I am using VS2017 Database project in which I have achieved this task by comparing a local database with database project once the comparison is over you can update the database by update button
Step 1 right click on the database project and click on schema compare item.
Step 2 select target -> select database connection option
Step 3 change source and target
Review Screenshots for more detail
I am going with compare solution :
Choose schema compare and make your database as a source and database project as a target then compare and update
see the this answer
Make a new temp Database project (outside of TFS) and import all the objects.
Checkout the Database project (inside TFS) and copy and paste all the folders (excluding BIN, OBJ folders) from the new temp Database Project into the Database Project (in TFS) and check in. This was way you get the latest DB object into TFS without duplicating.
If you expect new files in the copy/paste operation, then the new files should be included in the DB Project.
Delete the temp Database project folder.
You will need to do the process whenever you want to update all DB Objects into TFS.
This is a workaround which worked for me for this file duplicating issue.
I'm currently investigating ASP.NET MVC 2 and LINQ to SQL. It all looks pretty cool. But I have a few application and development lifecycle issues.
Currently, I design the DB in SqlServer Management Studio.
Then I update my DBML files by deleting and re-importing modified tables.
Issues:
I can't find how to simply update the whole DBML schema.
My DBML then loses some of the changes I made such as renaming relation members or mapping of some int to an enum.
If I want a SQL script to deploy my DB (or to keep the schema under source control), I need to go use the 'Genererate Script' SSMS wizard which would be cool if a) it could remember my settings and b) it could be automated.
Should I work the other way around (start from my DBML and generate the DB)? Should I go for some other framework (NHibernate? Can I use some Linq flavor with it?)
Also, I read that LINQ2SQL is already obsolete in favor of Linq to Entities. Does it mean that the ultimate tool supposed to make my life so much better will again make me lose time in the long term?
Thanks for shedding some light.
If you are starting your DB Schema from scratch you could consider "Code-First Development with Entity Framework 4" as outlined by Scottgu.
I have been using this on a new project and am finding it extremely beneficial - especially for testing.
I started with simple POCO classes representing my data, then as the project progressed I would allow EF4 to generate the schema to a "real" DB using my "in-memory" example data ... now I am using a mixture of both in memory POCO (for development and TDD) and auto-generated DB Schema (auto-loaded with more "realistic" data) for demonstrations etc ... so far I am very happy.
There is a lot of opinion over LINQ2SQL and whether it's 'obsolete' or 'discontinued'. But it is still in the .NET framework and a good tool, so if it suit your needs then you should use it. Frankly the Entity Framework is still not perfect and if you don't need the extra flexibility that it affords then it is not worth the pain. If I had a small to midsize project then I would definitely use LINQ2SQL again (and over EF).
As for your question, yes you'll lose any names or different type mappings when you remove and re-add a table. The options that I'm aware of are
Only remove / re-add the table that has changed (not all tables)
Try altering the DBML tables in place, rather than remove / re-add. You can add and remove columns, change column names and data types, add relationships all on the DBML.
I like JcMalta's suggestion of creating objects as classes before rendering into the database, but if you find SQL Studio to be quick to develop with then it might simply be quickest to create tables there are drop them into your DBML. It's a touch annoying to have to change something in a database and the push the changes into your code but the code-gen tools are quite good and take away most of the pain.
You can try CodeSmith/PLINQO to auto-sync DB/code:
http://plinqo.com/
As a follow-up, just wanted to say that I eventually found and fell in love with Huagati DBML/EDMX Tools.
To be totally honest, I must say that the price has significantly increased since I purchased it. I believe it is still worth the money anyway.
And for people who are looking for the same kind of tool for MySQL (or other), DevArt is your friend.
I'm generating domain model using LINQ to SQL via the VS2008 built-in editor. That works really well, too; when I adjust my database schema I simply delete everything from the editor and then pull it back in from the server explorer by selecting all tables and dragging them into the designer surface. That works great too.
Now the problem: I have properties that I manually set to autogenerated, readonly etc. using the property inspector on the right. Everything I re-create the entire schema I have to do this manually all over again.
Is there a way to persist these settings externally and/or automate them to bring it back to the state from before?
You can use something like the Huagati DBML Tools. This will allow you to update the DBML file from the VS designer.
I've also used the following process before:
Create my schema in SSMS
Create a script that uses the SQL Metal command line tool to generate the DBML file
As the DBML file is XML, you can run transformations on the file. I used this to simply change a few things like setting certain fields to be auto-generated (DateCreated, etc).
Then, either use SQL Metal or T4 to create the model files from the altered DBML file.
This process worked great - however I had complete control over the database schema. This process also allowed me to use L2S with SQL Server Compact Edition.
Hope this helps!
T4 Toolbox has a Linq to Sql Schema generator which allows you to develop your Linq to Sql applications in a model first approach. I have used it a little and it works really well, here is a blog post with details and usage info.
Your solution may appear to work when you have very few database entities / tables, but it does not scale and as you've found, syncing is less than ideal.
Do not use the Visual Studio 2008 LinqToSql O/R Designer
After looking at many alternatives to the problems you are describing with LinqToSql, I decided to abandon LinqToSql altogether as I didn't find any of the workarounds very good. Competing ORMs don't have the silly problems that LinqToSql has and they are much more mature and feature rich.
I could/should probably list some of the alternatives I ran across, but I don't want to spend the time and give you false hope, sorry.
I'm fairly new to LINQ to SQL, so I could be missing something basic here.
I created a LINQ to SQL layer, generated all the dbml files etc., and created a LINQ query which worked fine. I then made a change to the database, and wanted to get that change reflected in the ORM layer. To do this, I deleted my ORM layer and created a new one (may not be the best way?).
Now my code is not able to see the datacontext object in intellisense and won't compile. I imagine this may be something simple, but I'd also like to understand the bigger picture of how to update the LINQ to SQL ORM layer when the database changes.
Yeah, you don't want to delete your whole DBML file. Open it in the designer, and delete the table that changed. Then drag'n'drop it again from the "Server Explorer" (in the view menu). This will load an updated copy of teh database...
Note that if server explorer is already open while you make the change to the SQL schema, you'll need to refresh server explorer so it has the latest versions.
The drawback to this approach is that if you do customizations to the table in the DBML, those need to be redone. This is an infrequent case for me.
I remember having this issue a bunch. The fix is simple, really. Rebuild your solution! The DataContext and other such classes are generated during a build.
Quite a headache - I wish the DBML tool did this for you when you closed it.
You can also use SQLMetal to update your DBML classes. Some people even write a script or batch file to automate the process.