Prevent entity framework model from rebuilding every time - visual-studio-2010

I've got an entity framework based data layer which is in a separate assembly. In the same solution I've got a business assembly, as ASP.Net MVC application and some unit test assemblies. The EF model is quite large and takes around 20 seconds to build. My problem is it's getting rebuilt every time a build is required. If I change one line in a unit test I have to wait 20+ seconds for the build to complete. Anyone know of any tricks to prevent the EF model triggering rebuilds (I don't want to unload the assembly from the solution)?

In the properties of the solution, I believe you can specify what assemblies get built for the solution. Right click on the solution and select "Properties". Select the Configuration Properties. In the right pane you can select what items get built. Hope this helps.

Related

How to skip building a DACPAC in an SSDT project

In my solution, I have an SSDT project. Every time I build my solution, a line stating:
1> Database -> <some\path>\Database\bin\Debug\Database.dacpac
shows up, and takes about 20 seconds to complete.
If I decide to use DACPACs in the future, I'll only generate the DACPAC when the database is ready to publish.
Is there any way to continue building the SSDT project, but not generate the DACPAC everytime the project is built, to cut down on build times?
No, this is not possible. Building the dacpac file is part of the SQL Server Database Project's job. Note that the process of producing the dacpac file does not take long to execute. Internally most of the time is spent parsing the T-SQL, interpreting the object declarations, building a model representation of the declared database, resolving references between objects, and validating. The serialization of the model into XML and storing it in a dacpac file would typically not take 20 seconds. I would guess that other things are happening during that time.
This might not sound like a good solution but if you are not interested with the Sql Project and dacpac output, for example if you made a code change you just want to test/debug it;
Right click the project and click "Unload Project".
It is not a real solution, but it can save you that 20 seconds.

VS2010 Updating Service Reference *crazy* Slow (like 5 minutes)

our team is starting to dread updating the service references in our solution because it's a 5+ minute investment. Everything is localhost inside Visual Studio's web server.
My question is - how can I debug what this problem is? It works fine once it is over, but the long delay is crazy. If I had a clue where to look, perhaps I could resolve this.
With VS2012, I ran into the same issue: it took me almost 10 minutes just to update one service reference. I just managed to fix this by re-adding the service in the following way:
Delete the service reference.
Right-click "Service References" and select "Add Service Reference".
Click "Discover" (required in my case, might be different for others).
Select the service that you want to add under "Services".
Give the service a name (under "Namespace" at the bottom").
Press "Advanced".
Uncheck "Reuse types in referenced assemblies" and press "OK".
Press "OK" to add the Service Reference.
For me, the reusage of types was the big issue: now that this is unchecked, new updates only take a few seconds. Since I couldn't find this solution anywhere else, I thought I'd just post it here in case others run into a similar issue.
More than likely the .suo files have gotten ridiculous due to constant refreshes. You can check this by examining the source. If this is the case, you can delete the .suo and update the reference. You might want to make a backup, just in case you forgot some other user settings you have.
The other option is the WSDL for the service has just gotten too damned large and you have to bite the bullet.
If you want to reduce the impact, get the service guys to nail down the contract by using a little known secret called planning. ;-) Honestly, poor planning is often the root cause for a lot of the issues that crop up in VS.
I noticed that using svcutil instead of Add Service Reference in Visual Studio leads to shorter generation times, albeit sometimes the code generated is slightly different (more on that later).
At work we have a WCF service composed of about 100 service operations and 100 service contracts and the proxy generation in Visual Studio 2012 starting from the WSDL exposed by the service takes about 7 minutes. I then tried to use svcutil (without any option) and the generation took only about 2 minutes.
I had to add some options to match the same characteristics configured in the service reference (/enableDataBinding, /serializable, /namespace:*,myns, /syncOnly and collectionType:System.ComponentModel.BindingList'1) and with this option the generation time raised to 3 and a half minutes. Overall the proxy generation is not order of magnitude faster but at least the generation time should be cut in half.
In my experience the two generation methods have some differences that I'd like to point out:
Visual Studio generates datasource files (the one generated by Visual Studio when adding an object datasource to a Windows Forms project, see also this SO thread); svcutil has no option for generating them. It shouldn't be a major problem, since the first time you need to databind to a contract the file should be generated by Visual Studio.
As an aside, if the proxy is compiled in a separate assembly, the referring project could not reuse the generated datasource files since they are not included in the assembly and they will be regenerated anyway.
the ConfigurationName property of the Service Contracts can be different, apparently because the two generation methods consider differently the target namespace in generating the attribute value. This is a problem in our case since we do not use the generated app.config. This however can be managed easily by changing the app.config to match the new value or by (automatically) changing the ConfigurationName property in the generated proxy source.
svcutil does not decorate the ExtensionData property with the attribute Browsable(false) -- this can be a problem if (like us) you use the data contracts as source for databinding in Windows Forms, since all grids now will acquire an additional column for ExtensionData. Like the previous hiccup, this can be handled by adding the attribute using a sed-like tool (for example, I used the PowerShell snippet contained in this answer).
I faced this same problem just now, and updating my service reference was taking around 10-15 minutes sometimes it failed to update. I was frustrated and finally I deleted the reference and then added it once again. And now everything is working fine.
So, I will suggest you to delete the reference and add it again and lets see what happens
Had a slow problem updating webreferences. I was crasy about the times. More than 1 hour.
Some co-worker told to to add my workspace path to exclude from Windows Defender and it solve my problem.

Disable deserialzation Database Project Schema

Recently we started working with Database project in Visual Studio 2010. I have added a reasonably large database to the solution and imported all objects. All warnings have been eliminated so it builds fine.
The one thing that really annoys me is that when you open the solution, the database project will start to load the database schema which takes some time (minutes) and uses a lot of resources. Practically I am not able to start working the moment the solution was loaded.
Is there any way to disable or change that behavior other than removing the database project from the solution or other hardware?
Answering my own question: unload the project.
This way it will (locally) not load the next time you open the solution.
If you need to work on the database project, just load it, work on it and unload it again.

Is there a better way to refresh Entity Models (*.edmx files)

I need to refresh a bunch of EDMX files in my solution. We have disected our tables into groups and each model represents one component or process. However, there are some overlapping tables, which means sometimes I need to refresh/update multiple Entity Models.
Refreshing a group of different entity models in VS 2008 is slow and dangerous. If I miss an out-of-date model, my application won't work
I need to verify that all of the models in my solution are up to date with my development database.
Ultimate solution: How can I script this? Is there an API to Visual Studio for refreshing an EDMX file? I do the exact same thing every time. Can't I program this?
Ok Solution: Can I set something up in Visual Studio to inform me when an Entity Model doesn't match a DB? What is the recommended way to test the model against a DB?
Thanks in advance.
Check out the EDM Generator
http://msdn.microsoft.com/en-us/library/bb387165.aspx
You could put in a pre-build event to regenerate the model.
It's also wise to pre-generate the Views which increases performance substantially.
Hope this helps.
As far as I can tell EdmGen will only work for WPF apps. Since you are looking to update a .EDMX file you will not have access to the xml files which the EdmGen tries to validate.
I would suggest checking the The ADO.NET blog: http://blogs.msdn.com/adonet/archive/2008/06/26/edm-tools-options-part-3-of-4.aspx
They have an explination of how to update an EDMX file but it is pretty involved. Hopefully VS10 will have a better solution for this.
Open the .edmx file in the Model Browser (double click it in VS.Net). Right-click anywhere and choose "Update Model from Database...". The wizard that opens up will show you a limited diff: new tables and deleted tables. But its granularity stops there. It doesn't show changes in fields, for instance.

Problem with large solutions and service factory

My team is developing WCF services in Visual Studio 2008 SP1 with the Service Factory Modeling Edition. The problem is that we have so many services to develop and we've put everything in the same solution.
250 projects later, the solution barely loads and adding to it is nearly impossible. We thought it would be good to split out every service into its own solution, but the service factory stuff effectively prevents us from doing that.
We are generating code from our model project into our other projects, but if we try to use multiple solutions, we can't reference the model project because it can't maintain the project mappings it needs in order to generate to the proper location. And we use a couple shared data contracts in our Model Project, so splitting the model project up doesn't seem to make sense either.
We have so many projects now that we can't really turn back and not use the service factory method of doing everything.
What should we do?
At this point you should break up your 250 projects into groups and create a solution for each of them. My recomendation is to use ~25 projects per solution. Use these mini-solutions for your day to day editting.
It is still possible to do a full build of the overall solution via msbuild and the command line. But VS was not meant to handle so many projects at the sametime.
I'm surprised to hear you say this. I used the Service Factory in my previous job, and we created a separate solution for each suite of services. In each solution, we created one or more models in the model project.
I recommend that you experiment by creating a couple of new SF solutions, adding a couple of models and generating code. See if that causes any problems.
Have you discussed this or created an issue on CodePlex?
I agree with John Saunders. Keep them small. Also, keep a Continuous Integration env running to keep track of your mess-ups during development.

Resources