In my solution, I have an SSDT project. Every time I build my solution, a line stating:
1> Database -> <some\path>\Database\bin\Debug\Database.dacpac
shows up, and takes about 20 seconds to complete.
If I decide to use DACPACs in the future, I'll only generate the DACPAC when the database is ready to publish.
Is there any way to continue building the SSDT project, but not generate the DACPAC everytime the project is built, to cut down on build times?
No, this is not possible. Building the dacpac file is part of the SQL Server Database Project's job. Note that the process of producing the dacpac file does not take long to execute. Internally most of the time is spent parsing the T-SQL, interpreting the object declarations, building a model representation of the declared database, resolving references between objects, and validating. The serialization of the model into XML and storing it in a dacpac file would typically not take 20 seconds. I would guess that other things are happening during that time.
This might not sound like a good solution but if you are not interested with the Sql Project and dacpac output, for example if you made a code change you just want to test/debug it;
Right click the project and click "Unload Project".
It is not a real solution, but it can save you that 20 seconds.
Related
We have two software projects that both communicate with a single database. Right now SQL updates are all done on the database and it's relying on developers to make sure to update both sets of projects independently to use the latest database model. Making these matters worse both projects are in separate solutions in separate source control repositories.
While I acknowledge this is a terrible situation to be in, I inherited this situation, and while my long term goal is to consolidate and share the (lots) of duplicated logic between them in one common project shared among both sets of application for various reasons it is not feasible to jump right into that right now due to critical deadlines coming up and the need to combine them iteratively and schedule it with other developers to not disrupt work too much.
Keeping that in mind, I really want to use SSDT to at least start bringing the database structure under source control and make it easier to manage, as there are quite a few database changes that I'm about to do.
The problem with SSDT in this scenario is that you can only import from database once. After that the option is greyed out and unavailable, which is apparently a design decision of SSDT, since it's explicitly listed in the MSDN documentation.
Is there any easy way to update my SSDT project without nuking the current project and recreating it each time someone makes a change to the database structure?
Firstly your right, it is a horrible situation so work on improving it in the long term!
There are two things you can do, firstly you could use SSMS "Generate Scripts" to export all the objects and then use the import in SSDT to import from the scripts - this isn't greyed out.
The second thing you can do is manually bring the changes in using the schema compare in SSDT, you can set the database as the source and project as the destination and choose what you drop, update and import.
Ed
its bit delay in answer. I am using VS2017 Database project in which I have achieved this task by comparing a local database with database project once the comparison is over you can update the database by update button
Step 1 right click on the database project and click on schema compare item.
Step 2 select target -> select database connection option
Step 3 change source and target
Review Screenshots for more detail
I am going with compare solution :
Choose schema compare and make your database as a source and database project as a target then compare and update
see the this answer
Make a new temp Database project (outside of TFS) and import all the objects.
Checkout the Database project (inside TFS) and copy and paste all the folders (excluding BIN, OBJ folders) from the new temp Database Project into the Database Project (in TFS) and check in. This was way you get the latest DB object into TFS without duplicating.
If you expect new files in the copy/paste operation, then the new files should be included in the DB Project.
Delete the temp Database project folder.
You will need to do the process whenever you want to update all DB Objects into TFS.
This is a workaround which worked for me for this file duplicating issue.
I have an application written in Pro*C (C with embedded Oracle SQL) that I had building in VS2008 using custom build files to do the preprocessing. The source is in .m4 format (for legacy reasons - I will at some point get rid of this but it's not high on my priority list). The .m4 gets processed by the M4 utility into .pc (Pro*C files), which then get converted into .C files by the Oracle ProC utility.
We have clients that are using both Oracle 10 and Oracle 11 clients, so I have one project set up for each database version since they need to be built with different Oracle versions. Both projects use the same source code - the only difference is in the Rule files.
When using VS2008, I never had any problems with this process. When I moved the project to VS2010 (rules files converted to .xml, .props, and .targets files), I started getting a build error where my custom build rule was returning a bad code for one of the projects.
What's interesting is that every other time I "Rebuild All", the project that fails and the project succeeds switches (i.e. first time oracle10 project succeeds and oracle11 project fails, second time oracle10 fails and oracle11 succeeds). The project that succeeds reliably alternates each time I build, though it seems to always be the first project to process (not sure why Visual Studio wouldn't start with the same project each time).
If I build each project individually, I don't have any problems. If I create a solution-level project dependency, regardless of which project references which, I don't have any problems.
So despite the fact that I've found a work-around, I'm curious about what was causing this issue to begin with.
PS - I'm pretty new to StackOverflow, so I apologize in advance if I missed anything in asking this question. Let me know if there's anything else I could provide to help solve the issue.
In the last year I've worked on two relatively large .NET projects and both of them have ended up with project/code generation strangeness that I just haven't figured out how to fix..
The first project generates some bad code for forms that causes the VB.Net build to fail. I actually had to make a search/replace macro that fixes the 5 problems by adding a Global. to the beginning of a few references.
I chalked that up to a random act of unkindness against me and went on my way since the macro takes about 2 seconds to run...
So now 6 months later and new project is cranking along and I get a similar-ish problem. I have a bunch of form controls that store state in a settings file using the built in capabilities of .Net. I had about 20 controls that were configured automatically this way. Works great until today when for reasons I don't understand in the designer.vb file gets corrupted. At least one other person on the planet has had this problem here:
http://social.msdn.microsoft.com/Forums/en-US/winformsdesigner/thread/9bd20b56-7264-4a1f-a379-ad66b372ddd3
but the proposed solution didn't change the behavior.
So now I've had two projects (larger ones) that have project file issues that I can't resolve (I've had several smaller projects that are just fine).
What tools are available to fix projects, migrate projects, lint projects ... anything to recover projects to a reasonable state? Any successful recovery procedures beyond a roll-back/merge?
i had a corrupted reference issue linked to my use of mercurial and VS getting lost in file save time... if this may help...
If you open it in notepad and its corrupted - then its probably corrupted and the only way to restore it would be to go to a backup.
--> go to backup
-->click your project name
-->and then find your fire thats are corrupt
I've got an entity framework based data layer which is in a separate assembly. In the same solution I've got a business assembly, as ASP.Net MVC application and some unit test assemblies. The EF model is quite large and takes around 20 seconds to build. My problem is it's getting rebuilt every time a build is required. If I change one line in a unit test I have to wait 20+ seconds for the build to complete. Anyone know of any tricks to prevent the EF model triggering rebuilds (I don't want to unload the assembly from the solution)?
In the properties of the solution, I believe you can specify what assemblies get built for the solution. Right click on the solution and select "Properties". Select the Configuration Properties. In the right pane you can select what items get built. Hope this helps.
Recently we started working with Database project in Visual Studio 2010. I have added a reasonably large database to the solution and imported all objects. All warnings have been eliminated so it builds fine.
The one thing that really annoys me is that when you open the solution, the database project will start to load the database schema which takes some time (minutes) and uses a lot of resources. Practically I am not able to start working the moment the solution was loaded.
Is there any way to disable or change that behavior other than removing the database project from the solution or other hardware?
Answering my own question: unload the project.
This way it will (locally) not load the next time you open the solution.
If you need to work on the database project, just load it, work on it and unload it again.