DB_A -> DB project
DB_B -> Schema Comparison with DB Project
Apply Update Script to DB Project
Schema View shows changes, but Solution Explorer does not.
When applying changes during schema comparison, with database source and database project target, the changes are represented within the schema view, but the solution explorer still has all the old scripts, even though they've been dropped as a result of the schema comparison. Is this expected behavior?
How do I synchronize the Solution Explorer (and all its scripts) with the Schema View?
Best answer I've found to this is:
http://social.msdn.microsoft.com/Forums/en/vstsdb/thread/81c1d830-1921-47c0-87d5-c8cf27d35df6
Related
We have two software projects that both communicate with a single database. Right now SQL updates are all done on the database and it's relying on developers to make sure to update both sets of projects independently to use the latest database model. Making these matters worse both projects are in separate solutions in separate source control repositories.
While I acknowledge this is a terrible situation to be in, I inherited this situation, and while my long term goal is to consolidate and share the (lots) of duplicated logic between them in one common project shared among both sets of application for various reasons it is not feasible to jump right into that right now due to critical deadlines coming up and the need to combine them iteratively and schedule it with other developers to not disrupt work too much.
Keeping that in mind, I really want to use SSDT to at least start bringing the database structure under source control and make it easier to manage, as there are quite a few database changes that I'm about to do.
The problem with SSDT in this scenario is that you can only import from database once. After that the option is greyed out and unavailable, which is apparently a design decision of SSDT, since it's explicitly listed in the MSDN documentation.
Is there any easy way to update my SSDT project without nuking the current project and recreating it each time someone makes a change to the database structure?
Firstly your right, it is a horrible situation so work on improving it in the long term!
There are two things you can do, firstly you could use SSMS "Generate Scripts" to export all the objects and then use the import in SSDT to import from the scripts - this isn't greyed out.
The second thing you can do is manually bring the changes in using the schema compare in SSDT, you can set the database as the source and project as the destination and choose what you drop, update and import.
Ed
its bit delay in answer. I am using VS2017 Database project in which I have achieved this task by comparing a local database with database project once the comparison is over you can update the database by update button
Step 1 right click on the database project and click on schema compare item.
Step 2 select target -> select database connection option
Step 3 change source and target
Review Screenshots for more detail
I am going with compare solution :
Choose schema compare and make your database as a source and database project as a target then compare and update
see the this answer
Make a new temp Database project (outside of TFS) and import all the objects.
Checkout the Database project (inside TFS) and copy and paste all the folders (excluding BIN, OBJ folders) from the new temp Database Project into the Database Project (in TFS) and check in. This was way you get the latest DB object into TFS without duplicating.
If you expect new files in the copy/paste operation, then the new files should be included in the DB Project.
Delete the temp Database project folder.
You will need to do the process whenever you want to update all DB Objects into TFS.
This is a workaround which worked for me for this file duplicating issue.
We are using Visual studio 2010 and our database scripts are in a database project.
We have two databases DB1 and DB2. DB1 uses DB2.
I created a database project for each of databases and added DB2's .dbschema file as a "Database Reference" to DB1's project.
So my code for my view in DB1 is like
CREATE VIEW dbo.myView
AS
SELECT * FROM [$(DB2Ref)].dbo.SomeTable
GO
Until here all is fine.
But when i make a schema comparison between actual DB1 database and DB1 database project, comparison finds a difference between "myView" in project and "myView" in database.
Is there a way to make schema comparisons igonore these referenced database variables ?
Yo can set the Default for the SQL CMD Variable in the project settings to the actual database name. The schema compare in visual studio will then know that there is no change.
Unfortunately if you compare against different databases with different names, you'll need to change this Default each time to the database you're comparing to.
Setting SQL Cmd variable Default
SQL Schema Compare of View - the top is without the default defined and thus the object is marked as a change, and the bottom with the variable defined and thus marked as no-action
Sorry not enough rep to add images or more than 3 links yet
I am having a very weird problem with building a SQL Server 2008 Database Project from within Visual Studio 2010. I created the database project and then imported the database objects and settings from a local database that I am working with. I then went to build the database project and got the following error:
SQL03006: View: [dbo].[GovCAStaff] has an unresolved reference to object [CTS_Staff].[dbo].[Client_Assignments].
The problem appears to be that the view GovCAStaff is referencing a table in a different database (CTS_Staff). However, I have numerous functions and stored procedures in the same database project that are referencing tables in a different database but the build process only generates warnings for those, not errors. Other than rewriting the view as a function, does anyone know of a way to get rid of this build error? Is this a known limitation of views within database projects? Anyway, I am really stumped. Have googled this topic and haven't found anything relevant. Any help would be greatly appreciated. Thanks in advance.
The reason this error shows up in views, but not functions and stored procedures, is because that is how SQL Server itself will react if the database/table does not actually exist. In other words, in SQL Server you can define stored procedures and functions that reference tables that don't exist or are otherwise inaccessible. Not so for views.
The way to resolve this issue is to add a 'database reference' (a .dbschema file) to your project so that the project build process knows about the schema of that other database. Where to get this magical .dbschema file?
Create another database project (presumably in the same solution) for the other database. This is most convenient, as you can just create a 'project reference' and everything stays up-to-date (you wanted a DB project for that other database, anyway, right?).
Create the .dbschema file manually via vsdbcmd.
If the database is a 'system' database (e.g. 'master' or 'msdb'), you can use one of the pre-built .dbschema files ({Program Files}\Microsoft Visual Studio 9.0\VSTSDB\Extensions\SqlServer{version}\DBSchemas).
first question ever on this site.
I've been having a real stubborn problem using Visual Studio 2008 and I'm hoping someone has figured this out before.
I have 2 libraries and 1 project that use strongly typed datasets (MSSQL backend) that I generated using the "Configure DataSet with Wizard" option on in Data Sources. I've had them working just fine for awhile and I've written a lot of code in the non-designer file for the row classes. I've also specified a lot of custom queries using the dataset designer. This is all work I can't afford to loose.
I've recently made some changes to re-organize my libraries which included changing the names of the libraries themselves. I also changed the connection string to point to a different database which is a development copy (same exact schema).
Problem is now when I open up "Configure DataSet with Wizard" to pickup a new column I've added to one of the tables it no longer matches the tables correctly in the wizard. The wizard displays all of the tables in the database and none of them have check boxes next to them (ie: are not part of this dataset). Below those it shows all of the tables again but with red Xs and these are checked. Basically meaning that Visual Studio sees all of the tables it currently has in the DataSet and sees all of the tables in the database, but believes they are no longer the same and thus do not match!
I've had this same thing happen quite awhile back and I think I just re-built the xsd from scratch and manually copied the code over and then had to redefine all of the custom queries I built in the dataset designer. That's not a good solution.
I'm looking for 2 answers:
1. What causes this to happen and how to prevent it.
2. How do I fix this so that the wizard once again believes the tables in its xsd are the same tables that are in the database (yes, they have the exact same names still).
Thanks.
The dataset designer uses the default query (The first one with a check on it) to sync up the schema for each table. Whenever you go to edit the default query, VS will actually connect to your datasource and look for changes in the query. If new columns are added, they will show up as new columns for you to add to your table. Renamed columns show up as new, since VS doesn't have any way to know that you changed the name.
Answer 1. The XSD file contains the names of the database tables that it used to create the table originally. If you change the name of the table, the designer won't know which table to sync to.
Answer 2. You can edit the XML inside the XSD file. Do a "Find and Replace" inside the XSD file replacing the old tablename for the new tablename. Make sure you have a backup of the XSD file before you do. Be careful to only change instances of the old table name and not any other working XML.
I'm fairly new to LINQ to SQL, so I could be missing something basic here.
I created a LINQ to SQL layer, generated all the dbml files etc., and created a LINQ query which worked fine. I then made a change to the database, and wanted to get that change reflected in the ORM layer. To do this, I deleted my ORM layer and created a new one (may not be the best way?).
Now my code is not able to see the datacontext object in intellisense and won't compile. I imagine this may be something simple, but I'd also like to understand the bigger picture of how to update the LINQ to SQL ORM layer when the database changes.
Yeah, you don't want to delete your whole DBML file. Open it in the designer, and delete the table that changed. Then drag'n'drop it again from the "Server Explorer" (in the view menu). This will load an updated copy of teh database...
Note that if server explorer is already open while you make the change to the SQL schema, you'll need to refresh server explorer so it has the latest versions.
The drawback to this approach is that if you do customizations to the table in the DBML, those need to be redone. This is an infrequent case for me.
I remember having this issue a bunch. The fix is simple, really. Rebuild your solution! The DataContext and other such classes are generated during a build.
Quite a headache - I wish the DBML tool did this for you when you closed it.
You can also use SQLMetal to update your DBML classes. Some people even write a script or batch file to automate the process.