I usually create a solution folder in Visual Studio and put my DB scripts in them. I always use at least this set of scripts:
Drop model
Create model script
User functions
Stored procedures
Static data (lookup tables)
Test data (not deployed)
Then I simply combine them and run against an SQL Server so I'm able to recreate the whole DB in a single step (by combining these scripts into a single one and executing it).
Anyway. I've never used projects in either:
Visual Studio or
SQL Management Studio
I've tried creating SQL Server 2008 Database Project in Visual Studio 2010, but I'm somehow overwhelmed by all the possible server settings (which I prefer to stay default as set on the server anyway). So I'm a bit confused: Should I use this project template or should I just do the same thing I always did?
What do you use and why? What are advantages I may benefit from by using either?
If I were you I would continue to do it the way you are doing it. In fact I do! The advantages of having the actual .sql files right there in a folder for you to use/edit/look at in my opinion are far better than the advantages you get by using a DB project. DB Project would be used if you were doing something like Storage Reports, were you have to communicate with like 8 databases and compare then to 8 different databases and save result sets etc... Now don't get my wrong there are advantages of Database Projects, I just don't think they are actually doing much help when you have such a simple setup that works already.
Advantages of the SQL Server 2008 Database Project in VS10:
Not having to switch back and forth
from your current client you use to
communicate with your SQL server.
Decent Data and Schema compare tools.
Gives you a one-click way to reverse
engineer a database into source
control, and keep it up to date.
You can compare projects to physical
databases and vice-versa. (This makes it pretty easy to keep your database up to date, no matter where you make change it: file system database project, or in the physical database itself)
If the current tool your using is not specifically tailored to SQL Server, this one is.
Extremely helpful if you need to do
unit tests directly on the database
without using abstractions.
If you're looking for something a little less complicated, you might want to try SQL Source Control. This won't even require you to maintain scripts, as it doesn't this for you behind the scenes. It will, however, only work as a solution for you if you use either TFS or SVN. And it costs $295...
It has a 28-day trial period, so if you're happy to try it out, I'd be interested in your feedback.
Related
I have a Visual Studio 2010 database project and I'd like to do a partial deployment, i.e. of specified objects. Is this possible? The only options I can see are to either do a full deployment or stop after generating the script.
For example, I'm changing many tables and stored procs but not everything is 100% finished and I'd like to push out a specific stored procedure to my test database, including its permissions, etc.
I read a little bit about SQL Server Data Tools, which apparently supports this, but I'm not clear on whether I'd have to migrate my database project to use that instead (would also need the ok from team lead), or if it's simply a plugin that would allow extra functionality.
Check out Schema Comparisons. They allow you to select the objects you want to deploy. They are available without SQL Server Data Tools.
A "partial deployment" is actually a little dangerous. Consider that you will have just built your database project, your entire database project, complete with the table changes, and it has built with no errors or warnings (right?). Now you want to deploy just your stored procedure, into a database that does not have the table changes.
Your stored procedure got no errors or warnings in the context of all the changes. Are you sure it will get no errors or warnings without those changes?
You should consider a source control solution to this problem. Save a copy of your stored procedure, revert to a version of the code that matches the database you'll be deploying to, then make your stored procedure changes to that. When you deploy, you will be checking to see if the stored procedure makes sense in the context of the database you'll be deploying into.
I've always been intrigued by Visual Studio Database Projects, and while they seem to be quite capable, I've never used them to any great degree outside of simplistic proof-of-concept work. I want to try this for a new project, and I'm also interested in using an EF layer on top of it, but in past test projects this has involved some decent effort.
I'm curious: has Visual Studio matured its product integration to support a single workflow that builds the database project, builds the EF layer on top of it, and finally builds the code, without intermediate steps involved?
We are a small team and we don't have dedicated SQL developers, and our primary goal is to bring the database into Visual Studio and to get it nicely under source control (TFS), and to achieve strong integration between from end to end. We're interested in growing into EF, and will probably start simple by treating it like a simple ORM tool to begin with if possible.
Has anyone actually done this that can provide insight into the process?
We have used VS2014, tool seem much the same and early version
Don't think has been much changes over the years
We have EDMX model and a DB project in the solution
Does mean that you need to keep the db project up to date.
But this is easy to do, you just publish you EDMX to local box/target
Then can import the changes with a schema compare of local to the project.
So they you can still have Model driven DB design
And use the DB project to deploy changes to the Dev/Stage/Live boxes
And can publish with automated deployments also.
The db project has a post build scripts option
Where you can use it to do seed data
And also a pre-build where you can do db manipulation if need to change structure and types of fields types when the data is on a live db.
Schema compare tool are rather good in Visual Studio
Can compare a DB to DB, DB to Project, or Schema file to either also
Currently I'm using Visual Studio 2012 RC and SQL Server 2012 RTM.
I'd like to know how to re-deploy/re-create a test database for each test run.
Keep in mind I've a SQL Server database project for the database using Visual Studio 2012's template.
Actually I'm not very sure about an idea I got in my mind, but .testsettings file has Setup and cleanup scripts. Is this the way to go? For example, a PowerShell script reading the database project generated script and executing it against the database?
I guess there're better ways of doing that and it should be an out-of-the-box solution but I ignore it and Google doesn't help me in finding the right solution.
As mentioned you'll probably want to use the VS 2012 .Local.testsettings > Setup and Cleanup scripts to create / tear down you SQL Server database.
For the script you may want to use powershell with a .dacpac (rather than just a T-SQL script), since you are using a SSDT project. Here's a link to some example code - in particular you may want to take a look at the 'Deploy-Dac' command.
If you are unfamiliar with .dacpacs as the (build) output of SSDT-created database projects, take a look at this reference link.
Edit: Although this doesn't answer the question in a plain SQL Server way, an easy Entity Framework approach would be the following: I found that I could create and destroy my database every time correctly by using the DbContext.Database.CreateIfNotExists() and DbContext.Database.Delete() methods in my setup and cleanup phases of my tests.
The fastest solution, while a bit of a hack, is really straightforward. You can set the DB Projects properties under the debugging tab to "always re-created DB". Then test in two clicks, do a debug/build, then run all tests. You should get a freshly built DB on localDB for you tests to be ran against. You can also change the target for the debugging DB (again the DB projects properties) to whatever you want, so you can deploy to a .dacpac, or to an existing SQL DB or wherever. It means testing in two steps, and if your build is long, it may be annoying, but it works. Otherwise, I believe scripting is your only option.
Why does Visual Studio asks (Optionally) to add database .mdf file to be stored in project output folder? It's still is a requirement that .mdf file to be part of running SQL Server instance so that application can work with the database.
For instance if I stop the instance of SQL Server and run the application, it throws exceptions etc. I wonder why it stays in VS solution folders then? Any advantage of this?
I think this is generally to allow for the "User Instance" feature, which lets you make a copy of the MDF file for local debugging purposes (without impacting the database that's running within SQL Server).
You can see this URL for more information on how this feature works, but I would just ignore it, since it is deprecated and in SQL Server 2012 is replaced with a fundamentally better and different way of dealing with isolation and avoiding instance maintenance (no more AttachDbFileName nonsense).
Personally, I think it's much better to work with a single copy of the database, attached to a proper instance of SQL Server, because these other methods just seem far too convoluted and confusing for very little gain. But maybe that's just me.
I'm generating domain model using LINQ to SQL via the VS2008 built-in editor. That works really well, too; when I adjust my database schema I simply delete everything from the editor and then pull it back in from the server explorer by selecting all tables and dragging them into the designer surface. That works great too.
Now the problem: I have properties that I manually set to autogenerated, readonly etc. using the property inspector on the right. Everything I re-create the entire schema I have to do this manually all over again.
Is there a way to persist these settings externally and/or automate them to bring it back to the state from before?
You can use something like the Huagati DBML Tools. This will allow you to update the DBML file from the VS designer.
I've also used the following process before:
Create my schema in SSMS
Create a script that uses the SQL Metal command line tool to generate the DBML file
As the DBML file is XML, you can run transformations on the file. I used this to simply change a few things like setting certain fields to be auto-generated (DateCreated, etc).
Then, either use SQL Metal or T4 to create the model files from the altered DBML file.
This process worked great - however I had complete control over the database schema. This process also allowed me to use L2S with SQL Server Compact Edition.
Hope this helps!
T4 Toolbox has a Linq to Sql Schema generator which allows you to develop your Linq to Sql applications in a model first approach. I have used it a little and it works really well, here is a blog post with details and usage info.
Your solution may appear to work when you have very few database entities / tables, but it does not scale and as you've found, syncing is less than ideal.
Do not use the Visual Studio 2008 LinqToSql O/R Designer
After looking at many alternatives to the problems you are describing with LinqToSql, I decided to abandon LinqToSql altogether as I didn't find any of the workarounds very good. Competing ORMs don't have the silly problems that LinqToSql has and they are much more mature and feature rich.
I could/should probably list some of the alternatives I ran across, but I don't want to spend the time and give you false hope, sorry.